Compare commits

...

148 Commits

Author SHA1 Message Date
copilot-swe-agent[bot] 4a30a697f1 check-requirements: update existing comment in place instead of delete+recreate
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/0c57df2f-81a3-4ab1-9343-465523db657f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-29 06:24:07 +00:00
copilot-swe-agent[bot] 0f738ce5b0 check-requirements: add workflow_dispatch trigger, deduplicate comment on each run
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/7ca80b22-68f1-4a3b-ad94-2d4c054ac0f0

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-24 14:57:03 +00:00
Robert Resch 93a8f4d94e Apply suggestions from code review
Co-authored-by: Robert Resch <robert@resch.dev>
2026-04-24 01:06:21 +02:00
copilot-swe-agent[bot] 3303339797 check-requirements: move overall summary line to top of comment (before table)
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/9228f3c1-ac84-42f9-aed1-c8c6156cef03

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 23:03:28 +00:00
copilot-swe-agent[bot] eacfd0ce50 check-requirements: use icon-only table, add collapsible per-package detail sections
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/5314c056-b511-48aa-bace-bb9c43fac637

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 22:59:18 +00:00
Robert Resch 609f935430 Recompile 2026-04-23 22:19:29 +00:00
copilot-swe-agent[bot] 30151a484b Exclude auto-generated lock file from prettier check
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/0ab3b575-2a57-48e7-a15f-cd55aa410f41

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 22:15:44 +00:00
copilot-swe-agent[bot] df1cf178e8 Exclude auto-generated lock file from yamllint and zizmor checks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/f728bfbc-371b-44a3-bce9-3ecdc9cce4fb

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-23 09:23:26 +00:00
copilot-swe-agent[bot] fe2214e071 check-requirements: tighten step 4a, add public-repo check, always comment
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/dc04d9a1-1c24-4abd-8379-58a473ba3f25

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 15:17:44 +00:00
copilot-swe-agent[bot] 36488d5d26 Revert "Add PyPI wheel availability info output to hassfest requirements check"
This reverts commit 4a895255d6.

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 14:14:35 +00:00
copilot-swe-agent[bot] 4a895255d6 Add PyPI wheel availability info output to hassfest requirements check
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/846855d5-b238-485c-ad9c-9def58ab5de5

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:51:58 +00:00
copilot-swe-agent[bot] b3075ecc9b Restore forks trigger; generalize release pipeline check to GitLab and other hosts
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/774c8674-5a55-4b8c-a48c-44ebfe4ca73d

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:28:56 +00:00
copilot-swe-agent[bot] fdfe4365a1 Restrict workflow to non-fork PRs; add PyPI CI-upload and release pipeline sanity checks
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/a9e9c91a-f16e-4237-8693-f301733062a3

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 13:17:15 +00:00
copilot-swe-agent[bot] 5043c8b87d Expand requirements check: test deps, repo-specific link validation, diff consistency
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/552c9b6d-5829-411f-b3cd-a86c7ffb7ac7

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:56:08 +00:00
copilot-swe-agent[bot] 7864a661e1 Fix duplicate .gitattributes entry for lock files
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/175e4dde-73f0-4164-bf5f-7a839518bf1f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:39:13 +00:00
copilot-swe-agent[bot] aa40340068 Add agentic workflow to check requirements licenses and PR description links
Agent-Logs-Url: https://github.com/home-assistant/core/sessions/175e4dde-73f0-4164-bf5f-7a839518bf1f

Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-22 12:38:07 +00:00
Robert Resch 1345356bdc Validate local_only user property during ws auth phase (#168812) 2026-04-22 14:07:47 +02:00
Shay Levy be07fed774 Remove unused hass.data[DOMAIN] in LG webOS TV (#168813) 2026-04-22 13:58:44 +02:00
Erwin Douna d17f6a1509 Firefly III consistency with access token (#168565) 2026-04-22 11:12:40 +02:00
Thijs W. f3932f2342 Improve exception handling for frontier_silicon (#168635)
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Erwin Douna <e.douna@gmail.com>
2026-04-22 10:58:09 +02:00
Mick Vleeshouwer 598be31daf Improve test structure for Overkiz (#168728) 2026-04-22 10:10:18 +02:00
epenet 9b2a81614f Simplify Tuya runtime_data (#168718)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-22 10:02:24 +02:00
Øyvind Matheson Wergeland f53c89d3bc Translate override_type options in nobo_hub (#168752) 2026-04-22 09:59:51 +02:00
dependabot[bot] ac6991072f Bump github/codeql-action from 4.35.1 to 4.35.2 (#168754)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-22 09:53:11 +02:00
Jan Bouwhuis 018e8e06fa Cancel and await idle_start future if the task was canceled after an IMAP connection was lost (#168662)
Co-authored-by: J. Nick Koston <nick@koston.org>
2026-04-22 09:43:22 +02:00
Ronald van der Meer 0ffc9694a7 Bump python-duco-client to 0.3.4 (#168757) 2026-04-22 09:41:21 +02:00
Marc Mueller 8d8b30a41e Update mypy to 1.20.2 (#168741) 2026-04-22 09:38:08 +02:00
Tomer 9b7f61d862 Victron GX: Diagnostics (#168700)
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-04-22 09:36:49 +02:00
epenet 368f2f44be Use HassKey in zeroconf (#168707)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-22 09:26:13 +02:00
LG-ThinQ-Integration ad6a910244 Bump thinqconnect to 1.0.12 (#168753)
Co-authored-by: YunseonPark-LGE <yunseon.park@lge.com>
2026-04-22 09:21:15 +02:00
Leonardo Rivera 840b44039d Fix OneDrive upload service to support multiple files (#168512) 2026-04-22 09:11:27 +02:00
Ronald van der Meer 1943675a64 Add DHCP discovery to Duco integration (#168730) 2026-04-22 08:32:05 +02:00
Linkplay2020 161e05b075 Updata wiim to 0.1.2 (#168671)
Co-authored-by: Tao Jiang <tao.jiang@linkplay.com>
2026-04-22 08:07:17 +02:00
Paulus Schoutsen f2d5ca3582 Rename SerialSelector to SerialPortSelector (#168744)
Co-authored-by: Claude <noreply@anthropic.com>
2026-04-22 07:47:28 +02:00
Florent Thoumie 551af8caef Rename iAqualink to iAquaLink (#168743) 2026-04-22 07:26:48 +02:00
Johan Henkens 201c575316 Bump aioesphomeapi to 44.18.0 (#168749) 2026-04-22 06:12:32 +02:00
tronikos 703860ee6e Add support for away mode in ESPHome water heater (#167951)
Co-authored-by: J. Nick Koston <nick@home-assistant.io>
Co-authored-by: J. Nick Koston <nick+github@koston.org>
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-04-22 05:37:47 +02:00
puddly cb021f0b6b Allow integrations to contribute serial port scanning helpers (#168660)
Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2026-04-21 21:15:57 -04:00
Øyvind Matheson Wergeland 50dbff31b0 Fix nobo_hub override type description (#168740) 2026-04-21 23:30:06 +02:00
MohamedBarrak3 800299077e Fix case-sensitive MIME type check in Google Generative AI TTS (#168458) 2026-04-21 23:26:31 +02:00
Andrew Jackson f40b269752 Version checking of Transmission (#168429)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-04-21 23:26:14 +02:00
David f2105c07de Expose Lutron Caseta shade battery status on covers (#165180) 2026-04-21 23:25:45 +02:00
Erwin Douna d23dbfb214 Add volumes to Portainer (#167326) 2026-04-21 23:23:27 +02:00
Erwin Douna de6586684a Add recreate container button to Portainer (#167163) 2026-04-21 23:21:45 +02:00
Avi Miller 9a08b941bb Limit LIFX bulb changes to the values that are actually changing (#168618) 2026-04-21 23:08:04 +02:00
Øyvind Matheson Wergeland 51b9f004e9 Introduce NoboBaseEntity in nobo_hub (#168724)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-04-21 23:03:45 +02:00
epenet fe443f4ce9 Use runtime_data in wyoming integration (#168619) 2026-04-21 22:50:06 +02:00
Thijs W. b0ba7ec6ec Frontier silicon: use correct command to restart stopped stream (#168633)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-21 22:36:44 +02:00
Florent Thoumie 156901c290 iaqualink: Add basic DHCP discovery for iAquaLink devices (#168256)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-21 22:34:37 +02:00
Franck Nijhof b6271e59fa Add sensor platform to Fumis integration (#168680)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-21 22:24:13 +02:00
Franck Nijhof 17cd0aa474 Add DHCP discovery to Fumis integration (#168735) 2026-04-21 22:20:51 +02:00
Stefan Agner 79f12f658a Improve Supervisor update entity progress and data refresh (#168712)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-21 21:01:01 +02:00
Simone Chemelli e13b63342e Disable DNS queries in tests (#165603)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-04-21 20:42:30 +02:00
Erik Montnemery 3500f0a195 Revert "Add Broadlink infrared emitter support to native infrared platform" (#168717) 2026-04-21 18:19:22 +02:00
Øyvind Matheson Wergeland 4a93dcb936 Add data descriptions for nobo_hub config and options flows (#168723)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 18:02:54 +02:00
Ronald van der Meer 27ddb5b6a4 Claim platinum quality scale for Duco integration (#168719) 2026-04-21 17:30:58 +02:00
Raphael Hehl 0ff38cdc7f Fix/unifi access uah door and thumbnail (#168708) 2026-04-21 17:04:49 +02:00
Mick Vleeshouwer 1a8adea358 Add sensor entity tests to Overkiz (#168701) 2026-04-21 16:53:14 +02:00
Ariel Ebersberger 2a85046584 Fix shelly tests - bluetooth config flow (#166850) 2026-04-21 16:46:33 +02:00
Florent Thoumie fc85d35d4c Add initial quality scale assessment to iaqualink, set to bronze (#167738)
Co-authored-by: Ariel Ebersberger <31776703+justanotherariel@users.noreply.github.com>
2026-04-21 16:39:25 +02:00
Raphael Hehl 608b92be40 unifi: implement action-exceptions quality scale rule (#168559)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-04-21 16:25:41 +02:00
renovate[bot] af01b41e52 Update infrared-protocols to 2.0.0 (#168667)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2026-04-21 15:13:58 +01:00
MohamedBarrak3 f257d54d1e Bump mcstatus to 13.1.0 (#168716) 2026-04-21 16:09:14 +02:00
Denis Shulyaka 7c7c075df4 Filter Anthropic schema (#168542) 2026-04-21 09:55:00 -04:00
Denis Shulyaka 5a487d452d Remove retired Claude Haiku 3 model (#168657) 2026-04-21 09:53:56 -04:00
arsenicks a4138fa4cd Sonos - Add support for TV Autoplay and Ungroup on Autoplay (#167956)
Co-authored-by: Gustav Åkerström <23389010+gustavakerstrom@users.noreply.github.com>
2026-04-21 15:28:39 +02:00
epenet a6b4609313 Combine AWS hass.data entries into a single dataclass (#168711) 2026-04-21 15:24:14 +02:00
Aaron Ten Clay 95e9405cd0 Preserve Fahrenheit precision in google_assistant temperature range (#168672) 2026-04-21 15:22:21 +02:00
bkobus-bbx d990ec1b65 Bump blebox_uniapi to v2.5.1 (#168713) 2026-04-21 15:21:24 +02:00
epenet 52d7dcbcc8 Drop redundant BackupManager annotation in aws_s3/google_drive diagnostics (#168714)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 15:18:57 +02:00
Ronald van der Meer 8e1346fd1f Add dynamic device discovery and stale device removal to Duco integration (#168675) 2026-04-21 15:18:27 +02:00
epenet a2485960d8 Move Tuya listener classes to separate module (#168636) 2026-04-21 15:15:14 +02:00
epenet a06ffe6379 Use runtime_data in abode integration (#168709) 2026-04-21 15:05:49 +02:00
Martin Claesson 966e8aeca4 Add Kiosker binary sensor platform (#168507)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-04-21 14:52:15 +02:00
Abílio Costa d7f666a661 Implement doorbell.rang trigger (#168388) 2026-04-21 14:43:34 +02:00
Thomas Rupprecht 671b3e01ad Allow requesting spaceapi without authentication and with cors headers (#160797) 2026-04-21 14:31:07 +02:00
Erwin Douna a85c82ae24 Add dynamic update interval to Tado (#160723)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2026-04-21 14:28:41 +02:00
Denis Shulyaka d9af83a03f Fix telegram_bot.send_message_draft action description (#168212)
Co-authored-by: c0ffeeca7 <38767475+c0ffeeca7@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-21 14:54:08 +03:00
Erik Montnemery c489980551 Add duration to more conditions (#168383) 2026-04-21 13:41:53 +02:00
epenet 06400ab688 Use runtime_data in zamg (#168699) 2026-04-21 13:06:14 +02:00
epenet 9d7d56c5bf Use runtime_data in Yardian (#168697) 2026-04-21 13:05:09 +02:00
epenet b1fcc0ebde Use runtime_data in youtube (#168696) 2026-04-21 13:04:49 +02:00
epenet 12af4bd0f4 Use runtime_data in yolink (#168693) 2026-04-21 13:04:19 +02:00
Retha Runolfsson 6bb083ee61 Bump pySwitchbot to 2.1.0 (#168692) 2026-04-21 13:03:47 +02:00
Denis Shulyaka a6f9246c2f Add myself as a codeowner for OpenAI integration (#168705) 2026-04-21 13:01:45 +02:00
epenet 3222472f10 Use runtime_data in youless (#168694)
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-04-21 12:44:18 +02:00
epenet e620426002 Use runtime_data in yamaha_musiccast (#168691) 2026-04-21 11:33:02 +02:00
Mike Degatano 6e61a60eba refactor(hassio): store aiohasupervisor models directly in hass.data using typed HassKey (#168400)
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
2026-04-21 11:24:07 +02:00
epenet 6942066930 Use runtime_data in wiffi integration (#168687) 2026-04-21 10:58:47 +02:00
Marc Mueller 7c1fd1a237 Update aiousbwatcher to 1.1.2 (#168688) 2026-04-21 10:56:00 +02:00
epenet 3fd77b0d7a Use runtime_data in wilight integration (#168686) 2026-04-21 10:47:53 +02:00
Allen Porter f73f1df5a2 Add Roborock fan speed validation and error handling (#168623) 2026-04-21 10:47:32 +02:00
Florent Thoumie fb89d94957 Add missing data_description strings to iaqualink (#168670) 2026-04-21 10:30:15 +02:00
epenet a9c3854d69 Use runtime_data in whois (#168684) 2026-04-21 10:28:45 +02:00
renovate[bot] ef1a5ea2df Update zizmor (#168666)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-21 10:14:26 +02:00
Raphael Hehl 514d5e570a Bump py-unifi-access to version 1.2.0 (#168679) 2026-04-21 10:13:31 +02:00
epenet 9de658b918 Use runtime_data in WeatherKit (#168682) 2026-04-21 09:43:54 +02:00
Franck Nijhof ac4e746977 Add reauthentication flow to Fumis integration (#168645) 2026-04-21 09:32:13 +02:00
Mick Vleeshouwer e10f59c936 Add additional cover fixtures to Overkiz (#168661) 2026-04-21 08:57:28 +02:00
Andres Ruiz fb171809ec Update waterfurnace to 1.7.1 (#168665) 2026-04-21 08:56:45 +02:00
epenet 137122ebb5 Use runtime_data in weatherflow integration (#168622) 2026-04-21 08:55:50 +02:00
epenet 502dc5075d Use runtime_data in weatherflow_cloud integration (#168624)
Co-authored-by: Michael <35783820+mib1185@users.noreply.github.com>
2026-04-21 08:55:29 +02:00
Marc Mueller 42232cfe3f Fix esphome test ResourceWarning (#168181) 2026-04-21 08:55:05 +02:00
epenet 0ae1236acb Use runtime_data in ws66i integration (#168628) 2026-04-21 08:54:49 +02:00
Ariel Ebersberger 63f84af4ff Fix tplink tests for Python 3.14.3 (#168361) 2026-04-21 08:54:21 +02:00
Ronald van der Meer 89fe56c599 Add reconfiguration flow to Duco integration (#168652) 2026-04-21 07:46:50 +02:00
Rene Nulsch 2fb1ed443a Validate directory_path and file_name in telegram_bot.download_file (#168656) 2026-04-21 07:46:43 +02:00
Glenn Vandeuren (aka Iondependent) ea8f82e9ba Bump nhc to 0.8.0 (#168651)
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: VandeurenGlenn <8685280+VandeurenGlenn@users.noreply.github.com>
2026-04-20 22:09:19 +01:00
puddly 31dc02c3ee Bump universal-silabs-flasher to 1.1.0 (#168647) 2026-04-20 23:02:53 +02:00
Nils Ove Erstad 70ec6fa654 Fix MQTT JSON light restoring None color_mode on startup (#168608)
Co-authored-by: Jan Bouwhuis <jbouwh@users.noreply.github.com>
2026-04-20 21:59:03 +02:00
puddly c2946404ea Bump ZHA to 1.2.1 (#168644) 2026-04-20 15:42:04 -04:00
Abílio Costa f715bcd7c1 Change Claude gh review agent back to skill (#168642) 2026-04-20 20:59:20 +02:00
Manu 0c0e61e133 Remove hunterjm from Xbox integration codeowners (#167024) 2026-04-20 20:58:43 +02:00
Tomer 305761e7de Victron GX: device_tracker platfrom (#168462)
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
2026-04-20 20:54:58 +02:00
puddly 3b81f09765 Bump serialx to 1.4.1 (#168640) 2026-04-20 20:53:51 +02:00
epenet a2cc7d0fca Use runtime_data in watttime integration (#168630) 2026-04-20 20:46:41 +02:00
Ronald van der Meer 038b56e5eb Claim Silver quality scale for Duco integration (#168620) 2026-04-20 19:45:57 +01:00
Franck Nijhof 0edcb8d60f Set parallel updates for PVOutput sensor platform (#168643) 2026-04-20 20:45:11 +02:00
Stefan Agner cc8000ed89 Remove hassio-main panel registration (#168626)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-20 19:42:10 +02:00
Mick Vleeshouwer a92dcaaf5f Add first cover entity tests to Overkiz (#165670)
Co-authored-by: Copilot <copilot@github.com>
2026-04-20 18:30:26 +01:00
Joakim Plate e889541d2e Correct state/device class for water in gardena (#168637) 2026-04-20 19:02:29 +02:00
Michael 85e9d3c6a8 Migrate Z-Wave.Me to use runtime_data (#168562) 2026-04-20 18:29:46 +02:00
Robert Resch fe9db39684 Add docker syntax to all Docker files (#168350) 2026-04-20 17:31:04 +02:00
Assaf Akrabi 253d3e1758 Migrate lib to aiorussound for Russound RNET (#168484) 2026-04-20 17:21:45 +02:00
Raphael Hehl dcb5f0d533 Improve UniFi config flow quality scale: config-flow and config-flow-test-coverage (#168477)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-04-20 17:16:51 +02:00
epenet d5e4be317c Use runtime_data in wolflink integration (#168625) 2026-04-20 17:09:49 +02:00
albaintor 0ebf4d86f5 Fixed Kodi Media Browsing (#165819) 2026-04-20 17:09:03 +02:00
Klaas Schoute 1a86913239 Merge config flows for powerfox integration (#164019) 2026-04-20 17:07:45 +02:00
Max R f2c010aaaf feat(citybikes): add number of ebikes attribute (#166229) 2026-04-20 17:02:24 +02:00
Erik Montnemery 74de32377e Improve async_get_system_info tests (#168586)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-20 16:47:40 +02:00
Franck Nijhof 901925ad54 Add Fumis pellet stove integration (#168515) 2026-04-20 16:25:12 +02:00
Øyvind Matheson Wergeland defbfe17a3 Fix nobo_hub via_device warning (#168595) 2026-04-20 16:25:05 +02:00
Merlin Schumacher 9795f55af3 Remove reference to deprecated state STANDBY from universal media player (#160930)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2026-04-20 16:10:34 +02:00
johanzander 967c5d2092 Fix KeyError in Growatt server login response handling (#168482)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-20 15:41:38 +02:00
Paulus Schoutsen cdecff9380 Use dedicated power commands for LG infrared (#168488)
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: balloob <1444314+balloob@users.noreply.github.com>
2026-04-20 15:36:00 +02:00
Robert Resch 59ceb7c58c Revert "Update PyTurboJPEG to 2.2.0" (#168617) 2026-04-20 15:28:17 +02:00
Kurt Chrisford d66b9f4316 Bump actron-neo-api to 0.5.3 (#167732) 2026-04-20 14:58:43 +02:00
Christophe Gagnier 40477ff87b Bump python-technove to 2.1.1 (#168403)
Co-authored-by: Moustachauve <2206577+Moustachauve@users.noreply.github.com>
2026-04-20 14:52:07 +02:00
epenet d96b626497 Use runtime_data in vallox integration (#168604)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 14:41:53 +02:00
epenet 0c294b342c Use runtime_data in verisure integration (#168605)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-20 14:39:53 +02:00
Marc Mueller 1f64ca4a8d Update pydantic to 2.13.2 (#168601)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-04-20 14:36:07 +02:00
epenet 79ae0e6c49 Use runtime_data in venstar integration (#168613) 2026-04-20 14:32:13 +02:00
epenet dc0052552a Use runtime_data in vera integration (#168614) 2026-04-20 14:31:22 +02:00
epenet 77f4baa79e Use runtime_data in volumio integration (#168616) 2026-04-20 14:30:34 +02:00
Matthias Alphart 52377b958b Update knx-frontend to 2026.4.19.175239 (#168568) 2026-04-20 14:28:46 +02:00
Denis Shulyaka 09105693c7 Filter OpenAI schema (#168543) 2026-04-20 14:28:13 +02:00
epenet db838f67d7 Move vallox service registration to services.py (#168612)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-20 14:26:57 +02:00
renovate[bot] 720fd6d802 Update ruff (#168240)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-20 14:25:43 +02:00
598 changed files with 70426 additions and 3268 deletions
@@ -1,7 +1,6 @@
---
name: github-pr-reviewer
description: Reviews GitHub pull requests and provides feedback comments.
disallowedTools: Write, Edit
description: Reviews GitHub pull requests and provides feedback comments. This is the top skill to use for reviewing Pull Requests from GitHub.
---
# Review GitHub Pull Request
+1
View File
@@ -23,3 +23,4 @@ requirements_all.txt linguist-generated=true
requirements_test_all.txt linguist-generated=true
requirements_test_pre_commit.txt linguist-generated=true
script/hassfest/docker/Dockerfile linguist-generated=true
.github/workflows/*.lock.yml linguist-generated=true merge=ours
+11
View File
@@ -6,6 +6,7 @@
"pep621",
"pip_requirements",
"pre-commit",
"regex",
"homeassistant-manifest"
],
@@ -26,6 +27,16 @@
]
},
"regexManagers": [
{
"description": "Update ruff required-version in pyproject.toml",
"managerFilePatterns": ["/^pyproject\\.toml$/"],
"matchStrings": ["required-version = \">=(?<currentValue>[\\d.]+)\""],
"depNameTemplate": "ruff",
"datasourceTemplate": "pypi"
}
],
"minimumReleaseAge": "7 days",
"prConcurrentLimit": 10,
"prHourlyLimit": 2,
File diff suppressed because it is too large Load Diff
+402
View File
@@ -0,0 +1,402 @@
---
on:
pull_request:
types: [opened, synchronize, reopened]
paths:
- "requirements*.txt"
- "homeassistant/package_constraints.txt"
- "pyproject.toml"
forks: ["*"]
workflow_dispatch:
inputs:
pull_request_number:
description: "Pull request number to (re-)check"
required: true
type: number
permissions:
contents: read
pull-requests: read
issues: read
network:
allowed:
- python
tools:
web-fetch: {}
github:
toolsets: [default]
safe-outputs:
add-comment:
max: 1
description: >
Checks changed Python package requirements on PRs targeting the core repo
(including fork PRs): verifies licenses match PyPI metadata, source
repositories are publicly accessible, PyPI releases were uploaded via
automated CI (Trusted Publisher attestation), the package's release pipeline
uses OIDC or equivalent automated credentials (not static tokens), and the PR
description contains the required links.
---
# Requirements License and Availability Check
You are a code review assistant for the Home Assistant project. Your job is to
review changes to Python package requirements and verify they meet the project's
standards.
## Context
- Home Assistant uses `requirements_all.txt` (all integration packages),
`requirements.txt` (core packages), `requirements_test.txt` (test
dependencies), and `requirements_test_all.txt` (all test dependencies) to
declare Python dependencies.
- Each integration lists its packages in `homeassistant/components/<name>/manifest.json`
under the `requirements` field.
- Allowed licenses are maintained in `script/licenses.py` under
`OSI_APPROVED_LICENSES_SPDX` (SPDX identifiers) and `OSI_APPROVED_LICENSES`
(classifier strings).
## Step 1 — Identify Changed Packages
Use the GitHub tool to fetch the PR diff. Look for lines that were added (`+`)
or removed (`-`) in **all** of these files:
- `requirements.txt`
- `requirements_all.txt`
- `requirements_test.txt`
- `requirements_test_all.txt`
- `homeassistant/package_constraints.txt`
- `pyproject.toml`
For each changed line that contains a package pin (e.g. `SomePackage==1.2.3`),
classify it as:
- **New package**: the package name appears only in `+` lines, with no
corresponding `-` line for the same package name.
- **Version bump**: the same package name appears in both `+` lines (new
version) and `-` lines (old version), with different version numbers.
Record the **old version** and **new version** for every version bump — you
will need these values in Step 4.
Ignore comment lines (starting with `#`), lines that start with `-r ` (file
includes), and lines that don't contain `==`.
## Step 2 — Check License via PyPI
For each new or bumped package:
1. Fetch `https://pypi.org/pypi/{package_name}/json` (use the exact
package name as it appears on PyPI).
2. From the JSON response, extract:
- `info.license` — free-text license field
- `info.license_expression` — SPDX expression (if present)
- `info.classifiers` — filter for entries starting with `"License ::"`.
3. Determine if the license is in the approved list from `script/licenses.py`:
- SPDX identifiers: compare against `OSI_APPROVED_LICENSES_SPDX`
- Classifier strings: compare against `OSI_APPROVED_LICENSES`
4. Flag a package as ❌ if the license is unknown, missing, or not in the
approved list. Flag as ⚠️ if the license information is ambiguous or cannot
be definitively determined.
## Step 2b — Verify PyPI Release Was Uploaded by CI
For each new or bumped package, verify that the release on PyPI was published
automatically by a CI pipeline (via OIDC Trusted Publisher), not uploaded
manually.
1. Fetch the PyPI JSON for the specific version being introduced or bumped:
`https://pypi.org/pypi/{package_name}/{version}/json`
2. Inspect the `urls` array in the response. For each distribution file (wheel
or sdist), note the filename.
3. For each filename, attempt to fetch the PyPI provenance attestation:
`https://pypi.org/integrity/{package_name}/{version}/{filename}/provenance`
- If the response is HTTP 200 and contains a valid attestation object,
inspect `attestation_bundles[*].publisher`. A Trusted Publisher attestation
will have a `kind` identifying the CI system (e.g. `"GitHub Actions"`,
`"GitLab"`) and a `repository` or `project` field matching the source
repository.
- If at least one distribution file has a valid Trusted Publisher attestation,
mark ✅ CI-uploaded.
- If no attestation is found for any file (404 for all), mark ❌ — "Release
has no provenance attestation; it may have been uploaded manually".
- If an attestation exists but the `publisher` does not identify a recognized
CI system or Trusted Publisher, mark ⚠️ — "Attestation present but
publisher cannot be verified as automated CI".
Note: if PyPI returns an error fetching the per-version JSON, fall back to the
latest JSON (`https://pypi.org/pypi/{package_name}/json`) and look up the
specific version in the `releases` dict.
## Step 3 — Check Repository Availability
For each new or bumped package:
1. From the PyPI JSON at `info.project_urls`, find the source repository URL
(keys such as `"Source"`, `"Homepage"`, `"Repository"`, or `"Source Code"`).
2. Use web-fetch to perform a GET request to the repository URL.
3. If the response returns HTTP 200 and the page is publicly accessible, mark ✅.
4. If the URL is missing, returns a non-200 status, or redirects to a login
page, mark ❌ with a note that the repository could not be verified as public.
## Step 4 — Check PR Description
Read the PR body from the GitHub API using the PR number `${{ github.event.pull_request.number }}`.
Extract all URLs present in the PR body.
### 4a — New packages: repository link required
For **new packages** (brand-new dependency not previously in any requirements
file): the PR description must contain a link that points to the package's
**source repository** as identified in Step 3 (the URL recorded from
`info.project_urls`). A PyPI page link alone is **not** acceptable — the link
must point directly to the source repository (e.g. a GitHub or GitLab URL).
- If a URL in the PR body matches (or is a sub-path of) the source repository
URL identified via PyPI, mark ✅.
- If the PR body contains a source repository URL that does **not** match the
repository URL found in the package's PyPI metadata (`info.project_urls`),
mark ❌ — "PR description links to `<pr_url>` but PyPI reports the source
repository as `<pypi_repo_url>`; please use the correct repository URL."
- If no source repository URL is present in the PR body at all, mark ❌ —
"PR description must link to the source repository at `<repo_url>` (found
via PyPI). A PyPI page link is not sufficient."
### 4b — Version bumps: changelog or diff link required
For **version bumps**: the PR description must contain a link to a changelog,
release notes page, or a diff/comparison URL that references the **correct
versions** being bumped (old → new).
Checks to perform for each bumped package (old version = X, new version = Y):
1. Extract all URLs from the PR body that contain the repository's domain or
path (as identified in Step 3).
2. Verify that at least one such URL includes both the old version string and
new version string in some form — e.g. a GitHub compare URL like
`compare/vX...vY`, a releases URL mentioning version Y, or a
`CHANGELOG.md` anchor referencing Y.
3. If no URL matches, check if the PR body contains any changelog/diff link at
all for this package.
Outcome:
- ✅ — a URL pointing to the correct repo with version references covering the
exact bump (X → Y).
- ⚠️ — a changelog/diff link exists but does not clearly reference the correct
versions or the correct repository; explain what was found and what is
expected.
- ❌ — no changelog or diff link found at all in the PR description for this
package.
### 4c — Diff consistency check
For each **version bump**, verify that the version change recorded in the diff
(Step 1) is internally consistent:
- The `-` line must contain the old version and the `+` line must contain the
new version for the same package name.
- Flag ❌ if the diff shows a downgrade (new version < old version) without an
explanation, or if the version strings cannot be parsed.
## Step 5 — Verify Source Repository is Publicly Accessible
Before inspecting the release pipeline, confirm that the source repository
identified in Step 3 is publicly reachable.
For each new or bumped package:
1. Use the source repository URL recorded in Step 3.
2. If no repository URL was found in `info.project_urls`, mark ❌ — "No source
repository URL found in PyPI metadata; a public source repository is
required."
3. If a repository URL was found, perform a GET request to that URL (using
web-fetch). If the response is HTTP 200 and returns a publicly accessible
page (not a login redirect or error page), mark ✅.
4. If the response is non-200, the URL redirects to a login/authentication page,
or the repository appears private or unavailable, mark ❌ — "Source
repository at `<repo_url>` is not publicly accessible. Home Assistant
requires all dependencies to have publicly available source code." **Do not
proceed with the release pipeline check (Step 6) for this package.**
## Step 6 — Check Release Pipeline Sanity
For each new or bumped package, determine the source repository host from the
URL identified in Step 3, then inspect whether the project's release/publish CI
workflow is sane. The checks differ by hosting provider.
### GitHub repositories (`github.com`)
1. Using the GitHub API, list the workflows in the source repository:
`GET /repos/{owner}/{repo}/actions/workflows`
2. Identify any workflow whose name or filename suggests publishing to PyPI
(e.g., contains "release", "publish", "pypi", or "deploy").
3. Fetch the workflow file content and check the following:
a. **Trigger sanity**: The publish job should be triggered by `push` to tags,
`release: published`, or `workflow_run` on a release job — **not** solely
by `workflow_dispatch` with no additional guards. A `workflow_dispatch`
trigger alongside other triggers is acceptable. Mark ❌ if the only trigger
is manual `workflow_dispatch` with no environment protection rules.
b. **OIDC / Trusted Publisher**: The workflow should use OIDC-based publishing.
Look for `id-token: write` permission and one of:
- `pypa/gh-action-pypi-publish` action
- `actions/attest-build-provenance` action
- Any step that sets `TWINE_PASSWORD` from `secrets.PYPI_TOKEN` directly
(flag ❌ if a long-lived API token is used instead of OIDC).
Mark ✅ if OIDC is used, ⚠️ if the publish method cannot be determined,
❌ if a static secret token is the only credential.
c. **No manual upload bypass**: Verify there is no step that calls
`twine upload` or `pip upload` outside of a properly gated job (e.g., one
that requires an environment approval). Flag ⚠️ if such steps exist.
4. If no publish workflow is found in the repository, mark ⚠️ — "No publish
workflow found; it is unclear how this package is released to PyPI."
### GitLab repositories (`gitlab.com` or self-hosted GitLab)
1. Use the GitLab REST API to list CI/CD pipeline configuration files. First
resolve the project ID via
`GET https://gitlab.com/api/v4/projects/{url-encoded-namespace-and-name}`
and note the `id` field.
2. Fetch the repository's `.gitlab-ci.yml` (and any included files) using
`GET https://gitlab.com/api/v4/projects/{id}/repository/files/.gitlab-ci.yml/raw?ref=HEAD`
(use web-fetch for public repos).
3. Identify any job whose name or `stage` suggests publishing to PyPI
(e.g., "publish", "deploy", "release", "pypi").
4. For each such job, check:
a. **Trigger sanity**: The job should run only on tag pipelines (`only: tags`
or `rules: - if: $CI_COMMIT_TAG`) or on protected branches — **not**
solely on manual triggers (`when: manual`) with no additional protection.
Mark ❌ if the only trigger is manual with no environment or protected-branch
guard.
b. **Automated credentials**: The job should use GitLab's OIDC ID token
(`id_tokens:` block) and `pypa/gh-action-pypi-publish` equivalent, or
reference `secrets.PYPI_TOKEN` / `$PYPI_TOKEN` injected from GitLab CI/CD
protected variables (flag ❌ if the token is hard-coded or unprotected).
Mark ✅ if OIDC or protected CI variables are used, ⚠️ if the method
cannot be determined, ❌ if credentials appear to be insecure.
c. **No manual upload bypass**: Flag ⚠️ if any job calls `twine upload`
without being behind a protected-variable or environment guard.
5. If no publish job is found, mark ⚠️ — "No publish job found in .gitlab-ci.yml;
it is unclear how this package is released to PyPI."
### Other code hosting providers
For repositories hosted on platforms other than GitHub or GitLab (e.g.,
Bitbucket, Codeberg, Gitea, Sourcehut):
1. Use web-fetch to retrieve the repository's root page and look for any
publicly visible CI configuration files (e.g., `.circleci/config.yml`,
`Jenkinsfile`, `azure-pipelines.yml`, `bitbucket-pipelines.yml`,
`.builds/*.yml` for Sourcehut).
2. Apply the same conceptual checks as above:
- Does publishing run on automated triggers (tags/releases), not solely
manual ones?
- Are credentials injected by the CI system (not hard-coded)?
- Is there a `twine upload` or equivalent step that could be run manually?
3. If no CI configuration can be retrieved, mark ⚠️ — "Release pipeline could
not be inspected; hosting provider is not GitHub or GitLab."
## Step 7 — Post a Review Comment
**Always** post a review comment using `add-comment`, regardless of whether
packages pass or fail. Use the following structure:
> **Note on deduplication**: The workflow automatically updates any previous
> requirements-check comment on the PR in place (preserving its position in the
> thread). If no previous comment exists, the newly created comment is kept as-is.
> You do not need to search for or update previous comments yourself.
### Comment structure
Begin every comment with the HTML marker `<!-- requirements-check -->` on its
own line (this is used by the workflow to find the previous comment and update
it on the next run).
### 7a — Overall summary line
Begin the comment with a single summary line, before anything else:
- If everything passed: `All requirements checks passed. ✅`
- If there are failures or warnings: `⚠️ Some checks require attention — see the details below.`
### 7b — Summary table
Render a compact table where every check column contains **only the status
icon** (✅, ⚠️, or ❌). No explanatory text belongs inside the table cells —
all detail goes in the per-package sections below.
Use `—` (em dash) when a check was skipped (e.g. Release Pipeline is skipped
when the repository is not publicly accessible).
```
<!-- requirements-check -->
## Requirements Check
| Package | Type | Old→New | License | Repo Public | CI Upload | Release Pipeline | PR Link | Diff Consistent |
|---------|------|---------|---------|-------------|-----------|------------------|---------|-----------------|
| PackageA | bump | 1.2.3→1.3.0 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| PackageB | new | —→4.5.6 | ❌ | ✅ | ❌ | ⚠️ | ❌ | ✅ |
| PackageC | bump | 2.0.0→2.1.0 | ✅ | ❌ | — | — | ⚠️ | ✅ |
```
### 7c — Per-package detail sections
After the table, add one collapsible `<details>` block per package.
- If **all checks passed** for that package, render the block **collapsed**
(no `open` attribute) so the comment stays concise.
- If **any check failed or produced a warning**, render the block **open**
(`<details open>`) so the contributor sees the issues immediately.
Each block must include the full detail for every check: the license found, the
repository URL, whether a provenance attestation was found, the release
pipeline findings, the PR link found (or missing), and whether the diff is
consistent. For failed or warned checks, explain exactly what the contributor
must fix, including the expected source repository URL, expected version range,
etc.
Template (repeat for each package):
```
<details open>
<summary><strong>PackageB 📦 new —→4.5.6</strong></summary>
- **License**: ❌ License is `UNKNOWN` — not in the approved list. Check PyPI metadata and `script/licenses.py`.
- **Repository Public**: ✅ https://github.com/example/packageb is publicly accessible.
- **CI Upload**: ❌ No provenance attestation found for any distribution file. The release may have been uploaded manually.
- **Release Pipeline**: ⚠️ No publish workflow found in the repository; it is unclear how this package is released to PyPI.
- **PR Link**: ❌ PR description must link to the source repository at https://github.com/example/packageb (a PyPI page link is not sufficient).
- **Diff Consistent**: ✅
</details>
```
Collapsed example (all checks passed):
```
<details>
<summary><strong>PackageA 📦 bump 1.2.3→1.3.0</strong></summary>
- **License**: ✅ MIT
- **Repository Public**: ✅ https://github.com/example/packagea
- **CI Upload**: ✅ Trusted Publisher attestation found (GitHub Actions).
- **Release Pipeline**: ✅ OIDC via `pypa/gh-action-pypi-publish`; triggered on `release: published`; `environment: release` gate.
- **PR Link**: ✅ https://github.com/example/packagea/compare/v1.2.3...v1.3.0
- **Diff Consistent**: ✅
</details>
```
## Notes
- Be constructive and helpful. Provide direct links where possible so the
contributor can quickly fix the issue.
- If PyPI returns an error for a package, mention that it could not be found and
suggest the contributor verify the package name.
- For packages that only appear in `homeassistant/package_constraints.txt` or
`pyproject.toml` without being tied to a specific integration, the PR
description link requirement still applies.
- When checking test-only packages (from `requirements_test.txt` or
`requirements_test_all.txt`), apply the same license, repository, and PR
description checks as for production dependencies.
- A package that appears in both a production file and a test file should only
be reported once; use the production file entry as the canonical one.
- This workflow is only triggered when a commit actually changes one of the
tracked requirements files (for `synchronize` events GitHub compares the
before/after SHAs of the push, not the entire PR diff). Members can manually
retrigger the workflow via `workflow_dispatch` with the PR number to re-run
the check after updating the PR description or fixing issues without changing
any requirements files. On a retrigger the existing comment is updated in
place so there is always exactly one requirements-check comment in the PR.
+2 -2
View File
@@ -28,11 +28,11 @@ jobs:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/init@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/analyze@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
category: "/language:python"
+4 -2
View File
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.15.1
rev: v0.15.10
hooks:
- id: ruff-check
args:
@@ -18,11 +18,12 @@ repos:
exclude_types: [csv, json, html]
exclude: ^tests/fixtures/|homeassistant/generated/|tests/components/.*/snapshots/
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.24.0
rev: v1.24.1
hooks:
- id: zizmor
args:
- --pedantic
exclude: ^\.github/workflows/check-requirements\.lock\.yml$
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v6.0.0
hooks:
@@ -46,6 +47,7 @@ repos:
additional_dependencies:
- prettier@3.6.2
- prettier-plugin-sort-json@4.2.0
exclude: ^\.github/workflows/check-requirements\.lock\.yml$
- repo: https://github.com/cdce8p/python-typing-update
rev: v0.6.0
hooks:
+2
View File
@@ -46,6 +46,7 @@ homeassistant.components.accuweather.*
homeassistant.components.acer_projector.*
homeassistant.components.acmeda.*
homeassistant.components.actiontec.*
homeassistant.components.actron_air.*
homeassistant.components.adax.*
homeassistant.components.adguard.*
homeassistant.components.aftership.*
@@ -223,6 +224,7 @@ homeassistant.components.fronius.*
homeassistant.components.frontend.*
homeassistant.components.fujitsu_fglair.*
homeassistant.components.fully_kiosk.*
homeassistant.components.fumis.*
homeassistant.components.fyta.*
homeassistant.components.generic_hygrostat.*
homeassistant.components.generic_thermostat.*
+1
View File
@@ -1,5 +1,6 @@
ignore: |
tests/fixtures/core/config/yaml_errors/
.github/workflows/check-requirements.lock.yml
rules:
braces:
level: error
Generated
+8 -2
View File
@@ -400,6 +400,8 @@ CLAUDE.md @home-assistant/core
/tests/components/dnsip/ @gjohansson-ST
/homeassistant/components/door/ @home-assistant/core
/tests/components/door/ @home-assistant/core
/homeassistant/components/doorbell/ @home-assistant/core
/tests/components/doorbell/ @home-assistant/core
/homeassistant/components/doorbird/ @oblogic7 @bdraco @flacjacket
/tests/components/doorbird/ @oblogic7 @bdraco @flacjacket
/homeassistant/components/dormakaba_dkey/ @emontnemery
@@ -592,6 +594,8 @@ CLAUDE.md @home-assistant/core
/tests/components/fujitsu_fglair/ @crevetor
/homeassistant/components/fully_kiosk/ @cgarwood
/tests/components/fully_kiosk/ @cgarwood
/homeassistant/components/fumis/ @frenck
/tests/components/fumis/ @frenck
/homeassistant/components/fyta/ @dontinelli
/tests/components/fyta/ @dontinelli
/homeassistant/components/garage_door/ @home-assistant/core
@@ -1251,6 +1255,8 @@ CLAUDE.md @home-assistant/core
/tests/components/open_meteo/ @frenck
/homeassistant/components/open_router/ @joostlek @ab3lson
/tests/components/open_router/ @joostlek @ab3lson
/homeassistant/components/openai_conversation/ @Shulyaka
/tests/components/openai_conversation/ @Shulyaka
/homeassistant/components/opendisplay/ @g4bri3lDev
/tests/components/opendisplay/ @g4bri3lDev
/homeassistant/components/openerz/ @misialq
@@ -1989,8 +1995,8 @@ CLAUDE.md @home-assistant/core
/tests/components/wsdot/ @ucodery
/homeassistant/components/wyoming/ @synesthesiam
/tests/components/wyoming/ @synesthesiam
/homeassistant/components/xbox/ @hunterjm @tr4nt0r
/tests/components/xbox/ @hunterjm @tr4nt0r
/homeassistant/components/xbox/ @tr4nt0r
/tests/components/xbox/ @tr4nt0r
/homeassistant/components/xiaomi_aqara/ @danielhiversen @syssi
/tests/components/xiaomi_aqara/ @danielhiversen @syssi
/homeassistant/components/xiaomi_ble/ @Jc2k @Ernst79
Generated
+1
View File
@@ -1,3 +1,4 @@
# syntax=docker/dockerfile@sha256:2780b5c3bab67f1f76c781860de469442999ed1a0d7992a5efdf2cffc0e3d769
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
+1
View File
@@ -1,3 +1,4 @@
# syntax=docker/dockerfile@sha256:2780b5c3bab67f1f76c781860de469442999ed1a0d7992a5efdf2cffc0e3d769
FROM mcr.microsoft.com/vscode/devcontainers/base:debian
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+26 -19
View File
@@ -30,7 +30,7 @@ from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType
from .const import CONF_POLLING, DOMAIN, DOMAIN_DATA, LOGGER
from .const import CONF_POLLING, DOMAIN, LOGGER
from .services import async_setup_services
ATTR_DEVICE_NAME = "device_name"
@@ -67,13 +67,16 @@ class AbodeSystem:
logout_listener: CALLBACK_TYPE | None = None
type AbodeConfigEntry = ConfigEntry[AbodeSystem]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Abode component."""
async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: AbodeConfigEntry) -> bool:
"""Set up Abode integration from a config entry."""
username = entry.data[CONF_USERNAME]
password = entry.data[CONF_PASSWORD]
@@ -99,50 +102,54 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
except (AbodeException, ConnectTimeout, HTTPError) as ex:
raise ConfigEntryNotReady(f"Unable to connect to Abode: {ex}") from ex
hass.data[DOMAIN_DATA] = AbodeSystem(abode, polling)
entry.runtime_data = AbodeSystem(abode, polling)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
await setup_hass_events(hass)
await hass.async_add_executor_job(setup_abode_events, hass)
await setup_hass_events(hass, entry)
await hass.async_add_executor_job(setup_abode_events, hass, entry)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
def _shutdown_client(abode: Abode) -> None:
"""Shutdown client."""
abode.events.stop()
abode.logout()
async def async_unload_entry(hass: HomeAssistant, entry: AbodeConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
await hass.async_add_executor_job(hass.data[DOMAIN_DATA].abode.events.stop)
await hass.async_add_executor_job(hass.data[DOMAIN_DATA].abode.logout)
await hass.async_add_executor_job(_shutdown_client, entry.runtime_data.abode)
if logout_listener := hass.data[DOMAIN_DATA].logout_listener:
if logout_listener := entry.runtime_data.logout_listener:
logout_listener()
hass.data.pop(DOMAIN_DATA)
return unload_ok
async def setup_hass_events(hass: HomeAssistant) -> None:
async def setup_hass_events(hass: HomeAssistant, entry: AbodeConfigEntry) -> None:
"""Home Assistant start and stop callbacks."""
def logout(event: Event) -> None:
"""Logout of Abode."""
if not hass.data[DOMAIN_DATA].polling:
hass.data[DOMAIN_DATA].abode.events.stop()
if not entry.runtime_data.polling:
entry.runtime_data.abode.events.stop()
hass.data[DOMAIN_DATA].abode.logout()
entry.runtime_data.abode.logout()
LOGGER.info("Logged out of Abode")
if not hass.data[DOMAIN_DATA].polling:
await hass.async_add_executor_job(hass.data[DOMAIN_DATA].abode.events.start)
if not entry.runtime_data.polling:
await hass.async_add_executor_job(entry.runtime_data.abode.events.start)
hass.data[DOMAIN_DATA].logout_listener = hass.bus.async_listen_once(
entry.runtime_data.logout_listener = hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STOP, logout
)
def setup_abode_events(hass: HomeAssistant) -> None:
def setup_abode_events(hass: HomeAssistant, entry: AbodeConfigEntry) -> None:
"""Event callbacks."""
def event_callback(event: str, event_json: dict[str, str]) -> None:
@@ -179,6 +186,6 @@ def setup_abode_events(hass: HomeAssistant) -> None:
]
for event in events:
hass.data[DOMAIN_DATA].abode.events.add_event_callback(
entry.runtime_data.abode.events.add_event_callback(
event, partial(event_callback, event)
)
@@ -9,21 +9,20 @@ from homeassistant.components.alarm_control_panel import (
AlarmControlPanelEntityFeature,
AlarmControlPanelState,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeDevice
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode alarm control panel device."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
[AbodeAlarm(data, await hass.async_add_executor_job(data.abode.get_alarm))]
)
@@ -10,22 +10,21 @@ from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util.enum import try_parse_enum
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeDevice
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode binary sensor devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
device_types = [
"connectivity",
+4 -5
View File
@@ -12,14 +12,13 @@ import requests
from requests.models import Response
from homeassistant.components.camera import Camera
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import Event, HomeAssistant
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util import Throttle
from . import AbodeSystem
from .const import DOMAIN_DATA, LOGGER
from . import AbodeConfigEntry, AbodeSystem
from .const import LOGGER
from .entity import AbodeDevice
MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=90)
@@ -27,11 +26,11 @@ MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=90)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode camera devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
AbodeCamera(data, device, timeline.CAPTURE_IMAGE)
-7
View File
@@ -3,17 +3,10 @@
from __future__ import annotations
import logging
from typing import TYPE_CHECKING
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from . import AbodeSystem
LOGGER = logging.getLogger(__package__)
DOMAIN = "abode"
DOMAIN_DATA: HassKey[AbodeSystem] = HassKey(DOMAIN)
ATTRIBUTION = "Data provided by goabode.com"
CONF_POLLING = "polling"
+3 -4
View File
@@ -5,21 +5,20 @@ from typing import Any
from jaraco.abode.devices.cover import Cover
from homeassistant.components.cover import CoverEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeDevice
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode cover devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
AbodeCover(data, device)
+2 -2
View File
@@ -7,7 +7,7 @@ from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import Entity
from . import AbodeSystem
from .const import ATTRIBUTION, DOMAIN, DOMAIN_DATA
from .const import ATTRIBUTION, DOMAIN
class AbodeEntity(Entity):
@@ -29,7 +29,7 @@ class AbodeEntity(Entity):
self._update_connection_status,
)
self.hass.data[DOMAIN_DATA].entity_ids.add(self.entity_id)
self._data.entity_ids.add(self.entity_id)
async def async_will_remove_from_hass(self) -> None:
"""Unsubscribe from Abode connection status updates."""
+3 -4
View File
@@ -16,21 +16,20 @@ from homeassistant.components.light import (
ColorMode,
LightEntity,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeDevice
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode light devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
AbodeLight(data, device)
+3 -4
View File
@@ -5,21 +5,20 @@ from typing import Any
from jaraco.abode.devices.lock import Lock
from homeassistant.components.lock import LockEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeDevice
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode lock devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
AbodeLock(data, device)
+3 -5
View File
@@ -14,13 +14,11 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import LIGHT_LUX, PERCENTAGE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AbodeSystem
from .const import DOMAIN_DATA
from . import AbodeConfigEntry, AbodeSystem
from .entity import AbodeDevice
ABODE_TEMPERATURE_UNIT_HA_UNIT = {
@@ -66,11 +64,11 @@ SENSOR_TYPES: tuple[AbodeSensorDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode sensor devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
async_add_entities(
AbodeSensor(data, device, description)
+18 -4
View File
@@ -2,15 +2,21 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from jaraco.abode.exceptions import Exception as AbodeException
import voluptuous as vol
from homeassistant.const import ATTR_ENTITY_ID
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.dispatcher import dispatcher_send
from .const import DOMAIN, DOMAIN_DATA, LOGGER
from .const import DOMAIN, LOGGER
if TYPE_CHECKING:
from . import AbodeConfigEntry, AbodeSystem
ATTR_SETTING = "setting"
ATTR_VALUE = "value"
@@ -25,13 +31,21 @@ CAPTURE_IMAGE_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
AUTOMATION_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
def _get_abode_system(hass: HomeAssistant) -> AbodeSystem:
"""Return the Abode system for the loaded config entry."""
entries: list[AbodeConfigEntry] = hass.config_entries.async_loaded_entries(DOMAIN)
if not entries:
raise ServiceValidationError("Abode integration is not loaded")
return entries[0].runtime_data
def _change_setting(call: ServiceCall) -> None:
"""Change an Abode system setting."""
setting = call.data[ATTR_SETTING]
value = call.data[ATTR_VALUE]
try:
call.hass.data[DOMAIN_DATA].abode.set_setting(setting, value)
_get_abode_system(call.hass).abode.set_setting(setting, value)
except AbodeException as ex:
LOGGER.warning(ex)
@@ -42,7 +56,7 @@ def _capture_image(call: ServiceCall) -> None:
target_entities = [
entity_id
for entity_id in call.hass.data[DOMAIN_DATA].entity_ids
for entity_id in _get_abode_system(call.hass).entity_ids
if entity_id in entity_ids
]
@@ -57,7 +71,7 @@ def _trigger_automation(call: ServiceCall) -> None:
target_entities = [
entity_id
for entity_id in call.hass.data[DOMAIN_DATA].entity_ids
for entity_id in _get_abode_system(call.hass).entity_ids
if entity_id in entity_ids
]
+3 -4
View File
@@ -7,12 +7,11 @@ from typing import Any, cast
from jaraco.abode.devices.switch import Switch
from homeassistant.components.switch import SwitchEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN_DATA
from . import AbodeConfigEntry
from .entity import AbodeAutomation, AbodeDevice
DEVICE_TYPES = ["switch", "valve"]
@@ -20,11 +19,11 @@ DEVICE_TYPES = ["switch", "valve"]
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: AbodeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Abode switch devices."""
data = hass.data[DOMAIN_DATA]
data = entry.runtime_data
entities: list[SwitchEntity] = [
AbodeSwitch(data, device)
@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/acer_projector",
"iot_class": "local_polling",
"quality_scale": "legacy",
"requirements": ["serialx==1.2.2"]
"requirements": ["serialx==1.4.1"]
}
+16 -5
View File
@@ -15,8 +15,10 @@ from homeassistant.components.climate import (
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
from .entity import ActronAirAcEntity, ActronAirZoneEntity, actron_air_command
@@ -139,20 +141,24 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
@actron_air_command
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set a new fan mode."""
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode)
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR[fan_mode]
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
@actron_air_command
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode)
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR[hvac_mode]
await self._status.ac_system.set_system_mode(ac_mode)
@actron_air_command
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
temp = kwargs.get(ATTR_TEMPERATURE)
await self._status.user_aircon_settings.set_temperature(temperature=temp)
if (temperature := kwargs.get(ATTR_TEMPERATURE)) is None:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="temperature_missing",
)
await self._status.user_aircon_settings.set_temperature(temperature=temperature)
class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
@@ -221,4 +227,9 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
@actron_air_command
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
await self._zone.set_temperature(temperature=kwargs.get(ATTR_TEMPERATURE))
if (temperature := kwargs.get(ATTR_TEMPERATURE)) is None:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="temperature_missing",
)
await self._zone.set_temperature(temperature=temperature)
@@ -23,7 +23,7 @@ class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
self._user_code: str = ""
self._verification_uri: str = ""
self._expires_minutes: str = "30"
self.login_task: asyncio.Task | None = None
self.login_task: asyncio.Task[None] | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
@@ -94,7 +94,7 @@ class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.error("Error getting user info: %s", err)
return self.async_abort(reason="oauth2_error")
unique_id = str(user_data["id"])
unique_id = user_data.sub
await self.async_set_unique_id(unique_id)
# Check if this is a reauth flow
@@ -107,7 +107,7 @@ class ActronAirConfigFlow(ConfigFlow, domain=DOMAIN):
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=user_data["email"],
title=user_data.email,
data={CONF_API_TOKEN: self._api.refresh_token_value},
)
@@ -78,7 +78,14 @@ class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirStatus]):
translation_placeholders={"error": repr(err)},
) from err
self.status = self.api.state_manager.get_status(self.serial_number)
status = self.api.state_manager.get_status(self.serial_number)
if status is None:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_error",
translation_placeholders={"error": "Status not available"},
)
self.status = status
self.last_seen = dt_util.utcnow()
return self.status
@@ -24,7 +24,7 @@ def actron_air_command[_EntityT: ActronAirEntity, **_P](
"""
@wraps(func)
async def wrapper(self: _EntityT, *args: _P.args, **kwargs: _P.kwargs) -> None:
async def wrapper(self: _EntityT, /, *args: _P.args, **kwargs: _P.kwargs) -> None:
"""Wrap API calls with exception handling."""
try:
await func(self, *args, **kwargs)
@@ -13,5 +13,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "silver",
"requirements": ["actron-neo-api==0.5.0"]
"requirements": ["actron-neo-api==0.5.3"]
}
@@ -69,4 +69,4 @@ rules:
# Platinum
async-dependency: done
inject-websession: todo
strict-typing: todo
strict-typing: done
@@ -58,6 +58,9 @@
"setup_connection_error": {
"message": "Failed to connect to the Actron Air API"
},
"temperature_missing": {
"message": "Provide a temperature value when adjusting the climate entity."
},
"update_error": {
"message": "An error occurred while retrieving data from the Actron Air API: {error}"
}
@@ -36,7 +36,9 @@ def _make_detected_condition(
) -> type[Condition]:
"""Create a detected condition for a binary sensor device class."""
return make_entity_state_condition(
{BINARY_SENSOR_DOMAIN: DomainSpec(device_class=device_class)}, STATE_ON
{BINARY_SENSOR_DOMAIN: DomainSpec(device_class=device_class)},
STATE_ON,
support_duration=True,
)
@@ -45,7 +47,9 @@ def _make_cleared_condition(
) -> type[Condition]:
"""Create a cleared condition for a binary sensor device class."""
return make_entity_state_condition(
{BINARY_SENSOR_DOMAIN: DomainSpec(device_class=device_class)}, STATE_OFF
{BINARY_SENSOR_DOMAIN: DomainSpec(device_class=device_class)},
STATE_OFF,
support_duration=True,
)
@@ -249,6 +249,11 @@
.condition_binary_common: &condition_binary_common
fields:
behavior: *condition_behavior
for:
required: true
default: 00:00:00
selector:
duration:
is_gas_detected:
<<: *condition_binary_common
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"condition_threshold_name": "Threshold type",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least",
@@ -24,6 +25,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Carbon monoxide cleared"
@@ -33,6 +37,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Carbon monoxide detected"
@@ -54,6 +61,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Gas cleared"
@@ -63,6 +73,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Gas detected"
@@ -168,6 +181,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Smoke cleared"
@@ -177,6 +193,9 @@
"fields": {
"behavior": {
"name": "[%key:component::air_quality::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::air_quality::common::condition_for_name%]"
}
},
"name": "Smoke detected"
@@ -4,6 +4,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.automation import DomainSpec
from homeassistant.helpers.condition import (
ENTITY_STATE_CONDITION_SCHEMA_ANY_ALL_FOR,
Condition,
EntityStateConditionBase,
make_entity_state_condition,
@@ -25,6 +26,7 @@ class EntityStateRequiredFeaturesCondition(EntityStateConditionBase):
"""State condition."""
_required_features: int
_schema = ENTITY_STATE_CONDITION_SCHEMA_ANY_ALL_FOR
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain with the required features."""
@@ -82,9 +84,11 @@ CONDITIONS: dict[str, type[Condition]] = {
AlarmControlPanelState.ARMED_VACATION,
AlarmControlPanelEntityFeature.ARM_VACATION,
),
"is_disarmed": make_entity_state_condition(DOMAIN, AlarmControlPanelState.DISARMED),
"is_disarmed": make_entity_state_condition(
DOMAIN, AlarmControlPanelState.DISARMED, support_duration=True
),
"is_triggered": make_entity_state_condition(
DOMAIN, AlarmControlPanelState.TRIGGERED
DOMAIN, AlarmControlPanelState.TRIGGERED, support_duration=True
),
}
@@ -1,9 +1,9 @@
.condition_common: &condition_common
target:
target: &condition_common_target
entity:
domain: alarm_control_panel
fields: &condition_common_fields
behavior:
behavior: &condition_common_behavior
required: true
default: any
selector:
@@ -13,10 +13,20 @@
- all
- any
.condition_common_for: &condition_common_for
target: *condition_common_target
fields: &condition_common_for_fields
behavior: *condition_common_behavior
for:
required: true
default: 00:00:00
selector:
duration:
is_armed: *condition_common
is_armed_away:
fields: *condition_common_fields
fields: *condition_common_for_fields
target:
entity:
domain: alarm_control_panel
@@ -24,7 +34,7 @@ is_armed_away:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_AWAY
is_armed_home:
fields: *condition_common_fields
fields: *condition_common_for_fields
target:
entity:
domain: alarm_control_panel
@@ -32,7 +42,7 @@ is_armed_home:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_HOME
is_armed_night:
fields: *condition_common_fields
fields: *condition_common_for_fields
target:
entity:
domain: alarm_control_panel
@@ -40,13 +50,13 @@ is_armed_night:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_NIGHT
is_armed_vacation:
fields: *condition_common_fields
fields: *condition_common_for_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_VACATION
is_disarmed: *condition_common
is_disarmed: *condition_common_for
is_triggered: *condition_common
is_triggered: *condition_common_for
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -19,6 +20,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is armed away"
@@ -28,6 +32,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is armed home"
@@ -37,6 +44,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is armed night"
@@ -46,6 +56,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is armed vacation"
@@ -55,6 +68,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is disarmed"
@@ -64,6 +80,9 @@
"fields": {
"behavior": {
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::alarm_control_panel::common::condition_for_name%]"
}
},
"name": "Alarm is triggered"
@@ -43,7 +43,6 @@ from homeassistant.helpers.selector import (
from homeassistant.helpers.typing import VolDictType
from .const import (
CODE_EXECUTION_UNSUPPORTED_MODELS,
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
@@ -66,7 +65,6 @@ from .const import (
DOMAIN,
MIN_THINKING_BUDGET,
TOOL_SEARCH_UNSUPPORTED_MODELS,
WEB_SEARCH_UNSUPPORTED_MODELS,
PromptCaching,
)
from .coordinator import model_alias
@@ -389,8 +387,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
else cv.positive_int,
}
model = self.options[CONF_CHAT_MODEL]
if (
self.model_info.capabilities
and self.model_info.capabilities.thinking.supported
@@ -445,43 +441,34 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
else:
self.options.pop(CONF_THINKING_EFFORT, None)
if not model.startswith(tuple(CODE_EXECUTION_UNSUPPORTED_MODELS)):
step_schema[
step_schema.update(
{
vol.Optional(
CONF_CODE_EXECUTION,
default=DEFAULT[CONF_CODE_EXECUTION],
)
] = bool
else:
self.options.pop(CONF_CODE_EXECUTION, None)
if not model.startswith(tuple(WEB_SEARCH_UNSUPPORTED_MODELS)):
step_schema.update(
{
vol.Optional(
CONF_WEB_SEARCH,
default=DEFAULT[CONF_WEB_SEARCH],
): bool,
vol.Optional(
CONF_WEB_SEARCH_MAX_USES,
default=DEFAULT[CONF_WEB_SEARCH_MAX_USES],
): int,
vol.Optional(
CONF_WEB_SEARCH_USER_LOCATION,
default=DEFAULT[CONF_WEB_SEARCH_USER_LOCATION],
): bool,
}
)
else:
self.options.pop(CONF_WEB_SEARCH, None)
self.options.pop(CONF_WEB_SEARCH_MAX_USES, None)
self.options.pop(CONF_WEB_SEARCH_USER_LOCATION, None)
): bool,
vol.Optional(
CONF_WEB_SEARCH,
default=DEFAULT[CONF_WEB_SEARCH],
): bool,
vol.Optional(
CONF_WEB_SEARCH_MAX_USES,
default=DEFAULT[CONF_WEB_SEARCH_MAX_USES],
): int,
vol.Optional(
CONF_WEB_SEARCH_USER_LOCATION,
default=DEFAULT[CONF_WEB_SEARCH_USER_LOCATION],
): bool,
}
)
self.options.pop(CONF_WEB_SEARCH_CITY, None)
self.options.pop(CONF_WEB_SEARCH_REGION, None)
self.options.pop(CONF_WEB_SEARCH_COUNTRY, None)
self.options.pop(CONF_WEB_SEARCH_TIMEZONE, None)
model = self.options[CONF_CHAT_MODEL]
if not model.startswith(tuple(TOOL_SEARCH_UNSUPPORTED_MODELS)):
step_schema[
vol.Optional(
@@ -50,15 +50,6 @@ DEFAULT = {
CONF_WEB_SEARCH_MAX_USES: 5,
}
WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
CODE_EXECUTION_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
TOOL_SEARCH_UNSUPPORTED_MODELS = [
"claude-3",
"claude-haiku",
]
@@ -28,9 +28,7 @@ _model_short_form = re.compile(r"[^\d]-\d$")
@callback
def model_alias(model_id: str) -> str:
"""Resolve alias from versioned model name."""
if model_id == "claude-3-haiku-20240307" or model_id.endswith("-preview"):
return model_id
if model_id[-2:-1] != "-":
if model_id[-2:-1] != "-" and not model_id.endswith("-preview"):
model_id = model_id[:-9]
if _model_short_form.search(model_id):
return model_id + "-0"
+5 -1
View File
@@ -124,10 +124,14 @@ def _format_tool(
tool: llm.Tool, custom_serializer: Callable[[Any], Any] | None
) -> ToolParam:
"""Format tool specification."""
unsupported_keys = {"oneOf", "anyOf", "allOf"}
schema = convert(tool.parameters, custom_serializer=custom_serializer)
schema = {k: v for k, v in schema.items() if k not in unsupported_keys}
return ToolParam(
name=tool.name,
description=tool.description or "",
input_schema=convert(tool.parameters, custom_serializer=custom_serializer),
input_schema=schema,
)
@@ -7,13 +7,17 @@ from .const import DOMAIN
from .entity import AssistSatelliteState
CONDITIONS: dict[str, type[Condition]] = {
"is_idle": make_entity_state_condition(DOMAIN, AssistSatelliteState.IDLE),
"is_listening": make_entity_state_condition(DOMAIN, AssistSatelliteState.LISTENING),
"is_idle": make_entity_state_condition(
DOMAIN, AssistSatelliteState.IDLE, support_duration=True
),
"is_listening": make_entity_state_condition(
DOMAIN, AssistSatelliteState.LISTENING, support_duration=True
),
"is_processing": make_entity_state_condition(
DOMAIN, AssistSatelliteState.PROCESSING
DOMAIN, AssistSatelliteState.PROCESSING, support_duration=True
),
"is_responding": make_entity_state_condition(
DOMAIN, AssistSatelliteState.RESPONDING
DOMAIN, AssistSatelliteState.RESPONDING, support_duration=True
),
}
@@ -12,6 +12,11 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
is_idle: *condition_common
is_listening: *condition_common
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -10,6 +11,9 @@
"fields": {
"behavior": {
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::assist_satellite::common::condition_for_name%]"
}
},
"name": "Satellite is idle"
@@ -19,6 +23,9 @@
"fields": {
"behavior": {
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::assist_satellite::common::condition_for_name%]"
}
},
"name": "Satellite is listening"
@@ -28,6 +35,9 @@
"fields": {
"behavior": {
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::assist_satellite::common::condition_for_name%]"
}
},
"name": "Satellite is processing"
@@ -37,6 +47,9 @@
"fields": {
"behavior": {
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::assist_satellite::common::condition_for_name%]"
}
},
"name": "Satellite is responding"
@@ -169,6 +169,7 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"cover",
"device_tracker",
"door",
"doorbell",
"event",
"fan",
"garage_door",
+22 -11
View File
@@ -1,8 +1,12 @@
"""Support for Amazon Web Services (AWS)."""
from __future__ import annotations
import asyncio
from collections import OrderedDict
from dataclasses import dataclass
import logging
from typing import Any
from aiobotocore.session import AioSession
import voluptuous as vol
@@ -30,14 +34,22 @@ from .const import (
CONF_REGION,
CONF_SECRET_ACCESS_KEY,
CONF_VALIDATE,
DATA_CONFIG,
DATA_HASS_CONFIG,
DATA_SESSIONS,
DATA_AWS,
DOMAIN,
)
_LOGGER = logging.getLogger(__name__)
@dataclass
class AWSData:
"""Runtime data for the AWS integration."""
hass_config: ConfigType
config: dict[str, Any]
sessions: OrderedDict[str, AioSession]
AWS_CREDENTIAL_SCHEMA = vol.Schema(
{
vol.Required(CONF_NAME): cv.string,
@@ -88,14 +100,13 @@ CONFIG_SCHEMA = vol.Schema(
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up AWS component."""
hass.data[DATA_HASS_CONFIG] = config
if (conf := config.get(DOMAIN)) is None:
# create a default conf using default profile
conf = CONFIG_SCHEMA({ATTR_CREDENTIALS: DEFAULT_CREDENTIAL})
hass.data[DATA_CONFIG] = conf
hass.data[DATA_SESSIONS] = OrderedDict()
hass.data[DATA_AWS] = AWSData(
hass_config=config, config=conf, sessions=OrderedDict()
)
hass.async_create_task(
hass.config_entries.flow.async_init(
@@ -111,8 +122,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
Validate and save sessions per aws credential.
"""
config = hass.data[DATA_HASS_CONFIG]
conf = hass.data[DATA_CONFIG]
data = hass.data[DATA_AWS]
conf = data.config
if entry.source == config_entries.SOURCE_IMPORT:
if conf is None:
@@ -143,14 +154,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
validation = False
else:
hass.data[DATA_SESSIONS][name] = result
data.sessions[name] = result
# set up notify platform, no entry support for notify component yet,
# have to use discovery to load platform.
for notify_config in conf[CONF_NOTIFY]:
hass.async_create_task(
discovery.async_load_platform(
hass, Platform.NOTIFY, DOMAIN, notify_config, config
hass, Platform.NOTIFY, DOMAIN, notify_config, data.hass_config
)
)
+10 -3
View File
@@ -1,10 +1,17 @@
"""Constant for AWS component."""
from __future__ import annotations
from typing import TYPE_CHECKING
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from . import AWSData
DOMAIN = "aws"
DATA_CONFIG = "aws_config"
DATA_HASS_CONFIG = "aws_hass_config"
DATA_SESSIONS = "aws_sessions"
DATA_AWS: HassKey[AWSData] = HassKey(DOMAIN)
CONF_ACCESS_KEY_ID = "aws_access_key_id"
CONF_CONTEXT = "context"
+6 -4
View File
@@ -27,7 +27,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.json import JSONEncoder
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import CONF_CONTEXT, CONF_CREDENTIAL_NAME, CONF_REGION, DATA_SESSIONS
from .const import CONF_CONTEXT, CONF_CREDENTIAL_NAME, CONF_REGION, DATA_AWS
_LOGGER = logging.getLogger(__name__)
@@ -76,10 +76,12 @@ async def async_get_service(
if CONF_CONTEXT in aws_config:
del aws_config[CONF_CONTEXT]
sessions = hass.data[DATA_AWS].sessions
if not aws_config:
# no platform config, use the first aws component credential instead
if hass.data[DATA_SESSIONS]:
session = next(iter(hass.data[DATA_SESSIONS].values()))
if sessions:
session = next(iter(sessions.values()))
else:
_LOGGER.error("Missing aws credential for %s", config[CONF_NAME])
return None
@@ -87,7 +89,7 @@ async def async_get_service(
if session is None:
credential_name = aws_config.get(CONF_CREDENTIAL_NAME)
if credential_name is not None:
session = hass.data[DATA_SESSIONS].get(credential_name)
session = sessions.get(credential_name)
if session is None:
_LOGGER.warning("No available aws session for %s", credential_name)
del aws_config[CONF_CREDENTIAL_NAME]
@@ -5,10 +5,7 @@ from __future__ import annotations
import dataclasses
from typing import Any
from homeassistant.components.backup import (
DATA_MANAGER as BACKUP_DATA_MANAGER,
BackupManager,
)
from homeassistant.components.backup import DATA_MANAGER as BACKUP_DATA_MANAGER
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
@@ -31,7 +28,7 @@ async def async_get_config_entry_diagnostics(
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator = entry.runtime_data
backup_manager: BackupManager = hass.data[BACKUP_DATA_MANAGER]
backup_manager = hass.data[BACKUP_DATA_MANAGER]
backups = await async_list_backups_from_s3(
coordinator.client,
bucket=entry.data[CONF_BUCKET],
@@ -34,7 +34,7 @@ def get_device(hass: HomeAssistant, unique_id: str) -> DeviceEntry:
def get_serial_number_from_jid(jid: str) -> str:
"""Get serial number from Beolink JID."""
return jid.split(".")[2].split("@")[0]
return jid.split(".")[2].split("@", maxsplit=1)[0]
async def get_remotes(client: MozartClient) -> list[PairedRemote]:
+10 -4
View File
@@ -29,11 +29,17 @@ BATTERY_PERCENTAGE_DOMAIN_SPECS = {
}
CONDITIONS: dict[str, type[Condition]] = {
"is_low": make_entity_state_condition(BATTERY_DOMAIN_SPECS, STATE_ON),
"is_not_low": make_entity_state_condition(BATTERY_DOMAIN_SPECS, STATE_OFF),
"is_charging": make_entity_state_condition(BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON),
"is_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS, STATE_ON, support_duration=True
),
"is_not_low": make_entity_state_condition(
BATTERY_DOMAIN_SPECS, STATE_OFF, support_duration=True
),
"is_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_ON, support_duration=True
),
"is_not_charging": make_entity_state_condition(
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF
BATTERY_CHARGING_DOMAIN_SPECS, STATE_OFF, support_duration=True
),
"is_level": make_entity_numerical_condition(
BATTERY_PERCENTAGE_DOMAIN_SPECS, PERCENTAGE
@@ -13,6 +13,11 @@
options:
- all
- any
for: &condition_for
required: true
default: 00:00:00
selector:
duration:
.battery_threshold_entity: &battery_threshold_entity
- domain: input_number
@@ -39,6 +44,7 @@ is_charging:
device_class: battery_charging
fields:
behavior: *condition_behavior
for: *condition_for
is_not_charging:
target:
@@ -47,6 +53,7 @@ is_not_charging:
device_class: battery_charging
fields:
behavior: *condition_behavior
for: *condition_for
is_level:
target:
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"condition_threshold_name": "Threshold type",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least",
@@ -12,6 +13,9 @@
"fields": {
"behavior": {
"name": "[%key:component::battery::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::battery::common::condition_for_name%]"
}
},
"name": "Battery is charging"
@@ -33,6 +37,9 @@
"fields": {
"behavior": {
"name": "[%key:component::battery::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::battery::common::condition_for_name%]"
}
},
"name": "Battery is low"
@@ -42,6 +49,9 @@
"fields": {
"behavior": {
"name": "[%key:component::battery::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::battery::common::condition_for_name%]"
}
},
"name": "Battery is not charging"
@@ -51,6 +61,9 @@
"fields": {
"behavior": {
"name": "[%key:component::battery::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::battery::common::condition_for_name%]"
}
},
"name": "Battery is not low"
@@ -7,6 +7,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["blebox_uniapi"],
"requirements": ["blebox-uniapi==2.5.0"],
"requirements": ["blebox-uniapi==2.5.1"],
"zeroconf": ["_bbxsrv._tcp.local."]
}
@@ -6,7 +6,6 @@ DOMAIN = "broadlink"
DOMAINS_AND_TYPES = {
Platform.CLIMATE: {"HYS"},
Platform.INFRARED: {"RM4MINI", "RM4PRO", "RMMINI", "RMMINIB", "RMPRO"},
Platform.LIGHT: {"LB1", "LB2"},
Platform.REMOTE: {"RM4MINI", "RM4PRO", "RMMINI", "RMMINIB", "RMPRO"},
Platform.SELECT: {"HYS"},
@@ -45,6 +44,3 @@ DEVICE_TYPES = set.union(*DOMAINS_AND_TYPES.values())
DEFAULT_PORT = 80
DEFAULT_TIMEOUT = 5
# Broadlink IR packet format - repeat count byte offset
IR_PACKET_REPEAT_INDEX = 1
@@ -1,184 +0,0 @@
"""Infrared platform for Broadlink remotes."""
from __future__ import annotations
from typing import TYPE_CHECKING
from broadlink.exceptions import BroadlinkException
from broadlink.remote import pulses_to_data as _bl_pulses_to_data
import infrared_protocols
from homeassistant.components.infrared import InfraredCommand, InfraredEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, IR_PACKET_REPEAT_INDEX
from .entity import BroadlinkEntity
if TYPE_CHECKING:
from .device import BroadlinkDevice
PARALLEL_UPDATES = 1
class BroadlinkIRCommand(InfraredCommand):
"""Raw IR command with optional Broadlink hardware repeat count.
This class lets you send raw timing data through a Broadlink infrared
entity. The repeat_count maps directly to the Broadlink packet repeat
byte: the device will re-transmit the entire IR burst that many
additional times after the first transmission.
Use this when you have existing Broadlink-encoded IR data (e.g. from
IR code databases like SmartIR) and want to use it with the new
infrared platform.
Protocol-aware commands (infrared_protocols.NECCommand, LgTVCommand,
etc.) manage repeats *inside* get_raw_timings() and should use the
default repeat=0. Only BroadlinkIRCommand should set hardware repeat.
Example: Migrating IR code database base64 codes to the infrared platform:
import base64
from broadlink.remote import data_to_pulses
from homeassistant.components.broadlink.infrared import BroadlinkIRCommand
from homeassistant.components.broadlink.const import IR_PACKET_REPEAT_INDEX
# Decode base64 IR code (e.g. from IR code database)
packet_data = base64.b64decode(b64_code)
repeat_count = packet_data[IR_PACKET_REPEAT_INDEX]
# Parse Broadlink packet to microsecond timings
pulses = data_to_pulses(packet_data)
timings = list(zip(pulses[::2], pulses[1::2]))
if len(pulses) % 2:
timings.append((pulses[-1], 0))
# Create command
cmd = BroadlinkIRCommand(timings, repeat_count=repeat_count)
await infrared.async_send_command(hass, entity_id, cmd)
"""
# Standard IR carrier frequency. Broadlink hardware handles the carrier
# internally, so this value is informational only.
MODULATION = 38000
def __init__(
self,
timings: list[tuple[int, int]],
repeat_count: int = 0,
) -> None:
"""Initialize with timing pairs and optional repeat count.
Args:
timings: List of (mark_us, space_us) pairs in microseconds.
repeat_count: Broadlink hardware repeat count (0 = send once).
Must be 0255 (the hardware repeat byte is a single unsigned byte).
Raises:
ValueError: If repeat_count is outside 0255 range.
"""
if not 0 <= repeat_count <= 255:
raise ValueError(f"repeat_count must be 0255, got {repeat_count}")
super().__init__(modulation=self.MODULATION, repeat_count=repeat_count)
self._timings = [
infrared_protocols.Timing(high_us=high, low_us=low) for high, low in timings
]
def get_raw_timings(self) -> list[infrared_protocols.Timing]:
"""Return timing pairs for transmission."""
return self._timings
def timings_to_broadlink_packet(
timings: list[tuple[int, int]],
repeat: int = 0,
) -> bytes:
"""Convert raw timing pairs (high_us, low_us) to a Broadlink IR packet.
Args:
timings: List of (mark_us, space_us) pairs in microseconds.
repeat: Number of extra repeats (0 = send once).
Returns:
Binary packet ready for Broadlink send_data().
"""
if not 0 <= repeat <= 255:
raise ValueError(f"repeat must be 0255, got {repeat}")
# Flatten (mark, space) pairs into a pulse list, omitting any zero-length spaces
pulses: list[int] = []
for high_us, low_us in timings:
pulses.append(high_us)
if low_us:
pulses.append(low_us)
# Use broadlink library's encoder (tick=32.84 µs)
packet = bytearray(_bl_pulses_to_data(pulses))
packet[IR_PACKET_REPEAT_INDEX] = repeat
return bytes(packet)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Broadlink infrared entity."""
device = hass.data[DOMAIN].devices[config_entry.entry_id]
async_add_entities([BroadlinkInfraredEntity(device)])
class BroadlinkInfraredEntity(BroadlinkEntity, InfraredEntity):
"""Broadlink infrared transmitter entity."""
_attr_has_entity_name = True
_attr_translation_key = "infrared"
def __init__(self, device: BroadlinkDevice) -> None:
"""Initialize the entity."""
super().__init__(device)
self._attr_unique_id = f"{device.unique_id}-infrared"
async def async_send_command(self, command: InfraredCommand) -> None:
"""Send an IR command via the Broadlink device.
Handles two types of repeat behavior:
1. Protocol-aware commands (NECCommand, etc.): These encode repeats
(like NEC repeat codes) inside their get_raw_timings() data. The
Broadlink packet is sent with repeat=0.
2. BroadlinkIRCommand: Carries Broadlink hardware repeat count,
which tells the device to re-transmit the entire burst N times.
This is used for protocols/commands that need multiple full frame
transmissions (e.g. legacy SmartIR data).
Using isinstance check ensures protocol-level repeats (already in
timing data) don't get conflated with hardware repeats.
"""
timings = [
(timing.high_us, timing.low_us) for timing in command.get_raw_timings()
]
# Only BroadlinkIRCommand uses Broadlink hardware repeat. Protocol-aware
# commands (NECCommand, etc.) encode repeats inside get_raw_timings()
# and must use hardware repeat=0 to avoid double-repeating.
if isinstance(command, BroadlinkIRCommand):
repeat = command.repeat_count
else:
repeat = 0
packet = timings_to_broadlink_packet(timings, repeat=repeat)
try:
await self._device.async_request(self._device.api.send_data, packet)
except (BroadlinkException, OSError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="send_command_failed",
translation_placeholders={"error": str(err)},
) from err
@@ -3,7 +3,6 @@
"name": "Broadlink",
"codeowners": ["@danielhiversen", "@felipediel", "@L-I-Am", "@eifinger"],
"config_flow": true,
"dependencies": ["infrared"],
"dhcp": [
{
"registered_devices": true
@@ -49,11 +49,6 @@
}
},
"entity": {
"infrared": {
"infrared": {
"name": "IR transmitter"
}
},
"select": {
"day_of_week": {
"name": "Day of week",
@@ -82,10 +77,5 @@
"name": "Total consumption"
}
}
},
"exceptions": {
"send_command_failed": {
"message": "Failed to send IR command: {error}"
}
}
}
@@ -7,7 +7,9 @@ from homeassistant.helpers.condition import Condition, make_entity_state_conditi
from .const import DOMAIN
CONDITIONS: dict[str, type[Condition]] = {
"is_event_active": make_entity_state_condition(DOMAIN, STATE_ON),
"is_event_active": make_entity_state_condition(
DOMAIN, STATE_ON, support_duration=True
),
}
@@ -12,3 +12,8 @@ is_event_active:
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if"
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least"
},
"conditions": {
"is_event_active": {
@@ -8,6 +9,9 @@
"fields": {
"behavior": {
"name": "[%key:component::calendar::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::calendar::common::condition_for_name%]"
}
},
"name": "Calendar event is active"
@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/camera",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["PyTurboJPEG==2.2.0"]
"requirements": ["PyTurboJPEG==1.8.3"]
}
@@ -50,7 +50,9 @@ ATTR_UID = "uid"
ATTR_LATITUDE = "latitude"
ATTR_LONGITUDE = "longitude"
ATTR_EMPTY_SLOTS = "empty_slots"
ATTR_FREE_EBIKES = "free_ebikes"
ATTR_TIMESTAMP = "timestamp"
EXTRA_EBIKES = "ebikes"
CONF_NETWORK = "network"
CONF_STATIONS_LIST = "stations"
@@ -238,5 +240,6 @@ class CityBikesStation(SensorEntity):
ATTR_LATITUDE: station.latitude,
ATTR_LONGITUDE: station.longitude,
ATTR_EMPTY_SLOTS: station.empty_slots,
ATTR_FREE_EBIKES: station.extra.get(EXTRA_EBIKES),
ATTR_TIMESTAMP: station.timestamp,
}
@@ -67,7 +67,7 @@ class ClimateTargetTemperatureCondition(EntityNumericalConditionWithUnitBase):
CONDITIONS: dict[str, type[Condition]] = {
"is_hvac_mode": ClimateHVACModeCondition,
"is_off": make_entity_state_condition(DOMAIN, HVACMode.OFF),
"is_off": make_entity_state_condition(DOMAIN, HVACMode.OFF, support_duration=True),
"is_on": make_entity_state_condition(
DOMAIN,
{
@@ -39,7 +39,16 @@
- domain: number
device_class: temperature
is_off: *condition_common
is_off:
target: *condition_climate_target
fields:
behavior: *condition_behavior
for:
required: true
default: 00:00:00
selector:
duration:
is_on: *condition_common
is_cooling: *condition_common
is_drying: *condition_common
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"condition_threshold_name": "Threshold type",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least",
@@ -52,6 +53,9 @@
"fields": {
"behavior": {
"name": "[%key:component::climate::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::climate::common::condition_for_name%]"
}
},
"name": "Climate-control device is off"
@@ -15,7 +15,7 @@ from homeassistant.helpers.selector import (
SelectSelector,
SelectSelectorConfig,
SelectSelectorMode,
SerialSelector,
SerialPortSelector,
)
from .const import DOMAIN, LOGGER
@@ -110,7 +110,7 @@ class DenonRS232ConfigFlow(ConfigFlow, domain=DOMAIN):
translation_key="model",
)
),
vol.Required(CONF_DEVICE): SerialSelector(),
vol.Required(CONF_DEVICE): SerialPortSelector(),
}
),
user_input or {},
@@ -0,0 +1,15 @@
"""Integration for doorbell triggers."""
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType
DOMAIN = "doorbell"
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
__all__ = []
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the component."""
return True
@@ -0,0 +1,7 @@
{
"triggers": {
"rang": {
"trigger": "mdi:doorbell"
}
}
}
@@ -0,0 +1,8 @@
{
"domain": "doorbell",
"name": "Doorbell",
"codeowners": ["@home-assistant/core"],
"documentation": "https://www.home-assistant.io/integrations/doorbell",
"integration_type": "system",
"quality_scale": "internal"
}
@@ -0,0 +1,9 @@
{
"title": "Doorbell",
"triggers": {
"rang": {
"description": "Triggers after one or more doorbells rang.",
"name": "Doorbell rang"
}
}
}
@@ -0,0 +1,50 @@
"""Provides triggers for doorbells."""
from homeassistant.components.event import (
ATTR_EVENT_TYPE,
DOMAIN as EVENT_DOMAIN,
DoorbellEventType,
EventDeviceClass,
)
from homeassistant.const import STATE_UNAVAILABLE, STATE_UNKNOWN
from homeassistant.core import HomeAssistant, State
from homeassistant.helpers.automation import DomainSpec
from homeassistant.helpers.trigger import (
ENTITY_STATE_TRIGGER_SCHEMA,
EntityTriggerBase,
Trigger,
)
class DoorbellRangTrigger(EntityTriggerBase):
"""Trigger for doorbell event entity when a ring event is received."""
_domain_specs = {EVENT_DOMAIN: DomainSpec(device_class=EventDeviceClass.DOORBELL)}
_schema = ENTITY_STATE_TRIGGER_SCHEMA
def is_valid_state(self, state: State) -> bool:
"""Check if the entity is available and the event type is ring."""
return (
state.state not in (STATE_UNAVAILABLE, STATE_UNKNOWN)
and state.attributes.get(ATTR_EVENT_TYPE) == DoorbellEventType.RING
)
def is_valid_transition(self, from_state: State, to_state: State) -> bool:
"""Check if the origin state is valid and different from the current state."""
# UNKNOWN is a valid from_state, otherwise the first time the event is received
# would not trigger
if from_state.state == STATE_UNAVAILABLE:
return False
return from_state.state != to_state.state
TRIGGERS: dict[str, type[Trigger]] = {
"rang": DoorbellRangTrigger,
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for doorbells."""
return TRIGGERS
@@ -0,0 +1,5 @@
rang:
target:
entity:
domain: event
device_class: doorbell
@@ -13,6 +13,7 @@ from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import DOMAIN
@@ -35,6 +36,27 @@ class DucoConfigFlow(ConfigFlow, domain=DOMAIN):
_host: str
_box_name: str
async def async_step_dhcp(
self, discovery_info: DhcpServiceInfo
) -> ConfigFlowResult:
"""Handle DHCP discovery."""
await self.async_set_unique_id(format_mac(discovery_info.macaddress))
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.ip})
try:
box_name, _ = await self._validate_input(discovery_info.ip)
except DucoConnectionError:
return self.async_abort(reason="cannot_connect")
except DucoError:
_LOGGER.exception("Unexpected error discovering Duco box via DHCP")
return self.async_abort(reason="unknown")
self._host = discovery_info.ip
self._box_name = box_name
self.context["title_placeholders"] = {"name": box_name}
return await self.async_step_discovery_confirm()
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
@@ -72,6 +94,38 @@ class DucoConfigFlow(ConfigFlow, domain=DOMAIN):
description_placeholders={"name": self._box_name},
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfiguration of the integration."""
errors: dict[str, str] = {}
reconfigure_entry = self._get_reconfigure_entry()
if user_input is not None:
try:
box_name, mac = await self._validate_input(user_input[CONF_HOST])
except DucoConnectionError:
errors["base"] = "cannot_connect"
except DucoError:
_LOGGER.exception("Unexpected error connecting to Duco box")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(format_mac(mac))
self._abort_if_unique_id_mismatch()
return self.async_update_reload_and_abort(
reconfigure_entry,
title=box_name,
data_updates={CONF_HOST: user_input[CONF_HOST]},
)
return self.async_show_form(
step_id="reconfigure",
data_schema=self.add_suggested_values_to_schema(
STEP_USER_SCHEMA, reconfigure_entry.data
),
errors=errors,
)
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
+1
View File
@@ -64,6 +64,7 @@ async def async_setup_entry(
"""Set up Duco fan entities."""
coordinator = entry.runtime_data
# BOX is always node 1 and is never dynamically added or removed, so no listener needed.
async_add_entities(
DucoVentilationFanEntity(coordinator, node)
for node in coordinator.data.nodes.values()
+7 -2
View File
@@ -3,12 +3,17 @@
"name": "Duco",
"codeowners": ["@ronaldvdmeer"],
"config_flow": true,
"dhcp": [
{
"hostname": "duco_[0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f]"
}
],
"documentation": "https://www.home-assistant.io/integrations/duco",
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["duco"],
"quality_scale": "bronze",
"requirements": ["python-duco-client==0.3.2"],
"quality_scale": "platinum",
"requirements": ["python-duco-client==0.3.4"],
"zeroconf": [
{
"name": "duco [[][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][]].*",
@@ -55,24 +55,22 @@ rules:
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: todo
comment: >-
Users can pair new modules (CO2 sensors, humidity sensors, zone valves)
to their Duco box. Dynamic device support to be added in a follow-up PR.
dynamic-devices: done
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: todo
repair-issues: todo
stale-devices:
status: todo
reconfiguration-flow: done
repair-issues:
status: exempt
comment: >-
To be implemented together with dynamic device support in a follow-up PR.
The integration has no actionable repair scenarios. Connection failures are
handled by the coordinator (unavailable entities) and resolve automatically.
There are no credentials to expire and no versioned API to become
incompatible with.
stale-devices: done
# Platinum
async-dependency: done
inject-websession: done
+43 -11
View File
@@ -19,9 +19,11 @@ from homeassistant.const import (
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
EntityCategory,
)
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import DucoConfigEntry, DucoCoordinator
from .entity import DucoEntity
@@ -111,22 +113,52 @@ async def async_setup_entry(
"""Set up Duco sensor entities."""
coordinator = entry.runtime_data
async_add_entities(
[
*[
# Track the node IDs for which entities have already been created, so we
# can detect both newly added and stale (deregistered) nodes on every
# coordinator update.
known_nodes: set[int] = set()
@callback
def _async_add_new_entities() -> None:
# Remove devices whose nodes have disappeared from the API.
# The firmware removes deregistered RF/wired nodes automatically.
# BSRH box sensors that are physically unplugged from the PCB are
# not deregistered by the firmware and will never appear here as stale.
stale_node_ids = known_nodes - coordinator.data.nodes.keys()
if stale_node_ids:
device_reg = dr.async_get(hass)
mac = entry.unique_id
for node_id in stale_node_ids:
device = device_reg.async_get_device(
identifiers={(DOMAIN, f"{mac}_{node_id}")}
)
if device:
device_reg.async_update_device(
device.id,
remove_config_entry_id=entry.entry_id,
)
known_nodes.difference_update(stale_node_ids)
new_entities: list[SensorEntity] = []
for node in coordinator.data.nodes.values():
if node.node_id in known_nodes:
continue
known_nodes.add(node.node_id)
new_entities.extend(
DucoSensorEntity(coordinator, node, description)
for node in coordinator.data.nodes.values()
for description in SENSOR_DESCRIPTIONS
if node.general.node_type in description.node_types
],
*[
)
new_entities.extend(
DucoBoxSensorEntity(coordinator, node, description)
for node in coordinator.data.nodes.values()
for description in BOX_SENSOR_DESCRIPTIONS
if node.general.node_type == NodeType.BOX
],
]
)
)
if new_entities:
async_add_entities(new_entities)
entry.async_on_unload(coordinator.async_add_listener(_async_add_new_entities))
_async_add_new_entities()
class DucoSensorEntity(DucoEntity, SensorEntity):
@@ -4,6 +4,8 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unique_id_mismatch": "The device you entered belongs to a different Duco box.",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"error": {
@@ -14,6 +16,14 @@
"discovery_confirm": {
"description": "Do you want to set up {name}?"
},
"reconfigure": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
},
"data_description": {
"host": "[%key:component::duco::config::step::user::data_description::host%]"
}
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
+1 -5
View File
@@ -35,11 +35,7 @@ class EsphomeInfraredEntity(EsphomeEntity[InfraredInfo, EntityState], InfraredEn
@convert_api_error_ha_error
async def async_send_command(self, command: InfraredCommand) -> None:
"""Send an IR command."""
timings = [
interval
for timing in command.get_raw_timings()
for interval in (timing.high_us, -timing.low_us)
]
timings = command.get_raw_timings()
_LOGGER.debug("Sending command: %s", timings)
self._client.infrared_rf_transmit_raw_timings(
@@ -17,7 +17,7 @@
"mqtt": ["esphome/discover/#"],
"quality_scale": "platinum",
"requirements": [
"aioesphomeapi==44.13.3",
"aioesphomeapi==44.18.0",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.7.3"
],
@@ -11,6 +11,7 @@ from aioesphomeapi import (
WaterHeaterInfo,
WaterHeaterMode,
WaterHeaterState,
WaterHeaterStateFlag,
)
from homeassistant.components.water_heater import (
@@ -72,6 +73,8 @@ class EsphomeWaterHeater(
self._attr_operation_list = None
if static_info.supported_features & WaterHeaterFeature.SUPPORTS_ON_OFF:
features |= WaterHeaterEntityFeature.ON_OFF
if static_info.supported_features & WaterHeaterFeature.SUPPORTS_AWAY_MODE:
features |= WaterHeaterEntityFeature.AWAY_MODE
self._attr_supported_features = features
@property
@@ -92,6 +95,12 @@ class EsphomeWaterHeater(
"""Return current operation mode."""
return _WATER_HEATER_MODES.from_esphome(self._state.mode)
@property
@esphome_state_property
def is_away_mode_on(self) -> bool | None:
"""Return true if away mode is on."""
return bool(self._state.state & WaterHeaterStateFlag.AWAY)
@convert_api_error_ha_error
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set new target temperature."""
@@ -128,6 +137,24 @@ class EsphomeWaterHeater(
device_id=self._static_info.device_id,
)
@convert_api_error_ha_error
async def async_turn_away_mode_on(self) -> None:
"""Turn away mode on."""
self._client.water_heater_command(
key=self._key,
away=True,
device_id=self._static_info.device_id,
)
@convert_api_error_ha_error
async def async_turn_away_mode_off(self) -> None:
"""Turn away mode off."""
self._client.water_heater_command(
key=self._key,
away=False,
device_id=self._static_info.device_id,
)
async_setup_entry = partial(
platform_async_setup_entry,
+2 -2
View File
@@ -7,8 +7,8 @@ from homeassistant.helpers.condition import Condition, make_entity_state_conditi
from . import DOMAIN
CONDITIONS: dict[str, type[Condition]] = {
"is_off": make_entity_state_condition(DOMAIN, STATE_OFF),
"is_on": make_entity_state_condition(DOMAIN, STATE_ON),
"is_off": make_entity_state_condition(DOMAIN, STATE_OFF, support_duration=True),
"is_on": make_entity_state_condition(DOMAIN, STATE_ON, support_duration=True),
}
@@ -12,6 +12,11 @@
options:
- all
- any
for:
required: true
default: 00:00:00
selector:
duration:
is_off: *condition_common
is_on: *condition_common
@@ -1,6 +1,7 @@
{
"common": {
"condition_behavior_name": "Condition passes if",
"condition_for_name": "For at least",
"trigger_behavior_name": "Trigger when",
"trigger_for_name": "For at least"
},
@@ -10,6 +11,9 @@
"fields": {
"behavior": {
"name": "[%key:component::fan::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::fan::common::condition_for_name%]"
}
},
"name": "Fan is off"
@@ -19,6 +23,9 @@
"fields": {
"behavior": {
"name": "[%key:component::fan::common::condition_behavior_name%]"
},
"for": {
"name": "[%key:component::fan::common::condition_for_name%]"
}
},
"name": "Fan is on"
@@ -1,4 +1,9 @@
{
"common": {
"api_key": "Access token",
"api_key_description": "The access token for authenticating with Firefly III",
"verify_ssl_description": "Verify the SSL certificate of the Firefly III instance"
},
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
@@ -14,39 +19,39 @@
"step": {
"reauth_confirm": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
"api_key": "[%key:component::firefly_iii::common::api_key%]"
},
"data_description": {
"api_key": "The new API access token for authenticating with Firefly III"
"api_key": "[%key:component::firefly_iii::common::api_key_description%]"
},
"description": "The access token for your Firefly III instance is invalid and needs to be updated. Go to **Options > Remote access and tokens**. Create a new personal access token and copy it (it will only display once)."
"description": "The access token for your Firefly III instance is invalid and needs to be updated. Go to **Options > Remote access and tokens**. Create a new **personal access token** and copy it (it will only display once)."
},
"reconfigure": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"api_key": "[%key:component::firefly_iii::common::api_key%]",
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"api_key": "[%key:component::firefly_iii::config::step::user::data_description::api_key%]",
"api_key": "[%key:component::firefly_iii::common::api_key_description%]",
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "[%key:component::firefly_iii::config::step::user::data_description::verify_ssl%]"
"verify_ssl": "[%key:component::firefly_iii::common::verify_ssl_description%]"
},
"description": "Use the following form to reconfigure your Firefly III instance.",
"title": "Reconfigure Firefly III Integration"
},
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
"api_key": "[%key:component::firefly_iii::common::api_key%]",
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"api_key": "The API key for authenticating with Firefly III",
"api_key": "[%key:component::firefly_iii::common::api_key_description%]",
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "Verify the SSL certificate of the Firefly III instance"
"verify_ssl": "[%key:component::firefly_iii::common::verify_ssl_description%]"
},
"description": "You can create an API key in the Firefly III UI. Go to **Options > Remote access and tokens**. Create a new personal access token and copy it (it will only display once)."
"description": "You can create an access token in the Firefly III UI. Go to **Options > Remote access and tokens**. Create a new **personal access token** and copy it (it will only display once)."
}
}
},
@@ -198,7 +198,7 @@ class FritzBoxToolsFlowHandler(ConfigFlow, domain=DOMAIN):
def is_matching(self, other_flow: Self) -> bool:
"""Return True if other_flow is matching this flow."""
return other_flow._host == self._host # noqa: SLF001
return other_flow._host == self._host
async def async_step_confirm(
self, user_input: dict[str, Any] | None = None
@@ -148,7 +148,7 @@ class FritzboxConfigFlow(ConfigFlow, domain=DOMAIN):
def is_matching(self, other_flow: Self) -> bool:
"""Return True if other_flow is matching this flow."""
return other_flow._host == self._host # noqa: SLF001
return other_flow._host == self._host
async def async_step_confirm(
self, user_input: dict[str, Any] | None = None
@@ -2,11 +2,14 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable, Coroutine
from functools import wraps
import logging
from typing import Any
from typing import Any, Concatenate
from afsapi import (
AFSAPI,
FSApiError,
FSConnectionError,
FSNotImplementedError,
PlayCaps,
@@ -24,6 +27,7 @@ from homeassistant.components.media_player import (
RepeatMode,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util import dt as dt_util
@@ -35,6 +39,37 @@ from .const import DOMAIN, MEDIA_CONTENT_ID_PRESET
_LOGGER = logging.getLogger(__name__)
def fs_command_exception_wrap[
_AFSAPIDeviceT: AFSAPIDevice,
**_P,
_R,
](
func: Callable[Concatenate[_AFSAPIDeviceT, _P], Awaitable[_R]],
) -> Callable[Concatenate[_AFSAPIDeviceT, _P], Coroutine[Any, Any, _R]]:
"""Wrap command methods and map API exceptions to HA errors."""
@wraps(func)
async def _wrap(self: _AFSAPIDeviceT, *args: _P.args, **kwargs: _P.kwargs) -> _R:
try:
return await func(self, *args, **kwargs)
except FSConnectionError as err:
command = func.__name__.removeprefix("async_")
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="connection_error",
translation_placeholders={"command": command},
) from err
except FSApiError as err:
command = func.__name__.removeprefix("async_")
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"command": command, "message": str(err)},
) from err
return _wrap
async def async_setup_entry(
hass: HomeAssistant,
config_entry: FrontierSiliconConfigEntry,
@@ -272,57 +307,74 @@ class AFSAPIDevice(MediaPlayerEntity):
# Management actions
# power control
@fs_command_exception_wrap
async def async_turn_on(self) -> None:
"""Turn on the device."""
await self.fs_device.set_power(True)
@fs_command_exception_wrap
async def async_turn_off(self) -> None:
"""Turn off the device."""
await self.fs_device.set_power(False)
@fs_command_exception_wrap
async def async_media_play(self) -> None:
"""Send play command."""
await self.fs_device.play()
if (await self.fs_device.get_play_state()) == PlayState.STOPPED:
# The 'play' command only seems to work when the current stream is paused.
# We need to send a 'stop' command instead to resume a stopped stream.
await self.fs_device.stop()
else:
await self.fs_device.play()
@fs_command_exception_wrap
async def async_media_pause(self) -> None:
"""Send pause command."""
await self.fs_device.pause()
@fs_command_exception_wrap
async def async_media_stop(self) -> None:
"""Send stop command."""
await self.fs_device.stop()
@fs_command_exception_wrap
async def async_media_previous_track(self) -> None:
"""Send previous track command (results in rewind)."""
await self.fs_device.rewind()
@fs_command_exception_wrap
async def async_media_next_track(self) -> None:
"""Send next track command (results in fast-forward)."""
await self.fs_device.forward()
@fs_command_exception_wrap
async def async_mute_volume(self, mute: bool) -> None:
"""Send mute command."""
await self.fs_device.set_mute(mute)
# volume
@fs_command_exception_wrap
async def async_volume_up(self) -> None:
"""Send volume up command."""
volume = await self.fs_device.get_volume()
volume = int(volume or 0) + 1
await self.fs_device.set_volume(min(volume, self._max_volume or 1))
@fs_command_exception_wrap
async def async_volume_down(self) -> None:
"""Send volume down command."""
volume = await self.fs_device.get_volume()
volume = int(volume or 0) - 1
await self.fs_device.set_volume(max(volume, 0))
@fs_command_exception_wrap
async def async_set_volume_level(self, volume: float) -> None:
"""Set volume command."""
if self._max_volume: # Can't do anything sensible if not set
volume = int(volume * self._max_volume)
await self.fs_device.set_volume(volume)
@fs_command_exception_wrap
async def async_select_source(self, source: str) -> None:
"""Select input source."""
await self.fs_device.set_power(True)
@@ -332,6 +384,7 @@ class AFSAPIDevice(MediaPlayerEntity):
):
await self.fs_device.set_mode(mode)
@fs_command_exception_wrap
async def async_select_sound_mode(self, sound_mode: str) -> None:
"""Select EQ Preset."""
if (
@@ -340,6 +393,7 @@ class AFSAPIDevice(MediaPlayerEntity):
):
await self.fs_device.set_eq_preset(mode)
@fs_command_exception_wrap
async def async_set_repeat(self, repeat: RepeatMode) -> None:
"""Set repeat mode."""
await self.fs_device.play_repeat(
@@ -350,10 +404,12 @@ class AFSAPIDevice(MediaPlayerEntity):
}.get(repeat, PlayRepeatMode.OFF)
)
@fs_command_exception_wrap
async def async_set_shuffle(self, shuffle: bool) -> None:
"""Set shuffle mode."""
await self.fs_device.set_play_shuffle(shuffle)
@fs_command_exception_wrap
async def async_media_seek(self, position: float) -> None:
"""Seek to a position in seconds."""
await self.fs_device.set_play_position(int(position * 1000))
@@ -369,6 +425,7 @@ class AFSAPIDevice(MediaPlayerEntity):
return await browse_node(self.fs_device, media_content_type, media_content_id)
@fs_command_exception_wrap
async def async_play_media(
self, media_type: MediaType | str, media_id: str, **kwargs: Any
) -> None:
@@ -33,5 +33,13 @@
}
}
}
},
"exceptions": {
"api_error": {
"message": "Failed to execute {command}: {message}"
},
"connection_error": {
"message": "Failed to execute {command}: could not connect to device"
}
}
}
@@ -0,0 +1,26 @@
"""Support for Fumis pellet stoves."""
from __future__ import annotations
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .coordinator import FumisConfigEntry, FumisDataUpdateCoordinator
PLATFORMS = [Platform.CLIMATE, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: FumisConfigEntry) -> bool:
"""Set up Fumis from a config entry."""
coordinator = FumisDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: FumisConfigEntry) -> bool:
"""Unload Fumis config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
+128
View File
@@ -0,0 +1,128 @@
"""Support for Fumis climate entities."""
from __future__ import annotations
from typing import Any
from fumis import StoveStatus
from homeassistant.components.climate import (
ClimateEntity,
ClimateEntityFeature,
HVACAction,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import FumisConfigEntry, FumisDataUpdateCoordinator
from .entity import FumisEntity
from .helpers import fumis_exception_handler
PARALLEL_UPDATES = 1
STOVE_STATUS_TO_HVAC_ACTION: dict[StoveStatus, HVACAction | None] = {
StoveStatus.OFF: HVACAction.OFF,
StoveStatus.COLD_START_OFF: HVACAction.OFF,
StoveStatus.WOOD_BURNING_OFF: HVACAction.OFF,
StoveStatus.PRE_HEATING: HVACAction.PREHEATING,
StoveStatus.IGNITION: HVACAction.PREHEATING,
StoveStatus.PRE_COMBUSTION: HVACAction.PREHEATING,
StoveStatus.COLD_START: HVACAction.PREHEATING,
StoveStatus.COMBUSTION: HVACAction.HEATING,
StoveStatus.ECO: HVACAction.HEATING,
StoveStatus.HYBRID_INIT: HVACAction.HEATING,
StoveStatus.HYBRID_START: HVACAction.HEATING,
StoveStatus.WOOD_START: HVACAction.HEATING,
StoveStatus.WOOD_COMBUSTION: HVACAction.HEATING,
StoveStatus.COOLING: HVACAction.IDLE,
StoveStatus.UNKNOWN: None,
}
async def async_setup_entry(
hass: HomeAssistant,
entry: FumisConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Fumis climate entity based on a config entry."""
async_add_entities([FumisClimateEntity(entry.runtime_data)])
class FumisClimateEntity(FumisEntity, ClimateEntity):
"""Defines a Fumis climate entity."""
_attr_hvac_modes = [HVACMode.OFF, HVACMode.HEAT]
_attr_max_temp = 35.0
_attr_min_temp = 10.0
_attr_name = None
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.TURN_OFF
| ClimateEntityFeature.TURN_ON
)
_attr_target_temperature_step = 0.5
_attr_temperature_unit = UnitOfTemperature.CELSIUS
def __init__(self, coordinator: FumisDataUpdateCoordinator) -> None:
"""Initialize the Fumis climate entity."""
super().__init__(coordinator)
self._attr_unique_id = coordinator.config_entry.unique_id
@property
def hvac_mode(self) -> HVACMode:
"""Return the current HVAC mode."""
if self.coordinator.data.controller.on:
return HVACMode.HEAT
return HVACMode.OFF
@property
def hvac_action(self) -> HVACAction | None:
"""Return the current HVAC action."""
return STOVE_STATUS_TO_HVAC_ACTION[
self.coordinator.data.controller.stove_status
]
@property
def current_temperature(self) -> float | None:
"""Return the current temperature."""
if (temp := self.coordinator.data.controller.main_temperature) is None:
return None
return temp.actual
@property
def target_temperature(self) -> float | None:
"""Return the target temperature."""
if (temp := self.coordinator.data.controller.main_temperature) is None:
return None
return temp.setpoint
@fumis_exception_handler
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
if hvac_mode == HVACMode.HEAT:
await self.coordinator.client.turn_on()
else:
await self.coordinator.client.turn_off()
await self.coordinator.async_request_refresh()
@fumis_exception_handler
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the target temperature."""
if (temperature := kwargs.get(ATTR_TEMPERATURE)) is None:
return
await self.coordinator.client.set_target_temperature(temperature)
await self.coordinator.async_request_refresh()
@fumis_exception_handler
async def async_turn_on(self) -> None:
"""Turn on the stove."""
await self.coordinator.client.turn_on()
await self.coordinator.async_request_refresh()
@fumis_exception_handler
async def async_turn_off(self) -> None:
"""Turn off the stove."""
await self.coordinator.client.turn_off()
await self.coordinator.async_request_refresh()
@@ -0,0 +1,190 @@
"""Config flow to configure the Fumis integration."""
from __future__ import annotations
from collections.abc import Mapping
from typing import Any
from fumis import (
Fumis,
FumisAuthenticationError,
FumisConnectionError,
FumisStoveOfflineError,
)
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_MAC, CONF_PIN
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.selector import (
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from .const import DOMAIN, LOGGER
class FumisFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a Fumis config flow."""
_discovered_mac: str
async def async_step_dhcp(
self, discovery_info: DhcpServiceInfo
) -> ConfigFlowResult:
"""Handle DHCP discovery of a Fumis WiRCU module."""
mac = discovery_info.macaddress.replace(":", "").replace("-", "").upper()
await self.async_set_unique_id(format_mac(mac))
self._abort_if_unique_id_configured()
self._discovered_mac = mac
return await self.async_step_dhcp_confirm()
async def async_step_dhcp_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle DHCP discovery confirmation."""
errors: dict[str, str] = {}
if user_input is not None:
fumis = Fumis(
mac=self._discovered_mac,
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
try:
info = await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_create_entry(
title=info.controller.model_name or "Fumis",
data={
CONF_MAC: self._discovered_mac,
CONF_PIN: user_input[CONF_PIN],
},
)
return self.async_show_form(
step_id="dhcp_confirm",
data_schema=vol.Schema(
{
vol.Required(CONF_PIN): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
}
),
errors=errors,
)
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle a flow initiated by the user."""
errors: dict[str, str] = {}
if user_input is not None:
mac = user_input[CONF_MAC].replace(":", "").replace("-", "").upper()
fumis = Fumis(
mac=mac,
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
try:
info = await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
await self.async_set_unique_id(format_mac(mac), raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=info.controller.model_name or "Fumis",
data={
CONF_MAC: mac,
CONF_PIN: user_input[CONF_PIN],
},
)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
vol.Schema(
{
vol.Required(CONF_MAC): TextSelector(
TextSelectorConfig(autocomplete="off")
),
vol.Required(CONF_PIN): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
}
),
user_input,
),
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle re-authentication of a Fumis stove."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle re-authentication confirmation."""
errors: dict[str, str] = {}
if user_input is not None:
reauth_entry = self._get_reauth_entry()
fumis = Fumis(
mac=reauth_entry.data[CONF_MAC],
password=user_input[CONF_PIN],
session=async_get_clientsession(self.hass),
)
try:
await fumis.update_info()
except FumisAuthenticationError:
errors[CONF_PIN] = "invalid_auth"
except FumisStoveOfflineError:
errors["base"] = "device_offline"
except FumisConnectionError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data_updates={CONF_PIN: user_input[CONF_PIN]},
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema(
{
vol.Required(CONF_PIN): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
}
),
errors=errors,
)
+11
View File
@@ -0,0 +1,11 @@
"""Constants for the Fumis integration."""
from __future__ import annotations
from datetime import timedelta
import logging
from typing import Final
DOMAIN: Final = "fumis"
LOGGER = logging.getLogger(__package__)
SCAN_INTERVAL = timedelta(seconds=30)
@@ -0,0 +1,71 @@
"""DataUpdateCoordinator for Fumis."""
from __future__ import annotations
from fumis import (
Fumis,
FumisAuthenticationError,
FumisConnectionError,
FumisError,
FumisInfo,
FumisStoveOfflineError,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_MAC, CONF_PIN
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, LOGGER, SCAN_INTERVAL
type FumisConfigEntry = ConfigEntry[FumisDataUpdateCoordinator]
class FumisDataUpdateCoordinator(DataUpdateCoordinator[FumisInfo]):
"""Class to manage fetching Fumis data."""
config_entry: FumisConfigEntry
def __init__(self, hass: HomeAssistant, entry: FumisConfigEntry) -> None:
"""Initialize the coordinator."""
self.client = Fumis(
mac=entry.data[CONF_MAC],
password=entry.data[CONF_PIN],
session=async_get_clientsession(hass),
)
super().__init__(
hass,
LOGGER,
config_entry=entry,
name=f"{DOMAIN}_{entry.unique_id}",
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> FumisInfo:
"""Fetch data from the Fumis API."""
try:
return await self.client.update_info()
except FumisAuthenticationError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="authentication_error",
) from err
except FumisStoveOfflineError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="stove_offline",
) from err
except FumisConnectionError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="communication_error",
translation_placeholders={"error": str(err)},
) from err
except FumisError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="unknown_error",
translation_placeholders={"error": str(err)},
) from err
+35
View File
@@ -0,0 +1,35 @@
"""Base entity for the Fumis integration."""
from __future__ import annotations
from homeassistant.const import CONF_MAC
from homeassistant.helpers.device_registry import (
CONNECTION_NETWORK_MAC,
DeviceInfo,
format_mac,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import FumisDataUpdateCoordinator
class FumisEntity(CoordinatorEntity[FumisDataUpdateCoordinator]):
"""Defines a Fumis entity."""
_attr_has_entity_name = True
def __init__(self, coordinator: FumisDataUpdateCoordinator) -> None:
"""Initialize a Fumis entity."""
super().__init__(coordinator=coordinator)
info = coordinator.data
mac = format_mac(coordinator.config_entry.data[CONF_MAC])
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, mac)},
connections={(CONNECTION_NETWORK_MAC, mac)},
manufacturer=info.controller.manufacturer or "Fumis",
model=info.controller.model_name,
name=info.controller.model_name or "Pellet stove",
sw_version=str(info.controller.version),
hw_version=str(info.unit.version),
)
+63
View File
@@ -0,0 +1,63 @@
"""Helpers for Fumis."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from typing import Any, Concatenate
from fumis import (
FumisAuthenticationError,
FumisConnectionError,
FumisError,
FumisStoveOfflineError,
)
from homeassistant.exceptions import HomeAssistantError
from .const import DOMAIN
from .entity import FumisEntity
def fumis_exception_handler[_FumisEntityT: FumisEntity, **_P](
func: Callable[Concatenate[_FumisEntityT, _P], Coroutine[Any, Any, Any]],
) -> Callable[Concatenate[_FumisEntityT, _P], Coroutine[Any, Any, None]]:
"""Decorate Fumis calls to handle exceptions.
A decorator that wraps the passed in function, catches Fumis errors.
"""
async def handler(self: _FumisEntityT, *args: _P.args, **kwargs: _P.kwargs) -> None:
try:
await func(self, *args, **kwargs)
self.coordinator.async_update_listeners()
except FumisAuthenticationError as error:
self.hass.config_entries.async_schedule_reload(
self.coordinator.config_entry.entry_id
)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="authentication_error",
) from error
except FumisStoveOfflineError as error:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="stove_offline",
) from error
except FumisConnectionError as error:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="communication_error",
translation_placeholders={"error": str(error)},
) from error
except FumisError as error:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="unknown_error",
translation_placeholders={"error": str(error)},
) from error
return handler
+48
View File
@@ -0,0 +1,48 @@
{
"entity": {
"sensor": {
"combustion_chamber_temperature": {
"default": "mdi:thermometer-high"
},
"detailed_stove_status": {
"default": "mdi:fireplace"
},
"fan_1_speed": {
"default": "mdi:fan"
},
"fan_2_speed": {
"default": "mdi:fan"
},
"fuel_quantity": {
"default": "mdi:gauge"
},
"fuel_used": {
"default": "mdi:counter"
},
"igniter_starts": {
"default": "mdi:counter"
},
"misfires": {
"default": "mdi:alert-outline"
},
"overheatings": {
"default": "mdi:thermometer-alert"
},
"power_output": {
"default": "mdi:fire"
},
"pressure": {
"default": "mdi:gauge"
},
"stove_status": {
"default": "mdi:fireplace"
},
"time_to_service": {
"default": "mdi:wrench-clock"
},
"wifi_signal_strength": {
"default": "mdi:wifi"
}
}
}
}
@@ -0,0 +1,17 @@
{
"domain": "fumis",
"name": "Fumis",
"codeowners": ["@frenck"],
"config_flow": true,
"dhcp": [
{
"macaddress": "0016D0*"
}
],
"documentation": "https://www.home-assistant.io/integrations/fumis",
"integration_type": "device",
"iot_class": "cloud_polling",
"loggers": ["fumis"],
"quality_scale": "bronze",
"requirements": ["fumis==0.2.1"]
}

Some files were not shown because too many files have changed in this diff Show More