Compare commits

..

162 Commits

Author SHA1 Message Date
Franck Nijhof c347311851 2024.5.5 (#118049) 2024-05-24 22:01:44 +02:00
J. Nick Koston 750ec261be Add state check to config entry setup to ensure it cannot be setup twice (#117193) 2024-05-24 21:28:04 +02:00
Franck Nijhof 8128449879 Fix rc pylint warning in MQTT (#118050) 2024-05-24 18:41:46 +02:00
Franck Nijhof 3f7e57dde2 Bump version to 2024.5.5 2024-05-24 16:13:44 +02:00
Marcel van der Veldt 81bf31bbb1 Extend the blocklist for Matter transitions with more models (#118038) 2024-05-24 16:13:32 +02:00
J. Nick Koston f5c20b3528 Bump pySwitchbot to 0.46.1 (#118025) 2024-05-24 16:13:29 +02:00
Erik Montnemery 3238bc83b8 Improve async_get_issue_tracker for custom integrations (#118016) 2024-05-24 16:13:26 +02:00
J. Nick Koston f4b653a767 Update pySwitchbot to 0.46.0 to fix lock key retrieval (#118005)
* Update pySwitchbot to 0.46.0 to fix lock key retrieval

needs https://github.com/Danielhiversen/pySwitchbot/pull/236

* bump

* fixes
2024-05-24 16:13:22 +02:00
Shay Levy 09779b5f6e Add Shelly debug logging for async_reconnect_soon (#117945) 2024-05-24 16:13:19 +02:00
On Freund ac97f25d6c Bump pyrympro to 0.0.8 (#117919) 2024-05-24 16:13:15 +02:00
Joakim Plate 7e18527dfb Update philips_js to 3.2.1 (#117881)
* Update philips_js to 3.2.0

* Update to 3.2.1
2024-05-24 16:13:12 +02:00
Peter 7d5f9b1adf Prevent time pattern reschedule if cancelled during job execution (#117879)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-24 16:13:09 +02:00
Bernardus Jansen 0fb5aaf0f8 Tesla Wall Connector fix spelling error/typo (#117841) 2024-05-24 16:13:05 +02:00
puddly 6956d0d65a Account for disabled ZHA discovery config entries when migrating SkyConnect integration (#117800)
* Properly handle disabled ZHA discovery config entries

* Update tests/components/homeassistant_sky_connect/test_util.py

Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>

---------

Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2024-05-24 16:13:02 +02:00
Franck Nijhof db73074185 Update wled to 0.18.0 (#117790) 2024-05-24 16:12:59 +02:00
J. Nick Koston dae4d316ae Fix race in config entry setup (#117756) 2024-05-24 16:12:55 +02:00
J. Nick Koston 56b55a0df5 Block older versions of custom integration mydolphin_plus since they cause crashes (#117751) 2024-05-24 16:12:50 +02:00
Ricardo Steijn 8d24f68f55 Bump crownstone-sse to 2.0.5, crownstone-cloud to 1.4.11 (#117748) 2024-05-24 16:12:46 +02:00
Anrijs b44821b805 Bump aranet4 to 2.3.4 (#117738)
bump aranet4 lib version to 2.3.4
2024-05-24 16:12:43 +02:00
Michael 66c52e144e Consider only active config entries as media source in Synology DSM (#117691)
consider only active config entries as media source
2024-05-24 16:12:40 +02:00
On Freund 66fccb7296 Bump pyrisco to 0.6.2 (#117682) 2024-05-24 16:12:36 +02:00
J. Nick Koston ecb587c4ca Fix setting MQTT socket buffer size with WebsocketWrapper (#117672) 2024-05-24 16:12:33 +02:00
Bouwe Westerdijk c6a9388aea Add options-property to Plugwise Select (#117655) 2024-05-24 16:12:29 +02:00
Ville Skyttä 85f0fffa5a Filter out HTML greater/less than entities from huawei_lte sensor values (#117209) 2024-05-24 16:12:26 +02:00
Pete Sage 9dc66404e7 Fix Sonos album artwork performance (#116391) 2024-05-24 16:12:05 +02:00
Franck Nijhof 3dc3de95fa 2024.5.4 (#117631) 2024-05-17 15:04:13 +02:00
Franck Nijhof 5c8f7fe52a Fix rc pylint warning for Home Assistant Analytics (#117635) 2024-05-17 14:13:10 +02:00
Franck Nijhof 8896d134e9 Bump version to 2024.5.4 2024-05-17 13:45:47 +02:00
Robert Svensson f043b2db49 Improve syncing light states to deCONZ groups (#117588) 2024-05-17 13:45:03 +02:00
Joost Lekkerkerker 5cd101d2b1 Fix poolsense naming (#117567) 2024-05-17 13:45:00 +02:00
Joost Lekkerkerker ab9ed0eba4 Handle uncaught exceptions in Analytics insights (#117558) 2024-05-17 13:44:57 +02:00
starkillerOG 4548ff619c Bump reolink-aio to 0.8.10 (#117501) 2024-05-17 13:44:53 +02:00
Erik Montnemery b1746faa47 Fix API creation for passwordless pi_hole (#117494) 2024-05-17 13:43:50 +02:00
starkillerOG 615ae780ca Reolink fix not unregistering webhook during ReAuth (#117490) 2024-05-17 13:38:56 +02:00
J. Nick Koston b86513c3a4 Fix non-thread-safe state write in tellduslive (#117487) 2024-05-17 13:38:51 +02:00
Maikel Punie 970ad8c07c Bump pyduotecno to 2024.5.0 (#117446) 2024-05-17 13:38:48 +02:00
Franck Nijhof 819e9860a8 Update wled to 0.17.1 (#117444) 2024-05-17 13:38:44 +02:00
mk-81 e7ff552de6 Fix Kodi on/off status (#117436)
* Fix Kodi Issue 104603

Fixes issue, that Kodi media player is displayed as online even when offline. The issue occurrs when using HTTP(S) only (no web Socket) integration after kodi was found online once.
Issue: In async_update the connection exceptions from self._kodi.get_players are not catched and therefore self._players (and the like) are not reset. The call of self._connection.connected returns always true for HTTP(S) connections.

Solution: Catch Exceptions from self._kodi.get_players und reset state in case of HTTP(S) only connection. Otherwise keep current behaviour.

* Fix Kodi Issue 104603 / code style adjustments

as requested
2024-05-17 13:38:41 +02:00
Jiaqi Wu f48f8eefe7 Fix Lutron Serena Tilt Only Wood Blinds set tilt function (#117374) 2024-05-17 13:38:38 +02:00
J. Nick Koston c90818e10c Fix squeezebox blocking startup (#117331)
fixes #117079
2024-05-17 13:38:33 +02:00
tronikos 642a6b44eb Call Google Assistant SDK service using async_add_executor_job (#117325) 2024-05-17 13:38:30 +02:00
Joost Lekkerkerker bca20646bb Fix Aurora naming (#117314) 2024-05-17 13:38:26 +02:00
Jan Bouwhuis dba4785c9b Increase MQTT broker socket buffer size (#117267)
* Increase MQTT broker socket buffer size

* Revert unrelated change

* Try to increase buffer size

* Set INITIAL_SUBSCRIBE_COOLDOWN back to 0.5 sec

* Sinplify and add test

* comments

* comments

---------

Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-17 13:38:23 +02:00
Raman Gupta 57cf91a8d4 Fix zwave_js discovery logic for node device class (#117232)
* Fix zwave_js discovery logic for node device class

* simplify check
2024-05-17 13:38:19 +02:00
jjlawren 17c6a49ff8 Bump SoCo to 0.30.4 (#117212) 2024-05-17 13:38:17 +02:00
Tom Harris 5941cf05e4 Fix issue changing Insteon Hub configuration (#117204)
Add Hub version to config schema
2024-05-17 13:38:14 +02:00
Thomas55555 a53b8cc0e2 Add reauth for missing token scope in Husqvarna Automower (#117098)
* Add repair for wrong token scope to Husqvarna Automower

* avoid new installations with missing scope

* tweaks

* just reauth

* texts

* Add link to correct account

* Update homeassistant/components/husqvarna_automower/strings.json

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Update homeassistant/components/husqvarna_automower/strings.json

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Update homeassistant/components/husqvarna_automower/strings.json

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Add comment

* directly assert mock_missing_scope_config_entry.state is loaded

* assert that a flow is started

* pass complete url to strings and simplify texts

* shorten long line

* address review

* simplify tests

* grammar

* remove obsolete fixture

* fix test

* Update tests/components/husqvarna_automower/test_init.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* test if reauth flow has started

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2024-05-17 13:38:11 +02:00
Michal Čihař 9d25d228ab Reduce update interval in Ondilo Ico (#116989)
Ondilo: reduce update interval

The API seems to have sticter rate-limiting and frequent requests fail
with HTTP 400.

Fixes #116593
2024-05-17 13:38:08 +02:00
tronikos 652ee1b90d Avoid exceptions when Gemini responses are blocked (#116847)
* Bump google-generativeai to v0.5.2

* Avoid exceptions when Gemini responses are blocked

* pytest --snapshot-update

* set error response

* add test

* ruff
2024-05-17 13:38:05 +02:00
Thomas55555 afb5e622cd Catch auth exception in husqvarna automower (#115365)
* Catch AuthException in Husqvarna Automower

* don't use getattr

* raise ConfigEntryAuthFailed
2024-05-17 13:38:01 +02:00
Maikel Punie 4501658a16 Mark Duotecno entities unavailable when tcp goes down (#114325)
When the tcp connection to the duotecno smartbox goes down, mark all entities as unavailable.
2024-05-17 13:37:58 +02:00
amura11 52147e5196 Fix Fully Kiosk set config service (#112840)
* Fixed a bug that prevented setting Fully Kiosk config values using a template

* Added test to cover change

* Fixed issue identified by Ruff

* Update services.py

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>
2024-05-17 13:37:52 +02:00
Paulus Schoutsen 9b6500582a 2024.5.3 (#117203) 2024-05-10 15:08:20 -04:00
Paulus Schoutsen e2da28fbdb Bump version to 2024.5.3 2024-05-10 18:14:24 +00:00
Robert Resch 2c8b3ac8bb Bump deebot-client to 7.2.0 (#117189) 2024-05-10 18:10:05 +00:00
Diogo Gomes f07c00a05b Bump pytrydan to 0.6.0 (#117162) 2024-05-10 18:10:04 +00:00
Jan Bouwhuis 56b38cd842 Fix typo in xiaomi_ble translation strings (#117144) 2024-05-10 18:10:03 +00:00
J. Nick Koston 1b519a4610 Handle tilt position being None in HKC (#117141) 2024-05-10 18:10:02 +00:00
mletenay 09490d9e0a Bump goodwe to 0.3.5 (#117115) 2024-05-10 18:10:02 +00:00
Jan Bouwhuis c0cd76b3bf Make the mqtt discovery update tasks eager and fix race (#117105)
* Fix mqtt discovery race for update rapidly followed on creation

* Revert unrelated renaming local var
2024-05-10 18:10:01 +00:00
MatthewFlamm b9ed2dab5f Fix nws blocking startup (#117094)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-10 18:10:00 +00:00
J. Nick Koston 11f86d9e0b Improve config entry has already been setup error message (#117091) 2024-05-10 18:10:00 +00:00
mletenay 82fab7df39 Goodwe Increase max value of export limit to 200% (#117090) 2024-05-10 18:09:59 +00:00
puddly d40689024a Add a missing addon_name placeholder to the SkyConnect config flow (#117089) 2024-05-10 18:09:58 +00:00
Arie Catsman 9e7e839f03 Bump pyenphase to 1.20.3 (#117061) 2024-05-10 18:09:57 +00:00
Chris Talkington 08ba5304fe Bump rokuecp to 0.19.3 (#117059) 2024-05-10 18:09:57 +00:00
Jan Bouwhuis f34a0dc5ce Log an exception mqtt client call back throws (#117028)
* Log an exception mqtt client call back throws

* Supress exceptions and add test
2024-05-10 18:09:56 +00:00
J. Nick Koston 1a13e1d024 Simplify MQTT subscribe debouncer execution (#117006) 2024-05-10 18:09:55 +00:00
Marc Mueller bee518dc78 Update jinja2 to 3.1.4 (#116986) 2024-05-10 18:09:54 +00:00
Matrix fdc59547e0 Bump Yolink api to 0.4.4 (#116967) 2024-05-10 18:09:54 +00:00
Mr. Bubbles 57861dc091 Update strings for Bring notification service (#116181)
update translations
2024-05-10 18:09:53 +00:00
Pete Sage 624baebbaa Fix Sonos select_source timeout error (#115640) 2024-05-10 18:09:52 +00:00
Franck Nijhof a8f3b699b3 2024.5.2 (#116937) 2024-05-06 19:49:06 +02:00
Bram Kragten eb6ccea8aa Update frontend to 20240501.1 (#116939) 2024-05-06 18:43:20 +02:00
Franck Nijhof 6b93f8d997 Bump version to 2024.5.2 2024-05-06 17:19:04 +02:00
Jan Bouwhuis ab113570c3 Fix initial mqtt subcribe cooldown timeout (#116904) 2024-05-06 17:18:26 +02:00
Robert Hillis ed6788ca3f fix radarr coordinator updates (#116874) 2024-05-06 17:18:23 +02:00
J. Nick Koston 9533f5b490 Fix non-thread-safe operations in amcrest (#116859)
* Fix non-thread-safe operations in amcrest

fixes #116850

* fix locking

* fix locking

* fix locking
2024-05-06 17:18:19 +02:00
mletenay 7c9653e397 Bump goodwe to 0.3.4 (#116849) 2024-05-06 17:18:16 +02:00
tronikos 73eabe821c Bump androidtvremote2 to v0.0.15 (#116844) 2024-05-06 17:18:13 +02:00
Michael 834c2e2a09 Avoid duplicate data fetch during Synologs DSM setup (#116839)
don't do first refresh of central coordinator, is already done by api.setup before
2024-05-06 17:18:10 +02:00
Michael 421f74cd7f Increase default timeout to 30 seconds in Synology DSM (#116836)
increase default timeout to 30s and use it consequently
2024-05-06 17:18:06 +02:00
J. Nick Koston c049888b00 Fix non-thread-safe state write in lutron event (#116829)
fixes #116746
2024-05-06 17:18:01 +02:00
Paulus Schoutsen ad5e0949b6 Hide conversation agents that are exposed as agent entities (#116813) 2024-05-06 16:58:30 +02:00
J. Nick Koston ae28c604e5 Fix airthings-ble data drop outs when Bluetooth connection is flakey (#116805)
* Fix airthings-ble data drop outs when Bluetooth adapter is flakey

fixes #116770

* add missing file

* update
2024-05-06 16:57:55 +02:00
Jan Bouwhuis dbe303d95e Fix IMAP config entry setup (#116797) 2024-05-06 16:51:29 +02:00
J. Nick Koston ad7688197f Ensure all synology_dsm coordinators handle expired sessions (#116796)
* Ensure all synology_dsm coordinators handle expired sessions

* Ensure all synology_dsm coordinators handle expired sessions

* Ensure all synology_dsm coordinators handle expired sessions

* handle cancellation

* add a debug log message

---------

Co-authored-by: mib1185 <mail@mib85.de>
2024-05-06 16:51:26 +02:00
Patrick Decat 79460cb017 fix UnboundLocalError on modified_statistic_ids in compile_statistics (#116795)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-06 16:51:23 +02:00
J. Nick Koston 18bcc61427 Bump bluetooth-adapters to 0.19.2 (#116785) 2024-05-06 16:51:19 +02:00
J. Nick Koston 57bbd10517 Refactor statistics to avoid creating tasks (#116743) 2024-05-06 16:51:16 +02:00
Joost Lekkerkerker f068b8cdb8 Remove suggested UoM from Opower (#116728) 2024-05-06 16:51:12 +02:00
Joost Lekkerkerker bbb94d9e17 Fix Bosch-SHC switch state (#116721) 2024-05-06 16:51:09 +02:00
J. Nick Koston 6d537e2a66 Bump aiohttp-isal to 0.3.1 (#116720) 2024-05-06 16:51:06 +02:00
J. Nick Koston 17c5aa2871 Improve logging of _TrackPointUTCTime objects (#116711) 2024-05-06 16:51:03 +02:00
Erik Montnemery f4830216a8 Add workaround for data entry flow show progress (#116704)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-05-06 16:50:52 +02:00
Franck Nijhof ab8a811c9f 2024.5.1 (#116696) 2024-05-03 17:17:43 +02:00
Franck Nijhof 9d2fd8217f Bump version to 2024.5.1 2024-05-03 13:38:38 +02:00
Felipe Martins Diel 7e8cbafc6f Fix BroadlinkRemote._learn_command() (#116692) 2024-05-03 13:36:03 +02:00
Marc Mueller a4f9a64588 Fix fyta test timezone handling (#116689) 2024-05-03 13:36:00 +02:00
J. Nick Koston 7a56ba1506 Block dreame_vacuum versions older than 1.0.4 (#116673) 2024-05-03 13:35:56 +02:00
Glenn Waters 66bb3ecac9 Bump env_canada lib to 0.6.2 (#116662) 2024-05-03 13:35:53 +02:00
J. Nick Koston ac302f38b1 Bump habluetooth to 2.8.1 (#116661) 2024-05-03 13:35:49 +02:00
puddly abeb65e43d Bump ZHA dependency bellows to 0.38.4 (#116660)
Bump ZHA dependencies

Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2024-05-03 13:35:46 +02:00
Galorhallen c36fd5550b Bump govee-light-local library and fix wrong information for Govee lights (#116651) 2024-05-03 13:35:42 +02:00
Robert Svensson 6be25c784d Bump aiounifi to v77 (#116639) 2024-05-03 13:35:38 +02:00
Marc Mueller c338f1b964 Add constraint for tuf (#116627) 2024-05-03 13:35:35 +02:00
Kevin Stillhammer 8193b82f4a Bump pywaze to 1.0.1 (#116621)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-03 13:35:31 +02:00
Ståle Storø Hauknes 7c1502fa05 Bump Airthings BLE to 0.8.0 (#116616)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-03 13:35:27 +02:00
Matthias Alphart 575a3da772 Fix inheritance order for KNX notify (#116600) 2024-05-03 13:35:23 +02:00
Joost Lekkerkerker 0e488ef505 Improve coordinator in Ondilo ico (#116596)
* Improve coordinator in Ondilo ico

* Improve coordinator in Ondilo ico
2024-05-03 13:35:19 +02:00
Ståle Storø Hauknes fabbe2f28f Fix Airthings BLE model names (#116579) 2024-05-03 13:35:16 +02:00
Tomasz 99ab8d2956 Bump sanix to 1.0.6 (#116570)
dependency version bump
2024-05-03 13:35:12 +02:00
Marcel van der Veldt 523de94184 Fix Matter startup when Matter bridge is present (#116569) 2024-05-03 13:35:09 +02:00
Glenn Waters 65839067e3 Bump elkm1_lib to 2.2.7 (#116564)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-03 13:35:05 +02:00
Glenn Waters 5da6f83d10 Bump upb_lib to 0.5.6 (#116558) 2024-05-03 13:35:02 +02:00
Jan Bouwhuis ea6a9b8316 Fix MQTT discovery cooldown too short with large setup (#116550)
* Fix MQTT discovery cooldown too short with large setup

* Set to 5 sec

* Only change the discovery cooldown

* Fire immediatly when teh debouncing period is over
2024-05-03 13:34:58 +02:00
J. Nick Koston 49de59432e Add a lock to homekit_controller platform loads (#116539) 2024-05-03 13:34:55 +02:00
GraceGRD 624e4a2b48 Bump opentherm_gw to 2.2.0 (#116527) 2024-05-03 13:34:52 +02:00
MatthewFlamm 4e4ac79595 Fix nws forecast coordinators and remove legacy forecast handling (#115857)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-05-03 13:34:48 +02:00
Franck Nijhof 2f47668422 2024.5.0 (#116538) 2024-05-01 20:47:10 +02:00
J. Nick Koston 343d97527c Ensure mqtt handler is restored if its already registered in bootstrap test (#116549) 2024-05-01 20:23:14 +02:00
J. Nick Koston 21466180aa Ensure mock mqtt handler is cleaned up after test_bootstrap_dependencies (#116544) 2024-05-01 19:46:54 +02:00
Franck Nijhof 858874f0da Bump version to 2024.5.0 2024-05-01 18:59:07 +02:00
Franck Nijhof 1641f24314 Bump version to 2024.5.0b7 2024-05-01 18:32:41 +02:00
J. Nick Koston e1c08959b0 Fix stop event cleanup when reloading MQTT (#116525) 2024-05-01 18:32:33 +02:00
Marcel van der Veldt b42f367128 Add blocklist for known Matter devices with faulty transitions (#116524) 2024-05-01 18:32:24 +02:00
Franck Nijhof 1e4e891f0b Bump version to 2024.5.0b6 2024-05-01 16:24:03 +02:00
Joost Lekkerkerker 15aa8949ee Improve scrape strings (#116519) 2024-05-01 16:23:54 +02:00
J. Nick Koston 780a6b314f Fix blocking I/O to import modules in mysensors (#116516) 2024-05-01 16:23:50 +02:00
Marcel van der Veldt 082721e1ab Bump python matter server library to 5.10.0 (#116514) 2024-05-01 16:23:47 +02:00
J. Nick Koston 4312f36dbe Fix non-thread-safe operations in ihc (#116513) 2024-05-01 16:23:43 +02:00
puddly f89677cd76 Bump ZHA dependencies (#116509) 2024-05-01 16:23:39 +02:00
Franck Nijhof 0eb734b6bf Bump version to 2024.5.0b5 2024-05-01 13:41:34 +02:00
Bram Kragten ad16c5bc25 Update frontend to 20240501.0 (#116503) 2024-05-01 13:41:18 +02:00
max2697 fabc3d751e Bump opower to 0.4.4 (#116489) 2024-05-01 13:40:25 +02:00
Luke Lashley 31cfabc44d Fix roborock image crashes (#116487) 2024-05-01 13:40:22 +02:00
J. Nick Koston 7d51556e1e Hold a lock to prevent concurrent setup of config entries (#116482) 2024-05-01 13:40:18 +02:00
Bram Kragten ac24105777 Update frontend to 20240430.0 (#116481) 2024-05-01 13:40:15 +02:00
Jan Bouwhuis 3d86577cab Add test MQTT subscription is completed when birth message is sent (#116476) 2024-05-01 13:40:12 +02:00
J. Nick Koston 6971898a43 Fix non-thread-safe operation in roon volume callback (#116475) 2024-05-01 13:40:08 +02:00
puddly c54d53b88a Change SkyConnect integration type back to hardware and fix multi-PAN migration bug (#116474)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-05-01 13:40:05 +02:00
J. Nick Koston c574d86ddb Fix local_todo blocking the event loop (#116473) 2024-05-01 13:40:01 +02:00
J. Nick Koston 3d13345575 Ensure MQTT resubscribes happen before birth message (#116471) 2024-05-01 13:39:58 +02:00
J. Nick Koston c77cef0391 Bump bluetooth-adapters to 0.19.1 (#116465) 2024-05-01 13:39:55 +02:00
Joost Lekkerkerker 3351b82667 Fix zoneminder async v2 (#116451) 2024-05-01 13:39:52 +02:00
Richard Kroegel 78d19854dd Bump bimmer_connected to 0.15.2 (#116424)
Co-authored-by: Richard <rikroe@users.noreply.github.com>
2024-05-01 13:39:48 +02:00
Marcel van der Veldt c0d529b072 Some fixes for the Matter light discovery schema (#116108)
* Fix discovery schema for light platform

* fix switch platform discovery schema

* extend light tests

* Update switch.py

* clarify comment

* use parameter for supported_color_modes
2024-05-01 13:39:16 +02:00
Franck Nijhof 5b7e09b886 Bump version to 2024.5.0b4 2024-04-30 12:47:51 +02:00
Joost Lekkerkerker 7cbb2892c1 Add user id to coordinator name in Withings (#116440)
* Add user id to coordinator name in Withings

* Add user id to coordinator name in Withings

* Fix
2024-04-30 12:47:44 +02:00
Joost Lekkerkerker 5510315b87 Fix zoneminder async (#116436) 2024-04-30 12:47:41 +02:00
Michael bd8ded1e55 Fix error handling in Shell Command integration (#116409)
* raise proper HomeAssistantError on command timeout

* raise proper HomeAssistantError on non-utf8 command output

* add error translation and test it

* Update homeassistant/components/shell_command/strings.json

* Update tests/components/shell_command/test_init.py

---------

Co-authored-by: G Johansson <goran.johansson@shiftit.se>
2024-04-30 12:47:38 +02:00
Joost Lekkerkerker 1a1dfbd489 Remove semicolon in Modbus (#116399) 2024-04-30 12:47:34 +02:00
Graham Wetzler 3477c81ed1 Bump smart_meter_texas to 0.5.5 (#116321) 2024-04-30 12:47:31 +02:00
Collin Fair 5d9abf9ac5 Fix stale prayer times from islamic-prayer-times (#115683) 2024-04-30 12:47:28 +02:00
Joost Lekkerkerker 8843780aab Set Synology camera device name as entity name (#109123) 2024-04-30 12:47:24 +02:00
Franck Nijhof a7faf2710f Bump version to 2024.5.0b3 2024-04-29 19:44:22 +02:00
Bram Kragten 06e032b838 Update frontend to 20240429.0 (#116404) 2024-04-29 19:44:12 +02:00
Robert Resch 8f2d10c49a Remove strict connection (#116396) 2024-04-29 19:44:09 +02:00
Steve Easley 39d923dc02 Fix jvcprojector command timeout with some projectors (#116392)
* Fix projector timeout in pyprojector lib v1.0.10

* Fix projector timeout by increasing time between power command and refresh.

* Bump jvcprojector lib to ensure unknown power states are handled
2024-04-29 19:44:06 +02:00
Mr. Bubbles 99e3236fb7 Deprecate YAML configuration of Habitica (#116374)
Add deprecation issue for yaml import
2024-04-29 19:44:03 +02:00
dontinelli 7ee79002b3 Store access token in entry for Fyta (#116260)
* save access_token and expiration date in ConfigEntry

* add MINOR_VERSION and async_migrate_entry

* shorten reading of expiration from config entry

* add additional consts and test for config entry migration

* Update homeassistant/components/fyta/coordinator.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

* Update homeassistant/components/fyta/__init__.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

* omit check for datetime data type

* Update homeassistant/components/fyta/__init__.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

* Update homeassistant/components/fyta/coordinator.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

---------

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-04-29 19:43:59 +02:00
dontinelli ac45d20e1f Bump fyta_cli to 0.4.1 (#115918)
* bump fyta_cli to 0.4.0

* Update PLANT_STATUS and add PLANT_MEASUREMENT_STATUS

* bump fyta_cli to v0.4.0

* minor adjustments of states to API documentation
2024-04-29 19:43:55 +02:00
214 changed files with 14781 additions and 1828 deletions
+1
View File
@@ -939,6 +939,7 @@ omit =
homeassistant/components/omnilogic/switch.py
homeassistant/components/ondilo_ico/__init__.py
homeassistant/components/ondilo_ico/api.py
homeassistant/components/ondilo_ico/coordinator.py
homeassistant/components/ondilo_ico/sensor.py
homeassistant/components/onkyo/media_player.py
homeassistant/components/onvif/__init__.py
+3
View File
@@ -414,6 +414,9 @@ async def async_from_config_dict(
start = monotonic()
hass.config_entries = config_entries.ConfigEntries(hass, config)
# Prime custom component cache early so we know if registry entries are tied
# to a custom integration
await loader.async_get_custom_components(hass)
await async_load_base_functionality(hass)
# Set up core.
@@ -16,7 +16,7 @@ from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import DEFAULT_SCAN_INTERVAL, DOMAIN
from .const import DEFAULT_SCAN_INTERVAL, DOMAIN, MAX_RETRIES_AFTER_STARTUP
PLATFORMS: list[Platform] = [Platform.SENSOR]
@@ -61,6 +61,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
await coordinator.async_config_entry_first_refresh()
# Once its setup and we know we are not going to delay
# the startup of Home Assistant, we can set the max attempts
# to a higher value. If the first connection attempt fails,
# Home Assistant's built-in retry logic will take over.
airthings.set_max_attempts(MAX_RETRIES_AFTER_STARTUP)
hass.data[DOMAIN][entry.entry_id] = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -7,3 +7,5 @@ VOLUME_BECQUEREL = "Bq/m³"
VOLUME_PICOCURIE = "pCi/L"
DEFAULT_SCAN_INTERVAL = 300
MAX_RETRIES_AFTER_STARTUP = 5
@@ -24,5 +24,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/airthings_ble",
"iot_class": "local_polling",
"requirements": ["airthings-ble==0.7.1"]
"requirements": ["airthings-ble==0.9.0"]
}
@@ -225,7 +225,7 @@ class AirthingsSensor(
manufacturer=airthings_device.manufacturer,
hw_version=airthings_device.hw_version,
sw_version=airthings_device.sw_version,
model=airthings_device.model.name,
model=airthings_device.model.product_name,
)
@property
+72 -23
View File
@@ -35,7 +35,7 @@ from homeassistant.const import (
HTTP_BASIC_AUTHENTICATION,
Platform,
)
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import Unauthorized, UnknownUser
from homeassistant.helpers import discovery
import homeassistant.helpers.config_validation as cv
@@ -177,7 +177,8 @@ class AmcrestChecker(ApiWrapper):
"""Return event flag that indicates if camera's API is responding."""
return self._async_wrap_event_flag
def _start_recovery(self) -> None:
@callback
def _async_start_recovery(self) -> None:
self.available_flag.clear()
self.async_available_flag.clear()
async_dispatcher_send(
@@ -222,50 +223,98 @@ class AmcrestChecker(ApiWrapper):
yield
except LoginError as ex:
async with self._async_wrap_lock:
self._handle_offline(ex)
self._async_handle_offline(ex)
raise
except AmcrestError:
async with self._async_wrap_lock:
self._handle_error()
self._async_handle_error()
raise
async with self._async_wrap_lock:
self._set_online()
self._async_set_online()
def _handle_offline(self, ex: Exception) -> None:
def _handle_offline_thread_safe(self, ex: Exception) -> bool:
"""Handle camera offline status shared between threads and event loop.
Returns if the camera was online as a bool.
"""
with self._wrap_lock:
was_online = self.available
was_login_err = self._wrap_login_err
self._wrap_login_err = True
if not was_login_err:
_LOGGER.error("%s camera offline: Login error: %s", self._wrap_name, ex)
if was_online:
self._start_recovery()
return was_online
def _handle_error(self) -> None:
def _handle_offline(self, ex: Exception) -> None:
"""Handle camera offline status from a thread."""
if self._handle_offline_thread_safe(ex):
self._hass.loop.call_soon_threadsafe(self._async_start_recovery)
@callback
def _async_handle_offline(self, ex: Exception) -> None:
if self._handle_offline_thread_safe(ex):
self._async_start_recovery()
def _handle_error_thread_safe(self) -> bool:
"""Handle camera error status shared between threads and event loop.
Returns if the camera was online and is now offline as
a bool.
"""
with self._wrap_lock:
was_online = self.available
errs = self._wrap_errors = self._wrap_errors + 1
offline = not self.available
_LOGGER.debug("%s camera errs: %i", self._wrap_name, errs)
if was_online and offline:
_LOGGER.error("%s camera offline: Too many errors", self._wrap_name)
self._start_recovery()
return was_online and offline
def _set_online(self) -> None:
def _handle_error(self) -> None:
"""Handle camera error status from a thread."""
if self._handle_error_thread_safe():
_LOGGER.error("%s camera offline: Too many errors", self._wrap_name)
self._hass.loop.call_soon_threadsafe(self._async_start_recovery)
@callback
def _async_handle_error(self) -> None:
"""Handle camera error status from the event loop."""
if self._handle_error_thread_safe():
_LOGGER.error("%s camera offline: Too many errors", self._wrap_name)
self._async_start_recovery()
def _set_online_thread_safe(self) -> bool:
"""Set camera online status shared between threads and event loop.
Returns if the camera was offline as a bool.
"""
with self._wrap_lock:
was_offline = not self.available
self._wrap_errors = 0
self._wrap_login_err = False
if was_offline:
assert self._unsub_recheck is not None
self._unsub_recheck()
self._unsub_recheck = None
_LOGGER.error("%s camera back online", self._wrap_name)
self.available_flag.set()
self.async_available_flag.set()
async_dispatcher_send(
self._hass, service_signal(SERVICE_UPDATE, self._wrap_name)
)
return was_offline
def _set_online(self) -> None:
"""Set camera online status from a thread."""
if self._set_online_thread_safe():
self._hass.loop.call_soon_threadsafe(self._async_signal_online)
@callback
def _async_set_online(self) -> None:
"""Set camera online status from the event loop."""
if self._set_online_thread_safe():
self._async_signal_online()
@callback
def _async_signal_online(self) -> None:
"""Signal that camera is back online."""
assert self._unsub_recheck is not None
self._unsub_recheck()
self._unsub_recheck = None
_LOGGER.error("%s camera back online", self._wrap_name)
self.available_flag.set()
self.async_available_flag.set()
async_dispatcher_send(
self._hass, service_signal(SERVICE_UPDATE, self._wrap_name)
)
async def _wrap_test_online(self, now: datetime) -> None:
"""Test if camera is back online."""
+3 -2
View File
@@ -16,7 +16,7 @@ import voluptuous as vol
from homeassistant.components.camera import Camera, CameraEntityFeature
from homeassistant.components.ffmpeg import FFmpegManager, get_ffmpeg_manager
from homeassistant.const import ATTR_ENTITY_ID, CONF_NAME, STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import (
async_aiohttp_proxy_stream,
@@ -325,7 +325,8 @@ class AmcrestCam(Camera):
# Other Entity method overrides
async def async_on_demand_update(self) -> None:
@callback
def async_on_demand_update(self) -> None:
"""Update state."""
self.async_schedule_update_ha_state(True)
@@ -82,6 +82,9 @@ class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
except HomeassistantAnalyticsConnectionError:
LOGGER.exception("Error connecting to Home Assistant analytics")
return self.async_abort(reason="cannot_connect")
except Exception: # pylint: disable=broad-except
LOGGER.exception("Unexpected error")
return self.async_abort(reason="unknown")
options = [
SelectOptionDict(
@@ -13,7 +13,8 @@
}
},
"abort": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"error": {
"no_integration_selected": "You must select at least one integration to track"
@@ -8,6 +8,6 @@
"iot_class": "local_push",
"loggers": ["androidtvremote2"],
"quality_scale": "platinum",
"requirements": ["androidtvremote2==0.0.14"],
"requirements": ["androidtvremote2==0.0.15"],
"zeroconf": ["_androidtvremote2._tcp.local."]
}
@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/aranet",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["aranet4==2.3.3"]
"requirements": ["aranet4==2.3.4"]
}
@@ -15,6 +15,7 @@ class AuroraEntity(CoordinatorEntity[AuroraDataUpdateCoordinator]):
"""Implementation of the base Aurora Entity."""
_attr_attribution = ATTRIBUTION
_attr_has_entity_name = True
def __init__(
self,
@@ -16,10 +16,10 @@
"requirements": [
"bleak==0.21.1",
"bleak-retry-connector==3.5.0",
"bluetooth-adapters==0.19.0",
"bluetooth-adapters==0.19.2",
"bluetooth-auto-recovery==1.4.2",
"bluetooth-data-tools==1.19.0",
"dbus-fast==2.21.1",
"habluetooth==2.8.0"
"habluetooth==2.8.1"
]
}
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/bmw_connected_drive",
"iot_class": "cloud_polling",
"loggers": ["bimmer_connected"],
"requirements": ["bimmer-connected[china]==0.14.6"]
"requirements": ["bimmer-connected[china]==0.15.2"]
}
+3 -3
View File
@@ -43,21 +43,21 @@ SWITCH_TYPES: dict[str, SHCSwitchEntityDescription] = {
"smartplug": SHCSwitchEntityDescription(
key="smartplug",
device_class=SwitchDeviceClass.OUTLET,
on_key="state",
on_key="switchstate",
on_value=SHCSmartPlug.PowerSwitchService.State.ON,
should_poll=False,
),
"smartplugcompact": SHCSwitchEntityDescription(
key="smartplugcompact",
device_class=SwitchDeviceClass.OUTLET,
on_key="state",
on_key="switchstate",
on_value=SHCSmartPlugCompact.PowerSwitchService.State.ON,
should_poll=False,
),
"lightswitch": SHCSwitchEntityDescription(
key="lightswitch",
device_class=SwitchDeviceClass.SWITCH,
on_key="state",
on_key="switchstate",
on_value=SHCLightSwitch.PowerSwitchService.State.ON,
should_poll=False,
),
+6 -6
View File
@@ -60,8 +60,8 @@
"description": "Type of push notification to send to list members."
},
"item": {
"name": "Item (Required if message type `Breaking news` selected)",
"description": "Item name to include in a breaking news message e.g. `Breaking news - Please get cilantro!`"
"name": "Article (Required if message type `Urgent Message` selected)",
"description": "Article name to include in an urgent message e.g. `Urgent Message - Please buy Cilantro urgently`"
}
}
}
@@ -69,10 +69,10 @@
"selector": {
"notification_type_selector": {
"options": {
"going_shopping": "I'm going shopping! - Last chance for adjustments",
"changed_list": "List changed - Check it out",
"shopping_done": "Shopping done - you can relax",
"urgent_message": "Breaking news - Please get `item`!"
"going_shopping": "I'm going shopping! - Last chance to make changes",
"changed_list": "List updated - Take a look at the articles",
"shopping_done": "Shopping done - The fridge is well stocked",
"urgent_message": "Urgent Message - Please buy `Article name` urgently"
}
}
}
+5 -2
View File
@@ -373,8 +373,11 @@ class BroadlinkRemote(BroadlinkEntity, RemoteEntity, RestoreEntity):
start_time = dt_util.utcnow()
while (dt_util.utcnow() - start_time) < LEARNING_TIMEOUT:
await asyncio.sleep(1)
found = await device.async_request(device.api.check_frequency)[0]
if found:
is_found, frequency = await device.async_request(
device.api.check_frequency
)
if is_found:
_LOGGER.info("Radiofrequency detected: %s MHz", frequency)
break
else:
await device.async_request(device.api.cancel_sweep_frequency)
@@ -30,7 +30,6 @@ from homeassistant.core import (
HomeAssistant,
ServiceCall,
ServiceResponse,
SupportsResponse,
callback,
)
from homeassistant.exceptions import (
@@ -458,10 +457,3 @@ def _setup_services(hass: HomeAssistant, prefs: CloudPreferences) -> None:
"url": f"https://login.home-assistant.io?u={quote_plus(url)}",
"direct_url": url,
}
hass.services.async_register(
DOMAIN,
"create_temporary_strict_connection_url",
create_temporary_strict_connection_url,
supports_response=SupportsResponse.ONLY,
)
+1 -10
View File
@@ -365,16 +365,7 @@ class CloudPreferences:
@property
def strict_connection(self) -> http.const.StrictConnectionMode:
"""Return the strict connection mode."""
mode = self._prefs.get(PREF_STRICT_CONNECTION)
if mode is None:
# Set to default value
# We store None in the store as the default value to detect if the user has changed the
# value or not.
mode = http.const.StrictConnectionMode.DISABLED
elif not isinstance(mode, http.const.StrictConnectionMode):
mode = http.const.StrictConnectionMode(mode)
return mode
return http.const.StrictConnectionMode.DISABLED
async def get_cloud_user(self) -> str:
"""Return ID of Home Assistant Cloud system user."""
@@ -142,6 +142,9 @@ async def websocket_list_agents(
agent = manager.async_get_agent(agent_info.id)
assert agent is not None
if isinstance(agent, ConversationEntity):
continue
supported_languages = agent.supported_languages
if language and supported_languages != MATCH_ALL:
supported_languages = language_util.matches(
@@ -13,8 +13,8 @@
"crownstone_uart"
],
"requirements": [
"crownstone-cloud==1.4.9",
"crownstone-sse==2.0.4",
"crownstone-cloud==1.4.11",
"crownstone-sse==2.0.5",
"crownstone-uart==2.1.0",
"pyserial==3.5"
]
+26 -8
View File
@@ -2,13 +2,13 @@
from __future__ import annotations
from typing import Any, TypedDict, TypeVar
from typing import Any, TypedDict, TypeVar, cast
from pydeconz.interfaces.groups import GroupHandler
from pydeconz.interfaces.lights import LightHandler
from pydeconz.models import ResourceType
from pydeconz.models.event import EventType
from pydeconz.models.group import Group
from pydeconz.models.group import Group, TypedGroupAction
from pydeconz.models.light.light import Light, LightAlert, LightColorMode, LightEffect
from homeassistant.components.light import (
@@ -105,6 +105,23 @@ class SetStateAttributes(TypedDict, total=False):
xy: tuple[float, float]
def update_color_state(
group: Group, lights: list[Light], override: bool = False
) -> None:
"""Sync group color state with light."""
data = {
attribute: light_attribute
for light in lights
for attribute in ("bri", "ct", "hue", "sat", "xy", "colormode", "effect")
if (light_attribute := light.raw["state"].get(attribute)) is not None
}
if override:
group.raw["action"] = cast(TypedGroupAction, data)
else:
group.update(cast(dict[str, dict[str, Any]], {"action": data}))
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
@@ -148,11 +165,12 @@ async def async_setup_entry(
if (group := hub.api.groups[group_id]) and not group.lights:
return
first = True
for light_id in group.lights:
if (light := hub.api.lights.lights.get(light_id)) and light.reachable:
group.update_color_state(light, update_all_attributes=first)
first = False
lights = [
light
for light_id in group.lights
if (light := hub.api.lights.lights.get(light_id)) and light.reachable
]
update_color_state(group, lights, True)
async_add_entities([DeconzGroup(group, hub)])
@@ -326,7 +344,7 @@ class DeconzLight(DeconzBaseLight[Light]):
if self._device.reachable and "attr" not in self._device.changed_keys:
for group in self.hub.api.groups.values():
if self._device.resource_id in group.lights:
group.update_color_state(self._device)
update_color_state(group, [self._device])
class DeconzGroup(DeconzBaseLight[Group]):
@@ -41,6 +41,11 @@ class DuotecnoEntity(Entity):
"""When a unit has an update."""
self.async_write_ha_state()
@property
def available(self) -> bool:
"""Available state for the unit."""
return self._unit.is_available()
_T = TypeVar("_T", bound="DuotecnoEntity")
_P = ParamSpec("_P")
@@ -7,5 +7,5 @@
"iot_class": "local_push",
"loggers": ["pyduotecno", "pyduotecno-node", "pyduotecno-unit"],
"quality_scale": "silver",
"requirements": ["pyDuotecno==2024.3.2"]
"requirements": ["pyDuotecno==2024.5.0"]
}
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/ecovacs",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.9", "deebot-client==7.1.0"]
"requirements": ["py-sucks==0.9.9", "deebot-client==7.2.0"]
}
+1 -1
View File
@@ -15,5 +15,5 @@
"documentation": "https://www.home-assistant.io/integrations/elkm1",
"iot_class": "local_push",
"loggers": ["elkm1_lib"],
"requirements": ["elkm1-lib==2.2.6"]
"requirements": ["elkm1-lib==2.2.7"]
}
@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/enphase_envoy",
"iot_class": "local_polling",
"loggers": ["pyenphase"],
"requirements": ["pyenphase==1.20.1"],
"requirements": ["pyenphase==1.20.3"],
"zeroconf": [
{
"type": "_enphase-envoy._tcp.local."
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/environment_canada",
"iot_class": "cloud_polling",
"loggers": ["env_canada"],
"requirements": ["env-canada==0.6.0"]
"requirements": ["env-canada==0.6.2"]
}
@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20240426.0"]
"requirements": ["home-assistant-frontend==20240501.1"]
}
@@ -69,18 +69,21 @@ async def async_setup_services(hass: HomeAssistant) -> None:
async def async_set_config(call: ServiceCall) -> None:
"""Set a Fully Kiosk Browser config value on the device."""
for coordinator in await collect_coordinators(call.data[ATTR_DEVICE_ID]):
key = call.data[ATTR_KEY]
value = call.data[ATTR_VALUE]
# Fully API has different methods for setting string and bool values.
# check if call.data[ATTR_VALUE] is a bool
if isinstance(call.data[ATTR_VALUE], bool) or call.data[
ATTR_VALUE
].lower() in ("true", "false"):
await coordinator.fully.setConfigurationBool(
call.data[ATTR_KEY], call.data[ATTR_VALUE]
)
if isinstance(value, bool) or (
isinstance(value, str) and value.lower() in ("true", "false")
):
await coordinator.fully.setConfigurationBool(key, value)
else:
await coordinator.fully.setConfigurationString(
call.data[ATTR_KEY], call.data[ATTR_VALUE]
)
# Convert any int values to string
if isinstance(value, int):
value = str(value)
await coordinator.fully.setConfigurationString(key, value)
# Register all the above services
service_mapping = [
@@ -111,7 +114,7 @@ async def async_setup_services(hass: HomeAssistant) -> None:
{
vol.Required(ATTR_DEVICE_ID): cv.ensure_list,
vol.Required(ATTR_KEY): cv.string,
vol.Required(ATTR_VALUE): vol.Any(str, bool),
vol.Required(ATTR_VALUE): vol.Any(str, bool, int),
}
)
),
+49 -3
View File
@@ -2,15 +2,23 @@
from __future__ import annotations
from datetime import datetime
import logging
from typing import Any
from zoneinfo import ZoneInfo
from fyta_cli.fyta_connector import FytaConnector
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME, Platform
from homeassistant.const import (
CONF_ACCESS_TOKEN,
CONF_PASSWORD,
CONF_USERNAME,
Platform,
)
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from .const import CONF_EXPIRATION, DOMAIN
from .coordinator import FytaCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -22,11 +30,16 @@ PLATFORMS = [
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up the Fyta integration."""
tz: str = hass.config.time_zone
username = entry.data[CONF_USERNAME]
password = entry.data[CONF_PASSWORD]
access_token: str = entry.data[CONF_ACCESS_TOKEN]
expiration: datetime = datetime.fromisoformat(
entry.data[CONF_EXPIRATION]
).astimezone(ZoneInfo(tz))
fyta = FytaConnector(username, password)
fyta = FytaConnector(username, password, access_token, expiration, tz)
coordinator = FytaCoordinator(hass, fyta)
@@ -47,3 +60,36 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
"""Migrate old entry."""
_LOGGER.debug("Migrating from version %s", config_entry.version)
if config_entry.version > 1:
# This means the user has downgraded from a future version
return False
if config_entry.version == 1:
new = {**config_entry.data}
if config_entry.minor_version < 2:
fyta = FytaConnector(
config_entry.data[CONF_USERNAME], config_entry.data[CONF_PASSWORD]
)
credentials: dict[str, Any] = await fyta.login()
await fyta.client.close()
new[CONF_ACCESS_TOKEN] = credentials[CONF_ACCESS_TOKEN]
new[CONF_EXPIRATION] = credentials[CONF_EXPIRATION].isoformat()
hass.config_entries.async_update_entry(
config_entry, data=new, minor_version=2, version=1
)
_LOGGER.debug(
"Migration to version %s.%s successful",
config_entry.version,
config_entry.minor_version,
)
return True
+14 -3
View File
@@ -17,7 +17,7 @@ import voluptuous as vol
from homeassistant.config_entries import ConfigEntry, ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
from .const import DOMAIN
from .const import CONF_EXPIRATION, DOMAIN
_LOGGER = logging.getLogger(__name__)
@@ -31,14 +31,19 @@ class FytaConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Fyta."""
VERSION = 1
_entry: ConfigEntry | None = None
MINOR_VERSION = 2
def __init__(self) -> None:
"""Initialize FytaConfigFlow."""
self.credentials: dict[str, Any] = {}
self._entry: ConfigEntry | None = None
async def async_auth(self, user_input: Mapping[str, Any]) -> dict[str, str]:
"""Reusable Auth Helper."""
fyta = FytaConnector(user_input[CONF_USERNAME], user_input[CONF_PASSWORD])
try:
await fyta.login()
self.credentials = await fyta.login()
except FytaConnectionError:
return {"base": "cannot_connect"}
except FytaAuthentificationError:
@@ -51,6 +56,10 @@ class FytaConfigFlow(ConfigFlow, domain=DOMAIN):
finally:
await fyta.client.close()
self.credentials[CONF_EXPIRATION] = self.credentials[
CONF_EXPIRATION
].isoformat()
return {}
async def async_step_user(
@@ -62,6 +71,7 @@ class FytaConfigFlow(ConfigFlow, domain=DOMAIN):
self._async_abort_entries_match({CONF_USERNAME: user_input[CONF_USERNAME]})
if not (errors := await self.async_auth(user_input)):
user_input |= self.credentials
return self.async_create_entry(
title=user_input[CONF_USERNAME], data=user_input
)
@@ -85,6 +95,7 @@ class FytaConfigFlow(ConfigFlow, domain=DOMAIN):
assert self._entry is not None
if user_input and not (errors := await self.async_auth(user_input)):
user_input |= self.credentials
return self.async_update_reload_and_abort(
self._entry, data={**self._entry.data, **user_input}
)
+1
View File
@@ -1,3 +1,4 @@
"""Const for fyta integration."""
DOMAIN = "fyta"
CONF_EXPIRATION = "expiration"
+22 -3
View File
@@ -12,10 +12,13 @@ from fyta_cli.fyta_exceptions import (
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_EXPIRATION
_LOGGER = logging.getLogger(__name__)
@@ -39,17 +42,33 @@ class FytaCoordinator(DataUpdateCoordinator[dict[int, dict[str, Any]]]):
) -> dict[int, dict[str, Any]]:
"""Fetch data from API endpoint."""
if self.fyta.expiration is None or self.fyta.expiration < datetime.now():
if (
self.fyta.expiration is None
or self.fyta.expiration.timestamp() < datetime.now().timestamp()
):
await self.renew_authentication()
return await self.fyta.update_all_plants()
async def renew_authentication(self) -> None:
async def renew_authentication(self) -> bool:
"""Renew access token for FYTA API."""
credentials: dict[str, Any] = {}
try:
await self.fyta.login()
credentials = await self.fyta.login()
except FytaConnectionError as ex:
raise ConfigEntryNotReady from ex
except (FytaAuthentificationError, FytaPasswordError) as ex:
raise ConfigEntryAuthFailed from ex
new_config_entry = {**self.config_entry.data}
new_config_entry[CONF_ACCESS_TOKEN] = credentials[CONF_ACCESS_TOKEN]
new_config_entry[CONF_EXPIRATION] = credentials[CONF_EXPIRATION].isoformat()
self.hass.config_entries.async_update_entry(
self.config_entry, data=new_config_entry
)
_LOGGER.debug("Credentials successfully updated")
return True
+1 -1
View File
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/fyta",
"integration_type": "hub",
"iot_class": "cloud_polling",
"requirements": ["fyta_cli==0.3.5"]
"requirements": ["fyta_cli==0.4.1"]
}
+18 -10
View File
@@ -7,7 +7,7 @@ from dataclasses import dataclass
from datetime import datetime
from typing import Final
from fyta_cli.fyta_connector import PLANT_STATUS
from fyta_cli.fyta_connector import PLANT_MEASUREMENT_STATUS, PLANT_STATUS
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -34,7 +34,15 @@ class FytaSensorEntityDescription(SensorEntityDescription):
)
PLANT_STATUS_LIST: list[str] = ["too_low", "low", "perfect", "high", "too_high"]
PLANT_STATUS_LIST: list[str] = ["deleted", "doing_great", "need_attention", "no_sensor"]
PLANT_MEASUREMENT_STATUS_LIST: list[str] = [
"no_data",
"too_low",
"low",
"perfect",
"high",
"too_high",
]
SENSORS: Final[list[FytaSensorEntityDescription]] = [
FytaSensorEntityDescription(
@@ -52,29 +60,29 @@ SENSORS: Final[list[FytaSensorEntityDescription]] = [
key="temperature_status",
translation_key="temperature_status",
device_class=SensorDeviceClass.ENUM,
options=PLANT_STATUS_LIST,
value_fn=PLANT_STATUS.get,
options=PLANT_MEASUREMENT_STATUS_LIST,
value_fn=PLANT_MEASUREMENT_STATUS.get,
),
FytaSensorEntityDescription(
key="light_status",
translation_key="light_status",
device_class=SensorDeviceClass.ENUM,
options=PLANT_STATUS_LIST,
value_fn=PLANT_STATUS.get,
options=PLANT_MEASUREMENT_STATUS_LIST,
value_fn=PLANT_MEASUREMENT_STATUS.get,
),
FytaSensorEntityDescription(
key="moisture_status",
translation_key="moisture_status",
device_class=SensorDeviceClass.ENUM,
options=PLANT_STATUS_LIST,
value_fn=PLANT_STATUS.get,
options=PLANT_MEASUREMENT_STATUS_LIST,
value_fn=PLANT_MEASUREMENT_STATUS.get,
),
FytaSensorEntityDescription(
key="salinity_status",
translation_key="salinity_status",
device_class=SensorDeviceClass.ENUM,
options=PLANT_STATUS_LIST,
value_fn=PLANT_STATUS.get,
options=PLANT_MEASUREMENT_STATUS_LIST,
value_fn=PLANT_MEASUREMENT_STATUS.get,
),
FytaSensorEntityDescription(
key="temperature",
+28 -25
View File
@@ -36,6 +36,16 @@
"plant_status": {
"name": "Plant state",
"state": {
"deleted": "Deleted",
"doing_great": "Doing great",
"need_attention": "Needs attention",
"no_sensor": "No sensor"
}
},
"temperature_status": {
"name": "Temperature state",
"state": {
"no_data": "No data",
"too_low": "Too low",
"low": "Low",
"perfect": "Perfect",
@@ -43,44 +53,37 @@
"too_high": "Too high"
}
},
"temperature_status": {
"name": "Temperature state",
"state": {
"too_low": "[%key:component::fyta::entity::sensor::plant_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::plant_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::plant_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::plant_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::plant_status::state::too_high%]"
}
},
"light_status": {
"name": "Light state",
"state": {
"too_low": "[%key:component::fyta::entity::sensor::plant_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::plant_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::plant_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::plant_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::plant_status::state::too_high%]"
"no_data": "[%key:component::fyta::entity::sensor::temperature_status::state::no_data%]",
"too_low": "[%key:component::fyta::entity::sensor::temperature_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::temperature_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::temperature_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::temperature_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::temperature_status::state::too_high%]"
}
},
"moisture_status": {
"name": "Moisture state",
"state": {
"too_low": "[%key:component::fyta::entity::sensor::plant_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::plant_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::plant_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::plant_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::plant_status::state::too_high%]"
"no_data": "[%key:component::fyta::entity::sensor::temperature_status::state::no_data%]",
"too_low": "[%key:component::fyta::entity::sensor::temperature_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::temperature_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::temperature_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::temperature_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::temperature_status::state::too_high%]"
}
},
"salinity_status": {
"name": "Salinity state",
"state": {
"too_low": "[%key:component::fyta::entity::sensor::plant_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::plant_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::plant_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::plant_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::plant_status::state::too_high%]"
"no_data": "[%key:component::fyta::entity::sensor::temperature_status::state::no_data%]",
"too_low": "[%key:component::fyta::entity::sensor::temperature_status::state::too_low%]",
"low": "[%key:component::fyta::entity::sensor::temperature_status::state::low%]",
"perfect": "[%key:component::fyta::entity::sensor::temperature_status::state::perfect%]",
"high": "[%key:component::fyta::entity::sensor::temperature_status::state::high%]",
"too_high": "[%key:component::fyta::entity::sensor::temperature_status::state::too_high%]"
}
},
"light": {
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/goodwe",
"iot_class": "local_polling",
"loggers": ["goodwe"],
"requirements": ["goodwe==0.3.2"]
"requirements": ["goodwe==0.3.5"]
}
+1 -1
View File
@@ -63,7 +63,7 @@ NUMBERS = (
native_unit_of_measurement=PERCENTAGE,
native_step=1,
native_min_value=0,
native_max_value=100,
native_max_value=200,
getter=lambda inv: inv.get_grid_export_limit(),
setter=lambda inv, val: inv.set_grid_export_limit(val),
filter=lambda inv: _get_setting_unit(inv, "grid_export_limit") == "%",
@@ -169,7 +169,9 @@ class GoogleAssistantConversationAgent(conversation.AbstractConversationAgent):
self.language = user_input.language
self.assistant = TextAssistant(credentials, self.language)
resp = self.assistant.assist(user_input.text)
resp = await self.hass.async_add_executor_job(
self.assistant.assist, user_input.text
)
text_response = resp[0] or "<empty response>"
intent_response = intent.IntentResponse(language=user_input.language)
@@ -79,7 +79,7 @@ async def async_send_text_commands(
) as assistant:
command_response_list = []
for command in commands:
resp = assistant.assist(command)
resp = await hass.async_add_executor_job(assistant.assist, command)
text_response = resp[0]
_LOGGER.debug("command: %s\nresponse: %s", command, text_response)
audio_response = resp[2]
@@ -182,11 +182,11 @@ class GoogleGenerativeAIAgent(conversation.AbstractConversationAgent):
conversation_id = ulid.ulid_now()
messages = [{}, {}]
intent_response = intent.IntentResponse(language=user_input.language)
try:
prompt = self._async_generate_prompt(raw_prompt)
except TemplateError as err:
_LOGGER.error("Error rendering prompt: %s", err)
intent_response = intent.IntentResponse(language=user_input.language)
intent_response.async_set_error(
intent.IntentResponseErrorCode.UNKNOWN,
f"Sorry, I had a problem with my template: {err}",
@@ -210,7 +210,6 @@ class GoogleGenerativeAIAgent(conversation.AbstractConversationAgent):
genai_types.StopCandidateException,
) as err:
_LOGGER.error("Error sending message: %s", err)
intent_response = intent.IntentResponse(language=user_input.language)
intent_response.async_set_error(
intent.IntentResponseErrorCode.UNKNOWN,
f"Sorry, I had a problem talking to Google Generative AI: {err}",
@@ -220,9 +219,15 @@ class GoogleGenerativeAIAgent(conversation.AbstractConversationAgent):
)
_LOGGER.debug("Response: %s", chat_response.parts)
if not chat_response.parts:
intent_response.async_set_error(
intent.IntentResponseErrorCode.UNKNOWN,
"Sorry, I had a problem talking to Google Generative AI. Likely blocked",
)
return conversation.ConversationResult(
response=intent_response, conversation_id=conversation_id
)
self.history[conversation_id] = chat.history
intent_response = intent.IntentResponse(language=user_input.language)
intent_response.async_set_speech(chat_response.text)
return conversation.ConversationResult(
response=intent_response, conversation_id=conversation_id
@@ -17,7 +17,7 @@ from homeassistant.components.light import (
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
@@ -94,7 +94,7 @@ class GoveeLight(CoordinatorEntity[GoveeLocalApiCoordinator], LightEntity):
name=device.sku,
manufacturer=MANUFACTURER,
model=device.sku,
connections={(CONNECTION_NETWORK_MAC, device.fingerprint)},
serial_number=device.fingerprint,
)
@property
@@ -6,5 +6,5 @@
"dependencies": ["network"],
"documentation": "https://www.home-assistant.io/integrations/govee_light_local",
"iot_class": "local_push",
"requirements": ["govee-local-api==1.4.4"]
"requirements": ["govee-local-api==1.4.5"]
}
@@ -10,9 +10,10 @@ import voluptuous as vol
from homeassistant.config_entries import ConfigFlow
from homeassistant.const import CONF_API_KEY, CONF_NAME, CONF_URL
from homeassistant.core import HomeAssistant
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from .const import CONF_API_USER, DEFAULT_URL, DOMAIN
@@ -79,6 +80,20 @@ class HabiticaConfigFlow(ConfigFlow, domain=DOMAIN):
async def async_step_import(self, import_data):
"""Import habitica config from configuration.yaml."""
async_create_issue(
self.hass,
HOMEASSISTANT_DOMAIN,
f"deprecated_yaml_{DOMAIN}",
is_fixable=False,
breaks_in_ha_version="2024.11.0",
severity=IssueSeverity.WARNING,
translation_key="deprecated_yaml",
translation_placeholders={
"domain": DOMAIN,
"integration_title": "Habitica",
},
)
return await self.async_step_user(import_data)
@@ -95,7 +95,10 @@ class BaseFirmwareInstallFlow(ConfigEntryBaseFlow, ABC):
_LOGGER.error(err)
raise AbortFlow(
"addon_set_config_failed",
description_placeholders=self._get_translation_placeholders(),
description_placeholders={
**self._get_translation_placeholders(),
"addon_name": addon_manager.addon_name,
},
) from err
async def _async_get_addon_info(self, addon_manager: AddonManager) -> AddonInfo:
@@ -597,6 +600,21 @@ class HomeAssistantSkyConnectMultiPanOptionsFlowHandler(
"""Return the name of the hardware."""
return self._hw_variant.full_name
async def async_step_flashing_complete(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Finish flashing and update the config entry."""
self.hass.config_entries.async_update_entry(
entry=self.config_entry,
data={
**self.config_entry.data,
"firmware": ApplicationType.EZSP.value,
},
options=self.config_entry.options,
)
return await super().async_step_flashing_complete(user_input)
class HomeAssistantSkyConnectOptionsFlowHandler(
BaseFirmwareInstallFlow, OptionsFlowWithConfigEntry
@@ -5,7 +5,7 @@
"config_flow": true,
"dependencies": ["hardware", "usb", "homeassistant_hardware"],
"documentation": "https://www.home-assistant.io/integrations/homeassistant_sky_connect",
"integration_type": "device",
"integration_type": "hardware",
"usb": [
{
"vid": "10C4",
@@ -50,9 +50,9 @@ def get_hardware_variant(config_entry: ConfigEntry) -> HardwareVariant:
return HardwareVariant.from_usb_product_name(config_entry.data["product"])
def get_zha_device_path(config_entry: ConfigEntry) -> str:
def get_zha_device_path(config_entry: ConfigEntry) -> str | None:
"""Get the device path from a ZHA config entry."""
return cast(str, config_entry.data["device"]["path"])
return cast(str | None, config_entry.data.get("device", {}).get("path", None))
@singleton(OTBR_ADDON_MANAGER_DATA)
@@ -94,13 +94,15 @@ async def guess_firmware_type(hass: HomeAssistant, device_path: str) -> Firmware
for zha_config_entry in hass.config_entries.async_entries(ZHA_DOMAIN):
zha_path = get_zha_device_path(zha_config_entry)
device_guesses[zha_path].append(
FirmwareGuess(
is_running=(zha_config_entry.state == ConfigEntryState.LOADED),
firmware_type=ApplicationType.EZSP,
source="zha",
if zha_path is not None:
device_guesses[zha_path].append(
FirmwareGuess(
is_running=(zha_config_entry.state == ConfigEntryState.LOADED),
firmware_type=ApplicationType.EZSP,
source="zha",
)
)
)
if is_hassio(hass):
otbr_addon_manager = get_otbr_addon_manager(hass)
@@ -153,6 +153,7 @@ class HKDevice:
self._subscriptions: dict[tuple[int, int], set[CALLBACK_TYPE]] = {}
self._pending_subscribes: set[tuple[int, int]] = set()
self._subscribe_timer: CALLBACK_TYPE | None = None
self._load_platforms_lock = asyncio.Lock()
@property
def entity_map(self) -> Accessories:
@@ -327,7 +328,8 @@ class HKDevice:
)
# BLE devices always get an RSSI sensor as well
if "sensor" not in self.platforms:
await self._async_load_platforms({"sensor"})
async with self._load_platforms_lock:
await self._async_load_platforms({"sensor"})
@callback
def _async_start_polling(self) -> None:
@@ -804,6 +806,7 @@ class HKDevice:
async def _async_load_platforms(self, platforms: set[str]) -> None:
"""Load a group of platforms."""
assert self._load_platforms_lock.locked(), "Must be called with lock held"
if not (to_load := platforms - self.platforms):
return
self.platforms.update(to_load)
@@ -813,22 +816,23 @@ class HKDevice:
async def async_load_platforms(self) -> None:
"""Load any platforms needed by this HomeKit device."""
to_load: set[str] = set()
for accessory in self.entity_map.accessories:
for service in accessory.services:
if service.type in HOMEKIT_ACCESSORY_DISPATCH:
platform = HOMEKIT_ACCESSORY_DISPATCH[service.type]
if platform not in self.platforms:
to_load.add(platform)
for char in service.characteristics:
if char.type in CHARACTERISTIC_PLATFORMS:
platform = CHARACTERISTIC_PLATFORMS[char.type]
async with self._load_platforms_lock:
to_load: set[str] = set()
for accessory in self.entity_map.accessories:
for service in accessory.services:
if service.type in HOMEKIT_ACCESSORY_DISPATCH:
platform = HOMEKIT_ACCESSORY_DISPATCH[service.type]
if platform not in self.platforms:
to_load.add(platform)
if to_load:
await self._async_load_platforms(to_load)
for char in service.characteristics:
if char.type in CHARACTERISTIC_PLATFORMS:
platform = CHARACTERISTIC_PLATFORMS[char.type]
if platform not in self.platforms:
to_load.add(platform)
if to_load:
await self._async_load_platforms(to_load)
@callback
def async_update_available_state(self, *_: Any) -> None:
@@ -212,13 +212,15 @@ class HomeKitWindowCover(HomeKitEntity, CoverEntity):
)
@property
def current_cover_tilt_position(self) -> int:
def current_cover_tilt_position(self) -> int | None:
"""Return current position of cover tilt."""
tilt_position = self.service.value(CharacteristicsTypes.VERTICAL_TILT_CURRENT)
if not tilt_position:
tilt_position = self.service.value(
CharacteristicsTypes.HORIZONTAL_TILT_CURRENT
)
if tilt_position is None:
return None
# Recalculate to convert from arcdegree scale to percentage scale.
if self.is_vertical_tilt:
scale = 0.9
+3 -15
View File
@@ -10,7 +10,7 @@ import os
import socket
import ssl
from tempfile import NamedTemporaryFile
from typing import Any, Final, Required, TypedDict, cast
from typing import Any, Final, TypedDict, cast
from urllib.parse import quote_plus, urljoin
from aiohttp import web
@@ -36,7 +36,6 @@ from homeassistant.core import (
HomeAssistant,
ServiceCall,
ServiceResponse,
SupportsResponse,
callback,
)
from homeassistant.exceptions import (
@@ -146,9 +145,6 @@ HTTP_SCHEMA: Final = vol.All(
[SSL_INTERMEDIATE, SSL_MODERN]
),
vol.Optional(CONF_USE_X_FRAME_OPTIONS, default=True): cv.boolean,
vol.Optional(
CONF_STRICT_CONNECTION, default=StrictConnectionMode.DISABLED
): vol.Coerce(StrictConnectionMode),
}
),
)
@@ -172,7 +168,6 @@ class ConfData(TypedDict, total=False):
login_attempts_threshold: int
ip_ban_enabled: bool
ssl_profile: str
strict_connection: Required[StrictConnectionMode]
@bind_hass
@@ -239,7 +234,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
login_threshold=login_threshold,
is_ban_enabled=is_ban_enabled,
use_x_frame_options=use_x_frame_options,
strict_connection_non_cloud=conf[CONF_STRICT_CONNECTION],
strict_connection_non_cloud=StrictConnectionMode.DISABLED,
)
async def stop_server(event: Event) -> None:
@@ -620,7 +615,7 @@ def _setup_services(hass: HomeAssistant, conf: ConfData) -> None:
if not user.is_admin:
raise Unauthorized(context=call.context)
if conf[CONF_STRICT_CONNECTION] is StrictConnectionMode.DISABLED:
if StrictConnectionMode.DISABLED is StrictConnectionMode.DISABLED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="strict_connection_not_enabled_non_cloud",
@@ -652,10 +647,3 @@ def _setup_services(hass: HomeAssistant, conf: ConfData) -> None:
"url": f"https://login.home-assistant.io?u={quote_plus(url)}",
"direct_url": url,
}
hass.services.async_register(
DOMAIN,
"create_temporary_strict_connection_url",
create_temporary_strict_connection_url,
supports_response=SupportsResponse.ONLY,
)
@@ -54,7 +54,7 @@ def format_default(value: StateType) -> tuple[StateType, str | None]:
if value is not None:
# Clean up value and infer unit, e.g. -71dBm, 15 dB
if match := re.match(
r"([>=<]*)(?P<value>.+?)\s*(?P<unit>[a-zA-Z]+)\s*$", str(value)
r"((&[gl]t;|[><])=?)?(?P<value>.+?)\s*(?P<unit>[a-zA-Z]+)\s*$", str(value)
):
try:
value = float(match.group("value"))
@@ -57,6 +57,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
if "amc:api" not in entry.data["token"]["scope"]:
# We raise ConfigEntryAuthFailed here because the websocket can't be used
# without the scope. So only polling would be possible.
raise ConfigEntryAuthFailed
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
@@ -13,7 +13,9 @@ from homeassistant.helpers import config_entry_oauth2_flow
from .const import DOMAIN, NAME
_LOGGER = logging.getLogger(__name__)
CONF_USER_ID = "user_id"
HUSQVARNA_DEV_PORTAL_URL = "https://developer.husqvarnagroup.cloud/applications"
class HusqvarnaConfigFlowHandler(
@@ -29,8 +31,14 @@ class HusqvarnaConfigFlowHandler(
async def async_oauth_create_entry(self, data: dict[str, Any]) -> ConfigFlowResult:
"""Create an entry for the flow."""
token = data[CONF_TOKEN]
if "amc:api" not in token["scope"] and not self.reauth_entry:
return self.async_abort(reason="missing_amc_scope")
user_id = token[CONF_USER_ID]
if self.reauth_entry:
if "amc:api" not in token["scope"]:
return self.async_update_reload_and_abort(
self.reauth_entry, data=data, reason="missing_amc_scope"
)
if self.reauth_entry.unique_id != user_id:
return self.async_abort(reason="wrong_account")
return self.async_update_reload_and_abort(self.reauth_entry, data=data)
@@ -56,6 +64,9 @@ class HusqvarnaConfigFlowHandler(
self.reauth_entry = self.hass.config_entries.async_get_entry(
self.context["entry_id"]
)
if self.reauth_entry is not None:
if "amc:api" not in self.reauth_entry.data["token"]["scope"]:
return await self.async_step_missing_scope()
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
@@ -65,3 +76,19 @@ class HusqvarnaConfigFlowHandler(
if user_input is None:
return self.async_show_form(step_id="reauth_confirm")
return await self.async_step_user()
async def async_step_missing_scope(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm reauth for missing scope."""
if user_input is None and self.reauth_entry is not None:
token_structured = structure_token(
self.reauth_entry.data["token"]["access_token"]
)
return self.async_show_form(
step_id="missing_scope",
description_placeholders={
"application_url": f"{HUSQVARNA_DEV_PORTAL_URL}/{token_structured.client_id}"
},
)
return await self.async_step_user()
@@ -4,12 +4,17 @@ import asyncio
from datetime import timedelta
import logging
from aioautomower.exceptions import ApiException, HusqvarnaWSServerHandshakeError
from aioautomower.exceptions import (
ApiException,
AuthException,
HusqvarnaWSServerHandshakeError,
)
from aioautomower.model import MowerAttributes
from aioautomower.session import AutomowerSession
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
@@ -46,6 +51,8 @@ class AutomowerDataUpdateCoordinator(DataUpdateCoordinator[dict[str, MowerAttrib
return await self.api.get_status()
except ApiException as err:
raise UpdateFailed(err) from err
except AuthException as err:
raise ConfigEntryAuthFailed(err) from err
@callback
def callback(self, ws_data: dict[str, MowerAttributes]) -> None:
@@ -5,6 +5,10 @@
"title": "[%key:common::config_flow::title::reauth%]",
"description": "The Husqvarna Automower integration needs to re-authenticate your account"
},
"missing_scope": {
"title": "Your account is missing some API connections",
"description": "For the best experience with this integration both the `Authentication API` and the `Automower Connect API` should be connected. Please make sure that both of them are connected to your account in the [Husqvarna Developer Portal]({application_url})."
},
"pick_implementation": {
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
}
@@ -22,7 +26,8 @@
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"wrong_account": "You can only reauthenticate this entry with the same Husqvarna account."
"wrong_account": "You can only reauthenticate this entry with the same Husqvarna account.",
"missing_amc_scope": "The `Authentication API` and the `Automower Connect API` are not connected to your application in the Husqvarna Developer Portal."
},
"create_entry": {
"default": "[%key:common::config_flow::create_entry::authenticated%]"
@@ -90,24 +90,24 @@ def setup_service_functions(hass: HomeAssistant) -> None:
ihc_controller = _get_controller(call)
await async_pulse(hass, ihc_controller, ihc_id)
hass.services.async_register(
hass.services.register(
DOMAIN,
SERVICE_SET_RUNTIME_VALUE_BOOL,
async_set_runtime_value_bool,
schema=SET_RUNTIME_VALUE_BOOL_SCHEMA,
)
hass.services.async_register(
hass.services.register(
DOMAIN,
SERVICE_SET_RUNTIME_VALUE_INT,
async_set_runtime_value_int,
schema=SET_RUNTIME_VALUE_INT_SCHEMA,
)
hass.services.async_register(
hass.services.register(
DOMAIN,
SERVICE_SET_RUNTIME_VALUE_FLOAT,
async_set_runtime_value_float,
schema=SET_RUNTIME_VALUE_FLOAT_SCHEMA,
)
hass.services.async_register(
hass.services.register(
DOMAIN, SERVICE_PULSE, async_pulse_runtime_input, schema=PULSE_SCHEMA
)
+1 -1
View File
@@ -75,7 +75,7 @@ CONFIG_SCHEMA = vol.Schema(
vol.Optional(CONF_FOLDER, default="INBOX"): str,
vol.Optional(CONF_SEARCH, default="UnSeen UnDeleted"): str,
# The default for new entries is to not include text and headers
vol.Optional(CONF_EVENT_MESSAGE_DATA, default=[]): cv.ensure_list,
vol.Optional(CONF_EVENT_MESSAGE_DATA, default=[]): EVENT_MESSAGE_DATA_SELECTOR,
}
)
CONFIG_SCHEMA_ADVANCED = {
@@ -22,6 +22,7 @@ from .const import (
CONF_CAT,
CONF_DIM_STEPS,
CONF_HOUSECODE,
CONF_HUB_VERSION,
CONF_SUBCAT,
CONF_UNITCODE,
HOUSECODES,
@@ -143,6 +144,7 @@ def build_hub_schema(
schema = {
vol.Required(CONF_HOST, default=host): str,
vol.Required(CONF_PORT, default=port): int,
vol.Required(CONF_HUB_VERSION, default=hub_version): int,
}
if hub_version == 2:
schema[vol.Required(CONF_USERNAME, default=username)] = str
@@ -2,7 +2,7 @@
from __future__ import annotations
from datetime import datetime, timedelta
from datetime import date, datetime, timedelta
import logging
from typing import Any, cast
@@ -70,8 +70,8 @@ class IslamicPrayerDataUpdateCoordinator(DataUpdateCoordinator[dict[str, datetim
"""Return the school."""
return self.config_entry.options.get(CONF_SCHOOL, DEFAULT_SCHOOL)
def get_new_prayer_times(self) -> dict[str, Any]:
"""Fetch prayer times for today."""
def get_new_prayer_times(self, for_date: date) -> dict[str, Any]:
"""Fetch prayer times for the specified date."""
calc = PrayerTimesCalculator(
latitude=self.latitude,
longitude=self.longitude,
@@ -79,7 +79,7 @@ class IslamicPrayerDataUpdateCoordinator(DataUpdateCoordinator[dict[str, datetim
latitudeAdjustmentMethod=self.lat_adj_method,
midnightMode=self.midnight_mode,
school=self.school,
date=str(dt_util.now().date()),
date=str(for_date),
iso8601=True,
)
return cast(dict[str, Any], calc.fetch_prayer_times())
@@ -88,51 +88,18 @@ class IslamicPrayerDataUpdateCoordinator(DataUpdateCoordinator[dict[str, datetim
def async_schedule_future_update(self, midnight_dt: datetime) -> None:
"""Schedule future update for sensors.
Midnight is a calculated time. The specifics of the calculation
depends on the method of the prayer time calculation. This calculated
midnight is the time at which the time to pray the Isha prayers have
expired.
The least surprising behaviour is to load the next day's prayer times only
after the current day's prayers are complete. We will take the fiqhi opinion
that Isha should be prayed before Islamic midnight (which may be before or after 12:00 midnight),
and thus we will switch to the next day's timings at Islamic midnight.
Calculated Midnight: The Islamic midnight.
Traditional Midnight: 12:00AM
Update logic for prayer times:
If the Calculated Midnight is before the traditional midnight then wait
until the traditional midnight to run the update. This way the day
will have changed over and we don't need to do any fancy calculations.
If the Calculated Midnight is after the traditional midnight, then wait
until after the calculated Midnight. We don't want to update the prayer
times too early or else the timings might be incorrect.
Example:
calculated midnight = 11:23PM (before traditional midnight)
Update time: 12:00AM
calculated midnight = 1:35AM (after traditional midnight)
update time: 1:36AM.
The +1s is to ensure that any automations predicated on the arrival of Islamic midnight will run.
"""
_LOGGER.debug("Scheduling next update for Islamic prayer times")
now = dt_util.utcnow()
if now > midnight_dt:
next_update_at = midnight_dt + timedelta(days=1, minutes=1)
_LOGGER.debug(
"Midnight is after the day changes so schedule update for after Midnight the next day"
)
else:
_LOGGER.debug(
"Midnight is before the day changes so schedule update for the next start of day"
)
next_update_at = dt_util.start_of_local_day(now + timedelta(days=1))
_LOGGER.debug("Next update scheduled for: %s", next_update_at)
self.event_unsub = async_track_point_in_time(
self.hass, self.async_request_update, next_update_at
self.hass, self.async_request_update, midnight_dt + timedelta(seconds=1)
)
async def async_request_update(self, _: datetime) -> None:
@@ -140,8 +107,34 @@ class IslamicPrayerDataUpdateCoordinator(DataUpdateCoordinator[dict[str, datetim
await self.async_request_refresh()
async def _async_update_data(self) -> dict[str, datetime]:
"""Update sensors with new prayer times."""
prayer_times = self.get_new_prayer_times()
"""Update sensors with new prayer times.
Prayer time calculations "roll over" at 12:00 midnight - but this does not mean that all prayers
occur within that Gregorian calendar day. For instance Jasper, Alta. sees Isha occur after 00:00 in the summer.
It is similarly possible (albeit less likely) that Fajr occurs before 00:00.
As such, to ensure that no prayer times are "unreachable" (e.g. we always see the Isha timestamp pass before loading the next day's times),
we calculate 3 days' worth of times (-1, 0, +1 days) and select the appropriate set based on Islamic midnight.
The calculation is inexpensive, so there is no need to cache it.
"""
# Zero out the us component to maintain consistent rollover at T+1s
now = dt_util.now().replace(microsecond=0)
yesterday_times = self.get_new_prayer_times((now - timedelta(days=1)).date())
today_times = self.get_new_prayer_times(now.date())
tomorrow_times = self.get_new_prayer_times((now + timedelta(days=1)).date())
if (
yesterday_midnight := dt_util.parse_datetime(yesterday_times["Midnight"])
) and now <= yesterday_midnight:
prayer_times = yesterday_times
elif (
tomorrow_midnight := dt_util.parse_datetime(today_times["Midnight"])
) and now > tomorrow_midnight:
prayer_times = tomorrow_times
else:
prayer_times = today_times
# introduced in prayer-times-calculator 0.0.8
prayer_times.pop("date", None)
@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["jvcprojector"],
"requirements": ["pyjvcprojector==1.0.9"]
"requirements": ["pyjvcprojector==1.0.11"]
}
@@ -2,6 +2,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Iterable
import logging
from typing import Any
@@ -74,11 +75,13 @@ class JvcProjectorRemote(JvcProjectorEntity, RemoteEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the device on."""
await self.device.power_on()
await asyncio.sleep(1)
await self.coordinator.async_refresh()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the device off."""
await self.device.power_off()
await asyncio.sleep(1)
await self.coordinator.async_refresh()
async def async_send_command(self, command: Iterable[str], **kwargs: Any) -> None:
+1 -1
View File
@@ -97,7 +97,7 @@ def _create_notification_instance(xknx: XKNX, config: ConfigType) -> XknxNotific
)
class KNXNotify(NotifyEntity, KnxEntity):
class KNXNotify(KnxEntity, NotifyEntity):
"""Representation of a KNX notification entity."""
_device: XknxNotification
@@ -480,7 +480,13 @@ class KodiEntity(MediaPlayerEntity):
self._reset_state()
return
self._players = await self._kodi.get_players()
try:
self._players = await self._kodi.get_players()
except (TransportError, ProtocolError):
if not self._connection.can_subscribe:
self._reset_state()
return
raise
if self._kodi_is_off:
self._reset_state()
+10 -2
View File
@@ -18,6 +18,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.setup import SetupPhases, async_pause_setup
from homeassistant.util import dt as dt_util
from .const import CONF_TODO_LIST_NAME, DOMAIN
@@ -67,9 +68,16 @@ async def async_setup_entry(
) -> None:
"""Set up the local_todo todo platform."""
store = hass.data[DOMAIN][config_entry.entry_id]
store: LocalTodoListStore = hass.data[DOMAIN][config_entry.entry_id]
ics = await store.async_load()
calendar = IcsCalendarStream.calendar_from_ics(ics)
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# calendar_from_ics will dynamically load packages
# the first time it is called, so we need to do it
# in a separate thread to avoid blocking the event loop
calendar: Calendar = await hass.async_add_import_executor_job(
IcsCalendarStream.calendar_from_ics, ics
)
migrated = _migrate_calendar(calendar)
calendar.prodid = PRODID
+1 -1
View File
@@ -106,4 +106,4 @@ class LutronEventEntity(LutronKeypad, EventEntity):
}
self.hass.bus.fire("lutron_event", data)
self._trigger_event(action)
self.async_write_ha_state()
self.schedule_update_ha_state()
@@ -96,7 +96,7 @@ class LutronCasetaTiltOnlyBlind(LutronCasetaDeviceUpdatableEntity, CoverEntity):
async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
"""Move the blind to a specific tilt."""
self._smartbridge.set_tilt(self.device_id, kwargs[ATTR_TILT_POSITION])
await self._smartbridge.set_tilt(self.device_id, kwargs[ATTR_TILT_POSITION])
PYLUTRON_TYPE_TO_CLASSES = {
@@ -107,9 +107,6 @@ class MatterEntity(Entity):
attr_path_filter=attr_path,
)
)
await self.matter_client.subscribe_attribute(
self._endpoint.node.node_id, sub_paths
)
# subscribe to node (availability changes)
self._unsubscribes.append(
self.matter_client.subscribe_events(
+46 -36
View File
@@ -43,6 +43,21 @@ COLOR_MODE_MAP = {
}
DEFAULT_TRANSITION = 0.2
# there's a bug in (at least) Espressif's implementation of light transitions
# on devices based on Matter 1.0. Mark potential devices with this issue.
# https://github.com/home-assistant/core/issues/113775
# vendorid (attributeKey 0/40/2)
# productid (attributeKey 0/40/4)
# hw version (attributeKey 0/40/8)
# sw version (attributeKey 0/40/10)
TRANSITION_BLOCKLIST = (
(4488, 514, "1.0", "1.0.0"),
(4488, 260, "1.0", "1.0.0"),
(5010, 769, "3.0", "1.0.0"),
(4999, 25057, "1.0", "27.0"),
(4448, 36866, "V1", "V1.0.0.5"),
)
async def async_setup_entry(
hass: HomeAssistant,
@@ -61,6 +76,7 @@ class MatterLight(MatterEntity, LightEntity):
_supports_brightness = False
_supports_color = False
_supports_color_temperature = False
_transitions_disabled = False
async def _set_xy_color(
self, xy_color: tuple[float, float], transition: float = 0.0
@@ -260,6 +276,8 @@ class MatterLight(MatterEntity, LightEntity):
color_temp = kwargs.get(ATTR_COLOR_TEMP)
brightness = kwargs.get(ATTR_BRIGHTNESS)
transition = kwargs.get(ATTR_TRANSITION, DEFAULT_TRANSITION)
if self._transitions_disabled:
transition = 0
if self.supported_color_modes is not None:
if hs_color is not None and ColorMode.HS in self.supported_color_modes:
@@ -295,7 +313,10 @@ class MatterLight(MatterEntity, LightEntity):
# brightness support
if self._entity_info.endpoint.has_attribute(
None, clusters.LevelControl.Attributes.CurrentLevel
):
) and self._entity_info.endpoint.device_types != {device_types.OnOffLight}:
# We need to filter out the OnOffLight device type here because
# that can have an optional LevelControl cluster present
# which we should ignore.
supported_color_modes.add(ColorMode.BRIGHTNESS)
self._supports_brightness = True
# colormode(s)
@@ -333,8 +354,12 @@ class MatterLight(MatterEntity, LightEntity):
supported_color_modes = filter_supported_color_modes(supported_color_modes)
self._attr_supported_color_modes = supported_color_modes
self._check_transition_blocklist()
# flag support for transition as soon as we support setting brightness and/or color
if supported_color_modes != {ColorMode.ONOFF}:
if (
supported_color_modes != {ColorMode.ONOFF}
and not self._transitions_disabled
):
self._attr_supported_features |= LightEntityFeature.TRANSITION
LOGGER.debug(
@@ -373,6 +398,23 @@ class MatterLight(MatterEntity, LightEntity):
else:
self._attr_color_mode = ColorMode.ONOFF
def _check_transition_blocklist(self) -> None:
"""Check if this device is reported to have non working transitions."""
device_info = self._endpoint.device_info
if isinstance(device_info, clusters.BridgedDeviceBasicInformation):
return
if (
device_info.vendorID,
device_info.productID,
device_info.hardwareVersionString,
device_info.softwareVersionString,
) in TRANSITION_BLOCKLIST:
self._transitions_disabled = True
LOGGER.warning(
"Detected a device that has been reported to have firmware issues "
"with light transitions. Transitions will be disabled for this light"
)
# Discovery schema(s) to map Matter Attributes to HA entities
DISCOVERY_SCHEMAS = [
@@ -406,11 +448,11 @@ DISCOVERY_SCHEMAS = [
entity_class=MatterLight,
required_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.CurrentHue,
clusters.ColorControl.Attributes.CurrentSaturation,
),
optional_attributes=(
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.ColorTemperatureMireds,
clusters.ColorControl.Attributes.ColorMode,
clusters.ColorControl.Attributes.CurrentX,
@@ -426,11 +468,11 @@ DISCOVERY_SCHEMAS = [
entity_class=MatterLight,
required_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.CurrentX,
clusters.ColorControl.Attributes.CurrentY,
),
optional_attributes=(
clusters.LevelControl.Attributes.CurrentLevel,
clusters.ColorControl.Attributes.ColorTemperatureMireds,
clusters.ColorControl.Attributes.ColorMode,
clusters.ColorControl.Attributes.CurrentHue,
@@ -451,36 +493,4 @@ DISCOVERY_SCHEMAS = [
),
optional_attributes=(clusters.ColorControl.Attributes.ColorMode,),
),
# Additional schema to match generic dimmable lights with incorrect/missing device type
MatterDiscoverySchema(
platform=Platform.LIGHT,
entity_description=LightEntityDescription(
key="MatterDimmableLightFallback", name=None
),
entity_class=MatterLight,
required_attributes=(
clusters.OnOff.Attributes.OnOff,
clusters.LevelControl.Attributes.CurrentLevel,
),
optional_attributes=(
clusters.ColorControl.Attributes.ColorMode,
clusters.ColorControl.Attributes.CurrentHue,
clusters.ColorControl.Attributes.CurrentSaturation,
clusters.ColorControl.Attributes.CurrentX,
clusters.ColorControl.Attributes.CurrentY,
clusters.ColorControl.Attributes.ColorTemperatureMireds,
),
# important: make sure to rule out all device types that are also based on the
# onoff and levelcontrol clusters !
not_device_type=(
device_types.Fan,
device_types.GenericSwitch,
device_types.OnOffPlugInUnit,
device_types.HeatingCoolingUnit,
device_types.Pump,
device_types.CastingVideoClient,
device_types.VideoRemoteControl,
device_types.Speaker,
),
),
]
@@ -6,6 +6,6 @@
"dependencies": ["websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/matter",
"iot_class": "local_push",
"requirements": ["python-matter-server==5.7.0"],
"requirements": ["python-matter-server==5.10.0"],
"zeroconf": ["_matter._tcp.local.", "_matterc._udp.local."]
}
+1 -5
View File
@@ -81,12 +81,8 @@ DISCOVERY_SCHEMAS = [
device_types.ColorTemperatureLight,
device_types.DimmableLight,
device_types.ExtendedColorLight,
device_types.OnOffLight,
device_types.DoorLock,
device_types.ColorDimmerSwitch,
device_types.DimmerSwitch,
device_types.Thermostat,
device_types.RoomAirConditioner,
device_types.OnOffLight,
),
),
]
+1 -1
View File
@@ -245,7 +245,7 @@ async def async_modbus_setup(
translation_key="deprecated_restart",
)
_LOGGER.warning(
"`modbus.restart`: is deprecated and will be removed in version 2024.11"
"`modbus.restart` is deprecated and will be removed in version 2024.11"
)
async_dispatcher_send(hass, SIGNAL_START_ENTITY)
hub = hub_collect[service.data[ATTR_HUB]]
+127 -61
View File
@@ -25,19 +25,12 @@ from homeassistant.const import (
CONF_PORT,
CONF_PROTOCOL,
CONF_USERNAME,
EVENT_HOMEASSISTANT_STARTED,
EVENT_HOMEASSISTANT_STOP,
)
from homeassistant.core import (
CALLBACK_TYPE,
CoreState,
Event,
HassJob,
HomeAssistant,
callback,
)
from homeassistant.core import CALLBACK_TYPE, Event, HassJob, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.start import async_at_started
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import bind_hass
from homeassistant.util import dt as dt_util
@@ -90,8 +83,18 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
DISCOVERY_COOLDOWN = 2
INITIAL_SUBSCRIBE_COOLDOWN = 1.0
MIN_BUFFER_SIZE = 131072 # Minimum buffer size to use if preferred size fails
PREFERRED_BUFFER_SIZE = 2097152 # Set receive buffer size to 2MB
DISCOVERY_COOLDOWN = 5
# The initial subscribe cooldown controls how long to wait to group
# subscriptions together. This is to avoid making too many subscribe
# requests in a short period of time. If the number is too low, the
# system will be flooded with subscribe requests. If the number is too
# high, we risk being flooded with responses to the subscribe requests
# which can exceed the receive buffer size of the socket. To mitigate
# this, we increase the receive buffer size of the socket as well.
INITIAL_SUBSCRIBE_COOLDOWN = 0.5
SUBSCRIBE_COOLDOWN = 0.1
UNSUBSCRIBE_COOLDOWN = 0.1
TIMEOUT_ACK = 10
@@ -324,7 +327,7 @@ class EnsureJobAfterCooldown:
self._loop = asyncio.get_running_loop()
self._timeout = timeout
self._callback = callback_job
self._task: asyncio.Future | None = None
self._task: asyncio.Task | None = None
self._timer: asyncio.TimerHandle | None = None
def set_timeout(self, timeout: float) -> None:
@@ -339,22 +342,23 @@ class EnsureJobAfterCooldown:
_LOGGER.error("%s", ha_error)
@callback
def _async_task_done(self, task: asyncio.Future) -> None:
def _async_task_done(self, task: asyncio.Task) -> None:
"""Handle task done."""
self._task = None
@callback
def _async_execute(self) -> None:
def async_execute(self) -> asyncio.Task:
"""Execute the job."""
if self._task:
# Task already running,
# so we schedule another run
self.async_schedule()
return
return self._task
self._async_cancel_timer()
self._task = create_eager_task(self._async_job())
self._task.add_done_callback(self._async_task_done)
return self._task
@callback
def _async_cancel_timer(self) -> None:
@@ -369,7 +373,7 @@ class EnsureJobAfterCooldown:
# We want to reschedule the timer in the future
# every time this is called.
self._async_cancel_timer()
self._timer = self._loop.call_later(self._timeout, self._async_execute)
self._timer = self._loop.call_later(self._timeout, self.async_execute)
async def async_cleanup(self) -> None:
"""Cleanup any pending task."""
@@ -429,24 +433,22 @@ class MQTT:
UNSUBSCRIBE_COOLDOWN, self._async_perform_unsubscribes
)
self._pending_unsubscribes: set[str] = set() # topic
if self.hass.state is CoreState.running:
self._ha_started.set()
else:
@callback
def ha_started(_: Event) -> None:
self._ha_started.set()
self.hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STARTED, ha_started)
async def async_stop_mqtt(_event: Event) -> None:
"""Stop MQTT component."""
await self.async_disconnect()
self._cleanup_on_unload.append(
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, async_stop_mqtt)
self._cleanup_on_unload.extend(
(
async_at_started(hass, self._async_ha_started),
hass.bus.async_listen(EVENT_HOMEASSISTANT_STOP, self._async_ha_stop),
)
)
self._socket_buffersize: int | None = None
@callback
def _async_ha_started(self, _hass: HomeAssistant) -> None:
"""Handle HA started."""
self._ha_started.set()
async def _async_ha_stop(self, _event: Event) -> None:
"""Handle HA stop."""
await self.async_disconnect()
def start(
self,
@@ -501,6 +503,9 @@ class MQTT:
mqttc.on_subscribe = self._async_mqtt_on_callback
mqttc.on_unsubscribe = self._async_mqtt_on_callback
# suppress exceptions at callback
mqttc.suppress_exceptions = True
if will := self.conf.get(CONF_WILL_MESSAGE, DEFAULT_WILL):
will_message = PublishMessage(**will)
mqttc.will_set(
@@ -535,6 +540,37 @@ class MQTT:
self.hass, self._misc_loop(), name="mqtt misc loop"
)
def _increase_socket_buffer_size(self, sock: SocketType) -> None:
"""Increase the socket buffer size."""
if not hasattr(sock, "setsockopt") and hasattr(sock, "_socket"):
# The WebsocketWrapper does not wrap setsockopt
# so we need to get the underlying socket
# Remove this once
# https://github.com/eclipse/paho.mqtt.python/pull/843
# is available.
sock = sock._socket # pylint: disable=protected-access
new_buffer_size = PREFERRED_BUFFER_SIZE
while True:
try:
# Some operating systems do not allow us to set the preferred
# buffer size. In that case we try some other size options.
sock.setsockopt(socket.SOL_SOCKET, socket.SO_RCVBUF, new_buffer_size)
except OSError as err:
if new_buffer_size <= MIN_BUFFER_SIZE:
_LOGGER.warning(
"Unable to increase the socket buffer size to %s; "
"The connection may be unstable if the MQTT broker "
"sends data at volume or a large amount of subscriptions "
"need to be processed: %s",
new_buffer_size,
err,
)
return
new_buffer_size //= 2
else:
return
def _on_socket_open(
self, client: mqtt.Client, userdata: Any, sock: SocketType
) -> None:
@@ -551,6 +587,7 @@ class MQTT:
fileno = sock.fileno()
_LOGGER.debug("%s: connection opened %s", self.config_entry.title, fileno)
if fileno > -1:
self._increase_socket_buffer_size(sock)
self.loop.add_reader(sock, partial(self._async_reader_callback, client))
self._async_start_misc_loop()
@@ -856,7 +893,7 @@ class MQTT:
for topic, qos in subscriptions.items():
_LOGGER.debug("Subscribing to %s, mid: %s, qos: %s", topic, mid, qos)
self._last_subscribe = time.time()
self._last_subscribe = time.monotonic()
if result == 0:
await self._wait_for_mid(mid)
@@ -878,6 +915,25 @@ class MQTT:
await self._wait_for_mid(mid)
async def _async_resubscribe_and_publish_birth_message(
self, birth_message: PublishMessage
) -> None:
"""Resubscribe to all topics and publish birth message."""
await self._async_perform_subscriptions()
await self._ha_started.wait() # Wait for Home Assistant to start
await self._discovery_cooldown() # Wait for MQTT discovery to cool down
# Update subscribe cooldown period to a shorter time
# and make sure we flush the debouncer
await self._subscribe_debouncer.async_execute()
self._subscribe_debouncer.set_timeout(SUBSCRIBE_COOLDOWN)
await self.async_publish(
topic=birth_message.topic,
payload=birth_message.payload,
qos=birth_message.qos,
retain=birth_message.retain,
)
_LOGGER.info("MQTT client initialized, birth message sent")
@callback
def _async_mqtt_on_connect(
self,
@@ -919,36 +975,34 @@ class MQTT:
result_code,
)
self.hass.async_create_task(self._async_resubscribe())
self._async_queue_resubscribe()
birth: dict[str, Any]
if birth := self.conf.get(CONF_BIRTH_MESSAGE, DEFAULT_BIRTH):
async def publish_birth_message(birth_message: PublishMessage) -> None:
await self._ha_started.wait() # Wait for Home Assistant to start
await self._discovery_cooldown() # Wait for MQTT discovery to cool down
# Update subscribe cooldown period to a shorter time
self._subscribe_debouncer.set_timeout(SUBSCRIBE_COOLDOWN)
await self.async_publish(
topic=birth_message.topic,
payload=birth_message.payload,
qos=birth_message.qos,
retain=birth_message.retain,
)
birth_message = PublishMessage(**birth)
self.config_entry.async_create_background_task(
self.hass,
publish_birth_message(birth_message),
name="mqtt birth message",
self._async_resubscribe_and_publish_birth_message(birth_message),
name="mqtt re-subscribe and birth",
)
else:
# Update subscribe cooldown period to a shorter time
self.config_entry.async_create_background_task(
self.hass,
self._async_perform_subscriptions(),
name="mqtt re-subscribe",
)
self._subscribe_debouncer.set_timeout(SUBSCRIBE_COOLDOWN)
_LOGGER.info("MQTT client initialized")
self._async_connection_result(True)
async def _async_resubscribe(self) -> None:
"""Resubscribe on reconnect."""
@callback
def _async_queue_resubscribe(self) -> None:
"""Queue subscriptions on reconnect.
self._async_perform_subscriptions must be called
after this method to actually subscribe.
"""
self._max_qos.clear()
self._retained_topics.clear()
# Group subscriptions to only re-subscribe once for each topic.
@@ -963,7 +1017,6 @@ class MQTT:
],
queue_only=True,
)
await self._async_perform_subscriptions()
@lru_cache(None) # pylint: disable=method-cache-max-size-none
def _matching_subscriptions(self, topic: str) -> list[Subscription]:
@@ -981,10 +1034,21 @@ class MQTT:
def _async_mqtt_on_message(
self, _mqttc: mqtt.Client, _userdata: None, msg: mqtt.MQTTMessage
) -> None:
topic = msg.topic
# msg.topic is a property that decodes the topic to a string
# every time it is accessed. Save the result to avoid
# decoding the same topic multiple times.
try:
# msg.topic is a property that decodes the topic to a string
# every time it is accessed. Save the result to avoid
# decoding the same topic multiple times.
topic = msg.topic
except UnicodeDecodeError:
bare_topic: bytes = getattr(msg, "_topic")
_LOGGER.warning(
"Skipping received%s message on invalid topic %s (qos=%s): %s",
" retained" if msg.retain else "",
bare_topic,
msg.qos,
msg.payload[0:8192],
)
return
_LOGGER.debug(
"Received%s message on %s (qos=%s): %s",
" retained" if msg.retain else "",
@@ -1052,7 +1116,9 @@ class MQTT:
# The callback signature for on_unsubscribe is different from on_subscribe
# see https://github.com/eclipse/paho.mqtt.python/issues/687
# properties and reasoncodes are not used in Home Assistant
self.hass.async_create_task(self._mqtt_handle_mid(mid))
self.config_entry.async_create_task(
self.hass, self._mqtt_handle_mid(mid), name=f"mqtt handle mid {mid}"
)
async def _mqtt_handle_mid(self, mid: int) -> None:
# Create the mid event if not created, either _mqtt_handle_mid or _wait_for_mid
@@ -1117,7 +1183,7 @@ class MQTT:
async def _discovery_cooldown(self) -> None:
"""Wait until all discovery and subscriptions are processed."""
now = time.time()
now = time.monotonic()
# Reset discovery and subscribe cooldowns
self._mqtt_data.last_discovery = now
self._last_subscribe = now
@@ -1129,7 +1195,7 @@ class MQTT:
)
while now < wait_until:
await asyncio.sleep(wait_until - now)
now = time.time()
now = time.monotonic()
last_discovery = self._mqtt_data.last_discovery
last_subscribe = (
now if self._pending_subscriptions else self._last_subscribe
+2 -2
View File
@@ -177,7 +177,7 @@ async def async_start( # noqa: C901
@callback
def async_discovery_message_received(msg: ReceiveMessage) -> None: # noqa: C901
"""Process the received message."""
mqtt_data.last_discovery = time.time()
mqtt_data.last_discovery = time.monotonic()
payload = msg.payload
topic = msg.topic
topic_trimmed = topic.replace(f"{discovery_topic}/", "", 1)
@@ -370,7 +370,7 @@ async def async_start( # noqa: C901
)
)
mqtt_data.last_discovery = time.time()
mqtt_data.last_discovery = time.monotonic()
mqtt_integrations = await async_get_mqtt(hass)
for integration, topics in mqtt_integrations.items():
+11 -6
View File
@@ -1015,8 +1015,7 @@ class MqttDiscoveryUpdate(Entity):
self.hass.async_create_task(
_async_process_discovery_update_and_remove(
payload, self._discovery_data
),
eager_start=False,
)
)
elif self._discovery_update:
if old_payload != self._discovery_data[ATTR_DISCOVERY_PAYLOAD]:
@@ -1025,8 +1024,7 @@ class MqttDiscoveryUpdate(Entity):
self.hass.async_create_task(
_async_process_discovery_update(
payload, self._discovery_update, self._discovery_data
),
eager_start=False,
)
)
else:
# Non-empty, unchanged payload: Ignore to avoid changing states
@@ -1059,6 +1057,15 @@ class MqttDiscoveryUpdate(Entity):
# rediscovered after a restart
await async_remove_discovery_payload(self.hass, self._discovery_data)
@final
async def add_to_platform_finish(self) -> None:
"""Finish adding entity to platform."""
await super().add_to_platform_finish()
# Only send the discovery done after the entity is fully added
# and the state is written to the state machine.
if self._discovery_data is not None:
send_discovery_done(self.hass, self._discovery_data)
@callback
def add_to_platform_abort(self) -> None:
"""Abort adding an entity to a platform."""
@@ -1218,8 +1225,6 @@ class MqttEntity(
self._prepare_subscribe_topics()
await self._subscribe_topics()
await self.mqtt_async_added_to_hass()
if self._discovery_data is not None:
send_discovery_done(self.hass, self._discovery_data)
async def mqtt_async_added_to_hass(self) -> None:
"""Call before the discovery message is acknowledged.
@@ -10,7 +10,7 @@ import socket
import sys
from typing import Any
from mysensors import BaseAsyncGateway, Message, Sensor, mysensors
from mysensors import BaseAsyncGateway, Message, Sensor, get_const, mysensors
import voluptuous as vol
from homeassistant.components.mqtt import (
@@ -24,6 +24,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_DEVICE, EVENT_HOMEASSISTANT_STOP
from homeassistant.core import Event, HomeAssistant, callback
import homeassistant.helpers.config_validation as cv
from homeassistant.setup import SetupPhases, async_pause_setup
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import (
@@ -162,6 +163,12 @@ async def _get_gateway(
) -> BaseAsyncGateway | None:
"""Return gateway after setup of the gateway."""
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# get_const will import a const module based on the version
# so we need to import it here to avoid it being imported
# in the event loop
await hass.async_add_import_executor_job(get_const, version)
if persistence_file is not None:
# Interpret relative paths to be in hass config folder.
# Absolute paths will be left as they are.
+64 -74
View File
@@ -5,18 +5,17 @@ from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
import datetime
from functools import partial
import logging
from typing import TYPE_CHECKING
from pynws import SimpleNWS
from pynws import SimpleNWS, call_with_retry
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE, Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import HomeAssistant
from homeassistant.helpers import debounce
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.event import async_track_point_in_utc_time
from homeassistant.helpers.update_coordinator import TimestampDataUpdateCoordinator
from homeassistant.util.dt import utcnow
@@ -27,8 +26,10 @@ _LOGGER = logging.getLogger(__name__)
PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
DEFAULT_SCAN_INTERVAL = datetime.timedelta(minutes=10)
FAILED_SCAN_INTERVAL = datetime.timedelta(minutes=1)
DEBOUNCE_TIME = 60 # in seconds
RETRY_INTERVAL = datetime.timedelta(minutes=1)
RETRY_STOP = datetime.timedelta(minutes=10)
DEBOUNCE_TIME = 10 * 60 # in seconds
def base_unique_id(latitude: float, longitude: float) -> str:
@@ -41,62 +42,9 @@ class NWSData:
"""Data for the National Weather Service integration."""
api: SimpleNWS
coordinator_observation: NwsDataUpdateCoordinator
coordinator_forecast: NwsDataUpdateCoordinator
coordinator_forecast_hourly: NwsDataUpdateCoordinator
class NwsDataUpdateCoordinator(TimestampDataUpdateCoordinator[None]): # pylint: disable=hass-enforce-coordinator-module
"""NWS data update coordinator.
Implements faster data update intervals for failed updates and exposes a last successful update time.
"""
def __init__(
self,
hass: HomeAssistant,
logger: logging.Logger,
*,
name: str,
update_interval: datetime.timedelta,
failed_update_interval: datetime.timedelta,
update_method: Callable[[], Awaitable[None]] | None = None,
request_refresh_debouncer: debounce.Debouncer | None = None,
) -> None:
"""Initialize NWS coordinator."""
super().__init__(
hass,
logger,
name=name,
update_interval=update_interval,
update_method=update_method,
request_refresh_debouncer=request_refresh_debouncer,
)
self.failed_update_interval = failed_update_interval
@callback
def _schedule_refresh(self) -> None:
"""Schedule a refresh."""
if self._unsub_refresh:
self._unsub_refresh()
self._unsub_refresh = None
# We _floor_ utcnow to create a schedule on a rounded second,
# minimizing the time between the point and the real activation.
# That way we obtain a constant update frequency,
# as long as the update process takes less than a second
if self.last_update_success:
if TYPE_CHECKING:
# the base class allows None, but this one doesn't
assert self.update_interval is not None
update_interval = self.update_interval
else:
update_interval = self.failed_update_interval
self._unsub_refresh = async_track_point_in_utc_time(
self.hass,
self._handle_refresh_interval,
utcnow().replace(microsecond=0) + update_interval,
)
coordinator_observation: TimestampDataUpdateCoordinator[None]
coordinator_forecast: TimestampDataUpdateCoordinator[None]
coordinator_forecast_hourly: TimestampDataUpdateCoordinator[None]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
@@ -112,41 +60,72 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
nws_data = SimpleNWS(latitude, longitude, api_key, client_session)
await nws_data.set_station(station)
async def update_observation() -> None:
"""Retrieve recent observations."""
await nws_data.update_observation(start_time=utcnow() - UPDATE_TIME_PERIOD)
def async_setup_update_observation(
retry_interval: datetime.timedelta | float,
retry_stop: datetime.timedelta | float,
) -> Callable[[], Awaitable[None]]:
async def update_observation() -> None:
"""Retrieve recent observations."""
await call_with_retry(
nws_data.update_observation,
retry_interval,
retry_stop,
start_time=utcnow() - UPDATE_TIME_PERIOD,
)
coordinator_observation = NwsDataUpdateCoordinator(
return update_observation
def async_setup_update_forecast(
retry_interval: datetime.timedelta | float,
retry_stop: datetime.timedelta | float,
) -> Callable[[], Awaitable[None]]:
return partial(
call_with_retry,
nws_data.update_forecast,
retry_interval,
retry_stop,
)
def async_setup_update_forecast_hourly(
retry_interval: datetime.timedelta | float,
retry_stop: datetime.timedelta | float,
) -> Callable[[], Awaitable[None]]:
return partial(
call_with_retry,
nws_data.update_forecast_hourly,
retry_interval,
retry_stop,
)
# Don't use retries in setup
coordinator_observation = TimestampDataUpdateCoordinator(
hass,
_LOGGER,
name=f"NWS observation station {station}",
update_method=update_observation,
update_method=async_setup_update_observation(0, 0),
update_interval=DEFAULT_SCAN_INTERVAL,
failed_update_interval=FAILED_SCAN_INTERVAL,
request_refresh_debouncer=debounce.Debouncer(
hass, _LOGGER, cooldown=DEBOUNCE_TIME, immediate=True
),
)
coordinator_forecast = NwsDataUpdateCoordinator(
coordinator_forecast = TimestampDataUpdateCoordinator(
hass,
_LOGGER,
name=f"NWS forecast station {station}",
update_method=nws_data.update_forecast,
update_method=async_setup_update_forecast(0, 0),
update_interval=DEFAULT_SCAN_INTERVAL,
failed_update_interval=FAILED_SCAN_INTERVAL,
request_refresh_debouncer=debounce.Debouncer(
hass, _LOGGER, cooldown=DEBOUNCE_TIME, immediate=True
),
)
coordinator_forecast_hourly = NwsDataUpdateCoordinator(
coordinator_forecast_hourly = TimestampDataUpdateCoordinator(
hass,
_LOGGER,
name=f"NWS forecast hourly station {station}",
update_method=nws_data.update_forecast_hourly,
update_method=async_setup_update_forecast_hourly(0, 0),
update_interval=DEFAULT_SCAN_INTERVAL,
failed_update_interval=FAILED_SCAN_INTERVAL,
request_refresh_debouncer=debounce.Debouncer(
hass, _LOGGER, cooldown=DEBOUNCE_TIME, immediate=True
),
@@ -164,6 +143,17 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
await coordinator_forecast.async_refresh()
await coordinator_forecast_hourly.async_refresh()
# Use retries
coordinator_observation.update_method = async_setup_update_observation(
RETRY_INTERVAL, RETRY_STOP
)
coordinator_forecast.update_method = async_setup_update_forecast(
RETRY_INTERVAL, RETRY_STOP
)
coordinator_forecast_hourly.update_method = async_setup_update_forecast_hourly(
RETRY_INTERVAL, RETRY_STOP
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
+1 -1
View File
@@ -7,5 +7,5 @@
"iot_class": "cloud_polling",
"loggers": ["metar", "pynws"],
"quality_scale": "platinum",
"requirements": ["pynws==1.6.0"]
"requirements": ["pynws[retry]==1.7.0"]
}
+6 -3
View File
@@ -25,7 +25,10 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
TimestampDataUpdateCoordinator,
)
from homeassistant.util.dt import utcnow
from homeassistant.util.unit_conversion import (
DistanceConverter,
@@ -34,7 +37,7 @@ from homeassistant.util.unit_conversion import (
)
from homeassistant.util.unit_system import US_CUSTOMARY_SYSTEM
from . import NWSData, NwsDataUpdateCoordinator, base_unique_id, device_info
from . import NWSData, base_unique_id, device_info
from .const import ATTRIBUTION, CONF_STATION, DOMAIN, OBSERVATION_VALID_TIME
PARALLEL_UPDATES = 0
@@ -158,7 +161,7 @@ async def async_setup_entry(
)
class NWSSensor(CoordinatorEntity[NwsDataUpdateCoordinator], SensorEntity):
class NWSSensor(CoordinatorEntity[TimestampDataUpdateCoordinator[None]], SensorEntity):
"""An NWS Sensor Entity."""
entity_description: NWSSensorEntityDescription
+33 -79
View File
@@ -2,6 +2,7 @@
from __future__ import annotations
from functools import partial
from types import MappingProxyType
from typing import TYPE_CHECKING, Any, cast
@@ -34,7 +35,6 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.util.dt import utcnow
from homeassistant.util.unit_conversion import SpeedConverter, TemperatureConverter
from . import NWSData, base_unique_id, device_info
@@ -46,7 +46,6 @@ from .const import (
DOMAIN,
FORECAST_VALID_TIME,
HOURLY,
OBSERVATION_VALID_TIME,
)
PARALLEL_UPDATES = 0
@@ -140,96 +139,69 @@ class NWSWeather(CoordinatorWeatherEntity):
self.nws = nws_data.api
latitude = entry_data[CONF_LATITUDE]
longitude = entry_data[CONF_LONGITUDE]
self.coordinator_forecast_legacy = nws_data.coordinator_forecast
self.station = self.nws.station
self.observation: dict[str, Any] | None = None
self._forecast_hourly: list[dict[str, Any]] | None = None
self._forecast_legacy: list[dict[str, Any]] | None = None
self._forecast_twice_daily: list[dict[str, Any]] | None = None
self.station = self.nws.station
self._attr_unique_id = _calculate_unique_id(entry_data, DAYNIGHT)
self._attr_device_info = device_info(latitude, longitude)
self._attr_name = self.station
async def async_added_to_hass(self) -> None:
"""Set up a listener and load data."""
"""When entity is added to hass."""
await super().async_added_to_hass()
self.async_on_remove(
self.coordinator_forecast_legacy.async_add_listener(
self._handle_legacy_forecast_coordinator_update
self.async_on_remove(partial(self._remove_forecast_listener, "daily"))
self.async_on_remove(partial(self._remove_forecast_listener, "hourly"))
self.async_on_remove(partial(self._remove_forecast_listener, "twice_daily"))
for forecast_type in ("twice_daily", "hourly"):
if (coordinator := self.forecast_coordinators[forecast_type]) is None:
continue
self.unsub_forecast[forecast_type] = coordinator.async_add_listener(
partial(self._handle_forecast_update, forecast_type)
)
)
# Load initial data from coordinators
self._handle_coordinator_update()
self._handle_hourly_forecast_coordinator_update()
self._handle_twice_daily_forecast_coordinator_update()
self._handle_legacy_forecast_coordinator_update()
@callback
def _handle_coordinator_update(self) -> None:
"""Load data from integration."""
self.observation = self.nws.observation
self.async_write_ha_state()
@callback
def _handle_hourly_forecast_coordinator_update(self) -> None:
"""Handle updated data from the hourly forecast coordinator."""
self._forecast_hourly = self.nws.forecast_hourly
@callback
def _handle_twice_daily_forecast_coordinator_update(self) -> None:
"""Handle updated data from the twice daily forecast coordinator."""
self._forecast_twice_daily = self.nws.forecast
@callback
def _handle_legacy_forecast_coordinator_update(self) -> None:
"""Handle updated data from the legacy forecast coordinator."""
self._forecast_legacy = self.nws.forecast
self.async_write_ha_state()
@property
def native_temperature(self) -> float | None:
"""Return the current temperature."""
if self.observation:
return self.observation.get("temperature")
if observation := self.nws.observation:
return observation.get("temperature")
return None
@property
def native_pressure(self) -> int | None:
"""Return the current pressure."""
if self.observation:
return self.observation.get("seaLevelPressure")
if observation := self.nws.observation:
return observation.get("seaLevelPressure")
return None
@property
def humidity(self) -> float | None:
"""Return the name of the sensor."""
if self.observation:
return self.observation.get("relativeHumidity")
if observation := self.nws.observation:
return observation.get("relativeHumidity")
return None
@property
def native_wind_speed(self) -> float | None:
"""Return the current windspeed."""
if self.observation:
return self.observation.get("windSpeed")
if observation := self.nws.observation:
return observation.get("windSpeed")
return None
@property
def wind_bearing(self) -> int | None:
"""Return the current wind bearing (degrees)."""
if self.observation:
return self.observation.get("windDirection")
if observation := self.nws.observation:
return observation.get("windDirection")
return None
@property
def condition(self) -> str | None:
"""Return current condition."""
weather = None
if self.observation:
weather = self.observation.get("iconWeather")
time = cast(str, self.observation.get("iconTime"))
if observation := self.nws.observation:
weather = observation.get("iconWeather")
time = cast(str, observation.get("iconTime"))
if weather:
return convert_condition(time, weather)
@@ -238,8 +210,8 @@ class NWSWeather(CoordinatorWeatherEntity):
@property
def native_visibility(self) -> int | None:
"""Return visibility."""
if self.observation:
return self.observation.get("visibility")
if observation := self.nws.observation:
return observation.get("visibility")
return None
def _forecast(
@@ -302,33 +274,12 @@ class NWSWeather(CoordinatorWeatherEntity):
@callback
def _async_forecast_hourly(self) -> list[Forecast] | None:
"""Return the hourly forecast in native units."""
return self._forecast(self._forecast_hourly, HOURLY)
return self._forecast(self.nws.forecast_hourly, HOURLY)
@callback
def _async_forecast_twice_daily(self) -> list[Forecast] | None:
"""Return the twice daily forecast in native units."""
return self._forecast(self._forecast_twice_daily, DAYNIGHT)
@property
def available(self) -> bool:
"""Return if state is available."""
last_success = (
self.coordinator.last_update_success
and self.coordinator_forecast_legacy.last_update_success
)
if (
self.coordinator.last_update_success_time
and self.coordinator_forecast_legacy.last_update_success_time
):
last_success_time = (
utcnow() - self.coordinator.last_update_success_time
< OBSERVATION_VALID_TIME
and utcnow() - self.coordinator_forecast_legacy.last_update_success_time
< FORECAST_VALID_TIME
)
else:
last_success_time = False
return last_success or last_success_time
return self._forecast(self.nws.forecast, DAYNIGHT)
async def async_update(self) -> None:
"""Update the entity.
@@ -336,4 +287,7 @@ class NWSWeather(CoordinatorWeatherEntity):
Only used by the generic entity update service.
"""
await self.coordinator.async_request_refresh()
await self.coordinator_forecast_legacy.async_request_refresh()
for forecast_type in ("twice_daily", "hourly"):
if (coordinator := self.forecast_coordinators[forecast_type]) is not None:
await coordinator.async_request_refresh()
@@ -7,6 +7,7 @@ from homeassistant.helpers import config_entry_oauth2_flow
from . import api, config_flow
from .const import DOMAIN
from .coordinator import OndiloIcoCoordinator
from .oauth_impl import OndiloOauth2Implementation
PLATFORMS = [Platform.SENSOR]
@@ -26,8 +27,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
)
hass.data.setdefault(DOMAIN, {})
hass.data[DOMAIN][entry.entry_id] = api.OndiloClient(hass, entry, implementation)
coordinator = OndiloIcoCoordinator(
hass, api.OndiloClient(hass, entry, implementation)
)
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -0,0 +1,37 @@
"""Define an object to coordinate fetching Ondilo ICO data."""
from datetime import timedelta
import logging
from typing import Any
from ondilo import OndiloError
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from . import DOMAIN
from .api import OndiloClient
_LOGGER = logging.getLogger(__name__)
class OndiloIcoCoordinator(DataUpdateCoordinator[list[dict[str, Any]]]):
"""Class to manage fetching Ondilo ICO data from API."""
def __init__(self, hass: HomeAssistant, api: OndiloClient) -> None:
"""Initialize."""
super().__init__(
hass,
logger=_LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=20),
)
self.api = api
async def _async_update_data(self) -> list[dict[str, Any]]:
"""Fetch data from API endpoint."""
try:
return await self.hass.async_add_executor_job(self.api.get_all_pools_data)
except OndiloError as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
+11 -57
View File
@@ -2,12 +2,6 @@
from __future__ import annotations
from datetime import timedelta
import logging
from typing import Any
from ondilo import OndiloError
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -24,14 +18,10 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
UpdateFailed,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .api import OndiloClient
from .const import DOMAIN
from .coordinator import OndiloIcoCoordinator
SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
SensorEntityDescription(
@@ -78,66 +68,30 @@ SENSOR_TYPES: tuple[SensorEntityDescription, ...] = (
)
SCAN_INTERVAL = timedelta(minutes=5)
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
) -> None:
"""Set up the Ondilo ICO sensors."""
api: OndiloClient = hass.data[DOMAIN][entry.entry_id]
coordinator: OndiloIcoCoordinator = hass.data[DOMAIN][entry.entry_id]
async def async_update_data() -> list[dict[str, Any]]:
"""Fetch data from API endpoint.
This is the place to pre-process the data to lookup tables
so entities can quickly look up their data.
"""
try:
return await hass.async_add_executor_job(api.get_all_pools_data)
except OndiloError as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
# Name of the data. For logging purposes.
name="sensor",
update_method=async_update_data,
# Polling interval. Will only be polled if there are subscribers.
update_interval=SCAN_INTERVAL,
async_add_entities(
OndiloICO(coordinator, poolidx, description)
for poolidx, pool in enumerate(coordinator.data)
for sensor in pool["sensors"]
for description in SENSOR_TYPES
if description.key == sensor["data_type"]
)
# Fetch initial data so we have data when entities subscribe
await coordinator.async_refresh()
entities = []
for poolidx, pool in enumerate(coordinator.data):
entities.extend(
[
OndiloICO(coordinator, poolidx, description)
for sensor in pool["sensors"]
for description in SENSOR_TYPES
if description.key == sensor["data_type"]
]
)
async_add_entities(entities)
class OndiloICO(
CoordinatorEntity[DataUpdateCoordinator[list[dict[str, Any]]]], SensorEntity
):
class OndiloICO(CoordinatorEntity[OndiloIcoCoordinator], SensorEntity):
"""Representation of a Sensor."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: DataUpdateCoordinator[list[dict[str, Any]]],
coordinator: OndiloIcoCoordinator,
poolidx: int,
description: SensorEntityDescription,
) -> None:
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/opentherm_gw",
"iot_class": "local_push",
"loggers": ["pyotgw"],
"requirements": ["pyotgw==2.1.3"]
"requirements": ["pyotgw==2.2.0"]
}
@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/opower",
"iot_class": "cloud_polling",
"loggers": ["opower"],
"requirements": ["opower==0.4.3"]
"requirements": ["opower==0.4.4"]
}
@@ -69,7 +69,6 @@ ELEC_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill electric cost to date",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.cost_to_date,
@@ -79,7 +78,6 @@ ELEC_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill electric forecasted cost",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.forecasted_cost,
@@ -89,7 +87,6 @@ ELEC_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Typical monthly electric cost",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.typical_cost,
@@ -101,7 +98,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill gas usage to date",
device_class=SensorDeviceClass.GAS,
native_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
suggested_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.usage_to_date,
@@ -111,7 +107,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill gas forecasted usage",
device_class=SensorDeviceClass.GAS,
native_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
suggested_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.forecasted_usage,
@@ -121,7 +116,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Typical monthly gas usage",
device_class=SensorDeviceClass.GAS,
native_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
suggested_unit_of_measurement=UnitOfVolume.CENTUM_CUBIC_FEET,
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.typical_usage,
@@ -131,7 +125,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill gas cost to date",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.cost_to_date,
@@ -141,7 +134,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Current bill gas forecasted cost",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.forecasted_cost,
@@ -151,7 +143,6 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
name="Typical monthly gas cost",
device_class=SensorDeviceClass.MONETARY,
native_unit_of_measurement="USD",
suggested_unit_of_measurement="USD",
state_class=SensorStateClass.TOTAL,
suggested_display_precision=0,
value_fn=lambda data: data.typical_cost,
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/philips_js",
"iot_class": "local_polling",
"loggers": ["haphilipsjs"],
"requirements": ["ha-philipsjs==3.1.1"]
"requirements": ["ha-philipsjs==3.2.1"]
}
+1 -1
View File
@@ -55,7 +55,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
use_tls = entry.data[CONF_SSL]
verify_tls = entry.data[CONF_VERIFY_SSL]
location = entry.data[CONF_LOCATION]
api_key = entry.data.get(CONF_API_KEY)
api_key = entry.data.get(CONF_API_KEY, "")
# remove obsolet CONF_STATISTICS_ONLY from entry.data
if CONF_STATISTICS_ONLY in entry.data:
+5 -1
View File
@@ -91,13 +91,17 @@ class PlugwiseSelectEntity(PlugwiseEntity, SelectEntity):
super().__init__(coordinator, device_id)
self.entity_description = entity_description
self._attr_unique_id = f"{device_id}-{entity_description.key}"
self._attr_options = self.device[entity_description.options_key]
@property
def current_option(self) -> str:
"""Return the selected entity option to represent the entity state."""
return self.device[self.entity_description.key]
@property
def options(self) -> list[str]:
"""Return the available select-options."""
return self.device[self.entity_description.options_key]
async def async_select_option(self, option: str) -> None:
"""Change to the selected entity option."""
await self.entity_description.command(
+7 -2
View File
@@ -1,9 +1,10 @@
"""Base entity for poolsense integration."""
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import ATTRIBUTION
from .const import ATTRIBUTION, DOMAIN
from .coordinator import PoolSenseDataUpdateCoordinator
@@ -11,6 +12,7 @@ class PoolSenseEntity(CoordinatorEntity[PoolSenseDataUpdateCoordinator]):
"""Implements a common class elements representing the PoolSense component."""
_attr_attribution = ATTRIBUTION
_attr_has_entity_name = True
def __init__(
self,
@@ -21,5 +23,8 @@ class PoolSenseEntity(CoordinatorEntity[PoolSenseDataUpdateCoordinator]):
"""Initialize poolsense sensor."""
super().__init__(coordinator)
self.entity_description = description
self._attr_name = f"PoolSense {description.name}"
self._attr_unique_id = f"{email}-{description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, email)},
model="PoolSense",
)
@@ -46,7 +46,7 @@ class RadarrDataUpdateCoordinator(DataUpdateCoordinator[T], Generic[T], ABC):
"""Data update coordinator for the Radarr integration."""
config_entry: ConfigEntry
update_interval = timedelta(seconds=30)
_update_interval = timedelta(seconds=30)
def __init__(
self,
@@ -59,7 +59,7 @@ class RadarrDataUpdateCoordinator(DataUpdateCoordinator[T], Generic[T], ABC):
hass=hass,
logger=LOGGER,
name=DOMAIN,
update_interval=self.update_interval,
update_interval=self._update_interval,
)
self.api_client = api_client
self.host_configuration = host_configuration
@@ -133,7 +133,7 @@ class QueueDataUpdateCoordinator(RadarrDataUpdateCoordinator):
class CalendarUpdateCoordinator(RadarrDataUpdateCoordinator[None]):
"""Calendar update coordinator."""
update_interval = timedelta(hours=1)
_update_interval = timedelta(hours=1)
def __init__(
self,
@@ -485,6 +485,12 @@ def compile_statistics(instance: Recorder, start: datetime, fire_events: bool) -
The actual calculation is delegated to the platforms.
"""
# Define modified_statistic_ids outside of the "with" statement as
# _compile_statistics may raise and be trapped by
# filter_unique_constraint_integrity_error which would make
# modified_statistic_ids unbound.
modified_statistic_ids: set[str] | None = None
# Return if we already have 5-minute statistics for the requested period
with session_scope(
session=instance.get_session(),
@@ -85,6 +85,7 @@ async def async_setup_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> b
try:
await host.update_states()
except CredentialsInvalidError as err:
await host.stop()
raise ConfigEntryAuthFailed(err) from err
except ReolinkError as err:
raise UpdateFailed(str(err)) from err
@@ -18,5 +18,5 @@
"documentation": "https://www.home-assistant.io/integrations/reolink",
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"requirements": ["reolink-aio==0.8.9"]
"requirements": ["reolink-aio==0.8.10"]
}
+1 -1
View File
@@ -7,5 +7,5 @@
"iot_class": "local_push",
"loggers": ["pyrisco"],
"quality_scale": "platinum",
"requirements": ["pyrisco==0.6.1"]
"requirements": ["pyrisco==0.6.2"]
}
@@ -49,7 +49,7 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
)
device_data = DeviceData(device, product_info.model, device_networking.ip)
self.api: RoborockLocalClientV1 | RoborockMqttClientV1 = RoborockLocalClientV1(
device_data
device_data, queue_timeout=5
)
self.cloud_api = cloud_api
self.device_info = DeviceInfo(
+25 -6
View File
@@ -66,17 +66,26 @@ class RoborockMap(RoborockCoordinatedEntity, ImageEntity):
)
self._attr_image_last_updated = dt_util.utcnow()
self.map_flag = map_flag
self.cached_map = self._create_image(starting_map)
try:
self.cached_map = self._create_image(starting_map)
except HomeAssistantError:
# If we failed to update the image on init, we set cached_map to empty bytes so that we are unavailable and can try again later.
self.cached_map = b""
self._attr_entity_category = EntityCategory.DIAGNOSTIC
@property
def available(self):
"""Determines if the entity is available."""
return self.cached_map != b""
@property
def is_selected(self) -> bool:
"""Return if this map is the currently selected map."""
return self.map_flag == self.coordinator.current_map
def is_map_valid(self) -> bool:
"""Update this map if it is the current active map, and the vacuum is cleaning."""
return (
"""Update this map if it is the current active map, and the vacuum is cleaning or if it has never been set at all."""
return self.cached_map == b"" or (
self.is_selected
and self.image_last_updated is not None
and self.coordinator.roborock_device_info.props.status is not None
@@ -96,7 +105,16 @@ class RoborockMap(RoborockCoordinatedEntity, ImageEntity):
async def async_image(self) -> bytes | None:
"""Update the image if it is not cached."""
if self.is_map_valid():
map_data: bytes = await self.cloud_api.get_map_v1()
response = await asyncio.gather(
*(self.cloud_api.get_map_v1(), self.coordinator.get_rooms()),
return_exceptions=True,
)
if not isinstance(response[0], bytes):
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
)
map_data = response[0]
self.cached_map = self._create_image(map_data)
return self.cached_map
@@ -141,9 +159,10 @@ async def create_coordinator_maps(
await asyncio.sleep(MAP_SLEEP)
# Get the map data
map_update = await asyncio.gather(
*[coord.cloud_api.get_map_v1(), coord.get_rooms()]
*[coord.cloud_api.get_map_v1(), coord.get_rooms()], return_exceptions=True
)
api_data: bytes = map_update[0]
# If we fail to get the map -> We should set it to empty byte, still create it, and set it as unavailable.
api_data: bytes = map_update[0] if isinstance(map_update[0], bytes) else b""
entities.append(
RoborockMap(
f"{slugify(coord.roborock_device_info.device.duid)}_map_{map_info.name}",
+1 -1
View File
@@ -11,7 +11,7 @@
"iot_class": "local_polling",
"loggers": ["rokuecp"],
"quality_scale": "silver",
"requirements": ["rokuecp==0.19.2"],
"requirements": ["rokuecp==0.19.3"],
"ssdp": [
{
"st": "roku:ecp",
+1 -2
View File
@@ -72,7 +72,6 @@ class RoonEventEntity(EventEntity):
via_device=(DOMAIN, self._server.roon_id),
)
@callback
def _roonapi_volume_callback(
self, control_key: str, event: str, value: int
) -> None:
@@ -88,7 +87,7 @@ class RoonEventEntity(EventEntity):
event = "volume_down"
self._trigger_event(event)
self.async_write_ha_state()
self.schedule_update_ha_state()
async def async_added_to_hass(self) -> None:
"""Register volume hooks with the roon api."""

Some files were not shown because too many files have changed in this diff Show More