Compare commits

..

129 Commits

Author SHA1 Message Date
Franck Nijhof c757c9b99f 2022.11.2 (#81780) 2022-11-08 17:41:05 +01:00
J. Nick Koston d88b2bf19c Fix off by one in HomeKit iid allocator (#81793) 2022-11-08 16:24:15 +01:00
ztamas83 7ab2029071 Retry tibber setup (#81785)
* Handle integration setup retries

* Fix black error

* Update to falsy check

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Remove duplicated log

* Update exception message

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2022-11-08 16:24:11 +01:00
Bram Kragten 59ec829106 Update frontend to 20221108.0 (#81787) 2022-11-08 15:01:23 +01:00
epenet 345d356e9a Fix rest import (#81784) 2022-11-08 13:35:12 +01:00
Franck Nijhof 42c09d8811 Bumped version to 2022.11.2 2022-11-08 12:44:44 +01:00
Allen Porter f614df29bd Partially revert google local sync for search cases (#81761)
fixes undefined
2022-11-08 12:43:39 +01:00
J. Nick Koston 815249eaeb Use more efficient async_progress_by_handler call in async_start_reauth (#81757) 2022-11-08 12:43:36 +01:00
J. Nick Koston 1f878433ac Fix check for duplicate config entry reauth when context is passed or augmented (#81753)
fixes https://github.com/home-assistant/core/issues/77578
2022-11-08 12:43:32 +01:00
epenet 797ea3ace4 Adjust REST schema validation (#81723)
fixes undefined
2022-11-08 12:43:28 +01:00
J. Nick Koston 3e8bea8fbd Fix flapping logbook tests (#81695) 2022-11-08 12:43:25 +01:00
Robert Hillis b9db84ed57 Bump aiopyarr to 22.11.0 (#81694) 2022-11-08 12:43:21 +01:00
J. Nick Koston 823ec88c52 Bump aiohomekit to 2.2.18 (#81693) 2022-11-08 12:43:18 +01:00
Simone Chemelli 34ae83b4e2 Restore negative values for shelly power factors (#81689)
fixes undefined
2022-11-08 12:43:15 +01:00
J. Nick Koston efb984aa83 Bump bleak to 0.19.2 (#81688) 2022-11-08 12:43:11 +01:00
J. Nick Koston 7ca5bd341b Bump aioesphomeapi to 11.4.3 (#81676) 2022-11-08 12:43:08 +01:00
J. Nick Koston c8ed3fd302 Bump bleak-retry-connector to 2.8.3 (#81675)
Improves chances of making a BLE connection with the ESP32s

changelog: https://github.com/Bluetooth-Devices/bleak-retry-connector/compare/v2.8.2...v2.8.3
2022-11-08 12:43:05 +01:00
J. Nick Koston a8a3f012f6 Bump bluetooth-adapters to 0.7.0 (#81576)
changelog: https://github.com/Bluetooth-Devices/bluetooth-adapters/releases/tag/v0.7.0
2022-11-08 12:43:00 +01:00
Aaron Bach 11013bd780 Fix missing RainMachine restrictions switches (#81673) 2022-11-08 12:33:17 +01:00
J. Nick Koston 2684a6e8ed Bump aiohomekit to 2.2.17 (#81657)
Improve BLE pairing reliability, especially with esp32 proxies

changelog: https://github.com/Jc2k/aiohomekit/compare/2.2.16...2.2.17
2022-11-08 12:33:13 +01:00
Artem Draft 63afb30f57 Fix Bravia TV options flow when device is off (#81644)
* Fix options flow when tv is off

* abort with message
2022-11-08 12:33:10 +01:00
Tobias Sauerwein f861137de4 Bump pyatmo to 7.4.0 (#81636) 2022-11-08 12:33:06 +01:00
J. Nick Koston 08debee94f Add missing h2 dep to iaqualink (#81630)
fixes https://github.com/home-assistant/core/issues/81439
2022-11-08 12:33:02 +01:00
Robert Svensson c8981f78b7 Fix situation where deCONZ sensor platform setup would fail (#81629)
* Fix situation where deCONZ sensor platform setup would fail

* Don't use try
2022-11-08 12:32:58 +01:00
J. Nick Koston e4269ff8b2 Fix creating multiple ElkM1 systems with TLS 1.2 (#81627)
fixes https://github.com/home-assistant/core/issues/81516
2022-11-08 12:32:55 +01:00
J. Nick Koston 6fb5c93182 Bump oralb-ble to 0.13.0 (#81622)
* Bump oralb-ble to 0.11.1

adds some more missing pressure mappings

changelog: https://github.com/Bluetooth-Devices/oralb-ble/compare/v0.10.2...v0.11.1

* bump again to update for additional reports

* bump again for more data from issue reports
2022-11-08 12:32:51 +01:00
J. Nick Koston 6fa69022f4 Bump aiohomekit to 2.2.16 (#81621) 2022-11-08 12:32:48 +01:00
J. Nick Koston 4391640734 Ignore unspecified addresses from zeroconf (#81620) 2022-11-08 12:32:44 +01:00
Tim Rightnour c60c99bd74 Bump venstarcolortouch to 0.19 to fix API rev 3 devices (#81614) 2022-11-08 12:32:41 +01:00
J. Nick Koston ac15f2cf9d Fix homekit bridge iid allocations (#81613)
fixes undefined
2022-11-08 12:32:35 +01:00
Bouwe Westerdijk b9757235a7 Bump plugwise to v0.25.7 (#81612) 2022-11-08 12:32:31 +01:00
Steven Looman 9beb9f6fc0 Fix repeating SSDP errors by checking address scope_ids and proper hostname (#81611) 2022-11-08 12:32:27 +01:00
David F. Mulcahey 9771147a1e Fix invalid min and max color temp in bad ZHA light devices (#81604)
* Fix ZHA default color temps

* update test
2022-11-08 12:32:24 +01:00
Sebastian Muszynski 0983f8aadf Bump PyXiaomiGateway to 0.14.3 (#81603)
Fixes: #80249
2022-11-08 12:32:20 +01:00
Maciej Bieniek 5a6423a944 Always use Celsius in Shelly integration, part 2 (#81602)
* Always use Celsius in Shelly integration

* Update homeassistant/components/shelly/sensor.py

Co-authored-by: Aarni Koskela <akx@iki.fi>

* Restore unit from the registry during HA startup

Co-authored-by: Aarni Koskela <akx@iki.fi>
2022-11-08 12:32:16 +01:00
David F. Mulcahey f9c7732090 Bump ZHA quirks and associated changes (#81587) 2022-11-08 12:32:13 +01:00
J. Nick Koston 55c87c733a Ensure HomeKit temperature controls appear before fan controls on thermostat accessories (#81586) 2022-11-08 12:32:09 +01:00
J. Nick Koston 6110700e18 Fix HomeKit reset accessory procedure (#81573)
fixes https://github.com/home-assistant/core/issues/81571
2022-11-08 12:32:06 +01:00
Nathan Spencer a8e1afb966 Bump pylitterbot to 2022.11.0 (#81572) 2022-11-08 12:32:02 +01:00
Klaas Schoute d24e272d5e Fix watermeter issue for old P1 Monitor versions (#81570)
* Bump the python package version

* Add exception to check if user has a water meter
2022-11-08 12:31:59 +01:00
Shay Levy bf5ecc30ed Fix Shelly Plus HT missing battery entity (#81564) 2022-11-08 12:31:55 +01:00
Allen Porter 2a34d3a56f Bump gcal_sync to 4.0.0 (#81562)
* Bump gcal_sync to 2.2.4

* Bump gcal sync to 4.0.0

* Add Calendar accessRole fields which are now required
2022-11-08 12:31:52 +01:00
Aaron Bach 7124cedd7a Bump pyairvisual to 2022.11.1 (#81556) 2022-11-08 12:31:46 +01:00
J. Nick Koston 087ede959d Bump oralb-ble to 0.10.2 (#81537)
Fixes some more missing pressure mappings

changelog: https://github.com/Bluetooth-Devices/oralb-ble/compare/v0.10.1...v0.10.2
2022-11-08 12:31:43 +01:00
J. Nick Koston 7b769b39c2 Add additional coverage for adding multiple elkm1 instances (#81528)
* Add additional coverage for adding multiple elkm1 instances

* fix copy error
2022-11-08 12:31:40 +01:00
Avi Miller 51ab5d1808 Fix lifx.set_state so it works with kelvin and color_temp_kelvin and color names (#81515) 2022-11-08 12:31:36 +01:00
J. Nick Koston 7832a7fd80 Bump oralb-ble to 0.10.1 (#81491)
fixes #81489

changelog: https://github.com/Bluetooth-Devices/oralb-ble/compare/v0.10.0...v0.10.1
2022-11-08 12:31:33 +01:00
J. Nick Koston c93c13d8bf Bump nexia to 2.0.6 (#81474)
* Bump nexia to 2.0.6

- Marks thermostat unavailable when it is offline

* is property
2022-11-08 12:31:30 +01:00
J. Nick Koston 42444872b9 Align esphome ble client notify behavior to match BlueZ (#81463) 2022-11-08 12:31:26 +01:00
Steven Looman d3bd80b876 Fix ignored upnp discoveries not being matched when device changes its unique identifier (#81240)
Fixes https://github.com/home-assistant/core/issues/78454
fixes undefined
2022-11-08 12:31:20 +01:00
epenet a53d1e072d Fix scrape scan interval (#81763) 2022-11-08 10:56:08 +01:00
Paulus Schoutsen 229d60e678 2022.11.1 (#81488) 2022-11-03 17:52:59 +01:00
Paulus Schoutsen dd004d62d4 Bumped version to 2022.11.1 2022-11-03 17:00:47 +01:00
mkmer 8cbe303677 Bump AIOAladdinConnect to 0.1.47 (#81479) 2022-11-03 17:00:38 +01:00
J. Nick Koston 48edd54e62 Fix HomeKit thermostat to take priority over fans (#81473) 2022-11-03 17:00:37 +01:00
Franck Nijhof 8cb4e8452d Update cryptography to 38.0.3 (#81455) 2022-11-03 17:00:36 +01:00
J. Nick Koston d4b7c00ed6 Bump aiohomekit to 2.2.14 (#81454) 2022-11-03 17:00:35 +01:00
Franck Nijhof 758e06b4b6 Fix SSDP failure to start on missing URLs (#81453) 2022-11-03 17:00:34 +01:00
J. Nick Koston 632231912e Skip flume devices with location missing (#81441)
fixes #81438
2022-11-03 17:00:31 +01:00
Raman Gupta e25cf0b338 Fix eight sleep client creation (#81440)
Fix eight sleep bug
2022-11-03 17:00:30 +01:00
Austin Brunkhorst 32c5248ddb Update pysnooz to 0.8.3 (#81428) 2022-11-03 17:00:28 +01:00
Dennis Schroer 21baf50fc9 Update energyflip-client dependency to 0.2.2 (#81426) 2022-11-03 17:00:25 +01:00
Franck Nijhof 1f0073f450 2022.11.0 (#81423) 2022-11-02 21:47:47 +01:00
Franck Nijhof f14a84211f Bumped version to 2022.11.0 2022-11-02 20:29:00 +01:00
Bram Kragten 1ea0d0e47f Update frontend to 20221102.1 (#81422) 2022-11-02 20:28:05 +01:00
Allen Porter 28832e1c2e Bump gcal_sync to 2.2.3 (#81414) 2022-11-02 20:28:02 +01:00
Daniel Hjelseth Høyer 970fd9bdba Update adax library to 0.1.5 (#81407) 2022-11-02 20:27:57 +01:00
Franck Nijhof 3409dea28c Bumped version to 2022.11.0b7 2022-11-02 12:46:33 +01:00
Bram Kragten 06d22d8249 Update frontend to 20221102.0 (#81405) 2022-11-02 12:46:19 +01:00
J. Nick Koston a6e745b687 Bump aiohomekit to 2.2.13 (#81398) 2022-11-02 12:46:16 +01:00
Mike Degatano f6c094b017 Improve supervisor repairs (#81387)
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2022-11-02 12:46:12 +01:00
J. Nick Koston 9f54e332ec Bump dbus-fast to 1.61.1 (#81386) 2022-11-02 12:46:08 +01:00
Paulus Schoutsen 3aca376374 Add unit conversion for energy costs (#81379)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2022-11-02 12:46:05 +01:00
J. Nick Koston a5f209b219 Bump aiohomekit to 2.2.12 (#81372)
* Bump aiohomekit to 2.2.12

Fixes a missing lock which was noticable on the esp32s
since they disconnect right away when you ask for gatt
notify.

https://github.com/Jc2k/aiohomekit/compare/2.2.11...2.2.12

* empty
2022-11-02 12:46:02 +01:00
J. Nick Koston 0dbf0504ff Bump bleak-retry-connector to 2.8.2 (#81370)
* Bump bleak-retry-connector to 2.8.2

Tweaks for the esp32 proxies now that we have better error
reporting. This change improves the retry cases a bit with
the new https://github.com/esphome/esphome/pull/3971

* empty
2022-11-02 12:45:58 +01:00
puddly 95ce20638a Bump zigpy-zigate to 0.10.3 (#81363) 2022-11-02 12:45:55 +01:00
Franck Nijhof b9132e78b4 Improve error logging of WebSocket API (#81360) 2022-11-02 12:45:50 +01:00
Paulus Schoutsen e8f93d9c7f Bumped version to 2022.11.0b6 2022-11-01 13:09:48 -04:00
J. Nick Koston 1efec8323a Bump aiohomekit to 2.2.11 (#81358) 2022-11-01 13:09:42 -04:00
J. Nick Koston 8c63a9ce5e Immediately prefer advertisements from alternate sources when a scanner goes away (#81357) 2022-11-01 13:09:42 -04:00
J. Nick Koston 9dff7ab6b9 Adjust time to remove stale connectable devices from the esphome ble to closer match bluez (#81356) 2022-11-01 13:09:41 -04:00
David F. Mulcahey d0ddbb5f58 Fix individual LED range for ZHA device action (#81351)
The inovelli individual LED effect device action can address 7 LEDs. I had set the range 1-7 but it should be 0-6.
2022-11-01 13:09:40 -04:00
Maciej Bieniek f265c160d1 Lower log level for non-JSON payload in MQTT update (#81348)
Change log level
2022-11-01 13:09:39 -04:00
Jan Bouwhuis a2d432dfd6 Revert "Do not write state if payload is ''" for MQTT sensor (#81347)
* Revert "Do not write state if payload is ''"

This reverts commit 869c11884e.

* Add test
2022-11-01 13:09:37 -04:00
Franck Nijhof c4bb225060 Fix power/energy mixup in Youless (#81345) 2022-11-01 13:09:36 -04:00
Shay Levy 473490aee7 Bump aioshelly to 4.1.2 (#81342) 2022-11-01 13:09:35 -04:00
Allen Porter f9493bc313 Bump gcal_sync to 2.2.2 and fix recurring event bug (#81339)
* Bump gcal_sync to 2.2.2 and fix recurring event bug

* Bump to 2.2.2
2022-11-01 13:09:34 -04:00
Ron Klinkien 1cc85f77e3 Add task id attribute to fireservicerota sensor (#81323) 2022-11-01 13:09:33 -04:00
javicalle c2c57712d2 Tuya configuration for tuya_manufacturer cluster (#81311)
* Tuya configuration for tuya_manufacturer cluster

* fix codespell

* Add attributes initialization

* Fix pylint complaints
2022-11-01 13:09:32 -04:00
Maciej Bieniek 4684101a85 Improve MQTT update platform (#81131)
* Allow JSON as state_topic payload

* Add title

* Add release_url

* Add release_summary

* Add entity_picture

* Fix typo

* Add abbreviations
2022-11-01 13:09:31 -04:00
J. Nick Koston 9b87f7f6f9 Fix homekit diagnostics test when version changes (#81046) 2022-11-01 13:09:31 -04:00
Maciej Bieniek 8965a1322c Always use Celsius in Shelly integration (#80842) 2022-11-01 13:09:30 -04:00
Franck Nijhof dfe399e370 Cherry-pick translation updates for Supervisor (#81341) 2022-11-01 13:08:26 -04:00
Paulus Schoutsen 0ac0e9c0d5 Bumped version to 2022.11.0b5 2022-10-31 21:23:21 -04:00
J. Nick Koston 941512641b Improve esphome bluetooth error reporting (#81326) 2022-10-31 21:23:14 -04:00
Bram Kragten 599c23c1d7 Update frontend to 20221031.0 (#81324) 2022-10-31 21:23:14 -04:00
J. Nick Koston 882ad31a99 Fix Yale Access Bluetooth not being available again after being unavailable (#81320) 2022-10-31 21:23:13 -04:00
Franck Nijhof 356953c8bc Update base image to 2022.10.0 (#81317) 2022-10-31 21:23:12 -04:00
J. Nick Koston 19a5c87da6 Bump oralb-ble to 0.10.0 (#81315) 2022-10-31 21:23:11 -04:00
J. Nick Koston d7e76fdf3a Bump zeroconf to 0.39.4 (#81313) 2022-10-31 21:23:10 -04:00
J. Nick Koston 9b4f2df8f3 Bump aiohomekit to 2.2.10 (#81312) 2022-10-31 21:23:09 -04:00
TheJulianJES 7046f5f19e Only try initializing Hue motion LED on endpoint 2 with ZHA (#81205) 2022-10-31 21:23:08 -04:00
Mike Degatano 3ddcc637da Create repairs for unsupported and unhealthy (#80747) 2022-10-31 21:23:08 -04:00
Paulus Schoutsen 0a476baf16 Bumped version to 2022.11.0b4 2022-10-31 09:54:14 -04:00
J. Nick Koston f3a96ce14b Bump dbus-fast to 1.60.0 (#81296) 2022-10-31 09:54:04 -04:00
Tobias Sauerwein 4fbbb7ba6d Bump pyatmo to 7.3.0 (#81290)
* Bump pyatmo to 7.3.0

* Update test fixture data and tests
2022-10-31 09:54:03 -04:00
Chris Talkington 8eef55ed60 Bump pyipp to 0.12.1 (#81287)
bump pyipp to 0.12.1
2022-10-31 09:54:02 -04:00
J. Nick Koston 8f843b3046 Do not fire the esphome ble disconnected callback if we were not connected (#81286) 2022-10-31 09:53:26 -04:00
J. Nick Koston 13562d271e Bump bleak-retry-connector to 2.8.1 (#81285)
* Bump bleak-retry-connector to 2.8.1

reduces logging now that we have found the problem
with esphome devices not disconnecting ble devices
after timeout

changelog: https://github.com/Bluetooth-Devices/bleak-retry-connector/compare/v2.8.0...v2.8.1

* empty
2022-10-31 09:52:03 -04:00
J. Nick Koston 3cf63ec88e Include esphome device name in BLE logs (#81284)
* Include esphome device name in BLE logs

This makes it easier to debug what is going on when there
are multiple esphome proxies

* revert unintended change
2022-10-31 09:52:02 -04:00
J. Nick Koston 1f70941f6d Do not fire the esphome ble disconnected callback if we were not connected (#81286) 2022-10-31 09:51:21 -04:00
J. Nick Koston 81dde5cfdf Bump bleak-retry-connector to 2.8.0 (#81283) 2022-10-31 09:49:55 -04:00
J. Nick Koston eccf61a546 Bump aioesphomeapi to 11.4.1 (#81282) 2022-10-31 09:49:54 -04:00
J. Nick Koston 9fac632dcd Bump bleak-retry-connector to 2.7.0 (#81280) 2022-10-31 09:49:53 -04:00
J. Nick Koston e26149d0c3 Bump aioesphomeapi to 11.4.0 (#81277) 2022-10-31 09:49:52 -04:00
J. Nick Koston 8bafb56f04 Bump bleak-retry-connector to 2.6.0 (#81270) 2022-10-31 09:49:51 -04:00
J. Nick Koston 94f92e7f8a Try to switch to a different esphome BLE proxy if we run out of slots while connecting (#81268) 2022-10-31 09:49:50 -04:00
J. Nick Koston 5e3fb6ee9f Provide a human readable error when an esphome ble proxy connection fails (#81266) 2022-10-31 09:49:50 -04:00
J. Nick Koston c36260dd17 Move esphome gatt services cache to be per device (#81265) 2022-10-31 09:49:49 -04:00
J. Nick Koston 0af69a1014 Significantly reduce clock_gettime syscalls on platforms with broken vdso (#81257) 2022-10-31 09:49:48 -04:00
Jc2k 5f81f968ee Set the correct state class for Eve Energy in homekit_controller (#81255) 2022-10-31 09:49:47 -04:00
J. Nick Koston 9d88c95314 Bump aiohomekit to 2.2.9 (#81254) 2022-10-31 09:49:46 -04:00
Tobias Sauerwein 90a3689489 Make Netatmo/Legrande/BTicino lights and switches optimistic (#81246)
* Make Netatmo lights optimistic

* Same for switches
2022-10-31 09:49:45 -04:00
Maciej Bieniek 11bdddc1dc Catch ApiError while checking credentials in NAM integration (#81243)
* Catch ApiError while checking credentials

* Update tests

* Suggested change
2022-10-31 09:49:44 -04:00
J. Nick Koston a6bb7a0832 Bump dbus-fast to 1.59.1 (#81229)
* Bump dbus-fast to 1.59.1

fixes incorrect logging of an exception when it was already handled

changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v1.59.0...v1.59.1

* empty
2022-10-31 09:49:44 -04:00
Tobias Sauerwein 24b3d21815 Mute superfluous exception when no Netatmo webhook is to be dropped (#81221)
* Mute superfluous exception when no webhook is to be droped

* Update homeassistant/components/netatmo/__init__.py

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2022-10-31 09:49:43 -04:00
Kevin Stillhammer be138adb23 Add missing string for option traffic_mode for google_travel_time (#81213)
Add missing string for option traffic_mode
2022-10-31 09:49:42 -04:00
Guido Schmitz 8d3ed60986 Fix Danfoss thermostat support in devolo Home Control (#81200)
Fix Danfoss thermostat
2022-10-31 09:49:41 -04:00
Raj Laud 0465510ed7 Fix Squeezebox media browsing (#81197)
* Squeezebox media browser fix icons

* Update pysqueezebox to 0.6.1
2022-10-31 09:49:40 -04:00
169 changed files with 4214 additions and 1481 deletions
+5 -5
View File
@@ -1,11 +1,11 @@
image: homeassistant/{arch}-homeassistant
shadow_repository: ghcr.io/home-assistant
build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2022.07.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2022.07.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2022.07.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2022.07.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2022.07.0
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2022.10.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2022.10.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2022.10.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2022.10.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2022.10.0
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "Adax",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/adax",
"requirements": ["adax==0.2.0", "Adax-local==0.1.4"],
"requirements": ["adax==0.2.0", "Adax-local==0.1.5"],
"codeowners": ["@danielhiversen"],
"iot_class": "local_polling",
"loggers": ["adax", "adax_local"]
@@ -7,13 +7,9 @@ from math import ceil
from typing import Any
from pyairvisual import CloudAPI, NodeSamba
from pyairvisual.errors import (
AirVisualError,
InvalidKeyError,
KeyExpiredError,
NodeProError,
UnauthorizedError,
)
from pyairvisual.cloud_api import InvalidKeyError, KeyExpiredError, UnauthorizedError
from pyairvisual.errors import AirVisualError
from pyairvisual.node import NodeProError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -6,14 +6,14 @@ from collections.abc import Mapping
from typing import Any
from pyairvisual import CloudAPI, NodeSamba
from pyairvisual.errors import (
AirVisualError,
from pyairvisual.cloud_api import (
InvalidKeyError,
KeyExpiredError,
NodeProError,
NotFoundError,
UnauthorizedError,
)
from pyairvisual.errors import AirVisualError
from pyairvisual.node import NodeProError
import voluptuous as vol
from homeassistant import config_entries
@@ -3,7 +3,7 @@
"name": "AirVisual",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/airvisual",
"requirements": ["pyairvisual==2022.07.0"],
"requirements": ["pyairvisual==2022.11.1"],
"codeowners": ["@bachya"],
"iot_class": "cloud_polling",
"loggers": ["pyairvisual", "pysmb"],
@@ -2,7 +2,7 @@
"domain": "aladdin_connect",
"name": "Aladdin Connect",
"documentation": "https://www.home-assistant.io/integrations/aladdin_connect",
"requirements": ["AIOAladdinConnect==0.1.46"],
"requirements": ["AIOAladdinConnect==0.1.47"],
"codeowners": ["@mkmer"],
"iot_class": "cloud_polling",
"loggers": ["aladdin_connect"],
@@ -3,13 +3,13 @@ from __future__ import annotations
from collections.abc import Callable, Coroutine
import logging
import time
from typing import Any, Generic, TypeVar
from bleak import BleakError
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.debounce import Debouncer
from homeassistant.util.dt import monotonic_time_coarse
from . import BluetoothChange, BluetoothScanningMode, BluetoothServiceInfoBleak
from .passive_update_processor import PassiveBluetoothProcessorCoordinator
@@ -94,7 +94,7 @@ class ActiveBluetoothProcessorCoordinator(
"""Return true if time to try and poll."""
poll_age: float | None = None
if self._last_poll:
poll_age = time.monotonic() - self._last_poll
poll_age = monotonic_time_coarse() - self._last_poll
return self._needs_poll_method(service_info, poll_age)
async def _async_poll_data(
@@ -124,7 +124,7 @@ class ActiveBluetoothProcessorCoordinator(
self.last_poll_successful = False
return
finally:
self._last_poll = time.monotonic()
self._last_poll = monotonic_time_coarse()
if not self.last_poll_successful:
self.logger.debug("%s: Polling recovered")
@@ -7,7 +7,6 @@ from dataclasses import replace
from datetime import datetime, timedelta
import itertools
import logging
import time
from typing import TYPE_CHECKING, Any, Final
from bleak.backends.scanner import AdvertisementDataCallback
@@ -22,6 +21,7 @@ from homeassistant.core import (
)
from homeassistant.helpers import discovery_flow
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.util.dt import monotonic_time_coarse
from .advertisement_tracker import AdvertisementTracker
from .const import (
@@ -69,7 +69,7 @@ APPLE_START_BYTES_WANTED: Final = {
APPLE_DEVICE_ID_START_BYTE,
}
MONOTONIC_TIME: Final = time.monotonic
MONOTONIC_TIME: Final = monotonic_time_coarse
_LOGGER = logging.getLogger(__name__)
@@ -127,6 +127,7 @@ class BluetoothManager:
self._non_connectable_scanners: list[BaseHaScanner] = []
self._connectable_scanners: list[BaseHaScanner] = []
self._adapters: dict[str, AdapterDetails] = {}
self._sources: set[str] = set()
@property
def supports_passive_scan(self) -> bool:
@@ -379,6 +380,7 @@ class BluetoothManager:
if (
(old_service_info := all_history.get(address))
and source != old_service_info.source
and old_service_info.source in self._sources
and self._prefer_previous_adv_from_different_source(
old_service_info, service_info
)
@@ -398,6 +400,7 @@ class BluetoothManager:
# the old connectable advertisement
or (
source != old_connectable_service_info.source
and old_connectable_service_info.source in self._sources
and self._prefer_previous_adv_from_different_source(
old_connectable_service_info, service_info
)
@@ -597,8 +600,10 @@ class BluetoothManager:
def _unregister_scanner() -> None:
self._advertisement_tracker.async_remove_source(scanner.source)
scanners.remove(scanner)
self._sources.remove(scanner.source)
scanners.append(scanner)
self._sources.add(scanner.source)
return _unregister_scanner
@hass_callback
@@ -6,11 +6,11 @@
"after_dependencies": ["hassio"],
"quality_scale": "internal",
"requirements": [
"bleak==0.19.1",
"bleak-retry-connector==2.5.0",
"bluetooth-adapters==0.6.0",
"bleak==0.19.2",
"bleak-retry-connector==2.8.3",
"bluetooth-adapters==0.7.0",
"bluetooth-auto-recovery==0.3.6",
"dbus-fast==1.59.0"
"dbus-fast==1.61.1"
],
"codeowners": ["@bdraco"],
"config_flow": true,
+11 -3
View File
@@ -264,6 +264,7 @@ class HaBleakClientWrapper(BleakClient):
self.__address = address_or_ble_device
self.__disconnected_callback = disconnected_callback
self.__timeout = timeout
self.__ble_device: BLEDevice | None = None
self._backend: BaseBleakClient | None = None # type: ignore[assignment]
@property
@@ -283,14 +284,21 @@ class HaBleakClientWrapper(BleakClient):
async def connect(self, **kwargs: Any) -> bool:
"""Connect to the specified GATT server."""
if not self._backend:
if (
not self._backend
or not self.__ble_device
or not self._async_get_backend_for_ble_device(self.__ble_device)
):
assert MANAGER is not None
wrapped_backend = (
self._async_get_backend() or self._async_get_fallback_backend()
)
self._backend = wrapped_backend.client(
self.__ble_device = (
await freshen_ble_device(wrapped_backend.device)
or wrapped_backend.device,
or wrapped_backend.device
)
self._backend = wrapped_backend.client(
self.__ble_device,
disconnected_callback=self.__disconnected_callback,
timeout=self.__timeout,
hass=MANAGER.hass,
@@ -6,7 +6,6 @@ from collections.abc import Callable
from datetime import datetime
import logging
import platform
import time
from typing import Any
import async_timeout
@@ -22,6 +21,7 @@ from dbus_fast import InvalidMessageError
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback as hass_callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.util.dt import monotonic_time_coarse
from homeassistant.util.package import is_docker_env
from .const import (
@@ -35,7 +35,7 @@ from .models import BaseHaScanner, BluetoothScanningMode, BluetoothServiceInfoBl
from .util import adapter_human_name, async_reset_adapter
OriginalBleakScanner = bleak.BleakScanner
MONOTONIC_TIME = time.monotonic
MONOTONIC_TIME = monotonic_time_coarse
# or_patterns is a workaround for the fact that passive scanning
# needs at least one matcher to be set. The below matcher
+2 -2
View File
@@ -2,11 +2,11 @@
from __future__ import annotations
import platform
import time
from bluetooth_auto_recovery import recover_adapter
from homeassistant.core import callback
from homeassistant.util.dt import monotonic_time_coarse
from .const import (
DEFAULT_ADAPTER_BY_PLATFORM,
@@ -29,7 +29,7 @@ async def async_load_history_from_system() -> dict[str, BluetoothServiceInfoBlea
bluez_dbus = BlueZDBusObjects()
await bluez_dbus.load()
now = time.monotonic()
now = monotonic_time_coarse()
return {
address: BluetoothServiceInfoBleak(
name=history.advertisement_data.local_name
@@ -262,7 +262,11 @@ class BraviaTVOptionsFlowHandler(config_entries.OptionsFlow):
self.config_entry.entry_id
]
await coordinator.async_update_sources()
try:
await coordinator.async_update_sources()
except BraviaTVError:
return self.async_abort(reason="failed_update")
sources = coordinator.source_map.values()
self.source_list = [item["title"] for item in sources]
return await self.async_step_user()
@@ -48,6 +48,9 @@
"ignored_sources": "List of ignored sources"
}
}
},
"abort": {
"failed_update": "An error occurred while updating the list of sources.\n\n Ensure that your TV is turned on before trying to set it up."
}
}
}
@@ -41,6 +41,9 @@
}
},
"options": {
"abort": {
"failed_update": "An error occurred while updating the list of sources.\n\n Ensure that your TV is turned on before trying to set it up."
},
"step": {
"user": {
"data": {
+15 -1
View File
@@ -89,6 +89,7 @@ T = TypeVar(
class DeconzSensorDescriptionMixin(Generic[T]):
"""Required values when describing secondary sensor attributes."""
supported_fn: Callable[[T], bool]
update_key: str
value_fn: Callable[[T], datetime | StateType]
@@ -105,6 +106,7 @@ class DeconzSensorDescription(SensorEntityDescription, DeconzSensorDescriptionMi
ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
DeconzSensorDescription[AirQuality](
key="air_quality",
supported_fn=lambda device: device.air_quality is not None,
update_key="airquality",
value_fn=lambda device: device.air_quality,
instance_check=AirQuality,
@@ -112,6 +114,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[AirQuality](
key="air_quality_ppb",
supported_fn=lambda device: device.air_quality_ppb is not None,
update_key="airqualityppb",
value_fn=lambda device: device.air_quality_ppb,
instance_check=AirQuality,
@@ -122,6 +125,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Consumption](
key="consumption",
supported_fn=lambda device: device.consumption is not None,
update_key="consumption",
value_fn=lambda device: device.scaled_consumption,
instance_check=Consumption,
@@ -131,6 +135,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Daylight](
key="daylight_status",
supported_fn=lambda device: True,
update_key="status",
value_fn=lambda device: DAYLIGHT_STATUS[device.daylight_status],
instance_check=Daylight,
@@ -139,12 +144,14 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[GenericStatus](
key="status",
supported_fn=lambda device: device.status is not None,
update_key="status",
value_fn=lambda device: device.status,
instance_check=GenericStatus,
),
DeconzSensorDescription[Humidity](
key="humidity",
supported_fn=lambda device: device.humidity is not None,
update_key="humidity",
value_fn=lambda device: device.scaled_humidity,
instance_check=Humidity,
@@ -154,6 +161,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[LightLevel](
key="light_level",
supported_fn=lambda device: device.light_level is not None,
update_key="lightlevel",
value_fn=lambda device: device.scaled_light_level,
instance_check=LightLevel,
@@ -163,6 +171,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Power](
key="power",
supported_fn=lambda device: device.power is not None,
update_key="power",
value_fn=lambda device: device.power,
instance_check=Power,
@@ -172,6 +181,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Pressure](
key="pressure",
supported_fn=lambda device: device.pressure is not None,
update_key="pressure",
value_fn=lambda device: device.pressure,
instance_check=Pressure,
@@ -181,6 +191,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Temperature](
key="temperature",
supported_fn=lambda device: device.temperature is not None,
update_key="temperature",
value_fn=lambda device: device.scaled_temperature,
instance_check=Temperature,
@@ -190,6 +201,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[Time](
key="last_set",
supported_fn=lambda device: device.last_set is not None,
update_key="lastset",
value_fn=lambda device: dt_util.parse_datetime(device.last_set),
instance_check=Time,
@@ -197,6 +209,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[SensorResources](
key="battery",
supported_fn=lambda device: device.battery is not None,
update_key="battery",
value_fn=lambda device: device.battery,
name_suffix="Battery",
@@ -208,6 +221,7 @@ ENTITY_DESCRIPTIONS: tuple[DeconzSensorDescription, ...] = (
),
DeconzSensorDescription[SensorResources](
key="internal_temperature",
supported_fn=lambda device: device.internal_temperature is not None,
update_key="temperature",
value_fn=lambda device: device.internal_temperature,
name_suffix="Temperature",
@@ -268,7 +282,7 @@ async def async_setup_entry(
continue
no_sensor_data = False
if description.value_fn(sensor) is None:
if not description.supported_fn(sensor):
no_sensor_data = True
if description.instance_check is None:
@@ -35,6 +35,7 @@ async def async_setup_entry(
"devolo.model.Thermostat:Valve",
"devolo.model.Room:Thermostat",
"devolo.model.Eurotronic:Spirit:Device",
"unk.model.Danfoss:Thermostat",
):
entities.append(
DevoloClimateDeviceEntity(
@@ -3,7 +3,7 @@
"name": "DLNA Digital Media Renderer",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/dlna_dmr",
"requirements": ["async-upnp-client==0.32.1"],
"requirements": ["async-upnp-client==0.32.2"],
"dependencies": ["ssdp"],
"after_dependencies": ["media_source"],
"ssdp": [
@@ -3,7 +3,7 @@
"name": "DLNA Digital Media Server",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/dlna_dms",
"requirements": ["async-upnp-client==0.32.1"],
"requirements": ["async-upnp-client==0.32.2"],
"dependencies": ["ssdp"],
"after_dependencies": ["media_source"],
"ssdp": [
@@ -95,7 +95,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.data[CONF_USERNAME],
entry.data[CONF_PASSWORD],
hass.config.time_zone,
async_get_clientsession(hass),
client_session=async_get_clientsession(hass),
)
# Authenticate, build sensors
+7 -2
View File
@@ -7,11 +7,11 @@ import logging
import re
from types import MappingProxyType
from typing import Any, cast
from urllib.parse import urlparse
import async_timeout
from elkm1_lib.elements import Element
from elkm1_lib.elk import Elk
from elkm1_lib.util import parse_url
import voluptuous as vol
from homeassistant.config_entries import SOURCE_IMPORT, ConfigEntry
@@ -96,6 +96,11 @@ SET_TIME_SERVICE_SCHEMA = vol.Schema(
)
def hostname_from_url(url: str) -> str:
"""Return the hostname from a url."""
return parse_url(url)[1]
def _host_validator(config: dict[str, str]) -> dict[str, str]:
"""Validate that a host is properly configured."""
if config[CONF_HOST].startswith("elks://"):
@@ -231,7 +236,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Elk-M1 Control from a config entry."""
conf: MappingProxyType[str, Any] = entry.data
host = urlparse(entry.data[CONF_HOST]).hostname
host = hostname_from_url(entry.data[CONF_HOST])
_LOGGER.debug("Setting up elkm1 %s", conf["host"])
@@ -4,7 +4,6 @@ from __future__ import annotations
import asyncio
import logging
from typing import Any
from urllib.parse import urlparse
from elkm1_lib.discovery import ElkSystem
from elkm1_lib.elk import Elk
@@ -26,7 +25,7 @@ from homeassistant.helpers.typing import DiscoveryInfoType
from homeassistant.util import slugify
from homeassistant.util.network import is_ip_address
from . import async_wait_for_elk_to_sync
from . import async_wait_for_elk_to_sync, hostname_from_url
from .const import CONF_AUTO_CONFIGURE, DISCOVER_SCAN_TIMEOUT, DOMAIN, LOGIN_TIMEOUT
from .discovery import (
_short_mac,
@@ -170,7 +169,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
for entry in self._async_current_entries(include_ignore=False):
if (
entry.unique_id == mac
or urlparse(entry.data[CONF_HOST]).hostname == host
or hostname_from_url(entry.data[CONF_HOST]) == host
):
if async_update_entry_from_discovery(self.hass, entry, device):
self.hass.async_create_task(
@@ -214,7 +213,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
current_unique_ids = self._async_current_ids()
current_hosts = {
urlparse(entry.data[CONF_HOST]).hostname
hostname_from_url(entry.data[CONF_HOST])
for entry in self._async_current_entries(include_ignore=False)
}
discovered_devices = await async_discover_devices(
@@ -344,7 +343,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
if self._url_already_configured(url):
return self.async_abort(reason="address_already_configured")
host = urlparse(url).hostname
host = hostname_from_url(url)
_LOGGER.debug(
"Importing is trying to fill unique id from discovery for %s", host
)
@@ -367,10 +366,10 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
def _url_already_configured(self, url: str) -> bool:
"""See if we already have a elkm1 matching user input configured."""
existing_hosts = {
urlparse(entry.data[CONF_HOST]).hostname
hostname_from_url(entry.data[CONF_HOST])
for entry in self._async_current_entries()
}
return urlparse(url).hostname in existing_hosts
return hostname_from_url(url) in existing_hosts
class InvalidAuth(exceptions.HomeAssistantError):
+63 -44
View File
@@ -2,6 +2,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
import copy
from dataclasses import dataclass
import logging
@@ -22,6 +23,7 @@ from homeassistant.const import (
VOLUME_GALLONS,
VOLUME_LITERS,
UnitOfEnergy,
UnitOfVolume,
)
from homeassistant.core import (
HomeAssistant,
@@ -34,29 +36,35 @@ from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.event import async_track_state_change_event
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import unit_conversion
import homeassistant.util.dt as dt_util
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import DOMAIN
from .data import EnergyManager, async_get_manager
SUPPORTED_STATE_CLASSES = [
SUPPORTED_STATE_CLASSES = {
SensorStateClass.MEASUREMENT,
SensorStateClass.TOTAL,
SensorStateClass.TOTAL_INCREASING,
]
VALID_ENERGY_UNITS = [
}
VALID_ENERGY_UNITS: set[str] = {
UnitOfEnergy.WATT_HOUR,
UnitOfEnergy.KILO_WATT_HOUR,
UnitOfEnergy.MEGA_WATT_HOUR,
UnitOfEnergy.GIGA_JOULE,
]
VALID_ENERGY_UNITS_GAS = [VOLUME_CUBIC_FEET, VOLUME_CUBIC_METERS] + VALID_ENERGY_UNITS
VALID_VOLUME_UNITS_WATER = [
}
VALID_ENERGY_UNITS_GAS = {
VOLUME_CUBIC_FEET,
VOLUME_CUBIC_METERS,
*VALID_ENERGY_UNITS,
}
VALID_VOLUME_UNITS_WATER = {
VOLUME_CUBIC_FEET,
VOLUME_CUBIC_METERS,
VOLUME_GALLONS,
VOLUME_LITERS,
]
}
_LOGGER = logging.getLogger(__name__)
@@ -252,8 +260,24 @@ class EnergyCostSensor(SensorEntity):
self.async_write_ha_state()
@callback
def _update_cost(self) -> None: # noqa: C901
def _update_cost(self) -> None:
"""Update incurred costs."""
if self._adapter.source_type == "grid":
valid_units = VALID_ENERGY_UNITS
default_price_unit: str | None = UnitOfEnergy.KILO_WATT_HOUR
elif self._adapter.source_type == "gas":
valid_units = VALID_ENERGY_UNITS_GAS
# No conversion for gas.
default_price_unit = None
elif self._adapter.source_type == "water":
valid_units = VALID_VOLUME_UNITS_WATER
if self.hass.config.units is METRIC_SYSTEM:
default_price_unit = UnitOfVolume.CUBIC_METERS
else:
default_price_unit = UnitOfVolume.GALLONS
energy_state = self.hass.states.get(
cast(str, self._config[self._adapter.stat_energy_key])
)
@@ -298,52 +322,27 @@ class EnergyCostSensor(SensorEntity):
except ValueError:
return
if energy_price_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT, "").endswith(
f"/{UnitOfEnergy.WATT_HOUR}"
):
energy_price *= 1000.0
energy_price_unit: str | None = energy_price_state.attributes.get(
ATTR_UNIT_OF_MEASUREMENT, ""
).partition("/")[2]
if energy_price_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT, "").endswith(
f"/{UnitOfEnergy.MEGA_WATT_HOUR}"
):
energy_price /= 1000.0
if energy_price_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT, "").endswith(
f"/{UnitOfEnergy.GIGA_JOULE}"
):
energy_price /= 1000 / 3.6
# For backwards compatibility we don't validate the unit of the price
# If it is not valid, we assume it's our default price unit.
if energy_price_unit not in valid_units:
energy_price_unit = default_price_unit
else:
energy_price_state = None
energy_price = cast(float, self._config["number_energy_price"])
energy_price_unit = default_price_unit
if self._last_energy_sensor_state is None:
# Initialize as it's the first time all required entities are in place.
self._reset(energy_state)
return
energy_unit = energy_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
energy_unit: str | None = energy_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
if self._adapter.source_type == "grid":
if energy_unit not in VALID_ENERGY_UNITS:
energy_unit = None
elif self._adapter.source_type == "gas":
if energy_unit not in VALID_ENERGY_UNITS_GAS:
energy_unit = None
elif self._adapter.source_type == "water":
if energy_unit not in VALID_VOLUME_UNITS_WATER:
energy_unit = None
if energy_unit == UnitOfEnergy.WATT_HOUR:
energy_price /= 1000
elif energy_unit == UnitOfEnergy.MEGA_WATT_HOUR:
energy_price *= 1000
elif energy_unit == UnitOfEnergy.GIGA_JOULE:
energy_price *= 1000 / 3.6
if energy_unit is None:
if energy_unit is None or energy_unit not in valid_units:
if not self._wrong_unit_reported:
self._wrong_unit_reported = True
_LOGGER.warning(
@@ -373,10 +372,30 @@ class EnergyCostSensor(SensorEntity):
energy_state_copy = copy.copy(energy_state)
energy_state_copy.state = "0.0"
self._reset(energy_state_copy)
# Update with newly incurred cost
old_energy_value = float(self._last_energy_sensor_state.state)
cur_value = cast(float, self._attr_native_value)
self._attr_native_value = cur_value + (energy - old_energy_value) * energy_price
if energy_price_unit is None:
converted_energy_price = energy_price
else:
if self._adapter.source_type == "grid":
converter: Callable[
[float, str, str], float
] = unit_conversion.EnergyConverter.convert
elif self._adapter.source_type in ("gas", "water"):
converter = unit_conversion.VolumeConverter.convert
converted_energy_price = converter(
energy_price,
energy_unit,
energy_price_unit,
)
self._attr_native_value = (
cur_value + (energy - old_energy_value) * converted_energy_price
)
self._last_energy_sensor_state = energy_state
@@ -249,6 +249,8 @@ async def async_setup_entry( # noqa: C901
async def on_disconnect() -> None:
"""Run disconnect callbacks on API disconnect."""
name = entry_data.device_info.name if entry_data.device_info else host
_LOGGER.debug("%s: %s disconnected, running disconnected callbacks", name, host)
for disconnect_cb in entry_data.disconnect_callbacks:
disconnect_cb()
entry_data.disconnect_callbacks = []
@@ -30,13 +30,15 @@ def _async_can_connect_factory(
@hass_callback
def _async_can_connect() -> bool:
"""Check if a given source can make another connection."""
can_connect = bool(entry_data.available and entry_data.ble_connections_free)
_LOGGER.debug(
"Checking if %s can connect, available=%s, ble_connections_free=%s",
"%s: Checking can connect, available=%s, ble_connections_free=%s result=%s",
source,
entry_data.available,
entry_data.ble_connections_free,
can_connect,
)
return bool(entry_data.available and entry_data.ble_connections_free)
return can_connect
return _async_can_connect
@@ -55,7 +57,7 @@ async def async_connect_scanner(
version = entry_data.device_info.bluetooth_proxy_version
connectable = version >= 2
_LOGGER.debug(
"Connecting scanner for %s, version=%s, connectable=%s",
"%s: Connecting scanner version=%s, connectable=%s",
source,
version,
connectable,
@@ -7,6 +7,11 @@ import logging
from typing import Any, TypeVar, cast
import uuid
from aioesphomeapi import (
ESP_CONNECTION_ERROR_DESCRIPTION,
ESPHOME_GATT_ERRORS,
BLEConnectionError,
)
from aioesphomeapi.connection import APIConnectionError, TimeoutAPIError
import async_timeout
from bleak.backends.characteristic import BleakGATTCharacteristic
@@ -60,7 +65,7 @@ def verify_connected(func: _WrapFuncType) -> _WrapFuncType:
if disconnected_event.is_set():
task.cancel()
raise BleakError(
f"{self._ble_device.name} ({self._ble_device.address}): " # pylint: disable=protected-access
f"{self._source}: {self._ble_device.name} - {self._ble_device.address}: " # pylint: disable=protected-access
"Disconnected during operation"
)
return next(iter(done)).result()
@@ -119,25 +124,41 @@ class ESPHomeClient(BaseBleakClient):
self._cancel_connection_state()
except (AssertionError, ValueError) as ex:
_LOGGER.debug(
"Failed to unsubscribe from connection state (likely connection dropped): %s",
"%s: %s - %s: Failed to unsubscribe from connection state (likely connection dropped): %s",
self._source,
self._ble_device.name,
self._ble_device.address,
ex,
)
self._cancel_connection_state = None
def _async_ble_device_disconnected(self) -> None:
"""Handle the BLE device disconnecting from the ESP."""
_LOGGER.debug("%s: BLE device disconnected", self._source)
self._is_connected = False
was_connected = self._is_connected
self.services = BleakGATTServiceCollection() # type: ignore[no-untyped-call]
self._is_connected = False
self._notify_cancels.clear()
if self._disconnected_event:
self._disconnected_event.set()
self._disconnected_event = None
self._async_call_bleak_disconnected_callback()
if was_connected:
_LOGGER.debug(
"%s: %s - %s: BLE device disconnected",
self._source,
self._ble_device.name,
self._ble_device.address,
)
self._async_call_bleak_disconnected_callback()
self._unsubscribe_connection_state()
def _async_esp_disconnected(self) -> None:
"""Handle the esp32 client disconnecting from hass."""
_LOGGER.debug("%s: ESP device disconnected", self._source)
_LOGGER.debug(
"%s: %s - %s: ESP device disconnected",
self._source,
self._ble_device.name,
self._ble_device.address,
)
self.entry_data.disconnect_callbacks.remove(self._async_esp_disconnected)
self._async_ble_device_disconnected()
@@ -167,7 +188,10 @@ class ESPHomeClient(BaseBleakClient):
) -> None:
"""Handle a connect or disconnect."""
_LOGGER.debug(
"Connection state changed: connected=%s mtu=%s error=%s",
"%s: %s - %s: Connection state changed to connected=%s mtu=%s error=%s",
self._source,
self._ble_device.name,
self._ble_device.address,
connected,
mtu,
error,
@@ -182,8 +206,19 @@ class ESPHomeClient(BaseBleakClient):
return
if error:
try:
ble_connection_error = BLEConnectionError(error)
ble_connection_error_name = ble_connection_error.name
human_error = ESP_CONNECTION_ERROR_DESCRIPTION[ble_connection_error]
except (KeyError, ValueError):
ble_connection_error_name = str(error)
human_error = ESPHOME_GATT_ERRORS.get(
error, f"Unknown error code {error}"
)
connected_future.set_exception(
BleakError(f"Error while connecting: {error}")
BleakError(
f"Error {ble_connection_error_name} while connecting: {human_error}"
)
)
return
@@ -191,6 +226,12 @@ class ESPHomeClient(BaseBleakClient):
connected_future.set_exception(BleakError("Disconnected"))
return
_LOGGER.debug(
"%s: %s - %s: connected, registering for disconnected callbacks",
self._source,
self._ble_device.name,
self._ble_device.address,
)
self.entry_data.disconnect_callbacks.append(self._async_esp_disconnected)
connected_future.set_result(connected)
@@ -218,7 +259,10 @@ class ESPHomeClient(BaseBleakClient):
if self.entry_data.ble_connections_free:
return
_LOGGER.debug(
"%s: Out of connection slots, waiting for a free one", self._source
"%s: %s - %s: Out of connection slots, waiting for a free one",
self._source,
self._ble_device.name,
self._ble_device.address,
)
async with async_timeout.timeout(timeout):
await self.entry_data.wait_for_ble_connections_free()
@@ -255,25 +299,34 @@ class ESPHomeClient(BaseBleakClient):
A :py:class:`bleak.backends.service.BleakGATTServiceCollection` with this device's services tree.
"""
address_as_int = self._address_as_int
domain_data = self.domain_data
entry_data = self.entry_data
if dangerous_use_bleak_cache and (
cached_services := domain_data.get_gatt_services_cache(address_as_int)
cached_services := entry_data.get_gatt_services_cache(address_as_int)
):
_LOGGER.debug(
"Cached services hit for %s - %s",
"%s: %s - %s: Cached services hit",
self._source,
self._ble_device.name,
self._ble_device.address,
)
self.services = cached_services
return self.services
_LOGGER.debug(
"Cached services miss for %s - %s",
"%s: %s - %s: Cached services miss",
self._source,
self._ble_device.name,
self._ble_device.address,
)
esphome_services = await self._client.bluetooth_gatt_get_services(
address_as_int
)
_LOGGER.debug(
"%s: %s - %s: Got services: %s",
self._source,
self._ble_device.name,
self._ble_device.address,
esphome_services,
)
max_write_without_response = self.mtu_size - GATT_HEADER_SIZE
services = BleakGATTServiceCollection() # type: ignore[no-untyped-call]
for service in esphome_services.services:
@@ -297,11 +350,12 @@ class ESPHomeClient(BaseBleakClient):
)
self.services = services
_LOGGER.debug(
"Cached services saved for %s - %s",
"%s: %s - %s: Cached services saved",
self._source,
self._ble_device.name,
self._ble_device.address,
)
domain_data.set_gatt_services_cache(address_as_int, services)
entry_data.set_gatt_services_cache(address_as_int, services)
return services
def _resolve_characteristic(
@@ -410,12 +464,20 @@ class ESPHomeClient(BaseBleakClient):
UUID or directly by the BleakGATTCharacteristic object representing it.
callback (function): The function to be called on notification.
"""
ble_handle = characteristic.handle
if ble_handle in self._notify_cancels:
raise BleakError(
"Notifications are already enabled on "
f"service:{characteristic.service_uuid} "
f"characteristic:{characteristic.uuid} "
f"handle:{ble_handle}"
)
cancel_coro = await self._client.bluetooth_gatt_start_notify(
self._address_as_int,
characteristic.handle,
ble_handle,
lambda handle, data: callback(data),
)
self._notify_cancels[characteristic.handle] = cancel_coro
self._notify_cancels[ble_handle] = cancel_coro
@api_error_as_bleak_error
async def stop_notify(
@@ -430,5 +492,7 @@ class ESPHomeClient(BaseBleakClient):
directly by the BleakGATTCharacteristic object representing it.
"""
characteristic = self._resolve_characteristic(char_specifier)
coro = self._notify_cancels.pop(characteristic.handle)
await coro()
# Do not raise KeyError if notifications are not enabled on this characteristic
# to be consistent with the behavior of the BlueZ backend
if coro := self._notify_cancels.pop(characteristic.handle, None):
await coro()
@@ -6,6 +6,7 @@ import datetime
from datetime import timedelta
import re
import time
from typing import Final
from aioesphomeapi import BluetoothLEAdvertisement
from bleak.backends.device import BLEDevice
@@ -19,9 +20,19 @@ from homeassistant.components.bluetooth import (
)
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.util.dt import monotonic_time_coarse
TWO_CHAR = re.compile("..")
# The maximum time between advertisements for a device to be considered
# stale when the advertisement tracker can determine the interval for
# connectable devices.
#
# BlueZ uses 180 seconds by default but we give it a bit more time
# to account for the esp32's bluetooth stack being a bit slower
# than BlueZ's.
CONNECTABLE_FALLBACK_MAXIMUM_STALE_ADVERTISEMENT_SECONDS: Final = 195
class ESPHomeScanner(BaseHaScanner):
"""Scanner for esphome."""
@@ -44,8 +55,12 @@ class ESPHomeScanner(BaseHaScanner):
self._connector = connector
self._connectable = connectable
self._details: dict[str, str | HaBluetoothConnector] = {"source": scanner_id}
self._fallback_seconds = FALLBACK_MAXIMUM_STALE_ADVERTISEMENT_SECONDS
if connectable:
self._details["connector"] = connector
self._fallback_seconds = (
CONNECTABLE_FALLBACK_MAXIMUM_STALE_ADVERTISEMENT_SECONDS
)
@callback
def async_setup(self) -> CALLBACK_TYPE:
@@ -60,7 +75,7 @@ class ESPHomeScanner(BaseHaScanner):
expired = [
address
for address, timestamp in self._discovered_device_timestamps.items()
if now - timestamp > FALLBACK_MAXIMUM_STALE_ADVERTISEMENT_SECONDS
if now - timestamp > self._fallback_seconds
]
for address in expired:
del self._discovered_device_advertisement_datas[address]
@@ -84,7 +99,7 @@ class ESPHomeScanner(BaseHaScanner):
@callback
def async_on_advertisement(self, adv: BluetoothLEAdvertisement) -> None:
"""Call the registered callback."""
now = time.monotonic()
now = monotonic_time_coarse()
address = ":".join(TWO_CHAR.findall("%012X" % adv.address)) # must be upper
name = adv.name
if prev_discovery := self._discovered_device_advertisement_datas.get(address):
@@ -1,13 +1,9 @@
"""Support for esphome domain data."""
from __future__ import annotations
from collections.abc import MutableMapping
from dataclasses import dataclass, field
from typing import TypeVar, cast
from bleak.backends.service import BleakGATTServiceCollection
from lru import LRU # pylint: disable=no-name-in-module
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.json import JSONEncoder
@@ -17,7 +13,6 @@ from .entry_data import RuntimeEntryData
STORAGE_VERSION = 1
DOMAIN = "esphome"
MAX_CACHED_SERVICES = 128
_DomainDataSelfT = TypeVar("_DomainDataSelfT", bound="DomainData")
@@ -29,21 +24,6 @@ class DomainData:
_entry_datas: dict[str, RuntimeEntryData] = field(default_factory=dict)
_stores: dict[str, Store] = field(default_factory=dict)
_entry_by_unique_id: dict[str, ConfigEntry] = field(default_factory=dict)
_gatt_services_cache: MutableMapping[int, BleakGATTServiceCollection] = field(
default_factory=lambda: LRU(MAX_CACHED_SERVICES) # type: ignore[no-any-return]
)
def get_gatt_services_cache(
self, address: int
) -> BleakGATTServiceCollection | None:
"""Get the BleakGATTServiceCollection for the given address."""
return self._gatt_services_cache.get(address)
def set_gatt_services_cache(
self, address: int, services: BleakGATTServiceCollection
) -> None:
"""Set the BleakGATTServiceCollection for the given address."""
self._gatt_services_cache[address] = services
def get_by_unique_id(self, unique_id: str) -> ConfigEntry:
"""Get the config entry by its unique ID."""
+33 -4
View File
@@ -2,7 +2,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
from collections.abc import Callable, MutableMapping
from dataclasses import dataclass, field
import logging
from typing import Any, cast
@@ -30,6 +30,8 @@ from aioesphomeapi import (
UserService,
)
from aioesphomeapi.model import ButtonInfo
from bleak.backends.service import BleakGATTServiceCollection
from lru import LRU # pylint: disable=no-name-in-module
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
@@ -57,6 +59,7 @@ INFO_TYPE_TO_PLATFORM: dict[type[EntityInfo], str] = {
SwitchInfo: Platform.SWITCH,
TextSensorInfo: Platform.SENSOR,
}
MAX_CACHED_SERVICES = 128
@dataclass
@@ -92,12 +95,37 @@ class RuntimeEntryData:
_ble_connection_free_futures: list[asyncio.Future[int]] = field(
default_factory=list
)
_gatt_services_cache: MutableMapping[int, BleakGATTServiceCollection] = field(
default_factory=lambda: LRU(MAX_CACHED_SERVICES) # type: ignore[no-any-return]
)
@property
def name(self) -> str:
"""Return the name of the device."""
return self.device_info.name if self.device_info else self.entry_id
def get_gatt_services_cache(
self, address: int
) -> BleakGATTServiceCollection | None:
"""Get the BleakGATTServiceCollection for the given address."""
return self._gatt_services_cache.get(address)
def set_gatt_services_cache(
self, address: int, services: BleakGATTServiceCollection
) -> None:
"""Set the BleakGATTServiceCollection for the given address."""
self._gatt_services_cache[address] = services
@callback
def async_update_ble_connection_limits(self, free: int, limit: int) -> None:
"""Update the BLE connection limits."""
name = self.device_info.name if self.device_info else self.entry_id
_LOGGER.debug("%s: BLE connection limits: %s/%s", name, free, limit)
_LOGGER.debug(
"%s: BLE connection limits: used=%s free=%s limit=%s",
self.name,
limit - free,
free,
limit,
)
self.ble_connections_free = free
self.ble_connections_limit = limit
if free:
@@ -168,7 +196,8 @@ class RuntimeEntryData:
subscription_key = (type(state), state.key)
self.state[type(state)][state.key] = state
_LOGGER.debug(
"Dispatching update with key %s: %s",
"%s: dispatching update with key %s: %s",
self.name,
subscription_key,
state,
)
@@ -3,7 +3,7 @@
"name": "ESPHome",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/esphome",
"requirements": ["aioesphomeapi==11.2.0"],
"requirements": ["aioesphomeapi==11.4.3"],
"zeroconf": ["_esphomelib._tcp.local."],
"dhcp": [{ "registered_devices": true }],
"codeowners": ["@OttoWinter", "@jesserockz"],
@@ -79,6 +79,7 @@ class IncidentsSensor(RestoreEntity, SensorEntity):
"type",
"responder_mode",
"can_respond_until",
"task_ids",
):
if data.get(value):
attr[value] = data[value]
+2 -1
View File
@@ -14,5 +14,6 @@ def get_valid_flume_devices(flume_devices: FlumeDeviceList) -> list[dict[str, An
return [
device
for device in flume_devices.device_list
if KEY_DEVICE_LOCATION_NAME in device[KEY_DEVICE_LOCATION]
if KEY_DEVICE_LOCATION in device
and KEY_DEVICE_LOCATION_NAME in device[KEY_DEVICE_LOCATION]
]
@@ -2,7 +2,7 @@
"domain": "frontend",
"name": "Home Assistant Frontend",
"documentation": "https://www.home-assistant.io/integrations/frontend",
"requirements": ["home-assistant-frontend==20221027.0"],
"requirements": ["home-assistant-frontend==20221108.0"],
"dependencies": [
"api",
"auth",
+118 -36
View File
@@ -3,11 +3,12 @@
from __future__ import annotations
import asyncio
from collections.abc import Iterable
from datetime import datetime, timedelta
import logging
from typing import Any
from gcal_sync.api import SyncEventsRequest
from gcal_sync.api import GoogleCalendarService, ListEventsRequest, SyncEventsRequest
from gcal_sync.exceptions import ApiException
from gcal_sync.model import DateOrDatetime, Event
from gcal_sync.store import ScopedCalendarStore
@@ -196,21 +197,30 @@ async def async_setup_entry(
entity_registry.async_remove(
entity_entry.entity_id,
)
request_template = SyncEventsRequest(
calendar_id=calendar_id,
search=data.get(CONF_SEARCH),
start_time=dt_util.now() + SYNC_EVENT_MIN_TIME,
)
sync = CalendarEventSyncManager(
calendar_service,
store=ScopedCalendarStore(store, unique_id or entity_name),
request_template=request_template,
)
coordinator = CalendarUpdateCoordinator(
hass,
sync,
data[CONF_NAME],
)
coordinator: CalendarSyncUpdateCoordinator | CalendarQueryUpdateCoordinator
if search := data.get(CONF_SEARCH):
coordinator = CalendarQueryUpdateCoordinator(
hass,
calendar_service,
data[CONF_NAME],
calendar_id,
search,
)
else:
request_template = SyncEventsRequest(
calendar_id=calendar_id,
start_time=dt_util.now() + SYNC_EVENT_MIN_TIME,
)
sync = CalendarEventSyncManager(
calendar_service,
store=ScopedCalendarStore(store, unique_id or entity_name),
request_template=request_template,
)
coordinator = CalendarSyncUpdateCoordinator(
hass,
sync,
data[CONF_NAME],
)
entities.append(
GoogleCalendarEntity(
coordinator,
@@ -242,8 +252,8 @@ async def async_setup_entry(
)
class CalendarUpdateCoordinator(DataUpdateCoordinator[Timeline]):
"""Coordinator for calendar RPC calls."""
class CalendarSyncUpdateCoordinator(DataUpdateCoordinator[Timeline]):
"""Coordinator for calendar RPC calls that use an efficient sync."""
def __init__(
self,
@@ -251,7 +261,7 @@ class CalendarUpdateCoordinator(DataUpdateCoordinator[Timeline]):
sync: CalendarEventSyncManager,
name: str,
) -> None:
"""Create the Calendar event device."""
"""Create the CalendarSyncUpdateCoordinator."""
super().__init__(
hass,
_LOGGER,
@@ -271,6 +281,87 @@ class CalendarUpdateCoordinator(DataUpdateCoordinator[Timeline]):
dt_util.DEFAULT_TIME_ZONE
)
async def async_get_events(
self, start_date: datetime, end_date: datetime
) -> Iterable[Event]:
"""Get all events in a specific time frame."""
if not self.data:
raise HomeAssistantError(
"Unable to get events: Sync from server has not completed"
)
return self.data.overlapping(
dt_util.as_local(start_date),
dt_util.as_local(end_date),
)
@property
def upcoming(self) -> Iterable[Event] | None:
"""Return upcoming events if any."""
if self.data:
return self.data.active_after(dt_util.now())
return None
class CalendarQueryUpdateCoordinator(DataUpdateCoordinator[list[Event]]):
"""Coordinator for calendar RPC calls.
This sends a polling RPC, not using sync, as a workaround
for limitations in the calendar API for supporting search.
"""
def __init__(
self,
hass: HomeAssistant,
calendar_service: GoogleCalendarService,
name: str,
calendar_id: str,
search: str | None,
) -> None:
"""Create the CalendarQueryUpdateCoordinator."""
super().__init__(
hass,
_LOGGER,
name=name,
update_interval=MIN_TIME_BETWEEN_UPDATES,
)
self.calendar_service = calendar_service
self.calendar_id = calendar_id
self._search = search
async def async_get_events(
self, start_date: datetime, end_date: datetime
) -> Iterable[Event]:
"""Get all events in a specific time frame."""
request = ListEventsRequest(
calendar_id=self.calendar_id,
start_time=start_date,
end_time=end_date,
search=self._search,
)
result_items = []
try:
result = await self.calendar_service.async_list_events(request)
async for result_page in result:
result_items.extend(result_page.items)
except ApiException as err:
self.async_set_update_error(err)
raise HomeAssistantError(str(err)) from err
return result_items
async def _async_update_data(self) -> list[Event]:
"""Fetch data from API endpoint."""
request = ListEventsRequest(calendar_id=self.calendar_id, search=self._search)
try:
result = await self.calendar_service.async_list_events(request)
except ApiException as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
return result.items
@property
def upcoming(self) -> Iterable[Event] | None:
"""Return the next upcoming event if any."""
return self.data
class GoogleCalendarEntity(CoordinatorEntity, CalendarEntity):
"""A calendar event entity."""
@@ -279,7 +370,7 @@ class GoogleCalendarEntity(CoordinatorEntity, CalendarEntity):
def __init__(
self,
coordinator: CalendarUpdateCoordinator,
coordinator: CalendarSyncUpdateCoordinator | CalendarQueryUpdateCoordinator,
calendar_id: str,
data: dict[str, Any],
entity_id: str,
@@ -352,14 +443,7 @@ class GoogleCalendarEntity(CoordinatorEntity, CalendarEntity):
self, hass: HomeAssistant, start_date: datetime, end_date: datetime
) -> list[CalendarEvent]:
"""Get all events in a specific time frame."""
if not (timeline := self.coordinator.data):
raise HomeAssistantError(
"Unable to get events: Sync from server has not completed"
)
result_items = timeline.overlapping(
dt_util.as_local(start_date),
dt_util.as_local(end_date),
)
result_items = await self.coordinator.async_get_events(start_date, end_date)
return [
_get_calendar_event(event)
for event in filter(self._event_filter, result_items)
@@ -367,14 +451,12 @@ class GoogleCalendarEntity(CoordinatorEntity, CalendarEntity):
def _apply_coordinator_update(self) -> None:
"""Copy state from the coordinator to this entity."""
if (timeline := self.coordinator.data) and (
api_event := next(
filter(
self._event_filter,
timeline.active_after(dt_util.now()),
),
None,
)
if api_event := next(
filter(
self._event_filter,
self.coordinator.upcoming or [],
),
None,
):
self._event = _get_calendar_event(api_event)
(self._event.summary, self._offset_value) = extract_offset(
@@ -4,7 +4,7 @@
"config_flow": true,
"dependencies": ["application_credentials"],
"documentation": "https://www.home-assistant.io/integrations/calendar.google/",
"requirements": ["gcal-sync==2.2.0", "oauth2client==4.1.3"],
"requirements": ["gcal-sync==4.0.0", "oauth2client==4.1.3"],
"codeowners": ["@allenporter"],
"iot_class": "cloud_polling",
"loggers": ["googleapiclient"]
@@ -30,6 +30,7 @@
"time_type": "Time Type",
"time": "Time",
"avoid": "Avoid",
"traffic_mode": "Traffic Mode",
"transit_mode": "Transit Mode",
"transit_routing_preference": "Transit Routing Preference",
"units": "Units"
@@ -28,6 +28,7 @@
"mode": "Travel Mode",
"time": "Time",
"time_type": "Time Type",
"traffic_mode": "Traffic Mode",
"transit_mode": "Transit Mode",
"transit_routing_preference": "Transit Routing Preference",
"units": "Units"
@@ -77,6 +77,7 @@ from .discovery import HassioServiceInfo, async_setup_discovery_view # noqa: F4
from .handler import HassIO, HassioAPIError, api_data
from .http import HassIOView
from .ingress import async_setup_ingress_view
from .repairs import SupervisorRepairs
from .websocket_api import async_load_websocket_api
_LOGGER = logging.getLogger(__name__)
@@ -103,6 +104,7 @@ DATA_SUPERVISOR_INFO = "hassio_supervisor_info"
DATA_ADDONS_CHANGELOGS = "hassio_addons_changelogs"
DATA_ADDONS_INFO = "hassio_addons_info"
DATA_ADDONS_STATS = "hassio_addons_stats"
DATA_SUPERVISOR_REPAIRS = "supervisor_repairs"
HASSIO_UPDATE_INTERVAL = timedelta(minutes=5)
ADDONS_COORDINATOR = "hassio_addons_coordinator"
@@ -758,6 +760,10 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
hass.config_entries.flow.async_init(DOMAIN, context={"source": "system"})
)
# Start listening for problems with supervisor and making repairs
hass.data[DATA_SUPERVISOR_REPAIRS] = repairs = SupervisorRepairs(hass, hassio)
await repairs.setup()
return True
+16 -5
View File
@@ -11,19 +11,26 @@ ATTR_CONFIG = "config"
ATTR_DATA = "data"
ATTR_DISCOVERY = "discovery"
ATTR_ENABLE = "enable"
ATTR_ENDPOINT = "endpoint"
ATTR_FOLDERS = "folders"
ATTR_HEALTHY = "healthy"
ATTR_HOMEASSISTANT = "homeassistant"
ATTR_INPUT = "input"
ATTR_METHOD = "method"
ATTR_PANELS = "panels"
ATTR_PASSWORD = "password"
ATTR_RESULT = "result"
ATTR_SUPPORTED = "supported"
ATTR_TIMEOUT = "timeout"
ATTR_TITLE = "title"
ATTR_UNHEALTHY = "unhealthy"
ATTR_UNHEALTHY_REASONS = "unhealthy_reasons"
ATTR_UNSUPPORTED = "unsupported"
ATTR_UNSUPPORTED_REASONS = "unsupported_reasons"
ATTR_UPDATE_KEY = "update_key"
ATTR_USERNAME = "username"
ATTR_UUID = "uuid"
ATTR_WS_EVENT = "event"
ATTR_ENDPOINT = "endpoint"
ATTR_METHOD = "method"
ATTR_RESULT = "result"
ATTR_TIMEOUT = "timeout"
X_AUTH_TOKEN = "X-Supervisor-Token"
X_INGRESS_PATH = "X-Ingress-Path"
@@ -38,6 +45,11 @@ WS_TYPE_EVENT = "supervisor/event"
WS_TYPE_SUBSCRIBE = "supervisor/subscribe"
EVENT_SUPERVISOR_EVENT = "supervisor_event"
EVENT_SUPERVISOR_UPDATE = "supervisor_update"
EVENT_HEALTH_CHANGED = "health_changed"
EVENT_SUPPORTED_CHANGED = "supported_changed"
UPDATE_KEY_SUPERVISOR = "supervisor"
ATTR_AUTO_UPDATE = "auto_update"
ATTR_VERSION = "version"
@@ -51,7 +63,6 @@ ATTR_STARTED = "started"
ATTR_URL = "url"
ATTR_REPOSITORY = "repository"
DATA_KEY_ADDONS = "addons"
DATA_KEY_OS = "os"
DATA_KEY_SUPERVISOR = "supervisor"
@@ -190,6 +190,14 @@ class HassIO:
"""
return self.send_command(f"/discovery/{uuid}", method="get")
@api_data
def get_resolution_info(self):
"""Return data for Supervisor resolution center.
This method return a coroutine.
"""
return self.send_command("/resolution/info", method="get")
@_api_bool
async def update_hass_api(self, http_config, refresh_token):
"""Update Home Assistant API data on Hass.io."""
+185
View File
@@ -0,0 +1,185 @@
"""Supervisor events monitor."""
from __future__ import annotations
from typing import Any
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.issue_registry import (
IssueSeverity,
async_create_issue,
async_delete_issue,
)
from .const import (
ATTR_DATA,
ATTR_HEALTHY,
ATTR_SUPPORTED,
ATTR_UNHEALTHY,
ATTR_UNHEALTHY_REASONS,
ATTR_UNSUPPORTED,
ATTR_UNSUPPORTED_REASONS,
ATTR_UPDATE_KEY,
ATTR_WS_EVENT,
DOMAIN,
EVENT_HEALTH_CHANGED,
EVENT_SUPERVISOR_EVENT,
EVENT_SUPERVISOR_UPDATE,
EVENT_SUPPORTED_CHANGED,
UPDATE_KEY_SUPERVISOR,
)
from .handler import HassIO
ISSUE_ID_UNHEALTHY = "unhealthy_system"
ISSUE_ID_UNSUPPORTED = "unsupported_system"
INFO_URL_UNHEALTHY = "https://www.home-assistant.io/more-info/unhealthy"
INFO_URL_UNSUPPORTED = "https://www.home-assistant.io/more-info/unsupported"
UNSUPPORTED_REASONS = {
"apparmor",
"connectivity_check",
"content_trust",
"dbus",
"dns_server",
"docker_configuration",
"docker_version",
"cgroup_version",
"job_conditions",
"lxc",
"network_manager",
"os",
"os_agent",
"restart_policy",
"software",
"source_mods",
"supervisor_version",
"systemd",
"systemd_journal",
"systemd_resolved",
}
# Some unsupported reasons also mark the system as unhealthy. If the unsupported reason
# provides no additional information beyond the unhealthy one then skip that repair.
UNSUPPORTED_SKIP_REPAIR = {"privileged"}
UNHEALTHY_REASONS = {
"docker",
"supervisor",
"setup",
"privileged",
"untrusted",
}
class SupervisorRepairs:
"""Create repairs from supervisor events."""
def __init__(self, hass: HomeAssistant, client: HassIO) -> None:
"""Initialize supervisor repairs."""
self._hass = hass
self._client = client
self._unsupported_reasons: set[str] = set()
self._unhealthy_reasons: set[str] = set()
@property
def unhealthy_reasons(self) -> set[str]:
"""Get unhealthy reasons. Returns empty set if system is healthy."""
return self._unhealthy_reasons
@unhealthy_reasons.setter
def unhealthy_reasons(self, reasons: set[str]) -> None:
"""Set unhealthy reasons. Create or delete repairs as necessary."""
for unhealthy in reasons - self.unhealthy_reasons:
if unhealthy in UNHEALTHY_REASONS:
translation_key = f"unhealthy_{unhealthy}"
translation_placeholders = None
else:
translation_key = "unhealthy"
translation_placeholders = {"reason": unhealthy}
async_create_issue(
self._hass,
DOMAIN,
f"{ISSUE_ID_UNHEALTHY}_{unhealthy}",
is_fixable=False,
learn_more_url=f"{INFO_URL_UNHEALTHY}/{unhealthy}",
severity=IssueSeverity.CRITICAL,
translation_key=translation_key,
translation_placeholders=translation_placeholders,
)
for fixed in self.unhealthy_reasons - reasons:
async_delete_issue(self._hass, DOMAIN, f"{ISSUE_ID_UNHEALTHY}_{fixed}")
self._unhealthy_reasons = reasons
@property
def unsupported_reasons(self) -> set[str]:
"""Get unsupported reasons. Returns empty set if system is supported."""
return self._unsupported_reasons
@unsupported_reasons.setter
def unsupported_reasons(self, reasons: set[str]) -> None:
"""Set unsupported reasons. Create or delete repairs as necessary."""
for unsupported in reasons - UNSUPPORTED_SKIP_REPAIR - self.unsupported_reasons:
if unsupported in UNSUPPORTED_REASONS:
translation_key = f"unsupported_{unsupported}"
translation_placeholders = None
else:
translation_key = "unsupported"
translation_placeholders = {"reason": unsupported}
async_create_issue(
self._hass,
DOMAIN,
f"{ISSUE_ID_UNSUPPORTED}_{unsupported}",
is_fixable=False,
learn_more_url=f"{INFO_URL_UNSUPPORTED}/{unsupported}",
severity=IssueSeverity.WARNING,
translation_key=translation_key,
translation_placeholders=translation_placeholders,
)
for fixed in self.unsupported_reasons - (reasons - UNSUPPORTED_SKIP_REPAIR):
async_delete_issue(self._hass, DOMAIN, f"{ISSUE_ID_UNSUPPORTED}_{fixed}")
self._unsupported_reasons = reasons
async def setup(self) -> None:
"""Create supervisor events listener."""
await self.update()
async_dispatcher_connect(
self._hass, EVENT_SUPERVISOR_EVENT, self._supervisor_events_to_repairs
)
async def update(self) -> None:
"""Update repairs from Supervisor resolution center."""
data = await self._client.get_resolution_info()
self.unhealthy_reasons = set(data[ATTR_UNHEALTHY])
self.unsupported_reasons = set(data[ATTR_UNSUPPORTED])
@callback
def _supervisor_events_to_repairs(self, event: dict[str, Any]) -> None:
"""Create repairs from supervisor events."""
if ATTR_WS_EVENT not in event:
return
if (
event[ATTR_WS_EVENT] == EVENT_SUPERVISOR_UPDATE
and event.get(ATTR_UPDATE_KEY) == UPDATE_KEY_SUPERVISOR
):
self._hass.async_create_task(self.update())
elif event[ATTR_WS_EVENT] == EVENT_HEALTH_CHANGED:
self.unhealthy_reasons = (
set()
if event[ATTR_DATA][ATTR_HEALTHY]
else set(event[ATTR_DATA][ATTR_UNHEALTHY_REASONS])
)
elif event[ATTR_WS_EVENT] == EVENT_SUPPORTED_CHANGED:
self.unsupported_reasons = (
set()
if event[ATTR_DATA][ATTR_SUPPORTED]
else set(event[ATTR_DATA][ATTR_UNSUPPORTED_REASONS])
)
@@ -15,5 +15,115 @@
"update_channel": "Update Channel",
"version_api": "Version API"
}
},
"issues": {
"unhealthy": {
"title": "Unhealthy system - {reason}",
"description": "System is currently unhealthy due to {reason}. Use the link to learn more and how to fix this."
},
"unhealthy_docker": {
"title": "Unhealthy system - Docker misconfigured",
"description": "System is currently unhealthy because Docker is configured incorrectly. Use the link to learn more and how to fix this."
},
"unhealthy_supervisor": {
"title": "Unhealthy system - Supervisor update failed",
"description": "System is currently unhealthy because an attempt to update Supervisor to the latest version has failed. Use the link to learn more and how to fix this."
},
"unhealthy_setup": {
"title": "Unhealthy system - Setup failed",
"description": "System is currently unhealthy because setup failed to complete. There are a number of reasons this can occur, use the link to learn more and how to fix this."
},
"unhealthy_privileged": {
"title": "Unhealthy system - Not privileged",
"description": "System is currently unhealthy because it does not have privileged access to the docker runtime. Use the link to learn more and how to fix this."
},
"unhealthy_untrusted": {
"title": "Unhealthy system - Untrusted code",
"description": "System is currently unhealthy because it has detected untrusted code or images in use. Use the link to learn more and how to fix this."
},
"unsupported": {
"title": "Unsupported system - {reason}",
"description": "System is unsupported due to {reason}. Use the link to learn more and how to fix this."
},
"unsupported_apparmor": {
"title": "Unsupported system - AppArmor issues",
"description": "System is unsupported because AppArmor is working incorrectly and add-ons are running in an unprotected and insecure way. Use the link to learn more and how to fix this."
},
"unsupported_cgroup_version": {
"title": "Unsupported system - CGroup version",
"description": "System is unsupported because the wrong version of Docker CGroup is in use. Use the link to learn the correct version and how to fix this."
},
"unsupported_connectivity_check": {
"title": "Unsupported system - Connectivity check disabled",
"description": "System is unsupported because Home Assistant cannot determine when an internet connection is available. Use the link to learn more and how to fix this."
},
"unsupported_content_trust": {
"title": "Unsupported system - Content-trust check disabled",
"description": "System is unsupported because Home Assistant cannot verify content being run is trusted and not modified by attackers. Use the link to learn more and how to fix this."
},
"unsupported_dbus": {
"title": "Unsupported system - D-Bus issues",
"description": "System is unsupported because D-Bus is working incorrectly. Many things fail without this as Supervisor cannot communicate with the host. Use the link to learn more and how to fix this."
},
"unsupported_dns_server": {
"title": "Unsupported system - DNS server issues",
"description": "System is unsupported because the provided DNS server does not work correctly and the fallback DNS option has been disabled. Use the link to learn more and how to fix this."
},
"unsupported_docker_configuration": {
"title": "Unsupported system - Docker misconfigured",
"description": "System is unsupported because the Docker daemon is running in an unexpected way. Use the link to learn more and how to fix this."
},
"unsupported_docker_version": {
"title": "Unsupported system - Docker version",
"description": "System is unsupported because the wrong version of Docker is in use. Use the link to learn the correct version and how to fix this."
},
"unsupported_job_conditions": {
"title": "Unsupported system - Protections disabled",
"description": "System is unsupported because one or more job conditions have been disabled which protect from unexpected failures and breakages. Use the link to learn more and how to fix this."
},
"unsupported_lxc": {
"title": "Unsupported system - LXC detected",
"description": "System is unsupported because it is being run in an LXC virtual machine. Use the link to learn more and how to fix this."
},
"unsupported_network_manager": {
"title": "Unsupported system - Network Manager issues",
"description": "System is unsupported because Network Manager is missing, inactive or misconfigured. Use the link to learn more and how to fix this."
},
"unsupported_os": {
"title": "Unsupported system - Operating System",
"description": "System is unsupported because the operating system in use is not tested or maintained for use with Supervisor. Use the link to which operating systems are supported and how to fix this."
},
"unsupported_os_agent": {
"title": "Unsupported system - OS-Agent issues",
"description": "System is unsupported because OS-Agent is missing, inactive or misconfigured. Use the link to learn more and how to fix this."
},
"unsupported_restart_policy": {
"title": "Unsupported system - Container restart policy",
"description": "System is unsupported because a Docker container has a restart policy set which could cause issues on startup. Use the link to learn more and how to fix this."
},
"unsupported_software": {
"title": "Unsupported system - Unsupported software",
"description": "System is unsupported because additional software outside the Home Assistant ecosystem has been detected. Use the link to learn more and how to fix this."
},
"unsupported_source_mods": {
"title": "Unsupported system - Supervisor source modifications",
"description": "System is unsupported because Supervisor source code has been modified. Use the link to learn more and how to fix this."
},
"unsupported_supervisor_version": {
"title": "Unsupported system - Supervisor version",
"description": "System is unsupported because an out-of-date version of Supervisor is in use and auto-update has been disabled. Use the link to learn more and how to fix this."
},
"unsupported_systemd": {
"title": "Unsupported system - Systemd issues",
"description": "System is unsupported because Systemd is missing, inactive or misconfigured. Use the link to learn more and how to fix this."
},
"unsupported_systemd_journal": {
"title": "Unsupported system - Systemd Journal issues",
"description": "System is unsupported because Systemd Journal and/or the gateway service is missing, inactive or misconfigured . Use the link to learn more and how to fix this."
},
"unsupported_systemd_resolved": {
"title": "Unsupported system - Systemd-Resolved issues",
"description": "System is unsupported because Systemd Resolved is missing, inactive or misconfigured. Use the link to learn more and how to fix this."
}
}
}
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "El sistema no \u00e9s saludable a causa de '{reason}'. Clica l'enlla\u00e7 per obtenir m\u00e9s informaci\u00f3 sobre qu\u00e8 falla aix\u00f2 i com solucionar-ho.",
"title": "Sistema no saludable - {reason}"
},
"unsupported": {
"description": "El sistema no \u00e9s compatible a causa de '{reason}'. Clica l'enlla\u00e7 per obtenir m\u00e9s informaci\u00f3 sobre qu\u00e8 significa aix\u00f2 i com tornar a un sistema compatible.",
"title": "Sistema no compatible - {reason}"
}
},
"system_health": {
"info": {
"agent_version": "Versi\u00f3 de l'agent",
@@ -1,4 +1,114 @@
{
"issues": {
"unhealthy": {
"description": "System is currently unhealthy due to {reason}. Use the link to learn more and how to fix this.",
"title": "Unhealthy system - {reason}"
},
"unhealthy_docker": {
"description": "System is currently unhealthy because Docker is configured incorrectly. Use the link to learn more and how to fix this.",
"title": "Unhealthy system - Docker misconfigured"
},
"unhealthy_privileged": {
"description": "System is currently unhealthy because it does not have privileged access to the docker runtime. Use the link to learn more and how to fix this.",
"title": "Unhealthy system - Not privileged"
},
"unhealthy_setup": {
"description": "System is currently because setup failed to complete. There are a number of reasons this can occur, use the link to learn more and how to fix this.",
"title": "Unhealthy system - Setup failed"
},
"unhealthy_supervisor": {
"description": "System is currently unhealthy because an attempt to update Supervisor to the latest version has failed. Use the link to learn more and how to fix this.",
"title": "Unhealthy system - Supervisor update failed"
},
"unhealthy_untrusted": {
"description": "System is currently unhealthy because it has detected untrusted code or images in use. Use the link to learn more and how to fix this.",
"title": "Unhealthy system - Untrusted code"
},
"unsupported": {
"description": "System is unsupported due to {reason}. Use the link to learn more and how to fix this.",
"title": "Unsupported system - {reason}"
},
"unsupported_apparmor": {
"description": "System is unsupported because AppArmor is working incorrectly and add-ons are running in an unprotected and insecure way. Use the link to learn more and how to fix this.",
"title": "Unsupported system - AppArmor issues"
},
"unsupported_cgroup_version": {
"description": "System is unsupported because the wrong version of Docker CGroup is in use. Use the link to learn the correct version and how to fix this.",
"title": "Unsupported system - CGroup version"
},
"unsupported_connectivity_check": {
"description": "System is unsupported because Home Assistant cannot determine when an internet connection is available. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Connectivity check disabled"
},
"unsupported_content_trust": {
"description": "System is unsupported because Home Assistant cannot verify content being run is trusted and not modified by attackers. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Content-trust check disabled"
},
"unsupported_dbus": {
"description": "System is unsupported because D-Bus is working incorrectly. Many things fail without this as Supervisor cannot communicate with the host. Use the link to learn more and how to fix this.",
"title": "Unsupported system - D-Bus issues"
},
"unsupported_dns_server": {
"description": "System is unsupported because the provided DNS server does not work correctly and the fallback DNS option has been disabled. Use the link to learn more and how to fix this.",
"title": "Unsupported system - DNS server issues"
},
"unsupported_docker_configuration": {
"description": "System is unsupported because the Docker daemon is running in an unexpected way. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Docker misconfigured"
},
"unsupported_docker_version": {
"description": "System is unsupported because the wrong version of Docker is in use. Use the link to learn the correct version and how to fix this.",
"title": "Unsupported system - Docker version"
},
"unsupported_job_conditions": {
"description": "System is unsupported because one or more job conditions have been disabled which protect from unexpected failures and breakages. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Protections disabled"
},
"unsupported_lxc": {
"description": "System is unsupported because it is being run in an LXC virtual machine. Use the link to learn more and how to fix this.",
"title": "Unsupported system - LXC detected"
},
"unsupported_network_manager": {
"description": "System is unsupported because Network Manager is missing, inactive or misconfigured. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Network Manager issues"
},
"unsupported_os": {
"description": "System is unsupported because the operating system in use is not tested or maintained for use with Supervisor. Use the link to which operating systems are supported and how to fix this.",
"title": "Unsupported system - Operating System"
},
"unsupported_os_agent": {
"description": "System is unsupported because OS-Agent is missing, inactive or misconfigured. Use the link to learn more and how to fix this.",
"title": "Unsupported system - OS-Agent issues"
},
"unsupported_restart_policy": {
"description": "System is unsupported because a Docker container has a restart policy set which could cause issues on startup. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Container restart policy"
},
"unsupported_software": {
"description": "System is unsupported because additional software outside the Home Assistant ecosystem has been detected. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Unsupported software"
},
"unsupported_source_mods": {
"description": "System is unsupported because Supervisor source code has been modified. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Supervisor source modifications"
},
"unsupported_supervisor_version": {
"description": "System is unsupported because an out-of-date version of Supervisor is in use and auto-update has been disabled. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Supervisor version"
},
"unsupported_systemd": {
"description": "System is unsupported because Systemd is missing, inactive or misconfigured. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Systemd issues"
},
"unsupported_systemd_journal": {
"description": "System is unsupported because Systemd Journal and/or the gateway service is missing, inactive or misconfigured . Use the link to learn more and how to fix this.",
"title": "Unsupported system - Systemd Journal issues"
},
"unsupported_systemd_resolved": {
"description": "System is unsupported because Systemd Resolved is missing, inactive or misconfigured. Use the link to learn more and how to fix this.",
"title": "Unsupported system - Systemd-Resolved issues"
}
},
"system_health": {
"info": {
"agent_version": "Agent Version",
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "Actualmente el sistema no est\u00e1 en buen estado debido a ''{reason}''. Utiliza el enlace para obtener m\u00e1s informaci\u00f3n sobre lo que est\u00e1 mal y c\u00f3mo solucionarlo.",
"title": "Sistema en mal estado: {reason}"
},
"unsupported": {
"description": "El sistema no es compatible debido a ''{reason}''. Utiliza el enlace para obtener m\u00e1s informaci\u00f3n sobre lo que esto significa y c\u00f3mo volver a un sistema compatible.",
"title": "Sistema no compatible: {reason}"
}
},
"system_health": {
"info": {
"agent_version": "Versi\u00f3n del agente",
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "S\u00fcsteem ei ole praegu korras '{reason}' t\u00f5ttu. Kasuta linki, et saada rohkem teavet selle kohta, mis on valesti ja kuidas seda parandada.",
"title": "Vigane s\u00fcsteem \u2013 {reason}"
},
"unsupported": {
"description": "S\u00fcsteemi ei toetata '{reason}' t\u00f5ttu. Kasuta linki, et saada lisateavet selle kohta, mida see t\u00e4hendab ja kuidas toetatud s\u00fcsteemi naasta.",
"title": "Toetamata s\u00fcsteem \u2013 {reason}"
}
},
"system_health": {
"info": {
"agent_version": "Agendi versioon",
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "A rendszer jelenleg renellenes \u00e1llapotban van '{reason}' miatt. A link seg\u00edts\u00e9g\u00e9vel t\u00f6bbet is megtudhat arr\u00f3l, hogy mi a probl\u00e9ma, \u00e9s hogyan jav\u00edthatja ki.",
"title": "Rendellenes \u00e1llapot \u2013 {reason}"
},
"unsupported": {
"description": "A rendszer nem t\u00e1mogatott a k\u00f6vetkez\u0151 miatt: '{reason}'. A hivatkoz\u00e1s seg\u00edts\u00e9g\u00e9vel t\u00f6bbet megtudhat arr\u00f3l, mit jelent ez, \u00e9s hogyan t\u00e9rhet vissza egy t\u00e1mogatott rendszerhez.",
"title": "Nem t\u00e1mogatott rendszer \u2013 {reason}"
}
},
"system_health": {
"info": {
"agent_version": "\u00dcgyn\u00f6k verzi\u00f3",
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "O sistema n\u00e3o est\u00e1 \u00edntegro devido a '{reason}'. Use o link para saber mais sobre o que est\u00e1 errado e como corrigi-lo.",
"title": "Sistema insalubre - {reason}"
},
"unsupported": {
"description": "O sistema n\u00e3o \u00e9 suportado devido a '{reason}'. Use o link para saber mais sobre o que isso significa e como retornar a um sistema compat\u00edvel.",
"title": "Sistema n\u00e3o suportado - {reason}"
}
},
"system_health": {
"info": {
"agent_version": "Vers\u00e3o do Agent",
@@ -1,4 +1,14 @@
{
"issues": {
"unhealthy": {
"description": "\u0421\u0438\u0441\u0442\u0435\u043c\u0430 \u0432 \u043d\u0430\u0441\u0442\u043e\u044f\u0449\u0435\u0435 \u0432\u0440\u0435\u043c\u044f \u043d\u0435\u0440\u0430\u0431\u043e\u0442\u043e\u0441\u043f\u043e\u0441\u043e\u0431\u043d\u0430 \u043f\u043e \u043f\u0440\u0438\u0447\u0438\u043d\u0435 '{reason}'. \u041f\u0435\u0440\u0435\u0439\u0434\u0438\u0442\u0435 \u043f\u043e \u0441\u0441\u044b\u043b\u043a\u0435, \u0447\u0442\u043e\u0431\u044b \u0443\u0437\u043d\u0430\u0442\u044c \u0447\u0442\u043e \u043d\u0435 \u0442\u0430\u043a \u0438 \u043a\u0430\u043a \u044d\u0442\u043e \u0438\u0441\u043f\u0440\u0430\u0432\u0438\u0442\u044c.",
"title": "\u041d\u0435\u0440\u0430\u0431\u043e\u0442\u043e\u0441\u043f\u043e\u0441\u043e\u0431\u043d\u0430\u044f \u0441\u0438\u0441\u0442\u0435\u043c\u0430 - {reason}"
},
"unsupported": {
"description": "\u0421\u0438\u0441\u0442\u0435\u043c\u0430 \u043d\u0435 \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u0442\u0441\u044f \u043f\u043e \u043f\u0440\u0438\u0447\u0438\u043d\u0435 '{reason}'. \u041f\u0435\u0440\u0435\u0439\u0434\u0438\u0442\u0435 \u043f\u043e \u0441\u0441\u044b\u043b\u043a\u0435, \u0447\u0442\u043e\u0431\u044b \u0443\u0437\u043d\u0430\u0442\u044c \u0447\u0442\u043e \u044d\u0442\u043e \u0437\u043d\u0430\u0447\u0438\u0442 \u0438 \u043a\u0430\u043a \u0432\u0435\u0440\u043d\u0443\u0442\u044c\u0441\u044f \u043a \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u043c\u043e\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u0435.",
"title": "\u041d\u0435\u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u043c\u0430\u044f \u0441\u0438\u0441\u0442\u0435\u043c\u0430 - {reason}"
}
},
"system_health": {
"info": {
"agent_version": "\u0412\u0435\u0440\u0441\u0438\u044f \u0430\u0433\u0435\u043d\u0442\u0430",
+19 -19
View File
@@ -10,8 +10,10 @@ import os
from typing import Any, cast
from aiohttp import web
from pyhap.characteristic import Characteristic
from pyhap.const import STANDALONE_AID
from pyhap.loader import get_loader
from pyhap.service import Service
import voluptuous as vol
from zeroconf.asyncio import AsyncZeroconf
@@ -74,13 +76,7 @@ from . import ( # noqa: F401
type_switches,
type_thermostats,
)
from .accessories import (
HomeAccessory,
HomeBridge,
HomeDriver,
HomeIIDManager,
get_accessory,
)
from .accessories import HomeAccessory, HomeBridge, HomeDriver, get_accessory
from .aidmanager import AccessoryAidStorage
from .const import (
ATTR_INTEGRATION,
@@ -139,7 +135,7 @@ STATUS_WAIT = 3
PORT_CLEANUP_CHECK_INTERVAL_SECS = 1
_HOMEKIT_CONFIG_UPDATE_TIME = (
5 # number of seconds to wait for homekit to see the c# change
10 # number of seconds to wait for homekit to see the c# change
)
@@ -529,6 +525,7 @@ class HomeKit:
self.status = STATUS_READY
self.driver: HomeDriver | None = None
self.bridge: HomeBridge | None = None
self._reset_lock = asyncio.Lock()
def setup(self, async_zeroconf_instance: AsyncZeroconf, uuid: str) -> None:
"""Set up bridge and accessory driver."""
@@ -548,7 +545,7 @@ class HomeKit:
async_zeroconf_instance=async_zeroconf_instance,
zeroconf_server=f"{uuid}-hap.local.",
loader=get_loader(),
iid_manager=HomeIIDManager(self.iid_storage),
iid_storage=self.iid_storage,
)
# If we do not load the mac address will be wrong
@@ -558,21 +555,24 @@ class HomeKit:
async def async_reset_accessories(self, entity_ids: Iterable[str]) -> None:
"""Reset the accessory to load the latest configuration."""
if not self.bridge:
await self.async_reset_accessories_in_accessory_mode(entity_ids)
return
await self.async_reset_accessories_in_bridge_mode(entity_ids)
async with self._reset_lock:
if not self.bridge:
await self.async_reset_accessories_in_accessory_mode(entity_ids)
return
await self.async_reset_accessories_in_bridge_mode(entity_ids)
async def _async_shutdown_accessory(self, accessory: HomeAccessory) -> None:
"""Shutdown an accessory."""
assert self.driver is not None
await accessory.stop()
# Deallocate the IIDs for the accessory
iid_manager = self.driver.iid_manager
for service in accessory.services:
iid_manager.remove_iid(iid_manager.remove_obj(service))
for char in service.characteristics:
iid_manager.remove_iid(iid_manager.remove_obj(char))
iid_manager = accessory.iid_manager
services: list[Service] = accessory.services
for service in services:
iid_manager.remove_obj(service)
characteristics: list[Characteristic] = service.characteristics
for char in characteristics:
iid_manager.remove_obj(char)
async def async_reset_accessories_in_accessory_mode(
self, entity_ids: Iterable[str]
@@ -581,7 +581,6 @@ class HomeKit:
assert self.driver is not None
acc = cast(HomeAccessory, self.driver.accessory)
await self._async_shutdown_accessory(acc)
if acc.entity_id not in entity_ids:
return
if not (state := self.hass.states.get(acc.entity_id)):
@@ -589,6 +588,7 @@ class HomeKit:
"The underlying entity %s disappeared during reset", acc.entity_id
)
return
await self._async_shutdown_accessory(acc)
if new_acc := self._async_create_single_accessory([state]):
self.driver.accessory = new_acc
self.hass.async_add_job(new_acc.run)
@@ -270,7 +270,7 @@ class HomeAccessory(Accessory): # type: ignore[misc]
driver=driver,
display_name=cleanup_name_for_homekit(name),
aid=aid,
iid_manager=driver.iid_manager,
iid_manager=HomeIIDManager(driver.iid_storage),
*args,
**kwargs,
)
@@ -570,7 +570,7 @@ class HomeBridge(Bridge): # type: ignore[misc]
def __init__(self, hass: HomeAssistant, driver: HomeDriver, name: str) -> None:
"""Initialize a Bridge object."""
super().__init__(driver, name, iid_manager=driver.iid_manager)
super().__init__(driver, name, iid_manager=HomeIIDManager(driver.iid_storage))
self.set_info_service(
firmware_revision=format_version(__version__),
manufacturer=MANUFACTURER,
@@ -603,7 +603,7 @@ class HomeDriver(AccessoryDriver): # type: ignore[misc]
entry_id: str,
bridge_name: str,
entry_title: str,
iid_manager: HomeIIDManager,
iid_storage: AccessoryIIDStorage,
**kwargs: Any,
) -> None:
"""Initialize a AccessoryDriver object."""
@@ -612,7 +612,7 @@ class HomeDriver(AccessoryDriver): # type: ignore[misc]
self._entry_id = entry_id
self._bridge_name = bridge_name
self._entry_title = entry_title
self.iid_manager = iid_manager
self.iid_storage = iid_storage
@pyhap_callback # type: ignore[misc]
def pair(
@@ -31,6 +31,8 @@ async def async_get_config_entry_diagnostics(
"options": dict(entry.options),
},
}
if homekit.iid_storage:
data["iid_storage"] = homekit.iid_storage.allocations
if not homekit.driver: # not started yet or startup failed
return data
driver: AccessoryDriver = homekit.driver
+58 -14
View File
@@ -17,7 +17,7 @@ from homeassistant.helpers.storage import Store
from .util import get_iid_storage_filename_for_entry_id
IID_MANAGER_STORAGE_VERSION = 1
IID_MANAGER_STORAGE_VERSION = 2
IID_MANAGER_SAVE_DELAY = 2
ALLOCATIONS_KEY = "allocations"
@@ -26,6 +26,40 @@ IID_MIN = 1
IID_MAX = 18446744073709551615
ACCESSORY_INFORMATION_SERVICE = "3E"
class IIDStorage(Store):
"""Storage class for IIDManager."""
async def _async_migrate_func(
self,
old_major_version: int,
old_minor_version: int,
old_data: dict,
):
"""Migrate to the new version."""
if old_major_version == 1:
# Convert v1 to v2 format which uses a unique iid set per accessory
# instead of per pairing since we need the ACCESSORY_INFORMATION_SERVICE
# to always have iid 1 for each bridged accessory as well as the bridge
old_allocations: dict[str, int] = old_data.pop(ALLOCATIONS_KEY, {})
new_allocation: dict[str, dict[str, int]] = {}
old_data[ALLOCATIONS_KEY] = new_allocation
for allocation_key, iid in old_allocations.items():
aid_str, new_allocation_key = allocation_key.split("_", 1)
service_type, _, char_type, *_ = new_allocation_key.split("_")
accessory_allocation = new_allocation.setdefault(aid_str, {})
if service_type == ACCESSORY_INFORMATION_SERVICE and not char_type:
accessory_allocation[new_allocation_key] = 1
elif iid != 1:
accessory_allocation[new_allocation_key] = iid
return old_data
raise NotImplementedError
class AccessoryIIDStorage:
"""
Provide stable allocation of IIDs for the lifetime of an accessory.
@@ -37,15 +71,15 @@ class AccessoryIIDStorage:
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
"""Create a new iid store."""
self.hass = hass
self.allocations: dict[str, int] = {}
self.allocated_iids: list[int] = []
self.allocations: dict[str, dict[str, int]] = {}
self.allocated_iids: dict[str, list[int]] = {}
self.entry_id = entry_id
self.store: Store | None = None
self.store: IIDStorage | None = None
async def async_initialize(self) -> None:
"""Load the latest IID data."""
iid_store = get_iid_storage_filename_for_entry_id(self.entry_id)
self.store = Store(self.hass, IID_MANAGER_STORAGE_VERSION, iid_store)
self.store = IIDStorage(self.hass, IID_MANAGER_STORAGE_VERSION, iid_store)
if not (raw_storage := await self.store.async_load()):
# There is no data about iid allocations yet
@@ -53,7 +87,8 @@ class AccessoryIIDStorage:
assert isinstance(raw_storage, dict)
self.allocations = raw_storage.get(ALLOCATIONS_KEY, {})
self.allocated_iids = sorted(self.allocations.values())
for aid_str, allocations in self.allocations.items():
self.allocated_iids[aid_str] = sorted(allocations.values())
def get_or_allocate_iid(
self,
@@ -68,16 +103,25 @@ class AccessoryIIDStorage:
char_hap_type: str | None = uuid_to_hap_type(char_uuid) if char_uuid else None
# Allocation key must be a string since we are saving it to JSON
allocation_key = (
f'{aid}_{service_hap_type}_{service_unique_id or ""}_'
f'{service_hap_type}_{service_unique_id or ""}_'
f'{char_hap_type or ""}_{char_unique_id or ""}'
)
if allocation_key in self.allocations:
return self.allocations[allocation_key]
next_iid = self.allocated_iids[-1] + 1 if self.allocated_iids else 1
self.allocations[allocation_key] = next_iid
self.allocated_iids.append(next_iid)
# AID must be a string since JSON keys cannot be int
aid_str = str(aid)
accessory_allocation = self.allocations.setdefault(aid_str, {})
accessory_allocated_iids = self.allocated_iids.setdefault(aid_str, [1])
if service_hap_type == ACCESSORY_INFORMATION_SERVICE and char_uuid is None:
return 1
if allocation_key in accessory_allocation:
return accessory_allocation[allocation_key]
if accessory_allocated_iids:
allocated_iid = accessory_allocated_iids[-1] + 1
else:
allocated_iid = 2
accessory_allocation[allocation_key] = allocated_iid
accessory_allocated_iids.append(allocated_iid)
self._async_schedule_save()
return next_iid
return allocated_iid
@callback
def _async_schedule_save(self) -> None:
@@ -91,6 +135,6 @@ class AccessoryIIDStorage:
return await self.store.async_save(self._data_to_save())
@callback
def _data_to_save(self) -> dict[str, dict[str, int]]:
def _data_to_save(self) -> dict[str, dict[str, dict[str, int]]]:
"""Return data of entity map to store in a file."""
return {ALLOCATIONS_KEY: self.allocations}
@@ -3,7 +3,7 @@
"name": "HomeKit Controller",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/homekit_controller",
"requirements": ["aiohomekit==2.2.8"],
"requirements": ["aiohomekit==2.2.18"],
"zeroconf": ["_hap._tcp.local.", "_hap._udp.local."],
"bluetooth": [{ "manufacturer_id": 76, "manufacturer_data_start": [6] }],
"dependencies": ["bluetooth", "zeroconf"],
@@ -183,7 +183,7 @@ SIMPLE_SENSOR: dict[str, HomeKitSensorEntityDescription] = {
key=CharacteristicsTypes.VENDOR_EVE_ENERGY_KW_HOUR,
name="Energy kWh",
device_class=SensorDeviceClass.ENERGY,
state_class=SensorStateClass.MEASUREMENT,
state_class=SensorStateClass.TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_KILO_WATT_HOUR,
),
CharacteristicsTypes.VENDOR_EVE_ENERGY_VOLTAGE: HomeKitSensorEntityDescription(
@@ -3,7 +3,7 @@
"name": "Huisbaasje",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/huisbaasje",
"requirements": ["energyflip-client==0.2.1"],
"requirements": ["energyflip-client==0.2.2"],
"codeowners": ["@dennisschroer"],
"iot_class": "cloud_polling",
"loggers": ["huisbaasje"]
@@ -4,7 +4,7 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/iaqualink/",
"codeowners": ["@flz"],
"requirements": ["iaqualink==0.5.0"],
"requirements": ["iaqualink==0.5.0", "h2==4.1.0"],
"iot_class": "cloud_polling",
"loggers": ["iaqualink"]
}
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "Internet Printing Protocol (IPP)",
"documentation": "https://www.home-assistant.io/integrations/ipp",
"integration_type": "device",
"requirements": ["pyipp==0.12.0"],
"requirements": ["pyipp==0.12.1"],
"codeowners": ["@ctalkington"],
"config_flow": true,
"quality_scale": "platinum",
@@ -2,7 +2,7 @@
"domain": "lidarr",
"name": "Lidarr",
"documentation": "https://www.home-assistant.io/integrations/lidarr",
"requirements": ["aiopyarr==22.10.0"],
"requirements": ["aiopyarr==22.11.0"],
"codeowners": ["@tkdrob"],
"config_flow": true,
"iot_class": "local_polling",
+25 -1
View File
@@ -14,8 +14,11 @@ from awesomeversion import AwesomeVersion
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ATTR_BRIGHTNESS_PCT,
ATTR_COLOR_NAME,
ATTR_COLOR_TEMP_KELVIN,
ATTR_HS_COLOR,
ATTR_KELVIN,
ATTR_RGB_COLOR,
ATTR_XY_COLOR,
)
@@ -24,7 +27,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
import homeassistant.util.color as color_util
from .const import DOMAIN, INFRARED_BRIGHTNESS_VALUES_MAP, OVERALL_TIMEOUT
from .const import _LOGGER, DOMAIN, INFRARED_BRIGHTNESS_VALUES_MAP, OVERALL_TIMEOUT
FIX_MAC_FW = AwesomeVersion("3.70")
@@ -80,6 +83,17 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
"""
hue, saturation, brightness, kelvin = [None] * 4
if (color_name := kwargs.get(ATTR_COLOR_NAME)) is not None:
try:
hue, saturation = color_util.color_RGB_to_hs(
*color_util.color_name_to_rgb(color_name)
)
except ValueError:
_LOGGER.warning(
"Got unknown color %s, falling back to neutral white", color_name
)
hue, saturation = (0, 0)
if ATTR_HS_COLOR in kwargs:
hue, saturation = kwargs[ATTR_HS_COLOR]
elif ATTR_RGB_COLOR in kwargs:
@@ -93,6 +107,13 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
saturation = int(saturation / 100 * 65535)
kelvin = 3500
if ATTR_KELVIN in kwargs:
_LOGGER.warning(
"The 'kelvin' parameter is deprecated. Please use 'color_temp_kelvin' for all service calls"
)
kelvin = kwargs.pop(ATTR_KELVIN)
saturation = 0
if ATTR_COLOR_TEMP_KELVIN in kwargs:
kelvin = kwargs.pop(ATTR_COLOR_TEMP_KELVIN)
saturation = 0
@@ -100,6 +121,9 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
if ATTR_BRIGHTNESS in kwargs:
brightness = convert_8_to_16(kwargs[ATTR_BRIGHTNESS])
if ATTR_BRIGHTNESS_PCT in kwargs:
brightness = convert_8_to_16(round(255 * kwargs[ATTR_BRIGHTNESS_PCT] / 100))
hsbk = [hue, saturation, brightness, kelvin]
return None if hsbk == [None] * 4 else hsbk
@@ -3,7 +3,7 @@
"name": "Litter-Robot",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/litterrobot",
"requirements": ["pylitterbot==2022.10.2"],
"requirements": ["pylitterbot==2022.11.0"],
"codeowners": ["@natekspencer", "@tkdrob"],
"dhcp": [{ "hostname": "litter-robot4" }],
"iot_class": "cloud_push",
@@ -52,6 +52,7 @@ ABBREVIATIONS = {
"e": "encoding",
"en": "enabled_by_default",
"ent_cat": "entity_category",
"ent_pic": "entity_picture",
"err_t": "error_topic",
"err_tpl": "error_template",
"fanspd_t": "fan_speed_topic",
@@ -169,6 +170,8 @@ ABBREVIATIONS = {
"pr_mode_val_tpl": "preset_mode_value_template",
"pr_modes": "preset_modes",
"r_tpl": "red_template",
"rel_s": "release_summary",
"rel_u": "release_url",
"ret": "retain",
"rgb_cmd_tpl": "rgb_command_template",
"rgb_cmd_t": "rgb_command_topic",
@@ -242,6 +245,7 @@ ABBREVIATIONS = {
"tilt_opt": "tilt_optimistic",
"tilt_status_t": "tilt_status_topic",
"tilt_status_tpl": "tilt_status_template",
"tit": "title",
"t": "topic",
"uniq_id": "unique_id",
"unit_of_meas": "unit_of_measurement",
+2 -2
View File
@@ -271,8 +271,8 @@ class MqttSensor(MqttEntity, RestoreSensor):
)
elif self.device_class == SensorDeviceClass.DATE:
payload = payload.date()
if payload != "":
self._state = payload
self._state = payload
def _update_last_reset(msg):
payload = self._last_reset_template(msg.payload)
+74 -8
View File
@@ -19,6 +19,7 @@ from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME, CONF_VALUE_TEMPLAT
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.json import JSON_DECODE_EXCEPTIONS, json_loads
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
@@ -30,6 +31,7 @@ from .const import (
CONF_QOS,
CONF_RETAIN,
CONF_STATE_TOPIC,
PAYLOAD_EMPTY_JSON,
)
from .debug_info import log_messages
from .mixins import MQTT_ENTITY_COMMON_SCHEMA, MqttEntity, async_setup_entry_helper
@@ -40,20 +42,28 @@ _LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = "MQTT Update"
CONF_ENTITY_PICTURE = "entity_picture"
CONF_LATEST_VERSION_TEMPLATE = "latest_version_template"
CONF_LATEST_VERSION_TOPIC = "latest_version_topic"
CONF_PAYLOAD_INSTALL = "payload_install"
CONF_RELEASE_SUMMARY = "release_summary"
CONF_RELEASE_URL = "release_url"
CONF_TITLE = "title"
PLATFORM_SCHEMA_MODERN = MQTT_RO_SCHEMA.extend(
{
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA,
vol.Optional(CONF_COMMAND_TOPIC): valid_publish_topic,
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA,
vol.Optional(CONF_ENTITY_PICTURE): cv.string,
vol.Optional(CONF_LATEST_VERSION_TEMPLATE): cv.template,
vol.Required(CONF_LATEST_VERSION_TOPIC): valid_subscribe_topic,
vol.Optional(CONF_LATEST_VERSION_TOPIC): valid_subscribe_topic,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_PAYLOAD_INSTALL): cv.string,
vol.Optional(CONF_RELEASE_SUMMARY): cv.string,
vol.Optional(CONF_RELEASE_URL): cv.string,
vol.Optional(CONF_RETAIN, default=DEFAULT_RETAIN): cv.boolean,
vol.Optional(CONF_TITLE): cv.string,
},
).extend(MQTT_ENTITY_COMMON_SCHEMA.schema)
@@ -99,10 +109,22 @@ class MqttUpdate(MqttEntity, UpdateEntity, RestoreEntity):
"""Initialize the MQTT update."""
self._config = config
self._attr_device_class = self._config.get(CONF_DEVICE_CLASS)
self._attr_release_summary = self._config.get(CONF_RELEASE_SUMMARY)
self._attr_release_url = self._config.get(CONF_RELEASE_URL)
self._attr_title = self._config.get(CONF_TITLE)
self._entity_picture: str | None = self._config.get(CONF_ENTITY_PICTURE)
UpdateEntity.__init__(self)
MqttEntity.__init__(self, hass, config, config_entry, discovery_data)
@property
def entity_picture(self) -> str | None:
"""Return the entity picture to use in the frontend."""
if self._entity_picture is not None:
return self._entity_picture
return super().entity_picture
@staticmethod
def config_schema() -> vol.Schema:
"""Return the config schema."""
@@ -138,15 +160,59 @@ class MqttUpdate(MqttEntity, UpdateEntity, RestoreEntity):
@callback
@log_messages(self.hass, self.entity_id)
def handle_installed_version_received(msg: ReceiveMessage) -> None:
"""Handle receiving installed version via MQTT."""
installed_version = self._templates[CONF_VALUE_TEMPLATE](msg.payload)
def handle_state_message_received(msg: ReceiveMessage) -> None:
"""Handle receiving state message via MQTT."""
payload = self._templates[CONF_VALUE_TEMPLATE](msg.payload)
if isinstance(installed_version, str) and installed_version != "":
self._attr_installed_version = installed_version
if not payload or payload == PAYLOAD_EMPTY_JSON:
_LOGGER.debug(
"Ignoring empty payload '%s' after rendering for topic %s",
payload,
msg.topic,
)
return
json_payload = {}
try:
json_payload = json_loads(payload)
_LOGGER.debug(
"JSON payload detected after processing payload '%s' on topic %s",
json_payload,
msg.topic,
)
except JSON_DECODE_EXCEPTIONS:
_LOGGER.debug(
"No valid (JSON) payload detected after processing payload '%s' on topic %s",
payload,
msg.topic,
)
json_payload["installed_version"] = payload
if "installed_version" in json_payload:
self._attr_installed_version = json_payload["installed_version"]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
add_subscription(topics, CONF_STATE_TOPIC, handle_installed_version_received)
if "latest_version" in json_payload:
self._attr_latest_version = json_payload["latest_version"]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
if CONF_TITLE in json_payload and not self._attr_title:
self._attr_title = json_payload[CONF_TITLE]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
if CONF_RELEASE_SUMMARY in json_payload and not self._attr_release_summary:
self._attr_release_summary = json_payload[CONF_RELEASE_SUMMARY]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
if CONF_RELEASE_URL in json_payload and not self._attr_release_url:
self._attr_release_url = json_payload[CONF_RELEASE_URL]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
if CONF_ENTITY_PICTURE in json_payload and not self._entity_picture:
self._entity_picture = json_payload[CONF_ENTITY_PICTURE]
get_mqtt_data(self.hass).state_write_requests.write_state_request(self)
add_subscription(topics, CONF_STATE_TOPIC, handle_state_message_received)
@callback
@log_messages(self.hass, self.entity_id)
+2
View File
@@ -56,6 +56,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
try:
await nam.async_check_credentials()
except ApiError as err:
raise ConfigEntryNotReady from err
except AuthFailed as err:
raise ConfigEntryAuthFailed from err
+4 -1
View File
@@ -271,7 +271,10 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
if CONF_WEBHOOK_ID in entry.data:
webhook_unregister(hass, entry.data[CONF_WEBHOOK_ID])
await data[entry.entry_id][AUTH].async_dropwebhook()
try:
await data[entry.entry_id][AUTH].async_dropwebhook()
except pyatmo.ApiError:
_LOGGER.debug("No webhook to be dropped")
_LOGGER.info("Unregister Netatmo webhook")
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
+5 -2
View File
@@ -193,17 +193,20 @@ class NetatmoLight(NetatmoBase, LightEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn light on."""
_LOGGER.debug("Turn light '%s' on", self.name)
if ATTR_BRIGHTNESS in kwargs:
await self._dimmer.async_set_brightness(kwargs[ATTR_BRIGHTNESS])
else:
await self._dimmer.async_on()
self._attr_is_on = True
self.async_write_ha_state()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn light off."""
_LOGGER.debug("Turn light '%s' off", self.name)
await self._dimmer.async_off()
self._attr_is_on = False
self.async_write_ha_state()
@callback
def async_update_callback(self) -> None:
@@ -2,7 +2,7 @@
"domain": "netatmo",
"name": "Netatmo",
"documentation": "https://www.home-assistant.io/integrations/netatmo",
"requirements": ["pyatmo==7.2.0"],
"requirements": ["pyatmo==7.4.0"],
"after_dependencies": ["cloud", "media_source"],
"dependencies": ["application_credentials", "webhook"],
"codeowners": ["@cgtobi"],
@@ -77,7 +77,11 @@ class NetatmoSwitch(NetatmoBase, SwitchEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the zone on."""
await self._switch.async_on()
self._attr_is_on = True
self.async_write_ha_state()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the zone off."""
await self._switch.async_off()
self._attr_is_on = False
self.async_write_ha_state()
+5
View File
@@ -80,6 +80,11 @@ class NexiaThermostatEntity(NexiaEntity):
self.hass, f"{SIGNAL_THERMOSTAT_UPDATE}-{self._thermostat.thermostat_id}"
)
@property
def available(self) -> bool:
"""Return True if thermostat is available and data is available."""
return super().available and self._thermostat.is_online
class NexiaThermostatZoneEntity(NexiaThermostatEntity):
"""Base class for nexia devices attached to a thermostat."""
+1 -1
View File
@@ -1,7 +1,7 @@
{
"domain": "nexia",
"name": "Nexia/American Standard/Trane",
"requirements": ["nexia==2.0.5"],
"requirements": ["nexia==2.0.6"],
"codeowners": ["@bdraco"],
"documentation": "https://www.home-assistant.io/integrations/nexia",
"config_flow": true,
+1 -1
View File
@@ -8,7 +8,7 @@
"manufacturer_id": 220
}
],
"requirements": ["oralb-ble==0.9.0"],
"requirements": ["oralb-ble==0.13.0"],
"dependencies": ["bluetooth"],
"codeowners": ["@bdraco"],
"iot_class": "local_push"
@@ -5,6 +5,7 @@ from typing import TypedDict
from p1monitor import (
P1Monitor,
P1MonitorConnectionError,
P1MonitorNoDataError,
Phases,
Settings,
@@ -101,8 +102,8 @@ class P1MonitorDataUpdateCoordinator(DataUpdateCoordinator[P1MonitorData]):
try:
data[SERVICE_WATERMETER] = await self.p1monitor.watermeter()
self.has_water_meter = True
except P1MonitorNoDataError:
LOGGER.debug("No watermeter data received from P1 Monitor")
except (P1MonitorNoDataError, P1MonitorConnectionError):
LOGGER.debug("No water meter data received from P1 Monitor")
if self.has_water_meter is None:
self.has_water_meter = False
@@ -3,7 +3,7 @@
"name": "P1 Monitor",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/p1_monitor",
"requirements": ["p1monitor==2.1.0"],
"requirements": ["p1monitor==2.1.1"],
"codeowners": ["@klaasnicolaas"],
"quality_scale": "platinum",
"iot_class": "local_polling",
@@ -2,7 +2,7 @@
"domain": "plugwise",
"name": "Plugwise",
"documentation": "https://www.home-assistant.io/integrations/plugwise",
"requirements": ["plugwise==0.25.3"],
"requirements": ["plugwise==0.25.7"],
"codeowners": ["@CoMPaTech", "@bouwew", "@brefra", "@frenck"],
"zeroconf": ["_plugwise._tcp.local."],
"config_flow": true,
@@ -2,7 +2,7 @@
"domain": "radarr",
"name": "Radarr",
"documentation": "https://www.home-assistant.io/integrations/radarr",
"requirements": ["aiopyarr==22.10.0"],
"requirements": ["aiopyarr==22.11.0"],
"codeowners": ["@tkdrob"],
"config_flow": true,
"iot_class": "local_polling",
@@ -237,6 +237,7 @@ async def async_setup_entry(
# Add switches to control restrictions:
for description in RESTRICTIONS_SWITCH_DESCRIPTIONS:
coordinator = data.coordinators[description.api_category]
if not key_exists(coordinator.data, description.data_key):
continue
entities.append(RainMachineRestrictionSwitch(entry, data, description))
+8 -1
View File
@@ -89,6 +89,13 @@ COMBINED_SCHEMA = vol.Schema(
)
CONFIG_SCHEMA = vol.Schema(
{DOMAIN: vol.All(cv.ensure_list, [COMBINED_SCHEMA])},
{
DOMAIN: vol.All(
# convert empty dict to empty list
lambda x: [] if x == {} else x,
cv.ensure_list,
[COMBINED_SCHEMA],
)
},
extra=vol.ALLOW_EXTRA,
)
@@ -7,7 +7,7 @@
"samsungctl[websocket]==0.7.1",
"samsungtvws[async,encrypted]==2.5.0",
"wakeonlan==2.1.0",
"async-upnp-client==0.32.1"
"async-upnp-client==0.32.2"
],
"ssdp": [
{
+4 -2
View File
@@ -23,6 +23,7 @@ from homeassistant.const import (
CONF_NAME,
CONF_PASSWORD,
CONF_RESOURCE,
CONF_SCAN_INTERVAL,
CONF_UNIQUE_ID,
CONF_UNIT_OF_MEASUREMENT,
CONF_USERNAME,
@@ -43,7 +44,7 @@ from .coordinator import ScrapeCoordinator
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(minutes=10)
DEFAULT_SCAN_INTERVAL = timedelta(minutes=10)
CONF_ATTR = "attribute"
CONF_SELECT = "select"
@@ -111,7 +112,8 @@ async def async_setup_platform(
rest = RestData(hass, method, resource, auth, headers, None, payload, verify_ssl)
coordinator = ScrapeCoordinator(hass, rest, SCAN_INTERVAL)
scan_interval: timedelta = config.get(CONF_SCAN_INTERVAL, DEFAULT_SCAN_INTERVAL)
coordinator = ScrapeCoordinator(hass, rest, scan_interval)
await coordinator.async_refresh()
if coordinator.data is None:
raise PlatformNotReady
@@ -16,7 +16,6 @@ from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import aiohttp_client, device_registry
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import (
CONF_COAP_PORT,
@@ -113,13 +112,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def _async_setup_block_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Shelly block based device from a config entry."""
temperature_unit = "C" if hass.config.units is METRIC_SYSTEM else "F"
options = aioshelly.common.ConnectionOptions(
entry.data[CONF_HOST],
entry.data.get(CONF_USERNAME),
entry.data.get(CONF_PASSWORD),
temperature_unit,
)
coap_context = await get_coap_context(hass)
@@ -9,6 +9,7 @@ from aioshelly.block_device import Block
from aioshelly.exceptions import DeviceConnectionError, InvalidAuthError, RpcCallError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_UNIT_OF_MEASUREMENT
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import device_registry, entity, entity_registry
@@ -615,6 +616,7 @@ class ShellySleepingBlockAttributeEntity(ShellyBlockAttributeEntity, RestoreEnti
"""Initialize the sleeping sensor."""
self.sensors = sensors
self.last_state: StateType = None
self.last_unit: str | None = None
self.coordinator = coordinator
self.attribute = attribute
self.block: Block | None = block # type: ignore[assignment]
@@ -644,6 +646,7 @@ class ShellySleepingBlockAttributeEntity(ShellyBlockAttributeEntity, RestoreEnti
if last_state is not None:
self.last_state = last_state.state
self.last_unit = last_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
@callback
def _update_callback(self) -> None:
@@ -696,6 +699,7 @@ class ShellySleepingRpcAttributeEntity(ShellyRpcAttributeEntity, RestoreEntity):
) -> None:
"""Initialize the sleeping sensor."""
self.last_state: StateType = None
self.last_unit: str | None = None
self.coordinator = coordinator
self.key = key
self.attribute = attribute
@@ -725,3 +729,4 @@ class ShellySleepingRpcAttributeEntity(ShellyRpcAttributeEntity, RestoreEntity):
if last_state is not None:
self.last_state = last_state.state
self.last_unit = last_state.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
@@ -3,7 +3,7 @@
"name": "Shelly",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/shelly",
"requirements": ["aioshelly==4.1.1"],
"requirements": ["aioshelly==4.1.2"],
"dependencies": ["http"],
"zeroconf": [
{
+21 -17
View File
@@ -47,12 +47,7 @@ from .entity import (
async_setup_entry_rest,
async_setup_entry_rpc,
)
from .utils import (
get_device_entry_gen,
get_device_uptime,
is_rpc_device_externally_powered,
temperature_unit,
)
from .utils import get_device_entry_gen, get_device_uptime
@dataclass
@@ -84,7 +79,7 @@ SENSORS: Final = {
("device", "deviceTemp"): BlockSensorDescription(
key="device|deviceTemp",
name="Device Temperature",
unit_fn=temperature_unit,
native_unit_of_measurement=TEMP_CELSIUS,
value=lambda value: round(value, 1),
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -145,7 +140,7 @@ SENSORS: Final = {
key="emeter|powerFactor",
name="Power Factor",
native_unit_of_measurement=PERCENTAGE,
value=lambda value: abs(round(value * 100, 1)),
value=lambda value: round(value * 100, 1),
device_class=SensorDeviceClass.POWER_FACTOR,
state_class=SensorStateClass.MEASUREMENT,
),
@@ -226,7 +221,7 @@ SENSORS: Final = {
("sensor", "temp"): BlockSensorDescription(
key="sensor|temp",
name="Temperature",
unit_fn=temperature_unit,
native_unit_of_measurement=TEMP_CELSIUS,
value=lambda value: round(value, 1),
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -235,7 +230,7 @@ SENSORS: Final = {
("sensor", "extTemp"): BlockSensorDescription(
key="sensor|extTemp",
name="Temperature",
unit_fn=temperature_unit,
native_unit_of_measurement=TEMP_CELSIUS,
value=lambda value: round(value, 1),
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -407,7 +402,6 @@ RPC_SENSORS: Final = {
value=lambda status, _: status["percent"],
device_class=SensorDeviceClass.BATTERY,
state_class=SensorStateClass.MEASUREMENT,
removal_condition=is_rpc_device_externally_powered,
entity_registry_enabled_default=True,
entity_category=EntityCategory.DIAGNOSTIC,
),
@@ -505,8 +499,6 @@ class BlockSensor(ShellyBlockAttributeEntity, SensorEntity):
super().__init__(coordinator, block, attribute, description)
self._attr_native_unit_of_measurement = description.native_unit_of_measurement
if unit_fn := description.unit_fn:
self._attr_native_unit_of_measurement = unit_fn(block.info(attribute))
@property
def native_value(self) -> StateType:
@@ -553,10 +545,6 @@ class BlockSleepingSensor(ShellySleepingBlockAttributeEntity, SensorEntity):
"""Initialize the sleeping sensor."""
super().__init__(coordinator, block, attribute, description, entry, sensors)
self._attr_native_unit_of_measurement = description.native_unit_of_measurement
if block and (unit_fn := description.unit_fn):
self._attr_native_unit_of_measurement = unit_fn(block.info(attribute))
@property
def native_value(self) -> StateType:
"""Return value of sensor."""
@@ -565,6 +553,14 @@ class BlockSleepingSensor(ShellySleepingBlockAttributeEntity, SensorEntity):
return self.last_state
@property
def native_unit_of_measurement(self) -> str | None:
"""Return the unit of measurement of the sensor, if any."""
if self.block is not None:
return self.entity_description.native_unit_of_measurement
return self.last_unit
class RpcSleepingSensor(ShellySleepingRpcAttributeEntity, SensorEntity):
"""Represent a RPC sleeping sensor."""
@@ -578,3 +574,11 @@ class RpcSleepingSensor(ShellySleepingRpcAttributeEntity, SensorEntity):
return self.attribute_value
return self.last_state
@property
def native_unit_of_measurement(self) -> str | None:
"""Return the unit of measurement of the sensor, if any."""
if self.coordinator.device.initialized:
return self.entity_description.native_unit_of_measurement
return self.last_unit
+2 -16
View File
@@ -5,13 +5,13 @@ from datetime import datetime, timedelta
from typing import Any, cast
from aiohttp.web import Request, WebSocketResponse
from aioshelly.block_device import BLOCK_VALUE_UNIT, COAP, Block, BlockDevice
from aioshelly.block_device import COAP, Block, BlockDevice
from aioshelly.const import MODEL_NAMES
from aioshelly.rpc_device import RpcDevice, WsServer
from homeassistant.components.http import HomeAssistantView
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EVENT_HOMEASSISTANT_STOP, TEMP_CELSIUS, TEMP_FAHRENHEIT
from homeassistant.const import EVENT_HOMEASSISTANT_STOP
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry, entity_registry, singleton
from homeassistant.helpers.typing import EventType
@@ -43,13 +43,6 @@ def async_remove_shelly_entity(
entity_reg.async_remove(entity_id)
def temperature_unit(block_info: dict[str, Any]) -> str:
"""Detect temperature unit."""
if block_info[BLOCK_VALUE_UNIT] == "F":
return TEMP_FAHRENHEIT
return TEMP_CELSIUS
def get_block_device_name(device: BlockDevice) -> str:
"""Naming for device."""
return cast(str, device.settings["name"] or device.settings["device"]["hostname"])
@@ -364,13 +357,6 @@ def is_rpc_channel_type_light(config: dict[str, Any], channel: int) -> bool:
return con_types is not None and con_types[channel].lower().startswith("light")
def is_rpc_device_externally_powered(
config: dict[str, Any], status: dict[str, Any], key: str
) -> bool:
"""Return true if device has external power instead of battery."""
return cast(bool, status[key]["external"]["present"])
def get_rpc_input_triggers(device: RpcDevice) -> list[tuple[str, str]]:
"""Return list of input triggers for RPC device."""
triggers = []
@@ -82,7 +82,7 @@ class SnoozConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
name = user_input[CONF_NAME]
discovered = self._discovered_devices.get(name)
discovered = self._discovered_devices[name]
assert discovered is not None
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "Snooz",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/snooz",
"requirements": ["pysnooz==0.8.2"],
"requirements": ["pysnooz==0.8.3"],
"dependencies": ["bluetooth"],
"codeowners": ["@AustinBrunkhorst"],
"bluetooth": [
@@ -3,7 +3,7 @@
"name": "Sonarr",
"documentation": "https://www.home-assistant.io/integrations/sonarr",
"codeowners": ["@ctalkington"],
"requirements": ["aiopyarr==22.10.0"],
"requirements": ["aiopyarr==22.11.0"],
"config_flow": true,
"quality_scale": "silver",
"iot_class": "local_polling",
@@ -156,7 +156,6 @@ async def library_payload(hass, player):
media_content_type=item,
can_play=True,
can_expand=True,
thumbnail="https://brands.home-assistant.io/_/squeezebox/logo.png",
)
)
@@ -3,7 +3,7 @@
"name": "Squeezebox (Logitech Media Server)",
"documentation": "https://www.home-assistant.io/integrations/squeezebox",
"codeowners": ["@rajlaud"],
"requirements": ["pysqueezebox==0.6.0"],
"requirements": ["pysqueezebox==0.6.1"],
"config_flow": true,
"dhcp": [
{
+11 -2
View File
@@ -52,7 +52,7 @@ from homeassistant.helpers import discovery_flow
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.helpers.instance_id import async_get as async_get_instance_id
from homeassistant.helpers.network import get_url
from homeassistant.helpers.network import NoURLAvailableError, get_url
from homeassistant.helpers.system_info import async_get_system_info
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import async_get_ssdp, bind_hass
@@ -697,7 +697,16 @@ class Server:
udn = await self._async_get_instance_udn()
system_info = await async_get_system_info(self.hass)
model_name = system_info["installation_type"]
presentation_url = get_url(self.hass, allow_ip=True, prefer_external=False)
try:
presentation_url = get_url(self.hass, allow_ip=True, prefer_external=False)
except NoURLAvailableError:
_LOGGER.warning(
"Could not set up UPnP/SSDP server, as a presentation URL could"
" not be determined; Please configure your internal URL"
" in the Home Assistant general configuration"
)
return
serial_number = await async_get_instance_id(self.hass)
HassUpnpServiceDevice.DEVICE_DEFINITION = (
HassUpnpServiceDevice.DEVICE_DEFINITION._replace(
+1 -1
View File
@@ -2,7 +2,7 @@
"domain": "ssdp",
"name": "Simple Service Discovery Protocol (SSDP)",
"documentation": "https://www.home-assistant.io/integrations/ssdp",
"requirements": ["async-upnp-client==0.32.1"],
"requirements": ["async-upnp-client==0.32.2"],
"dependencies": ["network"],
"after_dependencies": ["zeroconf"],
"codeowners": [],
@@ -53,6 +53,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
try:
await tibber_connection.update_info()
if not tibber_connection.name:
raise ConfigEntryNotReady("Could not fetch Tibber data.")
except asyncio.TimeoutError as err:
raise ConfigEntryNotReady from err
except aiohttp.ClientError as err:
+44 -16
View File
@@ -20,6 +20,7 @@ from .const import (
CONFIG_ENTRY_ST,
CONFIG_ENTRY_UDN,
DOMAIN,
DOMAIN_DISCOVERIES,
LOGGER,
ST_IGD_V1,
ST_IGD_V2,
@@ -47,7 +48,7 @@ def _is_complete_discovery(discovery_info: ssdp.SsdpServiceInfo) -> bool:
)
async def _async_discover_igd_devices(
async def _async_discovered_igd_devices(
hass: HomeAssistant,
) -> list[ssdp.SsdpServiceInfo]:
"""Discovery IGD devices."""
@@ -79,9 +80,19 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
# - ssdp(discovery_info) --> ssdp_confirm(None) --> ssdp_confirm({}) --> create_entry()
# - user(None): scan --> user({...}) --> create_entry()
def __init__(self) -> None:
"""Initialize the UPnP/IGD config flow."""
self._discoveries: list[SsdpServiceInfo] | None = None
@property
def _discoveries(self) -> dict[str, SsdpServiceInfo]:
"""Get current discoveries."""
domain_data: dict = self.hass.data.setdefault(DOMAIN, {})
return domain_data.setdefault(DOMAIN_DISCOVERIES, {})
def _add_discovery(self, discovery: SsdpServiceInfo) -> None:
"""Add a discovery."""
self._discoveries[discovery.ssdp_usn] = discovery
def _remove_discovery(self, usn: str) -> SsdpServiceInfo:
"""Remove a discovery by its USN/unique_id."""
return self._discoveries.pop(usn)
async def async_step_user(
self, user_input: Mapping[str, Any] | None = None
@@ -95,7 +106,7 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
discovery = next(
iter(
discovery
for discovery in self._discoveries
for discovery in self._discoveries.values()
if discovery.ssdp_usn == user_input["unique_id"]
)
)
@@ -103,21 +114,19 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
return await self._async_create_entry_from_discovery(discovery)
# Discover devices.
discoveries = await _async_discover_igd_devices(self.hass)
discoveries = await _async_discovered_igd_devices(self.hass)
# Store discoveries which have not been configured.
current_unique_ids = {
entry.unique_id for entry in self._async_current_entries()
}
self._discoveries = [
discovery
for discovery in discoveries
for discovery in discoveries:
if (
_is_complete_discovery(discovery)
and _is_igd_device(discovery)
and discovery.ssdp_usn not in current_unique_ids
)
]
):
self._add_discovery(discovery)
# Ensure anything to add.
if not self._discoveries:
@@ -128,7 +137,7 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
vol.Required("unique_id"): vol.In(
{
discovery.ssdp_usn: _friendly_name_from_discovery(discovery)
for discovery in self._discoveries
for discovery in self._discoveries.values()
}
),
}
@@ -163,12 +172,13 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
mac_address = await _async_mac_address_from_discovery(self.hass, discovery_info)
host = discovery_info.ssdp_headers["_host"]
self._abort_if_unique_id_configured(
# Store mac address for older entries.
# Store mac address and other data for older entries.
# The location is stored in the config entry such that when the location changes, the entry is reloaded.
updates={
CONFIG_ENTRY_MAC_ADDRESS: mac_address,
CONFIG_ENTRY_LOCATION: discovery_info.ssdp_location,
CONFIG_ENTRY_HOST: host,
CONFIG_ENTRY_ST: discovery_info.ssdp_st,
},
)
@@ -204,7 +214,7 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
return self.async_abort(reason="config_entry_updated")
# Store discovery.
self._discoveries = [discovery_info]
self._add_discovery(discovery_info)
# Ensure user recognizable.
self.context["title_placeholders"] = {
@@ -221,10 +231,27 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
if user_input is None:
return self.async_show_form(step_id="ssdp_confirm")
assert self._discoveries
discovery = self._discoveries[0]
assert self.unique_id
discovery = self._remove_discovery(self.unique_id)
return await self._async_create_entry_from_discovery(discovery)
async def async_step_ignore(self, user_input: dict[str, Any]) -> FlowResult:
"""Ignore this config flow."""
usn = user_input["unique_id"]
discovery = self._remove_discovery(usn)
mac_address = await _async_mac_address_from_discovery(self.hass, discovery)
data = {
CONFIG_ENTRY_UDN: discovery.upnp[ssdp.ATTR_UPNP_UDN],
CONFIG_ENTRY_ST: discovery.ssdp_st,
CONFIG_ENTRY_ORIGINAL_UDN: discovery.upnp[ssdp.ATTR_UPNP_UDN],
CONFIG_ENTRY_MAC_ADDRESS: mac_address,
CONFIG_ENTRY_HOST: discovery.ssdp_headers["_host"],
CONFIG_ENTRY_LOCATION: discovery.ssdp_location,
}
await self.async_set_unique_id(user_input["unique_id"], raise_on_progress=False)
return self.async_create_entry(title=user_input["title"], data=data)
async def _async_create_entry_from_discovery(
self,
discovery: SsdpServiceInfo,
@@ -243,5 +270,6 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
CONFIG_ENTRY_ORIGINAL_UDN: discovery.upnp[ssdp.ATTR_UPNP_UDN],
CONFIG_ENTRY_LOCATION: discovery.ssdp_location,
CONFIG_ENTRY_MAC_ADDRESS: mac_address,
CONFIG_ENTRY_HOST: discovery.ssdp_headers["_host"],
}
return self.async_create_entry(title=title, data=data)
+1
View File
@@ -7,6 +7,7 @@ from homeassistant.const import TIME_SECONDS
LOGGER = logging.getLogger(__package__)
DOMAIN = "upnp"
DOMAIN_DISCOVERIES = "discoveries"
BYTES_RECEIVED = "bytes_received"
BYTES_SENT = "bytes_sent"
PACKETS_RECEIVED = "packets_received"
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "UPnP/IGD",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/upnp",
"requirements": ["async-upnp-client==0.32.1", "getmac==0.8.2"],
"requirements": ["async-upnp-client==0.32.2", "getmac==0.8.2"],
"dependencies": ["network", "ssdp"],
"codeowners": ["@StevenLooman"],
"ssdp": [
@@ -3,7 +3,7 @@
"name": "Venstar",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/venstar",
"requirements": ["venstarcolortouch==0.18"],
"requirements": ["venstarcolortouch==0.19"],
"codeowners": ["@garbled1"],
"iot_class": "local_polling",
"loggers": ["venstarcolortouch"]
@@ -9,6 +9,7 @@ from typing import TYPE_CHECKING, Any
import voluptuous as vol
from homeassistant.auth.models import RefreshToken, User
from homeassistant.components.http import current_request
from homeassistant.core import Context, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError, Unauthorized
@@ -137,6 +138,13 @@ class ActiveConnection:
err_message = "Unknown error"
log_handler = self.logger.exception
log_handler("Error handling message: %s (%s)", err_message, code)
self.send_message(messages.error_message(msg["id"], code, err_message))
if code:
err_message += f" ({code})"
if request := current_request.get():
err_message += f" from {request.remote}"
if user_agent := request.headers.get("user-agent"):
err_message += f" ({user_agent})"
log_handler("Error handling message: %s", err_message)
@@ -3,7 +3,7 @@
"name": "Xiaomi Gateway (Aqara)",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/xiaomi_aqara",
"requirements": ["PyXiaomiGateway==0.14.1"],
"requirements": ["PyXiaomiGateway==0.14.3"],
"after_dependencies": ["discovery"],
"codeowners": ["@danielhiversen", "@syssi"],
"zeroconf": ["_miio._udp.local."],
@@ -94,6 +94,19 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.title, push_lock
)
@callback
def _async_device_unavailable(
_service_info: bluetooth.BluetoothServiceInfoBleak,
) -> None:
"""Handle device not longer being seen by the bluetooth stack."""
push_lock.reset_advertisement_state()
entry.async_on_unload(
bluetooth.async_track_unavailable(
hass, _async_device_unavailable, push_lock.address
)
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
entry.async_on_unload(entry.add_update_listener(_async_update_listener))
return True

Some files were not shown because too many files have changed in this diff Show More