Compare commits

...

232 Commits

Author SHA1 Message Date
Franck Nijhof
2d5a75d4f2 2025.2.2 (#138231)
* LaCrosse View new endpoint (#137284)

* Switch to new endpoint in LaCrosse View

* Coverage

* Avoid merge conflict

* Switch to UpdateFailed

* Convert coinbase account amounts as floats to properly add them together (#137588)

Convert coinbase account amounts as floats to properly add

* Bump ohmepy to 1.2.9 (#137695)

* Bump onedrive_personal_sdk to 0.0.9 (#137729)

* Limit habitica ConfigEntrySelect to integration domain (#137767)

* Limit nordpool ConfigEntrySelect to integration domain (#137768)

* Limit transmission ConfigEntrySelect to integration domain (#137769)

* Fix tplink child updates taking up to 60s (#137782)

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Revert "Fix tplink child updates taking up to 60s"

This reverts commit 5cd20a120f772b8df96ec32890b071b22135895e.

* Call backup listener during setup in Google Drive (#137789)

* Use the external URL set in Settings > System > Network if my is disabled as redirect URL for Google Drive instructions (#137791)

* Use the Assistant URL set in Settings > System > Network if my is disabled

* fix

* Remove async_get_redirect_uri

* Fix manufacturer_id matching for 0 (#137802)

fix manufacturer_id matching for 0

* Fix DAB radio in Onkyo (#137852)

* Fix LG webOS TV fails to setup when device is off (#137870)

* Fix heos migration (#137887)

* Fix heos migration

* Fix for loop

* Bump pydrawise to 2025.2.0 (#137961)

* Bump aioshelly to version 12.4.2 (#137986)

* Prevent crash if telegram message failed and did not generate an ID (#137989)

Fix #137901 - Regression introduced in 6fdccda225

* Bump habiticalib to v0.3.7 (#137993)

* bump habiticalib to 0.3.6

* bump to v0.3.7

* Refresh the nest authentication token on integration start before invoking the pub/sub subsciber (#138003)

* Refresh the nest authentication token on integration start before invoking the pub/sub subscriber

* Apply suggestions from code review

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>

* Use resumable uploads in Google Drive (#138010)

* Use resumable uploads in Google Drive

* tests

* Bump py-synologydsm-api to 2.6.2 (#138060)

bump py-synologydsm-api to 2.6.2

* Handle generic agent exceptions when getting and deleting backups (#138145)

* Handle generic agent exceptions when getting backups

* Update hassio test

* Update delete_backup

* Bump onedrive-personal-sdk to 0.0.10 (#138186)

* Keep one backup per backup agent when executing retention policy (#138189)

* Keep one backup per backup agent when executing retention policy

* Add tests

* Use defaultdict instead of dict.setdefault

* Update hassio tests

* Improve inexogy logging when failed to update (#138210)

* Bump pyheos to v1.0.2 (#138224)

Bump pyheos

* Update frontend to 20250210.0 (#138227)

* Bump version to 2025.2.2

* Bump lacrosse-view to 1.1.1 (#137282)

---------

Co-authored-by: IceBotYT <34712694+IceBotYT@users.noreply.github.com>
Co-authored-by: Nathan Spencer <natekspencer@gmail.com>
Co-authored-by: Dan Raper <me@danr.uk>
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: tronikos <tronikos@users.noreply.github.com>
Co-authored-by: Patrick <14628713+patman15@users.noreply.github.com>
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
Co-authored-by: Shay Levy <levyshay1@gmail.com>
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
Co-authored-by: David Knowles <dknowles2@gmail.com>
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
Co-authored-by: Daniel O'Connor <daniel.oconnor@gmail.com>
Co-authored-by: Manu <4445816+tr4nt0r@users.noreply.github.com>
Co-authored-by: Allen Porter <allen@thebends.org>
Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
Co-authored-by: Michael <35783820+mib1185@users.noreply.github.com>
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Jan-Philipp Benecke <jan-philipp@bnck.me>
Co-authored-by: Andrew Sayre <6730289+andrewsayre@users.noreply.github.com>
Co-authored-by: Bram Kragten <mail@bramkragten.nl>
2025-02-10 22:08:18 +01:00
IceBotYT
e1ad3f05e6 Bump lacrosse-view to 1.1.1 (#137282) 2025-02-10 20:21:54 +00:00
Franck Nijhof
b9280edbfa Bump version to 2025.2.2 2025-02-10 19:52:33 +00:00
Bram Kragten
010993fc5f Update frontend to 20250210.0 (#138227) 2025-02-10 19:50:59 +00:00
Andrew Sayre
713931661e Bump pyheos to v1.0.2 (#138224)
Bump pyheos
2025-02-10 19:49:54 +00:00
Jan-Philipp Benecke
af06521f66 Improve inexogy logging when failed to update (#138210) 2025-02-10 19:49:51 +00:00
Erik Montnemery
c32f57f85a Keep one backup per backup agent when executing retention policy (#138189)
* Keep one backup per backup agent when executing retention policy

* Add tests

* Use defaultdict instead of dict.setdefault

* Update hassio tests
2025-02-10 19:49:48 +00:00
Josef Zweck
171061a778 Bump onedrive-personal-sdk to 0.0.10 (#138186) 2025-02-10 19:49:44 +00:00
Abílio Costa
476ea35bdb Handle generic agent exceptions when getting and deleting backups (#138145)
* Handle generic agent exceptions when getting backups

* Update hassio test

* Update delete_backup
2025-02-10 19:49:41 +00:00
Michael
00e6866664 Bump py-synologydsm-api to 2.6.2 (#138060)
bump py-synologydsm-api to 2.6.2
2025-02-10 19:49:38 +00:00
tronikos
201bf95ab8 Use resumable uploads in Google Drive (#138010)
* Use resumable uploads in Google Drive

* tests
2025-02-10 19:49:34 +00:00
Allen Porter
ff22bbd0e4 Refresh the nest authentication token on integration start before invoking the pub/sub subsciber (#138003)
* Refresh the nest authentication token on integration start before invoking the pub/sub subscriber

* Apply suggestions from code review

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2025-02-10 19:49:29 +00:00
Manu
fd8d4e937c Bump habiticalib to v0.3.7 (#137993)
* bump habiticalib to 0.3.6

* bump to v0.3.7
2025-02-10 19:48:56 +00:00
Daniel O'Connor
7903348d79 Prevent crash if telegram message failed and did not generate an ID (#137989)
Fix #137901 - Regression introduced in 6fdccda225
2025-02-10 19:48:52 +00:00
Maciej Bieniek
090dbba06e Bump aioshelly to version 12.4.2 (#137986) 2025-02-10 19:48:49 +00:00
David Knowles
af77e69eb0 Bump pydrawise to 2025.2.0 (#137961) 2025-02-10 19:48:43 +00:00
Paulus Schoutsen
23e7638687 Fix heos migration (#137887)
* Fix heos migration

* Fix for loop
2025-02-10 19:45:47 +00:00
Shay Levy
36b722960a Fix LG webOS TV fails to setup when device is off (#137870) 2025-02-10 19:45:44 +00:00
Artur Pragacz
3dd241a398 Fix DAB radio in Onkyo (#137852) 2025-02-10 19:45:40 +00:00
Patrick
b5a9c3d1f6 Fix manufacturer_id matching for 0 (#137802)
fix manufacturer_id matching for 0
2025-02-10 19:45:36 +00:00
tronikos
eca714a45a Use the external URL set in Settings > System > Network if my is disabled as redirect URL for Google Drive instructions (#137791)
* Use the Assistant URL set in Settings > System > Network if my is disabled

* fix

* Remove async_get_redirect_uri
2025-02-10 19:45:33 +00:00
tronikos
8049699efb Call backup listener during setup in Google Drive (#137789) 2025-02-10 19:45:30 +00:00
J. Nick Koston
7c6afd50dc Fix tplink child updates taking up to 60s (#137782)
* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Fix tplink child updates taking up to 60s

fixes #137562

* Revert "Fix tplink child updates taking up to 60s"

This reverts commit 5cd20a120f772b8df96ec32890b071b22135895e.
2025-02-10 19:45:26 +00:00
Marc Mueller
42d8889778 Limit transmission ConfigEntrySelect to integration domain (#137769) 2025-02-10 19:45:23 +00:00
Marc Mueller
a4c0304e1f Limit nordpool ConfigEntrySelect to integration domain (#137768) 2025-02-10 19:45:20 +00:00
Marc Mueller
c63e688ba8 Limit habitica ConfigEntrySelect to integration domain (#137767) 2025-02-10 19:45:16 +00:00
Josef Zweck
16298b4195 Bump onedrive_personal_sdk to 0.0.9 (#137729) 2025-02-10 19:45:13 +00:00
Dan Raper
da23eb22db Bump ohmepy to 1.2.9 (#137695) 2025-02-10 19:45:09 +00:00
Nathan Spencer
4bd1d0199b Convert coinbase account amounts as floats to properly add them together (#137588)
Convert coinbase account amounts as floats to properly add
2025-02-10 19:45:06 +00:00
IceBotYT
efe7050030 LaCrosse View new endpoint (#137284)
* Switch to new endpoint in LaCrosse View

* Coverage

* Avoid merge conflict

* Switch to UpdateFailed
2025-02-10 19:44:56 +00:00
Franck Nijhof
79ff85f517 2025.2.1 (#137688)
* Fix hassio test using wrong fixture (#137516)

* Change Electric Kiwi authentication (#135231)

Co-authored-by: Joostlek <joostlek@outlook.com>

* Update govee-ble to 0.42.1 (#137371)

* Bump holidays to 0.66 (#137449)

* Bump aiohttp-asyncmdnsresolver to 0.1.0 (#137492)

changelog: https://github.com/aio-libs/aiohttp-asyncmdnsresolver/compare/v0.0.3...v0.1.0

Switches to the new AsyncDualMDNSResolver class to which
tries via mDNS and DNS for .local domains since we have
so many different user DNS configurations to support

fixes #137479
fixes #136922

* Bump aiohttp to 3.11.12 (#137494)

changelog: https://github.com/aio-libs/aiohttp/compare/v3.11.11...v3.11.12

* Bump govee-ble to 0.43.0 to fix compat with new H5179 firmware (#137508)

changelog: https://github.com/Bluetooth-Devices/govee-ble/compare/v0.42.1...v0.43.0

fixes #136969

* Bump habiticalib to v0.3.5 (#137510)

* Fix Mill issue, where no sensors were shown (#137521)

Fix mill issue #137477

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>

* Don't overwrite setup state in async_set_domains_to_be_loaded (#137547)

* Use separate metadata files for onedrive (#137549)

* Fix sending polls to Telegram threads (#137553)

Fix sending poll to Telegram thread

* Skip building wheels for electrickiwi-api (#137556)

* Add excluded domains to broadcast intent (#137566)

* Revert "Add `PaddleSwitchPico` (Pico Paddle Remote) device trigger to Lutron Caseta" (#137571)

* Fix Overseerr webhook configuration JSON (#137572)

Co-authored-by: Lars Jouon <schm.lars@googlemail.com>

* Do not rely on pyserial for port scanning with the CM5 + ZHA (#137585)

Do not rely on pyserial for port scanning with the CM5

* Bump eheimdigital to 1.0.6 (#137587)

* Bump pyfireservicerota to 0.0.46 (#137589)

* Bump reolink-aio to 0.11.10 (#137591)

* Allow to omit the payload attribute to MQTT publish action to allow an empty payload to be sent by default (#137595)

Allow to omit the payload attribute to MQTT publish actionto allow an empty payload to be sent by default

* Handle previously migrated HEOS device identifier (#137596)

* Bump `aioshelly` to version `12.4.1` (#137598)

* Bump aioshelly to 12.4.0

* Bump to 12.4.1

* Bump electrickiwi-api  to 0.9.13 (#137601)

* bump ek api version to fix deps

* Revert "Skip building wheels for electrickiwi-api (#137556)"

This reverts commit 5f6068eea4.

---------

Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>

* Bump ZHA to 0.0.48 (#137610)

* Bump Electrickiwi-api to 0.9.14 (#137614)

* bump library to fix bug with post

* rebuild

* Update google-nest-sdm to 7.1.3 (#137625)

* Update google-nest-sdm to 7.1.2

* Bump nest to 7.1.3

* Don't use the current temperature from Shelly BLU TRV as a state for External Temperature number entity (#137658)

Introduce RpcBluTrvExtTempNumber for External Temperature entity

* Fix LG webOS TV turn off when device is already off (#137675)

* Bump version to 2025.2.1

---------

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Michael Arthur <mikey0000@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
Co-authored-by: G Johansson <goran.johansson@shiftit.se>
Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: Manu <4445816+tr4nt0r@users.noreply.github.com>
Co-authored-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Jasper Wiegratz <656460+jwhb@users.noreply.github.com>
Co-authored-by: Michael Hansen <mike@rhasspy.org>
Co-authored-by: Dennis Effing <dennis.effing@outlook.com>
Co-authored-by: Lars Jouon <schm.lars@googlemail.com>
Co-authored-by: puddly <32534428+puddly@users.noreply.github.com>
Co-authored-by: Sid <27780930+autinerd@users.noreply.github.com>
Co-authored-by: Ron <ron@cyberjunky.nl>
Co-authored-by: starkillerOG <starkiller.og@gmail.com>
Co-authored-by: Jan Bouwhuis <jbouwh@users.noreply.github.com>
Co-authored-by: Andrew Sayre <6730289+andrewsayre@users.noreply.github.com>
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
Co-authored-by: Allen Porter <allen@thebends.org>
Co-authored-by: Shay Levy <levyshay1@gmail.com>
2025-02-07 19:34:32 +01:00
Franck Nijhof
73ad4caf94 Bump version to 2025.2.1 2025-02-07 16:39:53 +00:00
Shay Levy
e3d649d349 Fix LG webOS TV turn off when device is already off (#137675) 2025-02-07 16:37:52 +00:00
Maciej Bieniek
657e3488ba Don't use the current temperature from Shelly BLU TRV as a state for External Temperature number entity (#137658)
Introduce RpcBluTrvExtTempNumber for External Temperature entity
2025-02-07 16:37:49 +00:00
Allen Porter
7508c14a53 Update google-nest-sdm to 7.1.3 (#137625)
* Update google-nest-sdm to 7.1.2

* Bump nest to 7.1.3
2025-02-07 16:37:43 +00:00
Michael Arthur
ac84970da8 Bump Electrickiwi-api to 0.9.14 (#137614)
* bump library to fix bug with post

* rebuild
2025-02-07 16:37:40 +00:00
TheJulianJES
30073f3493 Bump ZHA to 0.0.48 (#137610) 2025-02-07 16:37:36 +00:00
Michael Arthur
3abd7b8ba3 Bump electrickiwi-api to 0.9.13 (#137601)
* bump ek api version to fix deps

* Revert "Skip building wheels for electrickiwi-api (#137556)"

This reverts commit 5f6068eea4.

---------

Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
2025-02-07 16:37:33 +00:00
Maciej Bieniek
62bc6e4bf6 Bump aioshelly to version 12.4.1 (#137598)
* Bump aioshelly to 12.4.0

* Bump to 12.4.1
2025-02-07 16:37:30 +00:00
Andrew Sayre
5faa189fef Handle previously migrated HEOS device identifier (#137596) 2025-02-07 16:37:26 +00:00
Jan Bouwhuis
e09ae1c83d Allow to omit the payload attribute to MQTT publish action to allow an empty payload to be sent by default (#137595)
Allow to omit the payload attribute to MQTT publish actionto allow an empty payload to be sent by default
2025-02-07 16:37:23 +00:00
starkillerOG
7b20299de7 Bump reolink-aio to 0.11.10 (#137591) 2025-02-07 16:37:19 +00:00
Ron
81e501aba1 Bump pyfireservicerota to 0.0.46 (#137589) 2025-02-07 16:37:16 +00:00
Sid
568ac22ce8 Bump eheimdigital to 1.0.6 (#137587) 2025-02-07 16:37:12 +00:00
puddly
c71ab054f1 Do not rely on pyserial for port scanning with the CM5 + ZHA (#137585)
Do not rely on pyserial for port scanning with the CM5
2025-02-07 16:37:09 +00:00
Dennis Effing
bea201f9f6 Fix Overseerr webhook configuration JSON (#137572)
Co-authored-by: Lars Jouon <schm.lars@googlemail.com>
2025-02-07 16:37:05 +00:00
J. Nick Koston
dda90bc04c Revert "Add PaddleSwitchPico (Pico Paddle Remote) device trigger to Lutron Caseta" (#137571) 2025-02-07 16:37:02 +00:00
Michael Hansen
a033e4c88d Add excluded domains to broadcast intent (#137566) 2025-02-07 16:36:59 +00:00
Marc Mueller
42b6f83e7c Skip building wheels for electrickiwi-api (#137556) 2025-02-07 16:36:55 +00:00
Jasper Wiegratz
cb937bc115 Fix sending polls to Telegram threads (#137553)
Fix sending poll to Telegram thread
2025-02-07 16:36:51 +00:00
Josef Zweck
bec569caf9 Use separate metadata files for onedrive (#137549) 2025-02-07 16:36:47 +00:00
Erik Montnemery
3390fb32a8 Don't overwrite setup state in async_set_domains_to_be_loaded (#137547) 2025-02-07 16:36:43 +00:00
Daniel Hjelseth Høyer
3ebb58f780 Fix Mill issue, where no sensors were shown (#137521)
Fix mill issue #137477

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-02-07 16:36:40 +00:00
Manu
30b131d3b9 Bump habiticalib to v0.3.5 (#137510) 2025-02-07 16:36:36 +00:00
J. Nick Koston
cd40232beb Bump govee-ble to 0.43.0 to fix compat with new H5179 firmware (#137508)
changelog: https://github.com/Bluetooth-Devices/govee-ble/compare/v0.42.1...v0.43.0

fixes #136969
2025-02-07 16:36:29 +00:00
J. Nick Koston
f27fe365c5 Bump aiohttp to 3.11.12 (#137494)
changelog: https://github.com/aio-libs/aiohttp/compare/v3.11.11...v3.11.12
2025-02-07 16:34:31 +00:00
J. Nick Koston
1c769418fb Bump aiohttp-asyncmdnsresolver to 0.1.0 (#137492)
changelog: https://github.com/aio-libs/aiohttp-asyncmdnsresolver/compare/v0.0.3...v0.1.0

Switches to the new AsyncDualMDNSResolver class to which
tries via mDNS and DNS for .local domains since we have
so many different user DNS configurations to support

fixes #137479
fixes #136922
2025-02-07 16:32:21 +00:00
G Johansson
db7c2dab52 Bump holidays to 0.66 (#137449) 2025-02-07 16:28:43 +00:00
Marc Mueller
627377872b Update govee-ble to 0.42.1 (#137371) 2025-02-07 16:28:37 +00:00
Michael Arthur
8504162539 Change Electric Kiwi authentication (#135231)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-02-07 16:28:31 +00:00
Erik Montnemery
67c6a1d436 Fix hassio test using wrong fixture (#137516) 2025-02-06 09:04:49 +01:00
Franck Nijhof
5c383f3d88 2025.2.0 (#137448) 2025-02-05 20:11:04 +01:00
Franck Nijhof
3a88c9d6f4 Bump version to 2025.2.0 2025-02-05 17:35:07 +00:00
Franck Nijhof
5c7cabed1e Bump version to 2025.2.0b12 2025-02-05 17:30:55 +00:00
J. Nick Koston
65fde6042f Bump dbus-fast to 2.33.0 (#137446)
changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v2.32.0...v2.33.0
2025-02-05 17:30:19 +00:00
Michael Hansen
d5dd0f6ec1 Bump hassil and intents (#137440) 2025-02-05 17:28:33 +00:00
Marc Mueller
95410586b1 Update bluetooth-data-tools to 1.23.4 (#137374)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-02-05 17:24:18 +00:00
Marc Mueller
d5ad91fce3 Update bluetooth dependencies (#137353) 2025-02-05 17:21:28 +00:00
Franck Nijhof
04b0d587c5 Bump version to 2025.2.0b11 2025-02-05 16:18:01 +00:00
Bram Kragten
72a3c5296c Update frontend to 20250205.0 (#137441) 2025-02-05 16:16:12 +00:00
Erik Montnemery
d6414b9849 Bump aiohasupervisor to version 0.3.0 (#137437) 2025-02-05 16:14:42 +00:00
starkillerOG
c4e2ddd28b Bump reolink_aio to 0.11.9 (#137430)
* Add push callbacks

* Bump reolink_aio to 0.11.9
2025-02-05 16:14:39 +00:00
Josef Zweck
5687a4d718 Bump onedrive to 0.0.8 (#137423)
* Bump onedrive to 0.0.6

* bump to 0.0.7

* bump to 0.0.8

* Improve coverage
2025-02-05 16:14:06 +00:00
Franck Nijhof
a4474b2794 Bump version to 2025.2.0b10 2025-02-05 12:26:27 +00:00
Erik Montnemery
72a69d7e41 Adjust backup filename scheme (#137424)
* Adjust backup filename scheme

* Update tests
2025-02-05 12:16:11 +00:00
Erik Montnemery
e8314fb286 Adjust logic for per-backup agent encryption (#137420) 2025-02-05 12:16:07 +00:00
Erik Montnemery
30c099ef4e Allow creating backup if at least one agent is available (#137409) 2025-02-05 12:16:04 +00:00
Paulus Schoutsen
c506c9080a Simplify llm calendar tool (#137402)
* Simplify calendar tool

* Clean up exposed entities
2025-02-05 12:16:01 +00:00
Brett Adams
79563f3746 Handle powerwall at zero percent in Tesla Fleet and Tessie (#137393)
* Handle powerwall zero

* Add missing value_fn call
2025-02-05 12:15:56 +00:00
Brett Adams
0764c7e773 Bump Tesla Fleet API to v0.9.8 (#137379)
* v0.9.7

* v0.9.8
2025-02-05 12:14:14 +00:00
J. Nick Koston
fa83591148 Allow ignored Bluetooth adapters to be set up from the user flow (#137373) 2025-02-05 11:57:16 +00:00
J. Nick Koston
df2b29aef1 Bump led-ble to 1.1.6 (#137369) 2025-02-05 11:57:13 +00:00
Jan-Philipp Benecke
da8d300f29 Fix sqlalchemy deprecation warning that declarative_base has moved (#137360) 2025-02-05 11:57:09 +00:00
Marc Mueller
2c5fd4ee2a Update led-ble to 1.1.5 (#137347) 2025-02-05 11:57:06 +00:00
J. Nick Koston
16d9270833 Fix memory leak when unloading DataUpdateCoordinator (#137338)
* check wiz

* Fix memory leak when unloading DataUpdateCoordinator

fixes #137237

* handle namespace conflict

* handle namespace conflict

* address review comments
2025-02-05 11:57:02 +00:00
Erik Montnemery
d8179dacc6 Report progress while restoring supervisor backup (#137313) 2025-02-05 11:56:56 +00:00
TimL
3dc075f287 Bump pysmlight to v0.1.7 (#137390) 2025-02-05 09:43:38 +01:00
Bram Kragten
b5e4fee9aa Bump version to 2025.2.0b9 2025-02-04 21:42:50 +01:00
Shay Levy
1c8ced2c2d Fix Tado missing await (#137364) 2025-02-04 21:41:27 +01:00
Robert Resch
1a5b8cf854 Bump deebot-client to 12.0.0 (#137361) 2025-02-04 21:41:26 +01:00
Shay Levy
af40bb39ad Bump aranet4 to 2.5.1 (#137359) 2025-02-04 21:41:25 +01:00
Teemu R.
14034ed7f8 Polish tplink vacuum sensors (#137355) 2025-02-04 21:41:24 +01:00
Glenn Waters
d7f0a55568 Fix incorrect UPB service entity type (#137346) 2025-02-04 21:41:24 +01:00
J. Nick Koston
1038a849c4 Bump uiprotect to 7.5.1 (#137343) 2025-02-04 21:41:23 +01:00
Bram Kragten
c4b08d3d57 Update frontend to 20250204.0 (#137342) 2025-02-04 21:39:32 +01:00
J. Nick Koston
0e9658b5ff Copy area from remote parent device when creating Bluetooth devices (#137340) 2025-02-04 21:36:43 +01:00
Duco Sebel
0463b90d36 Fix HomeWizard reconfigure flow throwing error for v2-API devices (#137337)
Fix reconfigure flow not working for v2
2025-02-04 21:36:42 +01:00
Jan Bouwhuis
37f0832c8b Don't show active user initiated data entry config flows (#137334)
Do not show active user initiated  data entry config flows
2025-02-04 21:36:41 +01:00
Erik Montnemery
2005e14d5f Improve error handling when supervisor backups are deleted (#137331)
* Improve error handling when supervisor backups are deleted

* Move exception definitions
2025-02-04 21:36:41 +01:00
Josef Zweck
99219a9a73 Bump onedrive-personal-sdk to 0.0.4 (#137330) 2025-02-04 21:36:40 +01:00
Bram Kragten
1f967f7f77 Minor adjustments of hassio backup tests (#137324) 2025-02-04 21:35:04 +01:00
Kevin Worrel
8de64b8b1f Allow ignored screenlogic devices to be set up from the user flow (#137315)
Allow ignored ScreenLogic devices to be set up from the user flow
2025-02-04 21:25:18 +01:00
Matthias Lohr
48c88d8fa1 Bump tololib to 1.2.2 (#137303) 2025-02-04 21:25:18 +01:00
Erik Montnemery
d478f906df Include extra metadata in backup WS API (#137296)
* Include extra metadata in backup WS API

* Update onboarding backup view

* Update google_drive tests
2025-02-04 21:25:17 +01:00
Michael
09e02493b7 Improve backup file naming in Synology DSM backup agent (#137278)
* improve backup file naming

* use built-in suggested_filename
2025-02-04 21:25:16 +01:00
Abílio Costa
55c746f909 Add view to download support package to Cloud component (#135856) 2025-02-04 21:25:15 +01:00
Franck Nijhof
834a04ac49 Bump version to 2025.2.0b8 2025-02-04 12:26:19 +00:00
Josef Zweck
fa9b4c3524 Bump onedrive-personal-sdk to 0.0.3 (#137309) 2025-02-04 12:25:47 +00:00
Erik Montnemery
13bfa82038 Report progress while creating supervisor backup (#137301)
* Report progress while creating supervisor backup

* Use enum util
2025-02-04 12:25:44 +00:00
epenet
0766b47161 Fix data update coordinator garbage collection (#137299) 2025-02-04 12:25:36 +00:00
Brett Adams
fa8225d0a2 Bump tesla-fleet-api to 0.9.2 (#137295) 2025-02-04 12:23:23 +00:00
Daniel Hjelseth Høyer
623c82e5d1 Bump pymill to 0.12.3 (#137264)
Mill lib 0.12.3

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-02-04 12:11:18 +00:00
Bram Kragten
728a1a4be5 Update frontend to 20250203.0 (#137263) 2025-02-04 12:11:15 +00:00
Duco Sebel
4bbb3e351b Remove v2 API support for HomeWizard P1 Meter (#137261) 2025-02-04 12:11:12 +00:00
Erik Montnemery
044bafd6aa Improve shutdown of _CipherBackupStreamer (#137257)
* Improve shutdown of _CipherBackupStreamer

* Catch the right exception
2025-02-04 12:11:08 +00:00
Abílio Costa
1e1069b647 Allow ignored idasen_desk devices to be set up from the user flow (#137253) 2025-02-04 12:11:04 +00:00
Josef Zweck
455af9179b Bump onedrive-personal-sdk to 0.0.2 (#137252) 2025-02-04 12:10:59 +00:00
Regev Brody
30b309d7a1 Bump python-roborock to 2.11.1 (#137244) 2025-02-04 12:10:55 +00:00
Markus Adrario
7e32342eb2 Fix minor issues in Homee (#137239) 2025-02-04 12:10:51 +00:00
RJPoelstra
bb9740991e Fix retrieving PIN when no pin is set on mount in motionmount integration (#137230) 2025-02-04 12:10:47 +00:00
Erik Montnemery
88e5d1c18f Check for errors when creating backups using supervisor (#137220)
* Check for errors when creating backups using supervisor

* Improve error reporting when there's no backup reference
2025-02-04 12:10:43 +00:00
Erik Montnemery
e960053226 Check for errors when restoring backups using supervisor (#137217)
* Check for errors when restoring backups using supervisor

* Break long line in test

* Improve comments
2025-02-04 12:10:39 +00:00
cdnninja
b318fb46a0 Vesync bump pyvesync library (#137208) 2025-02-04 12:10:36 +00:00
Andre Lengwenus
523835080b Bump pypck to 0.8.5 (#137176) 2025-02-04 12:10:32 +00:00
Norbert Rittel
5a63138581 Fixes in user-facing strings of Tado integration (#137158) 2025-02-04 12:10:28 +00:00
Indu Prakash
90ddb6cce1 Humidifier turn display off for sleep mode (#137133) 2025-02-04 12:10:25 +00:00
Josef Zweck
81783dcfd3 Migrate OneDrive to onedrive_personal_sdk library (#137064) 2025-02-04 12:10:21 +00:00
Michael
405cc47157 Don't blow up when a backup doesn't exist on Synology DSM (#136913)
* don't raise while delte not existing backup

* only raise when error ne 408
2025-02-04 12:10:17 +00:00
Aaron Godfrey
809f5eea49 Bump todist-api-python to 2.1.7 (#136549)
Co-authored-by: Allen Porter <allen@thebends.org>
Co-authored-by: J. Diego Rodríguez Royo <jdrr1998@hotmail.com>
2025-02-04 12:10:12 +00:00
Paulus Schoutsen
63c153d671 Bump version to 2025.2.0b7 2025-02-03 02:27:53 +00:00
TimL
c8c6eddc65 Simplify config entry title for SMLIGHT (#137206) 2025-02-03 02:27:42 +00:00
J. Nick Koston
ddb40cb4a8 Bump dbus-fast to 2.23.0 (#137205)
changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v2.31.0...v2.32.0
2025-02-03 02:27:42 +00:00
TimL
38975775ac Switch to using IP Addresses for connecting to smlight devices (#137204) 2025-02-03 02:27:41 +00:00
J. Nick Koston
4fa043e6ff Bump bleak-esphome to 2.7.0 (#137199)
changelog: https://github.com/Bluetooth-Devices/bleak-esphome/compare/v2.6.0...v2.7.0
2025-02-03 02:27:40 +00:00
J. Nick Koston
433a51f6d5 Bump aiodhcpwatcher to 1.0.3 (#137188)
changelog: https://github.com/bdraco/aiodhcpwatcher/compare/v1.0.2...v1.0.3
2025-02-03 02:27:40 +00:00
J. Nick Koston
48511986bb Bump dbus-fast to 2.31.0 (#137180)
changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v2.30.4...v2.31.0
2025-02-03 02:27:39 +00:00
Steven B.
f1128adec4 Bump python-kasa to 0.10.1 (#137173) 2025-02-03 02:27:38 +00:00
Jan Bouwhuis
54a718c1d7 Fix mqtt reconfigure does not use broker entry password when it is not changed (#137169) 2025-02-03 02:27:38 +00:00
Jeef
63d1dddc76 Bump monarchmoney to 0.4.4 (#137168)
feat: update to backing lib to update backing lib
2025-02-03 02:27:37 +00:00
Manu
7d1b72a581 Bump habiticalib to v0.3.4 (#137148)
Bump habiticalib to version 0.3.4
2025-02-03 02:27:36 +00:00
J. Nick Koston
6c172705d1 Bump bluetooth-data-tools to 1.23.3 (#137147) 2025-02-03 02:27:36 +00:00
J. Nick Koston
505f089a73 Bump dbus-fast to 2.30.4 (#137151)
changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v2.30.2...v2.30.4
2025-02-03 02:26:40 +00:00
TimL
dbf9e370a8 Allow manual smlight user setup to override discovery (#137136)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-02-03 02:25:22 +00:00
Paulus Schoutsen
dc1c2f24e6 Bump version to 2025.2.0b6 2025-02-02 02:06:10 +00:00
Robert Resch
78dcf8b18e Bump deebot-client to 12.0.0b0 (#137137) 2025-02-02 02:06:07 +00:00
J. Nick Koston
613168fd62 Add missing brackets to ESPHome configuration URLs with IPv6 addresses (#137132)
fixes #137125
2025-02-02 02:06:06 +00:00
J. Nick Koston
5f28e95bdc Bump habluetooth to 3.21.0 (#137129) 2025-02-02 02:06:05 +00:00
Allen Porter
1db5da4037 Remove entity state from mcp-server prompt (#137126)
* Create a stateless assist API for MCP server

* Update stateless API

* Fix areas in exposed entity fields

* Add tests that verify areas are returned

* Revert the getstate intent

* Revert whitespace change

* Revert whitespace change

* Revert method name changes to avoid breaking openai and google tests
2025-02-02 02:06:05 +00:00
Alex Thompson
6bf5e95089 Allow ignored tilt_ble devices to be set up from user flow (#137123)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-02-02 02:06:04 +00:00
Shay Levy
1ea23fda10 Allow ignored Aranet devices to be set up from the user flow (#137121) 2025-02-02 02:06:03 +00:00
J. Nick Koston
21a85c014a Allow ignored xiaomi_ble devices to be set up from the user flow (#137115) 2025-02-02 02:06:03 +00:00
J. Nick Koston
4c8f716320 Allow ignored sensorpush devices to be set up from the user flow (#137113)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for sensorpush
2025-02-02 02:06:02 +00:00
J. Nick Koston
63bd67f6cd Allow ignored qingping devices to be set up from the user flow (#137111)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for qingping
2025-02-02 02:06:01 +00:00
Assaf Inbal
73b874c5e6 Fix Homekit camera profiles schema (#137110) 2025-02-02 02:06:00 +00:00
J. Nick Koston
3b67dc3651 Allow ignored oralb devices to be set up from the user flow (#137109)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for oralb
2025-02-02 02:06:00 +00:00
J. Nick Koston
434a4ebc9f Allow ignored mopeka devices to be set up from the user flow (#137107)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for mopeka
2025-02-02 02:05:59 +00:00
J. Nick Koston
cb4b7e71af Allow ignored inkbird devices to be set up from the user flow (#137106)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for inkbird
2025-02-02 02:05:58 +00:00
J. Nick Koston
4c6fda2096 Allow ignored bthome devices to be set up from the user flow (#137105) 2025-02-02 02:05:58 +00:00
J. Nick Koston
9b5c21524c Allow ignored thermopro devices to be set up from the user flow (#137104)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for thermopro
2025-02-02 02:05:57 +00:00
J. Nick Koston
76937541f1 Allow ignored yale_ble devices to be set up from the user flow (#137103)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for yalexs_ble
2025-02-02 02:05:56 +00:00
J. Nick Koston
bad966f3ab Allow ignored airthings_ble devices to be set up from the user flow (#137102)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for airthings
2025-02-02 02:05:55 +00:00
J. Nick Koston
2d1d9bbe5a Set via_device for remote Bluetooth adapters to link to the parent device (#137091) 2025-02-02 02:05:55 +00:00
Marc Mueller
e76ff0a0de Update RestrictedPython to 8.0 (#137075) 2025-02-02 02:05:54 +00:00
IceBotYT
fa8d1b4dc4 Bump lacrosse-view to 1.0.4 (#137058) 2025-02-02 02:05:53 +00:00
Paulus Schoutsen
b3c44ca03a Bump version to 2025.2.0b5 2025-02-01 13:58:56 +00:00
Jan-Philipp Benecke
6efa6f9687 Load hassio before backup at frontend stage (#137067) 2025-02-01 13:58:53 +00:00
J. Nick Koston
3588b88cbb Bump habluetooth to 3.20.1 (#137063) 2025-02-01 13:58:52 +00:00
tronikos
a51846a8cd For consistency use suggested_filename in Google Drive (#137061)
Use  suggested_filename in Google Drive
2025-02-01 13:58:52 +00:00
J. Nick Koston
ec22479733 Allow ignored switchbot devices to be set up from the user flow (#137056) 2025-02-01 13:58:51 +00:00
J. Nick Koston
3a11e8df6a Allow ignored govee-ble devices to be set up from the user flow (#137052)
* Allow ignored govee-ble devices to be setup up from the user flow

Every few days we get an issue report about a device
a user ignored and forgot about, and than can no longer
get set up. Allow ignored devices to be selected in
the user step and replace the ignored entry.

* Add the ability to skip ignored config entries when calling _abort_if_unique_id_configured

see https://github.com/home-assistant/core/pull/137052

* coverage

* revert
2025-02-01 13:58:50 +00:00
Nathan Spencer
a4eab35e01 Raise HomeAssistantError from camera snapshot service (#137051)
* Raise HomeAssistantError from camera snapshot service

* Improve error message

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-02-01 13:58:50 +00:00
Paulus Schoutsen
829a6271af Bump version to 2025.2.0b4 2025-02-01 01:04:55 +00:00
Jan Bouwhuis
9935528dd3 Bump aioimaplib to version 2.0.1 (#137049) 2025-02-01 01:04:48 +00:00
J. Nick Koston
df35d226d6 Bump habluetooth to 3.17.1 (#137045) 2025-02-01 01:04:47 +00:00
J. Nick Koston
2b510caa1c Bump aiohttp-asyncmdnsresolver to 0.0.3 (#137040) 2025-02-01 01:04:47 +00:00
J. Nick Koston
90c357c01f Bump bthome-ble to 3.12.3 (#137036) 2025-02-01 01:04:46 +00:00
J. Nick Koston
321ce698be Bump zeroconf to 0.143.0 (#137035) 2025-02-01 01:04:45 +00:00
Ernst Klamer
ea519268b6 Bump bthome-ble to 3.11.0 (#137032)
bump bthome-ble to 3.11.0
2025-02-01 01:04:45 +00:00
Josef Zweck
4687b2e455 Use readable backup names for onedrive (#137031)
* Use readable names for onedrive

* ensure filename is fixed

* fix import
2025-02-01 01:04:44 +00:00
Joost Lekkerkerker
bbb03d6731 Update Overseerr string to mention CSRF (#137001)
* Update Overseerr string to mention CSRF

* Update homeassistant/components/overseerr/strings.json

* Update homeassistant/components/overseerr/strings.json

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2025-02-01 01:04:43 +00:00
Jan Bouwhuis
b9884f72c3 Shorten the integration name for incomfort (#136930) 2025-02-01 01:04:42 +00:00
Paulus Schoutsen
e1105ef2fa Bump version to 2025.2.0b3 2025-01-31 19:25:16 +00:00
Robert Resch
5450ed8445 Bump deebot-client to 11.1.0b2 (#137030) 2025-01-31 19:24:42 +00:00
J. Nick Koston
7deb1715dd Bump SQLAlchemy to 2.0.37 (#137028)
changelog: https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-2.0.37

There is a bug fix that likely affects us that could lead to corrupted queries
https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-e4d04d8eb1bccee16b74f5662aff8edd
2025-01-31 19:24:41 +00:00
J. Nick Koston
ca2a555037 Bump bleak-esphome to 2.6.0 (#137025) 2025-01-31 19:24:41 +00:00
Bram Kragten
ae79b09401 Update frontend to 20250131.0 (#137024) 2025-01-31 19:24:40 +00:00
J. Nick Koston
e86a633c23 Bump habluetooth to 3.17.0 (#137022) 2025-01-31 19:24:39 +00:00
Erik Montnemery
b412164440 Make supervisor backup file names more user friendly (#137020) 2025-01-31 19:24:39 +00:00
Norbert Rittel
4fe76ec78c Revert previous PR and remove URL from error message instead (#137018) 2025-01-31 19:24:38 +00:00
Erik Montnemery
f4166c5390 Make sure we load the backup integration before frontend (#137010) 2025-01-31 19:24:37 +00:00
Cyrill Raccaud
3107b81333 Remove the unparsed config flow error from Swiss public transport (#136998) 2025-01-31 19:24:36 +00:00
Joost Lekkerkerker
07b85163d5 Use device name as entity name in Eheim digital climate (#136997) 2025-01-31 19:24:35 +00:00
Duco Sebel
c28d465f3b Bumb python-homewizard-energy to 8.3.2 (#136995) 2025-01-31 19:24:34 +00:00
Josef Zweck
00298db465 Call backup listener during setup in onedrive (#136990) 2025-01-31 19:24:34 +00:00
Cyrill Raccaud
6bab5b2c32 Fix missing duration translation for Swiss public transport integration (#136982) 2025-01-31 19:24:33 +00:00
Josef Zweck
0272d37e88 Retry backup uploads in onedrive (#136980)
* Retry backup uploads in onedrive

* no exponential backup on timeout
2025-01-31 19:24:32 +00:00
Erik Montnemery
26ae498974 Delete old addon update backups when updating addon (#136977)
* Delete old addon update backups when updating addon

* Address review comments

* Add tests
2025-01-31 19:24:31 +00:00
J. Nick Koston
c77bca1e44 Bump habluetooth to 3.15.0 (#136973) 2025-01-31 19:24:30 +00:00
Michael Hansen
ad86f9efd5 Consume extra system prompt in first pipeline (#136958) 2025-01-31 19:24:30 +00:00
Matthias Alphart
71a40d9234 Update knx-frontend to 2025.1.30.194235 (#136954) 2025-01-31 19:24:29 +00:00
J. Nick Koston
eb344ba335 Bump aiohttp-asyncmdnsresolver to 0.0.2 (#136942) 2025-01-31 19:22:11 +00:00
J. Nick Koston
eca30717a9 Bump zeroconf to 0.142.0 (#136940)
changelog: https://github.com/python-zeroconf/python-zeroconf/compare/0.141.0...0.142.0
2025-01-31 19:22:10 +00:00
Erik Montnemery
6e55ba137a Make backup file names more user friendly (#136928)
* Make backup file names more user friendly

* Strip backup name

* Strip backup name

* Underscores
2025-01-31 19:22:10 +00:00
tronikos
a391f0a7cc Bump opower to 0.8.9 (#136911)
* Bump opower to 0.8.9

* mypy
2025-01-31 19:22:09 +00:00
tronikos
c9fd27555c Include the redirect URL in the Google Drive instructions (#136906)
* Include the redirect URL in the Google Drive instructions

* Apply suggestions from code review

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-01-31 19:22:08 +00:00
Allen Porter
9cd48dd452 Persist roborock maps to disk only on shutdown (#136889)
* Persist roborock maps to disk only on shutdown

* Rename on_unload to on_stop

* Spawn 1 executor thread and block writes to disk

* Update tests/components/roborock/test_image.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

* Use config entry setup instead of component setup

---------

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-01-31 19:22:07 +00:00
Avi Miller
a74328e600 Suppress color_temp warning if color_temp_kelvin is provided (#136884) 2025-01-31 19:22:06 +00:00
Jan Stienstra
5cec045cac Bump jellyfin-apiclient-python to 1.10.0 (#136872) 2025-01-31 19:22:06 +00:00
Norbert Rittel
04a7c6f15e Fixes to the user-facing strings of energenie_power_sockets (#136844) 2025-01-31 19:22:05 +00:00
Austin Mroczek
833b17a8ee Bump total-connect-client to 2025.1.4 (#136793) 2025-01-31 19:22:04 +00:00
Sid
a955901d40 Refactor eheimdigital platform async_setup_entry (#136745) 2025-01-31 19:22:03 +00:00
starkillerOG
9a55b5e3f7 Ensure Reolink can start when privacy mode is enabled (#136514)
* Allow startup when privacy mode is enabled

* Add tests

* remove duplicate privacy_mode

* fix tests

* Apply suggestions from code review

Co-authored-by: Robert Resch <robert@resch.dev>

* Store in subfolder and cleanup when removed

* Add tests and fixes

* fix styling

* rename CONF_PRIVACY to CONF_SUPPORTS_PRIVACY_MODE

* use helper store

---------

Co-authored-by: Robert Resch <robert@resch.dev>
2025-01-31 19:22:03 +00:00
Bram Kragten
3847057444 Bump version to 2025.2.0b2 2025-01-30 19:28:55 +01:00
Bram Kragten
659a0df9ab Update frontend to 20250130.0 (#136937) 2025-01-30 19:21:55 +01:00
Maciej Bieniek
74f0af1ba1 Fix KeyError for Shelly virtual number component (#136932) 2025-01-30 19:21:54 +01:00
Michael
ad6c3f9e10 Fix backup related translations in Synology DSM (#136931)
refernce backup related strings in option-flow strings
2025-01-30 19:21:53 +01:00
Josef Zweck
252b13e63a Pick onedrive owner from a more reliable source (#136929)
* Pick onedrive owner from a more reliable source

* fix
2025-01-30 19:21:52 +01:00
Joost Lekkerkerker
07acabdb36 Create Xbox signed session in executor (#136927) 2025-01-30 19:21:51 +01:00
Joost Lekkerkerker
f479ed4ff0 Fix Sonos importing deprecating constant (#136926) 2025-01-30 19:21:51 +01:00
Joost Lekkerkerker
b70598673b Show name of the backup agents in issue (#136925)
* Show name of the backup agents in issue

* Show name of the backup agents in issue

* Update homeassistant/components/backup/manager.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-01-30 19:21:50 +01:00
tronikos
08bb027eac Don't log errors when raising a backup exception in Google Drive (#136916) 2025-01-30 19:21:49 +01:00
Maciej Bieniek
613f0add76 Convert valve position to int for Shelly BLU TRV (#136912) 2025-01-30 19:21:48 +01:00
Josef Zweck
9e23ff9a4d Fix onedrive does not fail on delete not found (#136910)
* Fix onedrive does not fail on delete not found

* Fix onedrive does not fail on delete not found
2025-01-30 19:21:47 +01:00
Erik Montnemery
fad3d5d293 Don't blow up when a backup doesn't exist on supervisor (#136907) 2025-01-30 19:21:46 +01:00
Erik Montnemery
b300fb1fab Fix handling of renamed backup files in the core writer (#136898)
* Fix handling of renamed backup files in the core writer

* Adjust mocking

* Raise BackupAgentError instead of KeyError in get_backup_path

* Add specific error indicating backup not found

* Fix tests

* Ensure backups are loaded

* Fix tests
2025-01-30 19:21:46 +01:00
Erik Montnemery
aed779172d Ignore dangling symlinks when restoring backup (#136893) 2025-01-30 19:21:45 +01:00
epenet
5e646a3cb6 Add missing discovery string from onewire (#136892) 2025-01-30 19:21:44 +01:00
Erik Montnemery
0764aca2f1 Poll supervisor job state when creating or restoring a backup (#136891)
* Poll supervisor job state when creating or restoring a backup

* Update tests

* Add tests for create and restore jobs finishing early
2025-01-30 19:21:43 +01:00
Allen Porter
8babdc0b71 Bump nest to 7.1.1 (#136888) 2025-01-30 19:21:43 +01:00
TheJulianJES
ff64e5a312 Bump ZHA to 0.0.47 (#136883) 2025-01-30 19:21:42 +01:00
TimL
55ac0b0f37 Fix loading of SMLIGHT integration when no internet is available (#136497)
* Don't fail to load integration if internet unavailable

* Add test case for no internet

* Also test we recover after internet returns
2025-01-30 19:21:41 +01:00
Paulus Schoutsen
f391438d0a Add start_conversation service to Assist Satellite (#134921)
* Add start_conversation service to Assist Satellite

* Fix tests

* Implement start_conversation in voip

* Update homeassistant/components/assist_satellite/entity.py

---------

Co-authored-by: Michael Hansen <mike@rhasspy.org>
2025-01-30 19:21:39 +01:00
342 changed files with 8870 additions and 3619 deletions

View File

@@ -146,6 +146,7 @@ def _extract_backup(
config_dir,
dirs_exist_ok=True,
ignore=shutil.ignore_patterns(*(keep)),
ignore_dangling_symlinks=True,
)
elif restore_content.restore_database:
for entry in KEEP_DATABASE:

View File

@@ -161,6 +161,16 @@ FRONTEND_INTEGRATIONS = {
# integrations can be removed and database migration status is
# visible in frontend
"frontend",
# Hassio is an after dependency of backup, after dependencies
# are not promoted from stage 2 to earlier stages, so we need to
# add it here. Hassio needs to be setup before backup, otherwise
# the backup integration will think we are a container/core install
# when using HAOS or Supervised install.
"hassio",
# Backup is an after dependency of frontend, after dependencies
# are not promoted from stage 2 to earlier stages, so we need to
# add it here.
"backup",
}
RECORDER_INTEGRATIONS = {
# Setup after frontend

View File

@@ -144,7 +144,7 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title=discovery.name, data={})
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:

View File

@@ -92,7 +92,7 @@ class AranetConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address][0], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:

View File

@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/aranet",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["aranet4==2.5.0"]
"requirements": ["aranet4==2.5.1"]
}

View File

@@ -1122,6 +1122,7 @@ class PipelineRun:
context=user_input.context,
language=user_input.language,
agent_id=user_input.agent_id,
extra_system_prompt=user_input.extra_system_prompt,
)
speech = conversation_result.response.speech.get("plain", {}).get(
"speech", ""

View File

@@ -63,6 +63,21 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"async_internal_announce",
[AssistSatelliteEntityFeature.ANNOUNCE],
)
component.async_register_entity_service(
"start_conversation",
vol.All(
cv.make_entity_service_schema(
{
vol.Optional("start_message"): str,
vol.Optional("start_media_id"): str,
vol.Optional("extra_system_prompt"): str,
}
),
cv.has_at_least_one_key("start_message", "start_media_id"),
),
"async_internal_start_conversation",
[AssistSatelliteEntityFeature.START_CONVERSATION],
)
hass.data[CONNECTION_TEST_DATA] = {}
async_register_websocket_api(hass)
hass.http.register_view(ConnectionTestView())

View File

@@ -26,3 +26,6 @@ class AssistSatelliteEntityFeature(IntFlag):
ANNOUNCE = 1
"""Device supports remotely triggered announcements."""
START_CONVERSATION = 2
"""Device supports starting conversations."""

View File

@@ -10,7 +10,7 @@ import logging
import time
from typing import Any, Final, Literal, final
from homeassistant.components import media_source, stt, tts
from homeassistant.components import conversation, media_source, stt, tts
from homeassistant.components.assist_pipeline import (
OPTION_PREFERRED,
AudioSettings,
@@ -27,6 +27,7 @@ from homeassistant.components.tts import (
generate_media_source_id as tts_generate_media_source_id,
)
from homeassistant.core import Context, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity
from homeassistant.helpers.entity import EntityDescription
@@ -117,6 +118,7 @@ class AssistSatelliteEntity(entity.Entity):
_run_has_tts: bool = False
_is_announcing = False
_extra_system_prompt: str | None = None
_wake_word_intercept_future: asyncio.Future[str | None] | None = None
_attr_tts_options: dict[str, Any] | None = None
_pipeline_task: asyncio.Task | None = None
@@ -216,6 +218,59 @@ class AssistSatelliteEntity(entity.Entity):
"""
raise NotImplementedError
async def async_internal_start_conversation(
self,
start_message: str | None = None,
start_media_id: str | None = None,
extra_system_prompt: str | None = None,
) -> None:
"""Start a conversation from the satellite.
If start_media_id is not provided, message is synthesized to
audio with the selected pipeline.
If start_media_id is provided, it is played directly. It is possible
to omit the message and the satellite will not show any text.
Calls async_start_conversation.
"""
await self._cancel_running_pipeline()
# The Home Assistant built-in agent doesn't support conversations.
pipeline = async_get_pipeline(self.hass, self._resolve_pipeline())
if pipeline.conversation_engine == conversation.HOME_ASSISTANT_AGENT:
raise HomeAssistantError(
"Built-in conversation agent does not support starting conversations"
)
if start_message is None:
start_message = ""
announcement = await self._resolve_announcement_media_id(
start_message, start_media_id
)
if self._is_announcing:
raise SatelliteBusyError
self._is_announcing = True
# Provide our start info to the LLM so it understands context of incoming message
if extra_system_prompt is not None:
self._extra_system_prompt = extra_system_prompt
else:
self._extra_system_prompt = start_message or None
try:
await self.async_start_conversation(announcement)
finally:
self._is_announcing = False
async def async_start_conversation(
self, start_announcement: AssistSatelliteAnnouncement
) -> None:
"""Start a conversation from the satellite."""
raise NotImplementedError
async def async_accept_pipeline_from_satellite(
self,
audio_stream: AsyncIterable[bytes],
@@ -226,6 +281,10 @@ class AssistSatelliteEntity(entity.Entity):
"""Triggers an Assist pipeline in Home Assistant from a satellite."""
await self._cancel_running_pipeline()
# Consume system prompt in first pipeline
extra_system_prompt = self._extra_system_prompt
self._extra_system_prompt = None
if self._wake_word_intercept_future and start_stage in (
PipelineStage.WAKE_WORD,
PipelineStage.STT,
@@ -302,6 +361,7 @@ class AssistSatelliteEntity(entity.Entity):
),
start_stage=start_stage,
end_stage=end_stage,
conversation_extra_system_prompt=extra_system_prompt,
),
f"{self.entity_id}_pipeline",
)

View File

@@ -7,6 +7,9 @@
"services": {
"announce": {
"service": "mdi:bullhorn"
},
"start_conversation": {
"service": "mdi:forum"
}
}
}

View File

@@ -1,5 +1,7 @@
"""Assist Satellite intents."""
from typing import Final
import voluptuous as vol
from homeassistant.core import HomeAssistant
@@ -7,6 +9,8 @@ from homeassistant.helpers import entity_registry as er, intent
from .const import DOMAIN, AssistSatelliteEntityFeature
EXCLUDED_DOMAINS: Final[set[str]] = {"voip"}
async def async_setup_intents(hass: HomeAssistant) -> None:
"""Set up the intents."""
@@ -30,19 +34,36 @@ class BroadcastIntentHandler(intent.IntentHandler):
ent_reg = er.async_get(hass)
# Find all assist satellite entities that are not the one invoking the intent
entities = {
entity: entry
for entity in hass.states.async_entity_ids(DOMAIN)
if (entry := ent_reg.async_get(entity))
and entry.supported_features & AssistSatelliteEntityFeature.ANNOUNCE
}
entities: dict[str, er.RegistryEntry] = {}
for entity in hass.states.async_entity_ids(DOMAIN):
entry = ent_reg.async_get(entity)
if (
(entry is None)
or (
# Supports announce
not (
entry.supported_features & AssistSatelliteEntityFeature.ANNOUNCE
)
)
# Not the invoking device
or (intent_obj.device_id and (entry.device_id == intent_obj.device_id))
):
# Skip satellite
continue
if intent_obj.device_id:
entities = {
entity: entry
for entity, entry in entities.items()
if entry.device_id != intent_obj.device_id
}
# Check domain of config entry against excluded domains
if (
entry.config_entry_id
and (
config_entry := hass.config_entries.async_get_entry(
entry.config_entry_id
)
)
and (config_entry.domain in EXCLUDED_DOMAINS)
):
continue
entities[entity] = entry
await hass.services.async_call(
DOMAIN,
@@ -54,7 +75,6 @@ class BroadcastIntentHandler(intent.IntentHandler):
)
response = intent_obj.create_response()
response.async_set_speech("Done")
response.response_type = intent.IntentResponseType.ACTION_DONE
response.async_set_results(
success_results=[

View File

@@ -14,3 +14,23 @@ announce:
required: false
selector:
text:
start_conversation:
target:
entity:
domain: assist_satellite
supported_features:
- assist_satellite.AssistSatelliteEntityFeature.START_CONVERSATION
fields:
start_message:
required: false
example: "You left the lights on in the living room. Turn them off?"
selector:
text:
start_media_id:
required: false
selector:
text:
extra_system_prompt:
required: false
selector:
text:

View File

@@ -25,6 +25,24 @@
"description": "The media ID to announce instead of using text-to-speech."
}
}
},
"start_conversation": {
"name": "Start Conversation",
"description": "Start a conversation from a satellite.",
"fields": {
"start_message": {
"name": "Message",
"description": "The message to start with."
},
"start_media_id": {
"name": "Media ID",
"description": "The media ID to start with instead of using text-to-speech."
},
"extra_system_prompt": {
"name": "Extra system prompt",
"description": "Provide background information to the AI about the request."
}
}
}
}
}

View File

@@ -26,15 +26,19 @@ from .manager import (
BackupReaderWriterError,
CoreBackupReaderWriter,
CreateBackupEvent,
CreateBackupStage,
CreateBackupState,
IdleEvent,
IncorrectPasswordError,
ManagerBackup,
NewBackup,
RestoreBackupEvent,
RestoreBackupStage,
RestoreBackupState,
WrittenBackup,
)
from .models import AddonInfo, AgentBackup, Folder
from .models import AddonInfo, AgentBackup, BackupNotFound, Folder
from .util import suggested_filename, suggested_filename_from_name_date
from .websocket import async_register_websocket_handlers
__all__ = [
@@ -44,10 +48,13 @@ __all__ = [
"BackupAgentError",
"BackupAgentPlatformProtocol",
"BackupManagerError",
"BackupNotFound",
"BackupPlatformProtocol",
"BackupReaderWriter",
"BackupReaderWriterError",
"CreateBackupEvent",
"CreateBackupStage",
"CreateBackupState",
"Folder",
"IdleEvent",
"IncorrectPasswordError",
@@ -55,9 +62,12 @@ __all__ = [
"ManagerBackup",
"NewBackup",
"RestoreBackupEvent",
"RestoreBackupStage",
"RestoreBackupState",
"WrittenBackup",
"async_get_manager",
"suggested_filename",
"suggested_filename_from_name_date",
]
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)

View File

@@ -11,13 +11,7 @@ from propcache.api import cached_property
from homeassistant.core import HomeAssistant, callback
from .models import AgentBackup, BackupError
class BackupAgentError(BackupError):
"""Base class for backup agent errors."""
error_code = "backup_agent_error"
from .models import AgentBackup, BackupAgentError
class BackupAgentUnreachableError(BackupAgentError):
@@ -94,11 +88,16 @@ class LocalBackupAgent(BackupAgent):
@abc.abstractmethod
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup.
"""Return the local path to an existing backup.
The method should return the path to the backup file with the specified id.
Raises BackupAgentError if the backup does not exist.
"""
@abc.abstractmethod
def get_new_backup_path(self, backup: AgentBackup) -> Path:
"""Return the local path to a new backup."""
class BackupAgentPlatformProtocol(Protocol):
"""Define the format of backup platforms which implement backup agents."""

View File

@@ -13,8 +13,8 @@ from homeassistant.helpers.hassio import is_hassio
from .agent import BackupAgent, LocalBackupAgent
from .const import DOMAIN, LOGGER
from .models import AgentBackup
from .util import read_backup
from .models import AgentBackup, BackupNotFound
from .util import read_backup, suggested_filename
async def async_get_backup_agents(
@@ -39,7 +39,7 @@ class CoreLocalBackupAgent(LocalBackupAgent):
super().__init__()
self._hass = hass
self._backup_dir = Path(hass.config.path("backups"))
self._backups: dict[str, AgentBackup] = {}
self._backups: dict[str, tuple[AgentBackup, Path]] = {}
self._loaded_backups = False
async def _load_backups(self) -> None:
@@ -49,13 +49,13 @@ class CoreLocalBackupAgent(LocalBackupAgent):
self._backups = backups
self._loaded_backups = True
def _read_backups(self) -> dict[str, AgentBackup]:
def _read_backups(self) -> dict[str, tuple[AgentBackup, Path]]:
"""Read backups from disk."""
backups: dict[str, AgentBackup] = {}
backups: dict[str, tuple[AgentBackup, Path]] = {}
for backup_path in self._backup_dir.glob("*.tar"):
try:
backup = read_backup(backup_path)
backups[backup.backup_id] = backup
backups[backup.backup_id] = (backup, backup_path)
except (OSError, TarError, json.JSONDecodeError, KeyError) as err:
LOGGER.warning("Unable to read backup %s: %s", backup_path, err)
return backups
@@ -76,13 +76,13 @@ class CoreLocalBackupAgent(LocalBackupAgent):
**kwargs: Any,
) -> None:
"""Upload a backup."""
self._backups[backup.backup_id] = backup
self._backups[backup.backup_id] = (backup, self.get_new_backup_path(backup))
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
if not self._loaded_backups:
await self._load_backups()
return list(self._backups.values())
return [backup for backup, _ in self._backups.values()]
async def async_get_backup(
self,
@@ -93,10 +93,10 @@ class CoreLocalBackupAgent(LocalBackupAgent):
if not self._loaded_backups:
await self._load_backups()
if not (backup := self._backups.get(backup_id)):
if backup_id not in self._backups:
return None
backup_path = self.get_backup_path(backup_id)
backup, backup_path = self._backups[backup_id]
if not await self._hass.async_add_executor_job(backup_path.exists):
LOGGER.debug(
(
@@ -112,15 +112,28 @@ class CoreLocalBackupAgent(LocalBackupAgent):
return backup
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup."""
return self._backup_dir / f"{backup_id}.tar"
"""Return the local path to an existing backup.
Raises BackupAgentError if the backup does not exist.
"""
try:
return self._backups[backup_id][1]
except KeyError as err:
raise BackupNotFound(f"Backup {backup_id} does not exist") from err
def get_new_backup_path(self, backup: AgentBackup) -> Path:
"""Return the local path to a new backup."""
return self._backup_dir / suggested_filename(backup)
async def async_delete_backup(self, backup_id: str, **kwargs: Any) -> None:
"""Delete a backup file."""
if await self.async_get_backup(backup_id) is None:
return
if not self._loaded_backups:
await self._load_backups()
backup_path = self.get_backup_path(backup_id)
try:
backup_path = self.get_backup_path(backup_id)
except BackupNotFound:
return
await self._hass.async_add_executor_job(backup_path.unlink, True)
LOGGER.debug("Deleted backup located at %s", backup_path)
self._backups.pop(backup_id)

View File

@@ -2,8 +2,6 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
from dataclasses import dataclass, field, replace
import datetime as dt
from datetime import datetime, timedelta
@@ -252,7 +250,7 @@ class RetentionConfig:
"""Delete backups older than days."""
self._schedule_next(manager)
def _backups_filter(
def _delete_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return backups older than days to delete."""
@@ -269,7 +267,9 @@ class RetentionConfig:
< now
}
await _delete_filtered_backups(manager, _backups_filter)
await manager.async_delete_filtered_backups(
include_filter=_automatic_backups_filter, delete_filter=_delete_filter
)
manager.remove_next_delete_event = async_call_later(
manager.hass, timedelta(days=1), _delete_backups
@@ -521,74 +521,21 @@ class CreateBackupParametersDict(TypedDict, total=False):
password: str | None
async def _delete_filtered_backups(
manager: BackupManager,
backup_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param manager: The backup manager.
:param backup_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await manager.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
# only delete backups that are created with the saved automatic settings
backups = {
def _automatic_backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return automatic backups."""
return {
backup_id: backup
for backup_id, backup in backups.items()
if backup.with_automatic_settings
}
LOGGER.debug("Total automatic backups: %s", backups)
filtered_backups = backup_filter(backups)
if not filtered_backups:
return
# always delete oldest backup first
filtered_backups = dict(
sorted(
filtered_backups.items(),
key=lambda backup_item: backup_item[1].date,
)
)
if len(filtered_backups) >= len(backups):
# Never delete the last backup.
last_backup = filtered_backups.popitem()
LOGGER.debug("Keeping the last backup: %s", last_backup)
LOGGER.debug("Backups to delete: %s", filtered_backups)
if not filtered_backups:
return
backup_ids = list(filtered_backups)
delete_results = await asyncio.gather(
*(manager.async_delete_backup(backup_id) for backup_id in filtered_backups)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)
async def delete_backups_exceeding_configured_count(manager: BackupManager) -> None:
"""Delete backups exceeding the configured retention count."""
def _backups_filter(
def _delete_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return oldest backups more numerous than copies to delete."""
@@ -603,4 +550,6 @@ async def delete_backups_exceeding_configured_count(manager: BackupManager) -> N
)[: max(len(backups) - manager.config.data.retention.copies, 0)]
)
await _delete_filtered_backups(manager, _backups_filter)
await manager.async_delete_filtered_backups(
include_filter=_automatic_backups_filter, delete_filter=_delete_filter
)

View File

@@ -21,6 +21,7 @@ from . import util
from .agent import BackupAgent
from .const import DATA_MANAGER
from .manager import BackupManager
from .models import BackupNotFound
@callback
@@ -69,13 +70,16 @@ class DownloadBackupView(HomeAssistantView):
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
}
if not password or not backup.protected:
return await self._send_backup_no_password(
request, headers, backup_id, agent_id, agent, manager
try:
if not password or not backup.protected:
return await self._send_backup_no_password(
request, headers, backup_id, agent_id, agent, manager
)
return await self._send_backup_with_password(
hass, request, headers, backup_id, agent_id, password, agent, manager
)
return await self._send_backup_with_password(
hass, request, headers, backup_id, agent_id, password, agent, manager
)
except BackupNotFound:
return Response(status=HTTPStatus.NOT_FOUND)
async def _send_backup_no_password(
self,

View File

@@ -4,11 +4,13 @@ from __future__ import annotations
import abc
import asyncio
from collections import defaultdict
from collections.abc import AsyncIterator, Callable, Coroutine
from dataclasses import dataclass, replace
from enum import StrEnum
import hashlib
import io
from itertools import chain
import json
from pathlib import Path, PurePath
import shutil
@@ -50,7 +52,14 @@ from .const import (
EXCLUDE_FROM_BACKUP,
LOGGER,
)
from .models import AgentBackup, BackupError, BackupManagerError, BaseBackup, Folder
from .models import (
AgentBackup,
BackupError,
BackupManagerError,
BackupReaderWriterError,
BaseBackup,
Folder,
)
from .store import BackupStore
from .util import (
AsyncIteratorReader,
@@ -274,12 +283,6 @@ class BackupReaderWriter(abc.ABC):
"""Get restore events after core restart."""
class BackupReaderWriterError(BackupError):
"""Backup reader/writer error."""
error_code = "backup_reader_writer_error"
class IncorrectPasswordError(BackupReaderWriterError):
"""Raised when the password is incorrect."""
@@ -558,8 +561,15 @@ class BackupManager:
return_exceptions=True,
)
for idx, result in enumerate(list_backups_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result
agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue
if isinstance(result, BaseException):
raise result # unexpected error
@@ -586,7 +596,7 @@ class BackupManager:
name=agent_backup.name,
with_automatic_settings=with_automatic_settings,
)
backups[backup_id].agents[agent_ids[idx]] = AgentBackupStatus(
backups[backup_id].agents[agent_id] = AgentBackupStatus(
protected=agent_backup.protected,
size=agent_backup.size,
)
@@ -609,8 +619,15 @@ class BackupManager:
return_exceptions=True,
)
for idx, result in enumerate(get_backup_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result
agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue
if isinstance(result, BaseException):
raise result # unexpected error
@@ -638,7 +655,7 @@ class BackupManager:
name=result.name,
with_automatic_settings=with_automatic_settings,
)
backup.agents[agent_ids[idx]] = AgentBackupStatus(
backup.agents[agent_id] = AgentBackupStatus(
protected=result.protected,
size=result.size,
)
@@ -661,10 +678,13 @@ class BackupManager:
return None
return with_automatic_settings
async def async_delete_backup(self, backup_id: str) -> dict[str, Exception]:
async def async_delete_backup(
self, backup_id: str, *, agent_ids: list[str] | None = None
) -> dict[str, Exception]:
"""Delete a backup."""
agent_errors: dict[str, Exception] = {}
agent_ids = list(self.backup_agents)
if agent_ids is None:
agent_ids = list(self.backup_agents)
delete_backup_results = await asyncio.gather(
*(
@@ -674,8 +694,15 @@ class BackupManager:
return_exceptions=True,
)
for idx, result in enumerate(delete_backup_results):
agent_id = agent_ids[idx]
if isinstance(result, BackupAgentError):
agent_errors[agent_ids[idx]] = result
agent_errors[agent_id] = result
continue
if isinstance(result, Exception):
agent_errors[agent_id] = result
LOGGER.error(
"Unexpected error for %s: %s", agent_id, result, exc_info=result
)
continue
if isinstance(result, BaseException):
raise result # unexpected error
@@ -685,6 +712,106 @@ class BackupManager:
return agent_errors
async def async_delete_filtered_backups(
self,
*,
include_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
delete_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param include_filter: A filter that should return the backups to consider for
deletion. Note: The newest of the backups returned by include_filter will
unconditionally be kept, even if delete_filter returns all backups.
:param delete_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await self.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
# Run the include filter first to ensure we only consider backups that
# should be included in the deletion process.
backups = include_filter(backups)
backups_by_agent: dict[str, dict[str, ManagerBackup]] = defaultdict(dict)
for backup_id, backup in backups.items():
for agent_id in backup.agents:
backups_by_agent[agent_id][backup_id] = backup
LOGGER.debug("Backups returned by include filter: %s", backups)
LOGGER.debug(
"Backups returned by include filter by agent: %s",
{agent_id: list(backups) for agent_id, backups in backups_by_agent.items()},
)
backups_to_delete = delete_filter(backups)
LOGGER.debug("Backups returned by delete filter: %s", backups_to_delete)
if not backups_to_delete:
return
# always delete oldest backup first
backups_to_delete_by_agent: dict[str, dict[str, ManagerBackup]] = defaultdict(
dict
)
for backup_id, backup in sorted(
backups_to_delete.items(),
key=lambda backup_item: backup_item[1].date,
):
for agent_id in backup.agents:
backups_to_delete_by_agent[agent_id][backup_id] = backup
LOGGER.debug(
"Backups returned by delete filter by agent: %s",
{
agent_id: list(backups)
for agent_id, backups in backups_to_delete_by_agent.items()
},
)
for agent_id, to_delete_from_agent in backups_to_delete_by_agent.items():
if len(to_delete_from_agent) >= len(backups_by_agent[agent_id]):
# Never delete the last backup.
last_backup = to_delete_from_agent.popitem()
LOGGER.debug(
"Keeping the last backup %s for agent %s", last_backup, agent_id
)
LOGGER.debug(
"Backups to delete by agent: %s",
{
agent_id: list(backups)
for agent_id, backups in backups_to_delete_by_agent.items()
},
)
backup_ids_to_delete: dict[str, set[str]] = defaultdict(set)
for agent_id, to_delete in backups_to_delete_by_agent.items():
for backup_id in to_delete:
backup_ids_to_delete[backup_id].add(agent_id)
if not backup_ids_to_delete:
return
backup_ids = list(backup_ids_to_delete)
delete_results = await asyncio.gather(
*(
self.async_delete_backup(backup_id, agent_ids=list(agent_ids))
for backup_id, agent_ids in backup_ids_to_delete.items()
)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)
async def async_receive_backup(
self,
*,
@@ -762,7 +889,7 @@ class BackupManager:
password=None,
)
await written_backup.release_stream()
self.known_backups.add(written_backup.backup, agent_errors)
self.known_backups.add(written_backup.backup, agent_errors, [])
return written_backup.backup.backup_id
async def async_create_backup(
@@ -886,19 +1013,30 @@ class BackupManager:
with_automatic_settings: bool,
) -> NewBackup:
"""Initiate generating a backup."""
if not agent_ids:
raise BackupManagerError("At least one agent must be selected")
if invalid_agents := [
unavailable_agents = [
agent_id for agent_id in agent_ids if agent_id not in self.backup_agents
]:
raise BackupManagerError(f"Invalid agents selected: {invalid_agents}")
]
if not (
available_agents := [
agent_id for agent_id in agent_ids if agent_id in self.backup_agents
]
):
raise BackupManagerError(
f"At least one available backup agent must be selected, got {agent_ids}"
)
if unavailable_agents:
LOGGER.warning(
"Backup agents %s are not available, will backupp to %s",
unavailable_agents,
available_agents,
)
if include_all_addons and include_addons:
raise BackupManagerError(
"Cannot include all addons and specify specific addons"
)
backup_name = (
name
(name if name is None else name.strip())
or f"{'Automatic' if with_automatic_settings else 'Custom'} backup {HAVERSION}"
)
extra_metadata = extra_metadata or {}
@@ -908,7 +1046,7 @@ class BackupManager:
new_backup,
self._backup_task,
) = await self._reader_writer.async_create_backup(
agent_ids=agent_ids,
agent_ids=available_agents,
backup_name=backup_name,
extra_metadata=extra_metadata
| {
@@ -927,7 +1065,9 @@ class BackupManager:
raise BackupManagerError(str(err)) from err
backup_finish_task = self._backup_finish_task = self.hass.async_create_task(
self._async_finish_backup(agent_ids, with_automatic_settings, password),
self._async_finish_backup(
available_agents, unavailable_agents, with_automatic_settings, password
),
name="backup_manager_finish_backup",
)
if not raise_task_error:
@@ -944,7 +1084,11 @@ class BackupManager:
return new_backup
async def _async_finish_backup(
self, agent_ids: list[str], with_automatic_settings: bool, password: str | None
self,
available_agents: list[str],
unavailable_agents: list[str],
with_automatic_settings: bool,
password: str | None,
) -> None:
"""Finish a backup."""
if TYPE_CHECKING:
@@ -963,7 +1107,7 @@ class BackupManager:
LOGGER.debug(
"Generated new backup with backup_id %s, uploading to agents %s",
written_backup.backup.backup_id,
agent_ids,
available_agents,
)
self.async_on_backup_event(
CreateBackupEvent(
@@ -976,13 +1120,15 @@ class BackupManager:
try:
agent_errors = await self._async_upload_backup(
backup=written_backup.backup,
agent_ids=agent_ids,
agent_ids=available_agents,
open_stream=written_backup.open_stream,
password=password,
)
finally:
await written_backup.release_stream()
self.known_backups.add(written_backup.backup, agent_errors)
self.known_backups.add(
written_backup.backup, agent_errors, unavailable_agents
)
if not agent_errors:
if with_automatic_settings:
# create backup was successful, update last_completed_automatic_backup
@@ -991,7 +1137,7 @@ class BackupManager:
backup_success = True
if with_automatic_settings:
self._update_issue_after_agent_upload(agent_errors)
self._update_issue_after_agent_upload(agent_errors, unavailable_agents)
# delete old backups more numerous than copies
# try this regardless of agent errors above
await delete_backups_exceeding_configured_count(self)
@@ -1151,10 +1297,10 @@ class BackupManager:
)
def _update_issue_after_agent_upload(
self, agent_errors: dict[str, Exception]
self, agent_errors: dict[str, Exception], unavailable_agents: list[str]
) -> None:
"""Update issue registry after a backup is uploaded to agents."""
if not agent_errors:
if not agent_errors and not unavailable_agents:
ir.async_delete_issue(self.hass, DOMAIN, "automatic_backup_failed")
return
ir.async_create_issue(
@@ -1166,7 +1312,17 @@ class BackupManager:
learn_more_url="homeassistant://config/backup",
severity=ir.IssueSeverity.WARNING,
translation_key="automatic_backup_failed_upload_agents",
translation_placeholders={"failed_agents": ", ".join(agent_errors)},
translation_placeholders={
"failed_agents": ", ".join(
chain(
(
self.backup_agents[agent_id].name
for agent_id in agent_errors
),
unavailable_agents,
)
)
},
)
async def async_can_decrypt_on_download(
@@ -1233,11 +1389,12 @@ class KnownBackups:
self,
backup: AgentBackup,
agent_errors: dict[str, Exception],
unavailable_agents: list[str],
) -> None:
"""Add a backup."""
self._backups[backup.backup_id] = KnownBackup(
backup_id=backup.backup_id,
failed_agent_ids=list(agent_errors),
failed_agent_ids=list(chain(agent_errors, unavailable_agents)),
)
self._manager.store.save()
@@ -1343,13 +1500,31 @@ class CoreBackupReaderWriter(BackupReaderWriter):
manager = self._hass.data[DATA_MANAGER]
agent_config = manager.config.data.agents.get(self._local_agent_id)
if agent_config and not agent_config.protected:
if (
self._local_agent_id in agent_ids
and agent_config
and not agent_config.protected
):
password = None
backup = AgentBackup(
addons=[],
backup_id=backup_id,
database_included=include_database,
date=date_str,
extra_metadata=extra_metadata,
folders=[],
homeassistant_included=True,
homeassistant_version=HAVERSION,
name=backup_name,
protected=password is not None,
size=0,
)
local_agent_tar_file_path = None
if self._local_agent_id in agent_ids:
local_agent = manager.local_backup_agents[self._local_agent_id]
local_agent_tar_file_path = local_agent.get_backup_path(backup_id)
local_agent_tar_file_path = local_agent.get_new_backup_path(backup)
on_progress(
CreateBackupEvent(
@@ -1391,19 +1566,7 @@ class CoreBackupReaderWriter(BackupReaderWriter):
# ValueError from json_bytes
raise BackupReaderWriterError(str(err)) from err
else:
backup = AgentBackup(
addons=[],
backup_id=backup_id,
database_included=include_database,
date=date_str,
extra_metadata=extra_metadata,
folders=[],
homeassistant_included=True,
homeassistant_version=HAVERSION,
name=backup_name,
protected=password is not None,
size=size_in_bytes,
)
backup = replace(backup, size=size_in_bytes)
async_add_executor_job = self._hass.async_add_executor_job
@@ -1517,7 +1680,7 @@ class CoreBackupReaderWriter(BackupReaderWriter):
manager = self._hass.data[DATA_MANAGER]
if self._local_agent_id in agent_ids:
local_agent = manager.local_backup_agents[self._local_agent_id]
tar_file_path = local_agent.get_backup_path(backup.backup_id)
tar_file_path = local_agent.get_new_backup_path(backup)
await async_add_executor_job(make_backup_dir, tar_file_path.parent)
await async_add_executor_job(shutil.move, temp_file, tar_file_path)
else:

View File

@@ -41,12 +41,6 @@ class BaseBackup:
homeassistant_version: str | None # None if homeassistant_included is False
name: str
def as_frontend_json(self) -> dict:
"""Return a dict representation of this backup for sending to frontend."""
return {
key: val for key, val in asdict(self).items() if key != "extra_metadata"
}
@dataclass(frozen=True, kw_only=True)
class AgentBackup(BaseBackup):
@@ -83,7 +77,25 @@ class BackupError(HomeAssistantError):
error_code = "unknown"
class BackupAgentError(BackupError):
"""Base class for backup agent errors."""
error_code = "backup_agent_error"
class BackupManagerError(BackupError):
"""Backup manager error."""
error_code = "backup_manager_error"
class BackupReaderWriterError(BackupError):
"""Backup reader/writer error."""
error_code = "backup_reader_writer_error"
class BackupNotFound(BackupAgentError, BackupManagerError):
"""Raised when a backup is not found."""
error_code = "backup_not_found"

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine
from concurrent.futures import CancelledError, Future
import copy
from dataclasses import dataclass, replace
from io import BytesIO
@@ -12,6 +13,7 @@ import os
from pathlib import Path, PurePath
from queue import SimpleQueue
import tarfile
import threading
from typing import IO, Any, Self, cast
import aiohttp
@@ -20,8 +22,8 @@ from securetar import SecureTarError, SecureTarFile, SecureTarReadError
from homeassistant.backup_restore import password_to_key
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import dt as dt_util
from homeassistant.util.json import JsonObjectType, json_loads_object
from homeassistant.util.thread import ThreadWithException
from .const import BUF_SIZE, LOGGER
from .models import AddonInfo, AgentBackup, Folder
@@ -117,6 +119,17 @@ def read_backup(backup_path: Path) -> AgentBackup:
)
def suggested_filename_from_name_date(name: str, date_str: str) -> str:
"""Suggest a filename for the backup."""
date = dt_util.parse_datetime(date_str, raise_on_error=True)
return "_".join(f"{name} {date.strftime('%Y-%m-%d %H.%M %S%f')}.tar".split())
def suggested_filename(backup: AgentBackup) -> str:
"""Suggest a filename for the backup."""
return suggested_filename_from_name_date(backup.name, backup.date)
def validate_password(path: Path, password: str | None) -> bool:
"""Validate the password."""
with tarfile.open(path, "r:", bufsize=BUF_SIZE) as backup_file:
@@ -155,23 +168,38 @@ class AsyncIteratorReader:
def __init__(self, hass: HomeAssistant, stream: AsyncIterator[bytes]) -> None:
"""Initialize the wrapper."""
self._aborted = False
self._hass = hass
self._stream = stream
self._buffer: bytes | None = None
self._next_future: Future[bytes | None] | None = None
self._pos: int = 0
async def _next(self) -> bytes | None:
"""Get the next chunk from the iterator."""
return await anext(self._stream, None)
def abort(self) -> None:
"""Abort the reader."""
self._aborted = True
if self._next_future is not None:
self._next_future.cancel()
def read(self, n: int = -1, /) -> bytes:
"""Read data from the iterator."""
result = bytearray()
while n < 0 or len(result) < n:
if not self._buffer:
self._buffer = asyncio.run_coroutine_threadsafe(
self._next_future = asyncio.run_coroutine_threadsafe(
self._next(), self._hass.loop
).result()
)
if self._aborted:
self._next_future.cancel()
raise AbortCipher
try:
self._buffer = self._next_future.result()
except CancelledError as err:
raise AbortCipher from err
self._pos = 0
if not self._buffer:
# The stream is exhausted
@@ -193,9 +221,11 @@ class AsyncIteratorWriter:
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the wrapper."""
self._aborted = False
self._hass = hass
self._pos: int = 0
self._queue: asyncio.Queue[bytes | None] = asyncio.Queue(maxsize=1)
self._write_future: Future[bytes | None] | None = None
def __aiter__(self) -> Self:
"""Return the iterator."""
@@ -207,13 +237,28 @@ class AsyncIteratorWriter:
return data
raise StopAsyncIteration
def abort(self) -> None:
"""Abort the writer."""
self._aborted = True
if self._write_future is not None:
self._write_future.cancel()
def tell(self) -> int:
"""Return the current position in the iterator."""
return self._pos
def write(self, s: bytes, /) -> int:
"""Write data to the iterator."""
asyncio.run_coroutine_threadsafe(self._queue.put(s), self._hass.loop).result()
self._write_future = asyncio.run_coroutine_threadsafe(
self._queue.put(s), self._hass.loop
)
if self._aborted:
self._write_future.cancel()
raise AbortCipher
try:
self._write_future.result()
except CancelledError as err:
raise AbortCipher from err
self._pos += len(s)
return len(s)
@@ -403,7 +448,9 @@ def _encrypt_backup(
class _CipherWorkerStatus:
done: asyncio.Event
error: Exception | None = None
thread: ThreadWithException
reader: AsyncIteratorReader
thread: threading.Thread
writer: AsyncIteratorWriter
class _CipherBackupStreamer:
@@ -456,11 +503,13 @@ class _CipherBackupStreamer:
stream = await self._open_stream()
reader = AsyncIteratorReader(self._hass, stream)
writer = AsyncIteratorWriter(self._hass)
worker = ThreadWithException(
worker = threading.Thread(
target=self._cipher_func,
args=[reader, writer, self._password, on_done, self.size(), self._nonces],
)
worker_status = _CipherWorkerStatus(done=asyncio.Event(), thread=worker)
worker_status = _CipherWorkerStatus(
done=asyncio.Event(), reader=reader, thread=worker, writer=writer
)
self._workers.append(worker_status)
worker.start()
return writer
@@ -468,9 +517,8 @@ class _CipherBackupStreamer:
async def wait(self) -> None:
"""Wait for the worker threads to finish."""
for worker in self._workers:
if not worker.thread.is_alive():
continue
worker.thread.raise_exc(AbortCipher)
worker.reader.abort()
worker.writer.abort()
await asyncio.gather(*(worker.done.wait() for worker in self._workers))

View File

@@ -15,7 +15,7 @@ from .manager import (
IncorrectPasswordError,
ManagerStateEvent,
)
from .models import Folder
from .models import BackupNotFound, Folder
@callback
@@ -57,7 +57,7 @@ async def handle_info(
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backups": [backup.as_frontend_json() for backup in backups.values()],
"backups": list(backups.values()),
"last_attempted_automatic_backup": manager.config.data.last_attempted_automatic_backup,
"last_completed_automatic_backup": manager.config.data.last_completed_automatic_backup,
"last_non_idle_event": manager.last_non_idle_event,
@@ -91,7 +91,7 @@ async def handle_details(
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backup": backup.as_frontend_json() if backup else None,
"backup": backup,
},
)
@@ -151,6 +151,8 @@ async def handle_restore(
restore_folders=msg.get("restore_folders"),
restore_homeassistant=msg["restore_homeassistant"],
)
except BackupNotFound:
connection.send_error(msg["id"], "backup_not_found", "Backup not found")
except IncorrectPasswordError:
connection.send_error(msg["id"], "password_incorrect", "Incorrect password")
else:
@@ -179,6 +181,8 @@ async def handle_can_decrypt_on_download(
agent_id=msg["agent_id"],
password=msg.get("password"),
)
except BackupNotFound:
connection.send_error(msg["id"], "backup_not_found", "Backup not found")
except IncorrectPasswordError:
connection.send_error(msg["id"], "password_incorrect", "Incorrect password")
except DecryptOnDowloadNotSupported:
@@ -199,7 +203,7 @@ async def handle_can_decrypt_on_download(
vol.Optional("include_database", default=True): bool,
vol.Optional("include_folders"): [vol.Coerce(Folder)],
vol.Optional("include_homeassistant", default=True): bool,
vol.Optional("name"): str,
vol.Optional("name"): vol.Any(str, None),
vol.Optional("password"): vol.Any(str, None),
}
)

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
import datetime
import logging
import platform
from typing import TYPE_CHECKING
from typing import TYPE_CHECKING, Any
from bleak_retry_connector import BleakSlotManager
from bluetooth_adapters import (
@@ -80,6 +80,7 @@ from .const import (
CONF_DETAILS,
CONF_PASSIVE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -297,7 +298,11 @@ async def async_discover_adapters(
async def async_update_device(
hass: HomeAssistant, entry: ConfigEntry, adapter: str, details: AdapterDetails
hass: HomeAssistant,
entry: ConfigEntry,
adapter: str,
details: AdapterDetails,
via_device_id: str | None = None,
) -> None:
"""Update device registry entry.
@@ -306,7 +311,8 @@ async def async_update_device(
update the device with the new location so they can
figure out where the adapter is.
"""
dr.async_get(hass).async_get_or_create(
device_registry = dr.async_get(hass)
device_entry = device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
name=adapter_human_name(adapter, details[ADAPTER_ADDRESS]),
connections={(dr.CONNECTION_BLUETOOTH, details[ADAPTER_ADDRESS])},
@@ -315,6 +321,11 @@ async def async_update_device(
sw_version=details.get(ADAPTER_SW_VERSION),
hw_version=details.get(ADAPTER_HW_VERSION),
)
if via_device_id and (via_device_entry := device_registry.async_get(via_device_id)):
kwargs: dict[str, Any] = {"via_device_id": via_device_id}
if not device_entry.area_id and via_device_entry.area_id:
kwargs["area_id"] = via_device_entry.area_id
device_registry.async_update_device(device_entry.id, **kwargs)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
@@ -349,6 +360,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry,
source_entry.title,
details,
entry.data.get(CONF_SOURCE_DEVICE_ID),
)
return True
manager = _get_manager(hass)

View File

@@ -181,10 +181,16 @@ def async_register_scanner(
source_domain: str | None = None,
source_model: str | None = None,
source_config_entry_id: str | None = None,
source_device_id: str | None = None,
) -> CALLBACK_TYPE:
"""Register a BleakScanner."""
return _get_manager(hass).async_register_hass_scanner(
scanner, connection_slots, source_domain, source_model, source_config_entry_id
scanner,
connection_slots,
source_domain,
source_model,
source_config_entry_id,
source_device_id,
)

View File

@@ -37,6 +37,7 @@ from .const import (
CONF_PASSIVE,
CONF_SOURCE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -139,7 +140,7 @@ class BluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
title=adapter_title(adapter, details), data={}
)
configured_addresses = self._async_current_ids()
configured_addresses = self._async_current_ids(include_ignore=False)
bluetooth_adapters = get_adapters()
await bluetooth_adapters.refresh()
self._adapters = bluetooth_adapters.adapters
@@ -154,12 +155,8 @@ class BluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
and not (system == "Linux" and details[ADAPTER_ADDRESS] == DEFAULT_ADDRESS)
]
if not unconfigured_adapters:
ignored_adapters = len(
self._async_current_entries(include_ignore=True)
) - len(self._async_current_entries(include_ignore=False))
return self.async_abort(
reason="no_adapters",
description_placeholders={"ignored_adapters": str(ignored_adapters)},
)
if len(unconfigured_adapters) == 1:
self._adapter = list(self._adapters)[0]
@@ -194,6 +191,7 @@ class BluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
CONF_SOURCE_MODEL: user_input[CONF_SOURCE_MODEL],
CONF_SOURCE_DOMAIN: user_input[CONF_SOURCE_DOMAIN],
CONF_SOURCE_CONFIG_ENTRY_ID: user_input[CONF_SOURCE_CONFIG_ENTRY_ID],
CONF_SOURCE_DEVICE_ID: user_input[CONF_SOURCE_DEVICE_ID],
}
self._abort_if_unique_id_configured(updates=data)
manager = get_manager()

View File

@@ -22,7 +22,7 @@ CONF_SOURCE: Final = "source"
CONF_SOURCE_DOMAIN: Final = "source_domain"
CONF_SOURCE_MODEL: Final = "source_model"
CONF_SOURCE_CONFIG_ENTRY_ID: Final = "source_config_entry_id"
CONF_SOURCE_DEVICE_ID: Final = "source_device_id"
SOURCE_LOCAL: Final = "local"

View File

@@ -25,6 +25,7 @@ from homeassistant.helpers.dispatcher import async_dispatcher_connect
from .const import (
CONF_SOURCE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -254,6 +255,7 @@ class HomeAssistantBluetoothManager(BluetoothManager):
source_domain: str | None = None,
source_model: str | None = None,
source_config_entry_id: str | None = None,
source_device_id: str | None = None,
) -> CALLBACK_TYPE:
"""Register a scanner."""
cancel = self.async_register_scanner(scanner, connection_slots)
@@ -261,9 +263,6 @@ class HomeAssistantBluetoothManager(BluetoothManager):
isinstance(scanner, BaseHaRemoteScanner)
and source_domain
and source_config_entry_id
and not self.hass.config_entries.async_entry_for_domain_unique_id(
DOMAIN, scanner.source
)
):
self.hass.async_create_task(
self.hass.config_entries.flow.async_init(
@@ -274,6 +273,7 @@ class HomeAssistantBluetoothManager(BluetoothManager):
CONF_SOURCE_DOMAIN: source_domain,
CONF_SOURCE_MODEL: source_model,
CONF_SOURCE_CONFIG_ENTRY_ID: source_config_entry_id,
CONF_SOURCE_DEVICE_ID: source_device_id,
},
)
)

View File

@@ -16,11 +16,11 @@
"quality_scale": "internal",
"requirements": [
"bleak==0.22.3",
"bleak-retry-connector==3.8.0",
"bluetooth-adapters==0.21.1",
"bleak-retry-connector==3.8.1",
"bluetooth-adapters==0.21.4",
"bluetooth-auto-recovery==1.4.2",
"bluetooth-data-tools==1.22.0",
"dbus-fast==2.30.2",
"habluetooth==3.14.0"
"bluetooth-data-tools==1.23.4",
"dbus-fast==2.33.0",
"habluetooth==3.21.1"
]
}

View File

@@ -411,7 +411,7 @@ def ble_device_matches(
) and service_data_uuid not in service_info.service_data:
return False
if manufacturer_id := matcher.get(MANUFACTURER_ID):
if (manufacturer_id := matcher.get(MANUFACTURER_ID)) is not None:
if manufacturer_id not in service_info.manufacturer_data:
return False

View File

@@ -23,7 +23,7 @@
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"no_adapters": "No unconfigured Bluetooth adapters found. There are {ignored_adapters} ignored adapters."
"no_adapters": "No unconfigured Bluetooth adapters found."
}
},
"options": {

View File

@@ -132,7 +132,7 @@ class BTHomeConfigFlow(ConfigFlow, domain=DOMAIN):
return self._async_get_or_create_entry()
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:

View File

@@ -20,5 +20,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/bthome",
"iot_class": "local_push",
"requirements": ["bthome-ble==3.9.1"]
"requirements": ["bthome-ble==3.12.3"]
}

View File

@@ -67,6 +67,16 @@ SENSOR_DESCRIPTIONS = {
state_class=SensorStateClass.MEASUREMENT,
entity_category=EntityCategory.DIAGNOSTIC,
),
# Conductivity (µS/cm)
(
BTHomeSensorDeviceClass.CONDUCTIVITY,
Units.CONDUCTIVITY,
): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.CONDUCTIVITY}_{Units.CONDUCTIVITY}",
device_class=SensorDeviceClass.CONDUCTIVITY,
native_unit_of_measurement=UnitOfConductivity.MICROSIEMENS_PER_CM,
state_class=SensorStateClass.MEASUREMENT,
),
# Count (-)
(BTHomeSensorDeviceClass.COUNT, None): SensorEntityDescription(
key=str(BTHomeSensorDeviceClass.COUNT),
@@ -99,6 +109,12 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
),
# Directions (°)
(BTHomeExtendedSensorDeviceClass.DIRECTION, Units.DEGREE): SensorEntityDescription(
key=f"{BTHomeExtendedSensorDeviceClass.DIRECTION}_{Units.DEGREE}",
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT,
),
# Distance (mm)
(
BTHomeSensorDeviceClass.DISTANCE,
@@ -221,6 +237,16 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
),
# Precipitation (mm)
(
BTHomeExtendedSensorDeviceClass.PRECIPITATION,
Units.LENGTH_MILLIMETERS,
): SensorEntityDescription(
key=f"{BTHomeExtendedSensorDeviceClass.PRECIPITATION}_{Units.LENGTH_MILLIMETERS}",
device_class=SensorDeviceClass.PRECIPITATION,
native_unit_of_measurement=UnitOfLength.MILLIMETERS,
state_class=SensorStateClass.MEASUREMENT,
),
# Pressure (mbar)
(BTHomeSensorDeviceClass.PRESSURE, Units.PRESSURE_MBAR): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.PRESSURE}_{Units.PRESSURE_MBAR}",
@@ -357,16 +383,6 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfVolume.LITERS,
state_class=SensorStateClass.TOTAL,
),
# Conductivity (µS/cm)
(
BTHomeSensorDeviceClass.CONDUCTIVITY,
Units.CONDUCTIVITY,
): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.CONDUCTIVITY}_{Units.CONDUCTIVITY}",
device_class=SensorDeviceClass.CONDUCTIVITY,
native_unit_of_measurement=UnitOfConductivity.MICROSIEMENS_PER_CM,
state_class=SensorStateClass.MEASUREMENT,
),
}

View File

@@ -1175,12 +1175,17 @@ async def async_handle_snapshot_service(
f"Cannot write `{snapshot_file}`, no access to path; `allowlist_external_dirs` may need to be adjusted in `configuration.yaml`"
)
async with asyncio.timeout(CAMERA_IMAGE_TIMEOUT):
image = (
await _async_get_stream_image(camera, wait_for_next_keyframe=True)
if camera.use_stream_for_stills
else await camera.async_camera_image()
)
try:
async with asyncio.timeout(CAMERA_IMAGE_TIMEOUT):
image = (
await _async_get_stream_image(camera, wait_for_next_keyframe=True)
if camera.use_stream_for_stills
else await camera.async_camera_image()
)
except TimeoutError as err:
raise HomeAssistantError(
f"Unable to get snapshot: Timed out after {CAMERA_IMAGE_TIMEOUT} seconds"
) from err
if image is None:
return
@@ -1194,7 +1199,7 @@ async def async_handle_snapshot_service(
try:
await hass.async_add_executor_job(_write_image, snapshot_file, image)
except OSError as err:
_LOGGER.error("Can't write image to file: %s", err)
raise HomeAssistantError(f"Can't write image to file: {err}") from err
async def async_handle_play_stream_service(

View File

@@ -29,6 +29,7 @@ from homeassistant.components.google_assistant import helpers as google_helpers
from homeassistant.components.homeassistant import exposed_entities
from homeassistant.components.http import KEY_HASS, HomeAssistantView, require_admin
from homeassistant.components.http.data_validator import RequestDataValidator
from homeassistant.components.system_health import get_info as get_system_health_info
from homeassistant.const import CLOUD_NEVER_EXPOSED_ENTITIES
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
@@ -107,6 +108,7 @@ def async_setup(hass: HomeAssistant) -> None:
hass.http.register_view(CloudRegisterView)
hass.http.register_view(CloudResendConfirmView)
hass.http.register_view(CloudForgotPasswordView)
hass.http.register_view(DownloadSupportPackageView)
_CLOUD_ERRORS.update(
{
@@ -389,6 +391,59 @@ class CloudForgotPasswordView(HomeAssistantView):
return self.json_message("ok")
class DownloadSupportPackageView(HomeAssistantView):
"""Download support package view."""
url = "/api/cloud/support_package"
name = "api:cloud:support_package"
def _generate_markdown(
self, hass_info: dict[str, Any], domains_info: dict[str, dict[str, str]]
) -> str:
def get_domain_table_markdown(domain_info: dict[str, Any]) -> str:
if len(domain_info) == 0:
return "No information available\n"
markdown = ""
first = True
for key, value in domain_info.items():
markdown += f"{key} | {value}\n"
if first:
markdown += "--- | ---\n"
first = False
return markdown + "\n"
markdown = "## System Information\n\n"
markdown += get_domain_table_markdown(hass_info)
for domain, domain_info in domains_info.items():
domain_info_md = get_domain_table_markdown(domain_info)
markdown += (
f"<details><summary>{domain}</summary>\n\n"
f"{domain_info_md}"
"</details>\n\n"
)
return markdown
async def get(self, request: web.Request) -> web.Response:
"""Download support package file."""
hass = request.app[KEY_HASS]
domain_health = await get_system_health_info(hass)
hass_info = domain_health.pop("homeassistant", {})
markdown = self._generate_markdown(hass_info, domain_health)
return web.Response(
body=markdown,
content_type="text/markdown",
headers={
"Content-Disposition": 'attachment; filename="support_package.md"'
},
)
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "cloud/remove_data"})
@websocket_api.async_response

View File

@@ -140,8 +140,10 @@ def get_accounts(client, version):
API_ACCOUNT_ID: account[API_V3_ACCOUNT_ID],
API_ACCOUNT_NAME: account[API_ACCOUNT_NAME],
API_ACCOUNT_CURRENCY: account[API_ACCOUNT_CURRENCY],
API_ACCOUNT_AMOUNT: account[API_ACCOUNT_AVALIABLE][API_ACCOUNT_VALUE]
+ account[API_ACCOUNT_HOLD][API_ACCOUNT_VALUE],
API_ACCOUNT_AMOUNT: (
float(account[API_ACCOUNT_AVALIABLE][API_ACCOUNT_VALUE])
+ float(account[API_ACCOUNT_HOLD][API_ACCOUNT_VALUE])
),
ACCOUNT_IS_VAULT: account[API_RESOURCE_TYPE] == API_V3_TYPE_VAULT,
}
for account in accounts

View File

@@ -302,7 +302,8 @@ def config_entries_progress(
[
flw
for flw in hass.config_entries.flow.async_progress()
if flw["context"]["source"] != config_entries.SOURCE_USER
if flw["context"]["source"]
not in (config_entries.SOURCE_RECONFIGURE, config_entries.SOURCE_USER)
],
)

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["hassil==2.2.0", "home-assistant-intents==2025.1.28"]
"requirements": ["hassil==2.2.3", "home-assistant-intents==2025.2.5"]
}

View File

@@ -14,7 +14,7 @@
],
"quality_scale": "internal",
"requirements": [
"aiodhcpwatcher==1.0.2",
"aiodhcpwatcher==1.0.3",
"aiodiscover==2.1.0",
"cached-ipaddress==0.8.0"
]

View File

@@ -44,9 +44,7 @@ class DiscovergyUpdateCoordinator(DataUpdateCoordinator[Reading]):
)
except InvalidLogin as err:
raise ConfigEntryAuthFailed(
f"Auth expired while fetching last reading for meter {self.meter.meter_id}"
"Auth expired while fetching last reading"
) from err
except (HTTPError, DiscovergyClientError) as err:
raise UpdateFailed(
f"Error while fetching last reading for meter {self.meter.meter_id}"
) from err
raise UpdateFailed(f"Error while fetching last reading: {err}") from err

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/ecovacs",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.10", "deebot-client==11.1.0b1"]
"requirements": ["py-sucks==0.9.10", "deebot-client==12.0.0"]
}

View File

@@ -2,6 +2,7 @@
from typing import Any
from eheimdigital.device import EheimDigitalDevice
from eheimdigital.heater import EheimDigitalHeater
from eheimdigital.types import EheimDigitalClientError, HeaterMode, HeaterUnit
@@ -39,17 +40,23 @@ async def async_setup_entry(
"""Set up the callbacks for the coordinator so climate entities can be added as devices are found."""
coordinator = entry.runtime_data
async def async_setup_device_entities(device_address: str) -> None:
"""Set up the light entities for a device."""
device = coordinator.hub.devices[device_address]
def async_setup_device_entities(
device_address: str | dict[str, EheimDigitalDevice],
) -> None:
"""Set up the climate entities for one or multiple devices."""
entities: list[EheimDigitalHeaterClimate] = []
if isinstance(device_address, str):
device_address = {device_address: coordinator.hub.devices[device_address]}
for device in device_address.values():
if isinstance(device, EheimDigitalHeater):
entities.append(EheimDigitalHeaterClimate(coordinator, device))
coordinator.known_devices.add(device.mac_address)
if isinstance(device, EheimDigitalHeater):
async_add_entities([EheimDigitalHeaterClimate(coordinator, device)])
async_add_entities(entities)
coordinator.add_platform_callback(async_setup_device_entities)
for device_address in entry.runtime_data.hub.devices:
await async_setup_device_entities(device_address)
async_setup_device_entities(coordinator.hub.devices)
class EheimDigitalHeaterClimate(EheimDigitalEntity[EheimDigitalHeater], ClimateEntity):
@@ -69,6 +76,7 @@ class EheimDigitalHeaterClimate(EheimDigitalEntity[EheimDigitalHeater], ClimateE
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_preset_mode = PRESET_NONE
_attr_translation_key = "heater"
_attr_name = None
def __init__(
self, coordinator: EheimDigitalUpdateCoordinator, device: EheimDigitalHeater

View File

@@ -2,8 +2,7 @@
from __future__ import annotations
from collections.abc import Callable, Coroutine
from typing import Any
from collections.abc import Callable
from aiohttp import ClientError
from eheimdigital.device import EheimDigitalDevice
@@ -19,7 +18,9 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import DOMAIN, LOGGER
type AsyncSetupDeviceEntitiesCallback = Callable[[str], Coroutine[Any, Any, None]]
type AsyncSetupDeviceEntitiesCallback = Callable[
[str | dict[str, EheimDigitalDevice]], None
]
class EheimDigitalUpdateCoordinator(
@@ -61,7 +62,7 @@ class EheimDigitalUpdateCoordinator(
if device_address not in self.known_devices:
for platform_callback in self.platform_callbacks:
await platform_callback(device_address)
platform_callback(device_address)
async def _async_receive_callback(self) -> None:
self.async_set_updated_data(self.hub.devices)

View File

@@ -3,6 +3,7 @@
from typing import Any
from eheimdigital.classic_led_ctrl import EheimDigitalClassicLEDControl
from eheimdigital.device import EheimDigitalDevice
from eheimdigital.types import EheimDigitalClientError, LightMode
from homeassistant.components.light import (
@@ -37,24 +38,28 @@ async def async_setup_entry(
"""Set up the callbacks for the coordinator so lights can be added as devices are found."""
coordinator = entry.runtime_data
async def async_setup_device_entities(device_address: str) -> None:
"""Set up the light entities for a device."""
device = coordinator.hub.devices[device_address]
def async_setup_device_entities(
device_address: str | dict[str, EheimDigitalDevice],
) -> None:
"""Set up the light entities for one or multiple devices."""
entities: list[EheimDigitalClassicLEDControlLight] = []
if isinstance(device_address, str):
device_address = {device_address: coordinator.hub.devices[device_address]}
for device in device_address.values():
if isinstance(device, EheimDigitalClassicLEDControl):
for channel in range(2):
if len(device.tankconfig[channel]) > 0:
entities.append(
EheimDigitalClassicLEDControlLight(
coordinator, device, channel
)
)
coordinator.known_devices.add(device.mac_address)
if isinstance(device, EheimDigitalClassicLEDControl):
for channel in range(2):
if len(device.tankconfig[channel]) > 0:
entities.append(
EheimDigitalClassicLEDControlLight(coordinator, device, channel)
)
coordinator.known_devices.add(device.mac_address)
async_add_entities(entities)
coordinator.add_platform_callback(async_setup_device_entities)
for device_address in entry.runtime_data.hub.devices:
await async_setup_device_entities(device_address)
async_setup_device_entities(coordinator.hub.devices)
class EheimDigitalClassicLEDControlLight(

View File

@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["eheimdigital"],
"quality_scale": "bronze",
"requirements": ["eheimdigital==1.0.5"],
"requirements": ["eheimdigital==1.0.6"],
"zeroconf": [
{ "type": "_http._tcp.local.", "name": "eheimdigital._http._tcp.local." }
]

View File

@@ -4,12 +4,16 @@ from __future__ import annotations
import aiohttp
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException
from electrickiwi_api.exceptions import ApiException, AuthException
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from homeassistant.helpers import (
aiohttp_client,
config_entry_oauth2_flow,
entity_registry as er,
)
from . import api
from .coordinator import (
@@ -44,7 +48,9 @@ async def async_setup_entry(
raise ConfigEntryNotReady from err
ek_api = ElectricKiwiApi(
api.AsyncConfigEntryAuth(aiohttp_client.async_get_clientsession(hass), session)
api.ConfigEntryElectricKiwiAuth(
aiohttp_client.async_get_clientsession(hass), session
)
)
hop_coordinator = ElectricKiwiHOPDataCoordinator(hass, entry, ek_api)
account_coordinator = ElectricKiwiAccountDataCoordinator(hass, entry, ek_api)
@@ -53,6 +59,8 @@ async def async_setup_entry(
await ek_api.set_active_session()
await hop_coordinator.async_config_entry_first_refresh()
await account_coordinator.async_config_entry_first_refresh()
except AuthException as err:
raise ConfigEntryAuthFailed from err
except ApiException as err:
raise ConfigEntryNotReady from err
@@ -70,3 +78,53 @@ async def async_unload_entry(
) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def async_migrate_entry(
hass: HomeAssistant, config_entry: ElectricKiwiConfigEntry
) -> bool:
"""Migrate old entry."""
if config_entry.version == 1 and config_entry.minor_version == 1:
implementation = (
await config_entry_oauth2_flow.async_get_config_entry_implementation(
hass, config_entry
)
)
session = config_entry_oauth2_flow.OAuth2Session(
hass, config_entry, implementation
)
ek_api = ElectricKiwiApi(
api.ConfigEntryElectricKiwiAuth(
aiohttp_client.async_get_clientsession(hass), session
)
)
try:
await ek_api.set_active_session()
connection_details = await ek_api.get_connection_details()
except AuthException:
config_entry.async_start_reauth(hass)
return False
except ApiException:
return False
unique_id = str(ek_api.customer_number)
identifier = ek_api.electricity.identifier
hass.config_entries.async_update_entry(
config_entry, unique_id=unique_id, minor_version=2
)
entity_registry = er.async_get(hass)
entity_entries = er.async_entries_for_config_entry(
entity_registry, config_entry_id=config_entry.entry_id
)
for entity in entity_entries:
assert entity.config_entry_id
entity_registry.async_update_entity(
entity.entity_id,
new_unique_id=entity.unique_id.replace(
f"{unique_id}_{connection_details.id}", f"{unique_id}_{identifier}"
),
)
return True

View File

@@ -2,17 +2,16 @@
from __future__ import annotations
from typing import cast
from aiohttp import ClientSession
from electrickiwi_api import AbstractAuth
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.core import HomeAssistant
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from .const import API_BASE_URL
class AsyncConfigEntryAuth(AbstractAuth):
class ConfigEntryElectricKiwiAuth(AbstractAuth):
"""Provide Electric Kiwi authentication tied to an OAuth2 based config entry."""
def __init__(
@@ -29,4 +28,21 @@ class AsyncConfigEntryAuth(AbstractAuth):
"""Return a valid access token."""
await self._oauth_session.async_ensure_token_valid()
return cast(str, self._oauth_session.token["access_token"])
return str(self._oauth_session.token["access_token"])
class ConfigFlowElectricKiwiAuth(AbstractAuth):
"""Provide Electric Kiwi authentication tied to an OAuth2 based config flow."""
def __init__(
self,
hass: HomeAssistant,
token: str,
) -> None:
"""Initialize ConfigFlowFitbitApi."""
super().__init__(aiohttp_client.async_get_clientsession(hass), API_BASE_URL)
self._token = token
async def async_get_access_token(self) -> str:
"""Return the token for the Electric Kiwi API."""
return self._token

View File

@@ -6,9 +6,14 @@ from collections.abc import Mapping
import logging
from typing import Any
from homeassistant.config_entries import ConfigFlowResult
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
from homeassistant.const import CONF_NAME
from homeassistant.helpers import config_entry_oauth2_flow
from . import api
from .const import DOMAIN, SCOPE_VALUES
@@ -17,6 +22,8 @@ class ElectricKiwiOauth2FlowHandler(
):
"""Config flow to handle Electric Kiwi OAuth2 authentication."""
VERSION = 1
MINOR_VERSION = 2
DOMAIN = DOMAIN
@property
@@ -40,12 +47,30 @@ class ElectricKiwiOauth2FlowHandler(
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
if user_input is None:
return self.async_show_form(step_id="reauth_confirm")
return self.async_show_form(
step_id="reauth_confirm",
description_placeholders={CONF_NAME: self._get_reauth_entry().title},
)
return await self.async_step_user()
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
"""Create an entry for Electric Kiwi."""
existing_entry = await self.async_set_unique_id(DOMAIN)
if existing_entry:
return self.async_update_reload_and_abort(existing_entry, data=data)
return await super().async_oauth_create_entry(data)
ek_api = ElectricKiwiApi(
api.ConfigFlowElectricKiwiAuth(self.hass, data["token"]["access_token"])
)
try:
session = await ek_api.get_active_session()
except ApiException:
return self.async_abort(reason="connection_error")
unique_id = str(session.data.customer_number)
await self.async_set_unique_id(unique_id)
if self.source == SOURCE_REAUTH:
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data=data
)
self._abort_if_unique_id_configured()
return self.async_create_entry(title=unique_id, data=data)

View File

@@ -8,4 +8,4 @@ OAUTH2_AUTHORIZE = "https://welcome.electrickiwi.co.nz/oauth/authorize"
OAUTH2_TOKEN = "https://welcome.electrickiwi.co.nz/oauth/token"
API_BASE_URL = "https://api.electrickiwi.co.nz"
SCOPE_VALUES = "read_connection_detail read_billing_frequency read_account_running_balance read_consumption_summary read_consumption_averages read_hop_intervals_config read_hop_connection save_hop_connection read_session"
SCOPE_VALUES = "read_customer_details read_connection_detail read_connection read_billing_address get_bill_address read_billing_frequency read_billing_details read_billing_bills read_billing_bill read_billing_bill_id read_billing_bill_file read_account_running_balance read_customer_account_summary read_consumption_summary download_consumption_file read_consumption_averages get_consumption_averages read_hop_intervals_config read_hop_intervals read_hop_connection read_hop_specific_connection save_hop_connection save_hop_specific_connection read_outage_contact get_outage_contact_info_for_icp read_session read_session_data_login"

View File

@@ -10,7 +10,7 @@ import logging
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException, AuthException
from electrickiwi_api.model import AccountBalance, Hop, HopIntervals
from electrickiwi_api.model import AccountSummary, Hop, HopIntervals
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -34,7 +34,7 @@ class ElectricKiwiRuntimeData:
type ElectricKiwiConfigEntry = ConfigEntry[ElectricKiwiRuntimeData]
class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountBalance]):
class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountSummary]):
"""ElectricKiwi Account Data object."""
def __init__(
@@ -51,13 +51,13 @@ class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountBalance]):
name="Electric Kiwi Account Data",
update_interval=ACCOUNT_SCAN_INTERVAL,
)
self._ek_api = ek_api
self.ek_api = ek_api
async def _async_update_data(self) -> AccountBalance:
async def _async_update_data(self) -> AccountSummary:
"""Fetch data from Account balance API endpoint."""
try:
async with asyncio.timeout(60):
return await self._ek_api.get_account_balance()
return await self.ek_api.get_account_summary()
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:
@@ -85,7 +85,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
# Polling interval. Will only be polled if there are subscribers.
update_interval=HOP_SCAN_INTERVAL,
)
self._ek_api = ek_api
self.ek_api = ek_api
self.hop_intervals: HopIntervals | None = None
def get_hop_options(self) -> dict[str, int]:
@@ -100,7 +100,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
async def async_update_hop(self, hop_interval: int) -> Hop:
"""Update selected hop and data."""
try:
self.async_set_updated_data(await self._ek_api.post_hop(hop_interval))
self.async_set_updated_data(await self.ek_api.post_hop(hop_interval))
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:
@@ -118,7 +118,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
try:
async with asyncio.timeout(60):
if self.hop_intervals is None:
hop_intervals: HopIntervals = await self._ek_api.get_hop_intervals()
hop_intervals: HopIntervals = await self.ek_api.get_hop_intervals()
hop_intervals.intervals = OrderedDict(
filter(
lambda pair: pair[1].active == 1,
@@ -127,7 +127,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
)
self.hop_intervals = hop_intervals
return await self._ek_api.get_hop()
return await self.ek_api.get_hop()
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/electric_kiwi",
"integration_type": "hub",
"iot_class": "cloud_polling",
"requirements": ["electrickiwi-api==0.8.5"]
"requirements": ["electrickiwi-api==0.9.14"]
}

View File

@@ -53,8 +53,8 @@ class ElectricKiwiSelectHOPEntity(
"""Initialise the HOP selection entity."""
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description
self.values_dict = coordinator.get_hop_options()

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable
from dataclasses import dataclass
from datetime import datetime, timedelta
from electrickiwi_api.model import AccountBalance, Hop
from electrickiwi_api.model import AccountSummary, Hop
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -39,7 +39,15 @@ ATTR_HOP_PERCENTAGE = "hop_percentage"
class ElectricKiwiAccountSensorEntityDescription(SensorEntityDescription):
"""Describes Electric Kiwi sensor entity."""
value_func: Callable[[AccountBalance], float | datetime]
value_func: Callable[[AccountSummary], float | datetime]
def _get_hop_percentage(account_balance: AccountSummary) -> float:
"""Return the hop percentage from account summary."""
if power := account_balance.services.get("power"):
if connection := power.connections[0]:
return float(connection.hop_percentage)
return 0.0
ACCOUNT_SENSOR_TYPES: tuple[ElectricKiwiAccountSensorEntityDescription, ...] = (
@@ -72,9 +80,7 @@ ACCOUNT_SENSOR_TYPES: tuple[ElectricKiwiAccountSensorEntityDescription, ...] = (
translation_key="hop_power_savings",
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
value_func=lambda account_balance: float(
account_balance.connections[0].hop_percentage
),
value_func=_get_hop_percentage,
),
)
@@ -165,8 +171,8 @@ class ElectricKiwiAccountEntity(
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description
@@ -194,8 +200,8 @@ class ElectricKiwiHOPEntity(
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description

View File

@@ -21,7 +21,8 @@
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]"
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
"connection_error": "[%key:common::config_flow::error::cannot_connect%]"
},
"create_entry": {
"default": "[%key:common::config_flow::create_entry::authenticated%]"

View File

@@ -3,7 +3,7 @@
"config": {
"step": {
"user": {
"title": "Searching for Energenie-Power-Sockets Devices.",
"title": "Searching for Energenie Power Sockets devices",
"description": "Choose a discovered device.",
"data": {
"device": "[%key:common::config_flow::data::device%]"
@@ -13,7 +13,7 @@
"abort": {
"usb_error": "Couldn't access USB devices!",
"no_device": "Unable to discover any (new) supported device.",
"device_not_found": "No device was found for the given id.",
"device_not_found": "No device was found for the given ID.",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},

View File

@@ -22,5 +22,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["eq3btsmart"],
"requirements": ["eq3btsmart==1.4.1", "bleak-esphome==2.2.0"]
"requirements": ["eq3btsmart==1.4.1", "bleak-esphome==2.7.0"]
}

View File

@@ -28,6 +28,7 @@ def async_connect_scanner(
entry_data: RuntimeEntryData,
cli: APIClient,
device_info: DeviceInfo,
device_id: str,
) -> CALLBACK_TYPE:
"""Connect scanner."""
client_data = connect_scanner(cli, device_info, entry_data.available)
@@ -45,6 +46,7 @@ def async_connect_scanner(
source_domain=DOMAIN,
source_model=device_info.model,
source_config_entry_id=entry_data.entry_id,
source_device_id=device_id,
),
scanner.async_setup(),
],

View File

@@ -425,7 +425,9 @@ class ESPHomeManager:
if device_info.bluetooth_proxy_feature_flags_compat(api_version):
entry_data.disconnect_callbacks.add(
async_connect_scanner(hass, entry_data, cli, device_info)
async_connect_scanner(
hass, entry_data, cli, device_info, self.device_id
)
)
else:
bluetooth.async_remove_scanner(hass, device_info.mac_address)
@@ -571,7 +573,9 @@ def _async_setup_device_registry(
configuration_url = None
if device_info.webserver_port > 0:
configuration_url = f"http://{entry.data['host']}:{device_info.webserver_port}"
entry_host = entry.data["host"]
host = f"[{entry_host}]" if ":" in entry_host else entry_host
configuration_url = f"http://{host}:{device_info.webserver_port}"
elif (
(dashboard := async_get_dashboard(hass))
and dashboard.data

View File

@@ -18,7 +18,7 @@
"requirements": [
"aioesphomeapi==29.0.0",
"esphome-dashboard-api==1.2.3",
"bleak-esphome==2.2.0"
"bleak-esphome==2.7.0"
],
"zeroconf": ["_esphomelib._tcp.local."]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/fireservicerota",
"iot_class": "cloud_polling",
"loggers": ["pyfireservicerota"],
"requirements": ["pyfireservicerota==0.0.43"]
"requirements": ["pyfireservicerota==0.0.46"]
}

View File

@@ -21,5 +21,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250129.0"]
"requirements": ["home-assistant-frontend==20250210.0"]
}

View File

@@ -7,7 +7,7 @@ from collections.abc import Callable
from google_drive_api.exceptions import GoogleDriveApiError
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import instance_id
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -49,6 +49,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoogleDriveConfigEntry)
except GoogleDriveApiError as err:
raise ConfigEntryNotReady from err
_async_notify_backup_listeners_soon(hass)
return True
@@ -56,10 +58,15 @@ async def async_unload_entry(
hass: HomeAssistant, entry: GoogleDriveConfigEntry
) -> bool:
"""Unload a config entry."""
hass.loop.call_soon(_notify_backup_listeners, hass)
_async_notify_backup_listeners_soon(hass)
return True
def _notify_backup_listeners(hass: HomeAssistant) -> None:
def _async_notify_backup_listeners(hass: HomeAssistant) -> None:
for listener in hass.data.get(DATA_BACKUP_AGENT_LISTENERS, []):
listener()
@callback
def _async_notify_backup_listeners_soon(hass: HomeAssistant) -> None:
hass.loop.call_soon(_async_notify_backup_listeners, hass)

View File

@@ -11,7 +11,7 @@ from aiohttp import ClientSession, ClientTimeout, StreamReader
from aiohttp.client_exceptions import ClientError, ClientResponseError
from google_drive_api.api import AbstractAuth, GoogleDriveApi
from homeassistant.components.backup import AgentBackup
from homeassistant.components.backup import AgentBackup, suggested_filename
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.exceptions import (
@@ -132,7 +132,7 @@ class DriveClient:
"""Upload a backup."""
folder_id, _ = await self.async_create_ha_root_folder_if_not_exists()
backup_metadata = {
"name": f"{backup.name} {backup.date}.tar",
"name": suggested_filename(backup),
"description": json.dumps(backup.as_dict()),
"parents": [folder_id],
"properties": {
@@ -146,9 +146,10 @@ class DriveClient:
backup.backup_id,
backup_metadata,
)
await self._api.upload_file(
await self._api.resumable_upload_file(
backup_metadata,
open_stream,
backup.size,
timeout=ClientTimeout(total=_UPLOAD_AND_DOWNLOAD_TIMEOUT),
)
_LOGGER.debug(

View File

@@ -2,6 +2,10 @@
from homeassistant.components.application_credentials import AuthorizationServer
from homeassistant.core import HomeAssistant
from homeassistant.helpers.config_entry_oauth2_flow import (
AUTH_CALLBACK_PATH,
MY_AUTH_CALLBACK_PATH,
)
async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer:
@@ -14,8 +18,14 @@ async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationSe
async def async_get_description_placeholders(hass: HomeAssistant) -> dict[str, str]:
"""Return description placeholders for the credentials dialog."""
if "my" in hass.config.components:
redirect_url = MY_AUTH_CALLBACK_PATH
else:
ha_host = hass.config.external_url or "https://YOUR_DOMAIN:PORT"
redirect_url = f"{ha_host}{AUTH_CALLBACK_PATH}"
return {
"oauth_consent_url": "https://console.cloud.google.com/apis/credentials/consent",
"more_info_url": "https://www.home-assistant.io/integrations/google_drive/",
"oauth_creds_url": "https://console.cloud.google.com/apis/credentials",
"redirect_url": redirect_url,
}

View File

@@ -80,16 +80,14 @@ class GoogleDriveBackupAgent(BackupAgent):
try:
await self._client.async_upload_backup(open_stream, backup)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Upload backup error: %s", err)
raise BackupAgentError("Failed to upload backup") from err
raise BackupAgentError(f"Failed to upload backup: {err}") from err
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
try:
return await self._client.async_list_backups()
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("List backups error: %s", err)
raise BackupAgentError("Failed to list backups") from err
raise BackupAgentError(f"Failed to list backups: {err}") from err
async def async_get_backup(
self,
@@ -121,9 +119,7 @@ class GoogleDriveBackupAgent(BackupAgent):
stream = await self._client.async_download(file_id)
return ChunkAsyncStreamIterator(stream)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Download backup error: %s", err)
raise BackupAgentError("Failed to download backup") from err
_LOGGER.error("Download backup_id: %s not found", backup_id)
raise BackupAgentError(f"Failed to download backup: {err}") from err
raise BackupAgentError("Backup not found")
async def async_delete_backup(
@@ -143,5 +139,4 @@ class GoogleDriveBackupAgent(BackupAgent):
await self._client.async_delete(file_id)
_LOGGER.debug("Deleted backup_id: %s", backup_id)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Delete backup error: %s", err)
raise BackupAgentError("Failed to delete backup") from err
raise BackupAgentError(f"Failed to delete backup: {err}") from err

View File

@@ -10,5 +10,5 @@
"iot_class": "cloud_polling",
"loggers": ["google_drive_api"],
"quality_scale": "platinum",
"requirements": ["python-google-drive-api==0.0.2"]
"requirements": ["python-google-drive-api==0.1.0"]
}

View File

@@ -35,6 +35,6 @@
}
},
"application_credentials": {
"description": "Follow the [instructions]({more_info_url}) for [OAuth consent screen]({oauth_consent_url}) to give Home Assistant access to your Google Drive. You also need to create Application Credentials linked to your account:\n1. Go to [Credentials]({oauth_creds_url}) and select **Create Credentials**.\n1. From the drop-down list select **OAuth client ID**.\n1. Select **Web application** for the Application Type."
"description": "Follow the [instructions]({more_info_url}) to configure the Cloud Console:\n\n1. Go to the [OAuth consent screen]({oauth_consent_url}) and configure\n1. Go to [Credentials]({oauth_creds_url}) and select **Create Credentials**.\n1. From the drop-down list select **OAuth client ID**.\n1. Select **Web application** for the Application Type.\n1. Add `{redirect_url}` under *Authorized redirect URI*."
}
}

View File

@@ -78,7 +78,7 @@ class GoveeConfigFlow(ConfigFlow, domain=DOMAIN):
title=title, data={CONF_DEVICE_TYPE: device.device_type}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:

View File

@@ -38,6 +38,10 @@
"local_name": "GV5126*",
"connectable": false
},
{
"local_name": "GV5179*",
"connectable": false
},
{
"local_name": "GVH5127*",
"connectable": false
@@ -131,5 +135,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/govee_ble",
"iot_class": "local_push",
"requirements": ["govee-ble==0.42.0"]
"requirements": ["govee-ble==0.43.0"]
}

View File

@@ -11,6 +11,7 @@ from typing import Any
from aiohttp import ClientError
from habiticalib import (
Avatar,
ContentData,
Habitica,
HabiticaException,
@@ -19,7 +20,6 @@ from habiticalib import (
TaskFilter,
TooManyRequestsError,
UserData,
UserStyles,
)
from homeassistant.config_entries import ConfigEntry
@@ -159,12 +159,10 @@ class HabiticaDataUpdateCoordinator(DataUpdateCoordinator[HabiticaData]):
else:
await self.async_request_refresh()
async def generate_avatar(self, user_styles: UserStyles) -> bytes:
async def generate_avatar(self, avatar: Avatar) -> bytes:
"""Generate Avatar."""
avatar = BytesIO()
await self.habitica.generate_avatar(
fp=avatar, user_styles=user_styles, fmt="PNG"
)
png = BytesIO()
await self.habitica.generate_avatar(fp=png, avatar=avatar, fmt="PNG")
return avatar.getvalue()
return png.getvalue()

View File

@@ -23,5 +23,5 @@ async def async_get_config_entry_diagnostics(
CONF_URL: config_entry.data[CONF_URL],
CONF_API_USER: config_entry.data[CONF_API_USER],
},
"habitica_data": habitica_data.to_dict()["data"],
"habitica_data": habitica_data.to_dict(omit_none=False)["data"],
}

View File

@@ -2,10 +2,9 @@
from __future__ import annotations
from dataclasses import asdict
from enum import StrEnum
from habiticalib import UserStyles
from habiticalib import Avatar, extract_avatar
from homeassistant.components.image import ImageEntity, ImageEntityDescription
from homeassistant.core import HomeAssistant
@@ -45,7 +44,7 @@ class HabiticaImage(HabiticaBase, ImageEntity):
translation_key=HabiticaImageEntity.AVATAR,
)
_attr_content_type = "image/png"
_current_appearance: UserStyles | None = None
_current_appearance: Avatar | None = None
_cache: bytes | None = None
def __init__(
@@ -60,7 +59,7 @@ class HabiticaImage(HabiticaBase, ImageEntity):
def _handle_coordinator_update(self) -> None:
"""Check if equipped gear and other things have changed since last avatar image generation."""
new_appearance = UserStyles.from_dict(asdict(self.coordinator.data.user))
new_appearance = extract_avatar(self.coordinator.data.user)
if self._current_appearance != new_appearance:
self._current_appearance = new_appearance

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/habitica",
"iot_class": "cloud_polling",
"loggers": ["habiticalib"],
"requirements": ["habiticalib==0.3.3"]
"requirements": ["habiticalib==0.3.7"]
}

View File

@@ -77,7 +77,7 @@ SERVICE_API_CALL_SCHEMA = vol.Schema(
SERVICE_CAST_SKILL_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(),
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_SKILL): cv.string,
vol.Optional(ATTR_TASK): cv.string,
}
@@ -85,12 +85,12 @@ SERVICE_CAST_SKILL_SCHEMA = vol.Schema(
SERVICE_MANAGE_QUEST_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(),
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
}
)
SERVICE_SCORE_TASK_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(),
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_TASK): cv.string,
vol.Optional(ATTR_DIRECTION): cv.string,
}
@@ -98,7 +98,7 @@ SERVICE_SCORE_TASK_SCHEMA = vol.Schema(
SERVICE_TRANSFORMATION_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(),
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Required(ATTR_ITEM): cv.string,
vol.Required(ATTR_TARGET): cv.string,
}
@@ -106,7 +106,7 @@ SERVICE_TRANSFORMATION_SCHEMA = vol.Schema(
SERVICE_GET_TASKS_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector(),
vol.Required(ATTR_CONFIG_ENTRY): ConfigEntrySelector({"integration": DOMAIN}),
vol.Optional(ATTR_TYPE): vol.All(
cv.ensure_list, [vol.All(vol.Upper, vol.In({x.name for x in TaskType}))]
),
@@ -510,7 +510,10 @@ def async_setup_services(hass: HomeAssistant) -> None: # noqa: C901
or (task.notes and keyword in task.notes.lower())
or any(keyword in item.text.lower() for item in task.checklist)
]
result: dict[str, Any] = {"tasks": response}
result: dict[str, Any] = {
"tasks": [task.to_dict(omit_none=False) for task in response]
}
return result
hass.services.async_register(

View File

@@ -6,7 +6,7 @@ import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine, Mapping
import logging
import os
from pathlib import Path
from pathlib import Path, PurePath
from typing import Any, cast
from uuid import UUID
@@ -20,6 +20,7 @@ from aiohasupervisor.models import (
backups as supervisor_backups,
mounts as supervisor_mounts,
)
from aiohasupervisor.models.backups import LOCATION_CLOUD_BACKUP, LOCATION_LOCAL_STORAGE
from homeassistant.components.backup import (
DATA_MANAGER,
@@ -27,30 +28,39 @@ from homeassistant.components.backup import (
AgentBackup,
BackupAgent,
BackupManagerError,
BackupNotFound,
BackupReaderWriter,
BackupReaderWriterError,
CreateBackupEvent,
CreateBackupStage,
CreateBackupState,
Folder,
IdleEvent,
IncorrectPasswordError,
ManagerBackup,
NewBackup,
RestoreBackupEvent,
RestoreBackupStage,
RestoreBackupState,
WrittenBackup,
async_get_manager as async_get_backup_manager,
suggested_filename as suggested_backup_filename,
suggested_filename_from_name_date,
)
from homeassistant.const import __version__ as HAVERSION
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.util import dt as dt_util
from homeassistant.util.enum import try_parse_enum
from .const import DOMAIN, EVENT_SUPERVISOR_EVENT
from .handler import get_supervisor_client
LOCATION_CLOUD_BACKUP = ".cloud_backup"
LOCATION_LOCAL = ".local"
MOUNT_JOBS = ("mount_manager_create_mount", "mount_manager_remove_mount")
RESTORE_JOB_ID_ENV = "SUPERVISOR_RESTORE_JOB_ID"
# Set on backups automatically created when updating an addon
TAG_ADDON_UPDATE = "supervisor.addon_update"
_LOGGER = logging.getLogger(__name__)
@@ -61,7 +71,9 @@ async def async_get_backup_agents(
"""Return the hassio backup agents."""
client = get_supervisor_client(hass)
mounts = await client.mounts.info()
agents: list[BackupAgent] = [SupervisorBackupAgent(hass, "local", None)]
agents: list[BackupAgent] = [
SupervisorBackupAgent(hass, "local", LOCATION_LOCAL_STORAGE)
]
for mount in mounts.mounts:
if mount.usage is not supervisor_mounts.MountUsage.BACKUP:
continue
@@ -101,7 +113,7 @@ def async_register_backup_agents_listener(
def _backup_details_to_agent_backup(
details: supervisor_backups.BackupComplete, location: str | None
details: supervisor_backups.BackupComplete, location: str
) -> AgentBackup:
"""Convert a supervisor backup details object to an agent backup."""
homeassistant_included = details.homeassistant is not None
@@ -113,12 +125,14 @@ def _backup_details_to_agent_backup(
AddonInfo(name=addon.name, slug=addon.slug, version=addon.version)
for addon in details.addons
]
location = location or LOCATION_LOCAL
extra_metadata = details.extra or {}
return AgentBackup(
addons=addons,
backup_id=details.slug,
database_included=database_included,
date=details.date.isoformat(),
date=extra_metadata.get(
"supervisor.backup_request_date", details.date.isoformat()
),
extra_metadata=details.extra or {},
folders=[Folder(folder) for folder in details.folders],
homeassistant_included=homeassistant_included,
@@ -134,7 +148,7 @@ class SupervisorBackupAgent(BackupAgent):
domain = DOMAIN
def __init__(self, hass: HomeAssistant, name: str, location: str | None) -> None:
def __init__(self, hass: HomeAssistant, name: str, location: str) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
@@ -149,10 +163,15 @@ class SupervisorBackupAgent(BackupAgent):
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
return await self._client.backups.download_backup(
backup_id,
options=supervisor_backups.DownloadBackupOptions(location=self.location),
)
try:
return await self._client.backups.download_backup(
backup_id,
options=supervisor_backups.DownloadBackupOptions(
location=self.location
),
)
except SupervisorNotFoundError as err:
raise BackupNotFound from err
async def async_upload_backup(
self,
@@ -174,7 +193,8 @@ class SupervisorBackupAgent(BackupAgent):
return
stream = await open_stream()
upload_options = supervisor_backups.UploadBackupOptions(
location={self.location}
location={self.location},
filename=PurePath(suggested_backup_filename(backup)),
)
await self._client.backups.upload_backup(
stream,
@@ -186,7 +206,7 @@ class SupervisorBackupAgent(BackupAgent):
backup_list = await self._client.backups.list()
result = []
for backup in backup_list:
if not backup.locations or self.location not in backup.locations:
if self.location not in backup.location_attributes:
continue
details = await self._client.backups.backup_info(backup.slug)
result.append(_backup_details_to_agent_backup(details, self.location))
@@ -198,8 +218,11 @@ class SupervisorBackupAgent(BackupAgent):
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
details = await self._client.backups.backup_info(backup_id)
if self.location not in details.locations:
try:
details = await self._client.backups.backup_info(backup_id)
except SupervisorNotFoundError:
return None
if self.location not in details.location_attributes:
return None
return _backup_details_to_agent_backup(details, self.location)
@@ -212,10 +235,6 @@ class SupervisorBackupAgent(BackupAgent):
location={self.location}
),
)
except SupervisorBadRequestError as err:
if err.args[0] != "Backup does not exist":
raise
_LOGGER.debug("Backup %s does not exist", backup_id)
except SupervisorNotFoundError:
_LOGGER.debug("Backup %s does not exist", backup_id)
@@ -276,8 +295,8 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
# will be handled by async_upload_backup.
# If the lists are the same length, it does not matter which one we send,
# we send the encrypted list to have a well defined behavior.
encrypted_locations: list[str | None] = []
decrypted_locations: list[str | None] = []
encrypted_locations: list[str] = []
decrypted_locations: list[str] = []
agents_settings = manager.config.data.agents
for hassio_agent in hassio_agents:
if password is not None:
@@ -302,6 +321,9 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
locations = []
locations = locations or [LOCATION_CLOUD_BACKUP]
date = dt_util.now().isoformat()
extra_metadata = extra_metadata | {"supervisor.backup_request_date": date}
filename = suggested_filename_from_name_date(backup_name, date)
try:
backup = await self._client.backups.partial_backup(
supervisor_backups.PartialBackupOptions(
@@ -315,6 +337,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
homeassistant_exclude_database=not include_database,
background=True,
extra=extra_metadata,
filename=PurePath(filename),
)
)
except SupervisorError as err:
@@ -323,40 +346,56 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
self._async_wait_for_backup(
backup,
locations,
on_progress=on_progress,
remove_after_upload=locations == [LOCATION_CLOUD_BACKUP],
),
name="backup_manager_create_backup",
eager_start=False, # To ensure the task is not started before we return
)
return (NewBackup(backup_job_id=backup.job_id), backup_task)
return (NewBackup(backup_job_id=backup.job_id.hex), backup_task)
async def _async_wait_for_backup(
self,
backup: supervisor_backups.NewBackup,
locations: list[str | None],
locations: list[str],
*,
on_progress: Callable[[CreateBackupEvent], None],
remove_after_upload: bool,
) -> WrittenBackup:
"""Wait for a backup to complete."""
backup_complete = asyncio.Event()
backup_id: str | None = None
create_errors: list[dict[str, str]] = []
@callback
def on_job_progress(data: Mapping[str, Any]) -> None:
"""Handle backup progress."""
nonlocal backup_id
if not (stage := try_parse_enum(CreateBackupStage, data.get("stage"))):
_LOGGER.debug("Unknown create stage: %s", data.get("stage"))
else:
on_progress(
CreateBackupEvent(
reason=None, stage=stage, state=CreateBackupState.IN_PROGRESS
)
)
if data.get("done") is True:
backup_id = data.get("reference")
create_errors.extend(data.get("errors", []))
backup_complete.set()
unsub = self._async_listen_job_events(backup.job_id, on_job_progress)
try:
unsub = self._async_listen_job_events(backup.job_id, on_job_progress)
await self._get_job_state(backup.job_id, on_job_progress)
await backup_complete.wait()
finally:
unsub()
if not backup_id:
raise BackupReaderWriterError("Backup failed")
if not backup_id or create_errors:
# We should add more specific error handling here in the future
raise BackupReaderWriterError(
f"Backup failed: {create_errors or 'no backup_id'}"
)
async def open_backup() -> AsyncIterator[bytes]:
try:
@@ -469,7 +508,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
else None
)
restore_location: str | None
restore_location: str
if manager.backup_agents[agent_id].domain != DOMAIN:
# Download the backup to the supervisor. Supervisor will clean up the backup
# two days after the restore is done.
@@ -495,6 +534,8 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
location=restore_location,
),
)
except SupervisorNotFoundError as err:
raise BackupNotFound from err
except SupervisorBadRequestError as err:
# Supervisor currently does not transmit machine parsable error types
message = err.args[0]
@@ -503,16 +544,30 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
raise HomeAssistantError(message) from err
restore_complete = asyncio.Event()
restore_errors: list[dict[str, str]] = []
@callback
def on_job_progress(data: Mapping[str, Any]) -> None:
"""Handle backup progress."""
"""Handle backup restore progress."""
if not (stage := try_parse_enum(RestoreBackupStage, data.get("stage"))):
_LOGGER.debug("Unknown restore stage: %s", data.get("stage"))
else:
on_progress(
RestoreBackupEvent(
reason=None, stage=stage, state=RestoreBackupState.IN_PROGRESS
)
)
if data.get("done") is True:
restore_complete.set()
restore_errors.extend(data.get("errors", []))
unsub = self._async_listen_job_events(job.job_id, on_job_progress)
try:
unsub = self._async_listen_job_events(job.job_id, on_job_progress)
await self._get_job_state(job.job_id, on_job_progress)
await restore_complete.wait()
if restore_errors:
# We should add more specific error handling here in the future
raise BackupReaderWriterError(f"Restore failed: {restore_errors}")
finally:
unsub()
@@ -522,28 +577,52 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
on_progress: Callable[[RestoreBackupEvent | IdleEvent], None],
) -> None:
"""Check restore status after core restart."""
if not (restore_job_id := os.environ.get(RESTORE_JOB_ID_ENV)):
if not (restore_job_str := os.environ.get(RESTORE_JOB_ID_ENV)):
_LOGGER.debug("No restore job ID found in environment")
return
restore_job_id = UUID(restore_job_str)
_LOGGER.debug("Found restore job ID %s in environment", restore_job_id)
sent_event = False
@callback
def on_job_progress(data: Mapping[str, Any]) -> None:
"""Handle backup restore progress."""
nonlocal sent_event
if not (stage := try_parse_enum(RestoreBackupStage, data.get("stage"))):
_LOGGER.debug("Unknown restore stage: %s", data.get("stage"))
if data.get("done") is not True:
on_progress(
RestoreBackupEvent(
reason="", stage=None, state=RestoreBackupState.IN_PROGRESS
if stage or not sent_event:
sent_event = True
on_progress(
RestoreBackupEvent(
reason=None,
stage=stage,
state=RestoreBackupState.IN_PROGRESS,
)
)
)
return
on_progress(
RestoreBackupEvent(
reason="", stage=None, state=RestoreBackupState.COMPLETED
restore_errors = data.get("errors", [])
if restore_errors:
_LOGGER.warning("Restore backup failed: %s", restore_errors)
# We should add more specific error handling here in the future
on_progress(
RestoreBackupEvent(
reason="unknown_error",
stage=stage,
state=RestoreBackupState.FAILED,
)
)
else:
on_progress(
RestoreBackupEvent(
reason=None, stage=stage, state=RestoreBackupState.COMPLETED
)
)
)
on_progress(IdleEvent())
unsub()
@@ -556,7 +635,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
@callback
def _async_listen_job_events(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
self, job_id: UUID, on_event: Callable[[Mapping[str, Any]], None]
) -> Callable[[], None]:
"""Listen for job events."""
@@ -571,7 +650,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
if (
data.get("event") != "job"
or not (event_data := data.get("data"))
or event_data.get("uuid") != job_id
or event_data.get("uuid") != job_id.hex
):
return
on_event(event_data)
@@ -582,10 +661,10 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
return unsub
async def _get_job_state(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
self, job_id: UUID, on_event: Callable[[Mapping[str, Any]], None]
) -> None:
"""Poll a job for its state."""
job = await self._client.jobs.get_job(UUID(job_id))
job = await self._client.jobs.get_job(job_id)
_LOGGER.debug("Job state: %s", job)
on_event(job.to_dict())
@@ -613,10 +692,20 @@ async def backup_addon_before_update(
else:
password = None
def addon_update_backup_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return addon update backups."""
return {
backup_id: backup
for backup_id, backup in backups.items()
if backup.extra_metadata.get(TAG_ADDON_UPDATE) == addon
}
try:
await backup_manager.async_create_backup(
agent_ids=[await _default_agent(client)],
extra_metadata={"supervisor.addon_update": addon},
extra_metadata={TAG_ADDON_UPDATE: addon},
include_addons=[addon],
include_all_addons=False,
include_database=False,
@@ -627,6 +716,14 @@ async def backup_addon_before_update(
)
except BackupManagerError as err:
raise HomeAssistantError(f"Error creating backup: {err}") from err
else:
try:
await backup_manager.async_delete_filtered_backups(
include_filter=addon_update_backup_filter,
delete_filter=lambda backups: backups,
)
except BackupManagerError as err:
raise HomeAssistantError(f"Error deleting old backups: {err}") from err
async def backup_core_before_update(hass: HomeAssistant) -> None:

View File

@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/hassio",
"iot_class": "local_polling",
"quality_scale": "internal",
"requirements": ["aiohasupervisor==0.2.2b6"],
"requirements": ["aiohasupervisor==0.3.0"],
"single_config_entry": true
}

View File

@@ -37,11 +37,24 @@ async def async_setup_entry(hass: HomeAssistant, entry: HeosConfigEntry) -> bool
for device in device_registry.devices.get_devices_for_config_entry_id(
entry.entry_id
):
for domain, player_id in device.identifiers:
if domain == DOMAIN and not isinstance(player_id, str):
device_registry.async_update_device( # type: ignore[unreachable]
device.id, new_identifiers={(DOMAIN, str(player_id))}
for ident in device.identifiers:
if ident[0] != DOMAIN or isinstance(ident[1], str):
continue
player_id = int(ident[1]) # type: ignore[unreachable]
# Create set of identifiers excluding this integration
identifiers = {ident for ident in device.identifiers if ident[0] != DOMAIN}
migrated_identifiers = {(DOMAIN, str(player_id))}
# Add migrated if not already present in another device, which occurs if the user downgraded and then upgraded
if not device_registry.async_get_device(migrated_identifiers):
identifiers.update(migrated_identifiers)
if len(identifiers) > 0:
device_registry.async_update_device(
device.id, new_identifiers=identifiers
)
else:
device_registry.async_remove_device(device.id)
break
coordinator = HeosCoordinator(hass, entry)

View File

@@ -8,7 +8,7 @@
"iot_class": "local_push",
"loggers": ["pyheos"],
"quality_scale": "silver",
"requirements": ["pyheos==1.0.1"],
"requirements": ["pyheos==1.0.2"],
"single_config_entry": true,
"ssdp": [
{

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/holiday",
"iot_class": "local_polling",
"requirements": ["holidays==0.65", "babel==2.15.0"]
"requirements": ["holidays==0.66", "babel==2.15.0"]
}

View File

@@ -1,6 +1,7 @@
"""Constants for the homee integration."""
from homeassistant.const import (
DEGREE,
LIGHT_LUX,
PERCENTAGE,
REVOLUTIONS_PER_MINUTE,
@@ -32,6 +33,7 @@ HOMEE_UNIT_TO_HA_UNIT = {
"W": UnitOfPower.WATT,
"m/s": UnitOfSpeed.METERS_PER_SECOND,
"km/h": UnitOfSpeed.KILOMETERS_PER_HOUR,
"°": DEGREE,
"°F": UnitOfTemperature.FAHRENHEIT,
"°C": UnitOfTemperature.CELSIUS,
"K": UnitOfTemperature.KELVIN,
@@ -51,7 +53,7 @@ OPEN_CLOSE_MAP_REVERSED = {
0.0: "closed",
1.0: "open",
2.0: "partial",
3.0: "cosing",
3.0: "closing",
4.0: "opening",
}
WINDOW_MAP = {

View File

@@ -78,6 +78,7 @@ from .const import (
CONF_VIDEO_CODEC,
CONF_VIDEO_MAP,
CONF_VIDEO_PACKET_SIZE,
CONF_VIDEO_PROFILE_NAMES,
DEFAULT_AUDIO_CODEC,
DEFAULT_AUDIO_MAP,
DEFAULT_AUDIO_PACKET_SIZE,
@@ -90,6 +91,7 @@ from .const import (
DEFAULT_VIDEO_CODEC,
DEFAULT_VIDEO_MAP,
DEFAULT_VIDEO_PACKET_SIZE,
DEFAULT_VIDEO_PROFILE_NAMES,
DOMAIN,
FEATURE_ON_OFF,
FEATURE_PLAY_PAUSE,
@@ -163,6 +165,9 @@ CAMERA_SCHEMA = BASIC_INFO_SCHEMA.extend(
vol.Optional(CONF_VIDEO_CODEC, default=DEFAULT_VIDEO_CODEC): vol.In(
VALID_VIDEO_CODECS
),
vol.Optional(CONF_VIDEO_PROFILE_NAMES, default=DEFAULT_VIDEO_PROFILE_NAMES): [
cv.string
],
vol.Optional(
CONF_AUDIO_PACKET_SIZE, default=DEFAULT_AUDIO_PACKET_SIZE
): cv.positive_int,

View File

@@ -25,7 +25,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: HomeWizardConfigEntry) -
api: HomeWizardEnergy
if token := entry.data.get(CONF_TOKEN):
is_battery = entry.unique_id.startswith("HWE-BAT") if entry.unique_id else False
if (token := entry.data.get(CONF_TOKEN)) and is_battery:
api = HomeWizardEnergyV2(
entry.data[CONF_IP_ADDRESS],
token=token,
@@ -37,7 +39,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: HomeWizardConfigEntry) -
clientsession=async_get_clientsession(hass),
)
await async_check_v2_support_and_create_issue(hass, entry)
if is_battery:
await async_check_v2_support_and_create_issue(hass, entry)
coordinator = HWEnergyDeviceUpdateCoordinator(hass, api)
try:

View File

@@ -272,9 +272,14 @@ class HomeWizardConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult:
"""Handle reconfiguration of the integration."""
errors: dict[str, str] = {}
reconfigure_entry = self._get_reconfigure_entry()
if user_input:
try:
device_info = await async_try_connect(user_input[CONF_IP_ADDRESS])
device_info = await async_try_connect(
user_input[CONF_IP_ADDRESS],
token=reconfigure_entry.data.get(CONF_TOKEN),
)
except RecoverableError as ex:
LOGGER.error(ex)
@@ -288,7 +293,6 @@ class HomeWizardConfigFlow(ConfigFlow, domain=DOMAIN):
self._get_reconfigure_entry(),
data_updates=user_input,
)
reconfigure_entry = self._get_reconfigure_entry()
return self.async_show_form(
step_id="reconfigure",
data_schema=vol.Schema(
@@ -306,7 +310,7 @@ class HomeWizardConfigFlow(ConfigFlow, domain=DOMAIN):
)
async def async_try_connect(ip_address: str) -> Device:
async def async_try_connect(ip_address: str, token: str | None = None) -> Device:
"""Try to connect.
Make connection with device to test the connection
@@ -317,7 +321,7 @@ async def async_try_connect(ip_address: str) -> Device:
# Determine if device is v1 or v2 capable
if await has_v2_api(ip_address):
energy_api = HomeWizardEnergyV2(ip_address)
energy_api = HomeWizardEnergyV2(ip_address, token=token)
else:
energy_api = HomeWizardEnergyV1(ip_address)

View File

@@ -12,6 +12,6 @@
"iot_class": "local_polling",
"loggers": ["homewizard_energy"],
"quality_scale": "platinum",
"requirements": ["python-homewizard-energy==v8.3.0"],
"requirements": ["python-homewizard-energy==v8.3.2"],
"zeroconf": ["_hwenergy._tcp.local.", "_homewizard._tcp.local."]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/hydrawise",
"iot_class": "cloud_polling",
"loggers": ["pydrawise"],
"requirements": ["pydrawise==2025.1.0"]
"requirements": ["pydrawise==2025.2.0"]
}

View File

@@ -87,7 +87,7 @@ class IdasenDeskConfigFlow(ConfigFlow, domain=DOMAIN):
if discovery := self._discovery_info:
self._discovered_devices[discovery.address] = discovery
else:
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery in async_discovered_service_info(self.hass):
if (
discovery.address in current_addresses

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/imap",
"iot_class": "cloud_push",
"loggers": ["aioimaplib"],
"requirements": ["aioimaplib==2.0.0"]
"requirements": ["aioimaplib==2.0.1"]
}

View File

@@ -1,6 +1,6 @@
{
"domain": "incomfort",
"name": "Intergas InComfort/Intouch Lan2RF gateway",
"name": "Intergas gateway",
"codeowners": ["@jbouwh"],
"config_flow": true,
"dhcp": [

View File

@@ -2,20 +2,20 @@
"config": {
"step": {
"user": {
"description": "Set up new Intergas InComfort Lan2RF Gateway, some older systems might not need credentials to be set up. For newer devices authentication is required.",
"description": "Set up new Intergas gateway, some older systems might not need credentials to be set up. For newer devices authentication is required.",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"host": "Hostname or IP-address of the Intergas InComfort Lan2RF Gateway.",
"host": "Hostname or IP-address of the Intergas gateway.",
"username": "The username to log into the gateway. This is `admin` in most cases.",
"password": "The password to log into the gateway, is printed at the bottom of the Lan2RF Gateway or is `intergas` for some older devices."
"password": "The password to log into the gateway, is printed at the bottom of the gateway or is `intergas` for some older devices."
}
},
"dhcp_auth": {
"title": "Set up Intergas InComfort Lan2RF Gateway",
"title": "Set up Intergas gateway",
"description": "Please enter authentication details for gateway {host}",
"data": {
"username": "[%key:common::config_flow::data::username%]",
@@ -23,12 +23,12 @@
},
"data_description": {
"username": "The username to log into the gateway. This is `admin` in most cases.",
"password": "The password to log into the gateway, is printed at the bottom of the Lan2RF Gateway or is `intergas` for some older devices."
"password": "The password to log into the gateway, is printed at the bottom of the Gateway or is `intergas` for some older devices."
}
},
"dhcp_confirm": {
"title": "Set up Intergas InComfort Lan2RF Gateway",
"description": "Do you want to set up the discovered Intergas InComfort Lan2RF Gateway ({host})?"
"title": "Set up Intergas gateway",
"description": "Do you want to set up the discovered Intergas gateway ({host})?"
},
"reauth_confirm": {
"data": {
@@ -48,9 +48,9 @@
"error": {
"auth_error": "Invalid credentials.",
"no_heaters": "No heaters found.",
"not_found": "No Lan2RF gateway found.",
"timeout_error": "Time out when connecting to Lan2RF gateway.",
"unknown": "Unknown error when connecting to Lan2RF gateway."
"not_found": "No gateway found.",
"timeout_error": "Time out when connecting to the gateway.",
"unknown": "Unknown error when connecting to the gateway."
}
},
"exceptions": {
@@ -70,7 +70,7 @@
"options": {
"step": {
"init": {
"title": "Intergas InComfort Lan2RF Gateway options",
"title": "Intergas gateway options",
"data": {
"legacy_setpoint_status": "Legacy setpoint handling"
},

View File

@@ -72,7 +72,7 @@ class INKBIRDConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:

View File

@@ -7,6 +7,6 @@
"integration_type": "service",
"iot_class": "local_polling",
"loggers": ["jellyfin_apiclient_python"],
"requirements": ["jellyfin-apiclient-python==1.9.2"],
"requirements": ["jellyfin-apiclient-python==1.10.0"],
"single_config_entry": true
}

View File

@@ -12,7 +12,7 @@
"requirements": [
"xknx==3.5.0",
"xknxproject==3.8.1",
"knx-frontend==2025.1.28.225404"
"knx-frontend==2025.1.30.194235"
],
"single_config_entry": true
}

View File

@@ -10,8 +10,8 @@ from lacrosse_view import HTTPError, LaCrosse, Location, LoginError, Sensor
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import SCAN_INTERVAL
@@ -26,6 +26,7 @@ class LaCrosseUpdateCoordinator(DataUpdateCoordinator[list[Sensor]]):
name: str
id: str
hass: HomeAssistant
devices: list[Sensor] | None = None
def __init__(
self,
@@ -60,24 +61,34 @@ class LaCrosseUpdateCoordinator(DataUpdateCoordinator[list[Sensor]]):
except LoginError as error:
raise ConfigEntryAuthFailed from error
if self.devices is None:
_LOGGER.debug("Getting devices")
try:
self.devices = await self.api.get_devices(
location=Location(id=self.id, name=self.name),
)
except HTTPError as error:
raise UpdateFailed from error
try:
# Fetch last hour of data
sensors = await self.api.get_sensors(
location=Location(id=self.id, name=self.name),
tz=self.hass.config.time_zone,
start=str(now - 3600),
end=str(now),
)
except HTTPError as error:
raise ConfigEntryNotReady from error
for sensor in self.devices:
sensor.data = (
await self.api.get_sensor_status(
sensor=sensor,
tz=self.hass.config.time_zone,
)
)["data"]["current"]
_LOGGER.debug("Got data: %s", sensor.data)
_LOGGER.debug("Got data: %s", sensors)
except HTTPError as error:
raise UpdateFailed from error
# Verify that we have permission to read the sensors
for sensor in sensors:
for sensor in self.devices:
if not sensor.permissions.get("read", False):
raise ConfigEntryAuthFailed(
f"This account does not have permission to read {sensor.name}"
)
return sensors
return self.devices

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/lacrosse_view",
"iot_class": "cloud_polling",
"loggers": ["lacrosse_view"],
"requirements": ["lacrosse-view==1.0.3"]
"requirements": ["lacrosse-view==1.1.1"]
}

View File

@@ -45,10 +45,10 @@ class LaCrosseSensorEntityDescription(SensorEntityDescription):
def get_value(sensor: Sensor, field: str) -> float | int | str | None:
"""Get the value of a sensor field."""
field_data = sensor.data.get(field)
field_data = sensor.data.get(field) if sensor.data is not None else None
if field_data is None:
return None
value = field_data["values"][-1]["s"]
value = field_data["spot"]["value"]
try:
value = float(value)
except ValueError:
@@ -178,7 +178,7 @@ async def async_setup_entry(
continue
# if the API returns a different unit of measurement from the description, update it
if sensor.data.get(field) is not None:
if sensor.data is not None and sensor.data.get(field) is not None:
native_unit_of_measurement = UNIT_OF_MEASUREMENT_MAP.get(
sensor.data[field].get("unit")
)
@@ -240,7 +240,9 @@ class LaCrosseViewSensor(
@property
def available(self) -> bool:
"""Return True if entity is available."""
data = self.coordinator.data[self.index].data
return (
super().available
and self.entity_description.key in self.coordinator.data[self.index].data
and data is not None
and self.entity_description.key in data
)

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/lcn",
"iot_class": "local_push",
"loggers": ["pypck"],
"requirements": ["pypck==0.8.3", "lcn-frontend==0.2.3"]
"requirements": ["pypck==0.8.5", "lcn-frontend==0.2.3"]
}

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/ld2410_ble",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["bluetooth-data-tools==1.22.0", "ld2410-ble==0.1.1"]
"requirements": ["bluetooth-data-tools==1.23.4", "ld2410-ble==0.1.1"]
}

View File

@@ -35,5 +35,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/led_ble",
"iot_class": "local_polling",
"requirements": ["bluetooth-data-tools==1.22.0", "led-ble==1.1.4"]
"requirements": ["bluetooth-data-tools==1.23.4", "led-ble==1.1.6"]
}

View File

@@ -113,7 +113,7 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
saturation = int(saturation / 100 * 65535)
kelvin = 3500
if _ATTR_COLOR_TEMP in kwargs:
if ATTR_COLOR_TEMP_KELVIN not in kwargs and _ATTR_COLOR_TEMP in kwargs:
# added in 2025.1, can be removed in 2026.1
_LOGGER.warning(
"The 'color_temp' parameter is deprecated. Please use 'color_temp_kelvin' for"

View File

@@ -277,20 +277,6 @@ FOUR_GROUP_REMOTE_TRIGGER_SCHEMA = LUTRON_BUTTON_TRIGGER_SCHEMA.extend(
}
)
PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP = {
"button_0": 2,
"button_2": 4,
}
PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LEAP = {
"button_0": 0,
"button_2": 2,
}
PADDLE_SWITCH_PICO_TRIGGER_SCHEMA = LUTRON_BUTTON_TRIGGER_SCHEMA.extend(
{
vol.Required(CONF_SUBTYPE): vol.In(PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP),
}
)
DEVICE_TYPE_SCHEMA_MAP = {
"Pico2Button": PICO_2_BUTTON_TRIGGER_SCHEMA,
@@ -302,7 +288,6 @@ DEVICE_TYPE_SCHEMA_MAP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_TRIGGER_SCHEMA,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_TRIGGER_SCHEMA,
"FourGroupRemote": FOUR_GROUP_REMOTE_TRIGGER_SCHEMA,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_TRIGGER_SCHEMA,
}
DEVICE_TYPE_SUBTYPE_MAP_TO_LIP = {
@@ -315,7 +300,6 @@ DEVICE_TYPE_SUBTYPE_MAP_TO_LIP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_BUTTON_TYPES_TO_LIP,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_BUTTON_TYPES_TO_LIP,
"FourGroupRemote": FOUR_GROUP_REMOTE_BUTTON_TYPES_TO_LIP,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP,
}
DEVICE_TYPE_SUBTYPE_MAP_TO_LEAP = {
@@ -328,7 +312,6 @@ DEVICE_TYPE_SUBTYPE_MAP_TO_LEAP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_BUTTON_TYPES_TO_LEAP,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_BUTTON_TYPES_TO_LEAP,
"FourGroupRemote": FOUR_GROUP_REMOTE_BUTTON_TYPES_TO_LEAP,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LEAP,
}
LEAP_TO_DEVICE_TYPE_SUBTYPE_MAP: dict[str, dict[int, str]] = {
@@ -343,7 +326,6 @@ TRIGGER_SCHEMA = vol.Any(
PICO_4_BUTTON_ZONE_TRIGGER_SCHEMA,
PICO_4_BUTTON_2_GROUP_TRIGGER_SCHEMA,
FOUR_GROUP_REMOTE_TRIGGER_SCHEMA,
PADDLE_SWITCH_PICO_TRIGGER_SCHEMA,
)

Some files were not shown because too many files have changed in this diff Show More