Compare commits

...

495 Commits

Author SHA1 Message Date
Chris Talkington 1ee6f5e24a more tweaks 2024-12-20 19:21:58 -06:00
Chris Talkington dd2b092b70 more tweaks 2024-12-20 18:53:58 -06:00
Chris Talkington 1122217326 freeze media player 2024-12-19 23:23:16 -06:00
Chris Talkington 57aaee3dbc work on tests 2024-12-19 22:48:19 -06:00
Chris Talkington cba83620b9 improve code quality for roku 2024-12-19 20:29:08 -06:00
Christopher Fenner afae257a12 Bump PyViCare to 2.39.1 (#133619) 2024-12-20 01:14:48 +01:00
Quentame 64aba0c1a3 Bump Freebox to 1.2.1 (#133455) 2024-12-20 00:48:03 +01:00
J. Nick Koston 551a584ca6 Handle mqtt.WebsocketConnectionError when connecting to the MQTT broker (#133610)
fixes #132985
2024-12-19 21:39:39 +01:00
Jan-Philipp Benecke b261c7f18a Mark docs-installation-parameters for SABnzbd as done (#133609) 2024-12-19 20:29:12 +01:00
Joost Lekkerkerker 61e5f10d12 Fix Twinkly raise on progress (#133601) 2024-12-19 20:27:08 +01:00
adam-the-hero 2413fc4c0d Fix Watergate Water meter volume sensor (#133606) 2024-12-19 20:25:24 +01:00
Abílio Costa e6ef3fe507 Update Idasen Desk user flow step strings (#133605) 2024-12-19 20:24:10 +01:00
J. Nick Koston 04bcc8d3d3 Bump yalexs-ble to 2.5.6 (#133593) 2024-12-19 09:13:51 -10:00
Joost Lekkerkerker 52683c5f75 Improve Airgradient config flow tests (#133594) 2024-12-19 19:58:33 +01:00
Raphael Hehl 2f77cda822 Add basic UniFi Protect AiPort support (#133523)
* UnifiProtect add basic support for AiPort devices

* Sort ignore-words

---------

Co-authored-by: J. Nick Koston <nick@koston.org>
2024-12-19 08:18:21 -10:00
Marcel van der Veldt a97434976e Handle null value for elapsed time in Music Assistant (#133597) 2024-12-19 19:00:18 +01:00
epenet e357e0a406 Set default min/max color temperature in template lights (#133549) 2024-12-19 18:40:04 +01:00
Andrew Jackson 1a068d99d6 Add data descriptions to Mealie integration (#133590) 2024-12-19 18:28:50 +01:00
Joost Lekkerkerker 95b3d27b60 Update Airgradient quality scale (#133569) 2024-12-19 18:23:40 +01:00
Allen Porter a3ef3cce3e Improve Google Tasks coordinator updates behavior (#133316) 2024-12-19 16:41:47 +01:00
Erik Montnemery 255f85eb2f Fix boot loop after restoring backup (#133581) 2024-12-19 16:04:59 +01:00
Josef Zweck 94c7d18346 Bump pylamarzocco to 1.4.1 (#133557) 2024-12-19 13:36:32 +01:00
Noah Husby eb8ee1339c Set Russound RIO quality scale to silver (#133494) 2024-12-19 12:40:23 +01:00
Stefan Agner 962f1bad32 Add mW as unit of measurement for Matter electrical power sensors (#133504) 2024-12-19 11:40:05 +00:00
Erik Montnemery dd215b3d5d Revert "Revert "Simplify recorder RecorderRunsManager (#131785)"" (#133564)
Revert "Revert "Simplify recorder RecorderRunsManager" (#133201)"

This reverts commit 980b8a91e6.
2024-12-19 12:32:15 +01:00
Erik Montnemery bb7abd037c Revert "Revert "Improve recorder history queries (#131702)"" (#133561)
Revert "Revert "Improve recorder history queries (#131702)" (#133203)"

This reverts commit 74e4654c26.
2024-12-19 11:50:12 +01:00
J. Nick Koston d35b34f142 Replace start time state query with single correlated scalar subquery (#133553) 2024-12-19 00:14:32 -10:00
dependabot[bot] 1c119518db Bump codecov/codecov-action from 5.1.1 to 5.1.2 (#133547)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.1.1 to 5.1.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v5.1.1...v5.1.2)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-19 10:52:10 +01:00
Norbert Rittel 9a6c749714 Change 'GSuite' to 'Workspace', fix 'Start' field label (#133554)
* Change 'GSuite' to 'Workspace', fix 'Start' field label

Several years ago Google renamed "G Suite" to "Google Workspace", this commit applies the same change to one of the  field descriptions of the set_vacation action.

In addition the "Start" field of the action currently uses the common action (!) for Start which is wrong in this context, it stands for the beginning here.

This commit changes this back to a local definition of this label just like "End".

In German for example "Start" needs to be "Beginn" in this context while the common action is translated as "Starten".

* Use "Google Workspace" for more clarity

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

---------

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-19 10:51:30 +01:00
Norbert Rittel 79484ea7f5 Grammar fixes for action names and descriptions (#133559)
Several KNX actions contain a wrong "s" at the end of their verbs while those are missing in several of the descriptions.

This commit changes all those to make them consistent with the remaining actions in KNX and the standard terminology in Home Assistant.
2024-12-19 10:50:12 +01:00
Franck Nijhof 3568bdca65 Update Home Assistant base image to 2024.12.0 (#133558) 2024-12-19 10:48:43 +01:00
Erik Montnemery a76f82080b Create repair issues when automatic backup fails (#133513)
* Create repair issues when automatic backup fails

* Improve test coverage

* Adjust issues
2024-12-19 10:40:07 +01:00
Christopher Fenner cd384cadbe Fulfill IQS rule config-flow in ViCare integration (#133524)
* add data_description

* Apply suggestions from code review

Co-authored-by: Josef Zweck <josef@zweck.dev>

---------

Co-authored-by: Josef Zweck <josef@zweck.dev>
2024-12-19 10:04:26 +01:00
J. Nick Koston 69a8d3f3c1 Revert "Optimize start time state queries for PostgreSQL" (#133555) 2024-12-18 23:01:58 -10:00
J. Nick Koston a3fb6e8f92 Bump pydantic to 2.10.4 (#133539)
changelog: https://github.com/pydantic/pydantic/compare/v2.10.3...v2.10.4
2024-12-19 10:01:40 +01:00
Erik Montnemery c8480627ca Add comment motivating magic number for MySQL error codes (#133516)
* Add comment motivating magic number for MySQL error codes

* Pick nits
2024-12-19 09:56:32 +01:00
Franck Nijhof 893f605d61 Revert "Update docker base image to 2024.12.1" (#133552)
Revert "Update docker base image to 2024.12.1 (#133323)"

This reverts commit 66dcd38701.
2024-12-19 09:42:22 +01:00
epenet ddd2ba6c4a Set default min/max color temperature in hue lights (#133548) 2024-12-19 08:36:29 +01:00
Stefan Agner 681863f80e Use mV and mA as units for electrical power measurement in Matter (#133505) 2024-12-19 08:32:46 +01:00
J. Nick Koston 99698ef95d Optimize start time state queries for PostgreSQL (#133228) 2024-12-18 19:41:53 -10:00
Franck Nijhof 3fe08a7223 Add zeroconf discovery to Peblar Rocksolid EV chargers (#133529) 2024-12-19 00:39:14 +01:00
J. Nick Koston 35601480d2 Bump aiohttp to 3.11.11 (#133530) 2024-12-18 23:48:39 +01:00
Abílio Costa 0076bd8389 Simplify Idasen Desk entity properties (#133536) 2024-12-18 23:47:24 +01:00
Franck Nijhof 9f3c549f8d Add integration setup tests to Peblar Rocksolid EV Chargers (#133532) 2024-12-18 23:46:18 +01:00
Norbert Rittel 03707e6308 Improve field descriptions for Download file action (#133413)
* Improve field descriptions for Download file action

Currently two of the field descriptions for the Download file action don't explain exactly what should be entered but rather explain these like additional actions.

The third, the Overwrite file option is misleading as it does not refer to an existing file.

This commit fixes both issues by explaining the purpose of all three fields in a slightly more detailed fashion.

* Update homeassistant/components/downloader/strings.json

Co-authored-by: Josef Zweck <josef@zweck.dev>

* Update homeassistant/components/downloader/strings.json

Co-authored-by: Josef Zweck <josef@zweck.dev>

---------

Co-authored-by: Josef Zweck <josef@zweck.dev>
2024-12-18 22:40:30 +01:00
Abílio Costa 9e6a8638dd Bump idasen-ha to 2.6.3 (#133508)
This is a minor bump that adds py.typed
2024-12-18 22:38:57 +01:00
Norbert Rittel 2a9082559a Fix names and description of two actions (#133528)
The two actions enable_motion_recording and disable_motion_recording use "Enables" and "Disables" in their names.

This is inconsistent with the name of the actions, all other actions of this component, and the standard way of naming them, too.

In addition the description of the latter misses the "s" which causes additional inconsistency – especially in translations.
2024-12-18 22:35:58 +01:00
starkillerOG ba3fca53b0 Reolink platinum quality scale (#133514) 2024-12-18 21:49:32 +01:00
Raphael Hehl e4bb351d2d Bump uiprotect to 7.1.0 (#133520)
* Bump uiprotect to version 7.1.0

* Add aiports to bootstrap fixture in unifiprotect tests
2024-12-18 21:41:22 +01:00
Christopher Fenner 1bdda0249e Bump PyViCare to 2.39.0 (#133519) 2024-12-18 21:38:52 +01:00
Erik Montnemery ff8bc763c3 Ensure indices needed by data migrators exist (#133367)
* Ensure indices needed by data migrators exist

* Update test

* Improve test

* Ignore index error on char(0) columns

* Adjust tests

* Address review comments

* Add comment motivating magic number
2024-12-18 21:29:52 +01:00
dontinelli 8a8be71f96 Add tests for cover and increase test coverage for slide_local (#133515) 2024-12-18 20:53:05 +01:00
starkillerOG 19e6867f1a Reolink translate errors (#132301) 2024-12-18 20:22:33 +01:00
Norbert Rittel c8f050ecbc Fix the local_file.update_file_path action's name and description (#133509) 2024-12-18 20:08:57 +01:00
IceBotYT b7ff27122a Add support for Nice G.O. HAE00080 wall station (#133186) 2024-12-18 19:47:41 +01:00
Shay Levy 3a8b0b3ea6 Use Switcher _async_call_api in climate (#133230) 2024-12-18 19:46:52 +01:00
mvn23 0ff2a0d66d Add "cancel room setpoint override" button to opentherm_gw (#132162) 2024-12-18 19:46:30 +01:00
Joakim Plate 4daf6dd41d Bump gardena_bluetooth to 1.5.0 (#133502) 2024-12-18 19:39:35 +01:00
Thomas55555 51bead3229 Update number platform values before add in APSystems and add tests (#131938)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-18 19:34:49 +01:00
Manu 352e948d56 Add tests for already_configured erros in IronOS integration (#132265) 2024-12-18 19:33:33 +01:00
Manu 70ad4ee454 Add select platform to IronOS (#132218) 2024-12-18 19:32:51 +01:00
TJ Horner 53ef96c63e weatherkit: use stale data for up to an hour if updates fail (#130398) 2024-12-18 19:21:03 +01:00
Franck Nijhof bb2d027532 Add Peblar Rocksolid EV Chargers integration (#133501)
* Add Peblar Rocksolid EV Chargers integration

* Process review comments
2024-12-18 19:11:13 +01:00
Erik Montnemery 51d63ba508 Store automatic backup flag in backup metadata (#133500) 2024-12-18 18:30:46 +01:00
Arie Catsman fc622e398f add exception translation to enphase_envoy (#132483) 2024-12-18 18:24:12 +01:00
peteS-UK 920de90603 Increase Squeezebox config_flow test coverage to 100% (#133484)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-18 18:22:22 +01:00
Joakim Plate a6089b497a Update fjäråskupan to 2.3.2 (#133499) 2024-12-18 18:03:27 +01:00
Erik Montnemery 5516f3609d Rename strategy backup to automatic backup (#133489)
* Rename strategy backup to automatic backup

* Update homeassistant/components/backup/config.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2024-12-18 17:35:11 +01:00
Joakim Plate a1558213c4 Update fjäråskupan to 2.3.1 (#133493) 2024-12-18 16:53:15 +01:00
Luke Lashley 2564533dae Update Roborock to 2.8.1 (#133492) 2024-12-18 16:22:39 +01:00
Noah Husby f46e764982 Update quality scale for Russound RIO (#133093) 2024-12-18 16:06:48 +01:00
dontinelli d6c201de4a Add exceptions and translations for slide_local (#133490) 2024-12-18 15:33:11 +01:00
mkmer c9f1829c0b Add (de)humidifier platform to Honeywell (#132287)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-18 15:27:40 +01:00
dontinelli 1e075cdac7 Add diagnostics to slide_local (#133488) 2024-12-18 15:21:17 +01:00
Philip Baylas fce6d6246f Change log level of connection failure to info (#132625)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2024-12-18 15:07:03 +01:00
Maciej Bieniek 3132700492 Add ability to translate ENUM sensor states in Unifi integration (#131921) 2024-12-18 15:02:44 +01:00
adam-the-hero 943b1d9f08 Add sensors platform to Watergate integration (#133015) 2024-12-18 14:52:25 +01:00
Markus Jacobsen 2d6d313e5c Complete adding custom integration action sections support to hassfest (#132443) 2024-12-18 14:50:12 +01:00
Guido Schmitz 9716183997 Add entity translations to devolo Home Control (#132927) 2024-12-18 14:38:29 +01:00
Andre Lengwenus a46a0ad2b4 Add device_id parameter to LCN actions (service calls) (#129590) 2024-12-18 14:35:02 +01:00
J. Diego Rodríguez Royo c06bc53724 Deprecate Home Connect program switches (#131641) 2024-12-18 14:26:37 +01:00
Bas Brussee 4399d09820 Allow data description in sections (#128965)
* Allow data description in sections

* update format with ruff

* Add data_description to kitchen_sink input section

---------

Co-authored-by: Erik <erik@montnemery.com>
2024-12-18 14:02:08 +01:00
Abílio Costa ca2c7280eb Remove uneeded logger param from Idasen Desk Coordinator (#133485) 2024-12-18 13:59:56 +01:00
Erik Montnemery ecb3bf79f3 Revert "Add support for subentries to config entries" (#133470)
Revert "Add support for subentries to config entries (#117355)"

This reverts commit ad15786115.
2024-12-18 13:51:05 +01:00
Joost Lekkerkerker 2aba1d399b Rename test file to singular form (#133482) 2024-12-18 12:47:30 +00:00
greyeee be25cb7aa7 Add support for SwitchBot Relay Switch 1 and Relay Switch 1PM (#132327) 2024-12-18 13:19:45 +01:00
Mick Vleeshouwer 3bb6256572 Add test button for SmokeSensor in Overkiz (#133476) 2024-12-18 11:48:10 +01:00
Mick Vleeshouwer fc4100833e Change device class from Volume to Volume Storage in Overkiz (#133473)
Change device class from Volume to Volume Storage
2024-12-18 11:43:04 +01:00
Erik Montnemery 992afc4cd3 Set the with_strategy_settings to None for unknown backups (#133466) 2024-12-18 11:27:07 +01:00
Mick Vleeshouwer 7730f423b3 Add identify device class in Overkiz (#133474) 2024-12-18 11:22:32 +01:00
Mick Vleeshouwer 05b0c56191 Use enum instead of string for button entities key in Overkiz (#133472) 2024-12-18 11:22:22 +01:00
Mick Vleeshouwer fa0e54e658 Don't raise Overkiz user flow unique_id check (#133471) 2024-12-18 11:05:52 +01:00
Joakim Sørensen 869a0d7abc Add name to cloud connection info response (#133468) 2024-12-18 11:01:38 +01:00
dotvav 90208d2eb1 Bump pypalazzetti to 0.1.15 (#133433) 2024-12-18 10:58:25 +01:00
J. Diego Rodríguez Royo a6520d2627 Handle Home Connect error at diagnostics (#131644) 2024-12-18 10:52:45 +01:00
epenet 8b8c409916 Fix test-before-setup IQS check (#133467) 2024-12-18 10:44:19 +01:00
Ron Weikamp a2be5a383c Bugfix: also schedule time based integration when source is 0 (#133438)
* Bugfix also schedule time based integration when source is 0

* Update tests/components/integration/test_sensor.py

Co-authored-by: Diogo Gomes <diogogomes@gmail.com>

* Improve comment in test. Remove redundant assertion.

---------

Co-authored-by: Diogo Gomes <diogogomes@gmail.com>
2024-12-18 10:41:46 +01:00
Tomer Shemesh 39d781905d Add ssdp discovery to Onkyo (#131066) 2024-12-18 10:21:37 +01:00
Abílio Costa 5fb5e933e2 Use a common base entity for Idasen Desk (#132496)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-18 10:20:14 +01:00
Mick Vleeshouwer 413a578fdb Bump pyOverkiz to 1.15.3 (#133458) 2024-12-18 10:19:57 +01:00
Jan-Philipp Benecke c1cf0e23b2 Lift SABnzbd to bronze quality scale (#133453) 2024-12-18 10:10:42 +01:00
Noah Husby a449ca65be Improve test coverage for Russound RIO (#133096)
* Improve test coverage for Russound RIO

* Update

* Update
2024-12-18 09:33:17 +01:00
Arie Catsman 4c91d1b402 Add support for ACB batteries to Enphase Envoy (#131298)
* Add support for ACB batteries to Enphase Envoy

* Add tests for ACB battery support in ENphase Envoy

* make acb state sensordeviceclass ENUM

* Capitalize strings and use common idle
2024-12-18 08:48:37 +01:00
Noah Husby fab92d1cf8 Add reconfigure flow to Russound RIO (#133091)
* Add reconfigure flow to Russound RIO

* Mark reconfiguration flow as done

* Update

* Update
2024-12-18 08:40:27 +01:00
Assaf Inbal c10473844f Add sensors to Ituran integration (#133359)
Add sensors to Ituran
2024-12-18 08:36:42 +01:00
dependabot[bot] dfdd83789a Bump actions/upload-artifact from 4.4.3 to 4.5.0 (#133461) 2024-12-18 08:05:39 +01:00
J. Nick Koston 9bff9c5e7b Ensure screenlogic retries if the protocol adapter is still booting (#133444)
* Ensure screenlogic retries if the protocol adapter is still booting

If the protocol adapter is still booting, it will disconnect and never
retry

```
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 640, in __async_setup_with_context
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/screenlogic/__init__.py", line 65, in async_setup_entry
    await gateway.async_connect(**connect_info)
  File "/usr/local/lib/python3.13/site-packages/screenlogicpy/gateway.py", line 142, in async_connect
    connectPkg = await async_connect_to_gateway(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<4 lines>...
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/screenlogicpy/requests/login.py", line 107, in async_connect_to_gateway
    mac_address = await async_gateway_connect(transport, protocol, max_retries)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/screenlogicpy/requests/login.py", line 77, in async_gateway_connect
    raise ScreenLogicConnectionError("Host unexpectedly disconnected.")
screenlogicpy.const.common.ScreenLogicConnectionError: Host unexpectedly disconnected.
```

* coverage
2024-12-17 20:57:43 -05:00
Abílio Costa e73512e11c Add integration_type to Idasen Desk (#132486)
* Add Idasen Desk quality scale record

* Update wrong checks

* Add integration_type to Idasen Desk
2024-12-17 23:49:04 +01:00
G Johansson 4c60e36f4f Add Get price service to Nord Pool (#130185)
* Add get_price service to Nord Pool

* Tests and fixes

* Fixes

* Not used fixtures

* update qs

* Fixes

* docstring

* Remove selector from strings

* Mod service
2024-12-17 21:59:20 +01:00
G Johansson f8cd6204ca Fix reconfigure in Nord Pool (#133431) 2024-12-17 21:30:49 +01:00
Jan-Philipp Benecke eae25023e7 Do not remove services when last config entry is unloaded in SABnzbd (#133449) 2024-12-17 21:27:41 +01:00
Klaas Schoute 21c3bf48f9 Allow only single instance of easyenergy integration (#133447) 2024-12-17 21:02:39 +01:00
Jan-Philipp Benecke 5014f305bf Mark docs-removal-instructions for SABnzbd as done (#133446) 2024-12-17 20:57:04 +01:00
benjamin-dcs b124ebeb1f Differentiate File integration entries by prefixing the title with the platform instead (#131016)
Differentiate File integration entries by prefixes the title with the platform
2024-12-17 20:54:30 +01:00
jimmyd-be 935bf3fb11 Bump renson-endura-delta to 1.7.2 (#129491) 2024-12-17 20:49:42 +01:00
Louis Christ 9c26654db7 Use entity services in bluesound integration (#129266) 2024-12-17 20:44:38 +01:00
Klaas Schoute c9ca1f63ea Allow only single instance of energyzero integration (#133443) 2024-12-17 20:44:24 +01:00
Jan-Philipp Benecke 5e5bebd7eb Remove unused constants from SABnzbd (#133445) 2024-12-17 20:43:53 +01:00
Richard Kroegel 8bbbbb00d5 Limit unique_id migration to platform for BMW (#131582) 2024-12-17 20:43:09 +01:00
Mick Vleeshouwer a7ba63bf86 Add missing CozyTouch servers to ConfigFlow expection handler in Overkiz (#131696) 2024-12-17 20:22:07 +01:00
G Johansson d785c4b0b1 Add optional category in OptionsFlow to holiday (#129514) 2024-12-17 20:20:26 +01:00
Mick Vleeshouwer e9e8228f07 Improve empty state handling for SomfyThermostat in Overkiz (#131700) 2024-12-17 20:18:16 +01:00
Erik Montnemery d22668a166 Don't run recorder data migration on new databases (#133412)
* Don't run recorder data migration on new databases

* Add tests
2024-12-17 20:02:12 +01:00
Erik Montnemery 633433709f Clean up backups after manual backup (#133434)
* Clean up backups after manual backup

* Address review comments
2024-12-17 20:00:02 +01:00
Artur Pragacz af1222e97b Distinct sources per zone in Onkyo (#130547) 2024-12-17 19:31:25 +01:00
epenet b5f6734197 Simplify modern_forms config flow (part 2) (#130494) 2024-12-17 19:23:54 +01:00
Kevin Stillhammer 98d5020690 Support units and filters in async_get_travel_times_service for waze_travel_time (#130776) 2024-12-17 18:00:23 +01:00
DrBlokmeister da85c497bf Add transmission download path to events + add_torrent service (#121371)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-17 17:48:54 +01:00
Norbert Rittel 1de8d63a63 Remove three duplicated space characters in strings.json (#133436) 2024-12-17 17:48:18 +01:00
Erik Montnemery 89eda9e068 Don't raise when removing non-existing cloud backup (#133429) 2024-12-17 17:47:17 +01:00
Norbert Rittel 3341e3d95b Fix two occurrences of "HomeAssistant" adding the missing space (#133435) 2024-12-17 17:43:56 +01:00
Erik Montnemery 25a63863cb Adapt hassio backup agent to supervisor changes (#133428) 2024-12-17 17:21:13 +01:00
Matthias Alphart 44a86f537f Add quality scale for Fronius (#131770) 2024-12-17 17:12:11 +01:00
Jan-Philipp Benecke d9fb5a7582 Record current IQS state for SABnzbd (#131656)
* Record current IQS state for SAbnzbd

* Convert review comments to IQS comments
2024-12-17 17:10:04 +01:00
Krisjanis Lejejs a14aca31e5 Add MFA login flow support for cloud component (#132497)
* Add MFA login flow support for cloud component

* Add tests for cloud MFA login

* Update code to reflect used package changes

* Update code to use underlying package changes

* Remove unused change

* Fix login required parameters

* Fix parameter validation

* Use cv.has_at_least_one_key for param validation

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2024-12-17 16:44:50 +01:00
Franck Nijhof 5b1c5bf9f6 Record current IQS scale for Tailwind (#133158)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-17 16:34:48 +01:00
Josef Zweck a9f6982ac0 Mark acaia as platinum quality (#131723)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-17 15:45:16 +01:00
Josef Zweck 9cc5f7ff84 Mark lamarzocco as platinum quality (#131609) 2024-12-17 15:41:34 +01:00
Erik Montnemery 4adfd52dc0 Improve hassio backup agent test coverage (#133426) 2024-12-17 15:08:03 +01:00
Erik Montnemery 8b3cd41396 Improve hassio backup agent test coverage (#133424) 2024-12-17 13:55:04 +01:00
Cyrill Raccaud 89946348df Add reconfigure to Cookidoo integration (#133144)
* add reconfigure

* merge steps

* comments
2024-12-17 13:54:07 +01:00
Erik Montnemery a4588c80d5 Bump aiohasupervisor to version 0.2.2b2 (#133417)
* Bump aiohasupervisor to version 0.2.2b2

* Update test
2024-12-17 13:18:26 +01:00
epenet e61142c2c2 Check if requirement is typed in strict_typing IQS validation (#133415)
* Check if requirement is typed in strict_typing IQS validation

* Apply suggestions from code review

* Apply suggestions from code review

* Return a list

* Adjust

* Improve
2024-12-17 12:53:27 +01:00
G Johansson 637614299c Fix strptime in python_script (#133159)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2024-12-17 12:41:18 +01:00
epenet 991864b38c Fix schema translation checks for nested config-flow sections (#133392) 2024-12-17 12:02:53 +01:00
Jonas Fors Lellky ce0117b2b8 Fix fan setpoints for flexit_bacnet (#133388) 2024-12-17 11:36:45 +01:00
Arie Catsman 084ef20695 Add quality_scale.yaml to enphase_envoy (#132489) 2024-12-17 11:33:04 +01:00
epenet 0dbd5bffe6 Fix incorrect schema in config tests (#133404) 2024-12-17 11:26:51 +01:00
G Johansson d8e853941a Bump holidays to 0.63 (#133391) 2024-12-17 11:10:38 +01:00
dotvav c0264f73b0 Add palazzetti status sensor (#131348)
* Add status sensor

* Lower the case of strings keys

* Make const Final

* Fix typo

* Fix typo

* Merge similar statuses

* Increase readability

* Update snapshot
2024-12-17 10:17:50 +01:00
Jan Bouwhuis ac6d718094 Fix mqtt reconfigure flow (#133315)
* FIx mqtt reconfigure flow

* Follow up on code review
2024-12-17 09:37:46 +01:00
Manu 9ca9e787b2 Add tests for Habitica integration (#131780)
* Add tests for Habitica integration

* update iqs
2024-12-17 09:07:18 +01:00
Vivien Chene fc9d32ef65 Fix issue when no data, where the integer sensor value is given a string (#132123)
* Fix issue when no data, where the integer sensor value is given a string

* Use None and not '0'
2024-12-17 08:57:43 +01:00
Marc Mueller 2d8e693cdb Update mypy-dev to 1.14.0a7 (#133390) 2024-12-17 07:34:59 +01:00
Ludovic BOUÉ 1512cd5fb7 Add Matter battery replacement description (#132974) 2024-12-17 00:03:32 +01:00
G Johansson 73e3e91af2 Nord Pool iqs platinum (#133389) 2024-12-16 23:54:56 +01:00
Dan Raper a374c7e4ca Add reauth flow to Ohme (#133275)
* Add reauth flow to ohme

* Reuse config flow user step for reauth

* Tidying up

* Add common _validate_account method for reauth and user config flow steps

* Add reauth fail test
2024-12-16 23:54:33 +01:00
Franck Nijhof 9cdc36681a Remove setup entry mock assert from LaMetric config flow (#133387) 2024-12-16 23:01:24 +01:00
Marc Mueller 8c67819f50 Update axis to v64 (#133385) 2024-12-16 22:40:00 +01:00
Michael Hansen 308200781f Add required domain to vacuum intents (#133166) 2024-12-16 21:49:15 +01:00
Franck Nijhof 3a622218f4 Improvements to the LaMetric config flow tests (#133383) 2024-12-16 21:47:31 +01:00
G Johansson 40182fc197 Load sun via entity component (#132598)
* Load sun via entity component

* Remove unique id

* Remove entity registry
2024-12-16 21:35:55 +01:00
dontinelli 2da7a93139 Add switch platform to local_slide (#133369) 2024-12-16 20:53:17 +01:00
Alexandre CUER 6a54edce19 Gives a friendly name to emoncms entities if unit is not specified (#133358) 2024-12-16 19:26:47 +01:00
Erik Montnemery 34ab3e033f Remove support for live recorder data post migration of entity IDs (#133370) 2024-12-16 19:23:05 +01:00
Simon e6e9788ecd Add quality scale to ElevenLabs (#133276) 2024-12-16 19:18:09 +01:00
Joakim Sørensen 482ad6fbee Increase backup upload timeout (#132990) 2024-12-16 19:12:15 +01:00
Maciej Bieniek 77fb440ed4 Bump imgw-pib to version 1.0.7 (#133364) 2024-12-16 19:06:06 +01:00
epenet 239767ee62 Set default min/max color temperature in mqtt lights (#133356) 2024-12-16 17:48:59 +01:00
Andrew Sayre cefb4a4ccc Add HEOS reconfigure flow (#133326)
* Add reconfig flow

* Add reconfigure tests

* Mark reconfigure_flow done

* Review feedback

* Update tests to always end in terminal state

* Correct test name and docstring
2024-12-16 10:08:14 -06:00
Åke Strandberg 5adb7f4542 Translate exception messages in myUplink (#131626)
* Translate exceptions

* Add one more translation

* Adding more translations

* Make message easier to understand for end-user

* Clarify message

* Address review comments
2024-12-16 15:42:15 +01:00
Erik Montnemery 14f4f8aeb5 Update hassio backup agents on mount added or removed (#133344)
* Update hassio backup agents on mount added or removed

* Address review comments
2024-12-16 15:37:29 +01:00
Maikel Punie a34992c0b5 Velbus add PARALLEL_UPDATES to all platforms (#133155) 2024-12-16 15:13:50 +01:00
Matthias Alphart 6f278fb856 Remove custom "unknown" state from Fronius Enum sensor (#133361) 2024-12-16 14:13:19 +01:00
Assaf Inbal a953abf5c3 Add reauth flow to Ituran (#132755) 2024-12-16 14:00:06 +01:00
Maikel Punie 38fdfba169 Velbus finish config-flow-test-coverage (#133149)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-16 13:56:17 +01:00
Åke Strandberg 0a0f482702 Update myuplink quality scale (#133083)
Updated documentation
2024-12-16 13:39:46 +01:00
Guido Schmitz cc27c95bad Use unique_id in devolo Home Network tests (#133147) 2024-12-16 13:35:55 +01:00
Franck Nijhof 836fd94a56 Record current IQS state for LaMetric (#133040) 2024-12-16 13:31:13 +01:00
Manu 34911a78bd Add Habitica quality scale record (#131429)
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-16 13:17:38 +01:00
Abílio Costa 739832691e Add Idasen Desk quality scale record (#132368) 2024-12-16 13:14:01 +01:00
epenet cd2cc1d99f Reduce false-positives in test-before-setup IQS check (#133349) 2024-12-16 13:10:15 +01:00
epenet 4b3893eadf Set default min/max color temperature in homekit_controller lights (#133334) 2024-12-16 12:26:29 +01:00
jb101010-2 d062171be3 Suez_water: mark reached bronze scale level (#133352) 2024-12-16 12:19:21 +01:00
epenet 9667a12030 Set default min/max color temperature in matter lights (#133340) 2024-12-16 10:32:57 +01:00
Jan-Philipp Benecke d78a24ba33 Use ConfigEntry.runtime_data in Twitch (#133337)
* Use `ConfigEntry.runtime_data` in Twitch

* Process code review

* Process code review
2024-12-16 09:54:01 +01:00
epenet f2674f3262 Set default min/max color temperature in deconz lights (#133333) 2024-12-16 09:49:18 +01:00
Erik Montnemery 06f6869da5 Avoid string manipulations in hassio backup reader/writer (#133339) 2024-12-16 09:47:49 +01:00
epenet 22d03afb9b Set default min/max color temperature in wemo lights (#133338) 2024-12-16 09:08:37 +01:00
Franck Nijhof 3129151ea9 Merge branch 'master' into dev 2024-12-16 07:52:34 +00:00
Chris Talkington 4566ebbb3d Add reconfigure flow to Roku (#132986)
* add reconfigure flow to roku

* Update strings.json

* aimplify

* Apply suggestions from code review

Co-authored-by: Josef Zweck <josef@zweck.dev>

* Update test_config_flow.py

* Update config_flow.py

* Update config_flow.py

---------

Co-authored-by: Josef Zweck <josef@zweck.dev>
2024-12-16 08:51:01 +01:00
epenet 5f2b1bd622 Set default min/max color temperature in demo lights (#133330) 2024-12-16 08:45:59 +01:00
epenet 909eb045cc Set default min/max color temperature in abode lights (#133331) 2024-12-16 08:27:10 +01:00
Marc Mueller 66dcd38701 Update docker base image to 2024.12.1 (#133323) 2024-12-16 08:10:37 +01:00
Paulus Schoutsen e24dc33259 Conversation: Use [] when we know key exists (#133305) 2024-12-15 21:45:50 +01:00
Josef Zweck 0030a970a1 Split coordinator in lamarzocco (#133208) 2024-12-15 21:31:18 +01:00
Josef Zweck 89387760d3 Cleanup tests for tedee (#133306) 2024-12-15 20:44:28 +01:00
Simone Chemelli 5cc8d9e105 Full test coverage for Vodafone Station button platform (#133281) 2024-12-15 20:27:19 +01:00
Allen Porter b77e42e8f3 Increase test coverage for google tasks init (#133252) 2024-12-15 20:23:56 +01:00
Matthias Alphart 81c12db6cd Fix missing Fronius data_description translation for reconfigure flow (#133304) 2024-12-15 20:19:56 +01:00
Jan Bouwhuis 2003fc7ae0 Adjust MQTT tests not to assert on deprecated color_temp attribute (#133198) 2024-12-15 19:42:54 +01:00
Allen Porter 6ca5f3e828 Mark Google Tasks test-before-setup quality scale rule as done (#133298) 2024-12-15 19:42:22 +01:00
Matthias Alphart be6ed05aa2 Improve Fronius tests (#132872) 2024-12-15 19:40:51 +01:00
Norbert Rittel 544ebcf310 Fix typo "configurered" in MQTT (#133295) 2024-12-15 19:35:50 +01:00
Bouwe Westerdijk 9e8a158c89 Bump plugwise to v1.6.4 and adapt (#133293) 2024-12-15 19:35:36 +01:00
J. Nick Koston e81add5a06 Set code_arm_required to False for homekit_controller (#133284) 2024-12-15 19:28:29 +01:00
G Johansson 6d6445bfcf Update quality scale for Nord Pool (#133282) 2024-12-15 19:28:10 +01:00
Michael e951511132 Allow load_verify_locations with only cadata passed (#133299) 2024-12-15 19:26:46 +01:00
Tomer Shemesh 2a49378f4c Refactor Onkyo tests to patch underlying pyeiscp library (#132653)
* Refactor Onkyo tests to patch underlying pyeiscp library instead of home assistant methods

* limit test patches to specific component, move atches into conftest

* use patch.multiple and restrict patches to specific component

* use side effect instead of mocking method
2024-12-15 10:27:17 -07:00
Allen Porter f069f340a3 Explicitly set PARALLEL_UPDATES for Google Tasks (#133296) 2024-12-15 17:53:36 +01:00
Conor Eager 042d4cd39b Bump starlink-grpc-core to 1.2.1 to fix missing ping (#133183) 2024-12-15 17:43:21 +01:00
G Johansson 51422a4502 Bump pynordpool 0.2.3 (#133277) 2024-12-15 17:41:43 +01:00
Norbert Rittel 95babbef21 Fix two typos in KEF strings (#133294) 2024-12-15 17:39:25 +01:00
Richard Kroegel b4b6067e8e Use typed BMWConfigEntry (#133272) 2024-12-15 14:41:35 +01:00
Dan Raper b13a54f605 Add button platform to Ohme (#133267)
* Add button platform and reauth flow

* CI fixes

* Test comment change

* Remove reauth from this PR

* Move is_supported_fn to OhmeEntityDescription

* Set parallel updates to 1

* Add coordinator refresh to button press

* Add exception handling to button async_press
2024-12-15 14:22:21 +01:00
Manu c2ee020eee Update quality scale documentation rules in IronOS integration (#133245) 2024-12-15 13:14:32 +01:00
Jan Bouwhuis 16ad2d52c7 Improve MQTT json color_temp validation (#133174)
* Improve MQTT json color_temp validation

* Revert unrelated changes and assert on logs

* Typo
2024-12-15 13:07:10 +01:00
Erik Montnemery 74e4654c26 Revert "Improve recorder history queries (#131702)" (#133203) 2024-12-15 12:28:32 +01:00
Matthias Alphart aa4b64386e Don't update existing Fronius config entries from config flow (#132886) 2024-12-15 12:25:35 +01:00
Claudio Ruggeri - CR-Tech 760c3ac98c Bump pymodbus version 3.7.4 (#133175)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-15 12:24:27 +01:00
Allen Porter 85ef2c0fb1 Mark Google Tasks action-exceptions quality scale as done (#133253) 2024-12-15 12:19:57 +01:00
Marc Mueller d1e466e615 Update elevenlabs to 1.9.0 (#133264) 2024-12-15 12:19:25 +01:00
Richard Kroegel 8953ac1357 Improve BMW translations (#133236) 2024-12-15 12:16:10 +01:00
Norbert Rittel ebc8ca8419 Replace "this" with "a" to fix Install Update action description (#133210) 2024-12-15 12:10:54 +01:00
Jan Bouwhuis 73cb3fa88d Fix lingering mqtt device_trigger unload entry test (#133202) 2024-12-15 11:55:33 +01:00
rappenze 14a61d94e2 Use entry.runtime_data in fibaro (#133235) 2024-12-15 11:49:23 +01:00
Manu 314076b85f Replace aiogithub dependency with pynecil update check (#133213) 2024-12-15 11:48:11 +01:00
rappenze 879d809e5a Enhance translation strings in fibaro (#133234) 2024-12-15 11:47:18 +01:00
Sid 412aa60e8f Fix enigma2 integration for devices not reporting MAC address (#133226) 2024-12-15 11:05:17 +01:00
Thomas55555 f8da2c3e5c Bump aioautomower to 2024.12.0 (#132962) 2024-12-15 11:04:11 +01:00
rappenze 80e4d7ee12 Fix fibaro climate hvac mode (#132508) 2024-12-15 11:02:26 +01:00
Marc Mueller af6948a911 Fix pydantic warnings in purpleair (#133247) 2024-12-15 10:34:33 +01:00
Avi Miller 9494128395 Bump aiolifx to 1.1.2 and add new HomeKit product prefixes (#133191)
Signed-off-by: Avi Miller <me@dje.li>
2024-12-15 11:24:41 +02:00
jb101010-2 1b2cf68e82 Suez_water: store coordinator in runtime_data (#133204)
* Suez_water: store coordinator in runtime_data

* jhfg
2024-12-15 09:46:14 +01:00
Arie Catsman 229a68dc73 set PARALLEL_UPDATES to 1 for enphase_envoy (#132373)
* set PARALLEL_UPDATES to 1 for enphase_envoy

* move PARALLEL_UPDATES from _init_ to platform files.

* Implement review feedback

* set parrallel_update 0 for read-only platforms
2024-12-15 09:27:14 +01:00
J. Nick Koston 2117e35d53 Bump yalexs-ble to 2.5.5 (#133229)
changelog: https://github.com/bdraco/yalexs-ble/compare/v2.5.4...v2.5.5
2024-12-14 23:06:26 +02:00
Matthias Alphart 74aa1a8f7e Update Fronius translations (#132876)
* Remove exception translation that's handled by configflow errors dict

* Remove entity name translations handled by device class

* Add data_description for Fronius config flow

* Remove unnecessary exception case

* review suggestion
2024-12-14 21:47:27 +01:00
Jan Bouwhuis 4dc1405e99 Bump incomfort-client to v0.6.4 (#133205) 2024-12-14 20:51:30 +01:00
Manu 35d5a16a3c Bump pynecil to 2.1.0 (#133211) 2024-12-14 20:47:06 +01:00
jb101010-2 79ecb4a87c Suez_water: add removal instructions (#133206) 2024-12-14 20:43:27 +01:00
YogevBokobza ff1df757b1 Switcher move _async_call_api to entity.py (#132877)
* Switcher move _async_call_api to entity.py

* fix based on requested changes

* fix based on requested changes
2024-12-14 21:06:36 +02:00
Dan Raper 9e2a3ea0e5 Add Ohme integration (#132574) 2024-12-14 18:12:44 +01:00
Erik Montnemery 980b8a91e6 Revert "Simplify recorder RecorderRunsManager" (#133201)
Revert "Simplify recorder RecorderRunsManager (#131785)"

This reverts commit cf0ee63507.
2024-12-14 14:21:19 +01:00
dontinelli d85d986075 Add button entity to slide_local (#133141)
Co-authored-by: Joostlek <joostlek@outlook.com>
2024-12-14 12:19:42 +01:00
dontinelli 06391d4635 Add reconfiguration to slide_local (#133182)
Co-authored-by: Joostlek <joostlek@outlook.com>
2024-12-14 12:10:28 +01:00
Sid ca1bcbf5d5 Bump openwebifpy to 4.3.0 (#133188) 2024-12-14 12:07:38 +01:00
Joost Lekkerkerker d2dfba3116 Improve Slide Local device tests (#133197) 2024-12-14 12:00:28 +01:00
IceBotYT bce6127264 Bump nice-go to 1.0.0 (#133185)
* Bump Nice G.O. to 1.0.0

* Mypy

* Pytest
2024-12-14 09:36:15 +01:00
J. Nick Koston 165ca5140c Bump uiprotect to 7.0.2 (#132975) 2024-12-13 20:05:41 -06:00
J. Nick Koston 1aabbec3dd Bump yalexs-ble to 2.5.4 (#133172) 2024-12-13 22:37:26 +01:00
Sid 0c8db8c8d6 Add eheimdigital integration (#126757)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2024-12-13 22:29:18 +01:00
Michael Hansen f06fda8023 Add response slot to HassRespond intent (#133162) 2024-12-13 15:19:43 -05:00
Michael Hansen 50b897bdaa Add STT error code for cloud authentication failure (#133170) 2024-12-13 14:59:46 -05:00
Franck Nijhof e13fa8346a Update debugpy to 1.8.11 (#133169) 2024-12-13 20:15:05 +01:00
Sid 8b6495f456 Bump ruff to 0.8.3 (#133163) 2024-12-13 19:06:44 +01:00
Franck Nijhof a812b594aa Fix Tailwind config entry typing in async_unload_entry signature (#133153) 2024-12-13 16:55:30 +01:00
epenet 1fbe880c5f Deprecate light constants (#132680)
* Deprecate light constants

* Reference deprecated values in MQTT light

* Reference deprecated values in test_recorder

* Adjust

* Adjust

* Add specific test
2024-12-13 16:52:47 +01:00
Jan Bouwhuis 97da8481d2 Add reconfigure flow to MQTT (#132246)
* Add reconfigure flow for MQTT integration

* Add test and translation strings

* Update quality scale configuration

* Do not cache ConfigEntry in flow

* Make sorce condition explictit

* Rework from suggested changes

* Do not allow reconfigure_entry and reconfigure_entry_data to be `None`
2024-12-13 16:11:45 +01:00
Maikel Punie f03f24f036 Velbus test before setup (#133069)
* Velbus test before setup

* Update homeassistant/components/velbus/__init__.py

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

* Add the connect named argument to make it clear we are testing the connection

* Correctly cleanup after the test

* Sync code for velbusaio 2024.12.2

* follow up

* rename connect_task to scan_task

---------

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-13 16:05:20 +01:00
Christopher Fenner 5f91676df0 Bump PyViCare to 2.38.0 (#133126) 2024-12-13 16:02:13 +01:00
dontinelli d6c81830a4 Fix missing password for slide_local (#133142) 2024-12-13 15:42:40 +01:00
epenet 8080ad14bf Add warning when light entities do not provide kelvin attributes or properties (#132723) 2024-12-13 15:34:02 +01:00
Klaas Schoute 067daad70e Set quality scale to silver for Powerfox integration (#133095) 2024-12-13 15:29:34 +01:00
Guido Schmitz 579ac01eb1 Fix typos in devolo Home Network tests (#133139) 2024-12-13 15:26:02 +01:00
Maikel Punie 5d8e997319 Bump velbusaio to 2024.12.2 (#133130)
* Bump velbusaio to 2024.12.2

* mistakely pushed this file
2024-12-13 13:49:00 +01:00
Cyrill Raccaud fe46fd24bd Improve data description and title for Cookidoo integration (#133106)
* fix data description typo for cookidoo

* use placeholder for cookidoo as it is non-translatable

* set title of language step

* fix for reauth

* fix reauth
2024-12-13 13:34:17 +01:00
Joost Lekkerkerker b4e065d331 Bump yt-dlp to 2024.12.13 (#133129) 2024-12-13 13:30:22 +01:00
epenet a131497e1f Reduce functools.partial with ServiceCall.hass in easyenergy (#133133) 2024-12-13 13:30:05 +01:00
epenet 4a5e47d2f0 Replace functools.partial with ServiceCall.hass in tibber (#133132) 2024-12-13 13:29:42 +01:00
epenet c7adc98408 Replace functools.partial with ServiceCall.hass in unifiprotect (#133131) 2024-12-13 13:28:54 +01:00
epenet f816a0667c Reduce functools.partial with ServiceCall.hass in energyzero (#133134) 2024-12-13 13:28:11 +01:00
Franck Nijhof 684667e8e7 Update open-meteo to v0.3.2 (#133122) 2024-12-13 13:24:46 +01:00
Jan-Philipp Benecke d658073246 Make Twitch sensor state and attributes translatable (#133127) 2024-12-13 13:01:55 +01:00
Martijn Russchen 81c8d7153b Push Nibe package to 2.14.0 (#133125) 2024-12-13 12:50:50 +01:00
Franck Nijhof 46db3964f3 2024.12.3 (#133123) 2024-12-13 12:16:14 +01:00
Marc Mueller 7e2d3eb482 Add contact vip info to fritzbox_callmonitor sensor (#132913) 2024-12-13 11:59:55 +01:00
Ludovic BOUÉ c0f6535d11 Fix typo in WaterHeaterEntityDescription name (#132888) 2024-12-13 11:11:47 +01:00
Franck Nijhof 9b83a00285 Bump version to 2024.12.3 2024-12-13 11:04:47 +01:00
Joost Lekkerkerker 9a7fda5b25 Bump aiowithings to 3.1.4 (#133117) 2024-12-13 11:04:34 +01:00
Robert Resch f9bdc29546 Bump deebot-client to 9.4.0 (#133114) 2024-12-13 11:04:31 +01:00
Brandon Rothweiler d9bb1f6035 Bump py-aosmith to 1.0.12 (#133100) 2024-12-13 11:04:28 +01:00
David Bonnes 01359b32c4 Bugfix to use evohome's new hostname (#133085) 2024-12-13 11:04:25 +01:00
jb101010-2 d0c00aaa67 Bump pysuezV2 to 1.3.5 (#133076) 2024-12-13 11:04:22 +01:00
Bram Kragten 73465a7aa8 Update frontend to 20241127.8 (#133066) 2024-12-13 11:04:19 +01:00
Franck Nijhof ed03c0a294 Fix LaMetric config flow for cloud import path (#133039) 2024-12-13 11:04:16 +01:00
Michael Hansen b38a7186d2 Change warning to debug for VAD timeout (#132987) 2024-12-13 11:04:13 +01:00
J. Nick Koston 31348930cc Bump led-ble to 1.1.1 (#132977)
changelog: https://github.com/Bluetooth-Devices/led-ble/compare/v1.0.2...v1.1.1
2024-12-13 11:04:09 +01:00
Simone Chemelli 83e1353c01 Guard Vodafone Station updates against bad data (#132921)
guard Vodafone Station updates against bad data
2024-12-13 11:04:07 +01:00
Simone Chemelli ede9c3ecd2 fix AndroidTV logging when disconnected (#132919) 2024-12-13 11:04:04 +01:00
Michael Hansen c08ffcff9b Fix pipeline conversation language (#132896) 2024-12-13 11:04:01 +01:00
Stefano Angeleri 038115fea2 Bump pydaikin to 2.13.8 (#132759) 2024-12-13 11:03:58 +01:00
Simon Lamon 4e5ceb3aa4 Bump python-linkplay to v0.1.1 (#132091) 2024-12-13 11:03:53 +01:00
Cyrill Raccaud 91f7afc2c5 Cookidoo reauth config flow for silver (#133110)
* reauth

* add check for duplicate email in reauth

* fix reauth double email check

* parametrize tests

* check reauth double entry data as well
2024-12-13 10:40:23 +01:00
Allen Porter 7f3373d233 Add a quality scale for Google Tasks (#131497)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-13 10:27:35 +01:00
Joost Lekkerkerker c0ef60bb98 Bump aiowithings to 3.1.4 (#133117) 2024-12-13 10:22:46 +01:00
Franck Nijhof fb5cca877b Fix failing CI due to Russound Rio incorrect IQS (#133118) 2024-12-13 10:12:35 +01:00
G Johansson 8cde404997 Raise issue for deprecated imperial unit system (#130979) 2024-12-13 10:05:46 +01:00
epenet 8b579d83ce Add data/data_description translation checks (#131705) 2024-12-13 09:50:10 +01:00
epenet f7b6f4b927 Replace functools.partial with ServiceCall.hass in knx (#133111) 2024-12-13 09:48:24 +01:00
Jan Rieger 3d93561e0a Remove native_unit_of_measurement from rfxtrx counters (#133108) 2024-12-13 09:47:39 +01:00
Andrew Sayre 566843591e Remove HEOS yaml import (#133082) 2024-12-13 09:46:52 +01:00
Robert Resch 2cd4ebbfb2 Bump deebot-client to 9.4.0 (#133114) 2024-12-13 09:45:38 +01:00
Stefan Agner 9ab69aa41c Add mWh as unit of measurement for Matter energy sensors (#133005) 2024-12-13 09:33:58 +01:00
epenet a0e49ebc97 Use internal min/max mireds in template (#133113) 2024-12-13 09:33:40 +01:00
epenet 899fb091fc Simplify access to hass in service calls (#133062) 2024-12-13 09:31:21 +01:00
Maikel Punie f9f37b9932 Velbus docs quality bump (#133070) 2024-12-13 09:23:53 +01:00
Marc Mueller e4cca3fe40 Update devcontainer to Python 3.13 (#132313) 2024-12-13 09:22:01 +01:00
Martin Weinelt 11b65b1eb3 Bump watchdog to 6.0.0 (#132895) 2024-12-13 09:21:14 +01:00
jb101010-2 e3d14e6993 Bump pysuezV2 to 1.3.5 (#133076) 2024-12-13 09:01:48 +01:00
Åke Strandberg 53439d6e2a Handle step size correctly in myuplink number platform (#133016) 2024-12-13 08:55:44 +01:00
David Bonnes de89be0512 Bugfix to use evohome's new hostname (#133085) 2024-12-13 08:54:14 +01:00
Brandon Rothweiler 8bd2c183e2 Bump py-aosmith to 1.0.12 (#133100) 2024-12-13 08:46:15 +01:00
epenet 263eb41e79 Remove unused constant from blink (#133109) 2024-12-13 08:24:18 +01:00
Klaas Schoute 0ffb588d5c Move config entry type of energyzero integration (#133094)
Move config_entry type to coordinator file
2024-12-13 07:53:25 +01:00
dependabot[bot] 09b06f839d Bump github/codeql-action from 3.27.7 to 3.27.9 (#133104) 2024-12-13 07:47:40 +01:00
epenet 72cc1f4d39 Use correct ATTR_KELVIN constant in yeelight tests (#133088) 2024-12-13 06:51:55 +01:00
Allen Porter 2af5c5ecda Update Rainbird quality scale grading on the Silver quality checks (#131498)
* Grade Rainbird on the Silver quality scale

* Remove done comments

* Update quality_scale.yaml

* Update config-flow-test-coverage
2024-12-12 20:26:30 -08:00
epenet bf9788b9c4 Fix CI failure in russound_rio (#133081)
* Fix CI in russound_rio

* Adjust
2024-12-12 22:16:28 +01:00
epenet 2cff7526d0 Add test-before-setup rule to quality_scale validation (#132255)
* Add test-before-setup rule to quality_scale validation

* Use ast_parse_module

* Add rules_done

* Add Config argument
2024-12-12 22:15:49 +01:00
Franck Nijhof 61b1b50c34 Improve Solar.Forecast configuration flow tests (#133077) 2024-12-12 21:19:05 +01:00
epenet aa7e024853 Migrate lifx light tests to use Kelvin (#133020) 2024-12-12 21:17:52 +01:00
epenet d02bceb6f3 Migrate alexa color_temp handlers to use Kelvin (#132995) 2024-12-12 21:17:31 +01:00
epenet b9a7307df8 Refactor light reproduce state to use kelvin attribute (#132854) 2024-12-12 21:17:05 +01:00
Noah Husby d79dc8d22f Add source zone exclusion to Russound RIO (#130392)
* Add source zone exclusion to Russound RIO

* Ruff format
2024-12-12 22:13:37 +02:00
Franck Nijhof 839f06b2dc Small improvements to the AdGuard tests (#133073) 2024-12-12 21:12:11 +01:00
Maikel Punie 3baa432bae Use runtime_data in velbus (#132988) 2024-12-12 20:48:01 +01:00
epenet b189bc6146 Migrate smartthings light tests to use Kelvin (#133022) 2024-12-12 20:38:49 +01:00
epenet 708084d300 Migrate switch_as_x light tests to use Kelvin (#133023) 2024-12-12 20:38:13 +01:00
epenet 7c9992f5d3 Migrate demo light tests to use Kelvin (#133003) 2024-12-12 20:37:32 +01:00
Franck Nijhof 483688dba2 Promote Twente Milieu quality scale to silver (#133074) 2024-12-12 20:32:59 +01:00
epenet e276f8ee89 Migrate zwave_js light tests to use Kelvin (#133034) 2024-12-12 20:32:39 +01:00
epenet de35bfce77 Migrate yeelight light tests to use Kelvin (#133033) 2024-12-12 20:29:15 +01:00
epenet f0391f4963 Migrate tradfri light tests to use Kelvin (#133030) 2024-12-12 20:28:42 +01:00
epenet fd811c85e9 Migrate wemo light tests to use Kelvin (#133031) 2024-12-12 20:28:08 +01:00
Cyrill Raccaud 56db536883 Add Cookidoo integration (#129800) 2024-12-12 20:23:14 +01:00
epenet 55fa717f10 Migrate flux_led light tests to use Kelvin (#133009) 2024-12-12 20:18:27 +01:00
dontinelli c164507952 Add new integration slide_local (#132632)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-12 20:18:19 +01:00
epenet 798f3a34f3 Migrate abode light tests to use Kelvin (#133001) 2024-12-12 20:17:45 +01:00
epenet a358491970 Migrate wiz light tests to use Kelvin (#133032) 2024-12-12 20:16:54 +01:00
Erik Montnemery ad15786115 Add support for subentries to config entries (#117355)
* Add support for subentries to config entries

* Improve error handling and test coverage

* Include subentry_id in subentry containers

* Auto-generate subentry_id and add optional unique_id

* Tweak

* Update tests

* Fix stale docstring

* Address review comments

* Typing tweaks

* Add methods to ConfigEntries to add and remove subentry

* Improve ConfigSubentryData typed dict

* Update test snapshots

* Adjust tests

* Fix unique_id logic

* Allow multiple subentries with None unique_id

* Add number of subentries to config entry JSON representation

* Add subentry translation support

* Allow integrations to implement multiple subentry flows

* Update translations schema

* Adjust exception text

* Change subentry flow init step to user

* Prevent creating a subentry with colliding unique_id

* Update tests

* Address review comments

* Remove duplicaetd unique_id collision check

* Remove change from the future

* Improve test coverage

* Add default value for unique_id
2024-12-12 20:16:18 +01:00
Marc Mueller 32c1b519ad Improve auth generic typing (#133061) 2024-12-12 20:14:56 +01:00
Klaas Schoute ce70cb9e33 Use ConfigEntry runtime_data in easyEnergy (#133053) 2024-12-12 20:13:41 +01:00
epenet 40c3dd2095 Migrate group light tests to use Kelvin (#133010) 2024-12-12 20:08:07 +01:00
Franck Nijhof 3c7502dd5d Explicitly pass config entry to coordinator in Tailwind (#133065) 2024-12-12 19:46:35 +01:00
Franck Nijhof b8ce1b010f Update demetriek to v1.1.0 (#133064) 2024-12-12 19:39:24 +01:00
Andrew Sayre 1205178702 Add HEOS quality scale (#132311) 2024-12-12 19:32:00 +01:00
Bram Kragten a6b785d937 Update frontend to 20241127.8 (#133066) 2024-12-12 19:11:07 +01:00
Martin Hjelmare 39e4719a43 Fix backup strategy retention filter (#133060)
* Fix lint

* Update tests

* Fix backup strategy retention filter
2024-12-12 18:47:37 +01:00
epenet e7a43cfe09 Migrate deconz light tests to use Kelvin (#133002) 2024-12-12 18:13:24 +01:00
Maikel Punie 0726809228 Bump velbusaio to 2024.12.1 (#133056) 2024-12-12 17:00:11 +01:00
Erik Montnemery 3d201690ce Fix load of backup store (#133024)
* Fix load of backup store

* Tweak type annotations in test

* Fix tests

* Remove the new test

* Remove snapshots
2024-12-12 16:54:21 +01:00
epenet 0b18e51a13 Remove reference to self.min/max_mireds in mqtt light (#133055) 2024-12-12 16:49:50 +01:00
epenet 2ce2765e67 Adjust light test helpers to use Kelvin, and cleanup unused helpers (#133048)
Cleanup light test helper methods
2024-12-12 16:49:25 +01:00
epenet 33c799b2d0 Migrate mqtt light tests to use Kelvin (#133035) 2024-12-12 16:42:10 +01:00
Marc Mueller 5c6e4ad191 Use PEP 695 TypeVar syntax (#133049) 2024-12-12 16:01:57 +01:00
Marc Mueller 0a748252e7 Improve Callable annotations (#133050) 2024-12-12 15:14:28 +01:00
epenet 839312c65c Migrate homekit light tests to use Kelvin (#133011) 2024-12-12 15:11:52 +01:00
epenet 37f2bde6f5 Migrate esphome light tests to use Kelvin (#133008) 2024-12-12 15:11:34 +01:00
epenet 6d042d987f Migrate emulated_hue light tests to use Kelvin (#133006) 2024-12-12 15:11:13 +01:00
Robert Resch 006b3b0e22 Bump uv to 0.5.8 (#133036) 2024-12-12 14:51:15 +01:00
Franck Nijhof f05d18ea70 Small test improvements to Tailwind tests (#133051) 2024-12-12 14:42:05 +01:00
Sid dc18e62e1e Bump ruff to 0.8.2 (#133041) 2024-12-12 14:38:55 +01:00
Marc Mueller 4b5d717898 Fix music_assistant decorator typing (#133044) 2024-12-12 14:35:11 +01:00
Franck Nijhof 8e15287662 Add data descriptions to Twente Milieu config flow (#133046) 2024-12-12 14:26:34 +01:00
Marc Mueller 2e133df549 Improve husqvarna_automower decorator typing (#133047) 2024-12-12 14:26:17 +01:00
Krisjanis Lejejs c18cbf5994 Bump hass-nabucasa from 0.86.0 to 0.87.0 (#133043) 2024-12-12 14:25:54 +01:00
Franck Nijhof bcaf1dc20b Clean up Elgato config flow tests (#133045) 2024-12-12 14:24:38 +01:00
Franck Nijhof 6005b6d01c Explicitly pass config entry to coordinator in Elgato (#133014)
* Explicitly pass config entry to coordinator in Elgato

* Make it noice!

* Apply suggestions from code review

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

* Adjustment from review comment

---------

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-12 13:55:57 +01:00
epenet 7bdf034b93 Migrate template light tests to use Kelvin (#133025) 2024-12-12 13:54:22 +01:00
Franck Nijhof 5c80ddb891 Fix LaMetric config flow for cloud import path (#133039) 2024-12-12 13:49:17 +01:00
Erik Montnemery 85d4572a17 Adjust backup agent platform (#132944)
* Adjust backup agent platform

* Adjust according to discussion

* Clean up the local agent dict too

* Add test

* Update kitchen_sink

* Apply suggestions from code review

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Adjust tests

* Clean up

* Fix kitchen sink reload

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2024-12-12 13:41:56 +01:00
Franck Nijhof f2aaf2ac4a Small test cleanups in Twente Milieu (#133028) 2024-12-12 12:55:25 +01:00
epenet 52491bb75e Migrate tplink light tests to use Kelvin (#133026) 2024-12-12 12:52:01 +01:00
Simone Chemelli ded7cee6e5 fix AndroidTV logging when disconnected (#132919) 2024-12-12 11:42:00 +01:00
Klaas Schoute 0006672489 Improve diagnostics code of EnergyZero integration (#133019) 2024-12-12 11:39:55 +01:00
Klaas Schoute a9d71e0a5f Add reconfigure flow for Powerfox integration (#132260) 2024-12-12 11:34:36 +01:00
epenet 0e45ccb956 Migrate google_assistant color_temp handlers to use Kelvin (#132997) 2024-12-12 11:13:24 +01:00
Franck Nijhof 7dc31dec3b Fix config entry import in Twente Milieu diagnostic (#133017) 2024-12-12 10:52:03 +01:00
Erik Montnemery a30c942fa7 Don't use kitchen_sink integration in config entries tests (#133012) 2024-12-12 10:42:27 +01:00
Klaas Schoute d49b1b2d6b Use ConfigEntry runtime_data in EnergyZero (#132979) 2024-12-12 10:28:41 +01:00
Maikel Punie 4a7039f51d Bump velbusaio to 2024.12.0 (#132989) 2024-12-12 10:25:21 +01:00
Franck Nijhof 0377dc5b5a Move coordinator for TwenteMilieu into own module (#133000) 2024-12-12 10:18:11 +01:00
epenet bb610acb86 Migrate elgato light tests to use Kelvin (#133004) 2024-12-12 09:53:55 +01:00
Franck Nijhof 85d4c48d6f Set parallel updates in Elgato (#132998) 2024-12-12 09:53:26 +01:00
Michael Hansen 053f03ac58 Change warning to debug for VAD timeout (#132987) 2024-12-12 09:03:05 +01:00
Chris Talkington 0d4780e91b Set parallel updates for roku (#132892)
* Set parallel updates for roku

* Update sensor.py

* Update media_player.py

* Update remote.py

* Update select.py

* Update media_player.py

* Update remote.py

* Update select.py

* Update remote.py

* Update media_player.py
2024-12-12 08:00:24 +01:00
Noah Husby 2d0c4e4a59 Improve config flow test coverage for Russound RIO (#132981) 2024-12-12 07:56:29 +01:00
Noah Husby e39897ff9a Enforce strict typing for Russound RIO (#132982) 2024-12-12 07:55:29 +01:00
Tom 7e071d1fc6 Introduce parallel updates for Plugwise (#132940)
* Plugwise indicate parallel updates

* Update homeassistant/components/plugwise/number.py

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

---------

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-12 07:49:08 +01:00
Christopher Fenner b02ccd0813 Add missing body height icon in Withings integration (#132991)
Update icons.json
2024-12-12 07:47:57 +01:00
J. Nick Koston eea781f34a Bump led-ble to 1.1.1 (#132977)
changelog: https://github.com/Bluetooth-Devices/led-ble/compare/v1.0.2...v1.1.1
2024-12-11 23:46:31 -05:00
Åke Strandberg 95f48963d4 Set strict typing for myuplink (#132972)
Set strict typing
2024-12-11 23:11:11 +01:00
Åke Strandberg 4c5965ffc9 Add reconfiguration flow to myuplink (#132970)
* Add reconfiguration flow

* Tick reconfiguration-flow rule
2024-12-11 22:47:14 +01:00
Erik Montnemery 8e991fc92f Merge feature branch with backup changes to dev (#132954)
* Reapply "Make WS command backup/generate send events" (#131530)

This reverts commit 9b8316df3f.

* MVP implementation of Backup sync agents (#126122)

* init sync agent

* add syncing

* root import

* rename list to info and add sync state

* Add base backup class

* Revert unneded change

* adjust tests

* move to kitchen_sink

* split

* move

* Adjustments

* Adjustment

* update

* Tests

* Test unknown agent

* adjust

* Adjust for different test environments

* Change /info WS to contain a dictinary

* reorder

* Add websocket command to trigger sync from the supervisor

* cleanup

* Make mypy happier

---------

Co-authored-by: Erik <erik@montnemery.com>

* Make BackupSyncMetadata model a dataclass (#130555)

Make backup BackupSyncMetadata model a dataclass

* Rename backup sync agent to backup agent (#130575)

* Rename sync agent module to agent

* Rename BackupSyncAgent to BackupAgent

* Fix test typo

* Rename async_get_backup_sync_agents to async_get_backup_agents

* Rename and clean up remaining sync things

* Update kitchen sink

* Apply suggestions from code review

* Update test_manager.py

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Add additional options to WS command backup/generate (#130530)

* Add additional options to WS command backup/generate

* Improve test

* Improve test

* Align parameter names in backup/agents/* WS commands (#130590)

* Allow setting password for backups (#110630)

* Allow setting password for backups

* use is_hassio from helpers

* move it

* Fix getting psw

* Fix restoring with psw

* Address review comments

* Improve docstring

* Adjust kitchen sink

* Adjust

---------

Co-authored-by: Erik <erik@montnemery.com>

* Export relevant names from backup integration (#130596)

* Tweak backup agent interface (#130613)

* Tweak backup agent interface

* Adjust kitchen_sink

* Test kitchen sink backup (#130609)

* Test agents_list_backups

* Test agents_info

* Test agents_download

* Export Backup from manager

* Test agents_upload

* Update tests after rebase

* Use backup domain

* Remove WS command backup/upload (#130588)

* Remove WS command backup/upload

* Disable failing kitchen_sink test

* Make local backup a backup agent (#130623)

* Make local backup a backup agent

* Adjust

* Adjust

* Adjust

* Adjust tests

* Adjust

* Adjust

* Adjust docstring

* Adjust

* Protect members of CoreLocalBackupAgent

* Remove redundant check for file

* Make the backup.create service use the first local agent

* Add BackupAgent.async_get_backup

* Fix some TODOs

* Add support for downloading backup from a remote agent

* Fix restore

* Fix test

* Adjust kitchen_sink test

* Remove unused method BackupManager.async_get_backup_path

* Re-enable kitchen sink test

* Remove BaseBackupManager.async_upload_backup

* Support restore from remote agent

* Fix review comments

* Include backup agent error in response to WS command backup/info (#130884)

* Adjust code related to WS command backup/info (#130890)

* Include backup agent error in response to WS command backup/details (#130892)

* Remove LOCAL_AGENT_ID constant from backup manager (#130895)

* Add backup config storage (#130871)

* Add base for backup config

* Allow updating backup config

* Test loading backup config

* Add backup config update method

* Add temporary check for BackupAgent.async_remove_backup (#130893)

* Rename backup slug to backup_id (#130902)

* Improve backup websocket API tests (#130912)

* Improve backup websocket API tests

* Add missing snapshot

* Fix tests leaving files behind

* Improve backup manager backup creation tests (#130916)

* Remove class backup.backup.LocalBackup (#130919)

* Add agent delete backup (#130921)

* Add backup agent delete backup

* Remove agents delete websocket command

* Update docstring

Co-authored-by: Erik Montnemery <erik@montnemery.com>

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Disable core local backup agent in hassio (#130933)

* Rename remove backup to delete backup (#130940)

* Rename remove backup to delete backup

* Revert "backup/delete"

* Refactor BackupManager (#130947)

* Refactor BackupManager

* Adjust

* Adjust backup creation

* Copy in executor

* Fix BackupManager.async_get_backup (#130975)

* Fix typo in backup tests (#130978)

* Adjust backup NewBackup class (#130976)

* Remove class backup.BackupUploadMetadata (#130977)

Remove class backup.BackupMetadata

* Report backup size in bytes instead of MB (#131028)

Co-authored-by: Robert Resch <robert@resch.dev>

* Speed up CI for feature branch (#131030)

* Speed up CI for feature branch

* adjust

* fix

* fix

* fix

* fix

* Rename remove to delete in backup websocket type (#131023)

* Revert "Speed up CI for feature branch" (#131074)

Revert "Speed up CI for feature branch (#131030)"

This reverts commit 791280506d.

* Rename class BaseBackup to AgentBackup (#131083)

* Rename class BaseBackup to AgentBackup

* Update tests

* Speed up CI for backup feature branch (#131079)

* Add backup platform to the hassio integration (#130991)

* Add backup platform to the hassio integration

* Add hassio to after_dependencies of backup

* Address review comments

* Remove redundant hassio parametrization of tests

* Add tests

* Address review comments

* Bump CI cache version

* Revert "Bump CI cache version"

This reverts commit 2ab4d2b179.

* Extend backup info class AgentBackup (#131110)

* Extend backup info class AgentBackup

* Update kitchen sink

* Update kitchen sink test

* Update kitchen sink test

* Exclude cloud and hassio from core files (#131117)

* Remove unnecessary **kwargs from backup API (#131124)

* Fix backup tests (#131128)

* Freeze backup dataclasses (#131122)

* Protect CoreLocalBackupAgent.load_backups (#131126)

* Use backup metadata v2 in core/container backups (#131125)

* Extend backup creation API (#131121)

* Extend backup creation API

* Add tests

* Fix merge

* Fix merge

* Return agent errors when deleting a backup (#131142)

* Return agent errors when deleting a backup

* Remove redundant calls to dict.keys()

* Add enum type for backup folder (#131158)

* Add method AgentBackup.from_dict (#131164)

* Remove WS command backup/agents/list_backups (#131163)

* Handle backup schedule (#131127)

* Add backup schedule handling

* Fix unrelated incorrect type annotation in test

* Clarify delay save

* Make the backup time compatible with the recorder nightly job

* Update create backup parameters

* Use typed dict for create backup parameters

* Simplify schedule state

* Group create backup parameters

* Move parameter

* Fix typo

* Use Folder model

* Handle deserialization of folders better

* Fail on attempt to include addons or folders in core backup (#131204)

* Fix AgentBackup test (#131201)

* Add options to WS command backup/restore (#131194)

* Add options to WS command backup/restore

* Add tests

* Fix test

* Teach core backup to restore only database or only settings (#131225)

* Exclude tmp_backups/*.tar from backups (#131243)

* Add WS command backup/subscribe_events (#131250)

* Clean up temporary directory after restoring backup (#131263)

* Improve hassio backup agent list (#131268)

* Include `last_automatic_backup` in reply to backup/info (#131293)

Include last_automatic_backup in reply to backup/info

* Handle backup delete after config (#131259)

* Handle delete after copies

* Handle delete after days

* Add some test examples

* Test config_delete_after_logic

* Test config_delete_after_copies_logic

* Test more delete after days

* Add debug logs

* Always delete the oldest backup first

* Never remove the last backup

* Clean up words

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Fix after cleaning words

* Use utcnow

* Remove duplicate guard

* Simplify sorting

* Delete backups even if there are agent errors on get backups

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>

* Rename backup delete after to backup retention (#131364)

* Rename backup delete after to backup retention

* Tweak

* Remove length limit on `agent_ids` when configuring backup (#132057)

Remove length limit on agent_ids when configuring backup

* Rename backup retention_config to retention (#132068)

* Modify backup agent API to be stream oriented (#132090)

* Modify backup agent API to be stream oriented

* Fix tests

* Adjust after code review

* Remove no longer needed pylint override

* Improve test coverage

* Change BackupAgent API to work with AsyncIterator objects

* Don't close files in the event loop

* Don't close files in the event loop

* Fix backup manager create backup log (#132174)

* Fix debug log level (#132186)

* Add cloud backup agent (#129621)

* Init cloud backup sync

* Add more metadata

* Fix typo

* Adjust to base changes

* Don't raise on list if more than one backup is available

* Adjust to base branch

* Fetch always and verify on download

* Update homeassistant/components/cloud/backup.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Adjust to base branch changes

* Not required anymore

* Workaround

* Fix blocking event loop

* Fix

* Add some tests

* some tests

* Add cloud backup delete functionality

* Enable check

* Fix ruff

* Use fixture

* Use iter_chunks instead

* Remove read

* Remove explicit export of read_backup

* Align with BackupAgent API changes

* Improve test coverage

* Improve error handling

* Adjust docstrings

* Catch aiohttp.ClientError bubbling up from hass_nabucasa

* Improve iteration

---------

Co-authored-by: Erik <erik@montnemery.com>
Co-authored-by: Robert Resch <robert@resch.dev>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Krisjanis Lejejs <krisjanis.lejejs@gmail.com>

* Extract file receiver from `BackupManager.async_receive_backup` to util (#132271)

* Extract file receiver from BackupManager.async_receive_backup to util

* Apply suggestions from code review

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

* Make sure backup directory exists (#132269)

* Make sure backup directory exists

* Hand off directory creation to executor

* Use mkdir's exist_ok feeature

* Organize BackupManager instance attributes (#132277)

* Don't store received backups in a TempDir (#132272)

* Don't store received backups in a TempDir

* Fix tests

* Make sure backup directory exists

* Address review comments

* Fix tests

* Rewrite backup manager state handling (#132375)

* Rewrite backup manager state handling

* Address review comments

* Modify backup reader/writer API to be stream oriented (#132464)

* Internalize backup tasks (#132482)

* Internalize backup tasks

* Update test after rebase

* Handle backup error during automatic backup (#132511)

* Improve backup manager state logging (#132549)

* Fix backup manager state when restore completes (#132548)

* Remove WS command backup/agents/download (#132664)

* Add WS command backup/generate_with_stored_settings (#132671)

* Add WS command backup/generate_with_stored_settings

* Register the new command, add tests

* Refactor local agent backup tests (#132683)

* Refactor test_load_backups

* Refactor test loading agents

* Refactor test_delete_backup

* Refactor test_upload

* Clean up duplicate tests

* Refactor backup manager receive tests (#132701)

* Refactor backup manager receive tests

* Clean up

* Refactor pre and post platform tests (#132708)

* Refactor backup pre platform test

* Refactor backup post platform test

* Bump aiohasupervisor to version 0.2.2b0 (#132704)

* Bump aiohasupervisor to version 0.2.2b0

* Adjust tests

* Publish event when manager is idle after creating backup (#132724)

* Handle busy backup manager when uploading backup (#132736)

* Adjust hassio backup agent to supervisor changes (#132732)

* Adjust hassio backup agent to supervisor changes

* Fix typo

* Refactor test for create backup with wrong parameters (#132763)

* Refactor test not loading bad backup platforms (#132769)

* Improve receive backup coverage (#132758)

* Refactor initiate backup test (#132829)

* Rename Backup to ManagerBackup (#132841)

* Refactor backup config (#132845)

* Refactor backup config

* Remove unnecessary condition

* Adjust tests

* Improve initiate backup test (#132858)

* Store the time of automatic backup attempts (#132860)

* Store the time of automatic backup attempts

* Address review comments

* Update test

* Update cloud test

* Save agent failures when creating backups (#132850)

* Save agent failures when creating backups

* Update tests

* Store KnownBackups

* Add test

* Only clear known_backups on no error, add tests

* Address review comments

* Store known backups as a list

* Update tests

* Track all backups created with backup strategy settings (#132916)

* Track all backups created with saved settings

* Rename

* Add explicit call to save the store

* Don't register service backup.create in HassOS installations (#132932)

* Revert changes to action service backup.create (#132938)

* Fix logic for cleaning up temporary backup file (#132934)

* Fix logic for cleaning up temporary backup file

* Reduce scope of patch

* Fix with_strategy_settings info not sent over websocket (#132939)

* Fix with_strategy_settings info not sent over websocket

* Fix kitchen sink tests

* Fix cloud and hassio tests

* Revert backup ci changes (#132955)

Revert changes speeding up CI

* Fix revert of CI changes (#132960)

---------

Co-authored-by: Joakim Sørensen <joasoe@gmail.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Robert Resch <robert@resch.dev>
Co-authored-by: Paul Bottein <paul.bottein@gmail.com>
Co-authored-by: Krisjanis Lejejs <krisjanis.lejejs@gmail.com>
2024-12-11 21:49:34 +01:00
G Johansson a1e4b3b0af Update quality scale for nordpool (#132964)
* Update quality scale for nordpool

* more
2024-12-11 21:23:26 +01:00
Noah Husby d43d84a67f Add parallel updates & use typed config entry for Russound RIO (#132958) 2024-12-11 21:07:29 +01:00
Josef Zweck 525614b7cd Bump pylamarzocco to 1.4.0 (#132917)
* Bump pylamarzocco to 1.4.0

* update device snapshot
2024-12-11 21:52:20 +02:00
Franck Nijhof 73e68971e8 Remove port from Elgato configuration flow (#132961) 2024-12-11 20:48:55 +01:00
Marc Mueller 833557fad5 Trigger full ci run on global mypy config change (#132909) 2024-12-11 19:16:49 +01:00
epenet 0e8fe1eb41 Improve coverage in light reproduce state (#132929) 2024-12-11 19:15:36 +01:00
Allen Porter fa05cc5e70 Add quality scale for nest integration (#131330)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
2024-12-11 19:04:16 +01:00
Noah Husby 096d653059 Record current IQS state for Russound RIO (#131219) 2024-12-11 19:03:43 +01:00
Jan Bouwhuis 3a7fc15656 Add Dutch locale on supported Alexa interfaces (#132936) 2024-12-11 19:01:20 +01:00
Matthias Alphart 233d927c01 Update xknx to 3.4.0 (#132943) 2024-12-11 18:56:21 +01:00
Michael Hansen 94260147d7 Fix pipeline conversation language (#132896) 2024-12-11 18:52:02 +01:00
Robert Resch 502a221feb Set go2rtc quality scale to internal (#132945) 2024-12-11 17:20:49 +01:00
epenet 39f8de0159 Fix mqtt light attributes (#132941) 2024-12-11 17:18:54 +01:00
Maikel Punie 00ab5db661 Split the velbus services code in its own file (#131375) 2024-12-11 16:41:48 +01:00
epenet 0d71828def Migrate mqtt lights to use Kelvin (#132828)
* Migrate mqtt lights to use Kelvin

* Adjust restore_cache tests

* Adjust tests
2024-12-11 16:11:14 +01:00
jb101010-2 ee4db13c2a Add data description to suez_water config flow (#132466)
* Suez_water: config flow data_descriptions

* Rename counter by meter

* Use placeholders
2024-12-11 15:52:43 +01:00
Simone Chemelli 555d7f1ea4 Guard Vodafone Station updates against bad data (#132921)
guard Vodafone Station updates against bad data
2024-12-11 15:40:18 +01:00
epenet 1753382307 Adjust lifx to use local _ATTR_COLOR_TEMP constant (#132840) 2024-12-11 14:11:29 +00:00
Åke Strandberg 05b23d081b Set quality_scale for myUplink to Silver (#132923) 2024-12-11 13:09:33 +00:00
Maikel Punie f974479970 Velbus add quality_scale.yaml (#131377)
Co-authored-by: Allen Porter <allen.porter@gmail.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-11 13:53:14 +01:00
Matthias Alphart ecfa888918 Create quality_scale.yaml from integration scaffold script (#132199)
Co-authored-by: Josef Zweck <24647999+zweckj@users.noreply.github.com>
2024-12-11 13:52:53 +01:00
Åke Strandberg 7103b7fd80 Use snapshot tests for remaining myuplink platforms (#132915)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-11 13:01:02 +01:00
Marc Mueller dc8b7cfede Allow bytearray for mqtt payload type (#132906) 2024-12-11 11:51:16 +01:00
Simon Lamon b26583b0bf Bump python-linkplay to v0.1.1 (#132091) 2024-12-11 11:12:05 +01:00
shapournemati-iotty beda273721 upgrade iottycloud lib to 0.3.0 (#132836) 2024-12-11 10:52:47 +01:00
Marc Mueller 0e8961276f Enable pydantic.v1 mypy plugin (#132907) 2024-12-11 10:50:42 +01:00
epenet 9c9e82a93e Migrate zha lights to use Kelvin (#132816) 2024-12-11 09:58:08 +01:00
epenet 7ef3e92e2d Migrate tasmota lights to use Kelvin (#132798) 2024-12-11 09:57:29 +01:00
G Johansson 2bb05296b8 Add remaining test coverage to yale_smart_alarm (#132869) 2024-12-11 09:46:53 +01:00
epenet b780f31e63 Migrate flux to use Kelvin over Mireds (#132839) 2024-12-11 08:55:23 +01:00
Robert Resch af838077cc Fix docker hassfest (#132823) 2024-12-11 08:55:00 +01:00
shapournemati-iotty 5e17721568 Remove old codeowner no longer working on the integration (#132807) 2024-12-11 08:53:19 +01:00
epenet 4ff41ed2f8 Refactor light significant change to use kelvin attribute (#132853) 2024-12-11 08:42:48 +01:00
dependabot[bot] f0f0b4b8fa Bump github/codeql-action from 3.27.6 to 3.27.7 (#132900)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-11 08:24:25 +01:00
Ludovic BOUÉ 9f40074d66 Fix typo in water heater integration (#132891)
Fix typo in water heater componant
2024-12-11 07:36:09 +01:00
Chris Talkington 73feeacc39 Use runtime_data for roku (#132781)
* use runtime_data for roku

* unload cleanup

* tweaks

* tweaks

* fix tests

* fix tests

* Update config_flow.py

* Update config_flow.py
2024-12-11 06:55:58 +01:00
Allen Porter 355e80aa56 Test the google tasks api connection in setup (#132657)
Improve google tasks setup
2024-12-10 19:01:50 -08:00
Marc Mueller 77debcbe8b Update numpy to 2.2.0 (#132874) 2024-12-10 22:28:30 +01:00
Franck Nijhof 3fe2c14a79 2024.12.2 (#132846) 2024-12-10 21:45:06 +01:00
Jonas Fors Lellky b46392041f Add model_id to flexit (bacnet) entity (#132875)
* Add model_id to flexit (bacnet) entity

* Add model to mock
2024-12-10 21:44:00 +01:00
epenet fb3ffaf18d Migrate demo lights to use Kelvin (#132837)
* Migrate demo lights to use Kelvin

* Adjust google_assistant tests
2024-12-10 20:59:12 +01:00
Manu 1b300a4389 Set config-flow rule in IQS to todo in Bring integration (#132855)
Set config-flow rule in IQS to todo
2024-12-10 20:52:39 +01:00
G Johansson 5dc2757324 Add quality scale to Nord Pool (#132415)
* Add quality scale to Nord Pool

* Update

* a

* fix
2024-12-10 19:35:21 +01:00
G Johansson 76b73fa9b1 Use floats instead of datetime in statistics (#132746)
* Use floats instead of datetime in statistics

* check if debug log
2024-12-10 19:03:43 +01:00
Stefano Angeleri 7fb5b17ac5 Bump pydaikin to 2.13.8 (#132759) 2024-12-10 19:29:28 +02:00
J. Nick Koston d2303eb83f Bump pydantic to 2.10.3 and update required deps (#131963) 2024-12-10 18:27:40 +01:00
G Johansson f99239538c Add retry to api calls in Nord Pool (#132768) 2024-12-10 19:26:49 +02:00
Markus Jacobsen dba405dd88 Bump mozart-api to 4.1.1.116.4 (#132859)
Bump API
2024-12-10 19:21:59 +02:00
Markus Jacobsen d4546c94b0 Add beolink_join source_id parameter to Bang & Olufsen (#132377)
* Add source as parameter to beolink join service

* Add beolink join source and responses

* Improve comment
Add translation

* Remove result from beolink join custom action

* Cleanup

* Use options selector instead of string for source ID
Fix test docstring

* Update options

* Use translation dict for source ids
Add input validation
Add tests for invalid sources
Improve source id description

* Use list instead of translation dict
Remove platform prefixes
Add test for Beolink Converter source

* Fix source_id naming and order
2024-12-10 11:01:12 -06:00
Allen Porter 8fd64d2ca4 Add a quality scale for fitbit integration (#131326)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2024-12-10 17:04:00 +01:00
Tom 2b17037edc Plugwise improve platform tests (#132748) 2024-12-10 16:43:08 +01:00
epenet 6a323a1d3c Fix wrong name attribute in mqtt ignore list (#132831) 2024-12-10 15:32:08 +01:00
epenet 7014317e9e Cleanup unnecessary mired attributes in esphome (#132833)
* Cleanup unnecessary mired attributes in esphome

* Adjust
2024-12-10 15:29:33 +01:00
Guido Schmitz 0a786394f5 Add data descriptions to devolo Home Control (#132703) 2024-12-10 15:15:57 +01:00
Franck Nijhof 238cf691a4 Bump version to 2024.12.2 2024-12-10 15:07:18 +01:00
Josef Zweck 5a5bb139fa Bump aioacaia to 0.1.11 (#132838) 2024-12-10 14:59:48 +01:00
Robert Resch 01a9a58327 Bump deebot-client to 9.3.0 (#132834) 2024-12-10 14:59:45 +01:00
David Knowles fc34c6181c Pass an application identifier to the Hydrawise API (#132779) 2024-12-10 14:59:42 +01:00
David Knowles 60e8a38ba3 Catch Hydrawise authorization errors in the correct place (#132727) 2024-12-10 14:59:37 +01:00
starkillerOG e4765c40fe Bump reolink-aio to 0.11.5 (#132757) 2024-12-10 14:55:48 +01:00
Bram Kragten e239871566 Update frontend to 20241127.7 (#132729)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2024-12-10 14:55:43 +01:00
Michael Hansen c8e5a6df5d Bump intents to 2024.12.9 (#132726) 2024-12-10 14:54:48 +01:00
Simone Rescio cac4eef795 Revert "Bump pyezviz to 0.2.2.3" (#132715) 2024-12-10 14:51:04 +01:00
Joost Lekkerkerker 8fc50c776e Bump yt-dlp to 2024.12.06 (#132684) 2024-12-10 14:51:00 +01:00
Bouwe Westerdijk da344a44e5 Bump plugwise to v1.6.3 (#132673) 2024-12-10 14:50:57 +01:00
puddly 1993142e44 Bump ZHA dependencies (#132630) 2024-12-10 14:50:54 +01:00
Thomas55555 382d32c7a7 Fix config flow in Husqvarna Automower (#132615) 2024-12-10 14:50:50 +01:00
Bouwe Westerdijk ef89563bad Bump plugwise to v1.6.2 and adapt (#132608) 2024-12-10 14:50:46 +01:00
Bouwe Westerdijk 26012ac922 Bump plugwise to v1.6.1 (#131950) 2024-12-10 14:50:42 +01:00
J. Nick Koston a33c69a2a2 Bump yalexs-ble to 2.5.2 (#132560) 2024-12-10 14:45:34 +01:00
Franck Nijhof 0096ffb659 Update twentemilieu to 2.2.0 (#132554) 2024-12-10 14:45:31 +01:00
Robert Svensson db141ce449 Bump aiounifi to v81 to fix partitioned cookies on python 3.13 (#132540) 2024-12-10 14:45:26 +01:00
Austin Mroczek af5f718a71 bump total_connect_client to 2023.12 (#132531) 2024-12-10 14:43:52 +01:00
Franck Nijhof f1284178ed Update debugpy to 1.8.8 (#132519) 2024-12-10 14:41:15 +01:00
J. Nick Koston b0005cedff Bump pycups to 2.0.4 (#132514) 2024-12-10 14:41:11 +01:00
Erwin Douna 5d01f7db85 Fix PyTado dependency (#132510) 2024-12-10 14:41:08 +01:00
Alex d6a4a7f052 Update pyrisco to 0.6.5 (#132493) 2024-12-10 14:41:02 +01:00
Ravaka Razafimanantsoa 1f6c5b4d8b Fix API change for AC not supporting floats in SwitchBot Cloud (#132231) 2024-12-10 14:36:09 +01:00
David Knowles 4e56f9c014 Bump pydrawise to 2024.12.0 (#132015) 2024-12-10 14:36:06 +01:00
Åke Strandberg f343dce418 Enable additional entities on myUplink model SMO20 (#131688)
* Add a couple of entities to SMO 20

* Enable additional entities on SMO20
2024-12-10 14:35:58 +01:00
Franck Nijhof cf53a9743f 2024.12.1 (#132509) 2024-12-06 20:21:31 +01:00
Franck Nijhof 4884891b2c Bump version to 2024.12.1 2024-12-06 18:54:13 +01:00
Allen Porter 30504fc9bd Fix google tasks due date timezone handling (#132498) 2024-12-06 18:53:42 +01:00
Bram Kragten 8827454dbd Update frontend to 20241127.6 (#132494) 2024-12-06 18:53:39 +01:00
Bram Kragten 3b30bbb85e Update frontend to 20241127.5 (#132475) 2024-12-06 18:53:35 +01:00
epenet df9eb482b5 Bump samsungtvws to 2.7.2 (#132474) 2024-12-06 18:53:32 +01:00
Steven B. 32aee61441 Bump tplink python-kasa dependency to 0.8.1 (#132472) 2024-12-06 18:53:29 +01:00
Robert Resch 35873cbe27 Point to the Ecovacs issue in the library for unspoorted devices (#132470)
Co-authored-by: Franck Nijhof <git@frenck.dev>
2024-12-06 18:53:26 +01:00
Robert Resch 6fe492a51c Bump deebot-client to 9.2.0 (#132467) 2024-12-06 18:53:22 +01:00
G Johansson b1bc35f1c3 Fix nordpool dont have previous or next price (#132457) 2024-12-06 18:53:19 +01:00
Joakim Sørensen 56d10a0a7a Bump hass-nabucasa from 0.85.0 to 0.86.0 (#132456)
Bump hass-nabucasa fro 0.85.0 to 0.86.0
2024-12-06 18:53:16 +01:00
Allen Porter d091936ac6 Update exception handling for python3.13 for getpass.getuser() (#132449)
* Update exception handling for python3.13 for getpass.getuser()

* Add comment

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

* Cleanup trailing space

---------

Co-authored-by: Franck Nijhof <frenck@frenck.nl>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-06 18:53:12 +01:00
J. Nick Koston 1dfd4e80b9 Bump aioesphomeapi to 28.0.0 (#132447) 2024-12-06 18:53:09 +01:00
J. Nick Koston d919de6734 Bump aiohttp to 3.11.10 (#132441) 2024-12-06 18:53:06 +01:00
Blake Bryant 3f9f0f8ac2 Bump pydeako to 0.6.0 (#132432)
feat: update deako integration to use improved version of pydeako

Some things of note:
- simplified errors
- pydeako has introduced some connection improvements

See here: https://github.com/DeakoLights/pydeako/releases/tag/0.6.0
2024-12-06 18:53:03 +01:00
Glenn Waters bf20ffae96 Bump upb-lib to 0.5.9 (#132411) 2024-12-06 18:53:00 +01:00
Diogo Gomes dad81927cb Removes references to croniter from utility_meter (#132364)
remove croniter
2024-12-06 18:52:56 +01:00
robinostlund 92392ab3d4 Add missing UnitOfPower to sensor (#132352)
* Add missing UnitOfPower to sensor

* Update homeassistant/components/sensor/const.py

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>

* adding to number

---------

Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-12-06 18:52:53 +01:00
Brett Adams a47e5398f0 Bump tesla-fleet-api to 0.8.5 (#132339) 2024-12-06 18:52:50 +01:00
J. Nick Koston cf6d33635b Fix deprecated call to mimetypes.guess_type in CachingStaticResource (#132299) 2024-12-06 18:52:47 +01:00
Alberto Geniola 6a4031a383 Bump elmax-api to 0.0.6.3 (#131876) 2024-12-06 18:52:39 +01:00
971 changed files with 54252 additions and 7590 deletions
+2
View File
@@ -6,6 +6,7 @@ core: &core
- homeassistant/helpers/**
- homeassistant/package_constraints.txt
- homeassistant/util/**
- mypy.ini
- pyproject.toml
- requirements.txt
- setup.cfg
@@ -131,6 +132,7 @@ tests: &tests
- tests/components/conftest.py
- tests/components/diagnostics/**
- tests/components/history/**
- tests/components/light/common.py
- tests/components/logbook/**
- tests/components/recorder/**
- tests/components/repairs/**
+2 -2
View File
@@ -69,7 +69,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: translations
path: translations.tar.gz
@@ -517,7 +517,7 @@ jobs:
tags: ${{ env.HASSFEST_IMAGE_TAG }}
- name: Run hassfest against core
run: docker run --rm -v ${{ github.workspace }}/homeassistant:/github/workspace/homeassistant ${{ env.HASSFEST_IMAGE_TAG }} --core-integrations-path=/github/workspace/homeassistant/components
run: docker run --rm -v ${{ github.workspace }}:/github/workspace ${{ env.HASSFEST_IMAGE_TAG }} --core-path=/github/workspace
- name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
+13 -13
View File
@@ -537,7 +537,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -661,7 +661,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${{ matrix.python-version }}.json
- name: Upload licenses
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -877,7 +877,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -979,14 +979,14 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1106,7 +1106,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1114,7 +1114,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1236,7 +1236,7 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1244,7 +1244,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1273,7 +1273,7 @@ jobs:
pattern: coverage-*
- name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'true'
uses: codecov/codecov-action@v5.1.1
uses: codecov/codecov-action@v5.1.2
with:
fail_ci_if_error: true
flags: full-suite
@@ -1378,14 +1378,14 @@ jobs:
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1411,7 +1411,7 @@ jobs:
pattern: coverage-*
- name: Upload coverage to Codecov
if: needs.info.outputs.test_full_suite == 'false'
uses: codecov/codecov-action@v5.1.1
uses: codecov/codecov-action@v5.1.2
with:
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
+2 -2
View File
@@ -24,11 +24,11 @@ jobs:
uses: actions/checkout@v4.2.2
- name: Initialize CodeQL
uses: github/codeql-action/init@v3.27.6
uses: github/codeql-action/init@v3.27.9
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3.27.6
uses: github/codeql-action/analyze@v3.27.9
with:
category: "/language:python"
+3 -30
View File
@@ -79,7 +79,7 @@ jobs:
) > .env_file
- name: Upload env_file
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: env_file
path: ./.env_file
@@ -87,7 +87,7 @@ jobs:
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -99,7 +99,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@v4.4.3
uses: actions/upload-artifact@v4.5.0
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt
@@ -197,33 +197,6 @@ jobs:
split -l $(expr $(expr $(cat requirements_all.txt | wc -l) + 1) / 3) requirements_all_wheels_${{ matrix.arch }}.txt requirements_all.txt
- name: Create requirements for cython<3
if: matrix.abi == 'cp312'
run: |
# Some dependencies still require 'cython<3'
# and don't yet use isolated build environments.
# Build these first.
# pydantic: https://github.com/pydantic/pydantic/issues/7689
touch requirements_old-cython.txt
cat homeassistant/package_constraints.txt | grep 'pydantic==' >> requirements_old-cython.txt
- name: Build wheels (old cython)
uses: home-assistant/wheels@2024.11.0
if: matrix.abi == 'cp312'
with:
abi: ${{ matrix.abi }}
tag: musllinux_1_2
arch: ${{ matrix.arch }}
wheels-key: ${{ secrets.WHEELS_KEY }}
env-file: true
apk: "bluez-dev;libffi-dev;openssl-dev;glib-dev;eudev-dev;libxml2-dev;libxslt-dev;libpng-dev;libjpeg-turbo-dev;tiff-dev;cups-dev;gmp-dev;mpfr-dev;mpc1-dev;ffmpeg-dev;gammu-dev;yaml-dev;openblas-dev;fftw-dev;lapack-dev;gfortran;blas-dev;eigen-dev;freetype-dev;glew-dev;harfbuzz-dev;hdf5-dev;libdc1394-dev;libtbb-dev;mesa-dev;openexr-dev;openjpeg-dev;uchardet-dev"
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pydantic;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_old-cython.txt"
pip: "'cython<3'"
- name: Build wheels (part 1)
uses: home-assistant/wheels@2024.11.0
with:
+2 -2
View File
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.8.1
rev: v0.8.3
hooks:
- id: ruff
args:
@@ -12,7 +12,7 @@ repos:
hooks:
- id: codespell
args:
- --ignore-words-list=astroid,checkin,currenty,hass,iif,incomfort,lookin,nam,NotIn
- --ignore-words-list=aiport,astroid,checkin,currenty,hass,iif,incomfort,lookin,nam,NotIn
- --skip="./.*,*.csv,*.json,*.ambr"
- --quiet-level=2
exclude_types: [csv, json, html]
+5
View File
@@ -137,6 +137,7 @@ homeassistant.components.co2signal.*
homeassistant.components.command_line.*
homeassistant.components.config.*
homeassistant.components.configurator.*
homeassistant.components.cookidoo.*
homeassistant.components.counter.*
homeassistant.components.cover.*
homeassistant.components.cpuspeed.*
@@ -169,6 +170,7 @@ homeassistant.components.easyenergy.*
homeassistant.components.ecovacs.*
homeassistant.components.ecowitt.*
homeassistant.components.efergy.*
homeassistant.components.eheimdigital.*
homeassistant.components.electrasmart.*
homeassistant.components.electric_kiwi.*
homeassistant.components.elevenlabs.*
@@ -269,6 +271,7 @@ homeassistant.components.ios.*
homeassistant.components.iotty.*
homeassistant.components.ipp.*
homeassistant.components.iqvia.*
homeassistant.components.iron_os.*
homeassistant.components.islamic_prayer_times.*
homeassistant.components.isy994.*
homeassistant.components.jellyfin.*
@@ -360,6 +363,7 @@ homeassistant.components.otbr.*
homeassistant.components.overkiz.*
homeassistant.components.p1_monitor.*
homeassistant.components.panel_custom.*
homeassistant.components.peblar.*
homeassistant.components.peco.*
homeassistant.components.persistent_notification.*
homeassistant.components.pi_hole.*
@@ -402,6 +406,7 @@ homeassistant.components.romy.*
homeassistant.components.rpi_power.*
homeassistant.components.rss_feed_template.*
homeassistant.components.rtsp_to_webrtc.*
homeassistant.components.russound_rio.*
homeassistant.components.ruuvi_gateway.*
homeassistant.components.ruuvitag_ble.*
homeassistant.components.samsungtv.*
+14 -4
View File
@@ -284,6 +284,8 @@ build.json @home-assistant/supervisor
/tests/components/control4/ @lawtancool
/homeassistant/components/conversation/ @home-assistant/core @synesthesiam
/tests/components/conversation/ @home-assistant/core @synesthesiam
/homeassistant/components/cookidoo/ @miaucl
/tests/components/cookidoo/ @miaucl
/homeassistant/components/coolmaster/ @OnFreund
/tests/components/coolmaster/ @OnFreund
/homeassistant/components/counter/ @fabaff
@@ -385,6 +387,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/efergy/ @tkdrob
/tests/components/efergy/ @tkdrob
/homeassistant/components/egardia/ @jeroenterheerdt
/homeassistant/components/eheimdigital/ @autinerd
/tests/components/eheimdigital/ @autinerd
/homeassistant/components/electrasmart/ @jafar-atili
/tests/components/electrasmart/ @jafar-atili
/homeassistant/components/electric_kiwi/ @mikey0000
@@ -727,8 +731,8 @@ build.json @home-assistant/supervisor
/tests/components/ios/ @robbiet480
/homeassistant/components/iotawatt/ @gtdiehl @jyavenard
/tests/components/iotawatt/ @gtdiehl @jyavenard
/homeassistant/components/iotty/ @pburgio @shapournemati-iotty
/tests/components/iotty/ @pburgio @shapournemati-iotty
/homeassistant/components/iotty/ @shapournemati-iotty
/tests/components/iotty/ @shapournemati-iotty
/homeassistant/components/iperf3/ @rohankapoorcom
/homeassistant/components/ipma/ @dgomes
/tests/components/ipma/ @dgomes
@@ -1049,6 +1053,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/octoprint/ @rfleming71
/tests/components/octoprint/ @rfleming71
/homeassistant/components/ohmconnect/ @robbiet480
/homeassistant/components/ohme/ @dan-r
/tests/components/ohme/ @dan-r
/homeassistant/components/ollama/ @synesthesiam
/tests/components/ollama/ @synesthesiam
/homeassistant/components/ombi/ @larssont
@@ -1060,8 +1066,8 @@ build.json @home-assistant/supervisor
/tests/components/ondilo_ico/ @JeromeHXP
/homeassistant/components/onewire/ @garbled1 @epenet
/tests/components/onewire/ @garbled1 @epenet
/homeassistant/components/onkyo/ @arturpragacz
/tests/components/onkyo/ @arturpragacz
/homeassistant/components/onkyo/ @arturpragacz @eclair4151
/tests/components/onkyo/ @arturpragacz @eclair4151
/homeassistant/components/onvif/ @hunterjm
/tests/components/onvif/ @hunterjm
/homeassistant/components/open_meteo/ @frenck
@@ -1107,6 +1113,8 @@ build.json @home-assistant/supervisor
/tests/components/palazzetti/ @dotvav
/homeassistant/components/panel_custom/ @home-assistant/frontend
/tests/components/panel_custom/ @home-assistant/frontend
/homeassistant/components/peblar/ @frenck
/tests/components/peblar/ @frenck
/homeassistant/components/peco/ @IceBotYT
/tests/components/peco/ @IceBotYT
/homeassistant/components/pegel_online/ @mib1185
@@ -1359,6 +1367,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/sleepiq/ @mfugate1 @kbickar
/tests/components/sleepiq/ @mfugate1 @kbickar
/homeassistant/components/slide/ @ualex73
/homeassistant/components/slide_local/ @dontinelli
/tests/components/slide_local/ @dontinelli
/homeassistant/components/slimproto/ @marcelveldt
/tests/components/slimproto/ @marcelveldt
/homeassistant/components/sma/ @kellerza @rklomp
+1 -1
View File
@@ -13,7 +13,7 @@ ENV \
ARG QEMU_CPU
# Install uv
RUN pip3 install uv==0.5.4
RUN pip3 install uv==0.5.8
WORKDIR /usr/src
+1 -1
View File
@@ -1,4 +1,4 @@
FROM mcr.microsoft.com/devcontainers/python:1-3.12
FROM mcr.microsoft.com/devcontainers/python:1-3.13
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+5 -5
View File
@@ -1,10 +1,10 @@
image: ghcr.io/home-assistant/{arch}-homeassistant
build_from:
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2024.11.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2024.11.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2024.11.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2024.11.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2024.11.0
aarch64: ghcr.io/home-assistant/aarch64-homeassistant-base:2024.12.0
armhf: ghcr.io/home-assistant/armhf-homeassistant-base:2024.12.0
armv7: ghcr.io/home-assistant/armv7-homeassistant-base:2024.12.0
amd64: ghcr.io/home-assistant/amd64-homeassistant-base:2024.12.0
i386: ghcr.io/home-assistant/i386-homeassistant-base:2024.12.0
codenotary:
signer: notary@home-assistant.io
base_image: notary@home-assistant.io
+1 -1
View File
@@ -115,7 +115,7 @@ class AuthManagerFlowManager(
*,
context: AuthFlowContext | None = None,
data: dict[str, Any] | None = None,
) -> LoginFlow:
) -> LoginFlow[Any]:
"""Create a login flow."""
auth_provider = self.auth_manager.get_auth_provider(*handler_key)
if not auth_provider:
+14 -4
View File
@@ -4,8 +4,9 @@ from __future__ import annotations
import logging
import types
from typing import Any
from typing import Any, Generic
from typing_extensions import TypeVar
import voluptuous as vol
from voluptuous.humanize import humanize_error
@@ -34,6 +35,12 @@ DATA_REQS: HassKey[set[str]] = HassKey("mfa_auth_module_reqs_processed")
_LOGGER = logging.getLogger(__name__)
_MultiFactorAuthModuleT = TypeVar(
"_MultiFactorAuthModuleT",
bound="MultiFactorAuthModule",
default="MultiFactorAuthModule",
)
class MultiFactorAuthModule:
"""Multi-factor Auth Module of validation function."""
@@ -71,7 +78,7 @@ class MultiFactorAuthModule:
"""Return a voluptuous schema to define mfa auth module's input."""
raise NotImplementedError
async def async_setup_flow(self, user_id: str) -> SetupFlow:
async def async_setup_flow(self, user_id: str) -> SetupFlow[Any]:
"""Return a data entry flow handler for setup module.
Mfa module should extend SetupFlow
@@ -95,11 +102,14 @@ class MultiFactorAuthModule:
raise NotImplementedError
class SetupFlow(data_entry_flow.FlowHandler):
class SetupFlow(data_entry_flow.FlowHandler, Generic[_MultiFactorAuthModuleT]):
"""Handler for the setup flow."""
def __init__(
self, auth_module: MultiFactorAuthModule, setup_schema: vol.Schema, user_id: str
self,
auth_module: _MultiFactorAuthModuleT,
setup_schema: vol.Schema,
user_id: str,
) -> None:
"""Initialize the setup flow."""
self._auth_module = auth_module
+2 -4
View File
@@ -162,7 +162,7 @@ class NotifyAuthModule(MultiFactorAuthModule):
return sorted(unordered_services)
async def async_setup_flow(self, user_id: str) -> SetupFlow:
async def async_setup_flow(self, user_id: str) -> NotifySetupFlow:
"""Return a data entry flow handler for setup module.
Mfa module should extend SetupFlow
@@ -268,7 +268,7 @@ class NotifyAuthModule(MultiFactorAuthModule):
await self.hass.services.async_call("notify", notify_service, data)
class NotifySetupFlow(SetupFlow):
class NotifySetupFlow(SetupFlow[NotifyAuthModule]):
"""Handler for the setup flow."""
def __init__(
@@ -280,8 +280,6 @@ class NotifySetupFlow(SetupFlow):
) -> None:
"""Initialize the setup flow."""
super().__init__(auth_module, setup_schema, user_id)
# to fix typing complaint
self._auth_module: NotifyAuthModule = auth_module
self._available_notify_services = available_notify_services
self._secret: str | None = None
self._count: int | None = None
+2 -3
View File
@@ -114,7 +114,7 @@ class TotpAuthModule(MultiFactorAuthModule):
self._users[user_id] = ota_secret # type: ignore[index]
return ota_secret
async def async_setup_flow(self, user_id: str) -> SetupFlow:
async def async_setup_flow(self, user_id: str) -> TotpSetupFlow:
"""Return a data entry flow handler for setup module.
Mfa module should extend SetupFlow
@@ -174,10 +174,9 @@ class TotpAuthModule(MultiFactorAuthModule):
return bool(pyotp.TOTP(ota_secret).verify(code, valid_window=1))
class TotpSetupFlow(SetupFlow):
class TotpSetupFlow(SetupFlow[TotpAuthModule]):
"""Handler for the setup flow."""
_auth_module: TotpAuthModule
_ota_secret: str
_url: str
_image: str
+10 -4
View File
@@ -5,8 +5,9 @@ from __future__ import annotations
from collections.abc import Mapping
import logging
import types
from typing import Any
from typing import Any, Generic
from typing_extensions import TypeVar
import voluptuous as vol
from voluptuous.humanize import humanize_error
@@ -46,6 +47,8 @@ AUTH_PROVIDER_SCHEMA = vol.Schema(
extra=vol.ALLOW_EXTRA,
)
_AuthProviderT = TypeVar("_AuthProviderT", bound="AuthProvider", default="AuthProvider")
class AuthProvider:
"""Provider of user authentication."""
@@ -105,7 +108,7 @@ class AuthProvider:
# Implement by extending class
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow:
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow[Any]:
"""Return the data flow for logging in with auth provider.
Auth provider should extend LoginFlow and return an instance.
@@ -192,12 +195,15 @@ async def load_auth_provider_module(
return module
class LoginFlow(FlowHandler[AuthFlowContext, AuthFlowResult, tuple[str, str]]):
class LoginFlow(
FlowHandler[AuthFlowContext, AuthFlowResult, tuple[str, str]],
Generic[_AuthProviderT],
):
"""Handler for the login flow."""
_flow_result = AuthFlowResult
def __init__(self, auth_provider: AuthProvider) -> None:
def __init__(self, auth_provider: _AuthProviderT) -> None:
"""Initialize the login flow."""
self._auth_provider = auth_provider
self._auth_module_id: str | None = None
+8 -6
View File
@@ -6,7 +6,7 @@ import asyncio
from collections.abc import Mapping
import logging
import os
from typing import Any, cast
from typing import Any
import voluptuous as vol
@@ -59,7 +59,9 @@ class CommandLineAuthProvider(AuthProvider):
super().__init__(*args, **kwargs)
self._user_meta: dict[str, dict[str, Any]] = {}
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow:
async def async_login_flow(
self, context: AuthFlowContext | None
) -> CommandLineLoginFlow:
"""Return a flow to login."""
return CommandLineLoginFlow(self)
@@ -133,7 +135,7 @@ class CommandLineAuthProvider(AuthProvider):
)
class CommandLineLoginFlow(LoginFlow):
class CommandLineLoginFlow(LoginFlow[CommandLineAuthProvider]):
"""Handler for the login flow."""
async def async_step_init(
@@ -145,9 +147,9 @@ class CommandLineLoginFlow(LoginFlow):
if user_input is not None:
user_input["username"] = user_input["username"].strip()
try:
await cast(
CommandLineAuthProvider, self._auth_provider
).async_validate_login(user_input["username"], user_input["password"])
await self._auth_provider.async_validate_login(
user_input["username"], user_input["password"]
)
except InvalidAuthError:
errors["base"] = "invalid_auth"
@@ -305,7 +305,7 @@ class HassAuthProvider(AuthProvider):
await data.async_load()
self.data = data
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow:
async def async_login_flow(self, context: AuthFlowContext | None) -> HassLoginFlow:
"""Return a flow to login."""
return HassLoginFlow(self)
@@ -400,7 +400,7 @@ class HassAuthProvider(AuthProvider):
pass
class HassLoginFlow(LoginFlow):
class HassLoginFlow(LoginFlow[HassAuthProvider]):
"""Handler for the login flow."""
async def async_step_init(
@@ -411,7 +411,7 @@ class HassLoginFlow(LoginFlow):
if user_input is not None:
try:
await cast(HassAuthProvider, self._auth_provider).async_validate_login(
await self._auth_provider.async_validate_login(
user_input["username"], user_input["password"]
)
except InvalidAuth:
@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Mapping
import hmac
from typing import cast
import voluptuous as vol
@@ -36,7 +35,9 @@ class InvalidAuthError(HomeAssistantError):
class ExampleAuthProvider(AuthProvider):
"""Example auth provider based on hardcoded usernames and passwords."""
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow:
async def async_login_flow(
self, context: AuthFlowContext | None
) -> ExampleLoginFlow:
"""Return a flow to login."""
return ExampleLoginFlow(self)
@@ -93,7 +94,7 @@ class ExampleAuthProvider(AuthProvider):
return UserMeta(name=name, is_active=True)
class ExampleLoginFlow(LoginFlow):
class ExampleLoginFlow(LoginFlow[ExampleAuthProvider]):
"""Handler for the login flow."""
async def async_step_init(
@@ -104,7 +105,7 @@ class ExampleLoginFlow(LoginFlow):
if user_input is not None:
try:
cast(ExampleAuthProvider, self._auth_provider).async_validate_login(
self._auth_provider.async_validate_login(
user_input["username"], user_input["password"]
)
except InvalidAuthError:
@@ -104,7 +104,9 @@ class TrustedNetworksAuthProvider(AuthProvider):
"""Trusted Networks auth provider does not support MFA."""
return False
async def async_login_flow(self, context: AuthFlowContext | None) -> LoginFlow:
async def async_login_flow(
self, context: AuthFlowContext | None
) -> TrustedNetworksLoginFlow:
"""Return a flow to login."""
assert context is not None
ip_addr = cast(IPAddress, context.get("ip_address"))
@@ -214,7 +216,7 @@ class TrustedNetworksAuthProvider(AuthProvider):
self.async_validate_access(ip_address(remote_ip))
class TrustedNetworksLoginFlow(LoginFlow):
class TrustedNetworksLoginFlow(LoginFlow[TrustedNetworksAuthProvider]):
"""Handler for the login flow."""
def __init__(
@@ -235,9 +237,7 @@ class TrustedNetworksLoginFlow(LoginFlow):
) -> AuthFlowResult:
"""Handle the step of the form."""
try:
cast(
TrustedNetworksAuthProvider, self._auth_provider
).async_validate_access(self._ip_address)
self._auth_provider.async_validate_access(self._ip_address)
except InvalidAuthError:
return self.async_abort(reason="not_allowed")
+81 -23
View File
@@ -1,6 +1,10 @@
"""Home Assistant module to handle restoring backups."""
from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
import hashlib
import json
import logging
from pathlib import Path
@@ -14,7 +18,12 @@ import securetar
from .const import __version__ as HA_VERSION
RESTORE_BACKUP_FILE = ".HA_RESTORE"
KEEP_PATHS = ("backups",)
KEEP_BACKUPS = ("backups",)
KEEP_DATABASE = (
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
)
_LOGGER = logging.getLogger(__name__)
@@ -24,6 +33,21 @@ class RestoreBackupFileContent:
"""Definition for restore backup file content."""
backup_file_path: Path
password: str | None
remove_after_restore: bool
restore_database: bool
restore_homeassistant: bool
def password_to_key(password: str) -> bytes:
"""Generate a AES Key from password.
Matches the implementation in supervisor.backups.utils.password_to_key.
"""
key: bytes = password.encode()
for _ in range(100):
key = hashlib.sha256(key).digest()
return key[:16]
def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent | None:
@@ -32,20 +56,27 @@ def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent |
try:
instruction_content = json.loads(instruction_path.read_text(encoding="utf-8"))
return RestoreBackupFileContent(
backup_file_path=Path(instruction_content["path"])
backup_file_path=Path(instruction_content["path"]),
password=instruction_content["password"],
remove_after_restore=instruction_content["remove_after_restore"],
restore_database=instruction_content["restore_database"],
restore_homeassistant=instruction_content["restore_homeassistant"],
)
except (FileNotFoundError, json.JSONDecodeError):
except (FileNotFoundError, KeyError, json.JSONDecodeError):
return None
finally:
# Always remove the backup instruction file to prevent a boot loop
instruction_path.unlink(missing_ok=True)
def _clear_configuration_directory(config_dir: Path) -> None:
"""Delete all files and directories in the config directory except for the backups directory."""
keep_paths = [config_dir.joinpath(path) for path in KEEP_PATHS]
config_contents = sorted(
[entry for entry in config_dir.iterdir() if entry not in keep_paths]
def _clear_configuration_directory(config_dir: Path, keep: Iterable[str]) -> None:
"""Delete all files and directories in the config directory except entries in the keep list."""
keep_paths = [config_dir.joinpath(path) for path in keep]
entries_to_remove = sorted(
entry for entry in config_dir.iterdir() if entry not in keep_paths
)
for entry in config_contents:
for entry in entries_to_remove:
entrypath = config_dir.joinpath(entry)
if entrypath.is_file():
@@ -54,12 +85,15 @@ def _clear_configuration_directory(config_dir: Path) -> None:
shutil.rmtree(entrypath)
def _extract_backup(config_dir: Path, backup_file_path: Path) -> None:
def _extract_backup(
config_dir: Path,
restore_content: RestoreBackupFileContent,
) -> None:
"""Extract the backup file to the config directory."""
with (
TemporaryDirectory() as tempdir,
securetar.SecureTarFile(
backup_file_path,
restore_content.backup_file_path,
gzip=False,
mode="r",
) as ostf,
@@ -88,22 +122,41 @@ def _extract_backup(config_dir: Path, backup_file_path: Path) -> None:
f"homeassistant.tar{'.gz' if backup_meta["compressed"] else ''}",
),
gzip=backup_meta["compressed"],
key=password_to_key(restore_content.password)
if restore_content.password is not None
else None,
mode="r",
) as istf:
for member in istf.getmembers():
if member.name == "data":
continue
member.name = member.name.replace("data/", "")
_clear_configuration_directory(config_dir)
istf.extractall(
path=config_dir,
members=[
member
for member in securetar.secure_path(istf)
if member.name != "data"
],
path=Path(tempdir, "homeassistant"),
members=securetar.secure_path(istf),
filter="fully_trusted",
)
if restore_content.restore_homeassistant:
keep = list(KEEP_BACKUPS)
if not restore_content.restore_database:
keep.extend(KEEP_DATABASE)
_clear_configuration_directory(config_dir, keep)
shutil.copytree(
Path(tempdir, "homeassistant", "data"),
config_dir,
dirs_exist_ok=True,
ignore=shutil.ignore_patterns(*(keep)),
)
elif restore_content.restore_database:
for entry in KEEP_DATABASE:
entrypath = config_dir / entry
if entrypath.is_file():
entrypath.unlink()
elif entrypath.is_dir():
shutil.rmtree(entrypath)
for entry in KEEP_DATABASE:
shutil.copy(
Path(tempdir, "homeassistant", "data", entry),
config_dir,
)
def restore_backup(config_dir_path: str) -> bool:
@@ -119,8 +172,13 @@ def restore_backup(config_dir_path: str) -> bool:
backup_file_path = restore_content.backup_file_path
_LOGGER.info("Restoring %s", backup_file_path)
try:
_extract_backup(config_dir, backup_file_path)
_extract_backup(
config_dir=config_dir,
restore_content=restore_content,
)
except FileNotFoundError as err:
raise ValueError(f"Backup file {backup_file_path} does not exist") from err
if restore_content.remove_after_restore:
backup_file_path.unlink(missing_ok=True)
_LOGGER.info("Restore complete, restarting")
return True
+7 -1
View File
@@ -50,6 +50,12 @@ def _check_sleep_call_allowed(mapped_args: dict[str, Any]) -> bool:
return False
def _check_load_verify_locations_call_allowed(mapped_args: dict[str, Any]) -> bool:
# If only cadata is passed, we can ignore it
kwargs = mapped_args.get("kwargs")
return bool(kwargs and len(kwargs) == 1 and "cadata" in kwargs)
@dataclass(slots=True, frozen=True)
class BlockingCall:
"""Class to hold information about a blocking call."""
@@ -158,7 +164,7 @@ _BLOCKING_CALLS: tuple[BlockingCall, ...] = (
original_func=SSLContext.load_verify_locations,
object=SSLContext,
function="load_verify_locations",
check_allowed=None,
check_allowed=_check_load_verify_locations_call_allowed,
strict=False,
strict_core=False,
skip_for_tests=True,
+5
View File
@@ -0,0 +1,5 @@
{
"domain": "slide",
"name": "Slide",
"integrations": ["slide", "slide_local"]
}
+4
View File
@@ -11,6 +11,8 @@ from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ATTR_COLOR_TEMP_KELVIN,
ATTR_HS_COLOR,
DEFAULT_MAX_KELVIN,
DEFAULT_MIN_KELVIN,
ColorMode,
LightEntity,
)
@@ -40,6 +42,8 @@ class AbodeLight(AbodeDevice, LightEntity):
_device: Light
_attr_name = None
_attr_max_color_temp_kelvin = DEFAULT_MAX_KELVIN
_attr_min_color_temp_kelvin = DEFAULT_MIN_KELVIN
def turn_on(self, **kwargs: Any) -> None:
"""Turn on the light."""
@@ -25,5 +25,6 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["aioacaia"],
"quality_scale": "platinum",
"requirements": ["aioacaia==0.1.11"]
}
@@ -16,7 +16,7 @@ rules:
No custom actions are defined.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: todo
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: |
@@ -31,7 +31,9 @@ rules:
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters: todo
docs-configuration-parameters:
status: exempt
comment: No options to configure
docs-installation-parameters: todo
entity-unavailable: done
integration-owner: done
@@ -41,12 +43,16 @@ rules:
status: exempt
comment: |
This integration does not require authentication.
test-coverage: done
test-coverage: todo
# Gold
devices: done
diagnostics: done
discovery-update-info: done
discovery: done
discovery-update-info:
status: todo
comment: DHCP is still possible
discovery:
status: todo
comment: DHCP is still possible
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
@@ -317,6 +317,7 @@ class Alexa(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -403,6 +404,7 @@ class AlexaPowerController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -469,6 +471,7 @@ class AlexaLockController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -523,6 +526,7 @@ class AlexaSceneController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -562,6 +566,7 @@ class AlexaBrightnessController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -611,6 +616,7 @@ class AlexaColorController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -669,6 +675,7 @@ class AlexaColorTemperatureController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -715,6 +722,7 @@ class AlexaSpeaker(AlexaCapability):
"fr-FR", # Not documented as of 2021-12-04, see PR #60489
"it-IT",
"ja-JP",
"nl-NL",
}
def name(self) -> str:
@@ -772,6 +780,7 @@ class AlexaStepSpeaker(AlexaCapability):
"es-ES",
"fr-FR", # Not documented as of 2021-12-04, see PR #60489
"it-IT",
"nl-NL",
}
def name(self) -> str:
@@ -801,6 +810,7 @@ class AlexaPlaybackController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -859,6 +869,7 @@ class AlexaInputController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -1104,6 +1115,7 @@ class AlexaThermostatController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -1245,6 +1257,7 @@ class AlexaPowerLevelController(AlexaCapability):
"fr-CA",
"fr-FR",
"it-IT",
"nl-NL",
"ja-JP",
}
@@ -1723,6 +1736,7 @@ class AlexaRangeController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -2066,6 +2080,7 @@ class AlexaToggleController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -2212,6 +2227,7 @@ class AlexaPlaybackStateReporter(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -2267,6 +2283,7 @@ class AlexaSeekController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -2360,6 +2377,7 @@ class AlexaEqualizerController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
@@ -2470,6 +2488,7 @@ class AlexaCameraStreamController(AlexaCapability):
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}
+1
View File
@@ -59,6 +59,7 @@ CONF_SUPPORTED_LOCALES = (
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
)
+9 -8
View File
@@ -376,14 +376,14 @@ async def async_api_decrease_color_temp(
) -> AlexaResponse:
"""Process a decrease color temperature request."""
entity = directive.entity
current = int(entity.attributes[light.ATTR_COLOR_TEMP])
max_mireds = int(entity.attributes[light.ATTR_MAX_MIREDS])
current = int(entity.attributes[light.ATTR_COLOR_TEMP_KELVIN])
min_kelvin = int(entity.attributes[light.ATTR_MIN_COLOR_TEMP_KELVIN])
value = min(max_mireds, current + 50)
value = max(min_kelvin, current - 500)
await hass.services.async_call(
entity.domain,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: entity.entity_id, light.ATTR_COLOR_TEMP: value},
{ATTR_ENTITY_ID: entity.entity_id, light.ATTR_COLOR_TEMP_KELVIN: value},
blocking=False,
context=context,
)
@@ -400,14 +400,14 @@ async def async_api_increase_color_temp(
) -> AlexaResponse:
"""Process an increase color temperature request."""
entity = directive.entity
current = int(entity.attributes[light.ATTR_COLOR_TEMP])
min_mireds = int(entity.attributes[light.ATTR_MIN_MIREDS])
current = int(entity.attributes[light.ATTR_COLOR_TEMP_KELVIN])
max_kelvin = int(entity.attributes[light.ATTR_MAX_COLOR_TEMP_KELVIN])
value = max(min_mireds, current - 50)
value = min(max_kelvin, current + 500)
await hass.services.async_call(
entity.domain,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: entity.entity_id, light.ATTR_COLOR_TEMP: value},
{ATTR_ENTITY_ID: entity.entity_id, light.ATTR_COLOR_TEMP_KELVIN: value},
blocking=False,
context=context,
)
@@ -527,6 +527,7 @@ async def async_api_unlock(
"hi-IN",
"it-IT",
"ja-JP",
"nl-NL",
"pt-BR",
}:
msg = (
@@ -41,7 +41,7 @@
}
},
"enable_motion_recording": {
"name": "Enables motion recording",
"name": "Enable motion recording",
"description": "Enables recording a clip to camera storage when motion is detected.",
"fields": {
"entity_id": {
@@ -51,8 +51,8 @@
}
},
"disable_motion_recording": {
"name": "Disables motion recording",
"description": "Disable recording a clip to camera storage when motion is detected.",
"name": "Disable motion recording",
"description": "Disables recording a clip to camera storage when motion is detected.",
"fields": {
"entity_id": {
"name": "[%key:component::amcrest::services::enable_recording::fields::entity_id::name%]",
+10 -9
View File
@@ -135,15 +135,16 @@ async def async_connect_androidtv(
)
aftv = await async_androidtv_setup(
config[CONF_HOST],
config[CONF_PORT],
adbkey,
config.get(CONF_ADB_SERVER_IP),
config.get(CONF_ADB_SERVER_PORT, DEFAULT_ADB_SERVER_PORT),
state_detection_rules,
config[CONF_DEVICE_CLASS],
timeout,
signer,
host=config[CONF_HOST],
port=config[CONF_PORT],
adbkey=adbkey,
adb_server_ip=config.get(CONF_ADB_SERVER_IP),
adb_server_port=config.get(CONF_ADB_SERVER_PORT, DEFAULT_ADB_SERVER_PORT),
state_detection_rules=state_detection_rules,
device_class=config[CONF_DEVICE_CLASS],
auth_timeout_s=timeout,
signer=signer,
log_errors=False,
)
if not aftv.available:
@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/aosmith",
"iot_class": "cloud_polling",
"requirements": ["py-aosmith==1.0.11"]
"requirements": ["py-aosmith==1.0.12"]
}
+1 -1
View File
@@ -20,7 +20,7 @@ async def async_setup_entry(
) -> None:
"""Set up the sensor platform."""
add_entities([ApSystemsMaxOutputNumber(config_entry.runtime_data)])
add_entities([ApSystemsMaxOutputNumber(config_entry.runtime_data)], True)
class ApSystemsMaxOutputNumber(ApSystemsEntity, NumberEntity):
@@ -16,6 +16,7 @@ import time
from typing import Any, Literal, cast
import wave
import hass_nabucasa
import voluptuous as vol
from homeassistant.components import (
@@ -29,6 +30,7 @@ from homeassistant.components import (
from homeassistant.components.tts import (
generate_media_source_id as tts_generate_media_source_id,
)
from homeassistant.const import MATCH_ALL
from homeassistant.core import Context, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import intent
@@ -917,6 +919,11 @@ class PipelineRun:
)
except (asyncio.CancelledError, TimeoutError):
raise # expected
except hass_nabucasa.auth.Unauthenticated as src_error:
raise SpeechToTextError(
code="cloud-auth-failed",
message="Home Assistant Cloud authentication failed",
) from src_error
except Exception as src_error:
_LOGGER.exception("Unexpected error during speech-to-text")
raise SpeechToTextError(
@@ -1009,12 +1016,19 @@ class PipelineRun:
if self.intent_agent is None:
raise RuntimeError("Recognize intent was not prepared")
if self.pipeline.conversation_language == MATCH_ALL:
# LLMs support all languages ('*') so use pipeline language for
# intent fallback.
input_language = self.pipeline.language
else:
input_language = self.pipeline.conversation_language
self.process_event(
PipelineEvent(
PipelineEventType.INTENT_START,
{
"engine": self.intent_agent,
"language": self.pipeline.conversation_language,
"language": input_language,
"intent_input": intent_input,
"conversation_id": conversation_id,
"device_id": device_id,
@@ -1029,7 +1043,7 @@ class PipelineRun:
context=self.context,
conversation_id=conversation_id,
device_id=device_id,
language=self.pipeline.language,
language=input_language,
agent_id=self.intent_agent,
)
processed_locally = self.intent_agent == conversation.HOME_ASSISTANT_AGENT
@@ -140,7 +140,7 @@ class VoiceCommandSegmenter:
self._timeout_seconds_left -= chunk_seconds
if self._timeout_seconds_left <= 0:
_LOGGER.warning(
_LOGGER.debug(
"VAD end of speech detection timed out after %s seconds",
self.timeout_seconds,
)
@@ -28,5 +28,5 @@
"documentation": "https://www.home-assistant.io/integrations/august",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==8.10.0", "yalexs-ble==2.5.2"]
"requirements": ["yalexs==8.10.0", "yalexs-ble==2.5.6"]
}
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/aussie_broadband",
"iot_class": "cloud_polling",
"loggers": ["aussiebb"],
"requirements": ["pyaussiebb==0.0.15"]
"requirements": ["pyaussiebb==0.1.4"]
}
+1 -1
View File
@@ -29,7 +29,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["axis"],
"requirements": ["axis==63"],
"requirements": ["axis==64"],
"ssdp": [
{
"manufacturer": "AXIS"
+60 -15
View File
@@ -5,36 +5,81 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.hassio import is_hassio
from homeassistant.helpers.typing import ConfigType
from .const import DATA_MANAGER, DOMAIN, LOGGER
from .agent import (
BackupAgent,
BackupAgentError,
BackupAgentPlatformProtocol,
LocalBackupAgent,
)
from .const import DATA_MANAGER, DOMAIN
from .http import async_register_http_views
from .manager import BackupManager
from .manager import (
BackupManager,
BackupPlatformProtocol,
BackupReaderWriter,
CoreBackupReaderWriter,
CreateBackupEvent,
ManagerBackup,
NewBackup,
WrittenBackup,
)
from .models import AddonInfo, AgentBackup, Folder
from .websocket import async_register_websocket_handlers
__all__ = [
"AddonInfo",
"AgentBackup",
"ManagerBackup",
"BackupAgent",
"BackupAgentError",
"BackupAgentPlatformProtocol",
"BackupPlatformProtocol",
"BackupReaderWriter",
"CreateBackupEvent",
"Folder",
"LocalBackupAgent",
"NewBackup",
"WrittenBackup",
]
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Backup integration."""
backup_manager = BackupManager(hass)
hass.data[DATA_MANAGER] = backup_manager
with_hassio = is_hassio(hass)
reader_writer: BackupReaderWriter
if not with_hassio:
reader_writer = CoreBackupReaderWriter(hass)
else:
# pylint: disable-next=import-outside-toplevel, hass-component-root-import
from homeassistant.components.hassio.backup import SupervisorBackupReaderWriter
reader_writer = SupervisorBackupReaderWriter(hass)
backup_manager = BackupManager(hass, reader_writer)
hass.data[DATA_MANAGER] = backup_manager
await backup_manager.async_setup()
async_register_websocket_handlers(hass, with_hassio)
if with_hassio:
if DOMAIN in config:
LOGGER.error(
"The backup integration is not supported on this installation method, "
"please remove it from your configuration"
)
return True
async def async_handle_create_service(call: ServiceCall) -> None:
"""Service handler for creating backups."""
await backup_manager.async_create_backup()
agent_id = list(backup_manager.local_backup_agents)[0]
await backup_manager.async_create_backup(
agent_ids=[agent_id],
include_addons=None,
include_all_addons=False,
include_database=True,
include_folders=None,
include_homeassistant=True,
name=None,
password=None,
)
hass.services.async_register(DOMAIN, "create", async_handle_create_service)
if not with_hassio:
hass.services.async_register(DOMAIN, "create", async_handle_create_service)
async_register_http_views(hass)
+121
View File
@@ -0,0 +1,121 @@
"""Backup agents for the Backup integration."""
from __future__ import annotations
import abc
from collections.abc import AsyncIterator, Callable, Coroutine
from pathlib import Path
from typing import Any, Protocol
from propcache import cached_property
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from .models import AgentBackup
class BackupAgentError(HomeAssistantError):
"""Base class for backup agent errors."""
class BackupAgentUnreachableError(BackupAgentError):
"""Raised when the agent can't reach its API."""
_message = "The backup agent is unreachable."
class BackupAgent(abc.ABC):
"""Backup agent interface."""
domain: str
name: str
@cached_property
def agent_id(self) -> str:
"""Return the agent_id."""
return f"{self.domain}.{self.name}"
@abc.abstractmethod
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
:return: An async iterator that yields bytes.
"""
@abc.abstractmethod
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup.
:param open_stream: A function returning an async iterator that yields bytes.
:param backup: Metadata about the backup that should be uploaded.
"""
@abc.abstractmethod
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
"""
@abc.abstractmethod
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
@abc.abstractmethod
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
class LocalBackupAgent(BackupAgent):
"""Local backup agent."""
@abc.abstractmethod
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup.
The method should return the path to the backup file with the specified id.
"""
class BackupAgentPlatformProtocol(Protocol):
"""Define the format of backup platforms which implement backup agents."""
async def async_get_backup_agents(
self,
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return a list of backup agents."""
@callback
def async_register_backup_agents_listener(
self,
hass: HomeAssistant,
*,
listener: Callable[[], None],
**kwargs: Any,
) -> Callable[[], None]:
"""Register a listener to be called when agents are added or removed.
:return: A function to unregister the listener.
"""
+125
View File
@@ -0,0 +1,125 @@
"""Local backup support for Core and Container installations."""
from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
import json
from pathlib import Path
from tarfile import TarError
from typing import Any
from homeassistant.core import HomeAssistant
from homeassistant.helpers.hassio import is_hassio
from .agent import BackupAgent, LocalBackupAgent
from .const import DOMAIN, LOGGER
from .models import AgentBackup
from .util import read_backup
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return the local backup agent."""
if is_hassio(hass):
return []
return [CoreLocalBackupAgent(hass)]
class CoreLocalBackupAgent(LocalBackupAgent):
"""Local backup agent for Core and Container installations."""
domain = DOMAIN
name = "local"
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
self._backup_dir = Path(hass.config.path("backups"))
self._backups: dict[str, AgentBackup] = {}
self._loaded_backups = False
async def _load_backups(self) -> None:
"""Load data of stored backup files."""
backups = await self._hass.async_add_executor_job(self._read_backups)
LOGGER.debug("Loaded %s local backups", len(backups))
self._backups = backups
self._loaded_backups = True
def _read_backups(self) -> dict[str, AgentBackup]:
"""Read backups from disk."""
backups: dict[str, AgentBackup] = {}
for backup_path in self._backup_dir.glob("*.tar"):
try:
backup = read_backup(backup_path)
backups[backup.backup_id] = backup
except (OSError, TarError, json.JSONDecodeError, KeyError) as err:
LOGGER.warning("Unable to read backup %s: %s", backup_path, err)
return backups
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file."""
raise NotImplementedError
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup."""
self._backups[backup.backup_id] = backup
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
if not self._loaded_backups:
await self._load_backups()
return list(self._backups.values())
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
if not self._loaded_backups:
await self._load_backups()
if not (backup := self._backups.get(backup_id)):
return None
backup_path = self.get_backup_path(backup_id)
if not await self._hass.async_add_executor_job(backup_path.exists):
LOGGER.debug(
(
"Removing tracked backup (%s) that does not exists on the expected"
" path %s"
),
backup.backup_id,
backup_path,
)
self._backups.pop(backup_id)
return None
return backup
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup."""
return self._backup_dir / f"{backup_id}.tar"
async def async_delete_backup(self, backup_id: str, **kwargs: Any) -> None:
"""Delete a backup file."""
if await self.async_get_backup(backup_id) is None:
return
backup_path = self.get_backup_path(backup_id)
await self._hass.async_add_executor_job(backup_path.unlink, True)
LOGGER.debug("Deleted backup located at %s", backup_path)
self._backups.pop(backup_id)
+473
View File
@@ -0,0 +1,473 @@
"""Provide persistent configuration for the backup integration."""
from __future__ import annotations
import asyncio
from collections.abc import Callable
from dataclasses import dataclass, field, replace
from datetime import datetime, timedelta
from enum import StrEnum
from typing import TYPE_CHECKING, Self, TypedDict
from cronsim import CronSim
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.event import async_call_later, async_track_point_in_time
from homeassistant.helpers.typing import UNDEFINED, UndefinedType
from homeassistant.util import dt as dt_util
from .const import LOGGER
from .models import Folder
if TYPE_CHECKING:
from .manager import BackupManager, ManagerBackup
# The time of the automatic backup event should be compatible with
# the time of the recorder's nightly job which runs at 04:12.
# Run the backup at 04:45.
CRON_PATTERN_DAILY = "45 4 * * *"
CRON_PATTERN_WEEKLY = "45 4 * * {}"
class StoredBackupConfig(TypedDict):
"""Represent the stored backup config."""
create_backup: StoredCreateBackupConfig
last_attempted_automatic_backup: str | None
last_completed_automatic_backup: str | None
retention: StoredRetentionConfig
schedule: StoredBackupSchedule
@dataclass(kw_only=True)
class BackupConfigData:
"""Represent loaded backup config data."""
create_backup: CreateBackupConfig
last_attempted_automatic_backup: datetime | None = None
last_completed_automatic_backup: datetime | None = None
retention: RetentionConfig
schedule: BackupSchedule
@classmethod
def from_dict(cls, data: StoredBackupConfig) -> Self:
"""Initialize backup config data from a dict."""
include_folders_data = data["create_backup"]["include_folders"]
if include_folders_data:
include_folders = [Folder(folder) for folder in include_folders_data]
else:
include_folders = None
retention = data["retention"]
if last_attempted_str := data["last_attempted_automatic_backup"]:
last_attempted = dt_util.parse_datetime(last_attempted_str)
else:
last_attempted = None
if last_attempted_str := data["last_completed_automatic_backup"]:
last_completed = dt_util.parse_datetime(last_attempted_str)
else:
last_completed = None
return cls(
create_backup=CreateBackupConfig(
agent_ids=data["create_backup"]["agent_ids"],
include_addons=data["create_backup"]["include_addons"],
include_all_addons=data["create_backup"]["include_all_addons"],
include_database=data["create_backup"]["include_database"],
include_folders=include_folders,
name=data["create_backup"]["name"],
password=data["create_backup"]["password"],
),
last_attempted_automatic_backup=last_attempted,
last_completed_automatic_backup=last_completed,
retention=RetentionConfig(
copies=retention["copies"],
days=retention["days"],
),
schedule=BackupSchedule(state=ScheduleState(data["schedule"]["state"])),
)
def to_dict(self) -> StoredBackupConfig:
"""Convert backup config data to a dict."""
if self.last_attempted_automatic_backup:
last_attempted = self.last_attempted_automatic_backup.isoformat()
else:
last_attempted = None
if self.last_completed_automatic_backup:
last_completed = self.last_completed_automatic_backup.isoformat()
else:
last_completed = None
return StoredBackupConfig(
create_backup=self.create_backup.to_dict(),
last_attempted_automatic_backup=last_attempted,
last_completed_automatic_backup=last_completed,
retention=self.retention.to_dict(),
schedule=self.schedule.to_dict(),
)
class BackupConfig:
"""Handle backup config."""
def __init__(self, hass: HomeAssistant, manager: BackupManager) -> None:
"""Initialize backup config."""
self.data = BackupConfigData(
create_backup=CreateBackupConfig(),
retention=RetentionConfig(),
schedule=BackupSchedule(),
)
self._manager = manager
def load(self, stored_config: StoredBackupConfig) -> None:
"""Load config."""
self.data = BackupConfigData.from_dict(stored_config)
self.data.schedule.apply(self._manager)
async def update(
self,
*,
create_backup: CreateBackupParametersDict | UndefinedType = UNDEFINED,
retention: RetentionParametersDict | UndefinedType = UNDEFINED,
schedule: ScheduleState | UndefinedType = UNDEFINED,
) -> None:
"""Update config."""
if create_backup is not UNDEFINED:
self.data.create_backup = replace(self.data.create_backup, **create_backup)
if retention is not UNDEFINED:
new_retention = RetentionConfig(**retention)
if new_retention != self.data.retention:
self.data.retention = new_retention
self.data.retention.apply(self._manager)
if schedule is not UNDEFINED:
new_schedule = BackupSchedule(state=schedule)
if new_schedule.to_dict() != self.data.schedule.to_dict():
self.data.schedule = new_schedule
self.data.schedule.apply(self._manager)
self._manager.store.save()
@dataclass(kw_only=True)
class RetentionConfig:
"""Represent the backup retention configuration."""
copies: int | None = None
days: int | None = None
def apply(self, manager: BackupManager) -> None:
"""Apply backup retention configuration."""
if self.days is not None:
self._schedule_next(manager)
else:
self._unschedule_next(manager)
def to_dict(self) -> StoredRetentionConfig:
"""Convert backup retention configuration to a dict."""
return StoredRetentionConfig(
copies=self.copies,
days=self.days,
)
@callback
def _schedule_next(
self,
manager: BackupManager,
) -> None:
"""Schedule the next delete after days."""
self._unschedule_next(manager)
async def _delete_backups(now: datetime) -> None:
"""Delete backups older than days."""
self._schedule_next(manager)
def _backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return backups older than days to delete."""
# we need to check here since we await before
# this filter is applied
if self.days is None:
return {}
now = dt_util.utcnow()
return {
backup_id: backup
for backup_id, backup in backups.items()
if dt_util.parse_datetime(backup.date, raise_on_error=True)
+ timedelta(days=self.days)
< now
}
await _delete_filtered_backups(manager, _backups_filter)
manager.remove_next_delete_event = async_call_later(
manager.hass, timedelta(days=1), _delete_backups
)
@callback
def _unschedule_next(self, manager: BackupManager) -> None:
"""Unschedule the next delete after days."""
if (remove_next_event := manager.remove_next_delete_event) is not None:
remove_next_event()
manager.remove_next_delete_event = None
class StoredRetentionConfig(TypedDict):
"""Represent the stored backup retention configuration."""
copies: int | None
days: int | None
class RetentionParametersDict(TypedDict, total=False):
"""Represent the parameters for retention."""
copies: int | None
days: int | None
class StoredBackupSchedule(TypedDict):
"""Represent the stored backup schedule configuration."""
state: ScheduleState
class ScheduleState(StrEnum):
"""Represent the schedule state."""
NEVER = "never"
DAILY = "daily"
MONDAY = "mon"
TUESDAY = "tue"
WEDNESDAY = "wed"
THURSDAY = "thu"
FRIDAY = "fri"
SATURDAY = "sat"
SUNDAY = "sun"
@dataclass(kw_only=True)
class BackupSchedule:
"""Represent the backup schedule."""
state: ScheduleState = ScheduleState.NEVER
cron_event: CronSim | None = field(init=False, default=None)
@callback
def apply(
self,
manager: BackupManager,
) -> None:
"""Apply a new schedule.
There are only three possible state types: never, daily, or weekly.
"""
if self.state is ScheduleState.NEVER:
self._unschedule_next(manager)
return
if self.state is ScheduleState.DAILY:
self._schedule_next(CRON_PATTERN_DAILY, manager)
else:
self._schedule_next(
CRON_PATTERN_WEEKLY.format(self.state.value),
manager,
)
@callback
def _schedule_next(
self,
cron_pattern: str,
manager: BackupManager,
) -> None:
"""Schedule the next backup."""
self._unschedule_next(manager)
now = dt_util.now()
if (cron_event := self.cron_event) is None:
seed_time = manager.config.data.last_completed_automatic_backup or now
cron_event = self.cron_event = CronSim(cron_pattern, seed_time)
next_time = next(cron_event)
if next_time < now:
# schedule a backup at next daily time once
# if we missed the last scheduled backup
cron_event = CronSim(CRON_PATTERN_DAILY, now)
next_time = next(cron_event)
# reseed the cron event attribute
# add a day to the next time to avoid scheduling at the same time again
self.cron_event = CronSim(cron_pattern, now + timedelta(days=1))
async def _create_backup(now: datetime) -> None:
"""Create backup."""
manager.remove_next_backup_event = None
config_data = manager.config.data
self._schedule_next(cron_pattern, manager)
# create the backup
try:
await manager.async_create_backup(
agent_ids=config_data.create_backup.agent_ids,
include_addons=config_data.create_backup.include_addons,
include_all_addons=config_data.create_backup.include_all_addons,
include_database=config_data.create_backup.include_database,
include_folders=config_data.create_backup.include_folders,
include_homeassistant=True, # always include HA
name=config_data.create_backup.name,
password=config_data.create_backup.password,
with_automatic_settings=True,
)
except Exception: # noqa: BLE001
# another more specific exception will be added
# and handled in the future
LOGGER.exception("Unexpected error creating automatic backup")
manager.remove_next_backup_event = async_track_point_in_time(
manager.hass, _create_backup, next_time
)
def to_dict(self) -> StoredBackupSchedule:
"""Convert backup schedule to a dict."""
return StoredBackupSchedule(state=self.state)
@callback
def _unschedule_next(self, manager: BackupManager) -> None:
"""Unschedule the next backup."""
if (remove_next_event := manager.remove_next_backup_event) is not None:
remove_next_event()
manager.remove_next_backup_event = None
@dataclass(kw_only=True)
class CreateBackupConfig:
"""Represent the config for async_create_backup."""
agent_ids: list[str] = field(default_factory=list)
include_addons: list[str] | None = None
include_all_addons: bool = False
include_database: bool = True
include_folders: list[Folder] | None = None
name: str | None = None
password: str | None = None
def to_dict(self) -> StoredCreateBackupConfig:
"""Convert create backup config to a dict."""
return {
"agent_ids": self.agent_ids,
"include_addons": self.include_addons,
"include_all_addons": self.include_all_addons,
"include_database": self.include_database,
"include_folders": self.include_folders,
"name": self.name,
"password": self.password,
}
class StoredCreateBackupConfig(TypedDict):
"""Represent the stored config for async_create_backup."""
agent_ids: list[str]
include_addons: list[str] | None
include_all_addons: bool
include_database: bool
include_folders: list[Folder] | None
name: str | None
password: str | None
class CreateBackupParametersDict(TypedDict, total=False):
"""Represent the parameters for async_create_backup."""
agent_ids: list[str]
include_addons: list[str] | None
include_all_addons: bool
include_database: bool
include_folders: list[Folder] | None
name: str | None
password: str | None
async def _delete_filtered_backups(
manager: BackupManager,
backup_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param manager: The backup manager.
:param backup_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await manager.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
# only delete backups that are created with the saved automatic settings
backups = {
backup_id: backup
for backup_id, backup in backups.items()
if backup.with_automatic_settings
}
LOGGER.debug("Total automatic backups: %s", backups)
filtered_backups = backup_filter(backups)
if not filtered_backups:
return
# always delete oldest backup first
filtered_backups = dict(
sorted(
filtered_backups.items(),
key=lambda backup_item: backup_item[1].date,
)
)
if len(filtered_backups) >= len(backups):
# Never delete the last backup.
last_backup = filtered_backups.popitem()
LOGGER.debug("Keeping the last backup: %s", last_backup)
LOGGER.debug("Backups to delete: %s", filtered_backups)
if not filtered_backups:
return
backup_ids = list(filtered_backups)
delete_results = await asyncio.gather(
*(manager.async_delete_backup(backup_id) for backup_id in filtered_backups)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)
async def delete_backups_exceeding_configured_count(manager: BackupManager) -> None:
"""Delete backups exceeding the configured retention count."""
def _backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return oldest backups more numerous than copies to delete."""
# we need to check here since we await before
# this filter is applied
if manager.config.data.retention.copies is None:
return {}
return dict(
sorted(
backups.items(),
key=lambda backup_item: backup_item[1].date,
)[: len(backups) - manager.config.data.retention.copies]
)
await _delete_filtered_backups(manager, _backups_filter)
+7
View File
@@ -10,6 +10,7 @@ from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from .manager import BackupManager
BUF_SIZE = 2**20 * 4 # 4MB
DOMAIN = "backup"
DATA_MANAGER: HassKey[BackupManager] = HassKey(DOMAIN)
LOGGER = getLogger(__package__)
@@ -22,6 +23,12 @@ EXCLUDE_FROM_BACKUP = [
"*.log.*",
"*.log",
"backups/*.tar",
"tmp_backups/*.tar",
"OZW_Log.txt",
"tts/*",
]
EXCLUDE_DATABASE_FROM_BACKUP = [
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
]
+41 -14
View File
@@ -8,10 +8,11 @@ from typing import cast
from aiohttp import BodyPartReader
from aiohttp.hdrs import CONTENT_DISPOSITION
from aiohttp.web import FileResponse, Request, Response
from aiohttp.web import FileResponse, Request, Response, StreamResponse
from homeassistant.components.http import KEY_HASS, HomeAssistantView, require_admin
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import slugify
from .const import DATA_MANAGER
@@ -27,30 +28,47 @@ def async_register_http_views(hass: HomeAssistant) -> None:
class DownloadBackupView(HomeAssistantView):
"""Generate backup view."""
url = "/api/backup/download/{slug}"
url = "/api/backup/download/{backup_id}"
name = "api:backup:download"
async def get(
self,
request: Request,
slug: str,
) -> FileResponse | Response:
backup_id: str,
) -> StreamResponse | FileResponse | Response:
"""Download a backup file."""
if not request["hass_user"].is_admin:
return Response(status=HTTPStatus.UNAUTHORIZED)
try:
agent_id = request.query.getone("agent_id")
except KeyError:
return Response(status=HTTPStatus.BAD_REQUEST)
manager = request.app[KEY_HASS].data[DATA_MANAGER]
backup = await manager.async_get_backup(slug=slug)
if agent_id not in manager.backup_agents:
return Response(status=HTTPStatus.BAD_REQUEST)
agent = manager.backup_agents[agent_id]
backup = await agent.async_get_backup(backup_id)
if backup is None or not backup.path.exists():
# We don't need to check if the path exists, aiohttp.FileResponse will handle
# that
if backup is None:
return Response(status=HTTPStatus.NOT_FOUND)
return FileResponse(
path=backup.path.as_posix(),
headers={
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
},
)
headers = {
CONTENT_DISPOSITION: f"attachment; filename={slugify(backup.name)}.tar"
}
if agent_id in manager.local_backup_agents:
local_agent = manager.local_backup_agents[agent_id]
path = local_agent.get_backup_path(backup_id)
return FileResponse(path=path.as_posix(), headers=headers)
stream = await agent.async_download_backup(backup_id)
response = StreamResponse(status=HTTPStatus.OK, headers=headers)
await response.prepare(request)
async for chunk in stream:
await response.write(chunk)
return response
class UploadBackupView(HomeAssistantView):
@@ -62,15 +80,24 @@ class UploadBackupView(HomeAssistantView):
@require_admin
async def post(self, request: Request) -> Response:
"""Upload a backup file."""
try:
agent_ids = request.query.getall("agent_id")
except KeyError:
return Response(status=HTTPStatus.BAD_REQUEST)
manager = request.app[KEY_HASS].data[DATA_MANAGER]
reader = await request.multipart()
contents = cast(BodyPartReader, await reader.next())
try:
await manager.async_receive_backup(contents=contents)
await manager.async_receive_backup(contents=contents, agent_ids=agent_ids)
except OSError as err:
return Response(
body=f"Can't write backup file {err}",
body=f"Can't write backup file: {err}",
status=HTTPStatus.INTERNAL_SERVER_ERROR,
)
except HomeAssistantError as err:
return Response(
body=f"Can't upload backup file: {err}",
status=HTTPStatus.INTERNAL_SERVER_ERROR,
)
except asyncio.CancelledError:
File diff suppressed because it is too large Load Diff
@@ -1,11 +1,12 @@
{
"domain": "backup",
"name": "Backup",
"after_dependencies": ["hassio"],
"codeowners": ["@home-assistant/core"],
"dependencies": ["http", "websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/backup",
"integration_type": "system",
"iot_class": "calculated",
"quality_scale": "internal",
"requirements": ["securetar==2024.11.0"]
"requirements": ["cronsim==2.6", "securetar==2024.11.0"]
}
+69
View File
@@ -0,0 +1,69 @@
"""Models for the backup integration."""
from __future__ import annotations
from dataclasses import asdict, dataclass
from enum import StrEnum
from typing import Any, Self
@dataclass(frozen=True, kw_only=True)
class AddonInfo:
"""Addon information."""
name: str
slug: str
version: str
class Folder(StrEnum):
"""Folder type."""
SHARE = "share"
ADDONS = "addons/local"
SSL = "ssl"
MEDIA = "media"
@dataclass(frozen=True, kw_only=True)
class AgentBackup:
"""Base backup class."""
addons: list[AddonInfo]
backup_id: str
date: str
database_included: bool
extra_metadata: dict[str, bool | str]
folders: list[Folder]
homeassistant_included: bool
homeassistant_version: str | None # None if homeassistant_included is False
name: str
protected: bool
size: int
def as_dict(self) -> dict:
"""Return a dict representation of this backup."""
return asdict(self)
def as_frontend_json(self) -> dict:
"""Return a dict representation of this backup for sending to frontend."""
return {
key: val for key, val in asdict(self).items() if key != "extra_metadata"
}
@classmethod
def from_dict(cls, data: dict[str, Any]) -> Self:
"""Create an instance from a JSON serialization."""
return cls(
addons=[AddonInfo(**addon) for addon in data["addons"]],
backup_id=data["backup_id"],
date=data["date"],
database_included=data["database_included"],
extra_metadata=data["extra_metadata"],
folders=[Folder(folder) for folder in data["folders"]],
homeassistant_included=data["homeassistant_included"],
homeassistant_version=data["homeassistant_version"],
name=data["name"],
protected=data["protected"],
size=data["size"],
)
+52
View File
@@ -0,0 +1,52 @@
"""Store backup configuration."""
from __future__ import annotations
from typing import TYPE_CHECKING, TypedDict
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.storage import Store
from .const import DOMAIN
if TYPE_CHECKING:
from .config import StoredBackupConfig
from .manager import BackupManager, StoredKnownBackup
STORE_DELAY_SAVE = 30
STORAGE_KEY = DOMAIN
STORAGE_VERSION = 1
class StoredBackupData(TypedDict):
"""Represent the stored backup config."""
backups: list[StoredKnownBackup]
config: StoredBackupConfig
class BackupStore:
"""Store backup config."""
def __init__(self, hass: HomeAssistant, manager: BackupManager) -> None:
"""Initialize the backup manager."""
self._hass = hass
self._manager = manager
self._store: Store[StoredBackupData] = Store(hass, STORAGE_VERSION, STORAGE_KEY)
async def load(self) -> StoredBackupData | None:
"""Load the store."""
return await self._store.async_load()
@callback
def save(self) -> None:
"""Save config."""
self._store.async_delay_save(self._data_to_save, STORE_DELAY_SAVE)
@callback
def _data_to_save(self) -> StoredBackupData:
"""Return data to save."""
return {
"backups": self._manager.known_backups.to_list(),
"config": self._manager.config.data.to_dict(),
}
@@ -1,4 +1,14 @@
{
"issues": {
"automatic_backup_failed_create": {
"title": "Automatic backup could not be created",
"description": "The automatic backup could not be created. Please check the logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
},
"automatic_backup_failed_upload_agents": {
"title": "Automatic backup could not be uploaded to agents",
"description": "The automatic backup could not be uploaded to agents {failed_agents}. Please check the logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured."
}
},
"services": {
"create": {
"name": "Create backup",
+112
View File
@@ -0,0 +1,112 @@
"""Local backup support for Core and Container installations."""
from __future__ import annotations
import asyncio
from pathlib import Path
from queue import SimpleQueue
import tarfile
from typing import cast
import aiohttp
from homeassistant.core import HomeAssistant
from homeassistant.util.json import JsonObjectType, json_loads_object
from .const import BUF_SIZE
from .models import AddonInfo, AgentBackup, Folder
def make_backup_dir(path: Path) -> None:
"""Create a backup directory if it does not exist."""
path.mkdir(exist_ok=True)
def read_backup(backup_path: Path) -> AgentBackup:
"""Read a backup from disk."""
with tarfile.open(backup_path, "r:", bufsize=BUF_SIZE) as backup_file:
if not (data_file := backup_file.extractfile("./backup.json")):
raise KeyError("backup.json not found in tar file")
data = json_loads_object(data_file.read())
addons = [
AddonInfo(
name=cast(str, addon["name"]),
slug=cast(str, addon["slug"]),
version=cast(str, addon["version"]),
)
for addon in cast(list[JsonObjectType], data.get("addons", []))
]
folders = [
Folder(folder)
for folder in cast(list[str], data.get("folders", []))
if folder != "homeassistant"
]
homeassistant_included = False
homeassistant_version: str | None = None
database_included = False
if (
homeassistant := cast(JsonObjectType, data.get("homeassistant"))
) and "version" in homeassistant:
homeassistant_version = cast(str, homeassistant["version"])
database_included = not cast(
bool, homeassistant.get("exclude_database", False)
)
return AgentBackup(
addons=addons,
backup_id=cast(str, data["slug"]),
database_included=database_included,
date=cast(str, data["date"]),
extra_metadata=cast(dict[str, bool | str], data.get("metadata", {})),
folders=folders,
homeassistant_included=homeassistant_included,
homeassistant_version=homeassistant_version,
name=cast(str, data["name"]),
protected=cast(bool, data.get("protected", False)),
size=backup_path.stat().st_size,
)
async def receive_file(
hass: HomeAssistant, contents: aiohttp.BodyPartReader, path: Path
) -> None:
"""Receive a file from a stream and write it to a file."""
queue: SimpleQueue[tuple[bytes, asyncio.Future[None] | None] | None] = SimpleQueue()
def _sync_queue_consumer() -> None:
with path.open("wb") as file_handle:
while True:
if (_chunk_future := queue.get()) is None:
break
_chunk, _future = _chunk_future
if _future is not None:
hass.loop.call_soon_threadsafe(_future.set_result, None)
file_handle.write(_chunk)
fut: asyncio.Future[None] | None = None
try:
fut = hass.async_add_executor_job(_sync_queue_consumer)
megabytes_sending = 0
while chunk := await contents.read_chunk(BUF_SIZE):
megabytes_sending += 1
if megabytes_sending % 5 != 0:
queue.put_nowait((chunk, None))
continue
chunk_future = hass.loop.create_future()
queue.put_nowait((chunk, chunk_future))
await asyncio.wait(
(fut, chunk_future),
return_when=asyncio.FIRST_COMPLETED,
)
if fut.done():
# The executor job failed
break
queue.put_nowait(None) # terminate queue consumer
finally:
if fut is not None:
await fut
+203 -21
View File
@@ -7,22 +7,31 @@ import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.core import HomeAssistant, callback
from .config import ScheduleState
from .const import DATA_MANAGER, LOGGER
from .manager import ManagerStateEvent
from .models import Folder
@callback
def async_register_websocket_handlers(hass: HomeAssistant, with_hassio: bool) -> None:
"""Register websocket commands."""
websocket_api.async_register_command(hass, backup_agents_info)
if with_hassio:
websocket_api.async_register_command(hass, handle_backup_end)
websocket_api.async_register_command(hass, handle_backup_start)
return
websocket_api.async_register_command(hass, handle_details)
websocket_api.async_register_command(hass, handle_info)
websocket_api.async_register_command(hass, handle_create)
websocket_api.async_register_command(hass, handle_remove)
websocket_api.async_register_command(hass, handle_create_with_automatic_settings)
websocket_api.async_register_command(hass, handle_delete)
websocket_api.async_register_command(hass, handle_restore)
websocket_api.async_register_command(hass, handle_subscribe_events)
websocket_api.async_register_command(hass, handle_config_info)
websocket_api.async_register_command(hass, handle_config_update)
@websocket_api.require_admin
@@ -35,12 +44,16 @@ async def handle_info(
) -> None:
"""List all stored backups."""
manager = hass.data[DATA_MANAGER]
backups = await manager.async_get_backups()
backups, agent_errors = await manager.async_get_backups()
connection.send_result(
msg["id"],
{
"backups": list(backups.values()),
"backing_up": manager.backing_up,
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backups": [backup.as_frontend_json() for backup in backups.values()],
"last_attempted_automatic_backup": manager.config.data.last_attempted_automatic_backup,
"last_completed_automatic_backup": manager.config.data.last_completed_automatic_backup,
},
)
@@ -49,7 +62,7 @@ async def handle_info(
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/details",
vol.Required("slug"): str,
vol.Required("backup_id"): str,
}
)
@websocket_api.async_response
@@ -58,12 +71,17 @@ async def handle_details(
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Get backup details for a specific slug."""
backup = await hass.data[DATA_MANAGER].async_get_backup(slug=msg["slug"])
"""Get backup details for a specific backup."""
backup, agent_errors = await hass.data[DATA_MANAGER].async_get_backup(
msg["backup_id"]
)
connection.send_result(
msg["id"],
{
"backup": backup,
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
},
"backup": backup.as_frontend_json() if backup else None,
},
)
@@ -71,26 +89,39 @@ async def handle_details(
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/remove",
vol.Required("slug"): str,
vol.Required("type"): "backup/delete",
vol.Required("backup_id"): str,
}
)
@websocket_api.async_response
async def handle_remove(
async def handle_delete(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Remove a backup."""
await hass.data[DATA_MANAGER].async_remove_backup(slug=msg["slug"])
connection.send_result(msg["id"])
"""Delete a backup."""
agent_errors = await hass.data[DATA_MANAGER].async_delete_backup(msg["backup_id"])
connection.send_result(
msg["id"],
{
"agent_errors": {
agent_id: str(err) for agent_id, err in agent_errors.items()
}
},
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/restore",
vol.Required("slug"): str,
vol.Required("backup_id"): str,
vol.Required("agent_id"): str,
vol.Optional("password"): str,
vol.Optional("restore_addons"): [str],
vol.Optional("restore_database", default=True): bool,
vol.Optional("restore_folders"): [vol.Coerce(Folder)],
vol.Optional("restore_homeassistant", default=True): bool,
}
)
@websocket_api.async_response
@@ -100,12 +131,32 @@ async def handle_restore(
msg: dict[str, Any],
) -> None:
"""Restore a backup."""
await hass.data[DATA_MANAGER].async_restore_backup(msg["slug"])
await hass.data[DATA_MANAGER].async_restore_backup(
msg["backup_id"],
agent_id=msg["agent_id"],
password=msg.get("password"),
restore_addons=msg.get("restore_addons"),
restore_database=msg["restore_database"],
restore_folders=msg.get("restore_folders"),
restore_homeassistant=msg["restore_homeassistant"],
)
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/generate"})
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/generate",
vol.Required("agent_ids"): [str],
vol.Optional("include_addons"): [str],
vol.Optional("include_all_addons", default=False): bool,
vol.Optional("include_database", default=True): bool,
vol.Optional("include_folders"): [vol.Coerce(Folder)],
vol.Optional("include_homeassistant", default=True): bool,
vol.Optional("name"): str,
vol.Optional("password"): str,
}
)
@websocket_api.async_response
async def handle_create(
hass: HomeAssistant,
@@ -113,7 +164,46 @@ async def handle_create(
msg: dict[str, Any],
) -> None:
"""Generate a backup."""
backup = await hass.data[DATA_MANAGER].async_create_backup()
backup = await hass.data[DATA_MANAGER].async_initiate_backup(
agent_ids=msg["agent_ids"],
include_addons=msg.get("include_addons"),
include_all_addons=msg["include_all_addons"],
include_database=msg["include_database"],
include_folders=msg.get("include_folders"),
include_homeassistant=msg["include_homeassistant"],
name=msg.get("name"),
password=msg.get("password"),
)
connection.send_result(msg["id"], backup)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/generate_with_automatic_settings",
}
)
@websocket_api.async_response
async def handle_create_with_automatic_settings(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Generate a backup with stored settings."""
config_data = hass.data[DATA_MANAGER].config.data
backup = await hass.data[DATA_MANAGER].async_initiate_backup(
agent_ids=config_data.create_backup.agent_ids,
include_addons=config_data.create_backup.include_addons,
include_all_addons=config_data.create_backup.include_all_addons,
include_database=config_data.create_backup.include_database,
include_folders=config_data.create_backup.include_folders,
include_homeassistant=True, # always include HA
name=config_data.create_backup.name,
password=config_data.create_backup.password,
with_automatic_settings=True,
)
connection.send_result(msg["id"], backup)
@@ -127,7 +217,6 @@ async def handle_backup_start(
) -> None:
"""Backup start notification."""
manager = hass.data[DATA_MANAGER]
manager.backing_up = True
LOGGER.debug("Backup start notification")
try:
@@ -149,7 +238,6 @@ async def handle_backup_end(
) -> None:
"""Backup end notification."""
manager = hass.data[DATA_MANAGER]
manager.backing_up = False
LOGGER.debug("Backup end notification")
try:
@@ -159,3 +247,97 @@ async def handle_backup_end(
return
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/agents/info"})
@websocket_api.async_response
async def backup_agents_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Return backup agents info."""
manager = hass.data[DATA_MANAGER]
connection.send_result(
msg["id"],
{
"agents": [{"agent_id": agent_id} for agent_id in manager.backup_agents],
},
)
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/config/info"})
@websocket_api.async_response
async def handle_config_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Send the stored backup config."""
manager = hass.data[DATA_MANAGER]
connection.send_result(
msg["id"],
{
"config": manager.config.data.to_dict(),
},
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "backup/config/update",
vol.Optional("create_backup"): vol.Schema(
{
vol.Optional("agent_ids"): vol.All(list[str]),
vol.Optional("include_addons"): vol.Any(list[str], None),
vol.Optional("include_all_addons"): bool,
vol.Optional("include_database"): bool,
vol.Optional("include_folders"): vol.Any([vol.Coerce(Folder)], None),
vol.Optional("name"): vol.Any(str, None),
vol.Optional("password"): vol.Any(str, None),
},
),
vol.Optional("retention"): vol.Schema(
{
vol.Optional("copies"): vol.Any(int, None),
vol.Optional("days"): vol.Any(int, None),
},
),
vol.Optional("schedule"): vol.All(str, vol.Coerce(ScheduleState)),
}
)
@websocket_api.async_response
async def handle_config_update(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Update the stored backup config."""
manager = hass.data[DATA_MANAGER]
changes = dict(msg)
changes.pop("id")
changes.pop("type")
await manager.config.update(**changes)
connection.send_result(msg["id"])
@websocket_api.require_admin
@websocket_api.websocket_command({vol.Required("type"): "backup/subscribe_events"})
@websocket_api.async_response
async def handle_subscribe_events(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Subscribe to backup events."""
def on_event(event: ManagerStateEvent) -> None:
connection.send_message(websocket_api.event_message(msg["id"], event))
manager = hass.data[DATA_MANAGER]
on_event(manager.last_event)
connection.subscriptions[msg["id"]] = manager.async_subscribe_events(on_event)
connection.send_result(msg["id"])
@@ -8,6 +8,7 @@ from aiohttp.client_exceptions import (
ClientConnectorError,
ClientOSError,
ServerTimeoutError,
WSMessageTypeError,
)
from mozart_api.exceptions import ApiException
from mozart_api.mozart_client import MozartClient
@@ -62,6 +63,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: BangOlufsenConfigEntry)
ServerTimeoutError,
ApiException,
TimeoutError,
WSMessageTypeError,
) as error:
await client.close_api_client()
raise ConfigEntryNotReady(f"Unable to connect to {entry.title}") from error
@@ -210,3 +210,20 @@ BANG_OLUFSEN_WEBSOCKET_EVENT: Final[str] = f"{DOMAIN}_websocket_event"
CONNECTION_STATUS: Final[str] = "CONNECTION_STATUS"
# Beolink Converter NL/ML sources need to be transformed to upper case
BEOLINK_JOIN_SOURCES_TO_UPPER = (
"aux_a",
"cd",
"ph",
"radio",
"tp1",
"tp2",
)
BEOLINK_JOIN_SOURCES = (
*BEOLINK_JOIN_SOURCES_TO_UPPER,
"beoradio",
"deezer",
"spotify",
"tidal",
)
@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/bang_olufsen",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["mozart-api==4.1.1.116.3"],
"requirements": ["mozart-api==4.1.1.116.4"],
"zeroconf": ["_bangolufsen._tcp.local."]
}
@@ -74,6 +74,8 @@ from .const import (
BANG_OLUFSEN_REPEAT_FROM_HA,
BANG_OLUFSEN_REPEAT_TO_HA,
BANG_OLUFSEN_STATES,
BEOLINK_JOIN_SOURCES,
BEOLINK_JOIN_SOURCES_TO_UPPER,
CONF_BEOLINK_JID,
CONNECTION_STATUS,
DOMAIN,
@@ -135,7 +137,10 @@ async def async_setup_entry(
platform.async_register_entity_service(
name="beolink_join",
schema={vol.Optional("beolink_jid"): jid_regex},
schema={
vol.Optional("beolink_jid"): jid_regex,
vol.Optional("source_id"): vol.In(BEOLINK_JOIN_SOURCES),
},
func="async_beolink_join",
)
@@ -985,12 +990,23 @@ class BangOlufsenMediaPlayer(BangOlufsenEntity, MediaPlayerEntity):
await self.async_beolink_leave()
# Custom actions:
async def async_beolink_join(self, beolink_jid: str | None = None) -> None:
async def async_beolink_join(
self, beolink_jid: str | None = None, source_id: str | None = None
) -> None:
"""Join a Beolink multi-room experience."""
# Touch to join
if beolink_jid is None:
await self._client.join_latest_beolink_experience()
else:
# Join a peer
elif beolink_jid and source_id is None:
await self._client.join_beolink_peer(jid=beolink_jid)
# Join a peer and select specific source
elif beolink_jid and source_id:
# Beolink Converter NL/ML sources need to be in upper case
if source_id in BEOLINK_JOIN_SOURCES_TO_UPPER:
source_id = source_id.upper()
await self._client.join_beolink_peer(jid=beolink_jid, source=source_id)
async def async_beolink_expand(
self, beolink_jids: list[str] | None = None, all_discovered: bool = False
@@ -48,6 +48,23 @@ beolink_join:
example: 1111.2222222.33333333@products.bang-olufsen.com
selector:
text:
source_id:
required: false
example: tidal
selector:
select:
translation_key: "source_ids"
options:
- beoradio
- deezer
- spotify
- tidal
- radio
- tp1
- tp2
- cd
- aux_a
- ph
beolink_leave:
target:
@@ -29,6 +29,22 @@
}
}
},
"selector": {
"source_ids": {
"options": {
"beoradio": "ASE Beoradio",
"deezer": "ASE / Mozart Deezer",
"spotify": "ASE / Mozart Spotify",
"tidal": "Mozart Tidal",
"aux_a": "Beolink Converter NL/ML AUX_A",
"cd": "Beolink Converter NL/ML CD",
"ph": "Beolink Converter NL/ML PH",
"radio": "Beolink Converter NL/ML RADIO",
"tp1": "Beolink Converter NL/ML TP1",
"tp2": "Beolink Converter NL/ML TP2"
}
}
},
"services": {
"beolink_allstandby": {
"name": "Beolink all standby",
@@ -61,6 +77,10 @@
"beolink_jid": {
"name": "Beolink JID",
"description": "Manually specify Beolink JID to join."
},
"source_id": {
"name": "Source",
"description": "Specify which source to join, behavior varies between hardware platforms. Source names prefaced by a platform name can only be used when connecting to that platform. For example \"ASE Beoradio\" can only be used when joining an ASE device, while ”ASE / Mozart Deezer” can be used with ASE or Mozart devices. A defined Beolink JID is required."
}
},
"sections": {
+1 -6
View File
@@ -5,7 +5,7 @@ from __future__ import annotations
import voluptuous as vol
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import ATTR_DEVICE_ID, CONF_PIN
from homeassistant.const import CONF_PIN
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
@@ -13,11 +13,6 @@ from homeassistant.helpers import config_validation as cv
from .const import ATTR_CONFIG_ENTRY_ID, DOMAIN, SERVICE_SEND_PIN
from .coordinator import BlinkConfigEntry
SERVICE_UPDATE_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): vol.All(cv.ensure_list, [cv.string]),
}
)
SERVICE_SEND_PIN_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY_ID): vol.All(cv.ensure_list, [cv.string]),
@@ -14,7 +14,6 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN
from .services import setup_services
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
@@ -36,7 +35,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Bluesound."""
if DOMAIN not in hass.data:
hass.data[DOMAIN] = []
setup_services(hass)
return True
@@ -6,7 +6,7 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/bluesound",
"iot_class": "local_polling",
"requirements": ["pyblu==1.0.4"],
"requirements": ["pyblu==2.0.0"],
"zeroconf": [
{
"type": "_musc._tcp.local."
@@ -28,18 +28,26 @@ from homeassistant.const import CONF_HOST, CONF_HOSTS, CONF_NAME, CONF_PORT
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.helpers import (
config_validation as cv,
entity_platform,
issue_registry as ir,
)
from homeassistant.helpers.device_registry import (
CONNECTION_NETWORK_MAC,
DeviceInfo,
format_mac,
)
from homeassistant.helpers.dispatcher import (
async_dispatcher_connect,
async_dispatcher_send,
)
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
import homeassistant.util.dt as dt_util
from .const import ATTR_BLUESOUND_GROUP, ATTR_MASTER, DOMAIN, INTEGRATION_TITLE
from .utils import format_unique_id
from .utils import dispatcher_join_signal, dispatcher_unjoin_signal, format_unique_id
if TYPE_CHECKING:
from . import BluesoundConfigEntry
@@ -51,6 +59,11 @@ SCAN_INTERVAL = timedelta(minutes=15)
DATA_BLUESOUND = DOMAIN
DEFAULT_PORT = 11000
SERVICE_CLEAR_TIMER = "clear_sleep_timer"
SERVICE_JOIN = "join"
SERVICE_SET_TIMER = "set_sleep_timer"
SERVICE_UNJOIN = "unjoin"
NODE_OFFLINE_CHECK_TIMEOUT = 180
NODE_RETRY_INITIATION = timedelta(minutes=3)
@@ -130,6 +143,18 @@ async def async_setup_entry(
config_entry.runtime_data.sync_status,
)
platform = entity_platform.async_get_current_platform()
platform.async_register_entity_service(
SERVICE_SET_TIMER, None, "async_increase_timer"
)
platform.async_register_entity_service(
SERVICE_CLEAR_TIMER, None, "async_clear_timer"
)
platform.async_register_entity_service(
SERVICE_JOIN, {vol.Required(ATTR_MASTER): cv.entity_id}, "async_join"
)
platform.async_register_entity_service(SERVICE_UNJOIN, None, "async_unjoin")
hass.data[DATA_BLUESOUND].append(bluesound_player)
async_add_entities([bluesound_player], update_before_add=True)
@@ -175,13 +200,12 @@ class BluesoundPlayer(MediaPlayerEntity):
self._status: Status | None = None
self._inputs: list[Input] = []
self._presets: list[Preset] = []
self._muted = False
self._master: BluesoundPlayer | None = None
self._is_master = False
self._group_name: str | None = None
self._group_list: list[str] = []
self._bluesound_device_name = sync_status.name
self._player = player
self._is_leader = False
self._leader: BluesoundPlayer | None = None
self._attr_unique_id = format_unique_id(sync_status.mac, port)
# there should always be one player with the default port per mac
@@ -250,6 +274,22 @@ class BluesoundPlayer(MediaPlayerEntity):
name=f"bluesound.poll_sync_status_loop_{self.host}:{self.port}",
)
assert self._sync_status.id is not None
self.async_on_remove(
async_dispatcher_connect(
self.hass,
dispatcher_join_signal(self.entity_id),
self.async_add_follower,
)
)
self.async_on_remove(
async_dispatcher_connect(
self.hass,
dispatcher_unjoin_signal(self._sync_status.id),
self.async_remove_follower,
)
)
async def async_will_remove_from_hass(self) -> None:
"""Stop the polling task."""
await super().async_will_remove_from_hass()
@@ -317,25 +357,25 @@ class BluesoundPlayer(MediaPlayerEntity):
self._group_list = self.rebuild_bluesound_group()
if sync_status.master is not None:
self._is_master = False
master_id = f"{sync_status.master.ip}:{sync_status.master.port}"
master_device = [
if sync_status.leader is not None:
self._is_leader = False
leader_id = f"{sync_status.leader.ip}:{sync_status.leader.port}"
leader_device = [
device
for device in self.hass.data[DATA_BLUESOUND]
if device.id == master_id
if device.id == leader_id
]
if master_device and master_id != self.id:
self._master = master_device[0]
if leader_device and leader_id != self.id:
self._leader = leader_device[0]
else:
self._master = None
_LOGGER.error("Master not found %s", master_id)
self._leader = None
_LOGGER.error("Leader not found %s", leader_id)
else:
if self._master is not None:
self._master = None
slaves = self._sync_status.slaves
self._is_master = slaves is not None
if self._leader is not None:
self._leader = None
followers = self._sync_status.followers
self._is_leader = followers is not None
self.async_write_ha_state()
@@ -355,7 +395,7 @@ class BluesoundPlayer(MediaPlayerEntity):
if self._status is None:
return MediaPlayerState.OFF
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return MediaPlayerState.IDLE
match self._status.state:
@@ -369,7 +409,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def media_title(self) -> str | None:
"""Title of current playing media."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
return self._status.name
@@ -380,7 +420,7 @@ class BluesoundPlayer(MediaPlayerEntity):
if self._status is None:
return None
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return self._group_name
return self._status.artist
@@ -388,7 +428,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def media_album_name(self) -> str | None:
"""Artist of current playing media (Music track only)."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
return self._status.album
@@ -396,7 +436,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def media_image_url(self) -> str | None:
"""Image url of current playing media."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
url = self._status.image
@@ -411,7 +451,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def media_position(self) -> int | None:
"""Position of current playing media in seconds."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
mediastate = self.state
@@ -430,7 +470,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def media_duration(self) -> int | None:
"""Duration of current playing media in seconds."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
duration = self._status.total_seconds
@@ -489,7 +529,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def source_list(self) -> list[str] | None:
"""List of available input sources."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
sources = [x.text for x in self._inputs]
@@ -500,7 +540,7 @@ class BluesoundPlayer(MediaPlayerEntity):
@property
def source(self) -> str | None:
"""Name of the current input source."""
if self._status is None or (self.is_grouped and not self.is_master):
if self._status is None or (self.is_grouped and not self.is_leader):
return None
if self._status.input_id is not None:
@@ -520,7 +560,7 @@ class BluesoundPlayer(MediaPlayerEntity):
if self._status is None:
return MediaPlayerEntityFeature(0)
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return (
MediaPlayerEntityFeature.VOLUME_STEP
| MediaPlayerEntityFeature.VOLUME_SET
@@ -560,14 +600,17 @@ class BluesoundPlayer(MediaPlayerEntity):
return supported
@property
def is_master(self) -> bool:
"""Return true if player is a coordinator."""
return self._is_master
def is_leader(self) -> bool:
"""Return true if player is leader of a group."""
return self._sync_status.followers is not None
@property
def is_grouped(self) -> bool:
"""Return true if player is a coordinator."""
return self._master is not None or self._is_master
"""Return true if player is member or leader of a group."""
return (
self._sync_status.followers is not None
or self._sync_status.leader is not None
)
@property
def shuffle(self) -> bool:
@@ -580,25 +623,25 @@ class BluesoundPlayer(MediaPlayerEntity):
async def async_join(self, master: str) -> None:
"""Join the player to a group."""
master_device = [
device
for device in self.hass.data[DATA_BLUESOUND]
if device.entity_id == master
]
if master == self.entity_id:
raise ServiceValidationError("Cannot join player to itself")
if len(master_device) > 0:
if self.id == master_device[0].id:
raise ServiceValidationError("Cannot join player to itself")
_LOGGER.debug("Trying to join player: %s", self.id)
async_dispatcher_send(
self.hass, dispatcher_join_signal(master), self.host, self.port
)
_LOGGER.debug(
"Trying to join player: %s to master: %s",
self.id,
master_device[0].id,
)
async def async_unjoin(self) -> None:
"""Unjoin the player from a group."""
if self._sync_status.leader is None:
return
await master_device[0].async_add_slave(self)
else:
_LOGGER.error("Master not found %s", master_device)
leader_id = f"{self._sync_status.leader.ip}:{self._sync_status.leader.port}"
_LOGGER.debug("Trying to unjoin player: %s", self.id)
async_dispatcher_send(
self.hass, dispatcher_unjoin_signal(leader_id), self.host, self.port
)
@property
def extra_state_attributes(self) -> dict[str, Any] | None:
@@ -607,31 +650,31 @@ class BluesoundPlayer(MediaPlayerEntity):
if self._group_list:
attributes = {ATTR_BLUESOUND_GROUP: self._group_list}
attributes[ATTR_MASTER] = self._is_master
attributes[ATTR_MASTER] = self.is_leader
return attributes
def rebuild_bluesound_group(self) -> list[str]:
"""Rebuild the list of entities in speaker group."""
if self.sync_status.master is None and self.sync_status.slaves is None:
if self.sync_status.leader is None and self.sync_status.followers is None:
return []
player_entities: list[BluesoundPlayer] = self.hass.data[DATA_BLUESOUND]
leader_sync_status: SyncStatus | None = None
if self.sync_status.master is None:
if self.sync_status.leader is None:
leader_sync_status = self.sync_status
else:
required_id = f"{self.sync_status.master.ip}:{self.sync_status.master.port}"
required_id = f"{self.sync_status.leader.ip}:{self.sync_status.leader.port}"
for x in player_entities:
if x.sync_status.id == required_id:
leader_sync_status = x.sync_status
break
if leader_sync_status is None or leader_sync_status.slaves is None:
if leader_sync_status is None or leader_sync_status.followers is None:
return []
follower_ids = [f"{x.ip}:{x.port}" for x in leader_sync_status.slaves]
follower_ids = [f"{x.ip}:{x.port}" for x in leader_sync_status.followers]
follower_names = [
x.sync_status.name
for x in player_entities
@@ -640,21 +683,13 @@ class BluesoundPlayer(MediaPlayerEntity):
follower_names.insert(0, leader_sync_status.name)
return follower_names
async def async_unjoin(self) -> None:
"""Unjoin the player from a group."""
if self._master is None:
return
async def async_add_follower(self, host: str, port: int) -> None:
"""Add follower to leader."""
await self._player.add_follower(host, port)
_LOGGER.debug("Trying to unjoin player: %s", self.id)
await self._master.async_remove_slave(self)
async def async_add_slave(self, slave_device: BluesoundPlayer) -> None:
"""Add slave to master."""
await self._player.add_slave(slave_device.host, slave_device.port)
async def async_remove_slave(self, slave_device: BluesoundPlayer) -> None:
"""Remove slave to master."""
await self._player.remove_slave(slave_device.host, slave_device.port)
async def async_remove_follower(self, host: str, port: int) -> None:
"""Remove follower to leader."""
await self._player.remove_follower(host, port)
async def async_increase_timer(self) -> int:
"""Increase sleep time on player."""
@@ -672,7 +707,7 @@ class BluesoundPlayer(MediaPlayerEntity):
async def async_select_source(self, source: str) -> None:
"""Select input source."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
# presets and inputs might have the same name; presets have priority
@@ -691,49 +726,49 @@ class BluesoundPlayer(MediaPlayerEntity):
async def async_clear_playlist(self) -> None:
"""Clear players playlist."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.clear()
async def async_media_next_track(self) -> None:
"""Send media_next command to media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.skip()
async def async_media_previous_track(self) -> None:
"""Send media_previous command to media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.back()
async def async_media_play(self) -> None:
"""Send media_play command to media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.play()
async def async_media_pause(self) -> None:
"""Send media_pause command to media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.pause()
async def async_media_stop(self) -> None:
"""Send stop command."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.stop()
async def async_media_seek(self, position: float) -> None:
"""Send media_seek command to media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
await self._player.play(seek=int(position))
@@ -742,7 +777,7 @@ class BluesoundPlayer(MediaPlayerEntity):
self, media_type: MediaType | str, media_id: str, **kwargs: Any
) -> None:
"""Send the play_media command to the media player."""
if self.is_grouped and not self.is_master:
if self.is_grouped and not self.is_leader:
return
if media_source.is_media_source_id(media_id):
@@ -1,68 +0,0 @@
"""Support for Bluesound devices."""
from __future__ import annotations
from typing import NamedTuple
import voluptuous as vol
from homeassistant.const import ATTR_ENTITY_ID
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.helpers import config_validation as cv
from .const import ATTR_MASTER, DOMAIN
SERVICE_CLEAR_TIMER = "clear_sleep_timer"
SERVICE_JOIN = "join"
SERVICE_SET_TIMER = "set_sleep_timer"
SERVICE_UNJOIN = "unjoin"
BS_SCHEMA = vol.Schema({vol.Optional(ATTR_ENTITY_ID): cv.entity_ids})
BS_JOIN_SCHEMA = BS_SCHEMA.extend({vol.Required(ATTR_MASTER): cv.entity_id})
class ServiceMethodDetails(NamedTuple):
"""Details for SERVICE_TO_METHOD mapping."""
method: str
schema: vol.Schema
SERVICE_TO_METHOD = {
SERVICE_JOIN: ServiceMethodDetails(method="async_join", schema=BS_JOIN_SCHEMA),
SERVICE_UNJOIN: ServiceMethodDetails(method="async_unjoin", schema=BS_SCHEMA),
SERVICE_SET_TIMER: ServiceMethodDetails(
method="async_increase_timer", schema=BS_SCHEMA
),
SERVICE_CLEAR_TIMER: ServiceMethodDetails(
method="async_clear_timer", schema=BS_SCHEMA
),
}
def setup_services(hass: HomeAssistant) -> None:
"""Set up services for Bluesound component."""
async def async_service_handler(service: ServiceCall) -> None:
"""Map services to method of Bluesound devices."""
if not (method := SERVICE_TO_METHOD.get(service.service)):
return
params = {
key: value for key, value in service.data.items() if key != ATTR_ENTITY_ID
}
if entity_ids := service.data.get(ATTR_ENTITY_ID):
target_players = [
player for player in hass.data[DOMAIN] if player.entity_id in entity_ids
]
else:
target_players = hass.data[DOMAIN]
for player in target_players:
await getattr(player, method.method)(**params)
for service, method in SERVICE_TO_METHOD.items():
hass.services.async_register(
DOMAIN, service, async_service_handler, schema=method.schema
)
@@ -6,3 +6,16 @@ from homeassistant.helpers.device_registry import format_mac
def format_unique_id(mac: str, port: int) -> str:
"""Generate a unique ID based on the MAC address and port number."""
return f"{format_mac(mac)}-{port}"
def dispatcher_join_signal(entity_id: str) -> str:
"""Join an entity ID with a signal."""
return f"bluesound_join_{entity_id}"
def dispatcher_unjoin_signal(leader_id: str) -> str:
"""Unjoin an entity ID with a signal.
Id is ip_address:port. This can be obtained from sync_status.id.
"""
return f"bluesound_unjoin_{leader_id}"
@@ -6,7 +6,6 @@ import logging
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_DEVICE_ID, CONF_ENTITY_ID, CONF_NAME, Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import (
@@ -50,7 +49,7 @@ SERVICE_UPDATE_STATE = "update_state"
@callback
def _async_migrate_options_from_data_if_missing(
hass: HomeAssistant, entry: ConfigEntry
hass: HomeAssistant, entry: BMWConfigEntry
) -> None:
data = dict(entry.data)
options = dict(entry.options)
@@ -74,23 +73,29 @@ async def _async_migrate_entries(
@callback
def update_unique_id(entry: er.RegistryEntry) -> dict[str, str] | None:
replacements = {
"charging_level_hv": "fuel_and_battery.remaining_battery_percent",
"fuel_percent": "fuel_and_battery.remaining_fuel_percent",
"ac_current_limit": "charging_profile.ac_current_limit",
"charging_start_time": "fuel_and_battery.charging_start_time",
"charging_end_time": "fuel_and_battery.charging_end_time",
"charging_status": "fuel_and_battery.charging_status",
"charging_target": "fuel_and_battery.charging_target",
"remaining_battery_percent": "fuel_and_battery.remaining_battery_percent",
"remaining_range_total": "fuel_and_battery.remaining_range_total",
"remaining_range_electric": "fuel_and_battery.remaining_range_electric",
"remaining_range_fuel": "fuel_and_battery.remaining_range_fuel",
"remaining_fuel": "fuel_and_battery.remaining_fuel",
"remaining_fuel_percent": "fuel_and_battery.remaining_fuel_percent",
"activity": "climate.activity",
Platform.SENSOR.value: {
"charging_level_hv": "fuel_and_battery.remaining_battery_percent",
"fuel_percent": "fuel_and_battery.remaining_fuel_percent",
"ac_current_limit": "charging_profile.ac_current_limit",
"charging_start_time": "fuel_and_battery.charging_start_time",
"charging_end_time": "fuel_and_battery.charging_end_time",
"charging_status": "fuel_and_battery.charging_status",
"charging_target": "fuel_and_battery.charging_target",
"remaining_battery_percent": "fuel_and_battery.remaining_battery_percent",
"remaining_range_total": "fuel_and_battery.remaining_range_total",
"remaining_range_electric": "fuel_and_battery.remaining_range_electric",
"remaining_range_fuel": "fuel_and_battery.remaining_range_fuel",
"remaining_fuel": "fuel_and_battery.remaining_fuel",
"remaining_fuel_percent": "fuel_and_battery.remaining_fuel_percent",
"activity": "climate.activity",
}
}
if (key := entry.unique_id.split("-")[-1]) in replacements:
new_unique_id = entry.unique_id.replace(key, replacements[key])
if (key := entry.unique_id.split("-")[-1]) in replacements.get(
entry.domain, []
):
new_unique_id = entry.unique_id.replace(
key, replacements[entry.domain][key]
)
_LOGGER.debug(
"Migrating entity '%s' unique_id from '%s' to '%s'",
entry.entity_id,
@@ -116,7 +121,7 @@ async def _async_migrate_entries(
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: BMWConfigEntry) -> bool:
"""Set up BMW Connected Drive from a config entry."""
_async_migrate_options_from_data_if_missing(hass, entry)
@@ -164,7 +169,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: BMWConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(
@@ -16,7 +16,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
from .entity import BMWBaseEntity
if TYPE_CHECKING:
@@ -55,7 +55,6 @@ BUTTON_TYPES: tuple[BMWButtonEntityDescription, ...] = (
BMWButtonEntityDescription(
key="deactivate_air_conditioning",
translation_key="deactivate_air_conditioning",
name="Deactivate air conditioning",
remote_function=lambda vehicle: vehicle.remote_services.trigger_remote_air_conditioning_stop(),
is_available=lambda vehicle: vehicle.is_remote_climate_stop_enabled,
),
@@ -111,6 +110,10 @@ class BMWButton(BMWBaseEntity, ButtonEntity):
try:
await self.entity_description.remote_function(self.vehicle)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
self.coordinator.async_update_listeners()
@@ -18,7 +18,6 @@ import voluptuous as vol
from homeassistant.config_entries import (
SOURCE_REAUTH,
SOURCE_RECONFIGURE,
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
OptionsFlow,
@@ -39,6 +38,7 @@ from .const import (
CONF_READ_ONLY,
CONF_REFRESH_TOKEN,
)
from .coordinator import BMWConfigEntry
DATA_SCHEMA = vol.Schema(
{
@@ -224,7 +224,7 @@ class BMWConfigFlow(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(
config_entry: ConfigEntry,
config_entry: BMWConfigEntry,
) -> BMWOptionsFlow:
"""Return a MyBMW option flow."""
return BMWOptionsFlow()
@@ -22,7 +22,13 @@ from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util.ssl import get_default_context
from .const import CONF_GCID, CONF_READ_ONLY, CONF_REFRESH_TOKEN, DOMAIN, SCAN_INTERVALS
from .const import (
CONF_GCID,
CONF_READ_ONLY,
CONF_REFRESH_TOKEN,
DOMAIN as BMW_DOMAIN,
SCAN_INTERVALS,
)
_LOGGER = logging.getLogger(__name__)
@@ -36,7 +42,7 @@ class BMWDataUpdateCoordinator(DataUpdateCoordinator[None]):
account: MyBMWAccount
config_entry: BMWConfigEntry
def __init__(self, hass: HomeAssistant, *, config_entry: ConfigEntry) -> None:
def __init__(self, hass: HomeAssistant, *, config_entry: BMWConfigEntry) -> None:
"""Initialize account-wide BMW data updater."""
self.account = MyBMWAccount(
config_entry.data[CONF_USERNAME],
@@ -57,7 +63,7 @@ class BMWDataUpdateCoordinator(DataUpdateCoordinator[None]):
hass,
_LOGGER,
config_entry=config_entry,
name=f"{DOMAIN}-{config_entry.data[CONF_USERNAME]}",
name=f"{BMW_DOMAIN}-{config_entry.data[CONF_USERNAME]}",
update_interval=timedelta(
seconds=SCAN_INTERVALS[config_entry.data[CONF_REGION]]
),
@@ -75,18 +81,29 @@ class BMWDataUpdateCoordinator(DataUpdateCoordinator[None]):
except MyBMWCaptchaMissingError as err:
# If a captcha is required (user/password login flow), always trigger the reauth flow
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_domain=BMW_DOMAIN,
translation_key="missing_captcha",
) from err
except MyBMWAuthError as err:
# Allow one retry interval before raising AuthFailed to avoid flaky API issues
if self.last_update_success:
raise UpdateFailed(err) from err
raise UpdateFailed(
translation_domain=BMW_DOMAIN,
translation_key="update_failed",
translation_placeholders={"exception": str(err)},
) from err
# Clear refresh token and trigger reauth if previous update failed as well
self._update_config_entry_refresh_token(None)
raise ConfigEntryAuthFailed(err) from err
raise ConfigEntryAuthFailed(
translation_domain=BMW_DOMAIN,
translation_key="invalid_auth",
) from err
except (MyBMWAPIError, RequestError) as err:
raise UpdateFailed(err) from err
raise UpdateFailed(
translation_domain=BMW_DOMAIN,
translation_key="update_failed",
translation_placeholders={"exception": str(err)},
) from err
if self.account.refresh_token != old_refresh_token:
self._update_config_entry_refresh_token(self.account.refresh_token)
@@ -49,7 +49,7 @@ class BMWDeviceTracker(BMWBaseEntity, TrackerEntity):
_attr_force_update = False
_attr_translation_key = "car"
_attr_icon = "mdi:car"
_attr_name = None
def __init__(
self,
@@ -58,9 +58,7 @@ class BMWDeviceTracker(BMWBaseEntity, TrackerEntity):
) -> None:
"""Initialize the Tracker."""
super().__init__(coordinator, vehicle)
self._attr_unique_id = vehicle.vin
self._attr_name = None
@property
def extra_state_attributes(self) -> dict[str, Any]:
@@ -14,7 +14,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
from .coordinator import BMWDataUpdateCoordinator
from .entity import BMWBaseEntity
@@ -70,7 +70,11 @@ class BMWLock(BMWBaseEntity, LockEntity):
# Set the state to unknown if the command fails
self._attr_is_locked = None
self.async_write_ha_state()
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
finally:
# Always update the listeners to get the latest state
self.coordinator.async_update_listeners()
@@ -90,7 +94,11 @@ class BMWLock(BMWBaseEntity, LockEntity):
# Set the state to unknown if the command fails
self._attr_is_locked = None
self.async_write_ha_state()
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
finally:
# Always update the listeners to get the latest state
self.coordinator.async_update_listeners()
@@ -20,7 +20,7 @@ from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from . import DOMAIN, BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
PARALLEL_UPDATES = 1
@@ -92,7 +92,7 @@ class BMWNotificationService(BaseNotificationService):
except (vol.Invalid, TypeError, ValueError) as ex:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_domain=BMW_DOMAIN,
translation_key="invalid_poi",
translation_placeholders={
"poi_exception": str(ex),
@@ -106,4 +106,8 @@ class BMWNotificationService(BaseNotificationService):
try:
await vehicle.remote_services.trigger_send_poi(poi)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
@@ -18,7 +18,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
from .coordinator import BMWDataUpdateCoordinator
from .entity import BMWBaseEntity
@@ -109,6 +109,10 @@ class BMWNumber(BMWBaseEntity, NumberEntity):
try:
await self.entity_description.remote_service(self.vehicle, value)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
self.coordinator.async_update_listeners()
@@ -15,7 +15,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
from .coordinator import BMWDataUpdateCoordinator
from .entity import BMWBaseEntity
@@ -123,6 +123,10 @@ class BMWSelect(BMWBaseEntity, SelectEntity):
try:
await self.entity_description.remote_service(self.vehicle, option)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
self.coordinator.async_update_listeners()
@@ -2,11 +2,16 @@
"config": {
"step": {
"user": {
"description": "Enter your MyBMW/MINI Connected credentials.",
"description": "Connect to your MyBMW/MINI Connected account to retrieve vehicle data.",
"data": {
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]",
"region": "ConnectedDrive Region"
},
"data_description": {
"username": "The email address of your MyBMW/MINI Connected account.",
"password": "The password of your MyBMW/MINI Connected account.",
"region": "The region of your MyBMW/MINI Connected account."
}
},
"captcha": {
@@ -23,6 +28,9 @@
"description": "Update your MyBMW/MINI Connected password for account `{username}` in region `{region}`.",
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::bmw_connected_drive::config::step::user::data_description::password%]"
}
}
},
@@ -41,7 +49,10 @@
"step": {
"account_options": {
"data": {
"read_only": "Read-only (only sensors and notify, no execution of services, no lock)"
"read_only": "Read-only mode"
},
"data_description": {
"read_only": "Only retrieve values and send POI data, but don't offer any services that can change the vehicle state."
}
}
}
@@ -83,6 +94,9 @@
"activate_air_conditioning": {
"name": "Activate air conditioning"
},
"deactivate_air_conditioning": {
"name": "Deactivate air conditioning"
},
"find_vehicle": {
"name": "Find vehicle"
}
@@ -220,6 +234,15 @@
},
"missing_captcha": {
"message": "Login requires captcha validation"
},
"invalid_auth": {
"message": "[%key:common::config_flow::error::invalid_auth%]"
},
"remote_service_error": {
"message": "Error executing remote service on vehicle. {exception}"
},
"update_failed": {
"message": "Error updating vehicle data. {exception}"
}
}
}
@@ -14,7 +14,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from . import BMWConfigEntry
from . import DOMAIN as BMW_DOMAIN, BMWConfigEntry
from .coordinator import BMWDataUpdateCoordinator
from .entity import BMWBaseEntity
@@ -111,8 +111,11 @@ class BMWSwitch(BMWBaseEntity, SwitchEntity):
try:
await self.entity_description.remote_service_on(self.vehicle)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
self.coordinator.async_update_listeners()
async def async_turn_off(self, **kwargs: Any) -> None:
@@ -120,6 +123,9 @@ class BMWSwitch(BMWBaseEntity, SwitchEntity):
try:
await self.entity_description.remote_service_off(self.vehicle)
except MyBMWAPIError as ex:
raise HomeAssistantError(ex) from ex
raise HomeAssistantError(
translation_domain=BMW_DOMAIN,
translation_key="remote_service_error",
translation_placeholders={"exception": str(ex)},
) from ex
self.coordinator.async_update_listeners()
@@ -7,7 +7,7 @@ rules:
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
config-flow: todo
dependency-transparency: done
docs-actions: done
docs-high-level-description: todo
+202
View File
@@ -0,0 +1,202 @@
"""Backup platform for the cloud integration."""
from __future__ import annotations
import base64
from collections.abc import AsyncIterator, Callable, Coroutine
import hashlib
from typing import Any, Self
from aiohttp import ClientError, ClientTimeout, StreamReader
from hass_nabucasa import Cloud, CloudError
from hass_nabucasa.cloud_api import (
async_files_delete_file,
async_files_download_details,
async_files_list,
async_files_upload_details,
)
from homeassistant.components.backup import AgentBackup, BackupAgent, BackupAgentError
from homeassistant.core import HomeAssistant, callback
from .client import CloudClient
from .const import DATA_CLOUD, DOMAIN
_STORAGE_BACKUP = "backup"
async def _b64md5(stream: AsyncIterator[bytes]) -> str:
"""Calculate the MD5 hash of a file."""
file_hash = hashlib.md5()
async for chunk in stream:
file_hash.update(chunk)
return base64.b64encode(file_hash.digest()).decode()
async def async_get_backup_agents(
hass: HomeAssistant,
**kwargs: Any,
) -> list[BackupAgent]:
"""Return the cloud backup agent."""
cloud = hass.data[DATA_CLOUD]
if not cloud.is_logged_in:
return []
return [CloudBackupAgent(hass=hass, cloud=cloud)]
class ChunkAsyncStreamIterator:
"""Async iterator for chunked streams.
Based on aiohttp.streams.ChunkTupleAsyncStreamIterator, but yields
bytes instead of tuple[bytes, bool].
"""
__slots__ = ("_stream",)
def __init__(self, stream: StreamReader) -> None:
"""Initialize."""
self._stream = stream
def __aiter__(self) -> Self:
"""Iterate."""
return self
async def __anext__(self) -> bytes:
"""Yield next chunk."""
rv = await self._stream.readchunk()
if rv == (b"", False):
raise StopAsyncIteration
return rv[0]
class CloudBackupAgent(BackupAgent):
"""Cloud backup agent."""
domain = DOMAIN
name = DOMAIN
def __init__(self, hass: HomeAssistant, cloud: Cloud[CloudClient]) -> None:
"""Initialize the cloud backup sync agent."""
super().__init__()
self._cloud = cloud
self._hass = hass
@callback
def _get_backup_filename(self) -> str:
"""Return the backup filename."""
return f"{self._cloud.client.prefs.instance_id}.tar"
async def async_download_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AsyncIterator[bytes]:
"""Download a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
:return: An async iterator that yields bytes.
"""
if not await self.async_get_backup(backup_id):
raise BackupAgentError("Backup not found")
try:
details = await async_files_download_details(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to get download details") from err
try:
resp = await self._cloud.websession.get(details["url"])
resp.raise_for_status()
except ClientError as err:
raise BackupAgentError("Failed to download backup") from err
return ChunkAsyncStreamIterator(resp.content)
async def async_upload_backup(
self,
*,
open_stream: Callable[[], Coroutine[Any, Any, AsyncIterator[bytes]]],
backup: AgentBackup,
**kwargs: Any,
) -> None:
"""Upload a backup.
:param open_stream: A function returning an async iterator that yields bytes.
:param backup: Metadata about the backup that should be uploaded.
"""
if not backup.protected:
raise BackupAgentError("Cloud backups must be protected")
base64md5hash = await _b64md5(await open_stream())
try:
details = await async_files_upload_details(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
metadata=backup.as_dict(),
size=backup.size,
base64md5hash=base64md5hash,
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to get upload details") from err
try:
upload_status = await self._cloud.websession.put(
details["url"],
data=await open_stream(),
headers=details["headers"] | {"content-length": str(backup.size)},
timeout=ClientTimeout(connect=10.0, total=43200.0), # 43200s == 12h
)
upload_status.raise_for_status()
except (TimeoutError, ClientError) as err:
raise BackupAgentError("Failed to upload backup") from err
async def async_delete_backup(
self,
backup_id: str,
**kwargs: Any,
) -> None:
"""Delete a backup file.
:param backup_id: The ID of the backup that was returned in async_list_backups.
"""
if not await self.async_get_backup(backup_id):
return
try:
await async_files_delete_file(
self._cloud,
storage_type=_STORAGE_BACKUP,
filename=self._get_backup_filename(),
)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to delete backup") from err
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
try:
backups = await async_files_list(self._cloud, storage_type=_STORAGE_BACKUP)
except (ClientError, CloudError) as err:
raise BackupAgentError("Failed to list backups") from err
return [AgentBackup.from_dict(backup["Metadata"]) for backup in backups]
async def async_get_backup(
self,
backup_id: str,
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
backups = await self.async_list_backups()
for backup in backups:
if backup.backup_id == backup_id:
return backup
return None
+1
View File
@@ -306,6 +306,7 @@ class CloudClient(Interface):
},
"version": HA_VERSION,
"instance_id": self.prefs.instance_id,
"name": self._hass.config.location_name,
}
async def async_alexa_message(self, payload: dict[Any, Any]) -> dict[Any, Any]:
+2
View File
@@ -88,3 +88,5 @@ DISPATCHER_REMOTE_UPDATE: SignalType[Any] = SignalType("cloud_remote_update")
STT_ENTITY_UNIQUE_ID = "cloud-speech-to-text"
TTS_ENTITY_UNIQUE_ID = "cloud-text-to-speech"
LOGIN_MFA_TIMEOUT = 60
+56 -2
View File
@@ -9,6 +9,7 @@ import dataclasses
from functools import wraps
from http import HTTPStatus
import logging
import time
from typing import Any, Concatenate
import aiohttp
@@ -31,6 +32,7 @@ from homeassistant.components.http.data_validator import RequestDataValidator
from homeassistant.const import CLOUD_NEVER_EXPOSED_ENTITIES
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.util.location import async_detect_location_info
@@ -39,6 +41,7 @@ from .assist_pipeline import async_create_cloud_pipeline
from .client import CloudClient
from .const import (
DATA_CLOUD,
LOGIN_MFA_TIMEOUT,
PREF_ALEXA_REPORT_STATE,
PREF_DISABLE_2FA,
PREF_ENABLE_ALEXA,
@@ -69,6 +72,10 @@ _CLOUD_ERRORS: dict[type[Exception], tuple[HTTPStatus, str]] = {
}
class MFAExpiredOrNotStarted(auth.CloudError):
"""Multi-factor authentication expired, or not started."""
@callback
def async_setup(hass: HomeAssistant) -> None:
"""Initialize the HTTP API."""
@@ -101,6 +108,11 @@ def async_setup(hass: HomeAssistant) -> None:
_CLOUD_ERRORS.update(
{
auth.InvalidTotpCode: (HTTPStatus.BAD_REQUEST, "Invalid TOTP code."),
auth.MFARequired: (
HTTPStatus.UNAUTHORIZED,
"Multi-factor authentication required.",
),
auth.UserNotFound: (HTTPStatus.BAD_REQUEST, "User does not exist."),
auth.UserNotConfirmed: (HTTPStatus.BAD_REQUEST, "Email not confirmed."),
auth.UserExists: (
@@ -112,6 +124,10 @@ def async_setup(hass: HomeAssistant) -> None:
HTTPStatus.BAD_REQUEST,
"Password change required.",
),
MFAExpiredOrNotStarted: (
HTTPStatus.BAD_REQUEST,
"Multi-factor authentication expired, or not started. Please try again.",
),
}
)
@@ -206,19 +222,57 @@ class GoogleActionsSyncView(HomeAssistantView):
class CloudLoginView(HomeAssistantView):
"""Login to Home Assistant cloud."""
_mfa_tokens: dict[str, str] = {}
_mfa_tokens_set_time: float = 0
url = "/api/cloud/login"
name = "api:cloud:login"
@require_admin
@_handle_cloud_errors
@RequestDataValidator(
vol.Schema({vol.Required("email"): str, vol.Required("password"): str})
vol.Schema(
vol.All(
{
vol.Required("email"): str,
vol.Exclusive("password", "login"): str,
vol.Exclusive("code", "login"): str,
},
cv.has_at_least_one_key("password", "code"),
)
)
)
async def post(self, request: web.Request, data: dict[str, Any]) -> web.Response:
"""Handle login request."""
hass = request.app[KEY_HASS]
cloud = hass.data[DATA_CLOUD]
await cloud.login(data["email"], data["password"])
try:
email = data["email"]
password = data.get("password")
code = data.get("code")
if email and password:
await cloud.login(email, password)
else:
if (
not self._mfa_tokens
or time.time() - self._mfa_tokens_set_time > LOGIN_MFA_TIMEOUT
):
raise MFAExpiredOrNotStarted
# Voluptuous should ensure that code is not None because password is
assert code is not None
await cloud.login_verify_totp(email, code, self._mfa_tokens)
self._mfa_tokens = {}
self._mfa_tokens_set_time = 0
except auth.MFARequired as mfa_err:
self._mfa_tokens = mfa_err.mfa_tokens
self._mfa_tokens_set_time = time.time()
raise
if "assist_pipeline" in hass.config.components:
new_cloud_pipeline_id = await async_create_cloud_pipeline(hass)
+7 -2
View File
@@ -1,13 +1,18 @@
{
"domain": "cloud",
"name": "Home Assistant Cloud",
"after_dependencies": ["assist_pipeline", "google_assistant", "alexa"],
"after_dependencies": [
"alexa",
"assist_pipeline",
"backup",
"google_assistant"
],
"codeowners": ["@home-assistant/cloud"],
"dependencies": ["auth", "http", "repairs", "webhook"],
"documentation": "https://www.home-assistant.io/integrations/cloud",
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["hass_nabucasa"],
"requirements": ["hass-nabucasa==0.86.0"],
"requirements": ["hass-nabucasa==0.87.0"],
"single_config_entry": true
}
@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/compensation",
"iot_class": "calculated",
"quality_scale": "legacy",
"requirements": ["numpy==2.1.3"]
"requirements": ["numpy==2.2.0"]
}
@@ -24,7 +24,7 @@ from .agent_manager import (
get_agent_manager,
)
from .const import DATA_COMPONENT, DATA_DEFAULT_ENTITY
from .default_agent import METADATA_CUSTOM_FILE, METADATA_CUSTOM_SENTENCE, DefaultAgent
from .default_agent import METADATA_CUSTOM_FILE, METADATA_CUSTOM_SENTENCE
from .entity import ConversationEntity
from .models import ConversationInput
@@ -162,8 +162,7 @@ async def websocket_list_sentences(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict
) -> None:
"""List custom registered sentences."""
agent = hass.data.get(DATA_DEFAULT_ENTITY)
assert isinstance(agent, DefaultAgent)
agent = hass.data[DATA_DEFAULT_ENTITY]
sentences = []
for trigger_data in agent.trigger_sentences:
@@ -185,8 +184,7 @@ async def websocket_hass_agent_debug(
hass: HomeAssistant, connection: websocket_api.ActiveConnection, msg: dict
) -> None:
"""Return intents that would be matched by the default agent for a list of sentences."""
agent = hass.data.get(DATA_DEFAULT_ENTITY)
assert isinstance(agent, DefaultAgent)
agent = hass.data[DATA_DEFAULT_ENTITY]
# Return results for each sentence in the same order as the input.
result_dicts: list[dict[str, Any] | None] = []
@@ -0,0 +1,49 @@
"""The Cookidoo integration."""
from __future__ import annotations
from cookidoo_api import Cookidoo, CookidooConfig, CookidooLocalizationConfig
from homeassistant.const import (
CONF_COUNTRY,
CONF_EMAIL,
CONF_LANGUAGE,
CONF_PASSWORD,
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .coordinator import CookidooConfigEntry, CookidooDataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.TODO]
async def async_setup_entry(hass: HomeAssistant, entry: CookidooConfigEntry) -> bool:
"""Set up Cookidoo from a config entry."""
cookidoo = Cookidoo(
async_get_clientsession(hass),
CookidooConfig(
email=entry.data[CONF_EMAIL],
password=entry.data[CONF_PASSWORD],
localization=CookidooLocalizationConfig(
country_code=entry.data[CONF_COUNTRY].lower(),
language=entry.data[CONF_LANGUAGE],
),
),
)
coordinator = CookidooDataUpdateCoordinator(hass, cookidoo, entry)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: CookidooConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
@@ -0,0 +1,247 @@
"""Config flow for Cookidoo integration."""
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
from cookidoo_api import (
Cookidoo,
CookidooAuthException,
CookidooConfig,
CookidooLocalizationConfig,
CookidooRequestException,
get_country_options,
get_localization_options,
)
import voluptuous as vol
from homeassistant.config_entries import (
SOURCE_RECONFIGURE,
SOURCE_USER,
ConfigFlow,
ConfigFlowResult,
)
from homeassistant.const import CONF_COUNTRY, CONF_EMAIL, CONF_LANGUAGE, CONF_PASSWORD
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import (
CountrySelector,
CountrySelectorConfig,
LanguageSelector,
LanguageSelectorConfig,
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
AUTH_DATA_SCHEMA = {
vol.Required(CONF_EMAIL): TextSelector(
TextSelectorConfig(
type=TextSelectorType.EMAIL,
autocomplete="email",
),
),
vol.Required(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
),
),
}
class CookidooConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Cookidoo."""
COUNTRY_DATA_SCHEMA: dict
LANGUAGE_DATA_SCHEMA: dict
user_input: dict[str, Any]
async def async_step_reconfigure(
self, user_input: dict[str, Any]
) -> ConfigFlowResult:
"""Perform reconfigure upon an user action."""
return await self.async_step_user(user_input)
async def async_step_user(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Handle the user step as well as serve for reconfiguration."""
errors: dict[str, str] = {}
if user_input is not None and not (
errors := await self.validate_input(user_input)
):
if self.source == SOURCE_USER:
self._async_abort_entries_match({CONF_EMAIL: user_input[CONF_EMAIL]})
self.user_input = user_input
return await self.async_step_language()
await self.generate_country_schema()
suggested_values: dict = {}
if self.source == SOURCE_RECONFIGURE:
reconfigure_entry = self._get_reconfigure_entry()
suggested_values = {
**suggested_values,
**reconfigure_entry.data,
}
if user_input is not None:
suggested_values = {**suggested_values, **user_input}
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
data_schema=vol.Schema(
{**AUTH_DATA_SCHEMA, **self.COUNTRY_DATA_SCHEMA}
),
suggested_values=suggested_values,
),
description_placeholders={"cookidoo": "Cookidoo"},
errors=errors,
)
async def async_step_language(
self,
language_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Async language step to set up the connection."""
errors: dict[str, str] = {}
if language_input is not None and not (
errors := await self.validate_input(self.user_input, language_input)
):
if self.source == SOURCE_USER:
return self.async_create_entry(
title="Cookidoo", data={**self.user_input, **language_input}
)
reconfigure_entry = self._get_reconfigure_entry()
return self.async_update_reload_and_abort(
reconfigure_entry,
data={
**reconfigure_entry.data,
**self.user_input,
**language_input,
},
)
await self.generate_language_schema()
return self.async_show_form(
step_id="language",
data_schema=vol.Schema(self.LANGUAGE_DATA_SCHEMA),
description_placeholders={"cookidoo": "Cookidoo"},
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Perform reauth upon an API authentication error."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
errors: dict[str, str] = {}
reauth_entry = self._get_reauth_entry()
if user_input is not None:
if not (
errors := await self.validate_input({**reauth_entry.data, **user_input})
):
if user_input[CONF_EMAIL] != reauth_entry.data[CONF_EMAIL]:
self._async_abort_entries_match(
{CONF_EMAIL: user_input[CONF_EMAIL]}
)
return self.async_update_reload_and_abort(
reauth_entry, data_updates=user_input
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=self.add_suggested_values_to_schema(
data_schema=vol.Schema(AUTH_DATA_SCHEMA),
suggested_values={CONF_EMAIL: reauth_entry.data[CONF_EMAIL]},
),
description_placeholders={"cookidoo": "Cookidoo"},
errors=errors,
)
async def generate_country_schema(self) -> None:
"""Generate country schema."""
self.COUNTRY_DATA_SCHEMA = {
vol.Required(CONF_COUNTRY): CountrySelector(
CountrySelectorConfig(
countries=[
country.upper() for country in await get_country_options()
],
)
)
}
async def generate_language_schema(self) -> None:
"""Generate language schema."""
self.LANGUAGE_DATA_SCHEMA = {
vol.Required(CONF_LANGUAGE): LanguageSelector(
LanguageSelectorConfig(
languages=[
option.language
for option in await get_localization_options(
country=self.user_input[CONF_COUNTRY].lower()
)
],
native_name=True,
),
),
}
async def validate_input(
self,
user_input: dict[str, Any],
language_input: dict[str, Any] | None = None,
) -> dict[str, str]:
"""Input Helper."""
errors: dict[str, str] = {}
data_input: dict[str, Any] = {}
if self.source == SOURCE_RECONFIGURE:
reconfigure_entry = self._get_reconfigure_entry()
data_input = {**data_input, **reconfigure_entry.data}
data_input = {**data_input, **user_input}
if language_input:
data_input = {**data_input, **language_input}
else:
data_input[CONF_LANGUAGE] = (
await get_localization_options(country=data_input[CONF_COUNTRY].lower())
)[0] # Pick any language to test login
session = async_get_clientsession(self.hass)
cookidoo = Cookidoo(
session,
CookidooConfig(
email=data_input[CONF_EMAIL],
password=data_input[CONF_PASSWORD],
localization=CookidooLocalizationConfig(
country_code=data_input[CONF_COUNTRY].lower(),
language=data_input[CONF_LANGUAGE],
),
),
)
try:
await cookidoo.login()
if language_input:
await cookidoo.get_additional_items()
except CookidooRequestException:
errors["base"] = "cannot_connect"
except CookidooAuthException:
errors["base"] = "invalid_auth"
except Exception:
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
return errors
@@ -0,0 +1,3 @@
"""Constants for the Cookidoo integration."""
DOMAIN = "cookidoo"
@@ -0,0 +1,101 @@
"""DataUpdateCoordinator for the Cookidoo integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
from cookidoo_api import (
Cookidoo,
CookidooAdditionalItem,
CookidooAuthException,
CookidooException,
CookidooIngredientItem,
CookidooRequestException,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_EMAIL
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
type CookidooConfigEntry = ConfigEntry[CookidooDataUpdateCoordinator]
@dataclass
class CookidooData:
"""Cookidoo data type."""
ingredient_items: list[CookidooIngredientItem]
additional_items: list[CookidooAdditionalItem]
class CookidooDataUpdateCoordinator(DataUpdateCoordinator[CookidooData]):
"""A Cookidoo Data Update Coordinator."""
config_entry: CookidooConfigEntry
def __init__(
self, hass: HomeAssistant, cookidoo: Cookidoo, entry: CookidooConfigEntry
) -> None:
"""Initialize the Cookidoo data coordinator."""
super().__init__(
hass,
_LOGGER,
name=DOMAIN,
update_interval=timedelta(seconds=90),
config_entry=entry,
)
self.cookidoo = cookidoo
async def _async_setup(self) -> None:
try:
await self.cookidoo.login()
except CookidooRequestException as e:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="setup_request_exception",
) from e
except CookidooAuthException as e:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="setup_authentication_exception",
translation_placeholders={
CONF_EMAIL: self.config_entry.data[CONF_EMAIL]
},
) from e
async def _async_update_data(self) -> CookidooData:
try:
ingredient_items = await self.cookidoo.get_ingredient_items()
additional_items = await self.cookidoo.get_additional_items()
except CookidooAuthException:
try:
await self.cookidoo.refresh_token()
except CookidooAuthException as exc:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="setup_authentication_exception",
translation_placeholders={
CONF_EMAIL: self.config_entry.data[CONF_EMAIL]
},
) from exc
_LOGGER.debug(
"Authentication failed but re-authentication was successful, trying again later"
)
return self.data
except CookidooException as e:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_exception",
) from e
return CookidooData(
ingredient_items=ingredient_items, additional_items=additional_items
)
@@ -0,0 +1,30 @@
"""Base entity for the Cookidoo integration."""
from __future__ import annotations
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import CookidooDataUpdateCoordinator
class CookidooBaseEntity(CoordinatorEntity[CookidooDataUpdateCoordinator]):
"""Cookidoo base entity."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: CookidooDataUpdateCoordinator,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self.device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
name="Cookidoo",
identifiers={(DOMAIN, coordinator.config_entry.entry_id)},
manufacturer="Vorwerk International & Co. KmG",
model="Cookidoo - Thermomix® recipe portal",
)
@@ -0,0 +1,12 @@
{
"entity": {
"todo": {
"ingredient_list": {
"default": "mdi:cart-plus"
},
"additional_item_list": {
"default": "mdi:cart-plus"
}
}
}
}
@@ -0,0 +1,11 @@
{
"domain": "cookidoo",
"name": "Cookidoo",
"codeowners": ["@miaucl"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/cookidoo",
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "silver",
"requirements": ["cookidoo-api==0.10.0"]
}
@@ -0,0 +1,90 @@
rules:
# Bronze
action-setup:
status: exempt
comment: No service actions implemented
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: No service actions implemented
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions:
status: exempt
comment: No special external action required
entity-event-setup:
status: exempt
comment: No callbacks are implemented
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
config-entry-unloading: done
log-when-unavailable:
status: done
comment: Offloaded to coordinator
entity-unavailable:
status: done
comment: Offloaded to coordinator
action-exceptions:
status: done
comment: Only providing todo actions
reauthentication-flow: done
parallel-updates: done
test-coverage: done
integration-owner: done
docs-installation-parameters: done
docs-configuration-parameters:
status: exempt
comment: No options flow
# Gold
entity-translations: done
entity-device-class:
status: exempt
comment: currently no platform with device classes
devices: done
entity-category: done
entity-disabled-by-default:
status: exempt
comment: No disabled entities implemented
discovery:
status: exempt
comment: Nothing to discover
stale-devices:
status: exempt
comment: No stale entities possible
diagnostics: todo
exception-translations: done
icon-translations: done
reconfiguration-flow: done
dynamic-devices:
status: exempt
comment: No dynamic entities available
discovery-update-info:
status: exempt
comment: No discoverable entities implemented
repair-issues:
status: exempt
comment: No issues/repairs
docs-use-cases: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-data-update: done
docs-known-limitations: done
docs-troubleshooting: todo
docs-examples: todo
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done
@@ -0,0 +1,80 @@
{
"config": {
"step": {
"user": {
"title": "Setup {cookidoo}",
"data": {
"email": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]",
"country": "Country"
},
"data_description": {
"email": "Email used to access your {cookidoo} account.",
"password": "Password used to access your {cookidoo} account.",
"country": "Pick your country for the {cookidoo} content."
}
},
"language": {
"title": "Setup {cookidoo}",
"data": {
"language": "[%key:common::config_flow::data::language%]"
},
"data_description": {
"language": "Pick your language for the {cookidoo} content."
}
},
"reauth_confirm": {
"title": "Login again to {cookidoo}",
"description": "Please log in to {cookidoo} again to continue using this integration.",
"data": {
"email": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"email": "[%key:component::cookidoo::config::step::user::data_description::email%]",
"password": "[%key:component::cookidoo::config::step::user::data_description::password%]"
}
}
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
}
},
"entity": {
"todo": {
"ingredient_list": {
"name": "Shopping list"
},
"additional_item_list": {
"name": "Additional purchases"
}
}
},
"exceptions": {
"todo_save_item_failed": {
"message": "Failed to save {name} to Cookidoo shopping list"
},
"todo_update_item_failed": {
"message": "Failed to update {name} in Cookidoo shopping list"
},
"todo_delete_item_failed": {
"message": "Failed to delete {count} item(s) from Cookidoo shopping list"
},
"setup_request_exception": {
"message": "Failed to connect to server, try again later"
},
"setup_authentication_exception": {
"message": "Authentication failed for {email}, check your email and password"
},
"update_exception": {
"message": "Unable to connect and retrieve data from cookidoo"
}
}
}
+185
View File
@@ -0,0 +1,185 @@
"""Todo platform for the Cookidoo integration."""
from __future__ import annotations
from typing import TYPE_CHECKING
from cookidoo_api import (
CookidooAdditionalItem,
CookidooException,
CookidooIngredientItem,
)
from homeassistant.components.todo import (
TodoItem,
TodoItemStatus,
TodoListEntity,
TodoListEntityFeature,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from .const import DOMAIN
from .coordinator import CookidooConfigEntry, CookidooDataUpdateCoordinator
from .entity import CookidooBaseEntity
PARALLEL_UPDATES = 0
async def async_setup_entry(
hass: HomeAssistant,
config_entry: CookidooConfigEntry,
async_add_entities: AddEntitiesCallback,
) -> None:
"""Set up the todo list from a config entry created in the integrations UI."""
coordinator = config_entry.runtime_data
async_add_entities(
[
CookidooIngredientsTodoListEntity(coordinator),
CookidooAdditionalItemTodoListEntity(coordinator),
]
)
class CookidooIngredientsTodoListEntity(CookidooBaseEntity, TodoListEntity):
"""A To-do List representation of the ingredients in the Cookidoo Shopping List."""
_attr_translation_key = "ingredient_list"
_attr_supported_features = TodoListEntityFeature.UPDATE_TODO_ITEM
def __init__(self, coordinator: CookidooDataUpdateCoordinator) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_ingredients"
@property
def todo_items(self) -> list[TodoItem]:
"""Return the todo ingredients."""
return [
TodoItem(
uid=item.id,
summary=item.name,
description=item.description or "",
status=(
TodoItemStatus.COMPLETED
if item.is_owned
else TodoItemStatus.NEEDS_ACTION
),
)
for item in self.coordinator.data.ingredient_items
]
async def async_update_todo_item(self, item: TodoItem) -> None:
"""Update an ingredient to the To-do list.
Cookidoo ingredients can be changed in state, but not in summary or description. This is currently not possible to distinguish in home assistant and just fails silently.
"""
try:
if TYPE_CHECKING:
assert item.uid
await self.coordinator.cookidoo.edit_ingredient_items_ownership(
[
CookidooIngredientItem(
id=item.uid,
name="",
description="",
is_owned=item.status == TodoItemStatus.COMPLETED,
)
]
)
except CookidooException as e:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="todo_update_item_failed",
translation_placeholders={"name": item.summary or ""},
) from e
await self.coordinator.async_refresh()
class CookidooAdditionalItemTodoListEntity(CookidooBaseEntity, TodoListEntity):
"""A To-do List representation of the additional items in the Cookidoo Shopping List."""
_attr_translation_key = "additional_item_list"
_attr_supported_features = (
TodoListEntityFeature.CREATE_TODO_ITEM
| TodoListEntityFeature.UPDATE_TODO_ITEM
| TodoListEntityFeature.DELETE_TODO_ITEM
)
def __init__(self, coordinator: CookidooDataUpdateCoordinator) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_additional_items"
@property
def todo_items(self) -> list[TodoItem]:
"""Return the todo items."""
return [
TodoItem(
uid=item.id,
summary=item.name,
status=(
TodoItemStatus.COMPLETED
if item.is_owned
else TodoItemStatus.NEEDS_ACTION
),
)
for item in self.coordinator.data.additional_items
]
async def async_create_todo_item(self, item: TodoItem) -> None:
"""Add an item to the To-do list."""
try:
if TYPE_CHECKING:
assert item.summary
await self.coordinator.cookidoo.add_additional_items([item.summary])
except CookidooException as e:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="todo_save_item_failed",
translation_placeholders={"name": item.summary or ""},
) from e
await self.coordinator.async_refresh()
async def async_update_todo_item(self, item: TodoItem) -> None:
"""Update an item to the To-do list."""
try:
if TYPE_CHECKING:
assert item.uid
assert item.summary
new_item = CookidooAdditionalItem(
id=item.uid,
name=item.summary,
is_owned=item.status == TodoItemStatus.COMPLETED,
)
await self.coordinator.cookidoo.edit_additional_items_ownership([new_item])
await self.coordinator.cookidoo.edit_additional_items([new_item])
except CookidooException as e:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="todo_update_item_failed",
translation_placeholders={"name": item.summary or ""},
) from e
await self.coordinator.async_refresh()
async def async_delete_todo_items(self, uids: list[str]) -> None:
"""Delete an item from the To-do list."""
try:
await self.coordinator.cookidoo.remove_additional_items(uids)
except CookidooException as e:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="todo_delete_item_failed",
translation_placeholders={"count": str(len(uids))},
) from e
await self.coordinator.async_refresh()
@@ -49,7 +49,7 @@ class BaseCrownstoneFlowHandler(ConfigEntryBaseFlow):
cloud: CrownstoneCloud
def __init__(
self, flow_type: str, create_entry_cb: Callable[..., ConfigFlowResult]
self, flow_type: str, create_entry_cb: Callable[[], ConfigFlowResult]
) -> None:
"""Set up flow instance."""
self.flow_type = flow_type
@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/daikin",
"iot_class": "local_polling",
"loggers": ["pydaikin"],
"requirements": ["pydaikin==2.13.7"],
"requirements": ["pydaikin==2.13.8"],
"zeroconf": ["_dkapi._tcp.local."]
}
@@ -6,5 +6,5 @@
"integration_type": "service",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["debugpy==1.8.8"]
"requirements": ["debugpy==1.8.11"]
}
+4
View File
@@ -18,6 +18,8 @@ from homeassistant.components.light import (
ATTR_HS_COLOR,
ATTR_TRANSITION,
ATTR_XY_COLOR,
DEFAULT_MAX_KELVIN,
DEFAULT_MIN_KELVIN,
DOMAIN as LIGHT_DOMAIN,
EFFECT_COLORLOOP,
FLASH_LONG,
@@ -191,6 +193,8 @@ class DeconzBaseLight[_LightDeviceT: Group | Light](
TYPE = LIGHT_DOMAIN
_attr_color_mode = ColorMode.UNKNOWN
_attr_min_color_temp_kelvin = DEFAULT_MIN_KELVIN
_attr_max_color_temp_kelvin = DEFAULT_MAX_KELVIN
def __init__(self, device: _LightDeviceT, hub: DeconzHub) -> None:
"""Set up light."""
+11 -6
View File
@@ -7,12 +7,14 @@ from typing import Any
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ATTR_COLOR_TEMP,
ATTR_COLOR_TEMP_KELVIN,
ATTR_EFFECT,
ATTR_HS_COLOR,
ATTR_RGBW_COLOR,
ATTR_RGBWW_COLOR,
ATTR_WHITE,
DEFAULT_MAX_KELVIN,
DEFAULT_MIN_KELVIN,
ColorMode,
LightEntity,
LightEntityFeature,
@@ -28,7 +30,7 @@ LIGHT_COLORS = [(56, 86), (345, 75)]
LIGHT_EFFECT_LIST = ["rainbow", "none"]
LIGHT_TEMPS = [240, 380]
LIGHT_TEMPS = [4166, 2631]
SUPPORT_DEMO = {ColorMode.HS, ColorMode.COLOR_TEMP}
SUPPORT_DEMO_HS_WHITE = {ColorMode.HS, ColorMode.WHITE}
@@ -100,6 +102,9 @@ class DemoLight(LightEntity):
_attr_name = None
_attr_should_poll = False
_attr_max_color_temp_kelvin = DEFAULT_MAX_KELVIN
_attr_min_color_temp_kelvin = DEFAULT_MIN_KELVIN
def __init__(
self,
unique_id: str,
@@ -185,8 +190,8 @@ class DemoLight(LightEntity):
return self._rgbww_color
@property
def color_temp(self) -> int:
"""Return the CT color temperature."""
def color_temp_kelvin(self) -> int | None:
"""Return the color temperature value in Kelvin."""
return self._ct
@property
@@ -216,9 +221,9 @@ class DemoLight(LightEntity):
if ATTR_BRIGHTNESS in kwargs:
self._brightness = kwargs[ATTR_BRIGHTNESS]
if ATTR_COLOR_TEMP in kwargs:
if ATTR_COLOR_TEMP_KELVIN in kwargs:
self._color_mode = ColorMode.COLOR_TEMP
self._ct = kwargs[ATTR_COLOR_TEMP]
self._ct = kwargs[ATTR_COLOR_TEMP_KELVIN]
if ATTR_EFFECT in kwargs:
self._effect = kwargs[ATTR_EFFECT]
@@ -81,14 +81,8 @@ class DevoloBinaryDeviceEntity(DevoloDeviceEntity, BinarySensorEntity):
or self._binary_sensor_property.sensor_type
)
if device_instance.binary_sensor_property[element_uid].sub_type != "":
self._attr_name = device_instance.binary_sensor_property[
element_uid
].sub_type.capitalize()
else:
self._attr_name = device_instance.binary_sensor_property[
element_uid
].sensor_type.capitalize()
if device_instance.binary_sensor_property[element_uid].sub_type == "overload":
self._attr_translation_key = "overload"
self._value = self._binary_sensor_property.state
@@ -129,7 +123,8 @@ class DevoloRemoteControl(DevoloDeviceEntity, BinarySensorEntity):
self._key = key
self._attr_is_on = False
self._attr_name = f"Button {key}"
self._attr_translation_key = "button"
self._attr_translation_placeholders = {"key": str(key)}
def _sync(self, message: tuple) -> None:
"""Update the binary sensor state."""
@@ -116,9 +116,11 @@ class DevoloGenericMultiLevelDeviceEntity(DevoloMultiLevelDeviceEntity):
self._multi_level_sensor_property.sensor_type
)
self._attr_native_unit_of_measurement = self._multi_level_sensor_property.unit
self._attr_name = self._multi_level_sensor_property.sensor_type.capitalize()
self._value = self._multi_level_sensor_property.value
if self._multi_level_sensor_property.sensor_type == "light":
self._attr_translation_key = "brightness"
if element_uid.startswith("devolo.VoltageMultiLevelSensor:"):
self._attr_entity_registry_enabled_default = False
@@ -128,7 +130,6 @@ class DevoloBatteryEntity(DevoloMultiLevelDeviceEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
_attr_native_unit_of_measurement = PERCENTAGE
_attr_name = "Battery level"
_attr_device_class = SensorDeviceClass.BATTERY
_attr_state_class = SensorStateClass.MEASUREMENT
@@ -175,8 +176,6 @@ class DevoloConsumptionEntity(DevoloMultiLevelDeviceEntity):
device_instance.consumption_property[element_uid], consumption
)
self._attr_name = f"{consumption.capitalize()} consumption"
@property
def unique_id(self) -> str:
"""Return the unique ID of the entity.

Some files were not shown because too many files have changed in this diff Show More