Compare commits

..

95 Commits

Author SHA1 Message Date
Paulus Schoutsen dc1c2f24e6 Bump version to 2025.2.0b6 2025-02-02 02:06:10 +00:00
Robert Resch 78dcf8b18e Bump deebot-client to 12.0.0b0 (#137137) 2025-02-02 02:06:07 +00:00
J. Nick Koston 613168fd62 Add missing brackets to ESPHome configuration URLs with IPv6 addresses (#137132)
fixes #137125
2025-02-02 02:06:06 +00:00
J. Nick Koston 5f28e95bdc Bump habluetooth to 3.21.0 (#137129) 2025-02-02 02:06:05 +00:00
Allen Porter 1db5da4037 Remove entity state from mcp-server prompt (#137126)
* Create a stateless assist API for MCP server

* Update stateless API

* Fix areas in exposed entity fields

* Add tests that verify areas are returned

* Revert the getstate intent

* Revert whitespace change

* Revert whitespace change

* Revert method name changes to avoid breaking openai and google tests
2025-02-02 02:06:05 +00:00
Alex Thompson 6bf5e95089 Allow ignored tilt_ble devices to be set up from user flow (#137123)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-02-02 02:06:04 +00:00
Shay Levy 1ea23fda10 Allow ignored Aranet devices to be set up from the user flow (#137121) 2025-02-02 02:06:03 +00:00
J. Nick Koston 21a85c014a Allow ignored xiaomi_ble devices to be set up from the user flow (#137115) 2025-02-02 02:06:03 +00:00
J. Nick Koston 4c8f716320 Allow ignored sensorpush devices to be set up from the user flow (#137113)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for sensorpush
2025-02-02 02:06:02 +00:00
J. Nick Koston 63bd67f6cd Allow ignored qingping devices to be set up from the user flow (#137111)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for qingping
2025-02-02 02:06:01 +00:00
Assaf Inbal 73b874c5e6 Fix Homekit camera profiles schema (#137110) 2025-02-02 02:06:00 +00:00
J. Nick Koston 3b67dc3651 Allow ignored oralb devices to be set up from the user flow (#137109)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for oralb
2025-02-02 02:06:00 +00:00
J. Nick Koston 434a4ebc9f Allow ignored mopeka devices to be set up from the user flow (#137107)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for mopeka
2025-02-02 02:05:59 +00:00
J. Nick Koston cb4b7e71af Allow ignored inkbird devices to be set up from the user flow (#137106)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for inkbird
2025-02-02 02:05:58 +00:00
J. Nick Koston 4c6fda2096 Allow ignored bthome devices to be set up from the user flow (#137105) 2025-02-02 02:05:58 +00:00
J. Nick Koston 9b5c21524c Allow ignored thermopro devices to be set up from the user flow (#137104)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for thermopro
2025-02-02 02:05:57 +00:00
J. Nick Koston 76937541f1 Allow ignored yale_ble devices to be set up from the user flow (#137103)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for yalexs_ble
2025-02-02 02:05:56 +00:00
J. Nick Koston bad966f3ab Allow ignored airthings_ble devices to be set up from the user flow (#137102)
Every few days we get an issue report about a device a user ignored and forgot about, and than can no longer get set up. Sometimes its a govee device, sometimes its a switchbot device, but the pattern is consistent.

Allow ignored devices to be selected in the user step and replace the ignored entry.

Same as #137056 and #137052 but for airthings
2025-02-02 02:05:55 +00:00
J. Nick Koston 2d1d9bbe5a Set via_device for remote Bluetooth adapters to link to the parent device (#137091) 2025-02-02 02:05:55 +00:00
Marc Mueller e76ff0a0de Update RestrictedPython to 8.0 (#137075) 2025-02-02 02:05:54 +00:00
IceBotYT fa8d1b4dc4 Bump lacrosse-view to 1.0.4 (#137058) 2025-02-02 02:05:53 +00:00
Paulus Schoutsen b3c44ca03a Bump version to 2025.2.0b5 2025-02-01 13:58:56 +00:00
Jan-Philipp Benecke 6efa6f9687 Load hassio before backup at frontend stage (#137067) 2025-02-01 13:58:53 +00:00
J. Nick Koston 3588b88cbb Bump habluetooth to 3.20.1 (#137063) 2025-02-01 13:58:52 +00:00
tronikos a51846a8cd For consistency use suggested_filename in Google Drive (#137061)
Use  suggested_filename in Google Drive
2025-02-01 13:58:52 +00:00
J. Nick Koston ec22479733 Allow ignored switchbot devices to be set up from the user flow (#137056) 2025-02-01 13:58:51 +00:00
J. Nick Koston 3a11e8df6a Allow ignored govee-ble devices to be set up from the user flow (#137052)
* Allow ignored govee-ble devices to be setup up from the user flow

Every few days we get an issue report about a device
a user ignored and forgot about, and than can no longer
get set up. Allow ignored devices to be selected in
the user step and replace the ignored entry.

* Add the ability to skip ignored config entries when calling _abort_if_unique_id_configured

see https://github.com/home-assistant/core/pull/137052

* coverage

* revert
2025-02-01 13:58:50 +00:00
Nathan Spencer a4eab35e01 Raise HomeAssistantError from camera snapshot service (#137051)
* Raise HomeAssistantError from camera snapshot service

* Improve error message

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-02-01 13:58:50 +00:00
Paulus Schoutsen 829a6271af Bump version to 2025.2.0b4 2025-02-01 01:04:55 +00:00
Jan Bouwhuis 9935528dd3 Bump aioimaplib to version 2.0.1 (#137049) 2025-02-01 01:04:48 +00:00
J. Nick Koston df35d226d6 Bump habluetooth to 3.17.1 (#137045) 2025-02-01 01:04:47 +00:00
J. Nick Koston 2b510caa1c Bump aiohttp-asyncmdnsresolver to 0.0.3 (#137040) 2025-02-01 01:04:47 +00:00
J. Nick Koston 90c357c01f Bump bthome-ble to 3.12.3 (#137036) 2025-02-01 01:04:46 +00:00
J. Nick Koston 321ce698be Bump zeroconf to 0.143.0 (#137035) 2025-02-01 01:04:45 +00:00
Ernst Klamer ea519268b6 Bump bthome-ble to 3.11.0 (#137032)
bump bthome-ble to 3.11.0
2025-02-01 01:04:45 +00:00
Josef Zweck 4687b2e455 Use readable backup names for onedrive (#137031)
* Use readable names for onedrive

* ensure filename is fixed

* fix import
2025-02-01 01:04:44 +00:00
Joost Lekkerkerker bbb03d6731 Update Overseerr string to mention CSRF (#137001)
* Update Overseerr string to mention CSRF

* Update homeassistant/components/overseerr/strings.json

* Update homeassistant/components/overseerr/strings.json

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2025-02-01 01:04:43 +00:00
Jan Bouwhuis b9884f72c3 Shorten the integration name for incomfort (#136930) 2025-02-01 01:04:42 +00:00
Paulus Schoutsen e1105ef2fa Bump version to 2025.2.0b3 2025-01-31 19:25:16 +00:00
Robert Resch 5450ed8445 Bump deebot-client to 11.1.0b2 (#137030) 2025-01-31 19:24:42 +00:00
J. Nick Koston 7deb1715dd Bump SQLAlchemy to 2.0.37 (#137028)
changelog: https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-2.0.37

There is a bug fix that likely affects us that could lead to corrupted queries
https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-e4d04d8eb1bccee16b74f5662aff8edd
2025-01-31 19:24:41 +00:00
J. Nick Koston ca2a555037 Bump bleak-esphome to 2.6.0 (#137025) 2025-01-31 19:24:41 +00:00
Bram Kragten ae79b09401 Update frontend to 20250131.0 (#137024) 2025-01-31 19:24:40 +00:00
J. Nick Koston e86a633c23 Bump habluetooth to 3.17.0 (#137022) 2025-01-31 19:24:39 +00:00
Erik Montnemery b412164440 Make supervisor backup file names more user friendly (#137020) 2025-01-31 19:24:39 +00:00
Norbert Rittel 4fe76ec78c Revert previous PR and remove URL from error message instead (#137018) 2025-01-31 19:24:38 +00:00
Erik Montnemery f4166c5390 Make sure we load the backup integration before frontend (#137010) 2025-01-31 19:24:37 +00:00
Cyrill Raccaud 3107b81333 Remove the unparsed config flow error from Swiss public transport (#136998) 2025-01-31 19:24:36 +00:00
Joost Lekkerkerker 07b85163d5 Use device name as entity name in Eheim digital climate (#136997) 2025-01-31 19:24:35 +00:00
Duco Sebel c28d465f3b Bumb python-homewizard-energy to 8.3.2 (#136995) 2025-01-31 19:24:34 +00:00
Josef Zweck 00298db465 Call backup listener during setup in onedrive (#136990) 2025-01-31 19:24:34 +00:00
Cyrill Raccaud 6bab5b2c32 Fix missing duration translation for Swiss public transport integration (#136982) 2025-01-31 19:24:33 +00:00
Josef Zweck 0272d37e88 Retry backup uploads in onedrive (#136980)
* Retry backup uploads in onedrive

* no exponential backup on timeout
2025-01-31 19:24:32 +00:00
Erik Montnemery 26ae498974 Delete old addon update backups when updating addon (#136977)
* Delete old addon update backups when updating addon

* Address review comments

* Add tests
2025-01-31 19:24:31 +00:00
J. Nick Koston c77bca1e44 Bump habluetooth to 3.15.0 (#136973) 2025-01-31 19:24:30 +00:00
Michael Hansen ad86f9efd5 Consume extra system prompt in first pipeline (#136958) 2025-01-31 19:24:30 +00:00
Matthias Alphart 71a40d9234 Update knx-frontend to 2025.1.30.194235 (#136954) 2025-01-31 19:24:29 +00:00
J. Nick Koston eb344ba335 Bump aiohttp-asyncmdnsresolver to 0.0.2 (#136942) 2025-01-31 19:22:11 +00:00
J. Nick Koston eca30717a9 Bump zeroconf to 0.142.0 (#136940)
changelog: https://github.com/python-zeroconf/python-zeroconf/compare/0.141.0...0.142.0
2025-01-31 19:22:10 +00:00
Erik Montnemery 6e55ba137a Make backup file names more user friendly (#136928)
* Make backup file names more user friendly

* Strip backup name

* Strip backup name

* Underscores
2025-01-31 19:22:10 +00:00
tronikos a391f0a7cc Bump opower to 0.8.9 (#136911)
* Bump opower to 0.8.9

* mypy
2025-01-31 19:22:09 +00:00
tronikos c9fd27555c Include the redirect URL in the Google Drive instructions (#136906)
* Include the redirect URL in the Google Drive instructions

* Apply suggestions from code review

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-01-31 19:22:08 +00:00
Allen Porter 9cd48dd452 Persist roborock maps to disk only on shutdown (#136889)
* Persist roborock maps to disk only on shutdown

* Rename on_unload to on_stop

* Spawn 1 executor thread and block writes to disk

* Update tests/components/roborock/test_image.py

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>

* Use config entry setup instead of component setup

---------

Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2025-01-31 19:22:07 +00:00
Avi Miller a74328e600 Suppress color_temp warning if color_temp_kelvin is provided (#136884) 2025-01-31 19:22:06 +00:00
Jan Stienstra 5cec045cac Bump jellyfin-apiclient-python to 1.10.0 (#136872) 2025-01-31 19:22:06 +00:00
Norbert Rittel 04a7c6f15e Fixes to the user-facing strings of energenie_power_sockets (#136844) 2025-01-31 19:22:05 +00:00
Austin Mroczek 833b17a8ee Bump total-connect-client to 2025.1.4 (#136793) 2025-01-31 19:22:04 +00:00
Sid a955901d40 Refactor eheimdigital platform async_setup_entry (#136745) 2025-01-31 19:22:03 +00:00
starkillerOG 9a55b5e3f7 Ensure Reolink can start when privacy mode is enabled (#136514)
* Allow startup when privacy mode is enabled

* Add tests

* remove duplicate privacy_mode

* fix tests

* Apply suggestions from code review

Co-authored-by: Robert Resch <robert@resch.dev>

* Store in subfolder and cleanup when removed

* Add tests and fixes

* fix styling

* rename CONF_PRIVACY to CONF_SUPPORTS_PRIVACY_MODE

* use helper store

---------

Co-authored-by: Robert Resch <robert@resch.dev>
2025-01-31 19:22:03 +00:00
Bram Kragten 3847057444 Bump version to 2025.2.0b2 2025-01-30 19:28:55 +01:00
Bram Kragten 659a0df9ab Update frontend to 20250130.0 (#136937) 2025-01-30 19:21:55 +01:00
Maciej Bieniek 74f0af1ba1 Fix KeyError for Shelly virtual number component (#136932) 2025-01-30 19:21:54 +01:00
Michael ad6c3f9e10 Fix backup related translations in Synology DSM (#136931)
refernce backup related strings in option-flow strings
2025-01-30 19:21:53 +01:00
Josef Zweck 252b13e63a Pick onedrive owner from a more reliable source (#136929)
* Pick onedrive owner from a more reliable source

* fix
2025-01-30 19:21:52 +01:00
Joost Lekkerkerker 07acabdb36 Create Xbox signed session in executor (#136927) 2025-01-30 19:21:51 +01:00
Joost Lekkerkerker f479ed4ff0 Fix Sonos importing deprecating constant (#136926) 2025-01-30 19:21:51 +01:00
Joost Lekkerkerker b70598673b Show name of the backup agents in issue (#136925)
* Show name of the backup agents in issue

* Show name of the backup agents in issue

* Update homeassistant/components/backup/manager.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-01-30 19:21:50 +01:00
tronikos 08bb027eac Don't log errors when raising a backup exception in Google Drive (#136916) 2025-01-30 19:21:49 +01:00
Maciej Bieniek 613f0add76 Convert valve position to int for Shelly BLU TRV (#136912) 2025-01-30 19:21:48 +01:00
Josef Zweck 9e23ff9a4d Fix onedrive does not fail on delete not found (#136910)
* Fix onedrive does not fail on delete not found

* Fix onedrive does not fail on delete not found
2025-01-30 19:21:47 +01:00
Erik Montnemery fad3d5d293 Don't blow up when a backup doesn't exist on supervisor (#136907) 2025-01-30 19:21:46 +01:00
Erik Montnemery b300fb1fab Fix handling of renamed backup files in the core writer (#136898)
* Fix handling of renamed backup files in the core writer

* Adjust mocking

* Raise BackupAgentError instead of KeyError in get_backup_path

* Add specific error indicating backup not found

* Fix tests

* Ensure backups are loaded

* Fix tests
2025-01-30 19:21:46 +01:00
Erik Montnemery aed779172d Ignore dangling symlinks when restoring backup (#136893) 2025-01-30 19:21:45 +01:00
epenet 5e646a3cb6 Add missing discovery string from onewire (#136892) 2025-01-30 19:21:44 +01:00
Erik Montnemery 0764aca2f1 Poll supervisor job state when creating or restoring a backup (#136891)
* Poll supervisor job state when creating or restoring a backup

* Update tests

* Add tests for create and restore jobs finishing early
2025-01-30 19:21:43 +01:00
Allen Porter 8babdc0b71 Bump nest to 7.1.1 (#136888) 2025-01-30 19:21:43 +01:00
TheJulianJES ff64e5a312 Bump ZHA to 0.0.47 (#136883) 2025-01-30 19:21:42 +01:00
TimL 55ac0b0f37 Fix loading of SMLIGHT integration when no internet is available (#136497)
* Don't fail to load integration if internet unavailable

* Add test case for no internet

* Also test we recover after internet returns
2025-01-30 19:21:41 +01:00
Paulus Schoutsen f391438d0a Add start_conversation service to Assist Satellite (#134921)
* Add start_conversation service to Assist Satellite

* Fix tests

* Implement start_conversation in voip

* Update homeassistant/components/assist_satellite/entity.py

---------

Co-authored-by: Michael Hansen <mike@rhasspy.org>
2025-01-30 19:21:39 +01:00
Paulus Schoutsen 9c8d31a3d5 Bump version to 2025.2.0b1 2025-01-29 21:18:11 +00:00
Erik Montnemery 49b90fc140 Bump backup store to version 1.3 (#136870)
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2025-01-29 21:17:58 +00:00
J. Nick Koston 9c0fa327a6 Fix incorrect Bluetooth source address when restoring data from D-Bus (#136862) 2025-01-29 21:17:58 +00:00
Abílio Costa 0f97747d27 Handle locked account error in Whirlpool (#136861) 2025-01-29 21:17:57 +00:00
Michael Hansen d338b0a2ff Cancel call if user does not pick up (#136858) 2025-01-29 21:17:56 +00:00
Erik Montnemery 6247a847bf Persist hassio backup restore status after core restart (#136857)
* Persist hassio backup restore status after core restart

* Remove useless condition
2025-01-29 21:17:56 +00:00
181 changed files with 3422 additions and 755 deletions
+1
View File
@@ -146,6 +146,7 @@ def _extract_backup(
config_dir,
dirs_exist_ok=True,
ignore=shutil.ignore_patterns(*(keep)),
ignore_dangling_symlinks=True,
)
elif restore_content.restore_database:
for entry in KEEP_DATABASE:
+10
View File
@@ -161,6 +161,16 @@ FRONTEND_INTEGRATIONS = {
# integrations can be removed and database migration status is
# visible in frontend
"frontend",
# Hassio is an after dependency of backup, after dependencies
# are not promoted from stage 2 to earlier stages, so we need to
# add it here. Hassio needs to be setup before backup, otherwise
# the backup integration will think we are a container/core install
# when using HAOS or Supervised install.
"hassio",
# Backup is an after dependency of frontend, after dependencies
# are not promoted from stage 2 to earlier stages, so we need to
# add it here.
"backup",
}
RECORDER_INTEGRATIONS = {
# Setup after frontend
@@ -144,7 +144,7 @@ class AirthingsConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry(title=discovery.name, data={})
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -92,7 +92,7 @@ class AranetConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address][0], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -1122,6 +1122,7 @@ class PipelineRun:
context=user_input.context,
language=user_input.language,
agent_id=user_input.agent_id,
extra_system_prompt=user_input.extra_system_prompt,
)
speech = conversation_result.response.speech.get("plain", {}).get(
"speech", ""
@@ -63,6 +63,21 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"async_internal_announce",
[AssistSatelliteEntityFeature.ANNOUNCE],
)
component.async_register_entity_service(
"start_conversation",
vol.All(
cv.make_entity_service_schema(
{
vol.Optional("start_message"): str,
vol.Optional("start_media_id"): str,
vol.Optional("extra_system_prompt"): str,
}
),
cv.has_at_least_one_key("start_message", "start_media_id"),
),
"async_internal_start_conversation",
[AssistSatelliteEntityFeature.START_CONVERSATION],
)
hass.data[CONNECTION_TEST_DATA] = {}
async_register_websocket_api(hass)
hass.http.register_view(ConnectionTestView())
@@ -26,3 +26,6 @@ class AssistSatelliteEntityFeature(IntFlag):
ANNOUNCE = 1
"""Device supports remotely triggered announcements."""
START_CONVERSATION = 2
"""Device supports starting conversations."""
@@ -10,7 +10,7 @@ import logging
import time
from typing import Any, Final, Literal, final
from homeassistant.components import media_source, stt, tts
from homeassistant.components import conversation, media_source, stt, tts
from homeassistant.components.assist_pipeline import (
OPTION_PREFERRED,
AudioSettings,
@@ -27,6 +27,7 @@ from homeassistant.components.tts import (
generate_media_source_id as tts_generate_media_source_id,
)
from homeassistant.core import Context, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity
from homeassistant.helpers.entity import EntityDescription
@@ -117,6 +118,7 @@ class AssistSatelliteEntity(entity.Entity):
_run_has_tts: bool = False
_is_announcing = False
_extra_system_prompt: str | None = None
_wake_word_intercept_future: asyncio.Future[str | None] | None = None
_attr_tts_options: dict[str, Any] | None = None
_pipeline_task: asyncio.Task | None = None
@@ -216,6 +218,59 @@ class AssistSatelliteEntity(entity.Entity):
"""
raise NotImplementedError
async def async_internal_start_conversation(
self,
start_message: str | None = None,
start_media_id: str | None = None,
extra_system_prompt: str | None = None,
) -> None:
"""Start a conversation from the satellite.
If start_media_id is not provided, message is synthesized to
audio with the selected pipeline.
If start_media_id is provided, it is played directly. It is possible
to omit the message and the satellite will not show any text.
Calls async_start_conversation.
"""
await self._cancel_running_pipeline()
# The Home Assistant built-in agent doesn't support conversations.
pipeline = async_get_pipeline(self.hass, self._resolve_pipeline())
if pipeline.conversation_engine == conversation.HOME_ASSISTANT_AGENT:
raise HomeAssistantError(
"Built-in conversation agent does not support starting conversations"
)
if start_message is None:
start_message = ""
announcement = await self._resolve_announcement_media_id(
start_message, start_media_id
)
if self._is_announcing:
raise SatelliteBusyError
self._is_announcing = True
# Provide our start info to the LLM so it understands context of incoming message
if extra_system_prompt is not None:
self._extra_system_prompt = extra_system_prompt
else:
self._extra_system_prompt = start_message or None
try:
await self.async_start_conversation(announcement)
finally:
self._is_announcing = False
async def async_start_conversation(
self, start_announcement: AssistSatelliteAnnouncement
) -> None:
"""Start a conversation from the satellite."""
raise NotImplementedError
async def async_accept_pipeline_from_satellite(
self,
audio_stream: AsyncIterable[bytes],
@@ -226,6 +281,10 @@ class AssistSatelliteEntity(entity.Entity):
"""Triggers an Assist pipeline in Home Assistant from a satellite."""
await self._cancel_running_pipeline()
# Consume system prompt in first pipeline
extra_system_prompt = self._extra_system_prompt
self._extra_system_prompt = None
if self._wake_word_intercept_future and start_stage in (
PipelineStage.WAKE_WORD,
PipelineStage.STT,
@@ -302,6 +361,7 @@ class AssistSatelliteEntity(entity.Entity):
),
start_stage=start_stage,
end_stage=end_stage,
conversation_extra_system_prompt=extra_system_prompt,
),
f"{self.entity_id}_pipeline",
)
@@ -7,6 +7,9 @@
"services": {
"announce": {
"service": "mdi:bullhorn"
},
"start_conversation": {
"service": "mdi:forum"
}
}
}
@@ -14,3 +14,23 @@ announce:
required: false
selector:
text:
start_conversation:
target:
entity:
domain: assist_satellite
supported_features:
- assist_satellite.AssistSatelliteEntityFeature.START_CONVERSATION
fields:
start_message:
required: false
example: "You left the lights on in the living room. Turn them off?"
selector:
text:
start_media_id:
required: false
selector:
text:
extra_system_prompt:
required: false
selector:
text:
@@ -25,6 +25,24 @@
"description": "The media ID to announce instead of using text-to-speech."
}
}
},
"start_conversation": {
"name": "Start Conversation",
"description": "Start a conversation from a satellite.",
"fields": {
"start_message": {
"name": "Message",
"description": "The message to start with."
},
"start_media_id": {
"name": "Media ID",
"description": "The media ID to start with instead of using text-to-speech."
},
"extra_system_prompt": {
"name": "Extra system prompt",
"description": "Provide background information to the AI about the request."
}
}
}
}
}
@@ -31,9 +31,11 @@ from .manager import (
ManagerBackup,
NewBackup,
RestoreBackupEvent,
RestoreBackupState,
WrittenBackup,
)
from .models import AddonInfo, AgentBackup, Folder
from .util import suggested_filename, suggested_filename_from_name_date
from .websocket import async_register_websocket_handlers
__all__ = [
@@ -54,8 +56,11 @@ __all__ = [
"ManagerBackup",
"NewBackup",
"RestoreBackupEvent",
"RestoreBackupState",
"WrittenBackup",
"async_get_manager",
"suggested_filename",
"suggested_filename_from_name_date",
]
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
+12 -1
View File
@@ -27,6 +27,12 @@ class BackupAgentUnreachableError(BackupAgentError):
_message = "The backup agent is unreachable."
class BackupNotFound(BackupAgentError):
"""Raised when a backup is not found."""
error_code = "backup_not_found"
class BackupAgent(abc.ABC):
"""Backup agent interface."""
@@ -94,11 +100,16 @@ class LocalBackupAgent(BackupAgent):
@abc.abstractmethod
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup.
"""Return the local path to an existing backup.
The method should return the path to the backup file with the specified id.
Raises BackupAgentError if the backup does not exist.
"""
@abc.abstractmethod
def get_new_backup_path(self, backup: AgentBackup) -> Path:
"""Return the local path to a new backup."""
class BackupAgentPlatformProtocol(Protocol):
"""Define the format of backup platforms which implement backup agents."""
+28 -15
View File
@@ -11,10 +11,10 @@ from typing import Any
from homeassistant.core import HomeAssistant
from homeassistant.helpers.hassio import is_hassio
from .agent import BackupAgent, LocalBackupAgent
from .agent import BackupAgent, BackupNotFound, LocalBackupAgent
from .const import DOMAIN, LOGGER
from .models import AgentBackup
from .util import read_backup
from .util import read_backup, suggested_filename
async def async_get_backup_agents(
@@ -39,7 +39,7 @@ class CoreLocalBackupAgent(LocalBackupAgent):
super().__init__()
self._hass = hass
self._backup_dir = Path(hass.config.path("backups"))
self._backups: dict[str, AgentBackup] = {}
self._backups: dict[str, tuple[AgentBackup, Path]] = {}
self._loaded_backups = False
async def _load_backups(self) -> None:
@@ -49,13 +49,13 @@ class CoreLocalBackupAgent(LocalBackupAgent):
self._backups = backups
self._loaded_backups = True
def _read_backups(self) -> dict[str, AgentBackup]:
def _read_backups(self) -> dict[str, tuple[AgentBackup, Path]]:
"""Read backups from disk."""
backups: dict[str, AgentBackup] = {}
backups: dict[str, tuple[AgentBackup, Path]] = {}
for backup_path in self._backup_dir.glob("*.tar"):
try:
backup = read_backup(backup_path)
backups[backup.backup_id] = backup
backups[backup.backup_id] = (backup, backup_path)
except (OSError, TarError, json.JSONDecodeError, KeyError) as err:
LOGGER.warning("Unable to read backup %s: %s", backup_path, err)
return backups
@@ -76,13 +76,13 @@ class CoreLocalBackupAgent(LocalBackupAgent):
**kwargs: Any,
) -> None:
"""Upload a backup."""
self._backups[backup.backup_id] = backup
self._backups[backup.backup_id] = (backup, self.get_new_backup_path(backup))
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
if not self._loaded_backups:
await self._load_backups()
return list(self._backups.values())
return [backup for backup, _ in self._backups.values()]
async def async_get_backup(
self,
@@ -93,10 +93,10 @@ class CoreLocalBackupAgent(LocalBackupAgent):
if not self._loaded_backups:
await self._load_backups()
if not (backup := self._backups.get(backup_id)):
if backup_id not in self._backups:
return None
backup_path = self.get_backup_path(backup_id)
backup, backup_path = self._backups[backup_id]
if not await self._hass.async_add_executor_job(backup_path.exists):
LOGGER.debug(
(
@@ -112,15 +112,28 @@ class CoreLocalBackupAgent(LocalBackupAgent):
return backup
def get_backup_path(self, backup_id: str) -> Path:
"""Return the local path to a backup."""
return self._backup_dir / f"{backup_id}.tar"
"""Return the local path to an existing backup.
Raises BackupAgentError if the backup does not exist.
"""
try:
return self._backups[backup_id][1]
except KeyError as err:
raise BackupNotFound(f"Backup {backup_id} does not exist") from err
def get_new_backup_path(self, backup: AgentBackup) -> Path:
"""Return the local path to a new backup."""
return self._backup_dir / suggested_filename(backup)
async def async_delete_backup(self, backup_id: str, **kwargs: Any) -> None:
"""Delete a backup file."""
if await self.async_get_backup(backup_id) is None:
return
if not self._loaded_backups:
await self._load_backups()
backup_path = self.get_backup_path(backup_id)
try:
backup_path = self.get_backup_path(backup_id)
except BackupNotFound:
return
await self._hass.async_add_executor_job(backup_path.unlink, True)
LOGGER.debug("Deleted backup located at %s", backup_path)
self._backups.pop(backup_id)
+13 -64
View File
@@ -2,8 +2,6 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
from dataclasses import dataclass, field, replace
import datetime as dt
from datetime import datetime, timedelta
@@ -252,7 +250,7 @@ class RetentionConfig:
"""Delete backups older than days."""
self._schedule_next(manager)
def _backups_filter(
def _delete_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return backups older than days to delete."""
@@ -269,7 +267,9 @@ class RetentionConfig:
< now
}
await _delete_filtered_backups(manager, _backups_filter)
await manager.async_delete_filtered_backups(
include_filter=_automatic_backups_filter, delete_filter=_delete_filter
)
manager.remove_next_delete_event = async_call_later(
manager.hass, timedelta(days=1), _delete_backups
@@ -521,74 +521,21 @@ class CreateBackupParametersDict(TypedDict, total=False):
password: str | None
async def _delete_filtered_backups(
manager: BackupManager,
backup_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param manager: The backup manager.
:param backup_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await manager.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
# only delete backups that are created with the saved automatic settings
backups = {
def _automatic_backups_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return automatic backups."""
return {
backup_id: backup
for backup_id, backup in backups.items()
if backup.with_automatic_settings
}
LOGGER.debug("Total automatic backups: %s", backups)
filtered_backups = backup_filter(backups)
if not filtered_backups:
return
# always delete oldest backup first
filtered_backups = dict(
sorted(
filtered_backups.items(),
key=lambda backup_item: backup_item[1].date,
)
)
if len(filtered_backups) >= len(backups):
# Never delete the last backup.
last_backup = filtered_backups.popitem()
LOGGER.debug("Keeping the last backup: %s", last_backup)
LOGGER.debug("Backups to delete: %s", filtered_backups)
if not filtered_backups:
return
backup_ids = list(filtered_backups)
delete_results = await asyncio.gather(
*(manager.async_delete_backup(backup_id) for backup_id in filtered_backups)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)
async def delete_backups_exceeding_configured_count(manager: BackupManager) -> None:
"""Delete backups exceeding the configured retention count."""
def _backups_filter(
def _delete_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return oldest backups more numerous than copies to delete."""
@@ -603,4 +550,6 @@ async def delete_backups_exceeding_configured_count(manager: BackupManager) -> N
)[: max(len(backups) - manager.config.data.retention.copies, 0)]
)
await _delete_filtered_backups(manager, _backups_filter)
await manager.async_delete_filtered_backups(
include_filter=_automatic_backups_filter, delete_filter=_delete_filter
)
+87 -17
View File
@@ -685,6 +685,70 @@ class BackupManager:
return agent_errors
async def async_delete_filtered_backups(
self,
*,
include_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
delete_filter: Callable[[dict[str, ManagerBackup]], dict[str, ManagerBackup]],
) -> None:
"""Delete backups parsed with a filter.
:param include_filter: A filter that should return the backups to consider for
deletion. Note: The newest of the backups returned by include_filter will
unconditionally be kept, even if delete_filter returns all backups.
:param delete_filter: A filter that should return the backups to delete.
"""
backups, get_agent_errors = await self.async_get_backups()
if get_agent_errors:
LOGGER.debug(
"Error getting backups; continuing anyway: %s",
get_agent_errors,
)
# Run the include filter first to ensure we only consider backups that
# should be included in the deletion process.
backups = include_filter(backups)
LOGGER.debug("Total automatic backups: %s", backups)
backups_to_delete = delete_filter(backups)
if not backups_to_delete:
return
# always delete oldest backup first
backups_to_delete = dict(
sorted(
backups_to_delete.items(),
key=lambda backup_item: backup_item[1].date,
)
)
if len(backups_to_delete) >= len(backups):
# Never delete the last backup.
last_backup = backups_to_delete.popitem()
LOGGER.debug("Keeping the last backup: %s", last_backup)
LOGGER.debug("Backups to delete: %s", backups_to_delete)
if not backups_to_delete:
return
backup_ids = list(backups_to_delete)
delete_results = await asyncio.gather(
*(self.async_delete_backup(backup_id) for backup_id in backups_to_delete)
)
agent_errors = {
backup_id: error
for backup_id, error in zip(backup_ids, delete_results, strict=True)
if error
}
if agent_errors:
LOGGER.error(
"Error deleting old copies: %s",
agent_errors,
)
async def async_receive_backup(
self,
*,
@@ -898,7 +962,7 @@ class BackupManager:
)
backup_name = (
name
(name if name is None else name.strip())
or f"{'Automatic' if with_automatic_settings else 'Custom'} backup {HAVERSION}"
)
extra_metadata = extra_metadata or {}
@@ -1166,7 +1230,11 @@ class BackupManager:
learn_more_url="homeassistant://config/backup",
severity=ir.IssueSeverity.WARNING,
translation_key="automatic_backup_failed_upload_agents",
translation_placeholders={"failed_agents": ", ".join(agent_errors)},
translation_placeholders={
"failed_agents": ", ".join(
self.backup_agents[agent_id].name for agent_id in agent_errors
)
},
)
async def async_can_decrypt_on_download(
@@ -1346,10 +1414,24 @@ class CoreBackupReaderWriter(BackupReaderWriter):
if agent_config and not agent_config.protected:
password = None
backup = AgentBackup(
addons=[],
backup_id=backup_id,
database_included=include_database,
date=date_str,
extra_metadata=extra_metadata,
folders=[],
homeassistant_included=True,
homeassistant_version=HAVERSION,
name=backup_name,
protected=password is not None,
size=0,
)
local_agent_tar_file_path = None
if self._local_agent_id in agent_ids:
local_agent = manager.local_backup_agents[self._local_agent_id]
local_agent_tar_file_path = local_agent.get_backup_path(backup_id)
local_agent_tar_file_path = local_agent.get_new_backup_path(backup)
on_progress(
CreateBackupEvent(
@@ -1391,19 +1473,7 @@ class CoreBackupReaderWriter(BackupReaderWriter):
# ValueError from json_bytes
raise BackupReaderWriterError(str(err)) from err
else:
backup = AgentBackup(
addons=[],
backup_id=backup_id,
database_included=include_database,
date=date_str,
extra_metadata=extra_metadata,
folders=[],
homeassistant_included=True,
homeassistant_version=HAVERSION,
name=backup_name,
protected=password is not None,
size=size_in_bytes,
)
backup = replace(backup, size=size_in_bytes)
async_add_executor_job = self._hass.async_add_executor_job
@@ -1517,7 +1587,7 @@ class CoreBackupReaderWriter(BackupReaderWriter):
manager = self._hass.data[DATA_MANAGER]
if self._local_agent_id in agent_ids:
local_agent = manager.local_backup_agents[self._local_agent_id]
tar_file_path = local_agent.get_backup_path(backup.backup_id)
tar_file_path = local_agent.get_new_backup_path(backup)
await async_add_executor_job(make_backup_dir, tar_file_path.parent)
await async_add_executor_job(shutil.move, temp_file, tar_file_path)
else:
+5 -3
View File
@@ -16,7 +16,7 @@ if TYPE_CHECKING:
STORE_DELAY_SAVE = 30
STORAGE_KEY = DOMAIN
STORAGE_VERSION = 1
STORAGE_VERSION_MINOR = 2
STORAGE_VERSION_MINOR = 3
class StoredBackupData(TypedDict):
@@ -47,8 +47,10 @@ class _BackupStore(Store[StoredBackupData]):
"""Migrate to the new version."""
data = old_data
if old_major_version == 1:
if old_minor_version < 2:
# Version 1.2 adds per agent settings, configurable backup time
if old_minor_version < 3:
# Version 1.2 bumped to 1.3 because 1.2 was changed several
# times during development.
# Version 1.3 adds per agent settings, configurable backup time
# and custom days
data["config"]["agents"] = {}
data["config"]["schedule"]["time"] = None
+12
View File
@@ -20,6 +20,7 @@ from securetar import SecureTarError, SecureTarFile, SecureTarReadError
from homeassistant.backup_restore import password_to_key
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import dt as dt_util
from homeassistant.util.json import JsonObjectType, json_loads_object
from homeassistant.util.thread import ThreadWithException
@@ -117,6 +118,17 @@ def read_backup(backup_path: Path) -> AgentBackup:
)
def suggested_filename_from_name_date(name: str, date_str: str) -> str:
"""Suggest a filename for the backup."""
date = dt_util.parse_datetime(date_str, raise_on_error=True)
return "_".join(f"{name} - {date.strftime('%Y-%m-%d %H.%M %S%f')}.tar".split())
def suggested_filename(backup: AgentBackup) -> str:
"""Suggest a filename for the backup."""
return suggested_filename_from_name_date(backup.name, backup.date)
def validate_password(path: Path, password: str | None) -> bool:
"""Validate the password."""
with tarfile.open(path, "r:", bufsize=BUF_SIZE) as backup_file:
+1 -1
View File
@@ -199,7 +199,7 @@ async def handle_can_decrypt_on_download(
vol.Optional("include_database", default=True): bool,
vol.Optional("include_folders"): [vol.Coerce(Folder)],
vol.Optional("include_homeassistant", default=True): bool,
vol.Optional("name"): str,
vol.Optional("name"): vol.Any(str, None),
vol.Optional("password"): vol.Any(str, None),
}
)
+15 -2
View File
@@ -80,6 +80,7 @@ from .const import (
CONF_DETAILS,
CONF_PASSIVE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -297,7 +298,12 @@ async def async_discover_adapters(
async def async_update_device(
hass: HomeAssistant, entry: ConfigEntry, adapter: str, details: AdapterDetails
hass: HomeAssistant,
entry: ConfigEntry,
adapter: str,
details: AdapterDetails,
via_device_domain: str | None = None,
via_device_id: str | None = None,
) -> None:
"""Update device registry entry.
@@ -306,7 +312,8 @@ async def async_update_device(
update the device with the new location so they can
figure out where the adapter is.
"""
dr.async_get(hass).async_get_or_create(
device_registry = dr.async_get(hass)
device_entry = device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
name=adapter_human_name(adapter, details[ADAPTER_ADDRESS]),
connections={(dr.CONNECTION_BLUETOOTH, details[ADAPTER_ADDRESS])},
@@ -315,6 +322,10 @@ async def async_update_device(
sw_version=details.get(ADAPTER_SW_VERSION),
hw_version=details.get(ADAPTER_HW_VERSION),
)
if via_device_id:
device_registry.async_update_device(
device_entry.id, via_device_id=via_device_id
)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
@@ -349,6 +360,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry,
source_entry.title,
details,
source_domain,
entry.data.get(CONF_SOURCE_DEVICE_ID),
)
return True
manager = _get_manager(hass)
+7 -1
View File
@@ -181,10 +181,16 @@ def async_register_scanner(
source_domain: str | None = None,
source_model: str | None = None,
source_config_entry_id: str | None = None,
source_device_id: str | None = None,
) -> CALLBACK_TYPE:
"""Register a BleakScanner."""
return _get_manager(hass).async_register_hass_scanner(
scanner, connection_slots, source_domain, source_model, source_config_entry_id
scanner,
connection_slots,
source_domain,
source_model,
source_config_entry_id,
source_device_id,
)
@@ -37,6 +37,7 @@ from .const import (
CONF_PASSIVE,
CONF_SOURCE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -194,6 +195,7 @@ class BluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
CONF_SOURCE_MODEL: user_input[CONF_SOURCE_MODEL],
CONF_SOURCE_DOMAIN: user_input[CONF_SOURCE_DOMAIN],
CONF_SOURCE_CONFIG_ENTRY_ID: user_input[CONF_SOURCE_CONFIG_ENTRY_ID],
CONF_SOURCE_DEVICE_ID: user_input[CONF_SOURCE_DEVICE_ID],
}
self._abort_if_unique_id_configured(updates=data)
manager = get_manager()
+1 -1
View File
@@ -22,7 +22,7 @@ CONF_SOURCE: Final = "source"
CONF_SOURCE_DOMAIN: Final = "source_domain"
CONF_SOURCE_MODEL: Final = "source_model"
CONF_SOURCE_CONFIG_ENTRY_ID: Final = "source_config_entry_id"
CONF_SOURCE_DEVICE_ID: Final = "source_device_id"
SOURCE_LOCAL: Final = "local"
@@ -25,6 +25,7 @@ from homeassistant.helpers.dispatcher import async_dispatcher_connect
from .const import (
CONF_SOURCE,
CONF_SOURCE_CONFIG_ENTRY_ID,
CONF_SOURCE_DEVICE_ID,
CONF_SOURCE_DOMAIN,
CONF_SOURCE_MODEL,
DOMAIN,
@@ -254,6 +255,7 @@ class HomeAssistantBluetoothManager(BluetoothManager):
source_domain: str | None = None,
source_model: str | None = None,
source_config_entry_id: str | None = None,
source_device_id: str | None = None,
) -> CALLBACK_TYPE:
"""Register a scanner."""
cancel = self.async_register_scanner(scanner, connection_slots)
@@ -261,9 +263,6 @@ class HomeAssistantBluetoothManager(BluetoothManager):
isinstance(scanner, BaseHaRemoteScanner)
and source_domain
and source_config_entry_id
and not self.hass.config_entries.async_entry_for_domain_unique_id(
DOMAIN, scanner.source
)
):
self.hass.async_create_task(
self.hass.config_entries.flow.async_init(
@@ -274,6 +273,7 @@ class HomeAssistantBluetoothManager(BluetoothManager):
CONF_SOURCE_DOMAIN: source_domain,
CONF_SOURCE_MODEL: source_model,
CONF_SOURCE_CONFIG_ENTRY_ID: source_config_entry_id,
CONF_SOURCE_DEVICE_ID: source_device_id,
},
)
)
@@ -21,6 +21,6 @@
"bluetooth-auto-recovery==1.4.2",
"bluetooth-data-tools==1.22.0",
"dbus-fast==2.30.2",
"habluetooth==3.14.0"
"habluetooth==3.21.0"
]
}
+9 -1
View File
@@ -39,6 +39,10 @@ def async_load_history_from_system(
now_monotonic = monotonic_time_coarse()
connectable_loaded_history: dict[str, BluetoothServiceInfoBleak] = {}
all_loaded_history: dict[str, BluetoothServiceInfoBleak] = {}
adapter_to_source_address = {
adapter: details[ADAPTER_ADDRESS]
for adapter, details in adapters.adapters.items()
}
# Restore local adapters
for address, history in adapters.history.items():
@@ -50,7 +54,11 @@ def async_load_history_from_system(
BluetoothServiceInfoBleak.from_device_and_advertisement_data(
history.device,
history.advertisement_data,
history.source,
# history.source is really the adapter name
# for historical compatibility since BlueZ
# does not know the MAC address of the adapter
# so we need to convert it to the source address (MAC)
adapter_to_source_address.get(history.source, history.source),
now_monotonic,
True,
)
@@ -132,7 +132,7 @@ class BTHomeConfigFlow(ConfigFlow, domain=DOMAIN):
return self._async_get_or_create_entry()
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -20,5 +20,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/bthome",
"iot_class": "local_push",
"requirements": ["bthome-ble==3.9.1"]
"requirements": ["bthome-ble==3.12.3"]
}
+26 -10
View File
@@ -67,6 +67,16 @@ SENSOR_DESCRIPTIONS = {
state_class=SensorStateClass.MEASUREMENT,
entity_category=EntityCategory.DIAGNOSTIC,
),
# Conductivity (µS/cm)
(
BTHomeSensorDeviceClass.CONDUCTIVITY,
Units.CONDUCTIVITY,
): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.CONDUCTIVITY}_{Units.CONDUCTIVITY}",
device_class=SensorDeviceClass.CONDUCTIVITY,
native_unit_of_measurement=UnitOfConductivity.MICROSIEMENS_PER_CM,
state_class=SensorStateClass.MEASUREMENT,
),
# Count (-)
(BTHomeSensorDeviceClass.COUNT, None): SensorEntityDescription(
key=str(BTHomeSensorDeviceClass.COUNT),
@@ -99,6 +109,12 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
),
# Directions (°)
(BTHomeExtendedSensorDeviceClass.DIRECTION, Units.DEGREE): SensorEntityDescription(
key=f"{BTHomeExtendedSensorDeviceClass.DIRECTION}_{Units.DEGREE}",
native_unit_of_measurement=DEGREE,
state_class=SensorStateClass.MEASUREMENT,
),
# Distance (mm)
(
BTHomeSensorDeviceClass.DISTANCE,
@@ -221,6 +237,16 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
),
# Precipitation (mm)
(
BTHomeExtendedSensorDeviceClass.PRECIPITATION,
Units.LENGTH_MILLIMETERS,
): SensorEntityDescription(
key=f"{BTHomeExtendedSensorDeviceClass.PRECIPITATION}_{Units.LENGTH_MILLIMETERS}",
device_class=SensorDeviceClass.PRECIPITATION,
native_unit_of_measurement=UnitOfLength.MILLIMETERS,
state_class=SensorStateClass.MEASUREMENT,
),
# Pressure (mbar)
(BTHomeSensorDeviceClass.PRESSURE, Units.PRESSURE_MBAR): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.PRESSURE}_{Units.PRESSURE_MBAR}",
@@ -357,16 +383,6 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=UnitOfVolume.LITERS,
state_class=SensorStateClass.TOTAL,
),
# Conductivity (µS/cm)
(
BTHomeSensorDeviceClass.CONDUCTIVITY,
Units.CONDUCTIVITY,
): SensorEntityDescription(
key=f"{BTHomeSensorDeviceClass.CONDUCTIVITY}_{Units.CONDUCTIVITY}",
device_class=SensorDeviceClass.CONDUCTIVITY,
native_unit_of_measurement=UnitOfConductivity.MICROSIEMENS_PER_CM,
state_class=SensorStateClass.MEASUREMENT,
),
}
+12 -7
View File
@@ -1175,12 +1175,17 @@ async def async_handle_snapshot_service(
f"Cannot write `{snapshot_file}`, no access to path; `allowlist_external_dirs` may need to be adjusted in `configuration.yaml`"
)
async with asyncio.timeout(CAMERA_IMAGE_TIMEOUT):
image = (
await _async_get_stream_image(camera, wait_for_next_keyframe=True)
if camera.use_stream_for_stills
else await camera.async_camera_image()
)
try:
async with asyncio.timeout(CAMERA_IMAGE_TIMEOUT):
image = (
await _async_get_stream_image(camera, wait_for_next_keyframe=True)
if camera.use_stream_for_stills
else await camera.async_camera_image()
)
except TimeoutError as err:
raise HomeAssistantError(
f"Unable to get snapshot: Timed out after {CAMERA_IMAGE_TIMEOUT} seconds"
) from err
if image is None:
return
@@ -1194,7 +1199,7 @@ async def async_handle_snapshot_service(
try:
await hass.async_add_executor_job(_write_image, snapshot_file, image)
except OSError as err:
_LOGGER.error("Can't write image to file: %s", err)
raise HomeAssistantError(f"Can't write image to file: {err}") from err
async def async_handle_play_stream_service(
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/ecovacs",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.10", "deebot-client==11.1.0b1"]
"requirements": ["py-sucks==0.9.10", "deebot-client==12.0.0b0"]
}
@@ -2,6 +2,7 @@
from typing import Any
from eheimdigital.device import EheimDigitalDevice
from eheimdigital.heater import EheimDigitalHeater
from eheimdigital.types import EheimDigitalClientError, HeaterMode, HeaterUnit
@@ -39,17 +40,23 @@ async def async_setup_entry(
"""Set up the callbacks for the coordinator so climate entities can be added as devices are found."""
coordinator = entry.runtime_data
async def async_setup_device_entities(device_address: str) -> None:
"""Set up the light entities for a device."""
device = coordinator.hub.devices[device_address]
def async_setup_device_entities(
device_address: str | dict[str, EheimDigitalDevice],
) -> None:
"""Set up the climate entities for one or multiple devices."""
entities: list[EheimDigitalHeaterClimate] = []
if isinstance(device_address, str):
device_address = {device_address: coordinator.hub.devices[device_address]}
for device in device_address.values():
if isinstance(device, EheimDigitalHeater):
entities.append(EheimDigitalHeaterClimate(coordinator, device))
coordinator.known_devices.add(device.mac_address)
if isinstance(device, EheimDigitalHeater):
async_add_entities([EheimDigitalHeaterClimate(coordinator, device)])
async_add_entities(entities)
coordinator.add_platform_callback(async_setup_device_entities)
for device_address in entry.runtime_data.hub.devices:
await async_setup_device_entities(device_address)
async_setup_device_entities(coordinator.hub.devices)
class EheimDigitalHeaterClimate(EheimDigitalEntity[EheimDigitalHeater], ClimateEntity):
@@ -69,6 +76,7 @@ class EheimDigitalHeaterClimate(EheimDigitalEntity[EheimDigitalHeater], ClimateE
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_preset_mode = PRESET_NONE
_attr_translation_key = "heater"
_attr_name = None
def __init__(
self, coordinator: EheimDigitalUpdateCoordinator, device: EheimDigitalHeater
@@ -2,8 +2,7 @@
from __future__ import annotations
from collections.abc import Callable, Coroutine
from typing import Any
from collections.abc import Callable
from aiohttp import ClientError
from eheimdigital.device import EheimDigitalDevice
@@ -19,7 +18,9 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import DOMAIN, LOGGER
type AsyncSetupDeviceEntitiesCallback = Callable[[str], Coroutine[Any, Any, None]]
type AsyncSetupDeviceEntitiesCallback = Callable[
[str | dict[str, EheimDigitalDevice]], None
]
class EheimDigitalUpdateCoordinator(
@@ -61,7 +62,7 @@ class EheimDigitalUpdateCoordinator(
if device_address not in self.known_devices:
for platform_callback in self.platform_callbacks:
await platform_callback(device_address)
platform_callback(device_address)
async def _async_receive_callback(self) -> None:
self.async_set_updated_data(self.hub.devices)
+18 -13
View File
@@ -3,6 +3,7 @@
from typing import Any
from eheimdigital.classic_led_ctrl import EheimDigitalClassicLEDControl
from eheimdigital.device import EheimDigitalDevice
from eheimdigital.types import EheimDigitalClientError, LightMode
from homeassistant.components.light import (
@@ -37,24 +38,28 @@ async def async_setup_entry(
"""Set up the callbacks for the coordinator so lights can be added as devices are found."""
coordinator = entry.runtime_data
async def async_setup_device_entities(device_address: str) -> None:
"""Set up the light entities for a device."""
device = coordinator.hub.devices[device_address]
def async_setup_device_entities(
device_address: str | dict[str, EheimDigitalDevice],
) -> None:
"""Set up the light entities for one or multiple devices."""
entities: list[EheimDigitalClassicLEDControlLight] = []
if isinstance(device_address, str):
device_address = {device_address: coordinator.hub.devices[device_address]}
for device in device_address.values():
if isinstance(device, EheimDigitalClassicLEDControl):
for channel in range(2):
if len(device.tankconfig[channel]) > 0:
entities.append(
EheimDigitalClassicLEDControlLight(
coordinator, device, channel
)
)
coordinator.known_devices.add(device.mac_address)
if isinstance(device, EheimDigitalClassicLEDControl):
for channel in range(2):
if len(device.tankconfig[channel]) > 0:
entities.append(
EheimDigitalClassicLEDControlLight(coordinator, device, channel)
)
coordinator.known_devices.add(device.mac_address)
async_add_entities(entities)
coordinator.add_platform_callback(async_setup_device_entities)
for device_address in entry.runtime_data.hub.devices:
await async_setup_device_entities(device_address)
async_setup_device_entities(coordinator.hub.devices)
class EheimDigitalClassicLEDControlLight(
@@ -3,7 +3,7 @@
"config": {
"step": {
"user": {
"title": "Searching for Energenie-Power-Sockets Devices.",
"title": "Searching for Energenie Power Sockets devices",
"description": "Choose a discovered device.",
"data": {
"device": "[%key:common::config_flow::data::device%]"
@@ -13,7 +13,7 @@
"abort": {
"usb_error": "Couldn't access USB devices!",
"no_device": "Unable to discover any (new) supported device.",
"device_not_found": "No device was found for the given id.",
"device_not_found": "No device was found for the given ID.",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
}
},
@@ -22,5 +22,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["eq3btsmart"],
"requirements": ["eq3btsmart==1.4.1", "bleak-esphome==2.2.0"]
"requirements": ["eq3btsmart==1.4.1", "bleak-esphome==2.6.0"]
}
@@ -28,6 +28,7 @@ def async_connect_scanner(
entry_data: RuntimeEntryData,
cli: APIClient,
device_info: DeviceInfo,
device_id: str,
) -> CALLBACK_TYPE:
"""Connect scanner."""
client_data = connect_scanner(cli, device_info, entry_data.available)
@@ -45,6 +46,7 @@ def async_connect_scanner(
source_domain=DOMAIN,
source_model=device_info.model,
source_config_entry_id=entry_data.entry_id,
source_device_id=device_id,
),
scanner.async_setup(),
],
+6 -2
View File
@@ -425,7 +425,9 @@ class ESPHomeManager:
if device_info.bluetooth_proxy_feature_flags_compat(api_version):
entry_data.disconnect_callbacks.add(
async_connect_scanner(hass, entry_data, cli, device_info)
async_connect_scanner(
hass, entry_data, cli, device_info, self.device_id
)
)
else:
bluetooth.async_remove_scanner(hass, device_info.mac_address)
@@ -571,7 +573,9 @@ def _async_setup_device_registry(
configuration_url = None
if device_info.webserver_port > 0:
configuration_url = f"http://{entry.data['host']}:{device_info.webserver_port}"
entry_host = entry.data["host"]
host = f"[{entry_host}]" if ":" in entry_host else entry_host
configuration_url = f"http://{host}:{device_info.webserver_port}"
elif (
(dashboard := async_get_dashboard(hass))
and dashboard.data
@@ -18,7 +18,7 @@
"requirements": [
"aioesphomeapi==29.0.0",
"esphome-dashboard-api==1.2.3",
"bleak-esphome==2.2.0"
"bleak-esphome==2.6.0"
],
"zeroconf": ["_esphomelib._tcp.local."]
}
@@ -21,5 +21,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250129.0"]
"requirements": ["home-assistant-frontend==20250131.0"]
}
+2 -2
View File
@@ -11,7 +11,7 @@ from aiohttp import ClientSession, ClientTimeout, StreamReader
from aiohttp.client_exceptions import ClientError, ClientResponseError
from google_drive_api.api import AbstractAuth, GoogleDriveApi
from homeassistant.components.backup import AgentBackup
from homeassistant.components.backup import AgentBackup, suggested_filename
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.exceptions import (
@@ -132,7 +132,7 @@ class DriveClient:
"""Upload a backup."""
folder_id, _ = await self.async_create_ha_root_folder_if_not_exists()
backup_metadata = {
"name": f"{backup.name} {backup.date}.tar",
"name": suggested_filename(backup),
"description": json.dumps(backup.as_dict()),
"parents": [folder_id],
"properties": {
@@ -2,6 +2,7 @@
from homeassistant.components.application_credentials import AuthorizationServer
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow
async def async_get_authorization_server(hass: HomeAssistant) -> AuthorizationServer:
@@ -18,4 +19,5 @@ async def async_get_description_placeholders(hass: HomeAssistant) -> dict[str, s
"oauth_consent_url": "https://console.cloud.google.com/apis/credentials/consent",
"more_info_url": "https://www.home-assistant.io/integrations/google_drive/",
"oauth_creds_url": "https://console.cloud.google.com/apis/credentials",
"redirect_url": config_entry_oauth2_flow.async_get_redirect_uri(hass),
}
@@ -80,16 +80,14 @@ class GoogleDriveBackupAgent(BackupAgent):
try:
await self._client.async_upload_backup(open_stream, backup)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Upload backup error: %s", err)
raise BackupAgentError("Failed to upload backup") from err
raise BackupAgentError(f"Failed to upload backup: {err}") from err
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
try:
return await self._client.async_list_backups()
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("List backups error: %s", err)
raise BackupAgentError("Failed to list backups") from err
raise BackupAgentError(f"Failed to list backups: {err}") from err
async def async_get_backup(
self,
@@ -121,9 +119,7 @@ class GoogleDriveBackupAgent(BackupAgent):
stream = await self._client.async_download(file_id)
return ChunkAsyncStreamIterator(stream)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Download backup error: %s", err)
raise BackupAgentError("Failed to download backup") from err
_LOGGER.error("Download backup_id: %s not found", backup_id)
raise BackupAgentError(f"Failed to download backup: {err}") from err
raise BackupAgentError("Backup not found")
async def async_delete_backup(
@@ -143,5 +139,4 @@ class GoogleDriveBackupAgent(BackupAgent):
await self._client.async_delete(file_id)
_LOGGER.debug("Deleted backup_id: %s", backup_id)
except (GoogleDriveApiError, HomeAssistantError, TimeoutError) as err:
_LOGGER.error("Delete backup error: %s", err)
raise BackupAgentError("Failed to delete backup") from err
raise BackupAgentError(f"Failed to delete backup: {err}") from err
@@ -35,6 +35,6 @@
}
},
"application_credentials": {
"description": "Follow the [instructions]({more_info_url}) for [OAuth consent screen]({oauth_consent_url}) to give Home Assistant access to your Google Drive. You also need to create Application Credentials linked to your account:\n1. Go to [Credentials]({oauth_creds_url}) and select **Create Credentials**.\n1. From the drop-down list select **OAuth client ID**.\n1. Select **Web application** for the Application Type."
"description": "Follow the [instructions]({more_info_url}) to configure the Cloud Console:\n\n1. Go to the [OAuth consent screen]({oauth_consent_url}) and configure\n1. Go to [Credentials]({oauth_creds_url}) and select **Create Credentials**.\n1. From the drop-down list select **OAuth client ID**.\n1. Select **Web application** for the Application Type.\n1. Add `{redirect_url}` under *Authorized redirect URI*."
}
}
@@ -78,7 +78,7 @@ class GoveeConfigFlow(ConfigFlow, domain=DOMAIN):
title=title, data={CONF_DEVICE_TYPE: device.device_type}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
+88 -12
View File
@@ -5,8 +5,10 @@ from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine, Mapping
import logging
from pathlib import Path
import os
from pathlib import Path, PurePath
from typing import Any, cast
from uuid import UUID
from aiohasupervisor import SupervisorClient
from aiohasupervisor.exceptions import (
@@ -31,15 +33,20 @@ from homeassistant.components.backup import (
Folder,
IdleEvent,
IncorrectPasswordError,
ManagerBackup,
NewBackup,
RestoreBackupEvent,
RestoreBackupState,
WrittenBackup,
async_get_manager as async_get_backup_manager,
suggested_filename as suggested_backup_filename,
suggested_filename_from_name_date,
)
from homeassistant.const import __version__ as HAVERSION
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.util import dt as dt_util
from .const import DOMAIN, EVENT_SUPERVISOR_EVENT
from .handler import get_supervisor_client
@@ -47,6 +54,9 @@ from .handler import get_supervisor_client
LOCATION_CLOUD_BACKUP = ".cloud_backup"
LOCATION_LOCAL = ".local"
MOUNT_JOBS = ("mount_manager_create_mount", "mount_manager_remove_mount")
RESTORE_JOB_ID_ENV = "SUPERVISOR_RESTORE_JOB_ID"
# Set on backups automatically created when updating an addon
TAG_ADDON_UPDATE = "supervisor.addon_update"
_LOGGER = logging.getLogger(__name__)
@@ -109,12 +119,15 @@ def _backup_details_to_agent_backup(
AddonInfo(name=addon.name, slug=addon.slug, version=addon.version)
for addon in details.addons
]
extra_metadata = details.extra or {}
location = location or LOCATION_LOCAL
return AgentBackup(
addons=addons,
backup_id=details.slug,
database_included=database_included,
date=details.date.isoformat(),
date=extra_metadata.get(
"supervisor.backup_request_date", details.date.isoformat()
),
extra_metadata=details.extra or {},
folders=[Folder(folder) for folder in details.folders],
homeassistant_included=homeassistant_included,
@@ -170,7 +183,8 @@ class SupervisorBackupAgent(BackupAgent):
return
stream = await open_stream()
upload_options = supervisor_backups.UploadBackupOptions(
location={self.location}
location={self.location},
filename=PurePath(suggested_backup_filename(backup)),
)
await self._client.backups.upload_backup(
stream,
@@ -194,7 +208,10 @@ class SupervisorBackupAgent(BackupAgent):
**kwargs: Any,
) -> AgentBackup | None:
"""Return a backup."""
details = await self._client.backups.backup_info(backup_id)
try:
details = await self._client.backups.backup_info(backup_id)
except SupervisorNotFoundError:
return None
if self.location not in details.locations:
return None
return _backup_details_to_agent_backup(details, self.location)
@@ -208,10 +225,6 @@ class SupervisorBackupAgent(BackupAgent):
location={self.location}
),
)
except SupervisorBadRequestError as err:
if err.args[0] != "Backup does not exist":
raise
_LOGGER.debug("Backup %s does not exist", backup_id)
except SupervisorNotFoundError:
_LOGGER.debug("Backup %s does not exist", backup_id)
@@ -298,6 +311,9 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
locations = []
locations = locations or [LOCATION_CLOUD_BACKUP]
date = dt_util.now().isoformat()
extra_metadata = extra_metadata | {"supervisor.backup_request_date": date}
filename = suggested_filename_from_name_date(backup_name, date)
try:
backup = await self._client.backups.partial_backup(
supervisor_backups.PartialBackupOptions(
@@ -311,6 +327,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
homeassistant_exclude_database=not include_database,
background=True,
extra=extra_metadata,
filename=PurePath(filename),
)
)
except SupervisorError as err:
@@ -346,8 +363,9 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
backup_id = data.get("reference")
backup_complete.set()
unsub = self._async_listen_job_events(backup.job_id, on_job_progress)
try:
unsub = self._async_listen_job_events(backup.job_id, on_job_progress)
await self._get_job_state(backup.job_id, on_job_progress)
await backup_complete.wait()
finally:
unsub()
@@ -502,12 +520,13 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
@callback
def on_job_progress(data: Mapping[str, Any]) -> None:
"""Handle backup progress."""
"""Handle backup restore progress."""
if data.get("done") is True:
restore_complete.set()
unsub = self._async_listen_job_events(job.job_id, on_job_progress)
try:
unsub = self._async_listen_job_events(job.job_id, on_job_progress)
await self._get_job_state(job.job_id, on_job_progress)
await restore_complete.wait()
finally:
unsub()
@@ -518,6 +537,37 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
on_progress: Callable[[RestoreBackupEvent | IdleEvent], None],
) -> None:
"""Check restore status after core restart."""
if not (restore_job_id := os.environ.get(RESTORE_JOB_ID_ENV)):
_LOGGER.debug("No restore job ID found in environment")
return
_LOGGER.debug("Found restore job ID %s in environment", restore_job_id)
@callback
def on_job_progress(data: Mapping[str, Any]) -> None:
"""Handle backup restore progress."""
if data.get("done") is not True:
on_progress(
RestoreBackupEvent(
reason="", stage=None, state=RestoreBackupState.IN_PROGRESS
)
)
return
on_progress(
RestoreBackupEvent(
reason="", stage=None, state=RestoreBackupState.COMPLETED
)
)
on_progress(IdleEvent())
unsub()
unsub = self._async_listen_job_events(restore_job_id, on_job_progress)
try:
await self._get_job_state(restore_job_id, on_job_progress)
except SupervisorError as err:
_LOGGER.debug("Could not get restore job %s: %s", restore_job_id, err)
unsub()
@callback
def _async_listen_job_events(
@@ -546,6 +596,14 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
)
return unsub
async def _get_job_state(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
) -> None:
"""Poll a job for its state."""
job = await self._client.jobs.get_job(UUID(job_id))
_LOGGER.debug("Job state: %s", job)
on_event(job.to_dict())
async def _default_agent(client: SupervisorClient) -> str:
"""Return the default agent for creating a backup."""
@@ -570,10 +628,20 @@ async def backup_addon_before_update(
else:
password = None
def addon_update_backup_filter(
backups: dict[str, ManagerBackup],
) -> dict[str, ManagerBackup]:
"""Return addon update backups."""
return {
backup_id: backup
for backup_id, backup in backups.items()
if backup.extra_metadata.get(TAG_ADDON_UPDATE) == addon
}
try:
await backup_manager.async_create_backup(
agent_ids=[await _default_agent(client)],
extra_metadata={"supervisor.addon_update": addon},
extra_metadata={TAG_ADDON_UPDATE: addon},
include_addons=[addon],
include_all_addons=False,
include_database=False,
@@ -584,6 +652,14 @@ async def backup_addon_before_update(
)
except BackupManagerError as err:
raise HomeAssistantError(f"Error creating backup: {err}") from err
else:
try:
await backup_manager.async_delete_filtered_backups(
include_filter=addon_update_backup_filter,
delete_filter=lambda backups: backups,
)
except BackupManagerError as err:
raise HomeAssistantError(f"Error deleting old backups: {err}") from err
async def backup_core_before_update(hass: HomeAssistant) -> None:
+5
View File
@@ -78,6 +78,7 @@ from .const import (
CONF_VIDEO_CODEC,
CONF_VIDEO_MAP,
CONF_VIDEO_PACKET_SIZE,
CONF_VIDEO_PROFILE_NAMES,
DEFAULT_AUDIO_CODEC,
DEFAULT_AUDIO_MAP,
DEFAULT_AUDIO_PACKET_SIZE,
@@ -90,6 +91,7 @@ from .const import (
DEFAULT_VIDEO_CODEC,
DEFAULT_VIDEO_MAP,
DEFAULT_VIDEO_PACKET_SIZE,
DEFAULT_VIDEO_PROFILE_NAMES,
DOMAIN,
FEATURE_ON_OFF,
FEATURE_PLAY_PAUSE,
@@ -163,6 +165,9 @@ CAMERA_SCHEMA = BASIC_INFO_SCHEMA.extend(
vol.Optional(CONF_VIDEO_CODEC, default=DEFAULT_VIDEO_CODEC): vol.In(
VALID_VIDEO_CODECS
),
vol.Optional(CONF_VIDEO_PROFILE_NAMES, default=DEFAULT_VIDEO_PROFILE_NAMES): [
cv.string
],
vol.Optional(
CONF_AUDIO_PACKET_SIZE, default=DEFAULT_AUDIO_PACKET_SIZE
): cv.positive_int,
@@ -12,6 +12,6 @@
"iot_class": "local_polling",
"loggers": ["homewizard_energy"],
"quality_scale": "platinum",
"requirements": ["python-homewizard-energy==v8.3.0"],
"requirements": ["python-homewizard-energy==v8.3.2"],
"zeroconf": ["_hwenergy._tcp.local.", "_homewizard._tcp.local."]
}
+1 -1
View File
@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/imap",
"iot_class": "cloud_push",
"loggers": ["aioimaplib"],
"requirements": ["aioimaplib==2.0.0"]
"requirements": ["aioimaplib==2.0.1"]
}
@@ -1,6 +1,6 @@
{
"domain": "incomfort",
"name": "Intergas InComfort/Intouch Lan2RF gateway",
"name": "Intergas gateway",
"codeowners": ["@jbouwh"],
"config_flow": true,
"dhcp": [
+11 -11
View File
@@ -2,20 +2,20 @@
"config": {
"step": {
"user": {
"description": "Set up new Intergas InComfort Lan2RF Gateway, some older systems might not need credentials to be set up. For newer devices authentication is required.",
"description": "Set up new Intergas gateway, some older systems might not need credentials to be set up. For newer devices authentication is required.",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"username": "[%key:common::config_flow::data::username%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"host": "Hostname or IP-address of the Intergas InComfort Lan2RF Gateway.",
"host": "Hostname or IP-address of the Intergas gateway.",
"username": "The username to log into the gateway. This is `admin` in most cases.",
"password": "The password to log into the gateway, is printed at the bottom of the Lan2RF Gateway or is `intergas` for some older devices."
"password": "The password to log into the gateway, is printed at the bottom of the gateway or is `intergas` for some older devices."
}
},
"dhcp_auth": {
"title": "Set up Intergas InComfort Lan2RF Gateway",
"title": "Set up Intergas gateway",
"description": "Please enter authentication details for gateway {host}",
"data": {
"username": "[%key:common::config_flow::data::username%]",
@@ -23,12 +23,12 @@
},
"data_description": {
"username": "The username to log into the gateway. This is `admin` in most cases.",
"password": "The password to log into the gateway, is printed at the bottom of the Lan2RF Gateway or is `intergas` for some older devices."
"password": "The password to log into the gateway, is printed at the bottom of the Gateway or is `intergas` for some older devices."
}
},
"dhcp_confirm": {
"title": "Set up Intergas InComfort Lan2RF Gateway",
"description": "Do you want to set up the discovered Intergas InComfort Lan2RF Gateway ({host})?"
"title": "Set up Intergas gateway",
"description": "Do you want to set up the discovered Intergas gateway ({host})?"
},
"reauth_confirm": {
"data": {
@@ -48,9 +48,9 @@
"error": {
"auth_error": "Invalid credentials.",
"no_heaters": "No heaters found.",
"not_found": "No Lan2RF gateway found.",
"timeout_error": "Time out when connecting to Lan2RF gateway.",
"unknown": "Unknown error when connecting to Lan2RF gateway."
"not_found": "No gateway found.",
"timeout_error": "Time out when connecting to the gateway.",
"unknown": "Unknown error when connecting to the gateway."
}
},
"exceptions": {
@@ -70,7 +70,7 @@
"options": {
"step": {
"init": {
"title": "Intergas InComfort Lan2RF Gateway options",
"title": "Intergas gateway options",
"data": {
"legacy_setpoint_status": "Legacy setpoint handling"
},
@@ -72,7 +72,7 @@ class INKBIRDConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -7,6 +7,6 @@
"integration_type": "service",
"iot_class": "local_polling",
"loggers": ["jellyfin_apiclient_python"],
"requirements": ["jellyfin-apiclient-python==1.9.2"],
"requirements": ["jellyfin-apiclient-python==1.10.0"],
"single_config_entry": true
}
+1 -1
View File
@@ -12,7 +12,7 @@
"requirements": [
"xknx==3.5.0",
"xknxproject==3.8.1",
"knx-frontend==2025.1.28.225404"
"knx-frontend==2025.1.30.194235"
],
"single_config_entry": true
}
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/lacrosse_view",
"iot_class": "cloud_polling",
"loggers": ["lacrosse_view"],
"requirements": ["lacrosse-view==1.0.3"]
"requirements": ["lacrosse-view==1.0.4"]
}
+1 -1
View File
@@ -113,7 +113,7 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
saturation = int(saturation / 100 * 65535)
kelvin = 3500
if _ATTR_COLOR_TEMP in kwargs:
if ATTR_COLOR_TEMP_KELVIN not in kwargs and _ATTR_COLOR_TEMP in kwargs:
# added in 2025.1, can be removed in 2026.1
_LOGGER.warning(
"The 'color_temp' parameter is deprecated. Please use 'color_temp_kelvin' for"
@@ -6,7 +6,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType
from . import http
from . import http, llm_api
from .const import DOMAIN
from .session import SessionManager
from .types import MCPServerConfigEntry
@@ -25,6 +25,7 @@ CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Model Context Protocol component."""
http.async_register(hass)
llm_api.async_register_api(hass)
return True
@@ -16,7 +16,7 @@ from homeassistant.helpers.selector import (
SelectSelectorConfig,
)
from .const import DOMAIN
from .const import DOMAIN, LLM_API, LLM_API_NAME
_LOGGER = logging.getLogger(__name__)
@@ -33,6 +33,12 @@ class ModelContextServerProtocolConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult:
"""Handle the initial step."""
llm_apis = {api.id: api.name for api in llm.async_get_apis(self.hass)}
if LLM_API not in llm_apis:
# MCP server component is not loaded yet, so make the LLM API a choice.
llm_apis = {
LLM_API: LLM_API_NAME,
**llm_apis,
}
if user_input is not None:
return self.async_create_entry(
@@ -2,3 +2,5 @@
DOMAIN = "mcp_server"
TITLE = "Model Context Protocol Server"
LLM_API = "stateless_assist"
LLM_API_NAME = "Stateless Assist"
@@ -0,0 +1,48 @@
"""LLM API for MCP Server."""
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import llm
from homeassistant.util import yaml as yaml_util
from .const import LLM_API, LLM_API_NAME
EXPOSED_ENTITY_FIELDS = {"name", "domain", "description", "areas", "names"}
def async_register_api(hass: HomeAssistant) -> None:
"""Register the LLM API."""
llm.async_register_api(hass, StatelessAssistAPI(hass))
class StatelessAssistAPI(llm.AssistAPI):
"""LLM API for MCP Server that provides the Assist API without state information in the prompt.
Syncing the state information is possible, but may put unnecessary load on
the system so we are instead providing the prompt without entity state. Since
actions don't care about the current state, there is little quality loss.
"""
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the StatelessAssistAPI."""
super().__init__(hass)
self.id = LLM_API
self.name = LLM_API_NAME
@callback
def _async_get_exposed_entities_prompt(
self, llm_context: llm.LLMContext, exposed_entities: dict | None
) -> list[str]:
"""Return the prompt for the exposed entities."""
prompt = []
if exposed_entities:
prompt.append(
"An overview of the areas and the devices in this smart home:"
)
entities = [
{k: v for k, v in entity_info.items() if k in EXPOSED_ENTITY_FIELDS}
for entity_info in exposed_entities.values()
]
prompt.append(yaml_util.dump(list(entities)))
return prompt
@@ -111,7 +111,7 @@ class MopekaConfigFlow(ConfigFlow, domain=DOMAIN):
data={CONF_MEDIUM_TYPE: user_input[CONF_MEDIUM_TYPE]},
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
+1 -1
View File
@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/nest",
"iot_class": "cloud_push",
"loggers": ["google_nest_sdm"],
"requirements": ["google-nest-sdm==7.1.0"]
"requirements": ["google-nest-sdm==7.1.1"]
}
@@ -97,6 +97,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
backup_folder_id=backup_folder_id,
)
_async_notify_backup_listeners_soon(hass)
return True
+66 -29
View File
@@ -2,6 +2,7 @@
from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine
from functools import wraps
import html
@@ -9,7 +10,7 @@ import json
import logging
from typing import Any, Concatenate, cast
from httpx import Response
from httpx import Response, TimeoutException
from kiota_abstractions.api_error import APIError
from kiota_abstractions.authentication import AnonymousAuthenticationProvider
from kiota_abstractions.headers_collection import HeadersCollection
@@ -33,7 +34,12 @@ from msgraph.generated.models.drive_item_uploadable_properties import (
)
from msgraph_core.models import LargeFileUploadSession
from homeassistant.components.backup import AgentBackup, BackupAgent, BackupAgentError
from homeassistant.components.backup import (
AgentBackup,
BackupAgent,
BackupAgentError,
suggested_filename,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.httpx_client import get_async_client
@@ -42,6 +48,7 @@ from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
_LOGGER = logging.getLogger(__name__)
UPLOAD_CHUNK_SIZE = 16 * 320 * 1024 # 5.2MB
MAX_RETRIES = 5
async def async_get_backup_agents(
@@ -96,7 +103,7 @@ def handle_backup_errors[_R, **P](
)
_LOGGER.debug("Full error: %s", err, exc_info=True)
raise BackupAgentError("Backup operation failed") from err
except TimeoutError as err:
except TimeoutException as err:
_LOGGER.error(
"Error during backup in %s: Timeout",
func.__name__,
@@ -128,6 +135,10 @@ class OneDriveBackupAgent(BackupAgent):
) -> AsyncIterator[bytes]:
"""Download a backup file."""
# this forces the query to return a raw httpx response, but breaks typing
backup = await self._find_item_by_backup_id(backup_id)
if backup is None or backup.id is None:
raise BackupAgentError("Backup not found")
request_config = (
ContentRequestBuilder.ContentRequestBuilderGetRequestConfiguration(
options=[ResponseHandlerOption(NativeResponseHandler())],
@@ -135,7 +146,7 @@ class OneDriveBackupAgent(BackupAgent):
)
response = cast(
Response,
await self._get_backup_file_item(backup_id).content.get(
await self._items.by_drive_item_id(backup.id).content.get(
request_configuration=request_config
),
)
@@ -160,9 +171,10 @@ class OneDriveBackupAgent(BackupAgent):
},
)
)
upload_session = await self._get_backup_file_item(
backup.backup_id
).create_upload_session.post(upload_session_request_body)
file_item = self._get_backup_file_item(suggested_filename(backup))
upload_session = await file_item.create_upload_session.post(
upload_session_request_body
)
if upload_session is None or upload_session.upload_url is None:
raise BackupAgentError(
@@ -179,9 +191,7 @@ class OneDriveBackupAgent(BackupAgent):
description = json.dumps(backup_dict)
_LOGGER.debug("Creating metadata: %s", description)
await self._get_backup_file_item(backup.backup_id).patch(
DriveItem(description=description)
)
await file_item.patch(DriveItem(description=description))
@handle_backup_errors
async def async_delete_backup(
@@ -190,7 +200,10 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Delete a backup file."""
await self._get_backup_file_item(backup_id).delete()
backup = await self._find_item_by_backup_id(backup_id)
if backup is None or backup.id is None:
return
await self._items.by_drive_item_id(backup.id).delete()
@handle_backup_errors
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
@@ -210,18 +223,12 @@ class OneDriveBackupAgent(BackupAgent):
self, backup_id: str, **kwargs: Any
) -> AgentBackup | None:
"""Return a backup."""
try:
drive_item = await self._get_backup_file_item(backup_id).get()
except APIError as err:
if err.response_status_code == 404:
return None
raise
if (
drive_item is not None
and (description := drive_item.description) is not None
):
return self._backup_from_description(description)
return None
backup = await self._find_item_by_backup_id(backup_id)
if backup is None:
return None
assert backup.description # already checked in _find_item_by_backup_id
return self._backup_from_description(backup.description)
def _backup_from_description(self, description: str) -> AgentBackup:
"""Create a backup object from a description."""
@@ -230,8 +237,20 @@ class OneDriveBackupAgent(BackupAgent):
) # OneDrive encodes the description on save automatically
return AgentBackup.from_dict(json.loads(description))
async def _find_item_by_backup_id(self, backup_id: str) -> DriveItem | None:
"""Find a backup item by its backup ID."""
items = await self._items.by_drive_item_id(f"{self._folder_id}").children.get()
if items and (values := items.value):
for item in values:
if (description := item.description) is None:
continue
if backup_id in description:
return item
return None
def _get_backup_file_item(self, backup_id: str) -> DriveItemItemRequestBuilder:
return self._items.by_drive_item_id(f"{self._folder_id}:/{backup_id}.tar:")
return self._items.by_drive_item_id(f"{self._folder_id}:/{backup_id}:")
async def _upload_file(
self, upload_url: str, stream: AsyncIterator[bytes], total_size: int
@@ -262,6 +281,7 @@ class OneDriveBackupAgent(BackupAgent):
start = 0
buffer: list[bytes] = []
buffer_size = 0
retries = 0
async for chunk in stream:
buffer.append(chunk)
@@ -273,11 +293,28 @@ class OneDriveBackupAgent(BackupAgent):
buffer_size > UPLOAD_CHUNK_SIZE
): # Loop in case the buffer is >= UPLOAD_CHUNK_SIZE * 2
slice_start = uploaded_chunks * UPLOAD_CHUNK_SIZE
await async_upload(
start,
start + UPLOAD_CHUNK_SIZE - 1,
chunk_data[slice_start : slice_start + UPLOAD_CHUNK_SIZE],
)
try:
await async_upload(
start,
start + UPLOAD_CHUNK_SIZE - 1,
chunk_data[slice_start : slice_start + UPLOAD_CHUNK_SIZE],
)
except APIError as err:
if (
err.response_status_code and err.response_status_code < 500
): # no retry on 4xx errors
raise
if retries < MAX_RETRIES:
await asyncio.sleep(2**retries)
retries += 1
continue
raise
except TimeoutException:
if retries < MAX_RETRIES:
retries += 1
continue
raise
retries = 0
start += UPLOAD_CHUNK_SIZE
uploaded_chunks += 1
buffer_size -= UPLOAD_CHUNK_SIZE
@@ -78,7 +78,7 @@ class OneDriveConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
self.logger.exception("Unknown error")
return self.async_abort(reason="unknown")
drive = response.json()
drive: dict = response.json()
await self.async_set_unique_id(drive["parentReference"]["driveId"])
@@ -94,7 +94,10 @@ class OneDriveConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
self._abort_if_unique_id_configured()
title = f"{drive['shared']['owner']['user']['displayName']}'s OneDrive"
user = drive.get("createdBy", {}).get("user", {}).get("displayName")
title = f"{user}'s OneDrive" if user else "OneDrive"
return self.async_create_entry(title=title, data=data)
async def async_step_reauth(
@@ -147,6 +147,7 @@ class OneWireFlowHandler(ConfigFlow, domain=DOMAIN):
return self.async_show_form(
step_id="discovery_confirm",
description_placeholders={"host": self._discovery_data[CONF_HOST]},
errors=errors,
)
@@ -8,6 +8,9 @@
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"step": {
"discovery_confirm": {
"description": "Do you want to set up OWServer from {host}?"
},
"reconfigure": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
@@ -5,18 +5,16 @@ import logging
from types import MappingProxyType
from typing import Any, cast
import aiohttp
from opower import (
Account,
AggregateType,
CannotConnect,
CostRead,
Forecast,
InvalidAuth,
MeterType,
Opower,
ReadResolution,
)
from opower.exceptions import ApiException, CannotConnect, InvalidAuth
from homeassistant.components.recorder import get_instance
from homeassistant.components.recorder.models import StatisticData, StatisticMetaData
@@ -89,7 +87,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
raise UpdateFailed(f"Error during login: {err}") from err
try:
forecasts: list[Forecast] = await self.api.async_get_forecast()
except aiohttp.ClientError as err:
except ApiException as err:
_LOGGER.error("Error getting forecasts: %s", err)
raise
_LOGGER.debug("Updating sensor data with: %s", forecasts)
@@ -102,7 +100,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
"""Insert Opower statistics."""
try:
accounts = await self.api.async_get_accounts()
except aiohttp.ClientError as err:
except ApiException as err:
_LOGGER.error("Error getting accounts: %s", err)
raise
for account in accounts:
@@ -271,7 +269,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
cost_reads = await self.api.async_get_cost_reads(
account, AggregateType.BILL, start, end
)
except aiohttp.ClientError as err:
except ApiException as err:
_LOGGER.error("Error getting monthly cost reads: %s", err)
raise
_LOGGER.debug("Got %s monthly cost reads", len(cost_reads))
@@ -290,7 +288,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
daily_cost_reads = await self.api.async_get_cost_reads(
account, AggregateType.DAY, start, end
)
except aiohttp.ClientError as err:
except ApiException as err:
_LOGGER.error("Error getting daily cost reads: %s", err)
raise
_LOGGER.debug("Got %s daily cost reads", len(daily_cost_reads))
@@ -308,7 +306,7 @@ class OpowerCoordinator(DataUpdateCoordinator[dict[str, Forecast]]):
hourly_cost_reads = await self.api.async_get_cost_reads(
account, AggregateType.HOUR, start, end
)
except aiohttp.ClientError as err:
except ApiException as err:
_LOGGER.error("Error getting hourly cost reads: %s", err)
raise
_LOGGER.debug("Got %s hourly cost reads", len(hourly_cost_reads))
@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/opower",
"iot_class": "cloud_polling",
"loggers": ["opower"],
"requirements": ["opower==0.8.8"]
"requirements": ["opower==0.8.9"]
}
+4 -4
View File
@@ -97,7 +97,7 @@ ELEC_SENSORS: tuple[OpowerEntityDescription, ...] = (
device_class=SensorDeviceClass.DATE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: data.start_date,
value_fn=lambda data: str(data.start_date),
),
OpowerEntityDescription(
key="elec_end_date",
@@ -105,7 +105,7 @@ ELEC_SENSORS: tuple[OpowerEntityDescription, ...] = (
device_class=SensorDeviceClass.DATE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: data.end_date,
value_fn=lambda data: str(data.end_date),
),
)
GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
@@ -169,7 +169,7 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
device_class=SensorDeviceClass.DATE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: data.start_date,
value_fn=lambda data: str(data.start_date),
),
OpowerEntityDescription(
key="gas_end_date",
@@ -177,7 +177,7 @@ GAS_SENSORS: tuple[OpowerEntityDescription, ...] = (
device_class=SensorDeviceClass.DATE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: data.end_date,
value_fn=lambda data: str(data.end_date),
),
)
@@ -72,7 +72,7 @@ class OralBConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -27,7 +27,7 @@
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_auth": "Authentication failed. Your API key is invalid or CSRF protection is turned on, preventing authentication.",
"invalid_host": "The provided URL is not a valid host."
}
},
@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/python_script",
"loggers": ["RestrictedPython"],
"quality_scale": "internal",
"requirements": ["RestrictedPython==7.4"]
"requirements": ["RestrictedPython==8.0"]
}
@@ -98,7 +98,7 @@ class QingpingConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -7,7 +7,7 @@
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": [
"SQLAlchemy==2.0.36",
"SQLAlchemy==2.0.37",
"fnv-hash-fast==1.2.2",
"psutil-home-assistant==0.0.1"
]
+24 -10
View File
@@ -28,11 +28,11 @@ from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.typing import ConfigType
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import CONF_USE_HTTPS, DOMAIN
from .const import CONF_SUPPORTS_PRIVACY_MODE, CONF_USE_HTTPS, DOMAIN
from .exceptions import PasswordIncompatible, ReolinkException, UserNotAdmin
from .host import ReolinkHost
from .services import async_setup_services
from .util import ReolinkConfigEntry, ReolinkData, get_device_uid_and_ch
from .util import ReolinkConfigEntry, ReolinkData, get_device_uid_and_ch, get_store
from .views import PlaybackProxyView
_LOGGER = logging.getLogger(__name__)
@@ -67,7 +67,9 @@ async def async_setup_entry(
hass: HomeAssistant, config_entry: ReolinkConfigEntry
) -> bool:
"""Set up Reolink from a config entry."""
host = ReolinkHost(hass, config_entry.data, config_entry.options)
host = ReolinkHost(
hass, config_entry.data, config_entry.options, config_entry.entry_id
)
try:
await host.async_init()
@@ -92,21 +94,25 @@ async def async_setup_entry(
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, host.stop)
)
# update the port info if needed for the next time
# update the config info if needed for the next time
if (
host.api.port != config_entry.data[CONF_PORT]
or host.api.use_https != config_entry.data[CONF_USE_HTTPS]
or host.api.supported(None, "privacy_mode")
!= config_entry.data.get(CONF_SUPPORTS_PRIVACY_MODE)
):
_LOGGER.warning(
"HTTP(s) port of Reolink %s, changed from %s to %s",
host.api.nvr_name,
config_entry.data[CONF_PORT],
host.api.port,
)
if host.api.port != config_entry.data[CONF_PORT]:
_LOGGER.warning(
"HTTP(s) port of Reolink %s, changed from %s to %s",
host.api.nvr_name,
config_entry.data[CONF_PORT],
host.api.port,
)
data = {
**config_entry.data,
CONF_PORT: host.api.port,
CONF_USE_HTTPS: host.api.use_https,
CONF_SUPPORTS_PRIVACY_MODE: host.api.supported(None, "privacy_mode"),
}
hass.config_entries.async_update_entry(config_entry, data=data)
@@ -248,6 +254,14 @@ async def async_unload_entry(
return await hass.config_entries.async_unload_platforms(config_entry, PLATFORMS)
async def async_remove_entry(
hass: HomeAssistant, config_entry: ReolinkConfigEntry
) -> None:
"""Handle removal of an entry."""
store = get_store(hass, config_entry.entry_id)
await store.async_remove()
async def async_remove_config_entry_device(
hass: HomeAssistant, config_entry: ReolinkConfigEntry, device: dr.DeviceEntry
) -> bool:
@@ -37,7 +37,7 @@ from homeassistant.helpers import config_validation as cv, selector
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from .const import CONF_USE_HTTPS, DOMAIN
from .const import CONF_SUPPORTS_PRIVACY_MODE, CONF_USE_HTTPS, DOMAIN
from .exceptions import (
PasswordIncompatible,
ReolinkException,
@@ -287,6 +287,9 @@ class ReolinkFlowHandler(ConfigFlow, domain=DOMAIN):
if not errors:
user_input[CONF_PORT] = host.api.port
user_input[CONF_USE_HTTPS] = host.api.use_https
user_input[CONF_SUPPORTS_PRIVACY_MODE] = host.api.supported(
None, "privacy_mode"
)
mac_address = format_mac(host.api.mac_address)
await self.async_set_unique_id(mac_address, raise_on_progress=False)
@@ -3,3 +3,4 @@
DOMAIN = "reolink"
CONF_USE_HTTPS = "use_https"
CONF_SUPPORTS_PRIVACY_MODE = "privacy_mode_supported"
+27 -3
View File
@@ -30,15 +30,17 @@ from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.network import NoURLAvailableError, get_url
from homeassistant.helpers.storage import Store
from homeassistant.util.ssl import SSLCipherList
from .const import CONF_USE_HTTPS, DOMAIN
from .const import CONF_SUPPORTS_PRIVACY_MODE, CONF_USE_HTTPS, DOMAIN
from .exceptions import (
PasswordIncompatible,
ReolinkSetupException,
ReolinkWebhookException,
UserNotAdmin,
)
from .util import get_store
DEFAULT_TIMEOUT = 30
FIRST_TCP_PUSH_TIMEOUT = 10
@@ -64,9 +66,12 @@ class ReolinkHost:
hass: HomeAssistant,
config: Mapping[str, Any],
options: Mapping[str, Any],
config_entry_id: str | None = None,
) -> None:
"""Initialize Reolink Host. Could be either NVR, or Camera."""
self._hass: HomeAssistant = hass
self._config_entry_id = config_entry_id
self._config = config
self._unique_id: str = ""
def get_aiohttp_session() -> aiohttp.ClientSession:
@@ -150,6 +155,14 @@ class ReolinkHost:
f"a-z, A-Z, 0-9 or {ALLOWED_SPECIAL_CHARS}"
)
store: Store[str] | None = None
if self._config_entry_id is not None:
store = get_store(self._hass, self._config_entry_id)
if self._config.get(CONF_SUPPORTS_PRIVACY_MODE):
data = await store.async_load()
if data:
self._api.set_raw_host_data(data)
await self._api.get_host_data()
if self._api.mac_address is None:
@@ -161,6 +174,19 @@ class ReolinkHost:
f"'{self._api.user_level}', only admin users can change camera settings"
)
self.privacy_mode = self._api.baichuan.privacy_mode()
if (
store
and self._api.supported(None, "privacy_mode")
and not self.privacy_mode
):
_LOGGER.debug(
"Saving raw host data for next reload in case privacy mode is enabled"
)
data = self._api.get_raw_host_data()
await store.async_save(data)
onvif_supported = self._api.supported(None, "ONVIF")
self._onvif_push_supported = onvif_supported
self._onvif_long_poll_supported = onvif_supported
@@ -235,8 +261,6 @@ class ReolinkHost:
self._hass, FIRST_TCP_PUSH_TIMEOUT, self._async_check_tcp_push
)
self.privacy_mode = self._api.baichuan.privacy_mode()
ch_list: list[int | None] = [None]
if self._api.is_nvr:
ch_list.extend(self._api.channels)
+12 -2
View File
@@ -4,7 +4,7 @@ from __future__ import annotations
from collections.abc import Awaitable, Callable, Coroutine
from dataclasses import dataclass
from typing import Any
from typing import TYPE_CHECKING, Any
from reolink_aio.exceptions import (
ApiError,
@@ -26,10 +26,15 @@ from homeassistant.components.media_source import Unresolvable
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.storage import Store
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN
from .host import ReolinkHost
if TYPE_CHECKING:
from .host import ReolinkHost
STORAGE_VERSION = 1
type ReolinkConfigEntry = config_entries.ConfigEntry[ReolinkData]
@@ -64,6 +69,11 @@ def get_host(hass: HomeAssistant, config_entry_id: str) -> ReolinkHost:
return config_entry.runtime_data.host
def get_store(hass: HomeAssistant, config_entry_id: str) -> Store[str]:
"""Return the reolink store."""
return Store[str](hass, STORAGE_VERSION, f"{DOMAIN}.{config_entry_id}.json")
def get_device_uid_and_ch(
device: dr.DeviceEntry, host: ReolinkHost
) -> tuple[list[str], int | None, bool]:
+16 -8
View File
@@ -22,7 +22,7 @@ from roborock.version_a01_apis import RoborockMqttClientA01
from roborock.web_api import RoborockApiClient
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_USERNAME
from homeassistant.const import CONF_USERNAME, EVENT_HOMEASSISTANT_STOP
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
@@ -118,13 +118,21 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
)
valid_coordinators = RoborockCoordinators(v1_coords, a01_coords)
async def on_unload() -> None:
release_tasks = set()
for coordinator in valid_coordinators.values():
release_tasks.add(coordinator.release())
await asyncio.gather(*release_tasks)
async def on_stop(_: Any) -> None:
_LOGGER.debug("Shutting down roborock")
await asyncio.gather(
*(
coordinator.async_shutdown()
for coordinator in valid_coordinators.values()
)
)
entry.async_on_unload(on_unload)
entry.async_on_unload(
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STOP,
on_stop,
)
)
entry.runtime_data = valid_coordinators
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
@@ -209,7 +217,7 @@ async def setup_device_v1(
try:
await coordinator.async_config_entry_first_refresh()
except ConfigEntryNotReady as ex:
await coordinator.release()
await coordinator.async_shutdown()
if isinstance(coordinator.api, RoborockMqttClientV1):
_LOGGER.warning(
"Not setting up %s because the we failed to get data for the first time using the online client. "
@@ -2,6 +2,7 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
@@ -116,10 +117,14 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
# Right now this should never be called if the cloud api is the primary api,
# but in the future if it is, a new else should be added.
async def release(self) -> None:
"""Disconnect from API."""
await self.api.async_release()
await self.cloud_api.async_release()
async def async_shutdown(self) -> None:
"""Shutdown the coordinator."""
await super().async_shutdown()
await asyncio.gather(
self.map_storage.flush(),
self.api.async_release(),
self.cloud_api.async_release(),
)
async def _update_device_prop(self) -> None:
"""Update device properties."""
@@ -226,8 +231,9 @@ class RoborockDataUpdateCoordinatorA01(
) -> dict[RoborockDyadDataProtocol | RoborockZeoProtocol, StateType]:
return await self.api.update_values(self.request_protocols)
async def release(self) -> None:
"""Disconnect from API."""
async def async_shutdown(self) -> None:
"""Shutdown the coordinator on config entry unload."""
await super().async_shutdown()
await self.api.async_release()
@cached_property
+3 -7
View File
@@ -157,13 +157,9 @@ class RoborockMap(RoborockCoordinatedEntityV1, ImageEntity):
)
if self.cached_map != content:
self.cached_map = content
self.config_entry.async_create_task(
self.hass,
self.coordinator.map_storage.async_save_map(
self.map_flag,
content,
),
f"{self.unique_id} map",
await self.coordinator.map_storage.async_save_map(
self.map_flag,
content,
)
return self.cached_map
@@ -31,6 +31,7 @@ class RoborockMapStorage:
self._path_prefix = (
_storage_path_prefix(hass, entry_id) / MAPS_PATH / device_id_slug
)
self._write_queue: dict[int, bytes] = {}
async def async_load_map(self, map_flag: int) -> bytes | None:
"""Load maps from disk."""
@@ -48,9 +49,22 @@ class RoborockMapStorage:
return None
async def async_save_map(self, map_flag: int, content: bytes) -> None:
"""Write map if it should be updated."""
filename = self._path_prefix / f"{map_flag}{MAP_FILENAME_SUFFIX}"
await self._hass.async_add_executor_job(self._save_map, filename, content)
"""Save the map to a pending write queue."""
self._write_queue[map_flag] = content
async def flush(self) -> None:
"""Flush all maps to disk."""
_LOGGER.debug("Flushing %s maps to disk", len(self._write_queue))
queue = self._write_queue.copy()
def _flush_all() -> None:
for map_flag, content in queue.items():
filename = self._path_prefix / f"{map_flag}{MAP_FILENAME_SUFFIX}"
self._save_map(filename, content)
await self._hass.async_add_executor_job(_flush_all)
self._write_queue.clear()
def _save_map(self, filename: Path, content: bytes) -> None:
"""Write the map to disk."""
@@ -72,7 +72,7 @@ class SensorPushConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -21,6 +21,7 @@ async def async_connect_scanner(
hass: HomeAssistant,
coordinator: ShellyRpcCoordinator,
scanner_mode: BLEScannerMode,
device_id: str,
) -> CALLBACK_TYPE:
"""Connect scanner."""
device = coordinator.device
@@ -34,6 +35,7 @@ async def async_connect_scanner(
source_domain=entry.domain,
source_model=coordinator.model,
source_config_entry_id=entry.entry_id,
source_device_id=device_id,
),
scanner.async_setup(),
coordinator.async_subscribe_events(scanner.async_on_event),
@@ -704,8 +704,11 @@ class ShellyRpcCoordinator(ShellyCoordinatorBase[RpcDevice]):
# BLE enable required a reboot, don't bother connecting
# the scanner since it will be disconnected anyway
return
assert self.device_id is not None
self._disconnected_callbacks.append(
await async_connect_scanner(self.hass, self, ble_scanner_mode)
await async_connect_scanner(
self.hass, self, ble_scanner_mode, self.device_id
)
)
@callback
+2 -2
View File
@@ -186,7 +186,7 @@ RPC_NUMBERS: Final = {
mode_fn=lambda config: VIRTUAL_NUMBER_MODE_MAP.get(
config["meta"]["ui"]["view"], NumberMode.BOX
),
step_fn=lambda config: config["meta"]["ui"]["step"],
step_fn=lambda config: config["meta"]["ui"].get("step"),
# If the unit is not set, the device sends an empty string
unit=lambda config: config["meta"]["ui"]["unit"]
if config["meta"]["ui"]["unit"]
@@ -208,7 +208,7 @@ RPC_NUMBERS: Final = {
method_params_fn=lambda idx, value: {
"id": idx,
"method": "Trv.SetPosition",
"params": {"id": 0, "pos": value},
"params": {"id": 0, "pos": int(value)},
},
removal_condition=lambda config, _status, key: config[key].get("enable", True)
is True,
@@ -144,11 +144,15 @@ class SmFirmwareUpdateCoordinator(SmBaseDataUpdateCoordinator[SmFwData]):
async def _internal_update_data(self) -> SmFwData:
"""Fetch data from the SMLIGHT device."""
info = await self.client.get_info()
esp_firmware = None
zb_firmware = None
return SmFwData(
info=info,
esp_firmware=await self.client.get_firmware_version(info.fw_channel),
zb_firmware=await self.client.get_firmware_version(
try:
esp_firmware = await self.client.get_firmware_version(info.fw_channel)
zb_firmware = await self.client.get_firmware_version(
info.fw_channel, device=info.model, mode="zigbee"
),
)
)
except SmlightConnectionError as err:
self.async_set_update_error(err)
return SmFwData(info=info, esp_firmware=esp_firmware, zb_firmware=zb_firmware)
+7 -3
View File
@@ -34,7 +34,11 @@ from homeassistant.helpers import (
)
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.event import async_call_later, async_track_time_interval
from homeassistant.helpers.service_info.ssdp import SsdpServiceInfo
from homeassistant.helpers.service_info.ssdp import (
ATTR_UPNP_MODEL_NAME,
ATTR_UPNP_UDN,
SsdpServiceInfo,
)
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.async_ import create_eager_task
@@ -503,7 +507,7 @@ class SonosDiscoveryManager:
def _async_ssdp_discovered_player(
self, info: SsdpServiceInfo, change: ssdp.SsdpChange
) -> None:
uid = info.upnp[ssdp.ATTR_UPNP_UDN]
uid = info.upnp[ATTR_UPNP_UDN]
if not uid.startswith("uuid:RINCON_"):
return
uid = uid[5:]
@@ -522,7 +526,7 @@ class SonosDiscoveryManager:
cast(str, urlparse(info.ssdp_location).hostname),
uid,
info.ssdp_headers.get("X-RINCON-BOOTSEQ"),
cast(str, info.upnp.get(ssdp.ATTR_UPNP_MODEL_NAME)),
cast(str, info.upnp.get(ATTR_UPNP_MODEL_NAME)),
None,
)
+1 -1
View File
@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sql",
"iot_class": "local_polling",
"requirements": ["SQLAlchemy==2.0.36", "sqlparse==0.5.0"]
"requirements": ["SQLAlchemy==2.0.37", "sqlparse==0.5.0"]
}
@@ -10,7 +10,7 @@
"departure2": {
"default": "mdi:bus-clock"
},
"duration": {
"trip_duration": {
"default": "mdi:timeline-clock"
},
"transfers": {
@@ -56,8 +56,10 @@ SENSORS: tuple[SwissPublicTransportSensorEntityDescription, ...] = (
],
SwissPublicTransportSensorEntityDescription(
key="duration",
translation_key="trip_duration",
device_class=SensorDeviceClass.DURATION,
native_unit_of_measurement=UnitOfTime.SECONDS,
suggested_unit_of_measurement=UnitOfTime.HOURS,
value_fn=lambda data_connection: data_connection["duration"],
),
SwissPublicTransportSensorEntityDescription(
@@ -2,7 +2,7 @@
"config": {
"error": {
"cannot_connect": "Cannot connect to server",
"bad_config": "Request failed due to bad config: Check at [stationboard]({stationboard_url}) if your station names are valid",
"bad_config": "Request failed due to bad config: Check the stationboard linked above if your station names are valid",
"too_many_via_stations": "Too many via stations, only up to 5 via stations are allowed per connection.",
"unknown": "An unknown error was raised by python-opendata-transport"
},
@@ -28,7 +28,7 @@
"time_station": "Usually the departure time of a connection when it leaves the start station is tracked. Alternatively, track the time when the connection arrives at its end station.",
"time_mode": "Time mode lets you change the departure timing and fix it to a specific time (e.g. 7:12:00 AM every morning) or add a moving offset (e.g. +00:05:00 taking into account the time to walk to the station)."
},
"description": "Provide start and end station for your connection,\nand optionally up to 5 via stations.\n\nCheck the [stationboard]({stationboard_url}) for valid stations.",
"description": "Provide start and end station for your connection, and optionally up to 5 via stations.\n\nCheck the [stationboard]({stationboard_url}) for valid stations.",
"title": "Swiss Public Transport"
},
"time_fixed": {
@@ -64,8 +64,8 @@
"departure2": {
"name": "Departure +2"
},
"duration": {
"name": "Duration"
"trip_duration": {
"name": "Trip duration"
},
"transfers": {
"name": "Transfers"
@@ -272,7 +272,7 @@ class SwitchbotConfigFlow(ConfigFlow, domain=DOMAIN):
@callback
def _async_discover_devices(self) -> None:
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for connectable in (True, False):
for discovery_info in async_discovered_service_info(self.hass, connectable):
address = discovery_info.address
@@ -70,7 +70,13 @@
"data": {
"scan_interval": "Minutes between scans",
"timeout": "Timeout (seconds)",
"snap_profile_type": "Quality level of camera snapshots (0:high 1:medium 2:low)"
"snap_profile_type": "Quality level of camera snapshots (0:high 1:medium 2:low)",
"backup_share": "[%key:component::synology_dsm::config::step::backup_share::data::backup_share%]",
"backup_path": "[%key:component::synology_dsm::config::step::backup_share::data::backup_path%]"
},
"data_description": {
"backup_share": "[%key:component::synology_dsm::config::step::backup_share::data_description::backup_share%]",
"backup_path": "[%key:component::synology_dsm::config::step::backup_share::data_description::backup_path%]"
}
}
}
@@ -72,7 +72,7 @@ class ThermoProConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -72,7 +72,7 @@ class TiltConfigFlow(ConfigFlow, domain=DOMAIN):
title=self._discovered_devices[address], data={}
)
current_addresses = self._async_current_ids()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery_info in async_discovered_service_info(self.hass, False):
address = discovery_info.address
if address in current_addresses or address in self._discovered_devices:
@@ -73,7 +73,7 @@ class TotalConnectAlarm(TotalConnectLocationEntity, AlarmControlPanelEntity):
) -> None:
"""Initialize the TotalConnect status."""
super().__init__(coordinator, location)
self._partition_id = partition_id
self._partition_id = int(partition_id)
self._partition = self._location.partitions[partition_id]
"""
@@ -81,7 +81,7 @@ class TotalConnectAlarm(TotalConnectLocationEntity, AlarmControlPanelEntity):
for most users with new support for partitions.
Add _# for partition 2 and beyond.
"""
if partition_id == 1:
if int(partition_id) == 1:
self._attr_name = None
self._attr_unique_id = str(location.location_id)
else:
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/totalconnect",
"iot_class": "cloud_polling",
"loggers": ["total_connect_client"],
"requirements": ["total-connect-client==2024.12"]
"requirements": ["total-connect-client==2025.1.4"]
}

Some files were not shown because too many files have changed in this diff Show More