Compare commits

..

94 Commits

Author SHA1 Message Date
Franck Nijhof
3e8923f105 2026.2.0 (#162224) 2026-02-04 20:35:11 +01:00
Franck Nijhof
17cca3e69d Bump version to 2026.2.0 2026-02-04 18:53:49 +00:00
Franck Nijhof
12714c489f Bump version to 2026.2.0b5 2026-02-04 18:45:36 +00:00
Robert Resch
f788d61b4a Revert "Bump intents (#162205)" (#162226) 2026-02-04 18:36:12 +00:00
Simone Chemelli
5c726af00b Fix logic and tests for Alexa Devices utils module (#162223) 2026-02-04 18:36:10 +00:00
Joost Lekkerkerker
d1d207fbb2 Add guard for Apple TV text focus state (#162207) 2026-02-04 18:36:09 +00:00
David Bonnes
6c7f8df7f7 Fix evohome not updating scheduled setpoints in state attrs (#162043) 2026-02-04 18:36:07 +00:00
Kevin Stillhammer
6f8c9b1504 Bump fressnapftracker to 0.2.2 (#161913) 2026-02-04 18:36:06 +00:00
Kevin Stillhammer
4f9aedbc84 Filter out invalid trackers in fressnapf_tracker (#161670) 2026-02-04 18:36:04 +00:00
Franck Nijhof
52fb0343e4 Bump version to 2026.2.0b4 2026-02-04 16:14:23 +00:00
Bram Kragten
1050b4580a Update frontend to 20260128.6 (#162214) 2026-02-04 16:10:08 +00:00
Åke Strandberg
344c42172e Add missing codes for Miele coffe systems (#162206) 2026-02-04 16:10:06 +00:00
Michael Hansen
93cc0fd7f1 Bump intents (#162205) 2026-02-04 16:10:05 +00:00
andreimoraru
05fe636b55 Bump yt-dlp to 2026.02.04 (#162204)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-04 16:10:03 +00:00
Marc Mueller
f22467d099 Pin auth0-python to <5.0 (#162203) 2026-02-04 16:10:01 +00:00
TheJulianJES
4bc3899b32 Bump ZHA to 0.0.89 (#162195) 2026-02-04 16:10:00 +00:00
Oliver
fc4d6bf5f1 Bump denonavr to 1.3.1 (#162183) 2026-02-04 16:09:58 +00:00
johanzander
8ed0672a8f Bump growattServer to 1.9.0 (#162179)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-04 16:09:57 +00:00
Norbert Rittel
282e347a1b Clarify action descriptions in media_player (#162172) 2026-02-04 16:09:55 +00:00
Erik Montnemery
1bfb02b440 Bump python-otbr-api to 2.8.0 (#162167) 2026-02-04 16:09:54 +00:00
Przemko92
71b03bd9ae Bump compit-inext-api to 0.8.0 (#162166) 2026-02-04 16:09:52 +00:00
Przemko92
cbd69822eb Update compit-inext-api to 0.7.0 (#162020) 2026-02-04 16:09:51 +00:00
Denis Shulyaka
db900f4dd2 Anthropic repair deprecated models (#162162)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-04 16:00:14 +00:00
Jonathan Bangert
a707e695bc Bump bleak-esphome to 3.6.0 (#162028) 2026-02-04 16:00:12 +00:00
Liquidmasl
4feceac205 Jellyfin native client controls (#161982) 2026-02-04 16:00:11 +00:00
Petro31
10c20faaca Fix template weather humidity (#161945) 2026-02-04 16:00:09 +00:00
Robert Svensson
abcd512401 Add missing OUI to Axis integration, discovery would abort with unsup… (#161943) 2026-02-04 16:00:07 +00:00
Bram Kragten
fdf8edf474 Bump version to 2026.2.0b3 2026-02-03 18:03:54 +01:00
Bram Kragten
47e1a98bee Update frontend to 20260128.5 (#162156) 2026-02-03 18:03:04 +01:00
Joost Lekkerkerker
2d8572b943 Add Heatit virtual brand (#162155) 2026-02-03 18:03:02 +01:00
Joost Lekkerkerker
660cfdbd50 Add Heiman virtual brand (#162152) 2026-02-03 18:03:00 +01:00
Steven Travers
4208595da6 Modify Analytics text on feature labs (#162151) 2026-02-03 18:02:59 +01:00
Paul Bottein
b6b2d2fc6f Update title and description of YAML dashboard repair (#162138) 2026-02-03 18:02:58 +01:00
victorigualada
6c4c632848 Handle chat log attachments in Cloud integration (#162121) 2026-02-03 18:02:57 +01:00
Shay Levy
78cf62176f Fix Shelly xpercent sensor state_class (#162107) 2026-02-03 18:02:56 +01:00
Denis Shulyaka
df971c7a42 Anthropic: Switch default model to Haiku 4.5 (#162093) 2026-02-03 18:02:55 +01:00
mezz64
1fcabb7f2d Bump pyhik to 0.4.2 (#162092)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-03 18:02:53 +01:00
Åke Strandberg
9fb60c9ea2 Update Senz temperature sensor (#162016) 2026-02-03 18:02:52 +01:00
J. Diego Rodríguez Royo
9c11a4646f Remove coffee machine's hot water sensor's state class at Home Connect (#161246) 2026-02-03 17:58:47 +01:00
jameson_uk
b036a78776 Remove invalid notification sensors for Alexa devices (#160422)
Co-authored-by: Simone Chemelli <simone.chemelli@gmail.com>
2026-02-03 17:58:45 +01:00
Kamil Breguła
60bb3cb704 Handle missing battery stats in systemmonitor (#158287)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-03 17:58:43 +01:00
Bram Kragten
0e770958ac Bump version to 2026.2.0b2 2026-02-02 19:12:33 +01:00
Bram Kragten
2a54c71b6c Update frontend to 20260128.4 (#162096) 2026-02-02 19:11:59 +01:00
Steven Travers
50463291ab Add learn more data for Analytics in labs (#162094) 2026-02-02 19:11:59 +01:00
Andrea Turri
43cc34042a Fix Miele dishwasher PowerDisk filling level sensor not showing up (#162048) 2026-02-02 19:11:58 +01:00
Jan Bouwhuis
a02244ccda Bump incomfort-client to 0.6.12 (#162037) 2026-02-02 19:11:57 +01:00
Adrián Moreno
a739619121 Bump pymeteoclimatic to 0.1.1 (#162029) 2026-02-02 19:11:56 +01:00
Åke Strandberg
5db97a5f1c Improved error checking during startup of SENZ (#162026) 2026-02-02 19:11:54 +01:00
Josef Zweck
804ba9c9cc Remove file description dependency in onedrive (#162012) 2026-02-02 19:11:53 +01:00
Filip Bårdsnes Tomren
5ecbcea946 Update ical requirement version to 12.1.3 (#162010) 2026-02-02 19:11:52 +01:00
hanwg
11be2b6289 Fix parse_mode for Telegram bot actions (#162006) 2026-02-02 19:11:51 +01:00
cdnninja
eefae0307b Add integration type of hub to vesync (#162004) 2026-02-02 19:11:50 +01:00
Matthias Alphart
d397ee28ea Fix KNX fan unique_id for switch-only fans (#162002) 2026-02-02 19:11:49 +01:00
starkillerOG
02c821128e Bump reolink-aio to 0.18.2 (#161998) 2026-02-02 19:11:48 +01:00
Shay Levy
71dc15d45f Fix Shelly CoIoT repair issue (#161973)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-02 19:11:47 +01:00
Raphael Hehl
1078387b22 Bump uiprotect to version 10.1.0 (#161967)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-02-02 19:11:46 +01:00
tronikos
35fab27d15 Bump opower to 0.17.0 (#161962) 2026-02-02 19:11:45 +01:00
Yuxin Wang
915dc7a908 Mark datetime sensors as unknown when parsing fails (#161952) 2026-02-02 19:11:44 +01:00
mvn23
e5a9738983 Fix OpenTherm Gateway button availability (#161933) 2026-02-02 19:11:43 +01:00
mvn23
2ff73219a2 Bump pyotgw to 2.2.3 (#161928) 2026-02-02 19:11:42 +01:00
epenet
5dc1270ed1 Fix mired warning in template light (#161923) 2026-02-02 19:11:41 +01:00
J. Diego Rodríguez Royo
9e95ad5a85 Restore the Home Connect program option entities (#156401)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-02-02 19:11:40 +01:00
Franck Nijhof
9a5d4610f7 Bump version to 2026.2.0b1 2026-01-30 11:45:08 +00:00
Paul Bottein
41c524fce4 Update frontend to 20260128.3 (#161918) 2026-01-30 11:44:54 +00:00
David Recordon
5f9fa95554 Fix Control4 HVAC state-to-action mapping (#161916) 2026-01-30 11:44:51 +00:00
Simone Chemelli
6950be8ea9 Handle hostname resolution for Shelly repair issue (#161914) 2026-01-30 11:44:47 +00:00
puddly
c5a8bf64d0 Bump ZHA to 0.0.88 (#161904) 2026-01-30 11:44:44 +00:00
hanwg
a2b9a6e9df Update translations for Telegram bot (#161903) 2026-01-30 11:44:43 +00:00
Marc Mueller
a0c567f0da Update fritzconnection to 1.15.1 (#161887) 2026-01-30 11:44:40 +00:00
Bram Kragten
c7feafdde6 Update frontend to 20260128.2 (#161881) 2026-01-30 11:44:38 +00:00
Björn Dalfors
e1e74b0aeb Bump nibe to 2.22.0 (#161873) 2026-01-30 11:44:36 +00:00
Sebastiaan Speck
673411ef97 Bump renault-api to 0.5.3 (#161857) 2026-01-30 11:44:34 +00:00
epenet
f7e5af7cb1 Fix incorrect entity_description class in radarr (#161856) 2026-01-30 11:44:32 +00:00
Norbert Rittel
0ee56ce708 Fix action descriptions of alarm_control_panel (#161852) 2026-01-30 11:44:30 +00:00
Manu
f93a176398 Fix string in Namecheap DynamicDNS integration (#161821) 2026-01-30 11:44:28 +00:00
Paul Bottein
cd2394bc12 Allow lovelace path for dashboard in yaml and fix yaml dashboard migration (#161816) 2026-01-30 11:44:26 +00:00
Michael Hansen
5c20b8eaff Bump intents to 2026.1.28 (#161813) 2026-01-30 11:44:25 +00:00
Aaron Godfrey
4bd499d3a6 Update todoist-api-python to 3.1.0 (#161811)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-30 11:44:23 +00:00
Jan Bouwhuis
8a53b94c5a Fix use of ambiguous units for reactive power and energy (#161810) 2026-01-30 11:44:20 +00:00
victorigualada
d5aff326e3 Use OpenAI schema dataclasses for cloud stream responses (#161663) 2026-01-30 11:44:18 +00:00
Gage Benne
22f66abbe7 Bump pydexcom to 0.5.1 (#161549) 2026-01-30 11:44:16 +00:00
Mattia Monga
f635228b1f Make viaggiatreno work by fixing some critical bugs (#160093)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-30 11:44:14 +00:00
Artur Pragacz
4c708c143d Fix validation of actions config in intent_script (#158266) 2026-01-30 11:44:12 +00:00
Franck Nijhof
3369459d41 Bump version to 2026.2.0b0 2026-01-28 20:00:19 +00:00
Jan Čermák
8536472fe9 Rename add-ons to apps in hassio integration (#161801)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-01-28 20:46:34 +01:00
Erwin Douna
ad4fda7bb4 Analytics refactor to apps (#161784)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-28 20:13:04 +01:00
Brett Adams
36e1b86952 Add missing data description string in Tesla Fleet (#161201)
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Robert Resch <robert@resch.dev>
2026-01-28 20:12:01 +01:00
Raphael Hehl
0c9834e4ca Exclude AI Port from camera entities and RTSP issues (#161188)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-01-28 19:54:15 +01:00
epenet
360af74519 Improve min/max kelvin handling in hue_ble (#161782) 2026-01-28 19:53:57 +01:00
epenet
d099ac457d Improve use of SensorEntityDescription in solax (#161687) 2026-01-28 19:50:18 +01:00
Joakim Plate
fc330ce165 Let nibe library autodetect word swap on config (#161786) 2026-01-28 19:42:36 +01:00
Chris
b52dd5fc05 Add number platform to openevse (#161726)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-28 19:40:36 +01:00
Tom Matheussen
b517ce132f Don't attempt to verify ignored Doorbird devices during discovery (#161776) 2026-01-28 19:40:02 +01:00
puddly
acec35846c Bump ZHA to 0.0.87 (#161733) 2026-01-28 19:39:13 +01:00
237 changed files with 5523 additions and 1516 deletions

View File

@@ -0,0 +1,5 @@
{
"domain": "heatit",
"name": "Heatit",
"iot_standards": ["zwave"]
}

View File

@@ -0,0 +1,5 @@
{
"domain": "heiman",
"name": "Heiman",
"iot_standards": ["matter", "zigbee"]
}

View File

@@ -166,7 +166,7 @@
},
"services": {
"alarm_arm_away": {
"description": "Arms the alarm in the away mode.",
"description": "Arms an alarm in the away mode.",
"fields": {
"code": {
"description": "[%key:component::alarm_control_panel::services::alarm_arm_custom_bypass::fields::code::description%]",
@@ -176,7 +176,7 @@
"name": "Arm away"
},
"alarm_arm_custom_bypass": {
"description": "Arms the alarm while allowing to bypass a custom area.",
"description": "Arms an alarm while allowing to bypass a custom area.",
"fields": {
"code": {
"description": "Code to arm the alarm.",
@@ -186,7 +186,7 @@
"name": "Arm with custom bypass"
},
"alarm_arm_home": {
"description": "Arms the alarm in the home mode.",
"description": "Arms an alarm in the home mode.",
"fields": {
"code": {
"description": "[%key:component::alarm_control_panel::services::alarm_arm_custom_bypass::fields::code::description%]",
@@ -196,7 +196,7 @@
"name": "Arm home"
},
"alarm_arm_night": {
"description": "Arms the alarm in the night mode.",
"description": "Arms an alarm in the night mode.",
"fields": {
"code": {
"description": "[%key:component::alarm_control_panel::services::alarm_arm_custom_bypass::fields::code::description%]",
@@ -206,7 +206,7 @@
"name": "Arm night"
},
"alarm_arm_vacation": {
"description": "Arms the alarm in the vacation mode.",
"description": "Arms an alarm in the vacation mode.",
"fields": {
"code": {
"description": "[%key:component::alarm_control_panel::services::alarm_arm_custom_bypass::fields::code::description%]",
@@ -216,7 +216,7 @@
"name": "Arm vacation"
},
"alarm_disarm": {
"description": "Disarms the alarm.",
"description": "Disarms an alarm.",
"fields": {
"code": {
"description": "Code to disarm the alarm.",
@@ -226,7 +226,7 @@
"name": "Disarm"
},
"alarm_trigger": {
"description": "Triggers the alarm manually.",
"description": "Triggers an alarm manually.",
"fields": {
"code": {
"description": "[%key:component::alarm_control_panel::services::alarm_arm_custom_bypass::fields::code::description%]",

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==11.0.2"]
"requirements": ["aioamazondevices==11.1.1"]
}

View File

@@ -28,6 +28,7 @@ from homeassistant.helpers.typing import StateType
from .const import CATEGORY_NOTIFICATIONS, CATEGORY_SENSORS
from .coordinator import AmazonConfigEntry
from .entity import AmazonEntity
from .utils import async_remove_unsupported_notification_sensors
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
@@ -105,6 +106,9 @@ async def async_setup_entry(
coordinator = entry.runtime_data
# Remove notification sensors from unsupported devices
await async_remove_unsupported_notification_sensors(hass, coordinator)
known_devices: set[str] = set()
def _check_device() -> None:
@@ -122,6 +126,7 @@ async def async_setup_entry(
AmazonSensorEntity(coordinator, serial_num, notification_desc)
for notification_desc in NOTIFICATIONS
for serial_num in new_devices
if coordinator.data[serial_num].notifications_supported
]
async_add_entities(sensors_list + notifications_list)

View File

@@ -59,13 +59,15 @@ async def async_setup_entry(
coordinator = entry.runtime_data
# Replace unique id for "DND" switch and remove from Speaker Group
await async_update_unique_id(
hass, coordinator, SWITCH_DOMAIN, "do_not_disturb", "dnd"
)
# DND keys
old_key = "do_not_disturb"
new_key = "dnd"
# Remove DND switch from virtual groups
await async_remove_dnd_from_virtual_group(hass, coordinator)
# Remove old DND switch from virtual groups
await async_remove_dnd_from_virtual_group(hass, coordinator, old_key)
# Replace unique id for DND switch
await async_update_unique_id(hass, coordinator, SWITCH_DOMAIN, old_key, new_key)
known_devices: set[str] = set()

View File

@@ -5,8 +5,14 @@ from functools import wraps
from typing import Any, Concatenate
from aioamazondevices.const.devices import SPEAKER_GROUP_FAMILY
from aioamazondevices.const.schedules import (
NOTIFICATION_ALARM,
NOTIFICATION_REMINDER,
NOTIFICATION_TIMER,
)
from aioamazondevices.exceptions import CannotConnect, CannotRetrieveData
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN
from homeassistant.components.switch import DOMAIN as SWITCH_DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
@@ -48,7 +54,7 @@ def alexa_api_call[_T: AmazonEntity, **_P](
async def async_update_unique_id(
hass: HomeAssistant,
coordinator: AmazonDevicesCoordinator,
domain: str,
platform: str,
old_key: str,
new_key: str,
) -> None:
@@ -57,7 +63,9 @@ async def async_update_unique_id(
for serial_num in coordinator.data:
unique_id = f"{serial_num}-{old_key}"
if entity_id := entity_registry.async_get_entity_id(domain, DOMAIN, unique_id):
if entity_id := entity_registry.async_get_entity_id(
DOMAIN, platform, unique_id
):
_LOGGER.debug("Updating unique_id for %s", entity_id)
new_unique_id = unique_id.replace(old_key, new_key)
@@ -68,12 +76,13 @@ async def async_update_unique_id(
async def async_remove_dnd_from_virtual_group(
hass: HomeAssistant,
coordinator: AmazonDevicesCoordinator,
key: str,
) -> None:
"""Remove entity DND from virtual group."""
entity_registry = er.async_get(hass)
for serial_num in coordinator.data:
unique_id = f"{serial_num}-do_not_disturb"
unique_id = f"{serial_num}-{key}"
entity_id = entity_registry.async_get_entity_id(
DOMAIN, SWITCH_DOMAIN, unique_id
)
@@ -81,3 +90,27 @@ async def async_remove_dnd_from_virtual_group(
if entity_id and is_group:
entity_registry.async_remove(entity_id)
_LOGGER.debug("Removed DND switch from virtual group %s", entity_id)
async def async_remove_unsupported_notification_sensors(
hass: HomeAssistant,
coordinator: AmazonDevicesCoordinator,
) -> None:
"""Remove notification sensors from unsupported devices."""
entity_registry = er.async_get(hass)
for serial_num in coordinator.data:
for notification_key in (
NOTIFICATION_ALARM,
NOTIFICATION_REMINDER,
NOTIFICATION_TIMER,
):
unique_id = f"{serial_num}-{notification_key}"
entity_id = entity_registry.async_get_entity_id(
DOMAIN, SENSOR_DOMAIN, unique_id=unique_id
)
is_unsupported = not coordinator.data[serial_num].notifications_supported
if entity_id and is_unsupported:
entity_registry.async_remove(entity_id)
_LOGGER.debug("Removed unsupported notification sensor %s", entity_id)

View File

@@ -10,6 +10,7 @@
"preview_features": {
"snapshots": {
"feedback_url": "https://forms.gle/GqvRmgmghSDco8M46",
"learn_more_url": "https://www.home-assistant.io/blog/2026/02/02/about-device-database/",
"report_issue_url": "https://github.com/OHF-Device-Database/device-database/issues/new"
}
},

View File

@@ -1,7 +1,7 @@
{
"preview_features": {
"snapshots": {
"description": "This free, open source device database of the Open Home Foundation helps users find useful information about smart home devices used in real installations.\n\nYou can help build it by anonymously sharing data about your devices. Only device-specific details (like model or manufacturer) are shared — never personally identifying information (like the names you assign).\n\nLearn more about the device database and how we process your data in our [Data Use Statement](https://www.openhomefoundation.org/device-database-data-use-statement), which you accept by opting in.",
"description": "We're creating the [Open Home Foundation Device Database](https://www.home-assistant.io/blog/2026/02/02/about-device-database/): a free, open source community-powered resource to help users find practical information about how smart home devices perform in real installations.\n\nYou can help us build it by opting in to share anonymized data about your devices. This data will only ever include device-specific details (like model or manufacturer) never personally identifying information (like the names you assign).\n\nFind out how we process your data (should you choose to contribute) in our [Data Use Statement](https://www.openhomefoundation.org/device-database-data-use-statement).",
"disable_confirmation": "Your data will no longer be shared with the Open Home Foundation's device database.",
"enable_confirmation": "This feature is still in development and may change. The device database is being refined based on user feedback and is not yet complete.",
"name": "Device database"

View File

@@ -13,9 +13,10 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_TRACKED_INTEGRATIONS
from .const import CONF_TRACKED_APPS, CONF_TRACKED_INTEGRATIONS
from .coordinator import HomeassistantAnalyticsDataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.SENSOR]
@@ -59,6 +60,30 @@ async def async_setup_entry(
return True
async def async_migrate_entry(
hass: HomeAssistant, entry: AnalyticsInsightsConfigEntry
) -> bool:
"""Migrate to a new version."""
# Migration for switching add-ons to apps
if entry.version < 2:
ent_reg = er.async_get(hass)
for entity_entry in er.async_entries_for_config_entry(ent_reg, entry.entry_id):
if not entity_entry.unique_id.startswith("addon_"):
continue
ent_reg.async_update_entity(
entity_entry.entity_id,
new_unique_id=entity_entry.unique_id.replace("addon_", "app_"),
)
options = dict(entry.options)
options[CONF_TRACKED_APPS] = options.pop("tracked_addons", [])
hass.config_entries.async_update_entry(entry, version=2, options=options)
return True
async def async_unload_entry(
hass: HomeAssistant, entry: AnalyticsInsightsConfigEntry
) -> bool:

View File

@@ -26,7 +26,7 @@ from homeassistant.helpers.selector import (
from . import AnalyticsInsightsConfigEntry
from .const import (
CONF_TRACKED_ADDONS,
CONF_TRACKED_APPS,
CONF_TRACKED_CUSTOM_INTEGRATIONS,
CONF_TRACKED_INTEGRATIONS,
DOMAIN,
@@ -43,6 +43,8 @@ INTEGRATION_TYPES_WITHOUT_ANALYTICS = (
class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Homeassistant Analytics."""
VERSION = 2
@staticmethod
@callback
def async_get_options_flow(
@@ -59,7 +61,7 @@ class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
if all(
[
not user_input.get(CONF_TRACKED_ADDONS),
not user_input.get(CONF_TRACKED_APPS),
not user_input.get(CONF_TRACKED_INTEGRATIONS),
not user_input.get(CONF_TRACKED_CUSTOM_INTEGRATIONS),
]
@@ -70,7 +72,7 @@ class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
title="Home Assistant Analytics Insights",
data={},
options={
CONF_TRACKED_ADDONS: user_input.get(CONF_TRACKED_ADDONS, []),
CONF_TRACKED_APPS: user_input.get(CONF_TRACKED_APPS, []),
CONF_TRACKED_INTEGRATIONS: user_input.get(
CONF_TRACKED_INTEGRATIONS, []
),
@@ -84,7 +86,7 @@ class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
session=async_get_clientsession(self.hass)
)
try:
addons = await client.get_addons()
apps = await client.get_addons()
integrations = await client.get_integrations(Environment.NEXT)
custom_integrations = await client.get_custom_integrations()
except HomeassistantAnalyticsConnectionError:
@@ -107,9 +109,9 @@ class HomeassistantAnalyticsConfigFlow(ConfigFlow, domain=DOMAIN):
errors=errors,
data_schema=vol.Schema(
{
vol.Optional(CONF_TRACKED_ADDONS): SelectSelector(
vol.Optional(CONF_TRACKED_APPS): SelectSelector(
SelectSelectorConfig(
options=list(addons),
options=list(apps),
multiple=True,
sort=True,
)
@@ -144,7 +146,7 @@ class HomeassistantAnalyticsOptionsFlowHandler(OptionsFlowWithReload):
if user_input is not None:
if all(
[
not user_input.get(CONF_TRACKED_ADDONS),
not user_input.get(CONF_TRACKED_APPS),
not user_input.get(CONF_TRACKED_INTEGRATIONS),
not user_input.get(CONF_TRACKED_CUSTOM_INTEGRATIONS),
]
@@ -154,7 +156,7 @@ class HomeassistantAnalyticsOptionsFlowHandler(OptionsFlowWithReload):
return self.async_create_entry(
title="",
data={
CONF_TRACKED_ADDONS: user_input.get(CONF_TRACKED_ADDONS, []),
CONF_TRACKED_APPS: user_input.get(CONF_TRACKED_APPS, []),
CONF_TRACKED_INTEGRATIONS: user_input.get(
CONF_TRACKED_INTEGRATIONS, []
),
@@ -168,7 +170,7 @@ class HomeassistantAnalyticsOptionsFlowHandler(OptionsFlowWithReload):
session=async_get_clientsession(self.hass)
)
try:
addons = await client.get_addons()
apps = await client.get_addons()
integrations = await client.get_integrations(Environment.NEXT)
custom_integrations = await client.get_custom_integrations()
except HomeassistantAnalyticsConnectionError:
@@ -189,9 +191,9 @@ class HomeassistantAnalyticsOptionsFlowHandler(OptionsFlowWithReload):
data_schema=self.add_suggested_values_to_schema(
vol.Schema(
{
vol.Optional(CONF_TRACKED_ADDONS): SelectSelector(
vol.Optional(CONF_TRACKED_APPS): SelectSelector(
SelectSelectorConfig(
options=list(addons),
options=list(apps),
multiple=True,
sort=True,
)

View File

@@ -4,7 +4,7 @@ import logging
DOMAIN = "analytics_insights"
CONF_TRACKED_ADDONS = "tracked_addons"
CONF_TRACKED_APPS = "tracked_apps"
CONF_TRACKED_INTEGRATIONS = "tracked_integrations"
CONF_TRACKED_CUSTOM_INTEGRATIONS = "tracked_custom_integrations"

View File

@@ -18,7 +18,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import (
CONF_TRACKED_ADDONS,
CONF_TRACKED_APPS,
CONF_TRACKED_CUSTOM_INTEGRATIONS,
CONF_TRACKED_INTEGRATIONS,
DOMAIN,
@@ -35,7 +35,7 @@ class AnalyticsData:
active_installations: int
reports_integrations: int
addons: dict[str, int]
apps: dict[str, int]
core_integrations: dict[str, int]
custom_integrations: dict[str, int]
@@ -60,7 +60,7 @@ class HomeassistantAnalyticsDataUpdateCoordinator(DataUpdateCoordinator[Analytic
update_interval=timedelta(hours=12),
)
self._client = client
self._tracked_addons = self.config_entry.options.get(CONF_TRACKED_ADDONS, [])
self._tracked_apps = self.config_entry.options.get(CONF_TRACKED_APPS, [])
self._tracked_integrations = self.config_entry.options[
CONF_TRACKED_INTEGRATIONS
]
@@ -70,7 +70,9 @@ class HomeassistantAnalyticsDataUpdateCoordinator(DataUpdateCoordinator[Analytic
async def _async_update_data(self) -> AnalyticsData:
try:
addons_data = await self._client.get_addons()
apps_data = (
await self._client.get_addons()
) # Still add method name. Needs library update
data = await self._client.get_current_analytics()
custom_data = await self._client.get_custom_integrations()
except HomeassistantAnalyticsConnectionError as err:
@@ -79,9 +81,7 @@ class HomeassistantAnalyticsDataUpdateCoordinator(DataUpdateCoordinator[Analytic
) from err
except HomeassistantAnalyticsNotModifiedError:
return self.data
addons = {
addon: get_addon_value(addons_data, addon) for addon in self._tracked_addons
}
apps = {app: get_app_value(apps_data, app) for app in self._tracked_apps}
core_integrations = {
integration: data.integrations.get(integration, 0)
for integration in self._tracked_integrations
@@ -93,14 +93,14 @@ class HomeassistantAnalyticsDataUpdateCoordinator(DataUpdateCoordinator[Analytic
return AnalyticsData(
data.active_installations,
data.reports_integrations,
addons,
apps,
core_integrations,
custom_integrations,
)
def get_addon_value(data: dict[str, Addon], name_slug: str) -> int:
"""Get addon value."""
def get_app_value(data: dict[str, Addon], name_slug: str) -> int:
"""Get app value."""
if name_slug in data:
return data[name_slug].total
return 0

View File

@@ -29,17 +29,17 @@ class AnalyticsSensorEntityDescription(SensorEntityDescription):
value_fn: Callable[[AnalyticsData], StateType]
def get_addon_entity_description(
def get_app_entity_description(
name_slug: str,
) -> AnalyticsSensorEntityDescription:
"""Get addon entity description."""
"""Get app entity description."""
return AnalyticsSensorEntityDescription(
key=f"addon_{name_slug}_active_installations",
translation_key="addons",
key=f"app_{name_slug}_active_installations",
translation_key="apps",
name=name_slug,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.addons.get(name_slug),
value_fn=lambda data: data.apps.get(name_slug),
)
@@ -106,9 +106,9 @@ async def async_setup_entry(
entities.extend(
HomeassistantAnalyticsSensor(
coordinator,
get_addon_entity_description(addon_name_slug),
get_app_entity_description(app_name_slug),
)
for addon_name_slug in coordinator.data.addons
for app_name_slug in coordinator.data.apps
)
entities.extend(
HomeassistantAnalyticsSensor(

View File

@@ -10,12 +10,12 @@
"step": {
"user": {
"data": {
"tracked_addons": "Add-ons",
"tracked_apps": "Apps",
"tracked_custom_integrations": "Custom integrations",
"tracked_integrations": "Integrations"
},
"data_description": {
"tracked_addons": "Select the add-ons you want to track",
"tracked_apps": "Select the apps you want to track",
"tracked_custom_integrations": "Select the custom integrations you want to track",
"tracked_integrations": "Select the integrations you want to track"
}
@@ -45,12 +45,12 @@
"step": {
"init": {
"data": {
"tracked_addons": "[%key:component::analytics_insights::config::step::user::data::tracked_addons%]",
"tracked_apps": "[%key:component::analytics_insights::config::step::user::data::tracked_apps%]",
"tracked_custom_integrations": "[%key:component::analytics_insights::config::step::user::data::tracked_custom_integrations%]",
"tracked_integrations": "[%key:component::analytics_insights::config::step::user::data::tracked_integrations%]"
},
"data_description": {
"tracked_addons": "[%key:component::analytics_insights::config::step::user::data_description::tracked_addons%]",
"tracked_apps": "[%key:component::analytics_insights::config::step::user::data_description::tracked_apps%]",
"tracked_custom_integrations": "[%key:component::analytics_insights::config::step::user::data_description::tracked_custom_integrations%]",
"tracked_integrations": "[%key:component::analytics_insights::config::step::user::data_description::tracked_integrations%]"
}

View File

@@ -14,10 +14,18 @@ from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from homeassistant.helpers.typing import ConfigType
from .const import DEFAULT_CONVERSATION_NAME, DOMAIN, LOGGER
from .const import (
CONF_CHAT_MODEL,
DATA_REPAIR_DEFER_RELOAD,
DEFAULT_CONVERSATION_NAME,
DEPRECATED_MODELS,
DOMAIN,
LOGGER,
)
PLATFORMS = (Platform.AI_TASK, Platform.CONVERSATION)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
@@ -27,6 +35,7 @@ type AnthropicConfigEntry = ConfigEntry[anthropic.AsyncClient]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up Anthropic."""
hass.data.setdefault(DOMAIN, {}).setdefault(DATA_REPAIR_DEFER_RELOAD, set())
await async_migrate_integration(hass)
return True
@@ -50,6 +59,22 @@ async def async_setup_entry(hass: HomeAssistant, entry: AnthropicConfigEntry) ->
entry.async_on_unload(entry.add_update_listener(async_update_options))
for subentry in entry.subentries.values():
if (model := subentry.data.get(CONF_CHAT_MODEL)) and model.startswith(
tuple(DEPRECATED_MODELS)
):
ir.async_create_issue(
hass,
DOMAIN,
"model_deprecated",
is_fixable=True,
is_persistent=False,
learn_more_url="https://platform.claude.com/docs/en/about-claude/model-deprecations",
severity=ir.IssueSeverity.WARNING,
translation_key="model_deprecated",
)
break
return True
@@ -62,6 +87,11 @@ async def async_update_options(
hass: HomeAssistant, entry: AnthropicConfigEntry
) -> None:
"""Update options."""
defer_reload_entries: set[str] = hass.data.setdefault(DOMAIN, {}).setdefault(
DATA_REPAIR_DEFER_RELOAD, set()
)
if entry.entry_id in defer_reload_entries:
return
await hass.config_entries.async_reload(entry.entry_id)

View File

@@ -92,6 +92,40 @@ async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> None:
await client.models.list(timeout=10.0)
async def get_model_list(client: anthropic.AsyncAnthropic) -> list[SelectOptionDict]:
"""Get list of available models."""
try:
models = (await client.models.list()).data
except anthropic.AnthropicError:
models = []
_LOGGER.debug("Available models: %s", models)
model_options: list[SelectOptionDict] = []
short_form = re.compile(r"[^\d]-\d$")
for model_info in models:
# Resolve alias from versioned model name:
model_alias = (
model_info.id[:-9]
if model_info.id
not in (
"claude-3-haiku-20240307",
"claude-3-5-haiku-20241022",
"claude-3-opus-20240229",
)
else model_info.id
)
if short_form.search(model_alias):
model_alias += "-0"
if model_alias.endswith(("haiku", "opus", "sonnet")):
model_alias += "-latest"
model_options.append(
SelectOptionDict(
label=model_info.display_name,
value=model_alias,
)
)
return model_options
class AnthropicConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Anthropic."""
@@ -401,38 +435,13 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
async def _get_model_list(self) -> list[SelectOptionDict]:
"""Get list of available models."""
try:
client = await self.hass.async_add_executor_job(
partial(
anthropic.AsyncAnthropic,
api_key=self._get_entry().data[CONF_API_KEY],
)
client = await self.hass.async_add_executor_job(
partial(
anthropic.AsyncAnthropic,
api_key=self._get_entry().data[CONF_API_KEY],
)
models = (await client.models.list()).data
except anthropic.AnthropicError:
models = []
_LOGGER.debug("Available models: %s", models)
model_options: list[SelectOptionDict] = []
short_form = re.compile(r"[^\d]-\d$")
for model_info in models:
# Resolve alias from versioned model name:
model_alias = (
model_info.id[:-9]
if model_info.id
not in ("claude-3-haiku-20240307", "claude-3-opus-20240229")
else model_info.id
)
if short_form.search(model_alias):
model_alias += "-0"
if model_alias.endswith(("haiku", "opus", "sonnet")):
model_alias += "-latest"
model_options.append(
SelectOptionDict(
label=model_info.display_name,
value=model_alias,
)
)
return model_options
)
return await get_model_list(client)
async def _get_location_data(self) -> dict[str, str]:
"""Get approximate location data of the user."""

View File

@@ -22,8 +22,10 @@ CONF_WEB_SEARCH_REGION = "region"
CONF_WEB_SEARCH_COUNTRY = "country"
CONF_WEB_SEARCH_TIMEZONE = "timezone"
DATA_REPAIR_DEFER_RELOAD = "repair_defer_reload"
DEFAULT = {
CONF_CHAT_MODEL: "claude-3-5-haiku-latest",
CONF_CHAT_MODEL: "claude-haiku-4-5",
CONF_MAX_TOKENS: 3000,
CONF_TEMPERATURE: 1.0,
CONF_THINKING_BUDGET: 0,
@@ -46,3 +48,10 @@ WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
]
DEPRECATED_MODELS = [
"claude-3-5-haiku",
"claude-3-7-sonnet",
"claude-3-5-sonnet",
"claude-3-opus",
]

View File

@@ -0,0 +1,275 @@
"""Issue repair flow for Anthropic."""
from __future__ import annotations
from collections.abc import Iterator
from typing import cast
import voluptuous as vol
from homeassistant import data_entry_flow
from homeassistant.components.repairs import RepairsFlow
from homeassistant.config_entries import ConfigEntry, ConfigEntryState, ConfigSubentry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.selector import SelectSelector, SelectSelectorConfig
from .config_flow import get_model_list
from .const import (
CONF_CHAT_MODEL,
DATA_REPAIR_DEFER_RELOAD,
DEFAULT,
DEPRECATED_MODELS,
DOMAIN,
)
class ModelDeprecatedRepairFlow(RepairsFlow):
"""Handler for an issue fixing flow."""
_subentry_iter: Iterator[tuple[str, str]] | None
_current_entry_id: str | None
_current_subentry_id: str | None
_reload_pending: set[str]
_pending_updates: dict[str, dict[str, str]]
def __init__(self) -> None:
"""Initialize the flow."""
super().__init__()
self._subentry_iter = None
self._current_entry_id = None
self._current_subentry_id = None
self._reload_pending = set()
self._pending_updates = {}
async def async_step_init(
self, user_input: dict[str, str] | None = None
) -> data_entry_flow.FlowResult:
"""Handle the first step of a fix flow."""
previous_entry_id: str | None = None
if user_input is not None:
previous_entry_id = self._async_update_current_subentry(user_input)
self._clear_current_target()
target = await self._async_next_target()
next_entry_id = target[0].entry_id if target else None
if previous_entry_id and previous_entry_id != next_entry_id:
await self._async_apply_pending_updates(previous_entry_id)
if target is None:
await self._async_apply_all_pending_updates()
return self.async_create_entry(data={})
entry, subentry, model = target
client = entry.runtime_data
model_list = [
model_option
for model_option in await get_model_list(client)
if not model_option["value"].startswith(tuple(DEPRECATED_MODELS))
]
if "opus" in model:
suggested_model = "claude-opus-4-5"
elif "haiku" in model:
suggested_model = "claude-haiku-4-5"
elif "sonnet" in model:
suggested_model = "claude-sonnet-4-5"
else:
suggested_model = cast(str, DEFAULT[CONF_CHAT_MODEL])
schema = vol.Schema(
{
vol.Required(
CONF_CHAT_MODEL,
default=suggested_model,
): SelectSelector(
SelectSelectorConfig(options=model_list, custom_value=True)
),
}
)
return self.async_show_form(
step_id="init",
data_schema=schema,
description_placeholders={
"entry_name": entry.title,
"model": model,
"subentry_name": subentry.title,
"subentry_type": self._format_subentry_type(subentry.subentry_type),
},
)
def _iter_deprecated_subentries(self) -> Iterator[tuple[str, str]]:
"""Yield entry/subentry pairs that use deprecated models."""
for entry in self.hass.config_entries.async_entries(DOMAIN):
if entry.state is not ConfigEntryState.LOADED:
continue
for subentry in entry.subentries.values():
model = subentry.data.get(CONF_CHAT_MODEL)
if model and model.startswith(tuple(DEPRECATED_MODELS)):
yield entry.entry_id, subentry.subentry_id
async def _async_next_target(
self,
) -> tuple[ConfigEntry, ConfigSubentry, str] | None:
"""Return the next deprecated subentry target."""
if self._subentry_iter is None:
self._subentry_iter = self._iter_deprecated_subentries()
while True:
try:
entry_id, subentry_id = next(self._subentry_iter)
except StopIteration:
return None
entry = self.hass.config_entries.async_get_entry(entry_id)
if entry is None:
continue
subentry = entry.subentries.get(subentry_id)
if subentry is None:
continue
model = self._pending_model(entry_id, subentry_id)
if model is None:
model = subentry.data.get(CONF_CHAT_MODEL)
if not model or not model.startswith(tuple(DEPRECATED_MODELS)):
continue
self._current_entry_id = entry_id
self._current_subentry_id = subentry_id
return entry, subentry, model
def _async_update_current_subentry(self, user_input: dict[str, str]) -> str | None:
"""Update the currently selected subentry."""
if not self._current_entry_id or not self._current_subentry_id:
return None
entry = self.hass.config_entries.async_get_entry(self._current_entry_id)
if entry is None:
return None
subentry = entry.subentries.get(self._current_subentry_id)
if subentry is None:
return None
updated_data = {
**subentry.data,
CONF_CHAT_MODEL: user_input[CONF_CHAT_MODEL],
}
if updated_data == subentry.data:
return entry.entry_id
self._queue_pending_update(
entry.entry_id,
subentry.subentry_id,
updated_data[CONF_CHAT_MODEL],
)
return entry.entry_id
def _clear_current_target(self) -> None:
"""Clear current target tracking."""
self._current_entry_id = None
self._current_subentry_id = None
def _format_subentry_type(self, subentry_type: str) -> str:
"""Return a user-friendly subentry type label."""
if subentry_type == "conversation":
return "Conversation agent"
if subentry_type in ("ai_task", "ai_task_data"):
return "AI task"
return subentry_type
def _queue_pending_update(
self, entry_id: str, subentry_id: str, model: str
) -> None:
"""Store a pending model update for a subentry."""
self._pending_updates.setdefault(entry_id, {})[subentry_id] = model
def _pending_model(self, entry_id: str, subentry_id: str) -> str | None:
"""Return a pending model update if one exists."""
return self._pending_updates.get(entry_id, {}).get(subentry_id)
def _mark_entry_for_reload(self, entry_id: str) -> None:
"""Prevent reload until repairs are complete for the entry."""
self._reload_pending.add(entry_id)
defer_reload_entries: set[str] = self.hass.data.setdefault(
DOMAIN, {}
).setdefault(DATA_REPAIR_DEFER_RELOAD, set())
defer_reload_entries.add(entry_id)
async def _async_reload_entry(self, entry_id: str) -> None:
"""Reload an entry once all repairs are completed."""
if entry_id not in self._reload_pending:
return
entry = self.hass.config_entries.async_get_entry(entry_id)
if entry is not None and entry.state is not ConfigEntryState.LOADED:
self._clear_defer_reload(entry_id)
self._reload_pending.discard(entry_id)
return
if entry is not None:
await self.hass.config_entries.async_reload(entry_id)
self._clear_defer_reload(entry_id)
self._reload_pending.discard(entry_id)
def _clear_defer_reload(self, entry_id: str) -> None:
"""Remove entry from the deferred reload set."""
defer_reload_entries: set[str] = self.hass.data.setdefault(
DOMAIN, {}
).setdefault(DATA_REPAIR_DEFER_RELOAD, set())
defer_reload_entries.discard(entry_id)
async def _async_apply_pending_updates(self, entry_id: str) -> None:
"""Apply pending subentry updates for a single entry."""
updates = self._pending_updates.pop(entry_id, None)
if not updates:
return
entry = self.hass.config_entries.async_get_entry(entry_id)
if entry is None or entry.state is not ConfigEntryState.LOADED:
return
changed = False
for subentry_id, model in updates.items():
subentry = entry.subentries.get(subentry_id)
if subentry is None:
continue
updated_data = {
**subentry.data,
CONF_CHAT_MODEL: model,
}
if updated_data == subentry.data:
continue
if not changed:
self._mark_entry_for_reload(entry_id)
changed = True
self.hass.config_entries.async_update_subentry(
entry,
subentry,
data=updated_data,
)
if not changed:
return
await self._async_reload_entry(entry_id)
async def _async_apply_all_pending_updates(self) -> None:
"""Apply all pending updates across entries."""
for entry_id in list(self._pending_updates):
await self._async_apply_pending_updates(entry_id)
async def async_create_fix_flow(
hass: HomeAssistant,
issue_id: str,
data: dict[str, str | int | float | None] | None,
) -> RepairsFlow:
"""Create flow."""
if issue_id == "model_deprecated":
return ModelDeprecatedRepairFlow()
raise HomeAssistantError("Unknown issue ID")

View File

@@ -109,5 +109,21 @@
}
}
}
},
"issues": {
"model_deprecated": {
"fix_flow": {
"step": {
"init": {
"data": {
"chat_model": "[%key:common::generic::model%]"
},
"description": "You are updating {subentry_name} ({subentry_type}) in {entry_name}. The current model {model} is deprecated. Select a supported model to continue.",
"title": "Update model"
}
}
},
"title": "Model deprecated"
}
}
}

View File

@@ -540,7 +540,17 @@ class APCUPSdSensor(APCUPSdEntity, SensorEntity):
data = self.coordinator.data[key]
if self.entity_description.device_class == SensorDeviceClass.TIMESTAMP:
self._attr_native_value = dateutil.parser.parse(data)
# The date could be "N/A" for certain fields (e.g., XOFFBATT), indicating there is no value yet.
if data == "N/A":
self._attr_native_value = None
return
try:
self._attr_native_value = dateutil.parser.parse(data)
except (dateutil.parser.ParserError, OverflowError):
# If parsing fails we should mark it as unknown, with a log for further debugging.
_LOGGER.warning('Failed to parse date for %s: "%s"', key, data)
self._attr_native_value = None
return
self._attr_native_value, inferred_unit = infer_unit(data)

View File

@@ -2,15 +2,16 @@
from __future__ import annotations
from pyatv.const import KeyboardFocusState
from pyatv.const import FeatureName, FeatureState, KeyboardFocusState
from pyatv.interface import AppleTV, KeyboardListener
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AppleTvConfigEntry
from . import SIGNAL_CONNECTED, AppleTvConfigEntry
from .entity import AppleTVEntity
@@ -21,10 +22,22 @@ async def async_setup_entry(
) -> None:
"""Load Apple TV binary sensor based on a config entry."""
# apple_tv config entries always have a unique id
assert config_entry.unique_id is not None
name: str = config_entry.data[CONF_NAME]
manager = config_entry.runtime_data
async_add_entities([AppleTVKeyboardFocused(name, config_entry.unique_id, manager)])
cb: CALLBACK_TYPE
def setup_entities(atv: AppleTV) -> None:
if atv.features.in_state(FeatureState.Available, FeatureName.TextFocusState):
assert config_entry.unique_id is not None
name: str = config_entry.data[CONF_NAME]
async_add_entities(
[AppleTVKeyboardFocused(name, config_entry.unique_id, manager)]
)
cb()
cb = async_dispatcher_connect(
hass, f"{SIGNAL_CONNECTED}_{config_entry.unique_id}", setup_entities
)
config_entry.async_on_unload(cb)
class AppleTVKeyboardFocused(AppleTVEntity, BinarySensorEntity, KeyboardListener):

View File

@@ -52,7 +52,7 @@ from .const import (
from .errors import AuthenticationRequired, CannotConnect
from .hub import AxisHub, get_axis_api
AXIS_OUI = {"00:40:8c", "ac:cc:8e", "b8:a4:4f"}
AXIS_OUI = {"00:40:8c", "ac:cc:8e", "b8:a4:4f", "e8:27:25"}
DEFAULT_PORT = 443
DEFAULT_PROTOCOL = "https"
PROTOCOL_CHOICES = ["https", "http"]

View File

@@ -12,14 +12,25 @@ from hass_nabucasa import Cloud, NabuCasaBaseError
from hass_nabucasa.llm import (
LLMAuthenticationError,
LLMRateLimitError,
LLMResponseCompletedEvent,
LLMResponseError,
LLMResponseErrorEvent,
LLMResponseFailedEvent,
LLMResponseFunctionCallArgumentsDeltaEvent,
LLMResponseFunctionCallArgumentsDoneEvent,
LLMResponseFunctionCallOutputItem,
LLMResponseImageOutputItem,
LLMResponseIncompleteEvent,
LLMResponseMessageOutputItem,
LLMResponseOutputItemAddedEvent,
LLMResponseOutputItemDoneEvent,
LLMResponseOutputTextDeltaEvent,
LLMResponseReasoningOutputItem,
LLMResponseReasoningSummaryTextDeltaEvent,
LLMResponseWebSearchCallOutputItem,
LLMResponseWebSearchCallSearchingEvent,
LLMServiceError,
)
from litellm import (
ResponseFunctionToolCall,
ResponseInputParam,
ResponsesAPIStreamEvents,
)
from openai.types.responses import (
FunctionToolParam,
ResponseInputItemParam,
@@ -60,9 +71,9 @@ class ResponseItemType(str, Enum):
def _convert_content_to_param(
chat_content: Iterable[conversation.Content],
) -> ResponseInputParam:
) -> list[ResponseInputItemParam]:
"""Convert any native chat message for this agent to the native format."""
messages: ResponseInputParam = []
messages: list[ResponseInputItemParam] = []
reasoning_summary: list[str] = []
web_search_calls: dict[str, dict[str, Any]] = {}
@@ -238,7 +249,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
"""Transform stream result into HA format."""
last_summary_index = None
last_role: Literal["assistant", "tool_result"] | None = None
current_tool_call: ResponseFunctionToolCall | None = None
current_tool_call: LLMResponseFunctionCallOutputItem | None = None
# Non-reasoning models don't follow our request to remove citations, so we remove
# them manually here. They always follow the same pattern: the citation is always
@@ -248,19 +259,10 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
citation_regexp = re.compile(r"\(\[([^\]]+)\]\((https?:\/\/[^\)]+)\)")
async for event in stream:
event_type = getattr(event, "type", None)
event_item = getattr(event, "item", None)
event_item_type = getattr(event_item, "type", None) if event_item else None
_LOGGER.debug("Event[%s]", getattr(event, "type", None))
_LOGGER.debug(
"Event[%s] | item: %s",
event_type,
event_item_type,
)
if event_type == ResponsesAPIStreamEvents.OUTPUT_ITEM_ADDED:
# Detect function_call even when it's a BaseLiteLLMOpenAIResponseObject
if event_item_type == ResponseItemType.FUNCTION_CALL:
if isinstance(event, LLMResponseOutputItemAddedEvent):
if isinstance(event.item, LLMResponseFunctionCallOutputItem):
# OpenAI has tool calls as individual events
# while HA puts tool calls inside the assistant message.
# We turn them into individual assistant content for HA
@@ -268,11 +270,11 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
yield {"role": "assistant"}
last_role = "assistant"
last_summary_index = None
current_tool_call = cast(ResponseFunctionToolCall, event.item)
current_tool_call = event.item
elif (
event_item_type == ResponseItemType.MESSAGE
isinstance(event.item, LLMResponseMessageOutputItem)
or (
event_item_type == ResponseItemType.REASONING
isinstance(event.item, LLMResponseReasoningOutputItem)
and last_summary_index is not None
) # Subsequent ResponseReasoningItem
or last_role != "assistant"
@@ -281,14 +283,14 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
last_role = "assistant"
last_summary_index = None
elif event_type == ResponsesAPIStreamEvents.OUTPUT_ITEM_DONE:
if event_item_type == ResponseItemType.REASONING:
encrypted_content = getattr(event.item, "encrypted_content", None)
summary = getattr(event.item, "summary", []) or []
elif isinstance(event, LLMResponseOutputItemDoneEvent):
if isinstance(event.item, LLMResponseReasoningOutputItem):
encrypted_content = event.item.encrypted_content
summary = event.item.summary
yield {
"native": ResponseReasoningItem(
type="reasoning",
"native": LLMResponseReasoningOutputItem(
type=event.item.type,
id=event.item.id,
summary=[],
encrypted_content=encrypted_content,
@@ -296,14 +298,8 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
}
last_summary_index = len(summary) - 1 if summary else None
elif event_item_type == ResponseItemType.WEB_SEARCH_CALL:
action = getattr(event.item, "action", None)
if isinstance(action, dict):
action_dict = action
elif action is not None:
action_dict = action.to_dict()
else:
action_dict = {}
elif isinstance(event.item, LLMResponseWebSearchCallOutputItem):
action_dict = event.item.action
yield {
"tool_calls": [
llm.ToolInput(
@@ -321,11 +317,11 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
"tool_result": {"status": event.item.status},
}
last_role = "tool_result"
elif event_item_type == ResponseItemType.IMAGE:
yield {"native": event.item}
elif isinstance(event.item, LLMResponseImageOutputItem):
yield {"native": event.item.raw}
last_summary_index = -1 # Trigger new assistant message on next turn
elif event_type == ResponsesAPIStreamEvents.OUTPUT_TEXT_DELTA:
elif isinstance(event, LLMResponseOutputTextDeltaEvent):
data = event.delta
if remove_parentheses:
data = data.removeprefix(")")
@@ -344,7 +340,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
if data:
yield {"content": data}
elif event_type == ResponsesAPIStreamEvents.REASONING_SUMMARY_TEXT_DELTA:
elif isinstance(event, LLMResponseReasoningSummaryTextDeltaEvent):
# OpenAI can output several reasoning summaries
# in a single ResponseReasoningItem. We split them as separate
# AssistantContent messages. Only last of them will have
@@ -358,14 +354,14 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
last_summary_index = event.summary_index
yield {"thinking_content": event.delta}
elif event_type == ResponsesAPIStreamEvents.FUNCTION_CALL_ARGUMENTS_DELTA:
elif isinstance(event, LLMResponseFunctionCallArgumentsDeltaEvent):
if current_tool_call is not None:
current_tool_call.arguments += event.delta
elif event_type == ResponsesAPIStreamEvents.WEB_SEARCH_CALL_SEARCHING:
elif isinstance(event, LLMResponseWebSearchCallSearchingEvent):
yield {"role": "assistant"}
elif event_type == ResponsesAPIStreamEvents.FUNCTION_CALL_ARGUMENTS_DONE:
elif isinstance(event, LLMResponseFunctionCallArgumentsDoneEvent):
if current_tool_call is not None:
current_tool_call.status = "completed"
@@ -385,35 +381,36 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
]
}
elif event_type == ResponsesAPIStreamEvents.RESPONSE_COMPLETED:
if event.response.usage is not None:
elif isinstance(event, LLMResponseCompletedEvent):
response = event.response
if response and "usage" in response:
usage = response["usage"]
chat_log.async_trace(
{
"stats": {
"input_tokens": event.response.usage.input_tokens,
"output_tokens": event.response.usage.output_tokens,
"input_tokens": usage.get("input_tokens"),
"output_tokens": usage.get("output_tokens"),
}
}
)
elif event_type == ResponsesAPIStreamEvents.RESPONSE_INCOMPLETE:
if event.response.usage is not None:
elif isinstance(event, LLMResponseIncompleteEvent):
response = event.response
if response and "usage" in response:
usage = response["usage"]
chat_log.async_trace(
{
"stats": {
"input_tokens": event.response.usage.input_tokens,
"output_tokens": event.response.usage.output_tokens,
"input_tokens": usage.get("input_tokens"),
"output_tokens": usage.get("output_tokens"),
}
}
)
if (
event.response.incomplete_details
and event.response.incomplete_details.reason
):
reason: str = event.response.incomplete_details.reason
else:
reason = "unknown reason"
incomplete_details = response.get("incomplete_details")
reason = "unknown reason"
if incomplete_details is not None and incomplete_details.get("reason"):
reason = incomplete_details["reason"]
if reason == "max_output_tokens":
reason = "max output tokens reached"
@@ -422,22 +419,24 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
raise HomeAssistantError(f"OpenAI response incomplete: {reason}")
elif event_type == ResponsesAPIStreamEvents.RESPONSE_FAILED:
if event.response.usage is not None:
elif isinstance(event, LLMResponseFailedEvent):
response = event.response
if response and "usage" in response:
usage = response["usage"]
chat_log.async_trace(
{
"stats": {
"input_tokens": event.response.usage.input_tokens,
"output_tokens": event.response.usage.output_tokens,
"input_tokens": usage.get("input_tokens"),
"output_tokens": usage.get("output_tokens"),
}
}
)
reason = "unknown reason"
if event.response.error is not None:
reason = event.response.error.message
if isinstance(error := response.get("error"), dict):
reason = error.get("message") or reason
raise HomeAssistantError(f"OpenAI response failed: {reason}")
elif event_type == ResponsesAPIStreamEvents.ERROR:
elif isinstance(event, LLMResponseErrorEvent):
raise HomeAssistantError(f"OpenAI response error: {event.message}")
@@ -452,7 +451,7 @@ class BaseCloudLLMEntity(Entity):
async def _prepare_chat_for_generation(
self,
chat_log: conversation.ChatLog,
messages: ResponseInputParam,
messages: list[ResponseInputItemParam],
response_format: dict[str, Any] | None = None,
) -> dict[str, Any]:
"""Prepare kwargs for Cloud LLM from the chat log."""
@@ -460,8 +459,17 @@ class BaseCloudLLMEntity(Entity):
last_content: Any = chat_log.content[-1]
if last_content.role == "user" and last_content.attachments:
files = await self._async_prepare_files_for_prompt(last_content.attachments)
current_content = last_content.content
last_content = [*(current_content or []), *files]
last_message = cast(dict[str, Any], messages[-1])
assert (
last_message["type"] == "message"
and last_message["role"] == "user"
and isinstance(last_message["content"], str)
)
last_message["content"] = [
{"type": "input_text", "text": last_message["content"]},
*files,
]
tools: list[ToolParam] = []
tool_choice: str | None = None

View File

@@ -13,6 +13,6 @@
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["acme", "hass_nabucasa", "snitun"],
"requirements": ["hass-nabucasa==1.11.0"],
"requirements": ["hass-nabucasa==1.12.0", "openai==2.15.0"],
"single_config_entry": true
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["compit"],
"quality_scale": "bronze",
"requirements": ["compit-inext-api==0.6.0"]
"requirements": ["compit-inext-api==0.8.0"]
}

View File

@@ -58,12 +58,13 @@ C4_TO_HA_HVAC_MODE = {
HA_TO_C4_HVAC_MODE = {v: k for k, v in C4_TO_HA_HVAC_MODE.items()}
# Map Control4 HVAC state to Home Assistant HVAC action
# Map the five known Control4 HVAC states to Home Assistant HVAC actions
C4_TO_HA_HVAC_ACTION = {
"heating": HVACAction.HEATING,
"cooling": HVACAction.COOLING,
"idle": HVACAction.IDLE,
"off": HVACAction.OFF,
"heat": HVACAction.HEATING,
"cool": HVACAction.COOLING,
"dry": HVACAction.DRYING,
"fan": HVACAction.FAN,
}
@@ -236,7 +237,10 @@ class Control4Climate(Control4Entity, ClimateEntity):
if c4_state is None:
return None
# Convert state to lowercase for mapping
return C4_TO_HA_HVAC_ACTION.get(str(c4_state).lower())
action = C4_TO_HA_HVAC_ACTION.get(str(c4_state).lower())
if action is None:
_LOGGER.debug("Unknown HVAC state received from Control4: %s", c4_state)
return action
@property
def target_temperature(self) -> float | None:

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.1.6"]
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.1.28"]
}

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["denonavr"],
"requirements": ["denonavr==1.2.0"],
"requirements": ["denonavr==1.3.1"],
"ssdp": [
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",

View File

@@ -1,6 +1,7 @@
"""The Dexcom integration."""
from pydexcom import AccountError, Dexcom, SessionError
from pydexcom import Dexcom, Region
from pydexcom.errors import AccountError, SessionError
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant
@@ -14,10 +15,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: DexcomConfigEntry) -> bo
"""Set up Dexcom from a config entry."""
try:
dexcom = await hass.async_add_executor_job(
Dexcom,
entry.data[CONF_USERNAME],
entry.data[CONF_PASSWORD],
entry.data[CONF_SERVER] == SERVER_OUS,
lambda: Dexcom(
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
region=Region.OUS
if entry.data[CONF_SERVER] == SERVER_OUS
else Region.US,
)
)
except AccountError:
return False

View File

@@ -5,7 +5,8 @@ from __future__ import annotations
import logging
from typing import Any
from pydexcom import AccountError, Dexcom, SessionError
from pydexcom import Dexcom, Region
from pydexcom.errors import AccountError, SessionError
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
@@ -37,10 +38,13 @@ class DexcomConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
try:
await self.hass.async_add_executor_job(
Dexcom,
user_input[CONF_USERNAME],
user_input[CONF_PASSWORD],
user_input[CONF_SERVER] == SERVER_OUS,
lambda: Dexcom(
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
region=Region.OUS
if user_input[CONF_SERVER] == SERVER_OUS
else Region.US,
)
)
except SessionError:
errors["base"] = "cannot_connect"

View File

@@ -18,7 +18,7 @@ _SCAN_INTERVAL = timedelta(seconds=180)
type DexcomConfigEntry = ConfigEntry[DexcomCoordinator]
class DexcomCoordinator(DataUpdateCoordinator[GlucoseReading]):
class DexcomCoordinator(DataUpdateCoordinator[GlucoseReading | None]):
"""Dexcom Coordinator."""
def __init__(
@@ -37,7 +37,7 @@ class DexcomCoordinator(DataUpdateCoordinator[GlucoseReading]):
)
self.dexcom = dexcom
async def _async_update_data(self) -> GlucoseReading:
async def _async_update_data(self) -> GlucoseReading | None:
"""Fetch data from API endpoint."""
return await self.hass.async_add_executor_job(
self.dexcom.get_current_glucose_reading

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["pydexcom"],
"requirements": ["pydexcom==0.2.3"]
"requirements": ["pydexcom==0.5.1"]
}

View File

@@ -12,6 +12,7 @@ from doorbirdpy import DoorBird
import voluptuous as vol
from homeassistant.config_entries import (
SOURCE_IGNORE,
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
@@ -218,6 +219,9 @@ class DoorBirdConfigFlow(ConfigFlow, domain=DOMAIN):
)
if existing_entry:
if existing_entry.source == SOURCE_IGNORE:
return self.async_abort(reason="already_configured")
# Check if the host is actually changing
if existing_entry.data.get(CONF_HOST) != host:
await self._async_verify_existing_device_for_discovery(

View File

@@ -19,7 +19,7 @@
"requirements": [
"aioesphomeapi==43.14.0",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.5.0"
"bleak-esphome==3.6.0"
],
"zeroconf": ["_esphomelib._tcp.local."]
}

View File

@@ -181,7 +181,7 @@ class EvoChild(EvoEntity):
self._device_state_attrs = {
"activeFaults": self._evo_device.active_faults,
"setpoints": self._setpoints,
"setpoints": self.setpoints,
}
super()._handle_coordinator_update()

View File

@@ -1,11 +1,22 @@
"""The Fressnapf Tracker integration."""
from fressnapftracker import AuthClient, FressnapfTrackerAuthenticationError
import logging
from fressnapftracker import (
ApiClient,
AuthClient,
Device,
FressnapfTrackerAuthenticationError,
FressnapfTrackerError,
FressnapfTrackerInvalidTrackerResponseError,
Tracker,
)
from homeassistant.const import CONF_ACCESS_TOKEN, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.httpx_client import get_async_client
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from .const import CONF_USER_ID, DOMAIN
from .coordinator import (
@@ -21,6 +32,43 @@ PLATFORMS: list[Platform] = [
Platform.SWITCH,
]
_LOGGER = logging.getLogger(__name__)
async def _get_valid_tracker(hass: HomeAssistant, device: Device) -> Tracker | None:
"""Test if the tracker returns valid data and return it.
Malformed data might indicate the tracker is broken or hasn't been properly registered with the app.
"""
client = ApiClient(
serial_number=device.serialnumber,
device_token=device.token,
client=get_async_client(hass),
)
try:
return await client.get_tracker()
except FressnapfTrackerInvalidTrackerResponseError:
_LOGGER.warning(
"Tracker with serialnumber %s is invalid. Consider removing it via the App",
device.serialnumber,
)
async_create_issue(
hass,
DOMAIN,
f"invalid_fressnapf_tracker_{device.serialnumber}",
issue_domain=DOMAIN,
learn_more_url="https://www.home-assistant.io/integrations/fressnapf_tracker/",
is_fixable=False,
severity=IssueSeverity.WARNING,
translation_key="invalid_fressnapf_tracker",
translation_placeholders={
"tracker_id": device.serialnumber,
},
)
return None
except FressnapfTrackerError as err:
raise ConfigEntryNotReady(err) from err
async def async_setup_entry(
hass: HomeAssistant, entry: FressnapfTrackerConfigEntry
@@ -40,12 +88,15 @@ async def async_setup_entry(
coordinators: list[FressnapfTrackerDataUpdateCoordinator] = []
for device in devices:
tracker = await _get_valid_tracker(hass, device)
if tracker is None:
continue
coordinator = FressnapfTrackerDataUpdateCoordinator(
hass,
entry,
device,
initial_data=tracker,
)
await coordinator.async_config_entry_first_refresh()
coordinators.append(coordinator)
entry.runtime_data = coordinators

View File

@@ -34,6 +34,7 @@ class FressnapfTrackerDataUpdateCoordinator(DataUpdateCoordinator[Tracker]):
hass: HomeAssistant,
config_entry: FressnapfTrackerConfigEntry,
device: Device,
initial_data: Tracker,
) -> None:
"""Initialize."""
super().__init__(
@@ -49,6 +50,7 @@ class FressnapfTrackerDataUpdateCoordinator(DataUpdateCoordinator[Tracker]):
device_token=device.token,
client=get_async_client(hass),
)
self.data = initial_data
async def _async_update_data(self) -> Tracker:
try:

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["fressnapftracker==0.2.1"]
"requirements": ["fressnapftracker==0.2.2"]
}

View File

@@ -92,5 +92,11 @@
"not_seen_recently": {
"message": "The flashlight cannot be activated when the tracker has not moved recently."
}
},
"issues": {
"invalid_fressnapf_tracker": {
"description": "The Fressnapf GPS tracker with the serial number `{tracker_id}` is sending invalid data. This most likely indicates the tracker is broken or hasn't been properly registered with the app. Please remove the tracker via the Fressnapf app. You can then close this issue.",
"title": "Invalid Fressnapf GPS tracker detected"
}
}
}

View File

@@ -9,7 +9,7 @@
"iot_class": "local_polling",
"loggers": ["fritzconnection"],
"quality_scale": "bronze",
"requirements": ["fritzconnection[qr]==1.15.0", "xmltodict==1.0.2"],
"requirements": ["fritzconnection[qr]==1.15.1", "xmltodict==1.0.2"],
"ssdp": [
{
"st": "urn:schemas-upnp-org:device:fritzbox:1"

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["fritzconnection"],
"requirements": ["fritzconnection[qr]==1.15.0"]
"requirements": ["fritzconnection[qr]==1.15.1"]
}

View File

@@ -19,9 +19,7 @@
],
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"preview_features": {
"winter_mode": {}
},
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260128.1"]
"requirements": ["home-assistant-frontend==20260128.6"]
}

View File

@@ -8,5 +8,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["googleapiclient"],
"requirements": ["gcal-sync==8.0.0", "oauth2client==4.1.3", "ical==12.1.2"]
"requirements": ["gcal-sync==8.0.0", "oauth2client==4.1.3", "ical==12.1.3"]
}

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["growattServer"],
"requirements": ["growattServer==1.7.1"]
"requirements": ["growattServer==1.9.0"]
}

View File

@@ -125,7 +125,7 @@ class AddonManager:
)
@api_error(
"Failed to get the {addon_name} add-on discovery info",
"Failed to get the {addon_name} app discovery info",
expected_error_type=SupervisorError,
)
async def async_get_addon_discovery_info(self) -> dict:
@@ -140,12 +140,12 @@ class AddonManager:
)
if not discovery_info:
raise AddonError(f"Failed to get {self.addon_name} add-on discovery info")
raise AddonError(f"Failed to get {self.addon_name} app discovery info")
return discovery_info.config
@api_error(
"Failed to get the {addon_name} add-on info",
"Failed to get the {addon_name} app info",
expected_error_type=SupervisorError,
)
async def async_get_addon_info(self) -> AddonInfo:
@@ -153,7 +153,7 @@ class AddonManager:
addon_store_info = await self._supervisor_client.store.addon_info(
self.addon_slug
)
self._logger.debug("Add-on store info: %s", addon_store_info.to_dict())
self._logger.debug("App store info: %s", addon_store_info.to_dict())
if not addon_store_info.installed:
return AddonInfo(
available=addon_store_info.available,
@@ -190,7 +190,7 @@ class AddonManager:
return addon_state
@api_error(
"Failed to set the {addon_name} add-on options",
"Failed to set the {addon_name} app options",
expected_error_type=SupervisorError,
)
async def async_set_addon_options(self, config: dict) -> None:
@@ -202,10 +202,10 @@ class AddonManager:
def _check_addon_available(self, addon_info: AddonInfo) -> None:
"""Check if the managed add-on is available."""
if not addon_info.available:
raise AddonError(f"{self.addon_name} add-on is not available")
raise AddonError(f"{self.addon_name} app is not available")
@api_error(
"Failed to install the {addon_name} add-on", expected_error_type=SupervisorError
"Failed to install the {addon_name} app", expected_error_type=SupervisorError
)
async def async_install_addon(self) -> None:
"""Install the managed add-on."""
@@ -216,14 +216,14 @@ class AddonManager:
await self._supervisor_client.store.install_addon(self.addon_slug)
@api_error(
"Failed to uninstall the {addon_name} add-on",
"Failed to uninstall the {addon_name} app",
expected_error_type=SupervisorError,
)
async def async_uninstall_addon(self) -> None:
"""Uninstall the managed add-on."""
await self._supervisor_client.addons.uninstall_addon(self.addon_slug)
@api_error("Failed to update the {addon_name} add-on")
@api_error("Failed to update the {addon_name} app")
async def async_update_addon(self) -> None:
"""Update the managed add-on if needed."""
addon_info = await self.async_get_addon_info()
@@ -231,7 +231,7 @@ class AddonManager:
self._check_addon_available(addon_info)
if addon_info.state is AddonState.NOT_INSTALLED:
raise AddonError(f"{self.addon_name} add-on is not installed")
raise AddonError(f"{self.addon_name} app is not installed")
if not addon_info.update_available:
return
@@ -242,28 +242,28 @@ class AddonManager:
)
@api_error(
"Failed to start the {addon_name} add-on", expected_error_type=SupervisorError
"Failed to start the {addon_name} app", expected_error_type=SupervisorError
)
async def async_start_addon(self) -> None:
"""Start the managed add-on."""
await self._supervisor_client.addons.start_addon(self.addon_slug)
@api_error(
"Failed to restart the {addon_name} add-on", expected_error_type=SupervisorError
"Failed to restart the {addon_name} app", expected_error_type=SupervisorError
)
async def async_restart_addon(self) -> None:
"""Restart the managed add-on."""
await self._supervisor_client.addons.restart_addon(self.addon_slug)
@api_error(
"Failed to stop the {addon_name} add-on", expected_error_type=SupervisorError
"Failed to stop the {addon_name} app", expected_error_type=SupervisorError
)
async def async_stop_addon(self) -> None:
"""Stop the managed add-on."""
await self._supervisor_client.addons.stop_addon(self.addon_slug)
@api_error(
"Failed to create a backup of the {addon_name} add-on",
"Failed to create a backup of the {addon_name} app",
expected_error_type=SupervisorError,
)
async def async_create_backup(self) -> None:
@@ -284,7 +284,7 @@ class AddonManager:
addon_info = await self.async_get_addon_info()
if addon_info.state is AddonState.NOT_INSTALLED:
raise AddonError(f"{self.addon_name} add-on is not installed")
raise AddonError(f"{self.addon_name} app is not installed")
if addon_config != addon_info.options:
await self.async_set_addon_options(addon_config)
@@ -297,7 +297,7 @@ class AddonManager:
"""
if not self._install_task or self._install_task.done():
self._logger.info(
"%s add-on is not installed. Installing add-on", self.addon_name
"%s app is not installed. Installing app", self.addon_name
)
self._install_task = self._async_schedule_addon_operation(
self.async_install_addon, catch_error=catch_error
@@ -316,7 +316,7 @@ class AddonManager:
"""
if not self._install_task or self._install_task.done():
self._logger.info(
"%s add-on is not installed. Installing add-on", self.addon_name
"%s app is not installed. Installing app", self.addon_name
)
self._install_task = self._async_schedule_addon_operation(
self.async_install_addon,
@@ -336,7 +336,7 @@ class AddonManager:
Only schedule a new update task if the there's no running task.
"""
if not self._update_task or self._update_task.done():
self._logger.info("Trying to update the %s add-on", self.addon_name)
self._logger.info("Trying to update the %s app", self.addon_name)
self._update_task = self._async_schedule_addon_operation(
self.async_update_addon,
catch_error=catch_error,
@@ -350,9 +350,7 @@ class AddonManager:
Only schedule a new start task if the there's no running task.
"""
if not self._start_task or self._start_task.done():
self._logger.info(
"%s add-on is not running. Starting add-on", self.addon_name
)
self._logger.info("%s app is not running. Starting app", self.addon_name)
self._start_task = self._async_schedule_addon_operation(
self.async_start_addon, catch_error=catch_error
)
@@ -365,7 +363,7 @@ class AddonManager:
Only schedule a new restart task if the there's no running task.
"""
if not self._restart_task or self._restart_task.done():
self._logger.info("Restarting %s add-on", self.addon_name)
self._logger.info("Restarting %s app", self.addon_name)
self._restart_task = self._async_schedule_addon_operation(
self.async_restart_addon, catch_error=catch_error
)
@@ -382,9 +380,7 @@ class AddonManager:
Only schedule a new setup task if there's no running task.
"""
if not self._start_task or self._start_task.done():
self._logger.info(
"%s add-on is not running. Starting add-on", self.addon_name
)
self._logger.info("%s app is not running. Starting app", self.addon_name)
self._start_task = self._async_schedule_addon_operation(
partial(
self.async_configure_addon,

View File

@@ -176,7 +176,7 @@ EXTRA_PLACEHOLDERS = {
class SupervisorEntityModel(StrEnum):
"""Supervisor entity model."""
ADDON = "Home Assistant Add-on"
ADDON = "Home Assistant App"
OS = "Home Assistant Operating System"
CORE = "Home Assistant Core"
SUPERVISOR = "Home Assistant Supervisor"

View File

@@ -117,7 +117,7 @@ class HassIODiscovery(HomeAssistantView):
try:
addon_info = await self._supervisor_client.addons.addon_info(data.addon)
except SupervisorError as err:
_LOGGER.error("Can't read add-on info: %s", err)
_LOGGER.error("Can't read app info: %s", err)
return
data.config[ATTR_ADDON] = addon_info.name

View File

@@ -51,7 +51,7 @@
},
"step": {
"fix_menu": {
"description": "Add-on {addon} is set to start at boot but failed to start. Usually this occurs when the configuration is incorrect or the same port is used in multiple add-ons. Check the configuration as well as logs for {addon} and Supervisor.\n\nUse Start to try again or Disable to turn off the start at boot option.",
"description": "App {addon} is set to start at boot but failed to start. Usually this occurs when the configuration is incorrect or the same port is used in multiple apps. Check the configuration as well as logs for {addon} and Supervisor.\n\nUse Start to try again or Disable to turn off the start at boot option.",
"menu_options": {
"addon_disable_boot": "[%key:common::action::disable%]",
"addon_execute_start": "[%key:common::action::start%]"
@@ -59,41 +59,41 @@
}
}
},
"title": "Add-on failed to start at boot"
"title": "App failed to start at boot"
},
"issue_addon_deprecated_addon": {
"fix_flow": {
"abort": {
"apply_suggestion_fail": "Could not uninstall the add-on. Check the Supervisor logs for more details."
"apply_suggestion_fail": "Could not uninstall the app. Check the Supervisor logs for more details."
},
"step": {
"addon_execute_remove": {
"description": "Add-on {addon} is marked deprecated by the developer. This means it is no longer being maintained and so may break or become a security issue over time.\n\nReview the [readme]({addon_info}) and [documentation]({addon_documentation}) of the add-on to see if the developer provided instructions.\n\nSelecting **Submit** will uninstall this deprecated add-on. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
"description": "App {addon} is marked deprecated by the developer. This means it is no longer being maintained and so may break or become a security issue over time.\n\nReview the [readme]({addon_info}) and [documentation]({addon_documentation}) of the app to see if the developer provided instructions.\n\nSelecting **Submit** will uninstall this deprecated app. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
}
}
},
"title": "Installed add-on is deprecated"
"title": "Installed app is deprecated"
},
"issue_addon_detached_addon_missing": {
"description": "Repository for add-on {addon} is missing. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nPlease check the [add-on's documentation]({addon_url}) for installation instructions and add the repository to the store.",
"title": "Missing repository for an installed add-on"
"description": "Repository for app {addon} is missing. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nPlease check the [app's documentation]({addon_url}) for installation instructions and add the repository to the store.",
"title": "Missing repository for an installed app"
},
"issue_addon_detached_addon_removed": {
"fix_flow": {
"abort": {
"apply_suggestion_fail": "Could not uninstall the add-on. Check the Supervisor logs for more details."
"apply_suggestion_fail": "Could not uninstall the app. Check the Supervisor logs for more details."
},
"step": {
"addon_execute_remove": {
"description": "Add-on {addon} has been removed from the repository it was installed from. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nSelecting **Submit** will uninstall this deprecated add-on. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
"description": "App {addon} has been removed from the repository it was installed from. This means it will not get updates, and backups may not be restored correctly as the Home Assistant Supervisor may not be able to build/download the resources required.\n\nSelecting **Submit** will uninstall this deprecated app. Alternatively, you can check [Home Assistant help]({help_url}) and the [community forum]({community_url}) for alternatives to migrate to."
}
}
},
"title": "Installed add-on has been removed from repository"
"title": "Installed app has been removed from repository"
},
"issue_addon_pwned": {
"description": "Add-on {addon} uses secrets/passwords in its configuration which are detected as not secure. See [pwned passwords and secrets]({more_info_pwned}) for more information on this issue.",
"title": "Insecure secrets detected in add-on configuration"
"description": "App {addon} uses secrets/passwords in its configuration which are detected as not secure. See [pwned passwords and secrets]({more_info_pwned}) for more information on this issue.",
"title": "Insecure secrets detected in app configuration"
},
"issue_mount_mount_failed": {
"fix_flow": {
@@ -123,7 +123,7 @@
},
"step": {
"system_execute_rebuild": {
"description": "The default configuration for add-ons and Home Assistant has changed. To update the configuration with the new defaults, a restart is required for the following:\n\n- {components}"
"description": "The default configuration for apps and Home Assistant has changed. To update the configuration with the new defaults, a restart is required for the following:\n\n- {components}"
}
}
},
@@ -160,7 +160,7 @@
},
"step": {
"system_execute_reboot": {
"description": "Settings were changed which require a system reboot to take effect.\n\nThis fix will initiate a system reboot which will make Home Assistant and all the Add-ons inaccessible for a brief period."
"description": "Settings were changed which require a system reboot to take effect.\n\nThis fix will initiate a system reboot which will make Home Assistant and all the apps inaccessible for a brief period."
}
}
},
@@ -203,7 +203,7 @@
"title": "Unsupported system - {reason}"
},
"unsupported_apparmor": {
"description": "System is unsupported because AppArmor is working incorrectly and add-ons are running in an unprotected and insecure way. For troubleshooting information, select Learn more.",
"description": "System is unsupported because AppArmor is working incorrectly and apps are running in an unprotected and insecure way. For troubleshooting information, select Learn more.",
"title": "Unsupported system - AppArmor issues"
},
"unsupported_cgroup_version": {
@@ -510,7 +510,7 @@
"docker_version": "Docker version",
"healthy": "Healthy",
"host_os": "Host operating system",
"installed_addons": "Installed add-ons",
"installed_addons": "Installed apps",
"nameservers": "Nameservers",
"supervisor_api": "Supervisor API",
"supervisor_version": "Supervisor version",

View File

@@ -8,5 +8,5 @@
"iot_class": "local_push",
"loggers": ["pyhik"],
"quality_scale": "legacy",
"requirements": ["pyHik==0.4.1"]
"requirements": ["pyHik==0.4.2"]
}

View File

@@ -169,6 +169,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect binary sensor."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -73,6 +73,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect button entities."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -7,18 +7,44 @@ from typing import cast
from aiohomeconnect.model import EventKey
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import HomeConnectApplianceData, HomeConnectConfigEntry
from .entity import HomeConnectEntity, HomeConnectOptionEntity
def should_add_option_entity(
description: EntityDescription,
appliance: HomeConnectApplianceData,
entity_registry: er.EntityRegistry,
platform: Platform,
) -> bool:
"""Check if the option entity should be added for the appliance.
This function returns `True` if the option is available in the appliance options
or if the entity was added in previous loads of this integration.
"""
description_key = description.key
return description_key in appliance.options or (
entity_registry.async_get_entity_id(
platform, DOMAIN, f"{appliance.info.ha_id}-{description_key}"
)
is not None
)
def _create_option_entities(
entity_registry: er.EntityRegistry,
entry: HomeConnectConfigEntry,
appliance: HomeConnectApplianceData,
known_entity_unique_ids: dict[str, str],
get_option_entities_for_appliance: Callable[
[HomeConnectConfigEntry, HomeConnectApplianceData],
[HomeConnectConfigEntry, HomeConnectApplianceData, er.EntityRegistry],
list[HomeConnectOptionEntity],
],
async_add_entities: AddConfigEntryEntitiesCallback,
@@ -26,7 +52,9 @@ def _create_option_entities(
"""Create the required option entities for the appliances."""
option_entities_to_add = [
entity
for entity in get_option_entities_for_appliance(entry, appliance)
for entity in get_option_entities_for_appliance(
entry, appliance, entity_registry
)
if entity.unique_id not in known_entity_unique_ids
]
known_entity_unique_ids.update(
@@ -39,13 +67,14 @@ def _create_option_entities(
def _handle_paired_or_connected_appliance(
hass: HomeAssistant,
entry: HomeConnectConfigEntry,
known_entity_unique_ids: dict[str, str],
get_entities_for_appliance: Callable[
[HomeConnectConfigEntry, HomeConnectApplianceData], list[HomeConnectEntity]
],
get_option_entities_for_appliance: Callable[
[HomeConnectConfigEntry, HomeConnectApplianceData],
[HomeConnectConfigEntry, HomeConnectApplianceData, er.EntityRegistry],
list[HomeConnectOptionEntity],
]
| None,
@@ -60,6 +89,7 @@ def _handle_paired_or_connected_appliance(
already or it is the first time we see them when the appliance is connected.
"""
entities: list[HomeConnectEntity] = []
entity_registry = er.async_get(hass)
for appliance in entry.runtime_data.data.values():
entities_to_add = [
entity
@@ -69,7 +99,9 @@ def _handle_paired_or_connected_appliance(
if get_option_entities_for_appliance:
entities_to_add.extend(
entity
for entity in get_option_entities_for_appliance(entry, appliance)
for entity in get_option_entities_for_appliance(
entry, appliance, entity_registry
)
if entity.unique_id not in known_entity_unique_ids
)
for event_key in (
@@ -80,6 +112,7 @@ def _handle_paired_or_connected_appliance(
entry.runtime_data.async_add_listener(
partial(
_create_option_entities,
entity_registry,
entry,
appliance,
known_entity_unique_ids,
@@ -120,13 +153,14 @@ def _handle_depaired_appliance(
def setup_home_connect_entry(
hass: HomeAssistant,
entry: HomeConnectConfigEntry,
get_entities_for_appliance: Callable[
[HomeConnectConfigEntry, HomeConnectApplianceData], list[HomeConnectEntity]
],
async_add_entities: AddConfigEntryEntitiesCallback,
get_option_entities_for_appliance: Callable[
[HomeConnectConfigEntry, HomeConnectApplianceData],
[HomeConnectConfigEntry, HomeConnectApplianceData, er.EntityRegistry],
list[HomeConnectOptionEntity],
]
| None = None,
@@ -141,6 +175,7 @@ def setup_home_connect_entry(
entry.runtime_data.async_add_special_listener(
partial(
_handle_paired_or_connected_appliance,
hass,
entry,
known_entity_unique_ids,
get_entities_for_appliance,

View File

@@ -96,6 +96,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect light."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -11,12 +11,13 @@ from homeassistant.components.number import (
NumberEntity,
NumberEntityDescription,
)
from homeassistant.const import PERCENTAGE
from homeassistant.const import PERCENTAGE, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .common import setup_home_connect_entry
from .common import setup_home_connect_entry, should_add_option_entity
from .const import DOMAIN, UNIT_MAP
from .coordinator import HomeConnectApplianceData, HomeConnectConfigEntry
from .entity import HomeConnectEntity, HomeConnectOptionEntity, constraint_fetcher
@@ -136,12 +137,15 @@ def _get_entities_for_appliance(
def _get_option_entities_for_appliance(
entry: HomeConnectConfigEntry,
appliance: HomeConnectApplianceData,
entity_registry: er.EntityRegistry,
) -> list[HomeConnectOptionEntity]:
"""Get a list of currently available option entities."""
return [
HomeConnectOptionNumberEntity(entry.runtime_data, appliance, description)
for description in NUMBER_OPTIONS
if description.key in appliance.options
if should_add_option_entity(
description, appliance, entity_registry, Platform.NUMBER
)
]
@@ -152,6 +156,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect number."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -11,11 +11,13 @@ from aiohomeconnect.model.error import HomeConnectError
from aiohomeconnect.model.program import Execution
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .common import setup_home_connect_entry
from .common import setup_home_connect_entry, should_add_option_entity
from .const import (
AVAILABLE_MAPS_ENUM,
BEAN_AMOUNT_OPTIONS,
@@ -358,12 +360,13 @@ def _get_entities_for_appliance(
def _get_option_entities_for_appliance(
entry: HomeConnectConfigEntry,
appliance: HomeConnectApplianceData,
entity_registry: er.EntityRegistry,
) -> list[HomeConnectOptionEntity]:
"""Get a list of entities."""
return [
HomeConnectSelectOptionEntity(entry.runtime_data, appliance, desc)
for desc in PROGRAM_SELECT_OPTION_ENTITY_DESCRIPTIONS
if desc.key in appliance.options
if should_add_option_entity(desc, appliance, entity_registry, Platform.SELECT)
]
@@ -374,6 +377,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect select entities."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -115,7 +115,6 @@ SENSORS = (
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfVolume.MILLILITERS,
device_class=SensorDeviceClass.VOLUME,
state_class=SensorStateClass.TOTAL_INCREASING,
translation_key="hot_water_counter",
),
HomeConnectSensorEntityDescription(
@@ -540,6 +539,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect sensor."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -7,12 +7,14 @@ from aiohomeconnect.model import OptionKey, SettingKey
from aiohomeconnect.model.error import HomeConnectError
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import UNDEFINED, UndefinedType
from .common import setup_home_connect_entry
from .common import setup_home_connect_entry, should_add_option_entity
from .const import BSH_POWER_OFF, BSH_POWER_ON, BSH_POWER_STANDBY, DOMAIN
from .coordinator import HomeConnectApplianceData, HomeConnectConfigEntry
from .entity import HomeConnectEntity, HomeConnectOptionEntity
@@ -190,12 +192,15 @@ def _get_entities_for_appliance(
def _get_option_entities_for_appliance(
entry: HomeConnectConfigEntry,
appliance: HomeConnectApplianceData,
entity_registry: er.EntityRegistry,
) -> list[HomeConnectOptionEntity]:
"""Get a list of currently available option entities."""
return [
HomeConnectSwitchOptionEntity(entry.runtime_data, appliance, description)
for description in SWITCH_OPTIONS
if description.key in appliance.options
if should_add_option_entity(
description, appliance, entity_registry, Platform.SWITCH
)
]
@@ -206,6 +211,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Home Connect switch."""
setup_home_connect_entry(
hass,
entry,
_get_entities_for_appliance,
async_add_entities,

View File

@@ -63,16 +63,14 @@ class HueBLELight(LightEntity):
self._api = light
self._attr_unique_id = light.address
self._attr_min_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(light.maximum_mireds)
if light.maximum_mireds
else None
)
self._attr_max_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(light.minimum_mireds)
if light.minimum_mireds
else None
)
if light.maximum_mireds:
self._attr_min_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(light.maximum_mireds)
)
if light.minimum_mireds:
self._attr_max_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(light.minimum_mireds)
)
self._attr_device_info = DeviceInfo(
name=light.name,
connections={(CONNECTION_BLUETOOTH, light.address)},

View File

@@ -12,5 +12,5 @@
"iot_class": "local_polling",
"loggers": ["incomfortclient"],
"quality_scale": "platinum",
"requirements": ["incomfort-client==0.6.11"]
"requirements": ["incomfort-client==0.6.12"]
}

View File

@@ -90,6 +90,7 @@
"boiler_int": "Boiler internal",
"buffer": "Buffer",
"central_heating": "Central heating",
"central_heating_low": "Central heating low",
"central_heating_rf": "Central heating rf",
"cv_temperature_too_high_e1": "Temperature too high",
"flame_detection_fault_e6": "Flame detection fault",

View File

@@ -10,6 +10,7 @@ import voluptuous as vol
from homeassistant.components.script import CONF_MODE
from homeassistant.const import CONF_DESCRIPTION, CONF_TYPE, SERVICE_RELOAD
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import (
config_validation as cv,
intent,
@@ -18,6 +19,7 @@ from homeassistant.helpers import (
template,
)
from homeassistant.helpers.reload import async_integration_yaml_config
from homeassistant.helpers.script import async_validate_actions_config
from homeassistant.helpers.typing import ConfigType
_LOGGER = logging.getLogger(__name__)
@@ -85,19 +87,29 @@ async def async_reload(hass: HomeAssistant, service_call: ServiceCall) -> None:
new_intents = new_config[DOMAIN]
async_load_intents(hass, new_intents)
await async_load_intents(hass, new_intents)
def async_load_intents(hass: HomeAssistant, intents: dict[str, ConfigType]) -> None:
async def async_load_intents(
hass: HomeAssistant, intents: dict[str, ConfigType]
) -> None:
"""Load YAML intents into the intent system."""
hass.data[DOMAIN] = intents
for intent_type, conf in intents.items():
if CONF_ACTION in conf:
try:
actions = await async_validate_actions_config(hass, conf[CONF_ACTION])
except (vol.Invalid, HomeAssistantError) as exc:
_LOGGER.error(
"Failed to validate actions for intent %s: %s", intent_type, exc
)
continue # Skip this intent
script_mode: str = conf.get(CONF_MODE, script.DEFAULT_SCRIPT_MODE)
conf[CONF_ACTION] = script.Script(
hass,
conf[CONF_ACTION],
actions,
f"Intent Script {intent_type}",
DOMAIN,
script_mode=script_mode,
@@ -109,7 +121,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the intent script component."""
intents = config[DOMAIN]
async_load_intents(hass, intents)
await async_load_intents(hass, intents)
async def _handle_reload(service_call: ServiceCall) -> None:
return await async_reload(hass, service_call)

View File

@@ -150,7 +150,9 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
self._attr_state = state
self._attr_is_volume_muted = volume_muted
self._attr_volume_level = volume_level
# Only update volume_level if the API provides it, otherwise preserve current value
if volume_level is not None:
self._attr_volume_level = volume_level
self._attr_media_content_type = media_content_type
self._attr_media_content_id = media_content_id
self._attr_media_title = media_title
@@ -190,7 +192,9 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
)
features = MediaPlayerEntityFeature(0)
if "PlayMediaSource" in commands:
if "PlayMediaSource" in commands or self.capabilities.get(
"SupportsMediaControl", False
):
features |= (
MediaPlayerEntityFeature.BROWSE_MEDIA
| MediaPlayerEntityFeature.PLAY_MEDIA
@@ -201,10 +205,10 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
| MediaPlayerEntityFeature.SEARCH_MEDIA
)
if "Mute" in commands:
if "Mute" in commands and "Unmute" in commands:
features |= MediaPlayerEntityFeature.VOLUME_MUTE
if "VolumeSet" in commands:
if "VolumeSet" in commands or "SetVolume" in commands:
features |= MediaPlayerEntityFeature.VOLUME_SET
return features
@@ -219,11 +223,13 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
"""Send pause command."""
self.coordinator.api_client.jellyfin.remote_pause(self.session_id)
self._attr_state = MediaPlayerState.PAUSED
self.schedule_update_ha_state()
def media_play(self) -> None:
"""Send play command."""
self.coordinator.api_client.jellyfin.remote_unpause(self.session_id)
self._attr_state = MediaPlayerState.PLAYING
self.schedule_update_ha_state()
def media_play_pause(self) -> None:
"""Send the PlayPause command to the session."""
@@ -233,6 +239,7 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
"""Send stop command."""
self.coordinator.api_client.jellyfin.remote_stop(self.session_id)
self._attr_state = MediaPlayerState.IDLE
self.schedule_update_ha_state()
def play_media(
self, media_type: MediaType | str, media_id: str, **kwargs: Any
@@ -247,6 +254,8 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
self.coordinator.api_client.jellyfin.remote_set_volume(
self.session_id, int(volume * 100)
)
self._attr_volume_level = volume
self.schedule_update_ha_state()
def mute_volume(self, mute: bool) -> None:
"""Mute the volume."""
@@ -254,6 +263,8 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
self.coordinator.api_client.jellyfin.remote_mute(self.session_id)
else:
self.coordinator.api_client.jellyfin.remote_unmute(self.session_id)
self._attr_is_volume_muted = mute
self.schedule_update_ha_state()
async def async_browse_media(
self,

View File

@@ -2,16 +2,19 @@
from __future__ import annotations
import logging
import math
from typing import Any
from propcache.api import cached_property
from xknx.devices import Fan as XknxFan
from xknx.telegram.address import parse_device_group_address
from homeassistant import config_entries
from homeassistant.components.fan import FanEntity, FanEntityFeature
from homeassistant.const import CONF_ENTITY_CATEGORY, CONF_NAME, Platform
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
async_get_current_platform,
@@ -37,6 +40,58 @@ from .storage.const import (
)
from .storage.util import ConfigExtractor
_LOGGER = logging.getLogger(__name__)
@callback
def async_migrate_yaml_uids(
hass: HomeAssistant, platform_config: list[ConfigType]
) -> None:
"""Migrate entities unique_id for YAML switch-only fan entities."""
# issue was introduced in 2026.1 - this migration in 2026.2
ent_reg = er.async_get(hass)
invalid_uid = str(None)
if (
none_entity_id := ent_reg.async_get_entity_id(Platform.FAN, DOMAIN, invalid_uid)
) is None:
return
for config in platform_config:
if not config.get(KNX_ADDRESS) and (
new_uid_base := config.get(FanSchema.CONF_SWITCH_ADDRESS)
):
break
else:
_LOGGER.info(
"No YAML entry found to migrate fan entity '%s' unique_id from '%s'. Removing entry",
none_entity_id,
invalid_uid,
)
ent_reg.async_remove(none_entity_id)
return
new_uid = str(
parse_device_group_address(
new_uid_base[0], # list of group addresses - first item is sending address
)
)
try:
ent_reg.async_update_entity(none_entity_id, new_unique_id=str(new_uid))
_LOGGER.info(
"Migrating fan entity '%s' unique_id from '%s' to %s",
none_entity_id,
invalid_uid,
new_uid,
)
except ValueError:
# New unique_id already exists - remove invalid entry. User might have changed YAML
_LOGGER.info(
"Failed to migrate fan entity '%s' unique_id from '%s' to '%s'. "
"Removing the invalid entry",
none_entity_id,
invalid_uid,
new_uid,
)
ent_reg.async_remove(none_entity_id)
async def async_setup_entry(
hass: HomeAssistant,
@@ -57,6 +112,7 @@ async def async_setup_entry(
entities: list[_KnxFan] = []
if yaml_platform_config := knx_module.config_yaml.get(Platform.FAN):
async_migrate_yaml_uids(hass, yaml_platform_config)
entities.extend(
KnxYamlFan(knx_module, entity_config)
for entity_config in yaml_platform_config
@@ -177,7 +233,10 @@ class KnxYamlFan(_KnxFan, KnxYamlEntity):
self._step_range: tuple[int, int] | None = (1, max_step) if max_step else None
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
self._attr_unique_id = str(self._device.speed.group_address)
if self._device.speed.group_address:
self._attr_unique_id = str(self._device.speed.group_address)
else:
self._attr_unique_id = str(self._device.switch.group_address)
class KnxUiFan(_KnxFan, KnxUiEntity):

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/local_calendar",
"iot_class": "local_polling",
"loggers": ["ical"],
"requirements": ["ical==12.1.2"]
"requirements": ["ical==12.1.3"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/local_todo",
"iot_class": "local_polling",
"requirements": ["ical==12.1.2"]
"requirements": ["ical==12.1.3"]
}

View File

@@ -33,6 +33,7 @@ from .const import ( # noqa: F401
CONF_ALLOW_SINGLE_WORD,
CONF_ICON,
CONF_REQUIRE_ADMIN,
CONF_RESOURCE_MODE,
CONF_SHOW_IN_SIDEBAR,
CONF_TITLE,
CONF_URL_PATH,
@@ -61,7 +62,7 @@ def _validate_url_slug(value: Any) -> str:
"""Validate value is a valid url slug."""
if value is None:
raise vol.Invalid("Slug should not be None")
if "-" not in value:
if value != "lovelace" and "-" not in value:
raise vol.Invalid("Url path needs to contain a hyphen (-)")
str_value = str(value)
slg = slugify(str_value, separator="-")
@@ -84,9 +85,13 @@ CONFIG_SCHEMA = vol.Schema(
{
vol.Optional(DOMAIN, default={}): vol.Schema(
{
# Deprecated - Remove in 2026.8
vol.Optional(CONF_MODE, default=MODE_STORAGE): vol.All(
vol.Lower, vol.In([MODE_YAML, MODE_STORAGE])
),
vol.Optional(CONF_RESOURCE_MODE): vol.All(
vol.Lower, vol.In([MODE_YAML, MODE_STORAGE])
),
vol.Optional(CONF_DASHBOARDS): cv.schema_with_slug_keys(
YAML_DASHBOARD_SCHEMA,
slug_validator=_validate_url_slug,
@@ -103,7 +108,7 @@ CONFIG_SCHEMA = vol.Schema(
class LovelaceData:
"""Dataclass to store information in hass.data."""
mode: str
resource_mode: str # The mode used for resources (yaml or storage)
dashboards: dict[str | None, dashboard.LovelaceConfig]
resources: resources.ResourceYAMLCollection | resources.ResourceStorageCollection
yaml_dashboards: dict[str | None, ConfigType]
@@ -114,18 +119,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
mode = config[DOMAIN][CONF_MODE]
yaml_resources = config[DOMAIN].get(CONF_RESOURCES)
# Deprecated - Remove in 2026.8
# For YAML mode, register the default panel in yaml mode (temporary until user migrates)
if mode == MODE_YAML:
frontend.async_register_built_in_panel(
hass,
DOMAIN,
config={"mode": mode},
sidebar_title="overview",
sidebar_icon="mdi:view-dashboard",
sidebar_default_visible=False,
)
_async_create_yaml_mode_repair(hass)
# resource_mode controls how resources are loaded (yaml vs storage)
# Deprecated - Remove mode fallback in 2026.8
resource_mode = config[DOMAIN].get(CONF_RESOURCE_MODE, mode)
async def reload_resources_service_handler(service_call: ServiceCall) -> None:
"""Reload yaml resources."""
@@ -149,12 +145,13 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
)
hass.data[LOVELACE_DATA].resources = resource_collection
default_config: dashboard.LovelaceConfig
resource_collection: (
resources.ResourceYAMLCollection | resources.ResourceStorageCollection
)
if mode == MODE_YAML:
default_config = dashboard.LovelaceYAML(hass, None, None)
default_config = dashboard.LovelaceStorage(hass, None)
# Load resources based on resource_mode
if resource_mode == MODE_YAML:
resource_collection = await create_yaml_resource_col(hass, yaml_resources)
async_register_admin_service(
@@ -177,8 +174,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
)
else:
default_config = dashboard.LovelaceStorage(hass, None)
if yaml_resources is not None:
_LOGGER.warning(
"Lovelace is running in storage mode. Define resources via user"
@@ -195,18 +190,44 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
RESOURCE_UPDATE_FIELDS,
).async_setup(hass)
websocket_api.async_register_command(hass, websocket.websocket_lovelace_info)
websocket_api.async_register_command(hass, websocket.websocket_lovelace_config)
websocket_api.async_register_command(hass, websocket.websocket_lovelace_save_config)
websocket_api.async_register_command(
hass, websocket.websocket_lovelace_delete_config
)
yaml_dashboards = config[DOMAIN].get(CONF_DASHBOARDS, {})
# Deprecated - Remove in 2026.8
# For YAML mode, add the default "lovelace" dashboard if not already defined
# This migrates the legacy yaml mode to a proper yaml dashboard entry
if mode == MODE_YAML and DOMAIN not in yaml_dashboards:
translations = await async_get_translations(
hass, hass.config.language, "dashboard", {onboarding.DOMAIN}
)
title = translations.get(
"component.onboarding.dashboard.overview.title", "Overview"
)
yaml_dashboards = {
DOMAIN: {
CONF_TITLE: title,
CONF_ICON: DEFAULT_ICON,
CONF_SHOW_IN_SIDEBAR: True,
CONF_REQUIRE_ADMIN: False,
CONF_MODE: MODE_YAML,
CONF_FILENAME: LOVELACE_CONFIG_FILE,
},
**yaml_dashboards,
}
_async_create_yaml_mode_repair(hass)
hass.data[LOVELACE_DATA] = LovelaceData(
mode=mode,
resource_mode=resource_mode,
# We store a dictionary mapping url_path: config. None is the default.
dashboards={None: default_config},
resources=resource_collection,
yaml_dashboards=config[DOMAIN].get(CONF_DASHBOARDS, {}),
yaml_dashboards=yaml_dashboards,
)
if hass.config.recovery_mode:
@@ -450,7 +471,7 @@ async def _async_migrate_default_config(
# Deprecated - Remove in 2026.8
@callback
def _async_create_yaml_mode_repair(hass: HomeAssistant) -> None:
"""Create repair issue for YAML mode migration."""
"""Create repair issue for YAML mode deprecation."""
ir.async_create_issue(
hass,
DOMAIN,

View File

@@ -158,7 +158,15 @@ async def _get_dashboard_info(
"""Load a dashboard and return info on views."""
if url_path == DEFAULT_DASHBOARD:
url_path = None
dashboard = hass.data[LOVELACE_DATA].dashboards.get(url_path)
# When url_path is None, prefer "lovelace" dashboard if it exists (for YAML mode)
# Otherwise fall back to dashboards[None] (storage mode default)
if url_path is None:
dashboard = hass.data[LOVELACE_DATA].dashboards.get(DOMAIN) or hass.data[
LOVELACE_DATA
].dashboards.get(None)
else:
dashboard = hass.data[LOVELACE_DATA].dashboards.get(url_path)
if dashboard is None:
raise ValueError("Invalid dashboard specified")

View File

@@ -57,6 +57,7 @@ RESOURCE_UPDATE_FIELDS: VolDictType = {
SERVICE_RELOAD_RESOURCES = "reload_resources"
RESOURCE_RELOAD_SERVICE_SCHEMA = vol.Schema({})
CONF_RESOURCE_MODE = "resource_mode"
CONF_TITLE = "title"
CONF_REQUIRE_ADMIN = "require_admin"
CONF_SHOW_IN_SIDEBAR = "show_in_sidebar"

View File

@@ -6,8 +6,8 @@
},
"issues": {
"yaml_mode_deprecated": {
"description": "Starting with Home Assistant 2026.8, the default Lovelace dashboard will no longer support YAML mode. To migrate:\n\n1. Remove `mode: yaml` from `lovelace:` in your `configuration.yaml`\n2. Rename `{config_file}` to a new filename (e.g., `my-dashboard.yaml`)\n3. Add a dashboard entry in your `configuration.yaml`:\n\n```yaml\nlovelace:\n dashboards:\n lovelace:\n mode: yaml\n filename: my-dashboard.yaml\n title: Overview\n icon: mdi:view-dashboard\n show_in_sidebar: true\n```\n\n4. Restart Home Assistant",
"title": "Lovelace YAML mode migration required"
"description": "Your YAML dashboard configuration uses the legacy `mode: yaml` option, which will be removed in Home Assistant 2026.8. Your YAML dashboards will continue to work, you just need to update how they are defined.\n\nTo update your configuration:\n\n1. Remove `mode: yaml` from `lovelace:` in your `configuration.yaml`\n2. Add a dashboard entry instead:\n\n ```yaml\n lovelace:\n resource_mode: yaml\n dashboards:\n lovelace:\n mode: yaml\n filename: {config_file}\n title: Overview\n icon: mdi:view-dashboard\n show_in_sidebar: true\n ```\n\n3. Restart Home Assistant\n\nNote: `resource_mode: yaml` keeps loading resources from YAML. If you want to manage resources through the UI instead, you can remove this line and move your resources to Settings > Dashboards > Resources.",
"title": "Lovelace YAML configuration needs update"
}
},
"services": {

View File

@@ -42,9 +42,7 @@ async def system_health_info(hass: HomeAssistant) -> dict[str, Any]:
else:
health_info[key] = dashboard[key]
if hass.data[LOVELACE_DATA].mode == MODE_YAML:
health_info[CONF_MODE] = MODE_YAML
elif MODE_STORAGE in modes:
if MODE_STORAGE in modes:
health_info[CONF_MODE] = MODE_STORAGE
elif MODE_YAML in modes:
health_info[CONF_MODE] = MODE_YAML

View File

@@ -14,7 +14,13 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.json import json_fragment
from .const import CONF_URL_PATH, LOVELACE_DATA, ConfigNotFound
from .const import (
CONF_RESOURCE_MODE,
CONF_URL_PATH,
DOMAIN,
LOVELACE_DATA,
ConfigNotFound,
)
from .dashboard import LovelaceConfig
if TYPE_CHECKING:
@@ -38,7 +44,15 @@ def _handle_errors[_R](
msg: dict[str, Any],
) -> None:
url_path = msg.get(CONF_URL_PATH)
config = hass.data[LOVELACE_DATA].dashboards.get(url_path)
# When url_path is None, prefer "lovelace" dashboard if it exists (for YAML mode)
# Otherwise fall back to dashboards[None] (storage mode default)
if url_path is None:
config = hass.data[LOVELACE_DATA].dashboards.get(DOMAIN) or hass.data[
LOVELACE_DATA
].dashboards.get(None)
else:
config = hass.data[LOVELACE_DATA].dashboards.get(url_path)
if config is None:
connection.send_error(
@@ -100,6 +114,20 @@ async def websocket_lovelace_resources_impl(
connection.send_result(msg["id"], resources.async_items())
@websocket_api.websocket_command({"type": "lovelace/info"})
@websocket_api.async_response
async def websocket_lovelace_info(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Send Lovelace UI info over WebSocket connection."""
connection.send_result(
msg["id"],
{CONF_RESOURCE_MODE: hass.data[LOVELACE_DATA].resource_mode},
)
@websocket_api.websocket_command(
{
"type": "lovelace/config",

View File

@@ -8,6 +8,6 @@
"iot_class": "calculated",
"loggers": ["yt_dlp"],
"quality_scale": "internal",
"requirements": ["yt-dlp[default]==2025.12.08"],
"requirements": ["yt-dlp[default]==2026.02.04"],
"single_config_entry": true
}

View File

@@ -266,7 +266,7 @@
"name": "Browse media"
},
"clear_playlist": {
"description": "Removes all items from the playlist.",
"description": "Removes all items from a media player's playlist.",
"name": "Clear playlist"
},
"join": {
@@ -284,15 +284,15 @@
"name": "Next"
},
"media_pause": {
"description": "Pauses.",
"description": "Pauses playback on a media player.",
"name": "[%key:common::action::pause%]"
},
"media_play": {
"description": "Starts playing.",
"description": "Starts playback on a media player.",
"name": "Play"
},
"media_play_pause": {
"description": "Toggles play/pause.",
"description": "Toggles play/pause on a media player.",
"name": "Play/Pause"
},
"media_previous_track": {
@@ -310,7 +310,7 @@
"name": "Seek"
},
"media_stop": {
"description": "Stops playing.",
"description": "Stops playback on a media player.",
"name": "[%key:common::action::stop%]"
},
"play_media": {
@@ -374,7 +374,7 @@
"name": "Select sound mode"
},
"select_source": {
"description": "Sends the media player the command to change input source.",
"description": "Sends a media player the command to change the input source.",
"fields": {
"source": {
"description": "Name of the source to switch to. Platform dependent.",
@@ -398,23 +398,23 @@
"name": "[%key:common::action::toggle%]"
},
"turn_off": {
"description": "Turns off the power of the media player.",
"description": "Turns off the power of a media player.",
"name": "[%key:common::action::turn_off%]"
},
"turn_on": {
"description": "Turns on the power of the media player.",
"description": "Turns on the power of a media player.",
"name": "[%key:common::action::turn_on%]"
},
"unjoin": {
"description": "Removes the player from a group. Only works on platforms which support player groups.",
"description": "Removes a media player from a group. Only works on platforms which support player groups.",
"name": "Unjoin"
},
"volume_down": {
"description": "Turns down the volume.",
"description": "Turns down the volume of a media player.",
"name": "Turn down volume"
},
"volume_mute": {
"description": "Mutes or unmutes the media player.",
"description": "Mutes or unmutes a media player.",
"fields": {
"is_volume_muted": {
"description": "Defines whether or not it is muted.",
@@ -424,7 +424,7 @@
"name": "Mute/unmute volume"
},
"volume_set": {
"description": "Sets the volume level.",
"description": "Sets the volume level of a media player.",
"fields": {
"volume_level": {
"description": "The volume. 0 is inaudible, 1 is the maximum volume.",
@@ -434,7 +434,7 @@
"name": "Set volume"
},
"volume_up": {
"description": "Turns up the volume.",
"description": "Turns up the volume of a media player.",
"name": "Turn up volume"
}
},

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["meteoclimatic"],
"requirements": ["pymeteoclimatic==0.1.0"]
"requirements": ["pymeteoclimatic==0.1.1"]
}

View File

@@ -806,6 +806,8 @@ class CoffeeSystemProgramId(MieleEnum, missing_to_none=True):
24813, # add profile
)
appliance_rinse = 24750, 24759, 24773, 24787, 24788
intermediate_rinsing = 24758
automatic_maintenance = 24778
descaling = 24751
brewing_unit_degrease = 24753
milk_pipework_rinse = 24754

View File

@@ -722,7 +722,7 @@ POLLED_SENSOR_TYPES: Final[tuple[MieleSensorDefinition[MieleFillingLevel], ...]]
description=MieleSensorDescription[MieleFillingLevel](
key="power_disk_level",
translation_key="power_disk_level",
value_fn=lambda value: None,
value_fn=lambda value: value.power_disc_filling_level,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),

View File

@@ -288,6 +288,7 @@
"auto": "[%key:common::state::auto%]",
"auto_roast": "Auto roast",
"automatic": "Automatic",
"automatic_maintenance": "Automatic maintenance",
"automatic_plus": "Automatic plus",
"baguettes": "Baguettes",
"barista_assistant": "BaristaAssistant",
@@ -551,6 +552,7 @@
"hygiene": "Hygiene",
"intensive": "Intensive",
"intensive_bake": "Intensive bake",
"intermediate_rinsing": "Intermediate rinsing",
"iridescent_shark_fillet": "Iridescent shark (fillet)",
"japanese_tea": "Japanese tea",
"jasmine_rice_rapid_steam_cooking": "Jasmine rice (rapid steam cooking)",

View File

@@ -34,7 +34,7 @@
},
"user": {
"data": {
"domain": "[%key:common::config_flow::data::username%]",
"domain": "Domain",
"host": "[%key:common::config_flow::data::host%]",
"password": "Dynamic DNS password"
},

View File

@@ -96,7 +96,6 @@ async def validate_nibegw_input(
"""Validate the user input allows us to connect."""
heatpump = HeatPump(Model[data[CONF_MODEL]])
heatpump.word_swap = True
await heatpump.initialize()
connection = NibeGW(
@@ -114,6 +113,9 @@ async def validate_nibegw_input(
"Address already in use", "listening_port", "address_in_use"
) from exception
if heatpump.word_swap is None:
heatpump.word_swap = True
try:
await connection.verify_connectivity()
except (ReadSendException, CoilWriteSendException) as exception:

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/nibe_heatpump",
"integration_type": "device",
"iot_class": "local_polling",
"requirements": ["nibe==2.21.0"]
"requirements": ["nibe==2.22.0"]
}

View File

@@ -594,7 +594,8 @@ UNIT_CONVERTERS: dict[NumberDeviceClass, type[BaseUnitConverter]] = {
}
# We translate units that were using using the legacy coding of μ \u00b5
# to units using recommended coding of μ \u03bc
# to units using recommended coding of μ \u03bc and
# we convert alternative accepted units to the preferred unit.
AMBIGUOUS_UNITS: dict[str | None, str] = {
"\u00b5Sv/h": "μSv/h", # aranet: radiation rate
"\u00b5S/cm": UnitOfConductivity.MICROSIEMENS_PER_CM,
@@ -604,4 +605,9 @@ AMBIGUOUS_UNITS: dict[str | None, str] = {
"\u00b5mol/s⋅m²": "μmol/s⋅m²", # fyta: light
"\u00b5g": UnitOfMass.MICROGRAMS,
"\u00b5s": UnitOfTime.MICROSECONDS,
"mVAr": UnitOfReactivePower.MILLIVOLT_AMPERE_REACTIVE,
"VAr": UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
"kVAr": UnitOfReactivePower.KILO_VOLT_AMPERE_REACTIVE,
"VArh": UnitOfReactiveEnergy.VOLT_AMPERE_REACTIVE_HOUR,
"kVArh": UnitOfReactiveEnergy.KILO_VOLT_AMPERE_REACTIVE_HOUR,
}

View File

@@ -14,7 +14,6 @@ from onedrive_personal_sdk.exceptions import (
NotFoundError,
OneDriveException,
)
from onedrive_personal_sdk.models.items import ItemUpdate
from homeassistant.const import CONF_ACCESS_TOKEN, Platform
from homeassistant.core import HomeAssistant
@@ -72,15 +71,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
entry, data={**entry.data, CONF_FOLDER_ID: backup_folder.id}
)
# write instance id to description
if backup_folder.description != (instance_id := await async_get_instance_id(hass)):
await _handle_item_operation(
lambda: client.update_drive_item(
backup_folder.id, ItemUpdate(description=instance_id)
),
folder_name,
)
# update in case folder was renamed manually inside OneDrive
if backup_folder.name != entry.data[CONF_FOLDER_NAME]:
hass.config_entries.async_update_entry(
@@ -122,7 +112,11 @@ async def async_unload_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -> None:
"""Migrate backup files to metadata version 2."""
"""Migrate backup files from metadata version 1 to version 2.
Version 1: Backup metadata was stored in the backup file's description field.
Version 2: Backup metadata is stored in a separate .metadata.json file.
"""
files = await client.list_drive_items(backup_folder_id)
for file in files:
if file.description and '"metadata_version": 1' in (
@@ -131,24 +125,11 @@ async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -
metadata = loads(metadata_json)
del metadata["metadata_version"]
metadata_filename = file.name.rsplit(".", 1)[0] + ".metadata.json"
metadata_file = await client.upload_file(
await client.upload_file(
backup_folder_id,
metadata_filename,
dumps(metadata),
)
metadata_description = {
"metadata_version": 2,
"backup_id": metadata["backup_id"],
"backup_file_id": file.id,
}
await client.update_drive_item(
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
await client.update_drive_item(
path_or_id=file.id,
data=ItemUpdate(description=""),
)
_LOGGER.debug("Migrated backup file %s", file.name)

View File

@@ -3,10 +3,7 @@
from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
from dataclasses import dataclass
from functools import wraps
from html import unescape
from json import dumps, loads
import logging
from time import time
from typing import Any, Concatenate
@@ -18,7 +15,6 @@ from onedrive_personal_sdk.exceptions import (
HashMismatchError,
OneDriveException,
)
from onedrive_personal_sdk.models.items import ItemUpdate
from onedrive_personal_sdk.models.upload import FileInfo
from homeassistant.components.backup import (
@@ -30,6 +26,8 @@ from homeassistant.components.backup import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.json import json_dumps
from homeassistant.util.json import json_loads_object
from .const import CONF_DELETE_PERMANENTLY, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .coordinator import OneDriveConfigEntry
@@ -38,7 +36,6 @@ _LOGGER = logging.getLogger(__name__)
MAX_CHUNK_SIZE = 60 * 1024 * 1024 # largest chunk possible, must be <= 60 MiB
TARGET_CHUNKS = 20
TIMEOUT = ClientTimeout(connect=10, total=43200) # 12 hours
METADATA_VERSION = 2
CACHE_TTL = 300
@@ -104,13 +101,10 @@ def handle_backup_errors[_R, **P](
return wrapper
@dataclass(kw_only=True)
class OneDriveBackup:
"""Define a OneDrive backup."""
backup: AgentBackup
backup_file_id: str
metadata_file_id: str
def suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata."""
base_name = suggested_filename(backup).rsplit(".", 1)[0]
return f"{base_name}.tar", f"{base_name}.metadata.json"
class OneDriveBackupAgent(BackupAgent):
@@ -129,7 +123,7 @@ class OneDriveBackupAgent(BackupAgent):
self.name = entry.title
assert entry.unique_id
self.unique_id = entry.unique_id
self._backup_cache: dict[str, OneDriveBackup] = {}
self._cache_backup_metadata: dict[str, AgentBackup] = {}
self._cache_expiration = time()
@handle_backup_errors
@@ -137,12 +131,11 @@ class OneDriveBackupAgent(BackupAgent):
self, backup_id: str, **kwargs: Any
) -> AsyncIterator[bytes]:
"""Download a backup file."""
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
backup = await self._find_backup_by_id(backup_id)
backup_filename, _ = suggested_filenames(backup)
stream = await self._client.download_drive_item(
backups[backup_id].backup_file_id, timeout=TIMEOUT
f"{self._folder_id}:/{backup_filename}:", timeout=TIMEOUT
)
return stream.iter_chunked(1024)
@@ -155,9 +148,9 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Upload a backup."""
filename = suggested_filename(backup)
backup_filename, metadata_filename = suggested_filenames(backup)
file = FileInfo(
filename,
backup_filename,
backup.size,
self._folder_id,
await open_stream(),
@@ -173,7 +166,7 @@ class OneDriveBackupAgent(BackupAgent):
upload_chunk_size = max(upload_chunk_size, 320 * 1024)
try:
backup_file = await LargeFileUploadClient.upload(
await LargeFileUploadClient.upload(
self._token_function,
file,
upload_chunk_size=upload_chunk_size,
@@ -185,35 +178,27 @@ class OneDriveBackupAgent(BackupAgent):
"Hash validation failed, backup file might be corrupt"
) from err
# store metadata in metadata file
description = dumps(backup.as_dict())
_LOGGER.debug("Creating metadata: %s", description)
metadata_filename = filename.rsplit(".", 1)[0] + ".metadata.json"
_LOGGER.debug("Uploaded backup to %s", backup_filename)
# Store metadata in separate metadata file (just backup.as_dict(), no extra fields)
metadata_content = json_dumps(backup.as_dict())
try:
metadata_file = await self._client.upload_file(
await self._client.upload_file(
self._folder_id,
metadata_filename,
description,
metadata_content,
)
except OneDriveException:
await self._client.delete_drive_item(backup_file.id)
# Clean up the backup file if metadata upload fails
_LOGGER.debug(
"Uploading metadata failed, deleting backup file %s", backup_filename
)
await self._client.delete_drive_item(
f"{self._folder_id}:/{backup_filename}:"
)
raise
# add metadata to the metadata file
metadata_description = {
"metadata_version": METADATA_VERSION,
"backup_id": backup.backup_id,
"backup_file_id": backup_file.id,
}
try:
await self._client.update_drive_item(
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
except OneDriveException:
await self._client.delete_drive_item(backup_file.id)
await self._client.delete_drive_item(metadata_file.id)
raise
_LOGGER.debug("Uploaded metadata file %s", metadata_filename)
self._cache_expiration = time()
@handle_backup_errors
@@ -223,66 +208,63 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Delete a backup file."""
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
backup = backups[backup_id]
backup = await self._find_backup_by_id(backup_id)
backup_filename, metadata_filename = suggested_filenames(backup)
delete_permanently = self._entry.options.get(CONF_DELETE_PERMANENTLY, False)
await self._client.delete_drive_item(backup.backup_file_id, delete_permanently)
await self._client.delete_drive_item(
backup.metadata_file_id, delete_permanently
f"{self._folder_id}:/{backup_filename}:", delete_permanently
)
await self._client.delete_drive_item(
f"{self._folder_id}:/{metadata_filename}:", delete_permanently
)
_LOGGER.debug("Deleted backup %s", backup_filename)
self._cache_expiration = time()
@handle_backup_errors
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
return [
backup.backup for backup in (await self._list_cached_backups()).values()
]
return list((await self._list_cached_metadata_files()).values())
@handle_backup_errors
async def async_get_backup(self, backup_id: str, **kwargs: Any) -> AgentBackup:
"""Return a backup."""
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
return backups[backup_id].backup
return await self._find_backup_by_id(backup_id)
async def _list_cached_backups(self) -> dict[str, OneDriveBackup]:
"""List backups with a cache."""
async def _list_cached_metadata_files(self) -> dict[str, AgentBackup]:
"""List metadata files with a cache."""
if time() <= self._cache_expiration:
return self._backup_cache
return self._cache_backup_metadata
items = await self._client.list_drive_items(self._folder_id)
async def download_backup_metadata(item_id: str) -> AgentBackup | None:
async def _download_metadata(item_id: str) -> AgentBackup | None:
"""Download metadata file."""
try:
metadata_stream = await self._client.download_drive_item(item_id)
except OneDriveException as err:
_LOGGER.warning("Error downloading metadata for %s: %s", item_id, err)
return None
metadata_json = loads(await metadata_stream.read())
return AgentBackup.from_dict(metadata_json)
backups: dict[str, OneDriveBackup] = {}
return AgentBackup.from_dict(
json_loads_object(await metadata_stream.read())
)
items = await self._client.list_drive_items(self._folder_id)
metadata_files: dict[str, AgentBackup] = {}
for item in items:
if item.description and f'"metadata_version": {METADATA_VERSION}' in (
metadata_description_json := unescape(item.description)
):
backup = await download_backup_metadata(item.id)
if backup is None:
continue
metadata_description = loads(metadata_description_json)
backups[backup.backup_id] = OneDriveBackup(
backup=backup,
backup_file_id=metadata_description["backup_file_id"],
metadata_file_id=item.id,
)
if item.name and item.name.endswith(".metadata.json"):
if metadata := await _download_metadata(item.id):
metadata_files[metadata.backup_id] = metadata
self._cache_backup_metadata = metadata_files
self._cache_expiration = time() + CACHE_TTL
self._backup_cache = backups
return backups
return self._cache_backup_metadata
async def _find_backup_by_id(self, backup_id: str) -> AgentBackup:
"""Find a backup by its backup ID on remote."""
metadata_files = await self._list_cached_metadata_files()
if backup := metadata_files.get(backup_id):
return backup
raise BackupNotFound(f"Backup {backup_id} not found")

View File

@@ -129,9 +129,6 @@ class OneDriveConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
except OneDriveException:
self.logger.debug("Failed to create folder", exc_info=True)
errors["base"] = "folder_creation_error"
else:
if folder.description and folder.description != instance_id:
errors[CONF_FOLDER_NAME] = "folder_already_in_use"
if not errors:
title = (
f"{self.approot.created_by.user.display_name}'s OneDrive"

View File

@@ -22,7 +22,6 @@
"default": "[%key:common::config_flow::create_entry::authenticated%]"
},
"error": {
"folder_already_in_use": "Folder already used for backups from another Home Assistant instance",
"folder_creation_error": "Failed to create folder",
"folder_rename_error": "Failed to rename folder"
},

View File

@@ -10,7 +10,7 @@ from homeassistant.exceptions import ConfigEntryNotReady
from .coordinator import OpenEVSEConfigEntry, OpenEVSEDataUpdateCoordinator
PLATFORMS = [Platform.SENSOR]
PLATFORMS = [Platform.NUMBER, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: OpenEVSEConfigEntry) -> bool:

View File

@@ -0,0 +1,114 @@
"""Support for OpenEVSE number entities."""
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from typing import Any
from openevsehttp.__main__ import OpenEVSE
from homeassistant.components.number import (
NumberDeviceClass,
NumberEntity,
NumberEntityDescription,
)
from homeassistant.const import (
ATTR_CONNECTIONS,
ATTR_SERIAL_NUMBER,
EntityCategory,
UnitOfElectricCurrent,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import OpenEVSEConfigEntry, OpenEVSEDataUpdateCoordinator
@dataclass(frozen=True, kw_only=True)
class OpenEVSENumberDescription(NumberEntityDescription):
"""Describes an OpenEVSE number entity."""
value_fn: Callable[[OpenEVSE], float]
min_value_fn: Callable[[OpenEVSE], float]
max_value_fn: Callable[[OpenEVSE], float]
set_value_fn: Callable[[OpenEVSE, float], Awaitable[Any]]
NUMBER_TYPES: tuple[OpenEVSENumberDescription, ...] = (
OpenEVSENumberDescription(
key="charge_rate",
translation_key="charge_rate",
value_fn=lambda ev: ev.max_current_soft,
min_value_fn=lambda ev: ev.min_amps,
max_value_fn=lambda ev: ev.max_amps,
set_value_fn=lambda ev, value: ev.set_current(value),
native_step=1.0,
entity_category=EntityCategory.CONFIG,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
device_class=NumberDeviceClass.CURRENT,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: OpenEVSEConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up OpenEVSE sensors based on config entry."""
coordinator = entry.runtime_data
identifier = entry.unique_id or entry.entry_id
async_add_entities(
OpenEVSENumber(coordinator, description, identifier, entry.unique_id)
for description in NUMBER_TYPES
)
class OpenEVSENumber(CoordinatorEntity[OpenEVSEDataUpdateCoordinator], NumberEntity):
"""Implementation of an OpenEVSE sensor."""
_attr_has_entity_name = True
entity_description: OpenEVSENumberDescription
def __init__(
self,
coordinator: OpenEVSEDataUpdateCoordinator,
description: OpenEVSENumberDescription,
identifier: str,
unique_id: str | None,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{identifier}-{description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, identifier)},
manufacturer="OpenEVSE",
)
if unique_id:
self._attr_device_info[ATTR_CONNECTIONS] = {
(CONNECTION_NETWORK_MAC, unique_id)
}
self._attr_device_info[ATTR_SERIAL_NUMBER] = unique_id
@property
def native_value(self) -> float:
"""Return the state of the number."""
return self.entity_description.value_fn(self.coordinator.charger)
@property
def native_min_value(self) -> float:
"""Return the minimum value."""
return self.entity_description.min_value_fn(self.coordinator.charger)
@property
def native_max_value(self) -> float:
"""Return the maximum value."""
return self.entity_description.max_value_fn(self.coordinator.charger)
async def async_set_native_value(self, value: float) -> None:
"""Set new value."""
await self.entity_description.set_value_fn(self.coordinator.charger, value)

View File

@@ -30,6 +30,11 @@
}
},
"entity": {
"number": {
"charge_rate": {
"name": "Charge rate"
}
},
"sensor": {
"ambient_temp": {
"name": "Ambient temperature"

View File

@@ -31,7 +31,6 @@ class OpenThermEntity(Entity):
"""Represent an OpenTherm entity."""
_attr_has_entity_name = True
_attr_should_poll = False
entity_description: OpenThermEntityDescription
def __init__(
@@ -61,6 +60,8 @@ class OpenThermEntity(Entity):
class OpenThermStatusEntity(OpenThermEntity):
"""Represent an OpenTherm entity that receives status updates."""
_attr_should_poll = False
async def async_added_to_hass(self) -> None:
"""Subscribe to updates from the component."""
self.async_on_remove(

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["pyotgw"],
"requirements": ["pyotgw==2.2.2"]
"requirements": ["pyotgw==2.2.3"]
}

View File

@@ -9,5 +9,5 @@
"iot_class": "cloud_polling",
"loggers": ["opower"],
"quality_scale": "bronze",
"requirements": ["opower==0.16.5"]
"requirements": ["opower==0.17.0"]
}

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/otbr",
"integration_type": "service",
"iot_class": "local_polling",
"requirements": ["python-otbr-api==2.7.1"]
"requirements": ["python-otbr-api==2.8.0"]
}

View File

@@ -4,15 +4,18 @@ from __future__ import annotations
from datetime import datetime
from homeassistant.components.calendar import CalendarEntity, CalendarEvent
from homeassistant.components.calendar import (
CalendarEntity,
CalendarEntityDescription,
CalendarEvent,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import CalendarUpdateCoordinator, RadarrConfigEntry, RadarrEvent
from .entity import RadarrEntity
CALENDAR_TYPE = EntityDescription(
CALENDAR_TYPE = CalendarEntityDescription(
key="calendar",
name=None,
)

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["ical"],
"quality_scale": "silver",
"requirements": ["ical==12.1.2"]
"requirements": ["ical==12.1.3"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["renault_api"],
"quality_scale": "silver",
"requirements": ["renault-api==0.5.2"]
"requirements": ["renault-api==0.5.3"]
}

View File

@@ -20,5 +20,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.18.1"]
"requirements": ["reolink-aio==0.18.2"]
}

View File

@@ -840,7 +840,8 @@ STATE_CLASS_UNITS: dict[SensorStateClass | str, set[type[StrEnum] | str | None]]
}
# We translate units that were using using the legacy coding of μ \u00b5
# to units using recommended coding of μ \u03bc
# to units using recommended coding of μ \u03bc and
# we convert alternative accepted units to the preferred unit.
AMBIGUOUS_UNITS: dict[str | None, str] = {
"\u00b5Sv/h": "μSv/h", # aranet: radiation rate
"\u00b5S/cm": UnitOfConductivity.MICROSIEMENS_PER_CM,
@@ -850,4 +851,9 @@ AMBIGUOUS_UNITS: dict[str | None, str] = {
"\u00b5mol/s⋅m²": "μmol/s⋅m²", # fyta: light
"\u00b5g": UnitOfMass.MICROGRAMS,
"\u00b5s": UnitOfTime.MICROSECONDS,
"mVAr": UnitOfReactivePower.MILLIVOLT_AMPERE_REACTIVE,
"VAr": UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
"kVAr": UnitOfReactivePower.KILO_VOLT_AMPERE_REACTIVE,
"VArh": UnitOfReactiveEnergy.VOLT_AMPERE_REACTIVE_HOUR,
"kVArh": UnitOfReactiveEnergy.KILO_VOLT_AMPERE_REACTIVE_HOUR,
}

View File

@@ -6,6 +6,7 @@ from datetime import timedelta
from http import HTTPStatus
import logging
from aiohttp import ClientResponseError
from httpx import HTTPStatusError, RequestError
import jwt
from pysenz import SENZAPI, Thermostat
@@ -70,11 +71,26 @@ async def async_setup_entry(hass: HomeAssistant, entry: SENZConfigEntry) -> bool
translation_domain=DOMAIN,
translation_key="config_entry_not_ready",
) from err
except ClientResponseError as err:
if err.status in (HTTPStatus.UNAUTHORIZED, HTTPStatus.BAD_REQUEST):
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="config_entry_auth_failed",
) from err
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="config_entry_not_ready",
) from err
except RequestError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="config_entry_not_ready",
) from err
except Exception as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="config_entry_auth_failed",
) from err
coordinator: SENZDataUpdateCoordinator = DataUpdateCoordinator(
hass,

View File

@@ -14,7 +14,7 @@ from homeassistant.components.sensor import (
SensorStateClass,
)
from homeassistant.const import UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
@@ -81,6 +81,12 @@ class SENZSensor(CoordinatorEntity[SENZDataUpdateCoordinator], SensorEntity):
serial_number=thermostat.serial_number,
)
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self._thermostat = self.coordinator.data[self._thermostat.serial_number]
self.async_write_ha_state()
@property
def available(self) -> bool:
"""Return True if the thermostat is available."""

Some files were not shown because too many files have changed in this diff Show More