Compare commits

..

86 Commits

Author SHA1 Message Date
Kyle Johnson
3693bc5878 Make Google Assistant fan speed percent and step speeds mutually exclusive (#162770) 2026-02-23 22:26:09 +00:00
Denis Shulyaka
af9ea5ea7a Bump anthropic to 0.83.0 (#163899) 2026-02-23 21:43:07 +00:00
Robert Resch
977d29956b Add clean_area support for Ecovacs mqtt vacuums (#163580) 2026-02-23 22:42:25 +01:00
Jamie Magee
fc9bdb3cb1 Bring aladdin_connect to Bronze quality scale (#163221)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-23 22:16:51 +01:00
Erwin Douna
bb1956c738 Portainer Platinum score (#163898) 2026-02-23 22:15:59 +01:00
J. Nick Koston
9212279c2c Bump aioesphomeapi 44.1.0 (#163894) 2026-02-23 22:14:40 +01:00
Denis Shulyaka
7e162cfda2 Update Anthropic models (#163897) 2026-02-23 22:13:31 +01:00
Tom Matheussen
5611b4564f Add debounce to Satel Integra alarm panel state (#163602) 2026-02-23 21:57:39 +01:00
Manu
1a16674f86 Update quality scale of Xbox integration to platinum 🏆️ (#155577)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 21:56:05 +01:00
Paul Tarjan
bae4de3753 Add Hikvision integration quality scale (#159252)
Co-authored-by: Joostlek <joostlek@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 21:53:22 +01:00
mettolen
8f2bfa1bb0 Add select entities to Liebherr integration (#163581) 2026-02-23 21:52:50 +01:00
Manu
fb118ed516 Add support for action buttons to ntfy integration (#152014) 2026-02-23 21:46:00 +01:00
Markus Adrario
bea84151b1 homee: add one-button-remote to event platform (#163690) 2026-02-23 21:42:08 +01:00
Markus Adrario
d581d65c8b Add handling of 2 IP addresses to homee (#162731)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 21:36:49 +01:00
Erwin Douna
bc1837d09d Portainer gold standard review (#155231)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-23 21:34:06 +01:00
Daniel Hjelseth Høyer
9cc3c850aa Homevolt switch platform (#163415)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-02-23 21:16:43 +01:00
Markus
8927960fca fix(snapcast): do not crash when stream is not found (#162439)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-23 21:09:14 +01:00
Erwin Douna
49b8232260 Add stale device removal to portainer (#160017)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 21:05:52 +01:00
Barry vd. Heuvel
1d5e8a9e5a Weheat energy logs update (#163621)
Co-authored-by: Jesper Raemaekers <jesper.raemaekers@wefabricate.com>
2026-02-23 21:00:35 +01:00
dvdinth
501e095578 Add IntelliClima Select platform (#163637)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-23 20:41:41 +01:00
Jeef
dc5eab6810 Allow support of Graph QL 4.0 / Bump pytibber 0.36.0 (#163305)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 20:41:05 +01:00
Manu
25787d2b75 Add DeviceInfo to Google Translate (#163762) 2026-02-23 20:29:49 +01:00
Denis Shulyaka
e57613af65 Anthropic interleaved thinking (#163583)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 20:24:40 +01:00
Erwin Douna
89ff86a941 Add diagnostics to Proxmox (#163800)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-23 20:17:49 +01:00
Brett Adams
c62ceee8fc Update Teslemetry quality scale to silver (#163611)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-23 20:12:38 +01:00
J. Nick Koston
d732e3d5ae Add climate platform to Trane Local integration (#163571)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 20:03:08 +01:00
Denis Shulyaka
dd78da929e Improve config flow tests for Anthropic (#163757) 2026-02-23 19:15:46 +01:00
Christopher Fenner
c2b74b7612 Correct EnOcean integration type (#163725) 2026-02-23 19:11:12 +01:00
Tom
6570b413d4 Add discovery for airOS devices (#154568)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 18:59:50 +01:00
Christian Lackas
ea7732e9ee Add heat pump sensors to ViCare integration (#161422) 2026-02-23 18:54:12 +01:00
TheJulianJES
4c885e7ce8 Fix ZHA number entity not using device class and mode (#163827) 2026-02-23 18:53:58 +01:00
Christian Lackas
67395f1cf5 Handle PyViCare device communication and server errors in ViCare integration (#162618) 2026-02-23 18:53:00 +01:00
Joost Lekkerkerker
a552266bfc Bump python-overseerr to 0.9.0 (#163883) 2026-02-23 18:52:56 +01:00
Bouwe Westerdijk
e6c2d54232 Improve Plugwise set_hvac_mode() logic (#163713) 2026-02-23 18:52:29 +01:00
Willem-Jan van Rootselaar
994eae8412 Bump python-bsblan to 5.0.1 (#163840) 2026-02-23 18:50:49 +01:00
Abílio Costa
b712207b75 Add refrigerator temperature level select to whirlpool (#162110)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-23 18:45:48 +01:00
wollew
fa38f25d4f Enable strict typing in Velux integration (#163798) 2026-02-23 18:05:50 +01:00
Karl Beecken
3a27fa782e Teltonika quality scale: mark test-coverage done (#163707) 2026-02-23 18:03:11 +01:00
Nathan Spencer
ffeb759aba Rename Litter-Robot integration to Whisker (#163826) 2026-02-23 17:46:15 +01:00
Denis Shulyaka
e96da42996 Fix notification service exceptions fot Telegram bot (#163882) 2026-02-23 17:40:22 +01:00
Tom
ce71e540ae Add airOS device reboot button (#163718)
Co-authored-by: Erwin Douna <e.douna@gmail.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-23 17:37:24 +01:00
Steve Easley
9b2bcaed92 Bump Kaleidescape integration dependency to v1.1.3 (#163884) 2026-02-23 17:36:44 +01:00
Ludovic BOUÉ
f564ad3ebe Add Matter KNX bridge fixture (#163875) 2026-02-23 17:30:51 +01:00
Joost Lekkerkerker
bd1b060718 Add integration_type device to solarlog (#163628) 2026-02-23 17:26:26 +01:00
Willem-Jan van Rootselaar
f4cab72228 Minor type fixes (#163606) 2026-02-23 17:26:07 +01:00
Leo Periou
733d381a7c Add new MyNeomitis integration (#151377)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 17:14:30 +01:00
Ingo Fischer
6fba886edb Replace Matter python client (#163704) 2026-02-23 17:02:39 +01:00
epenet
2f95d1ef78 Mark lock entity type hints as mandatory (#163796) 2026-02-23 16:50:52 +01:00
jesperraemaekers
6d6727ed58 Change weheat codeowner (#163860) 2026-02-23 16:49:41 +01:00
epenet
9c0c9758f0 Mark light entity type hints as mandatory (#163794) 2026-02-23 16:48:30 +01:00
epenet
bfa2da32fc Mark geo_location entity type hints as mandatory (#163790) 2026-02-23 16:48:12 +01:00
Paul Bottein
dfb17c2187 Add configurable panel properties to frontend (#162742)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Petar Petrov <MindFreeze@users.noreply.github.com>
2026-02-23 16:15:44 +01:00
Klaas Schoute
ac65163ebb Bump forecast-solar to v5.0.0 (#163841) 2026-02-23 15:58:54 +01:00
Sab44
f3042741bf Deprecate Libre Hardware Monitor versions below v0.9.5 (#163838) 2026-02-23 15:57:17 +01:00
Joost Lekkerkerker
80936497ce Add Zinvolt integration (#163449) 2026-02-23 15:55:15 +01:00
Joost Lekkerkerker
5e3d2bec68 Add integration_type device to sia (#163393) 2026-02-23 15:18:54 +01:00
Michael
e1667bd5c6 Increase request timeout from 10 to 20s in FRITZ!SmartHome (#163818) 2026-02-23 15:02:10 +01:00
Ludovic BOUÉ
cdb92a54b0 Fix Matter speaker mute toggle (#161128)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 14:38:37 +01:00
Erik Montnemery
74a3f4bbb9 Bump securetar to 2026.2.0 (#163226) 2026-02-23 14:03:43 +01:00
Nic Eggert
6299e8cb77 Add support for current sensors to egauge integration (#163728) 2026-02-23 13:29:20 +01:00
Joost Lekkerkerker
0f6a3a8328 Add integration_type service to snapcast (#163401) 2026-02-23 13:17:31 +01:00
Joost Lekkerkerker
77a56a3e60 Add integration_type device to smart_meter_texas (#163398) 2026-02-23 13:17:02 +01:00
Joost Lekkerkerker
cf5733de97 Add integration_type device to tilt_pi (#163667) 2026-02-23 13:16:37 +01:00
Joost Lekkerkerker
fe377befa6 Add integration_type hub to wallbox (#163752) 2026-02-23 13:12:40 +01:00
Joost Lekkerkerker
9d54236f7d Add integration_type hub to waqi (#163754) 2026-02-23 13:12:11 +01:00
Karl Beecken
bd6b8a812c Teltonika integration: add reauth config flow (#163712) 2026-02-23 13:07:19 +01:00
kshypachov
85eeac6812 Fix Matter energy sensor discovery when value is null (#162044)
Co-authored-by: Ludovic BOUÉ <lboue@users.noreply.github.com>
2026-02-23 11:52:05 +01:00
Robert Resch
ea71c40b0a Bump deebot-client to 18.0.0 (#163835) 2026-02-23 11:45:55 +01:00
Ludovic BOUÉ
99bd66194d Add allow_none_value=True to MatterDiscoverySchema for electrical power attributes (#163195)
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2026-02-23 11:33:57 +01:00
Sab44
13737ff2e6 Bump librehardwaremonitor-api to version 1.10.1 (#163572) 2026-02-23 11:01:58 +01:00
Tom
55c1d52310 Bump airOS to 0.6.4 (#163716) 2026-02-23 09:12:45 +01:00
hanwg
d5ef379caf Refactoring for Telegram bot (#163767) 2026-02-23 08:35:39 +01:00
Ludovic BOUÉ
a5d59decef Ikea bilresa dual button fixture (#163781)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 08:33:52 +01:00
Nathan Spencer
c75c5c0773 Adjust buttons to support new Litter-Robot lineup (#163825) 2026-02-23 08:32:33 +01:00
Nathan Spencer
f1fc6d10ad Adjust selects to support new Litter-Robot lineup (#163824) 2026-02-23 08:31:59 +01:00
Nathan Spencer
c3376df227 Adjust sensors to support new Litter-Robot lineup (#163823) 2026-02-23 08:30:46 +01:00
epenet
463003fc33 Add test for Tuya event (#163812) 2026-02-23 08:28:23 +01:00
Michael
b9b45c9994 Bump pyfritzhome to 0.6.20 (#163817) 2026-02-23 08:25:35 +01:00
Sebastiaan Speck
eed3b9fb89 Bump renault-api to 0.5.5 (#163821) 2026-02-23 07:35:37 +01:00
Erwin Douna
88d7954d7c Typing fix for Proxmox coordinator (#163808) 2026-02-23 06:58:43 +01:00
andarotajo
ce2afd85d4 Remove myself as code owner from dwd_weather_warnings (#163810) 2026-02-23 06:54:59 +01:00
Raphael Hehl
be96606b2c Bump uiprotect to 10.2.1 (#163816) 2026-02-23 01:05:23 +01:00
Maciej Bieniek
5afad9cabc Use async_add_executor_job in Fitbit to prevent event loop blocking (#163815) 2026-02-22 22:35:12 +01:00
epenet
19b606841d Mark fan entity type hints as mandatory (#163789) 2026-02-22 21:44:53 +01:00
Maciej Bieniek
abdd51c266 Allow unit of measurement translation in Analytics Insights (#163811) 2026-02-22 20:56:27 +01:00
Norbert Rittel
959bafe78b Fix grammar of amcrest.ptz_control action description (#163802) 2026-02-22 19:47:13 +01:00
275 changed files with 14548 additions and 1334 deletions

View File

@@ -583,6 +583,7 @@ homeassistant.components.vacuum.*
homeassistant.components.vallox.*
homeassistant.components.valve.*
homeassistant.components.velbus.*
homeassistant.components.velux.*
homeassistant.components.vivotek.*
homeassistant.components.vlc_telnet.*
homeassistant.components.vodafone_station.*
@@ -612,6 +613,7 @@ homeassistant.components.yale_smart_alarm.*
homeassistant.components.yalexs_ble.*
homeassistant.components.youtube.*
homeassistant.components.zeroconf.*
homeassistant.components.zinvolt.*
homeassistant.components.zodiac.*
homeassistant.components.zone.*
homeassistant.components.zwave_js.*

12
CODEOWNERS generated
View File

@@ -403,8 +403,8 @@ build.json @home-assistant/supervisor
/tests/components/duke_energy/ @hunterjm
/homeassistant/components/duotecno/ @cereal2nd
/tests/components/duotecno/ @cereal2nd
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192 @andarotajo
/tests/components/dwd_weather_warnings/ @runningman84 @stephan192 @andarotajo
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192
/tests/components/dwd_weather_warnings/ @runningman84 @stephan192
/homeassistant/components/dynalite/ @ziv1234
/tests/components/dynalite/ @ziv1234
/homeassistant/components/eafm/ @Jc2k
@@ -1082,6 +1082,8 @@ build.json @home-assistant/supervisor
/tests/components/mutesync/ @currentoor
/homeassistant/components/my/ @home-assistant/core
/tests/components/my/ @home-assistant/core
/homeassistant/components/myneomitis/ @l-pr
/tests/components/myneomitis/ @l-pr
/homeassistant/components/mysensors/ @MartinHjelmare @functionpointer
/tests/components/mysensors/ @MartinHjelmare @functionpointer
/homeassistant/components/mystrom/ @fabaff
@@ -1880,8 +1882,8 @@ build.json @home-assistant/supervisor
/tests/components/webostv/ @thecode
/homeassistant/components/websocket_api/ @home-assistant/core
/tests/components/websocket_api/ @home-assistant/core
/homeassistant/components/weheat/ @jesperraemaekers
/tests/components/weheat/ @jesperraemaekers
/homeassistant/components/weheat/ @barryvdh
/tests/components/weheat/ @barryvdh
/homeassistant/components/wemo/ @esev
/tests/components/wemo/ @esev
/homeassistant/components/whirlpool/ @abmantis @mkmer
@@ -1959,6 +1961,8 @@ build.json @home-assistant/supervisor
/tests/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES
/homeassistant/components/zimi/ @markhannon
/tests/components/zimi/ @markhannon
/homeassistant/components/zinvolt/ @joostlek
/tests/components/zinvolt/ @joostlek
/homeassistant/components/zodiac/ @JulienTant
/tests/components/zodiac/ @JulienTant
/homeassistant/components/zone/ @home-assistant/core

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
import hashlib
import json
import logging
from pathlib import Path
@@ -40,17 +39,6 @@ class RestoreBackupFileContent:
restore_homeassistant: bool
def password_to_key(password: str) -> bytes:
"""Generate a AES Key from password.
Matches the implementation in supervisor.backups.utils.password_to_key.
"""
key: bytes = password.encode()
for _ in range(100):
key = hashlib.sha256(key).digest()
return key[:16]
def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent | None:
"""Return the contents of the restore backup file."""
instruction_path = config_dir.joinpath(RESTORE_BACKUP_FILE)
@@ -96,15 +84,14 @@ def _extract_backup(
"""Extract the backup file to the config directory."""
with (
TemporaryDirectory() as tempdir,
securetar.SecureTarFile(
securetar.SecureTarArchive(
restore_content.backup_file_path,
gzip=False,
mode="r",
) as ostf,
):
ostf.extractall(
ostf.tar.extractall(
path=Path(tempdir, "extracted"),
members=securetar.secure_path(ostf),
members=securetar.secure_path(ostf.tar),
filter="fully_trusted",
)
backup_meta_file = Path(tempdir, "extracted", "backup.json")
@@ -126,10 +113,7 @@ def _extract_backup(
f"homeassistant.tar{'.gz' if backup_meta['compressed'] else ''}",
),
gzip=backup_meta["compressed"],
key=password_to_key(restore_content.password)
if restore_content.password is not None
else None,
mode="r",
password=restore_content.password,
) as istf:
istf.extractall(
path=Path(tempdir, "homeassistant"),

View File

@@ -23,6 +23,7 @@ from .coordinator import AirOSConfigEntry, AirOSDataUpdateCoordinator
_PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.BUTTON,
Platform.SENSOR,
]

View File

@@ -0,0 +1,73 @@
"""AirOS button component for Home Assistant."""
from __future__ import annotations
import logging
from airos.exceptions import AirOSException
from homeassistant.components.button import (
ButtonDeviceClass,
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import DOMAIN, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
REBOOT_BUTTON = ButtonEntityDescription(
key="reboot",
device_class=ButtonDeviceClass.RESTART,
entity_registry_enabled_default=False,
)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the AirOS button from a config entry."""
async_add_entities([AirOSRebootButton(config_entry.runtime_data, REBOOT_BUTTON)])
class AirOSRebootButton(AirOSEntity, ButtonEntity):
"""Button to reboot device."""
entity_description: ButtonEntityDescription
def __init__(
self,
coordinator: AirOSDataUpdateCoordinator,
description: ButtonEntityDescription,
) -> None:
"""Initialize the AirOS client button."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.derived.mac}_{description.key}"
async def async_press(self) -> None:
"""Handle the button press to reboot the device."""
try:
await self.coordinator.airos_device.login()
result = await self.coordinator.airos_device.reboot()
except AirOSException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
if not result:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="reboot_failed",
) from None

View File

@@ -2,16 +2,20 @@
from __future__ import annotations
import asyncio
from collections.abc import Mapping
import logging
from typing import Any
from airos.discovery import airos_discover_devices
from airos.exceptions import (
AirOSConnectionAuthenticationError,
AirOSConnectionSetupError,
AirOSDataMissingError,
AirOSDeviceConnectionError,
AirOSEndpointError,
AirOSKeyDataMissingError,
AirOSListenerError,
)
import voluptuous as vol
@@ -36,15 +40,27 @@ from homeassistant.helpers.selector import (
TextSelectorType,
)
from .const import DEFAULT_SSL, DEFAULT_VERIFY_SSL, DOMAIN, SECTION_ADVANCED_SETTINGS
from .const import (
DEFAULT_SSL,
DEFAULT_USERNAME,
DEFAULT_VERIFY_SSL,
DEVICE_NAME,
DOMAIN,
HOSTNAME,
IP_ADDRESS,
MAC_ADDRESS,
SECTION_ADVANCED_SETTINGS,
)
from .coordinator import AirOS8
_LOGGER = logging.getLogger(__name__)
STEP_USER_DATA_SCHEMA = vol.Schema(
# Discovery duration in seconds, airOS announces every 20 seconds
DISCOVER_INTERVAL: int = 30
STEP_DISCOVERY_DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_HOST): str,
vol.Required(CONF_USERNAME, default="ubnt"): str,
vol.Required(CONF_USERNAME, default=DEFAULT_USERNAME): str,
vol.Required(CONF_PASSWORD): str,
vol.Required(SECTION_ADVANCED_SETTINGS): section(
vol.Schema(
@@ -58,6 +74,10 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
STEP_MANUAL_DATA_SCHEMA = STEP_DISCOVERY_DATA_SCHEMA.extend(
{vol.Required(CONF_HOST): str}
)
class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Ubiquiti airOS."""
@@ -65,14 +85,29 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 2
MINOR_VERSION = 1
_discovery_task: asyncio.Task | None = None
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOS8
self.errors: dict[str, str] = {}
self.discovered_devices: dict[str, dict[str, Any]] = {}
self.discovery_abort_reason: str | None = None
self.selected_device_info: dict[str, Any] = {}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
self.errors = {}
return self.async_show_menu(
step_id="user", menu_options=["discovery", "manual"]
)
async def async_step_manual(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the manual input of host and credentials."""
self.errors = {}
@@ -84,7 +119,7 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
data=validated_info["data"],
)
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=self.errors
step_id="manual", data_schema=STEP_MANUAL_DATA_SCHEMA, errors=self.errors
)
async def _validate_and_get_device_info(
@@ -220,3 +255,163 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
),
errors=self.errors,
)
async def async_step_discovery(
self,
discovery_info: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Start the discovery process."""
if self._discovery_task and self._discovery_task.done():
self._discovery_task = None
# Handle appropriate 'errors' as abort through progress_done
if self.discovery_abort_reason:
return self.async_show_progress_done(
next_step_id=self.discovery_abort_reason
)
# Abort through progress_done if no devices were found
if not self.discovered_devices:
_LOGGER.debug(
"No (new or unconfigured) airOS devices found during discovery"
)
return self.async_show_progress_done(
next_step_id="discovery_no_devices"
)
# Skip selecting a device if only one new/unconfigured device was found
if len(self.discovered_devices) == 1:
self.selected_device_info = list(self.discovered_devices.values())[0]
return self.async_show_progress_done(next_step_id="configure_device")
return self.async_show_progress_done(next_step_id="select_device")
if not self._discovery_task:
self.discovered_devices = {}
self._discovery_task = self.hass.async_create_task(
self._async_run_discovery_with_progress()
)
# Show the progress bar and wait for discovery to complete
return self.async_show_progress(
step_id="discovery",
progress_action="discovering",
progress_task=self._discovery_task,
description_placeholders={"seconds": str(DISCOVER_INTERVAL)},
)
async def async_step_select_device(
self,
discovery_info: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Select a discovered device."""
if discovery_info is not None:
selected_mac = discovery_info[MAC_ADDRESS]
self.selected_device_info = self.discovered_devices[selected_mac]
return await self.async_step_configure_device()
list_options = {
mac: f"{device.get(HOSTNAME, mac)} ({device.get(IP_ADDRESS, DEVICE_NAME)})"
for mac, device in self.discovered_devices.items()
}
return self.async_show_form(
step_id="select_device",
data_schema=vol.Schema({vol.Required(MAC_ADDRESS): vol.In(list_options)}),
)
async def async_step_configure_device(
self,
user_input: dict[str, Any] | None = None,
) -> ConfigFlowResult:
"""Configure the selected device."""
self.errors = {}
if user_input is not None:
config_data = {
**user_input,
CONF_HOST: self.selected_device_info[IP_ADDRESS],
}
validated_info = await self._validate_and_get_device_info(config_data)
if validated_info:
return self.async_create_entry(
title=validated_info["title"],
data=validated_info["data"],
)
device_name = self.selected_device_info.get(
HOSTNAME, self.selected_device_info.get(IP_ADDRESS, DEVICE_NAME)
)
return self.async_show_form(
step_id="configure_device",
data_schema=STEP_DISCOVERY_DATA_SCHEMA,
errors=self.errors,
description_placeholders={"device_name": device_name},
)
async def _async_run_discovery_with_progress(self) -> None:
"""Run discovery with an embedded progress update loop."""
progress_bar = self.hass.async_create_task(self._async_update_progress_bar())
known_mac_addresses = {
entry.unique_id.lower()
for entry in self.hass.config_entries.async_entries(DOMAIN)
if entry.unique_id
}
try:
devices = await airos_discover_devices(DISCOVER_INTERVAL)
except AirOSEndpointError:
self.discovery_abort_reason = "discovery_detect_error"
except AirOSListenerError:
self.discovery_abort_reason = "discovery_listen_error"
except Exception:
self.discovery_abort_reason = "discovery_failed"
_LOGGER.exception("An error occurred during discovery")
else:
self.discovered_devices = {
mac_addr: info
for mac_addr, info in devices.items()
if mac_addr.lower() not in known_mac_addresses
}
_LOGGER.debug(
"Discovery task finished. Found %s new devices",
len(self.discovered_devices),
)
finally:
progress_bar.cancel()
async def _async_update_progress_bar(self) -> None:
"""Update progress bar every second."""
try:
for i in range(DISCOVER_INTERVAL):
progress = (i + 1) / DISCOVER_INTERVAL
self.async_update_progress(progress)
await asyncio.sleep(1)
except asyncio.CancelledError:
pass
async def async_step_discovery_no_devices(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Abort if discovery finds no (unconfigured) devices."""
return self.async_abort(reason="no_devices_found")
async def async_step_discovery_listen_error(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Abort if discovery is unable to listen on the port."""
return self.async_abort(reason="listen_error")
async def async_step_discovery_detect_error(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Abort if discovery receives incorrect broadcasts."""
return self.async_abort(reason="detect_error")
async def async_step_discovery_failed(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Abort if discovery fails for other reasons."""
return self.async_abort(reason="discovery_failed")

View File

@@ -12,3 +12,10 @@ DEFAULT_VERIFY_SSL = False
DEFAULT_SSL = True
SECTION_ADVANCED_SETTINGS = "advanced_settings"
# Discovery related
DEFAULT_USERNAME = "ubnt"
HOSTNAME = "hostname"
IP_ADDRESS = "ip_address"
MAC_ADDRESS = "mac_address"
DEVICE_NAME = "airOS device"

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.6.3"]
"requirements": ["airos==0.6.4"]
}

View File

@@ -2,6 +2,10 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"detect_error": "Unable to process discovered devices data, check the documentation for supported devices",
"discovery_failed": "Unable to start discovery, check logs for details",
"listen_error": "Unable to start listening for devices",
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unique_id_mismatch": "Re-authentication should be used for the same device not a new one"
@@ -13,37 +17,36 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"flow_title": "Ubiquiti airOS device",
"progress": {
"connecting": "Connecting to the airOS device",
"discovering": "Listening for any airOS devices for {seconds} seconds"
},
"step": {
"reauth_confirm": {
"configure_device": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
}
},
"reconfigure": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::user::data_description::password%]"
"password": "[%key:component::airos::config::step::manual::data_description::password%]",
"username": "[%key:component::airos::config::step::manual::data_description::username%]"
},
"description": "Enter the username and password for {device_name}",
"sections": {
"advanced_settings": {
"data": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data::ssl%]",
"ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data::ssl%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::ssl%]",
"verify_ssl": "[%key:component::airos::config::step::user::sections::advanced_settings::data_description::verify_ssl%]"
"ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data_description::ssl%]",
"verify_ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data_description::verify_ssl%]"
},
"name": "[%key:component::airos::config::step::user::sections::advanced_settings::name%]"
"name": "[%key:component::airos::config::step::manual::sections::advanced_settings::name%]"
}
}
},
"user": {
"manual": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
"password": "[%key:common::config_flow::data::password%]",
@@ -67,6 +70,49 @@
"name": "Advanced settings"
}
}
},
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::manual::data_description::password%]"
}
},
"reconfigure": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::airos::config::step::manual::data_description::password%]"
},
"sections": {
"advanced_settings": {
"data": {
"ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data::ssl%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data_description::ssl%]",
"verify_ssl": "[%key:component::airos::config::step::manual::sections::advanced_settings::data_description::verify_ssl%]"
},
"name": "[%key:component::airos::config::step::manual::sections::advanced_settings::name%]"
}
}
},
"select_device": {
"data": {
"mac_address": "Select the device to configure"
},
"data_description": {
"mac_address": "Select the device MAC address"
}
},
"user": {
"menu_options": {
"discovery": "Listen for airOS devices on the network",
"manual": "Manually configure airOS device"
}
}
}
},
@@ -157,6 +203,9 @@
},
"key_data_missing": {
"message": "Key data not returned from device"
},
"reboot_failed": {
"message": "The device did not accept the reboot request. Try again, or check your device web interface for errors."
}
}
}

View File

@@ -2,10 +2,12 @@
from __future__ import annotations
import aiohttp
from genie_partner_sdk.client import AladdinConnectClient
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import (
aiohttp_client,
config_entry_oauth2_flow,
@@ -31,11 +33,27 @@ async def async_setup_entry(
session = config_entry_oauth2_flow.OAuth2Session(hass, entry, implementation)
try:
await session.async_ensure_token_valid()
except aiohttp.ClientResponseError as err:
if 400 <= err.status < 500:
raise ConfigEntryAuthFailed(err) from err
raise ConfigEntryNotReady from err
except aiohttp.ClientError as err:
raise ConfigEntryNotReady from err
client = AladdinConnectClient(
api.AsyncConfigEntryAuth(aiohttp_client.async_get_clientsession(hass), session)
)
doors = await client.get_doors()
try:
doors = await client.get_doors()
except aiohttp.ClientResponseError as err:
if 400 <= err.status < 500:
raise ConfigEntryAuthFailed(err) from err
raise ConfigEntryNotReady from err
except aiohttp.ClientError as err:
raise ConfigEntryNotReady from err
entry.runtime_data = {
door.unique_id: AladdinConnectCoordinator(hass, entry, client, door)

View File

@@ -11,6 +11,18 @@ API_URL = "https://twdvzuefzh.execute-api.us-east-2.amazonaws.com/v1"
API_KEY = "k6QaiQmcTm2zfaNns5L1Z8duBtJmhDOW8JawlCC3"
class AsyncConfigFlowAuth(Auth):
"""Provide Aladdin Connect Genie authentication for config flow validation."""
def __init__(self, websession: ClientSession, access_token: str) -> None:
"""Initialize Aladdin Connect Genie auth."""
super().__init__(websession, API_URL, access_token, API_KEY)
async def async_get_access_token(self) -> str:
"""Return the access token."""
return self.access_token
class AsyncConfigEntryAuth(Auth):
"""Provide Aladdin Connect Genie authentication tied to an OAuth2 based config entry."""

View File

@@ -4,12 +4,14 @@ from collections.abc import Mapping
import logging
from typing import Any
from genie_partner_sdk.client import AladdinConnectClient
import jwt
import voluptuous as vol
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from .api import AsyncConfigFlowAuth
from .const import CONFIG_FLOW_MINOR_VERSION, CONFIG_FLOW_VERSION, DOMAIN
@@ -52,11 +54,25 @@ class OAuth2FlowHandler(
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
"""Create an oauth config entry or update existing entry for reauth."""
# Extract the user ID from the JWT token's 'sub' field
token = jwt.decode(
data["token"]["access_token"], options={"verify_signature": False}
try:
token = jwt.decode(
data["token"]["access_token"], options={"verify_signature": False}
)
user_id = token["sub"]
except jwt.DecodeError, KeyError:
return self.async_abort(reason="oauth_error")
client = AladdinConnectClient(
AsyncConfigFlowAuth(
aiohttp_client.async_get_clientsession(self.hass),
data["token"]["access_token"],
)
)
user_id = token["sub"]
try:
await client.get_doors()
except Exception: # noqa: BLE001
return self.async_abort(reason="cannot_connect")
await self.async_set_unique_id(user_id)
if self.source == SOURCE_REAUTH:

View File

@@ -7,39 +7,31 @@ rules:
brands: done
common-modules: done
config-flow: done
config-flow-test-coverage: todo
config-flow-test-coverage: done
dependency-transparency: done
docs-actions:
status: exempt
comment: Integration does not register any service actions.
docs-high-level-description: done
docs-installation-instructions:
status: todo
comment: Documentation needs to be created.
docs-removal-instructions:
status: todo
comment: Documentation needs to be created.
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: Integration does not subscribe to external events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure:
status: todo
comment: Config flow does not currently test connection during setup.
test-before-setup: todo
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: todo
comment: Documentation needs to be created.
docs-installation-parameters:
status: todo
comment: Documentation needs to be created.
status: exempt
comment: Integration does not have an options flow.
docs-installation-parameters: done
entity-unavailable: todo
integration-owner: done
log-when-unavailable: todo
@@ -52,29 +44,17 @@ rules:
# Gold
devices: done
diagnostics: todo
discovery: todo
discovery-update-info: todo
docs-data-update:
status: todo
comment: Documentation needs to be created.
docs-examples:
status: todo
comment: Documentation needs to be created.
docs-known-limitations:
status: todo
comment: Documentation needs to be created.
docs-supported-devices:
status: todo
comment: Documentation needs to be created.
docs-supported-functions:
status: todo
comment: Documentation needs to be created.
docs-troubleshooting:
status: todo
comment: Documentation needs to be created.
docs-use-cases:
status: todo
comment: Documentation needs to be created.
discovery: done
discovery-update-info:
status: exempt
comment: Integration connects via the cloud and not locally.
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
entity-category: done
entity-device-class: done
@@ -86,7 +66,7 @@ rules:
repair-issues: todo
stale-devices:
status: todo
comment: Stale devices can be done dynamically
comment: We can automatically remove removed devices
# Platinum
async-dependency: todo

View File

@@ -4,6 +4,7 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"authorize_url_timeout": "[%key:common::config_flow::abort::oauth2_authorize_url_timeout%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"cloud_not_enabled": "Please make sure you run Home Assistant with `{default_config}` enabled in your configuration.yaml.",
"missing_configuration": "[%key:common::config_flow::abort::oauth2_missing_configuration%]",
"no_url_available": "[%key:common::config_flow::abort::oauth2_no_url_available%]",

View File

@@ -75,7 +75,7 @@
"name": "Go to preset"
},
"ptz_control": {
"description": "Moves (pan/tilt) and/or zoom a PTZ camera.",
"description": "Moves (pan/tilt) and/or zooms a PTZ camera.",
"fields": {
"entity_id": {
"description": "[%key:component::amcrest::services::enable_recording::fields::entity_id::description%]",

View File

@@ -38,7 +38,6 @@ def get_app_entity_description(
translation_key="apps",
name=name_slug,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.apps.get(name_slug),
)
@@ -52,7 +51,6 @@ def get_core_integration_entity_description(
translation_key="core_integrations",
name=name,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.core_integrations.get(domain),
)
@@ -66,7 +64,6 @@ def get_custom_integration_entity_description(
translation_key="custom_integrations",
translation_placeholders={"custom_integration_domain": domain},
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.custom_integrations.get(domain),
)
@@ -77,7 +74,6 @@ GENERAL_SENSORS = [
translation_key="total_active_installations",
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.active_installations,
),
AnalyticsSensorEntityDescription(
@@ -85,7 +81,6 @@ GENERAL_SENSORS = [
translation_key="total_reports_integrations",
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.reports_integrations,
),
]

View File

@@ -24,14 +24,23 @@
},
"entity": {
"sensor": {
"apps": {
"unit_of_measurement": "active installations"
},
"core_integrations": {
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"custom_integrations": {
"name": "{custom_integration_domain} (custom)"
"name": "{custom_integration_domain} (custom)",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"total_active_installations": {
"name": "Total active installations"
"name": "Total active installations",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"total_reports_integrations": {
"name": "Total reported integrations"
"name": "Total reported integrations",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
}
}
},

View File

@@ -112,19 +112,12 @@ async def get_model_list(client: anthropic.AsyncAnthropic) -> list[SelectOptionD
# Resolve alias from versioned model name:
model_alias = (
model_info.id[:-9]
if model_info.id
not in (
"claude-3-haiku-20240307",
"claude-3-5-haiku-20241022",
"claude-3-opus-20240229",
)
if model_info.id != "claude-3-haiku-20240307"
and model_info.id[-2:-1] != "-"
else model_info.id
)
if short_form.search(model_alias):
model_alias += "-0"
if model_alias.endswith(("haiku", "opus", "sonnet")):
model_alias += "-latest"
model_options.append(
SelectOptionDict(
label=model_info.display_name,

View File

@@ -37,8 +37,6 @@ DEFAULT = {
MIN_THINKING_BUDGET = 1024
NON_THINKING_MODELS = [
"claude-3-5", # Both sonnet and haiku
"claude-3-opus",
"claude-3-haiku",
]
@@ -51,7 +49,7 @@ NON_ADAPTIVE_THINKING_MODELS = [
"claude-opus-4-20250514",
"claude-sonnet-4-0",
"claude-sonnet-4-20250514",
"claude-3",
"claude-3-haiku",
]
UNSUPPORTED_STRUCTURED_OUTPUT_MODELS = [
@@ -60,19 +58,13 @@ UNSUPPORTED_STRUCTURED_OUTPUT_MODELS = [
"claude-opus-4-20250514",
"claude-sonnet-4-0",
"claude-sonnet-4-20250514",
"claude-3",
"claude-3-haiku",
]
WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
"claude-3-opus",
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
]
DEPRECATED_MODELS = [
"claude-3-5-haiku",
"claude-3-7-sonnet",
"claude-3-5-sonnet",
"claude-3-opus",
"claude-3",
]

View File

@@ -132,11 +132,21 @@ class ContentDetails:
"""Native data for AssistantContent."""
citation_details: list[CitationDetails] = field(default_factory=list)
thinking_signature: str | None = None
redacted_thinking: str | None = None
def has_content(self) -> bool:
"""Check if there is any content."""
"""Check if there is any text content."""
return any(detail.length > 0 for detail in self.citation_details)
def __bool__(self) -> bool:
"""Check if there is any thinking content or citations."""
return (
self.thinking_signature is not None
or self.redacted_thinking is not None
or self.has_citations()
)
def has_citations(self) -> bool:
"""Check if there are any citations."""
return any(detail.citations for detail in self.citation_details)
@@ -246,29 +256,28 @@ def _convert_content(
content=[],
)
)
elif isinstance(messages[-1]["content"], str):
messages[-1]["content"] = [
TextBlockParam(type="text", text=messages[-1]["content"]),
]
if isinstance(content.native, ThinkingBlock):
messages[-1]["content"].append( # type: ignore[union-attr]
ThinkingBlockParam(
type="thinking",
thinking=content.thinking_content or "",
signature=content.native.signature,
if isinstance(content.native, ContentDetails):
if content.native.thinking_signature:
messages[-1]["content"].append( # type: ignore[union-attr]
ThinkingBlockParam(
type="thinking",
thinking=content.thinking_content or "",
signature=content.native.thinking_signature,
)
)
)
elif isinstance(content.native, RedactedThinkingBlock):
redacted_thinking_block = RedactedThinkingBlockParam(
type="redacted_thinking",
data=content.native.data,
)
if isinstance(messages[-1]["content"], str):
messages[-1]["content"] = [
TextBlockParam(type="text", text=messages[-1]["content"]),
redacted_thinking_block,
]
else:
messages[-1]["content"].append( # type: ignore[attr-defined]
redacted_thinking_block
if content.native.redacted_thinking:
messages[-1]["content"].append( # type: ignore[union-attr]
RedactedThinkingBlockParam(
type="redacted_thinking",
data=content.native.redacted_thinking,
)
)
if content.content:
current_index = 0
for detail in (
@@ -309,6 +318,7 @@ def _convert_content(
text=content.content[current_index:],
)
)
if content.tool_calls:
messages[-1]["content"].extend( # type: ignore[union-attr]
[
@@ -328,6 +338,14 @@ def _convert_content(
for tool_call in content.tool_calls
]
)
if (
isinstance(messages[-1]["content"], list)
and len(messages[-1]["content"]) == 1
and messages[-1]["content"][0]["type"] == "text"
):
# If there is only one text block, simplify the content to a string
messages[-1]["content"] = messages[-1]["content"][0]["text"]
else:
# Note: We don't pass SystemContent here as its passed to the API as the prompt
raise TypeError(f"Unexpected content type: {type(content)}")
@@ -379,8 +397,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
content_details = ContentDetails()
content_details.add_citation_detail()
input_usage: Usage | None = None
has_native = False
first_block: bool
first_block: bool = True
async for response in stream:
LOGGER.debug("Received response: %s", response)
@@ -401,13 +418,12 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
current_tool_args = ""
if response.content_block.name == output_tool:
if first_block or content_details.has_content():
if content_details.has_citations():
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
first_block = False
elif isinstance(response.content_block, TextBlock):
if ( # Do not start a new assistant content just for citations, concatenate consecutive blocks with citations instead.
@@ -418,12 +434,11 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
and content_details.has_content()
)
):
if content_details.has_citations():
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
yield {"role": "assistant"}
has_native = False
first_block = False
content_details.add_citation_detail()
if response.content_block.text:
@@ -432,14 +447,13 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
)
yield {"content": response.content_block.text}
elif isinstance(response.content_block, ThinkingBlock):
if first_block or has_native:
if content_details.has_citations():
if first_block or content_details.thinking_signature:
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
first_block = False
elif isinstance(response.content_block, RedactedThinkingBlock):
LOGGER.debug(
@@ -447,17 +461,15 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
"encrypted for safety reasons. This doesnt affect the quality of "
"responses"
)
if has_native:
if content_details.has_citations():
if first_block or content_details.redacted_thinking:
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
content_details.add_citation_detail()
yield {"role": "assistant"}
has_native = False
first_block = False
yield {"native": response.content_block}
has_native = True
content_details.redacted_thinking = response.content_block.data
elif isinstance(response.content_block, ServerToolUseBlock):
current_tool_block = ServerToolUseBlockParam(
type="server_tool_use",
@@ -467,7 +479,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
)
current_tool_args = ""
elif isinstance(response.content_block, WebSearchToolResultBlock):
if content_details.has_citations():
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()
@@ -510,19 +522,16 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
else:
current_tool_args += response.delta.partial_json
elif isinstance(response.delta, TextDelta):
content_details.citation_details[-1].length += len(response.delta.text)
yield {"content": response.delta.text}
elif isinstance(response.delta, ThinkingDelta):
yield {"thinking_content": response.delta.thinking}
elif isinstance(response.delta, SignatureDelta):
yield {
"native": ThinkingBlock(
type="thinking",
thinking="",
signature=response.delta.signature,
if response.delta.text:
content_details.citation_details[-1].length += len(
response.delta.text
)
}
has_native = True
yield {"content": response.delta.text}
elif isinstance(response.delta, ThinkingDelta):
if response.delta.thinking:
yield {"thinking_content": response.delta.thinking}
elif isinstance(response.delta, SignatureDelta):
content_details.thinking_signature = response.delta.signature
elif isinstance(response.delta, CitationsDelta):
content_details.add_citation(response.delta.citation)
elif isinstance(response, RawContentBlockStopEvent):
@@ -549,7 +558,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
if response.delta.stop_reason == "refusal":
raise HomeAssistantError("Potential policy violation detected")
elif isinstance(response, RawMessageStopEvent):
if content_details.has_citations():
if content_details:
content_details.delete_empty()
yield {"native": content_details}
content_details = ContentDetails()

View File

@@ -8,5 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/anthropic",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["anthropic==0.78.0"]
"requirements": ["anthropic==0.83.0"]
}

View File

@@ -10,15 +10,7 @@ rules:
Integration does not poll.
brands: done
common-modules: done
config-flow-test-coverage:
status: todo
comment: |
* Remove integration setup from the config flow init test
* Make `mock_setup_entry` a separate fixture
* Use the mock_config_entry fixture in `test_duplicate_entry`
* `test_duplicate_entry`: Patch `homeassistant.components.anthropic.config_flow.anthropic.resources.models.AsyncModels.list`
* Fix docstring and name for `test_form_invalid_auth` (does not only test auth)
* In `test_form_invalid_auth`, make sure the test run until CREATE_ENTRY to test that the flow is able to recover
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:

View File

@@ -3,7 +3,7 @@
from __future__ import annotations
from collections.abc import Iterator
from typing import TYPE_CHECKING, cast
from typing import TYPE_CHECKING
import voluptuous as vol
@@ -19,7 +19,7 @@ from homeassistant.helpers.selector import (
)
from .config_flow import get_model_list
from .const import CONF_CHAT_MODEL, DEFAULT, DEPRECATED_MODELS, DOMAIN
from .const import CONF_CHAT_MODEL, DEPRECATED_MODELS, DOMAIN
if TYPE_CHECKING:
from . import AnthropicConfigEntry
@@ -67,13 +67,23 @@ class ModelDeprecatedRepairFlow(RepairsFlow):
self._model_list_cache[entry.entry_id] = model_list
if "opus" in model:
suggested_model = "claude-opus-4-5"
elif "haiku" in model:
suggested_model = "claude-haiku-4-5"
family = "claude-opus"
elif "sonnet" in model:
suggested_model = "claude-sonnet-4-5"
family = "claude-sonnet"
else:
suggested_model = cast(str, DEFAULT[CONF_CHAT_MODEL])
family = "claude-haiku"
suggested_model = next(
(
model_option["value"]
for model_option in sorted(
(m for m in model_list if family in m["value"]),
key=lambda x: x["value"],
reverse=True,
)
),
vol.UNDEFINED,
)
schema = vol.Schema(
{

View File

@@ -33,3 +33,5 @@ EXCLUDE_DATABASE_FROM_BACKUP = [
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
]
SECURETAR_CREATE_VERSION = 2

View File

@@ -20,13 +20,9 @@ import time
from typing import IO, TYPE_CHECKING, Any, Protocol, TypedDict, cast
import aiohttp
from securetar import SecureTarFile, atomic_contents_add
from securetar import SecureTarArchive, atomic_contents_add
from homeassistant.backup_restore import (
RESTORE_BACKUP_FILE,
RESTORE_BACKUP_RESULT_FILE,
password_to_key,
)
from homeassistant.backup_restore import RESTORE_BACKUP_FILE, RESTORE_BACKUP_RESULT_FILE
from homeassistant.const import __version__ as HAVERSION
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import (
@@ -60,6 +56,7 @@ from .const import (
EXCLUDE_DATABASE_FROM_BACKUP,
EXCLUDE_FROM_BACKUP,
LOGGER,
SECURETAR_CREATE_VERSION,
)
from .models import (
AddonInfo,
@@ -1858,20 +1855,22 @@ class CoreBackupReaderWriter(BackupReaderWriter):
return False
outer_secure_tarfile = SecureTarFile(
tar_file_path, "w", gzip=False, bufsize=BUF_SIZE
)
with outer_secure_tarfile as outer_secure_tarfile_tarfile:
with SecureTarArchive(
tar_file_path,
"w",
bufsize=BUF_SIZE,
create_version=SECURETAR_CREATE_VERSION,
password=password,
) as outer_secure_tarfile:
raw_bytes = json_bytes(backup_data)
fileobj = io.BytesIO(raw_bytes)
tar_info = tarfile.TarInfo(name="./backup.json")
tar_info.size = len(raw_bytes)
tar_info.mtime = int(time.time())
outer_secure_tarfile_tarfile.addfile(tar_info, fileobj=fileobj)
with outer_secure_tarfile.create_inner_tar(
outer_secure_tarfile.tar.addfile(tar_info, fileobj=fileobj)
with outer_secure_tarfile.create_tar(
"./homeassistant.tar.gz",
gzip=True,
key=password_to_key(password) if password is not None else None,
) as core_tar:
atomic_contents_add(
tar_file=core_tar,

View File

@@ -8,6 +8,6 @@
"integration_type": "service",
"iot_class": "calculated",
"quality_scale": "internal",
"requirements": ["cronsim==2.7", "securetar==2025.2.1"],
"requirements": ["cronsim==2.7", "securetar==2026.2.0"],
"single_config_entry": true
}

View File

@@ -8,7 +8,6 @@ import copy
from dataclasses import dataclass, replace
from io import BytesIO
import json
import os
from pathlib import Path, PurePath
from queue import SimpleQueue
import tarfile
@@ -16,9 +15,14 @@ import threading
from typing import IO, Any, cast
import aiohttp
from securetar import SecureTarError, SecureTarFile, SecureTarReadError
from securetar import (
SecureTarArchive,
SecureTarError,
SecureTarFile,
SecureTarReadError,
SecureTarRootKeyContext,
)
from homeassistant.backup_restore import password_to_key
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import dt as dt_util
@@ -29,7 +33,7 @@ from homeassistant.util.async_iterator import (
)
from homeassistant.util.json import JsonObjectType, json_loads_object
from .const import BUF_SIZE, LOGGER
from .const import BUF_SIZE, LOGGER, SECURETAR_CREATE_VERSION
from .models import AddonInfo, AgentBackup, Folder
@@ -132,17 +136,23 @@ def suggested_filename(backup: AgentBackup) -> str:
def validate_password(path: Path, password: str | None) -> bool:
"""Validate the password."""
with tarfile.open(path, "r:", bufsize=BUF_SIZE) as backup_file:
"""Validate the password.
This assumes every inner tar is encrypted with the same secure tar version and
same password.
"""
with SecureTarArchive(
path, "r", bufsize=BUF_SIZE, password=password
) as backup_file:
compressed = False
ha_tar_name = "homeassistant.tar"
try:
ha_tar = backup_file.extractfile(ha_tar_name)
ha_tar = backup_file.tar.extractfile(ha_tar_name)
except KeyError:
compressed = True
ha_tar_name = "homeassistant.tar.gz"
try:
ha_tar = backup_file.extractfile(ha_tar_name)
ha_tar = backup_file.tar.extractfile(ha_tar_name)
except KeyError:
LOGGER.error("No homeassistant.tar or homeassistant.tar.gz found")
return False
@@ -150,13 +160,12 @@ def validate_password(path: Path, password: str | None) -> bool:
with SecureTarFile(
path, # Not used
gzip=compressed,
key=password_to_key(password) if password is not None else None,
mode="r",
password=password,
fileobj=ha_tar,
):
# If we can read the tar file, the password is correct
return True
except tarfile.ReadError:
except tarfile.ReadError, SecureTarReadError:
LOGGER.debug("Invalid password")
return False
except Exception: # noqa: BLE001
@@ -168,22 +177,23 @@ def validate_password_stream(
input_stream: IO[bytes],
password: str | None,
) -> None:
"""Decrypt a backup."""
with (
tarfile.open(fileobj=input_stream, mode="r|", bufsize=BUF_SIZE) as input_tar,
):
for obj in input_tar:
"""Validate the password.
This assumes every inner tar is encrypted with the same secure tar version and
same password.
"""
with SecureTarArchive(
fileobj=input_stream,
mode="r",
bufsize=BUF_SIZE,
streaming=True,
password=password,
) as input_archive:
for obj in input_archive.tar:
if not obj.name.endswith((".tar", ".tgz", ".tar.gz")):
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
)
with istf.decrypt(obj) as decrypted:
if istf.securetar_header.plaintext_size is None:
with input_archive.extract_tar(obj) as decrypted:
if decrypted.plaintext_size is None:
raise UnsupportedSecureTarVersion
try:
decrypted.read(1) # Read a single byte to trigger the decryption
@@ -212,21 +222,25 @@ def decrypt_backup(
password: str | None,
on_done: Callable[[Exception | None], None],
minimum_size: int,
nonces: NonceGenerator,
key_context: SecureTarRootKeyContext,
) -> None:
"""Decrypt a backup."""
error: Exception | None = None
try:
try:
with (
tarfile.open(
fileobj=input_stream, mode="r|", bufsize=BUF_SIZE
) as input_tar,
SecureTarArchive(
fileobj=input_stream,
mode="r",
bufsize=BUF_SIZE,
streaming=True,
password=password,
) as input_archive,
tarfile.open(
fileobj=output_stream, mode="w|", bufsize=BUF_SIZE
) as output_tar,
):
_decrypt_backup(backup, input_tar, output_tar, password)
_decrypt_backup(backup, input_archive, output_tar)
except (DecryptError, SecureTarError, tarfile.TarError) as err:
LOGGER.warning("Error decrypting backup: %s", err)
error = err
@@ -248,19 +262,18 @@ def decrypt_backup(
def _decrypt_backup(
backup: AgentBackup,
input_tar: tarfile.TarFile,
input_archive: SecureTarArchive,
output_tar: tarfile.TarFile,
password: str | None,
) -> None:
"""Decrypt a backup."""
expected_archives = _get_expected_archives(backup)
for obj in input_tar:
for obj in input_archive.tar:
# We compare with PurePath to avoid issues with different path separators,
# for example when backup.json is added as "./backup.json"
object_path = PurePath(obj.name)
if object_path == PurePath("backup.json"):
# Rewrite the backup.json file to indicate that the backup is decrypted
if not (reader := input_tar.extractfile(obj)):
if not (reader := input_archive.tar.extractfile(obj)):
raise DecryptError
metadata = json_loads_object(reader.read())
metadata["protected"] = False
@@ -272,21 +285,15 @@ def _decrypt_backup(
prefix, _, suffix = object_path.name.partition(".")
if suffix not in ("tar", "tgz", "tar.gz"):
LOGGER.debug("Unknown file %s will not be decrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_tar.addfile(obj, input_archive.tar.extractfile(obj))
continue
if prefix not in expected_archives:
LOGGER.debug("Unknown inner tar file %s will not be decrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_tar.addfile(obj, input_archive.tar.extractfile(obj))
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
)
with istf.decrypt(obj) as decrypted:
if (plaintext_size := istf.securetar_header.plaintext_size) is None:
with input_archive.extract_tar(obj) as decrypted:
# Guard against SecureTar v1 which doesn't store plaintext size
if (plaintext_size := decrypted.plaintext_size) is None:
raise UnsupportedSecureTarVersion
decrypted_obj = copy.deepcopy(obj)
decrypted_obj.size = plaintext_size
@@ -300,7 +307,7 @@ def encrypt_backup(
password: str | None,
on_done: Callable[[Exception | None], None],
minimum_size: int,
nonces: NonceGenerator,
key_context: SecureTarRootKeyContext,
) -> None:
"""Encrypt a backup."""
error: Exception | None = None
@@ -310,11 +317,16 @@ def encrypt_backup(
tarfile.open(
fileobj=input_stream, mode="r|", bufsize=BUF_SIZE
) as input_tar,
tarfile.open(
fileobj=output_stream, mode="w|", bufsize=BUF_SIZE
) as output_tar,
SecureTarArchive(
fileobj=output_stream,
mode="w",
bufsize=BUF_SIZE,
streaming=True,
root_key_context=key_context,
create_version=SECURETAR_CREATE_VERSION,
) as output_archive,
):
_encrypt_backup(backup, input_tar, output_tar, password, nonces)
_encrypt_backup(backup, input_tar, output_archive)
except (EncryptError, SecureTarError, tarfile.TarError) as err:
LOGGER.warning("Error encrypting backup: %s", err)
error = err
@@ -337,9 +349,7 @@ def encrypt_backup(
def _encrypt_backup(
backup: AgentBackup,
input_tar: tarfile.TarFile,
output_tar: tarfile.TarFile,
password: str | None,
nonces: NonceGenerator,
output_archive: SecureTarArchive,
) -> None:
"""Encrypt a backup."""
inner_tar_idx = 0
@@ -357,29 +367,20 @@ def _encrypt_backup(
updated_metadata_b = json.dumps(metadata).encode()
metadata_obj = copy.deepcopy(obj)
metadata_obj.size = len(updated_metadata_b)
output_tar.addfile(metadata_obj, BytesIO(updated_metadata_b))
output_archive.tar.addfile(metadata_obj, BytesIO(updated_metadata_b))
continue
prefix, _, suffix = object_path.name.partition(".")
if suffix not in ("tar", "tgz", "tar.gz"):
LOGGER.debug("Unknown file %s will not be encrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_archive.tar.addfile(obj, input_tar.extractfile(obj))
continue
if prefix not in expected_archives:
LOGGER.debug("Unknown inner tar file %s will not be encrypted", obj.name)
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
nonce=nonces.get(inner_tar_idx),
output_archive.import_tar(
input_tar.extractfile(obj), obj, derived_key_id=inner_tar_idx
)
inner_tar_idx += 1
with istf.encrypt(obj) as encrypted:
encrypted_obj = copy.deepcopy(obj)
encrypted_obj.size = encrypted.encrypted_size
output_tar.addfile(encrypted_obj, encrypted)
@dataclass(kw_only=True)
@@ -391,21 +392,6 @@ class _CipherWorkerStatus:
writer: AsyncIteratorWriter
class NonceGenerator:
"""Generate nonces for encryption."""
def __init__(self) -> None:
"""Initialize the generator."""
self._nonces: dict[int, bytes] = {}
def get(self, index: int) -> bytes:
"""Get a nonce for the given index."""
if index not in self._nonces:
# Generate a new nonce for the given index
self._nonces[index] = os.urandom(16)
return self._nonces[index]
class _CipherBackupStreamer:
"""Encrypt or decrypt a backup."""
@@ -417,7 +403,7 @@ class _CipherBackupStreamer:
str | None,
Callable[[Exception | None], None],
int,
NonceGenerator,
SecureTarRootKeyContext,
],
None,
]
@@ -435,7 +421,7 @@ class _CipherBackupStreamer:
self._hass = hass
self._open_stream = open_stream
self._password = password
self._nonces = NonceGenerator()
self._key_context = SecureTarRootKeyContext(password)
def size(self) -> int:
"""Return the maximum size of the decrypted or encrypted backup."""
@@ -466,7 +452,7 @@ class _CipherBackupStreamer:
self._password,
on_done,
self.size(),
self._nonces,
self._key_context,
],
)
worker_status = _CipherWorkerStatus(

View File

@@ -21,7 +21,6 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util.enum import try_parse_enum
from . import BSBLanConfigEntry, BSBLanData
from .const import ATTR_TARGET_TEMPERATURE, DOMAIN
@@ -113,7 +112,7 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
return target_temp.value
@property
def _hvac_mode_value(self) -> int | str | None:
def _hvac_mode_value(self) -> int | None:
"""Return the raw hvac_mode value from the coordinator."""
if (hvac_mode := self.coordinator.data.state.hvac_mode) is None:
return None
@@ -124,16 +123,14 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
"""Return hvac operation ie. heat, cool mode."""
if (hvac_mode_value := self._hvac_mode_value) is None:
return None
# BSB-Lan returns integer values: 0=off, 1=auto, 2=eco, 3=heat
if isinstance(hvac_mode_value, int):
return BSBLAN_TO_HA_HVAC_MODE.get(hvac_mode_value)
return try_parse_enum(HVACMode, hvac_mode_value)
return BSBLAN_TO_HA_HVAC_MODE.get(hvac_mode_value)
@property
def hvac_action(self) -> HVACAction | None:
"""Return the current running hvac action."""
action = self.coordinator.data.state.hvac_action
if not action or not isinstance(action.value, int):
if (
action := self.coordinator.data.state.hvac_action
) is None or action.value is None:
return None
category = get_hvac_action_category(action.value)
return HVACAction(category.name.lower())

View File

@@ -17,24 +17,24 @@ async def async_get_config_entry_diagnostics(
# Build diagnostic data from both coordinators
diagnostics = {
"info": data.info.to_dict(),
"device": data.device.to_dict(),
"info": data.info.model_dump(),
"device": data.device.model_dump(),
"fast_coordinator_data": {
"state": data.fast_coordinator.data.state.to_dict(),
"sensor": data.fast_coordinator.data.sensor.to_dict(),
"dhw": data.fast_coordinator.data.dhw.to_dict(),
"state": data.fast_coordinator.data.state.model_dump(),
"sensor": data.fast_coordinator.data.sensor.model_dump(),
"dhw": data.fast_coordinator.data.dhw.model_dump(),
},
"static": data.static.to_dict(),
"static": data.static.model_dump(),
}
# Add DHW config and schedule from slow coordinator if available
if data.slow_coordinator.data:
slow_data = {}
if data.slow_coordinator.data.dhw_config:
slow_data["dhw_config"] = data.slow_coordinator.data.dhw_config.to_dict()
slow_data["dhw_config"] = data.slow_coordinator.data.dhw_config.model_dump()
if data.slow_coordinator.data.dhw_schedule:
slow_data["dhw_schedule"] = (
data.slow_coordinator.data.dhw_schedule.to_dict()
data.slow_coordinator.data.dhw_schedule.model_dump()
)
if slow_data:
diagnostics["slow_coordinator_data"] = slow_data

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["bsblan"],
"requirements": ["python-bsblan==4.2.1"],
"requirements": ["python-bsblan==5.0.1"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -110,12 +110,11 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
@property
def current_operation(self) -> str | None:
"""Return current operation."""
if (operating_mode := self.coordinator.data.dhw.operating_mode) is None:
if (
operating_mode := self.coordinator.data.dhw.operating_mode
) is None or operating_mode.value is None:
return None
# The operating_mode.value is an integer (0=Off, 1=On, 2=Eco)
if isinstance(operating_mode.value, int):
return BSBLAN_TO_HA_OPERATION_MODE.get(operating_mode.value)
return None
return BSBLAN_TO_HA_OPERATION_MODE.get(operating_mode.value)
@property
def current_temperature(self) -> float | None:

View File

@@ -1,7 +1,7 @@
{
"domain": "dwd_weather_warnings",
"name": "Deutscher Wetterdienst (DWD) Weather Warnings",
"codeowners": ["@runningman84", "@stephan192", "@andarotajo"],
"codeowners": ["@runningman84", "@stephan192"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/dwd_weather_warnings",
"integration_type": "service",

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==17.1.0"]
"requirements": ["py-sucks==0.9.11", "deebot-client==18.0.0"]
}

View File

@@ -8,17 +8,24 @@ from typing import TYPE_CHECKING, Any
from deebot_client.capabilities import Capabilities, DeviceType
from deebot_client.device import Device
from deebot_client.events import FanSpeedEvent, RoomsEvent, StateEvent
from deebot_client.models import CleanAction, CleanMode, Room, State
from deebot_client.events import (
CachedMapInfoEvent,
FanSpeedEvent,
RoomsEvent,
StateEvent,
)
from deebot_client.events.map import Map
from deebot_client.models import CleanAction, CleanMode, State
import sucks
from homeassistant.components.vacuum import (
Segment,
StateVacuumEntity,
StateVacuumEntityDescription,
VacuumActivity,
VacuumEntityFeature,
)
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.util import slugify
@@ -29,6 +36,7 @@ from .entity import EcovacsEntity, EcovacsLegacyEntity
from .util import get_name_key
_LOGGER = logging.getLogger(__name__)
_SEGMENTS_SEPARATOR = "_"
ATTR_ERROR = "error"
@@ -218,7 +226,8 @@ class EcovacsVacuum(
"""Initialize the vacuum."""
super().__init__(device, device.capabilities)
self._rooms: list[Room] = []
self._room_event: RoomsEvent | None = None
self._maps: dict[str, Map] = {}
if fan_speed := self._capability.fan_speed:
self._attr_supported_features |= VacuumEntityFeature.FAN_SPEED
@@ -226,14 +235,13 @@ class EcovacsVacuum(
get_name_key(level) for level in fan_speed.types
]
if self._capability.map and self._capability.clean.action.area:
self._attr_supported_features |= VacuumEntityFeature.CLEAN_AREA
async def async_added_to_hass(self) -> None:
"""Set up the event listeners now that hass is ready."""
await super().async_added_to_hass()
async def on_rooms(event: RoomsEvent) -> None:
self._rooms = event.rooms
self.async_write_ha_state()
async def on_status(event: StateEvent) -> None:
self._attr_activity = _STATE_TO_VACUUM_STATE[event.state]
self.async_write_ha_state()
@@ -249,8 +257,20 @@ class EcovacsVacuum(
self._subscribe(self._capability.fan_speed.event, on_fan_speed)
if map_caps := self._capability.map:
async def on_rooms(event: RoomsEvent) -> None:
self._room_event = event
self._check_segments_changed()
self.async_write_ha_state()
self._subscribe(map_caps.rooms.event, on_rooms)
async def on_map_info(event: CachedMapInfoEvent) -> None:
self._maps = {map_obj.id: map_obj for map_obj in event.maps}
self._check_segments_changed()
self._subscribe(map_caps.cached_info.event, on_map_info)
@property
def extra_state_attributes(self) -> Mapping[str, Any] | None:
"""Return entity specific state attributes.
@@ -259,7 +279,10 @@ class EcovacsVacuum(
is lowercase snake_case.
"""
rooms: dict[str, Any] = {}
for room in self._rooms:
if self._room_event is None:
return rooms
for room in self._room_event.rooms:
# convert room name to snake_case to meet the convention
room_name = slugify(room.name)
room_values = rooms.get(room_name)
@@ -338,11 +361,11 @@ class EcovacsVacuum(
translation_placeholders={"name": name},
)
if command in "spot_area":
if command == "spot_area":
await self._device.execute_command(
self._capability.clean.action.area(
CleanMode.SPOT_AREA,
str(params["rooms"]),
params["rooms"],
params.get("cleanings", 1),
)
)
@@ -350,7 +373,7 @@ class EcovacsVacuum(
await self._device.execute_command(
self._capability.clean.action.area(
CleanMode.CUSTOM_AREA,
str(params["coordinates"]),
params["coordinates"],
params.get("cleanings", 1),
)
)
@@ -374,3 +397,116 @@ class EcovacsVacuum(
)
return await self._device.execute_command(position_commands[0])
@callback
def _check_segments_changed(self) -> None:
"""Check if segments have changed and create repair issue."""
last_seen = self.last_seen_segments
if last_seen is None:
return
last_seen_ids = {seg.id for seg in last_seen}
current_ids = {seg.id for seg in self._get_segments()}
if current_ids != last_seen_ids:
self.async_create_segments_issue()
def _get_segments(self) -> list[Segment]:
"""Get the segments that can be cleaned."""
last_seen = self.last_seen_segments or []
if self._room_event is None or not self._maps:
# If we don't have the necessary information to determine segments, return the last
# seen segments to avoid temporarily losing all segments until we get the necessary
# information, which could cause unnecessary issues to be created
return last_seen
map_id = self._room_event.map_id
if (map_obj := self._maps.get(map_id)) is None:
_LOGGER.warning("Map ID %s not found in available maps", map_id)
return []
id_prefix = f"{map_id}{_SEGMENTS_SEPARATOR}"
other_map_ids = {
map_obj.id
for map_obj in self._maps.values()
if map_obj.id != self._room_event.map_id
}
# Include segments from the current map and any segments from other maps that were
# previously seen, as we want to continue showing segments from other maps for
# mapping purposes
segments = [
seg for seg in last_seen if _split_composite_id(seg.id)[0] in other_map_ids
]
segments.extend(
Segment(
id=f"{id_prefix}{room.id}",
name=room.name,
group=map_obj.name,
)
for room in self._room_event.rooms
)
return segments
async def async_get_segments(self) -> list[Segment]:
"""Get the segments that can be cleaned."""
return self._get_segments()
async def async_clean_segments(self, segment_ids: list[str], **kwargs: Any) -> None:
"""Perform an area clean.
Only cleans segments from the currently selected map.
"""
if not self._maps:
_LOGGER.warning("No map information available, cannot clean segments")
return
valid_room_ids: list[int | float] = []
for composite_id in segment_ids:
map_id, segment_id = _split_composite_id(composite_id)
if (map_obj := self._maps.get(map_id)) is None:
_LOGGER.warning("Map ID %s not found in available maps", map_id)
continue
if not map_obj.using:
room_name = next(
(
segment.name
for segment in self.last_seen_segments or []
if segment.id == composite_id
),
"",
)
_LOGGER.warning(
'Map "%s" is not currently selected, skipping segment "%s" (%s)',
map_obj.name,
room_name,
segment_id,
)
continue
valid_room_ids.append(int(segment_id))
if not valid_room_ids:
_LOGGER.warning(
"No valid segments to clean after validation, skipping clean segments command"
)
return
if TYPE_CHECKING:
# Supported feature is only added if clean.action.area is not None
assert self._capability.clean.action.area is not None
await self._device.execute_command(
self._capability.clean.action.area(
CleanMode.SPOT_AREA,
valid_room_ids,
1,
)
)
@callback
def _split_composite_id(composite_id: str) -> tuple[str, str]:
"""Split a composite ID into its components."""
map_id, _, segment_id = composite_id.partition(_SEGMENTS_SEPARATOR)
return map_id, segment_id

View File

@@ -13,7 +13,12 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfElectricPotential, UnitOfEnergy, UnitOfPower
from homeassistant.const import (
UnitOfElectricCurrent,
UnitOfElectricPotential,
UnitOfEnergy,
UnitOfPower,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -59,6 +64,15 @@ SENSORS: tuple[EgaugeSensorEntityDescription, ...] = (
available_fn=lambda data, register: register in data.measurements,
supported_fn=lambda register_info: register_info.type == RegisterType.VOLTAGE,
),
EgaugeSensorEntityDescription(
key="current",
device_class=SensorDeviceClass.CURRENT,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
native_value_fn=lambda data, register: data.measurements[register],
available_fn=lambda data, register: register in data.measurements,
supported_fn=lambda register_info: register_info.type == RegisterType.CURRENT,
),
)

View File

@@ -4,7 +4,7 @@
"codeowners": [],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/enocean",
"integration_type": "device",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["enocean"],
"requirements": ["enocean==0.50"],

View File

@@ -17,7 +17,7 @@
"mqtt": ["esphome/discover/#"],
"quality_scale": "platinum",
"requirements": [
"aioesphomeapi==44.0.0",
"aioesphomeapi==44.1.0",
"esphome-dashboard-api==1.3.0",
"bleak-esphome==3.6.0"
],

View File

@@ -72,7 +72,7 @@ class FitbitApi(ABC):
configuration = Configuration()
configuration.pool_manager = async_get_clientsession(self._hass)
configuration.access_token = token[CONF_ACCESS_TOKEN]
return ApiClient(configuration)
return await self._hass.async_add_executor_job(ApiClient, configuration)
async def async_get_user_profile(self) -> FitbitProfile:
"""Return the user profile from the API."""

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/forecast_solar",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["forecast-solar==4.2.0"]
"requirements": ["forecast-solar==5.0.0"]
}

View File

@@ -63,6 +63,7 @@ class FritzboxDataUpdateCoordinator(DataUpdateCoordinator[FritzboxCoordinatorDat
host=self.config_entry.data[CONF_HOST],
user=self.config_entry.data[CONF_USERNAME],
password=self.config_entry.data[CONF_PASSWORD],
timeout=20,
)
try:

View File

@@ -7,7 +7,7 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["pyfritzhome"],
"requirements": ["pyfritzhome==0.6.19"],
"requirements": ["pyfritzhome==0.6.20"],
"ssdp": [
{
"st": "urn:schemas-upnp-org:device:fritzbox:1"

View File

@@ -18,7 +18,7 @@ from yarl import URL
from homeassistant.components import onboarding, websocket_api
from homeassistant.components.http import KEY_HASS, HomeAssistantView, StaticPathConfig
from homeassistant.components.websocket_api import ActiveConnection
from homeassistant.components.websocket_api import ERR_NOT_FOUND, ActiveConnection
from homeassistant.config import async_hass_config_yaml
from homeassistant.const import (
CONF_MODE,
@@ -78,6 +78,16 @@ THEMES_STORAGE_VERSION = 1
THEMES_SAVE_DELAY = 60
DATA_THEMES_STORE: HassKey[Store] = HassKey("frontend_themes_store")
DATA_THEMES: HassKey[dict[str, Any]] = HassKey("frontend_themes")
PANELS_STORAGE_KEY = f"{DOMAIN}_panels"
PANELS_STORAGE_VERSION = 1
PANELS_SAVE_DELAY = 10
DATA_PANELS_STORE: HassKey[Store[dict[str, dict[str, Any]]]] = HassKey(
"frontend_panels_store"
)
DATA_PANELS_CONFIG: HassKey[dict[str, dict[str, Any]]] = HassKey(
"frontend_panels_config"
)
DATA_DEFAULT_THEME = "frontend_default_theme"
DATA_DEFAULT_DARK_THEME = "frontend_default_dark_theme"
DEFAULT_THEME = "default"
@@ -312,9 +322,11 @@ class Panel:
self.sidebar_default_visible = sidebar_default_visible
@callback
def to_response(self) -> PanelResponse:
def to_response(
self, config_override: dict[str, Any] | None = None
) -> PanelResponse:
"""Panel as dictionary."""
return {
response: PanelResponse = {
"component_name": self.component_name,
"icon": self.sidebar_icon,
"title": self.sidebar_title,
@@ -324,6 +336,18 @@ class Panel:
"require_admin": self.require_admin,
"config_panel_domain": self.config_panel_domain,
}
if config_override:
if "require_admin" in config_override:
response["require_admin"] = config_override["require_admin"]
if config_override.get("show_in_sidebar") is False:
response["title"] = None
response["icon"] = None
else:
if "icon" in config_override:
response["icon"] = config_override["icon"]
if "title" in config_override:
response["title"] = config_override["title"]
return response
@bind_hass
@@ -415,12 +439,24 @@ def _frontend_root(dev_repo_path: str | None) -> pathlib.Path:
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the serving of the frontend."""
await async_setup_frontend_storage(hass)
panels_store = hass.data[DATA_PANELS_STORE] = Store[dict[str, dict[str, Any]]](
hass, PANELS_STORAGE_VERSION, PANELS_STORAGE_KEY
)
loaded: Any = await panels_store.async_load()
if not isinstance(loaded, dict):
if loaded is not None:
_LOGGER.warning("Ignoring invalid panel storage data")
loaded = {}
hass.data[DATA_PANELS_CONFIG] = loaded
websocket_api.async_register_command(hass, websocket_get_icons)
websocket_api.async_register_command(hass, websocket_get_panels)
websocket_api.async_register_command(hass, websocket_get_themes)
websocket_api.async_register_command(hass, websocket_get_translations)
websocket_api.async_register_command(hass, websocket_get_version)
websocket_api.async_register_command(hass, websocket_subscribe_extra_js)
websocket_api.async_register_command(hass, websocket_update_panel)
hass.http.register_view(ManifestJSONView())
conf = config.get(DOMAIN, {})
@@ -559,6 +595,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
)
async_register_built_in_panel(hass, "profile")
async_register_built_in_panel(hass, "notfound")
@callback
def async_change_listener(
@@ -883,11 +920,18 @@ def websocket_get_panels(
) -> None:
"""Handle get panels command."""
user_is_admin = connection.user.is_admin
panels = {
panel_key: panel.to_response()
for panel_key, panel in connection.hass.data[DATA_PANELS].items()
if user_is_admin or not panel.require_admin
}
panels_config = hass.data[DATA_PANELS_CONFIG]
panels: dict[str, PanelResponse] = {}
for panel_key, panel in connection.hass.data[DATA_PANELS].items():
config_override = panels_config.get(panel_key)
require_admin = (
config_override.get("require_admin", panel.require_admin)
if config_override
else panel.require_admin
)
if not user_is_admin and require_admin:
continue
panels[panel_key] = panel.to_response(config_override)
connection.send_message(websocket_api.result_message(msg["id"], panels))
@@ -986,6 +1030,50 @@ def websocket_subscribe_extra_js(
connection.send_message(websocket_api.result_message(msg["id"]))
@websocket_api.websocket_command(
{
vol.Required("type"): "frontend/update_panel",
vol.Required("url_path"): str,
vol.Optional("title"): vol.Any(cv.string, None),
vol.Optional("icon"): vol.Any(cv.icon, None),
vol.Optional("require_admin"): vol.Any(cv.boolean, None),
vol.Optional("show_in_sidebar"): vol.Any(cv.boolean, None),
}
)
@websocket_api.require_admin
@websocket_api.async_response
async def websocket_update_panel(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle update panel command."""
url_path: str = msg["url_path"]
if url_path not in hass.data.get(DATA_PANELS, {}):
connection.send_error(msg["id"], ERR_NOT_FOUND, "Panel not found")
return
panels_config = hass.data[DATA_PANELS_CONFIG]
panel_config = dict(panels_config.get(url_path, {}))
for key in ("title", "icon", "require_admin", "show_in_sidebar"):
if key in msg:
if (value := msg[key]) is None:
panel_config.pop(key, None)
else:
panel_config[key] = value
if panel_config:
panels_config[url_path] = panel_config
else:
panels_config.pop(url_path, None)
hass.data[DATA_PANELS_STORE].async_delay_save(
lambda: hass.data[DATA_PANELS_CONFIG], PANELS_SAVE_DELAY
)
hass.bus.async_fire(EVENT_PANELS_UPDATED)
connection.send_result(msg["id"])
class PanelResponse(TypedDict):
"""Represent the panel response type."""

View File

@@ -1752,15 +1752,15 @@ class FanSpeedTrait(_Trait):
"""Initialize a trait for a state."""
super().__init__(hass, state, config)
if state.domain == fan.DOMAIN:
speed_count = min(
FAN_SPEED_MAX_SPEED_COUNT,
round(
100 / (self.state.attributes.get(fan.ATTR_PERCENTAGE_STEP) or 1.0)
),
speed_count = round(
100 / (self.state.attributes.get(fan.ATTR_PERCENTAGE_STEP) or 1.0)
)
self._ordered_speed = [
f"{speed}/{speed_count}" for speed in range(1, speed_count + 1)
]
if speed_count <= FAN_SPEED_MAX_SPEED_COUNT:
self._ordered_speed = [
f"{speed}/{speed_count}" for speed in range(1, speed_count + 1)
]
else:
self._ordered_speed = []
@staticmethod
def supported(domain, features, device_class, _):
@@ -1786,7 +1786,11 @@ class FanSpeedTrait(_Trait):
result.update(
{
"reversible": reversible,
"supportsFanSpeedPercent": True,
# supportsFanSpeedPercent is mutually exclusive with
# availableFanSpeeds, where supportsFanSpeedPercent takes
# precedence. Report it only when step speeds are not
# supported so Google renders a percent slider (1-100%).
"supportsFanSpeedPercent": not self._ordered_speed,
}
)
@@ -1832,10 +1836,12 @@ class FanSpeedTrait(_Trait):
if domain == fan.DOMAIN:
percent = attrs.get(fan.ATTR_PERCENTAGE) or 0
response["currentFanSpeedPercent"] = percent
response["currentFanSpeedSetting"] = percentage_to_ordered_list_item(
self._ordered_speed, percent
)
if self._ordered_speed:
response["currentFanSpeedSetting"] = percentage_to_ordered_list_item(
self._ordered_speed, percent
)
else:
response["currentFanSpeedPercent"] = percent
return response
@@ -1855,7 +1861,7 @@ class FanSpeedTrait(_Trait):
)
if domain == fan.DOMAIN:
if fan_speed := params.get("fanSpeed"):
if self._ordered_speed and (fan_speed := params.get("fanSpeed")):
fan_speed_percent = ordered_list_item_to_percentage(
self._ordered_speed, fan_speed
)

View File

@@ -11,5 +11,10 @@
}
}
}
},
"device": {
"google_translate": {
"name": "Google Translate {lang} {tld}"
}
}
}

View File

@@ -19,6 +19,7 @@ from homeassistant.components.tts import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
@@ -26,6 +27,7 @@ from .const import (
CONF_TLD,
DEFAULT_LANG,
DEFAULT_TLD,
DOMAIN,
MAP_LANG_TLD,
SUPPORT_LANGUAGES,
SUPPORT_TLD,
@@ -66,6 +68,9 @@ async def async_setup_entry(
class GoogleTTSEntity(TextToSpeechEntity):
"""The Google speech API entity."""
_attr_supported_languages = SUPPORT_LANGUAGES
_attr_supported_options = SUPPORT_OPTIONS
def __init__(self, config_entry: ConfigEntry, lang: str, tld: str) -> None:
"""Init Google TTS service."""
if lang in MAP_LANG_TLD:
@@ -77,20 +82,15 @@ class GoogleTTSEntity(TextToSpeechEntity):
self._attr_name = f"Google Translate {self._lang} {self._tld}"
self._attr_unique_id = config_entry.entry_id
@property
def default_language(self) -> str:
"""Return the default language."""
return self._lang
@property
def supported_languages(self) -> list[str]:
"""Return list of supported languages."""
return SUPPORT_LANGUAGES
@property
def supported_options(self) -> list[str]:
"""Return a list of supported options."""
return SUPPORT_OPTIONS
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, config_entry.entry_id)},
manufacturer="Google",
model="Google Translate TTS",
translation_key="google_translate",
translation_placeholders={"lang": self._lang, "tld": self._tld},
)
self._attr_default_language = self._lang
def get_tts_audio(
self, message: str, language: str, options: dict[str, Any] | None = None

View File

@@ -7,6 +7,5 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["pyhik"],
"quality_scale": "legacy",
"requirements": ["pyHik==0.4.2"]
}

View File

@@ -0,0 +1,75 @@
rules:
# Bronze
action-setup:
status: exempt
comment: |
This integration does not provide additional actions.
appropriate-polling:
status: exempt
comment: |
This integration uses local_push and does not poll.
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: todo
docs-actions:
status: exempt
comment: |
This integration does not provide additional actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: |
This integration does not provide additional actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: |
This integration has no configuration parameters.
docs-installation-parameters: todo
entity-unavailable: todo
integration-owner: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: todo
test-coverage: todo
# Gold
devices: todo
diagnostics: todo
discovery: todo
discovery-update-info: todo
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: todo
entity-category: todo
entity-device-class: todo
entity-disabled-by-default: todo
entity-translations: todo
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues: todo
stale-devices: todo
# Platinum
async-dependency: todo
inject-websession: todo
strict-typing: todo

View File

@@ -11,7 +11,12 @@ from pyHomee import (
)
import voluptuous as vol
from homeassistant.config_entries import SOURCE_USER, ConfigFlow, ConfigFlowResult
from homeassistant.config_entries import (
SOURCE_USER,
ConfigEntryState,
ConfigFlow,
ConfigFlowResult,
)
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
@@ -113,7 +118,22 @@ class HomeeConfigFlow(ConfigFlow, domain=DOMAIN):
if discovery_info.ip_address.version == 6:
return self.async_abort(reason="ipv6_address")
await self.async_set_unique_id(self._name)
# If an already configured homee reports with a second IP, abort.
existing_entry = await self.async_set_unique_id(self._name)
if (
existing_entry
and existing_entry.state == ConfigEntryState.LOADED
and existing_entry.runtime_data.connected
and existing_entry.data[CONF_HOST] != self._host
):
_LOGGER.debug(
"Aborting config flow for discovered homee with IP %s "
"since it is already configured at IP %s",
self._host,
existing_entry.data[CONF_HOST],
)
return self.async_abort(reason="2nd_ip_address")
self._abort_if_unique_id_configured(updates={CONF_HOST: self._host})
# Cause an auth-error to see if homee is reachable.

View File

@@ -20,6 +20,7 @@ PARALLEL_UPDATES = 0
REMOTE_PROFILES = [
NodeProfile.REMOTE,
NodeProfile.ONE_BUTTON_REMOTE,
NodeProfile.TWO_BUTTON_REMOTE,
NodeProfile.THREE_BUTTON_REMOTE,
NodeProfile.FOUR_BUTTON_REMOTE,

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"2nd_ip_address": "Your homee is already connected using another IP address",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",

View File

@@ -10,7 +10,7 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .coordinator import HomevoltConfigEntry, HomevoltDataUpdateCoordinator
PLATFORMS: list[Platform] = [Platform.SENSOR]
PLATFORMS: list[Platform] = [Platform.SENSOR, Platform.SWITCH]
async def async_setup_entry(hass: HomeAssistant, entry: HomevoltConfigEntry) -> bool:

View File

@@ -0,0 +1,67 @@
"""Shared entity helpers for Homevolt."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from typing import Any, Concatenate
from homevolt import HomevoltAuthenticationError, HomevoltConnectionError, HomevoltError
from homeassistant.exceptions import ConfigEntryAuthFailed, HomeAssistantError
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER
from .coordinator import HomevoltDataUpdateCoordinator
class HomevoltEntity(CoordinatorEntity[HomevoltDataUpdateCoordinator]):
"""Base Homevolt entity."""
_attr_has_entity_name = True
def __init__(
self, coordinator: HomevoltDataUpdateCoordinator, device_identifier: str
) -> None:
"""Initialize the Homevolt entity."""
super().__init__(coordinator)
device_id = coordinator.data.unique_id
device_metadata = coordinator.data.device_metadata.get(device_identifier)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{device_id}_{device_identifier}")},
configuration_url=coordinator.client.base_url,
manufacturer=MANUFACTURER,
model=device_metadata.model if device_metadata else None,
name=device_metadata.name if device_metadata else None,
)
def homevolt_exception_handler[_HomevoltEntityT: HomevoltEntity, **_P](
func: Callable[Concatenate[_HomevoltEntityT, _P], Coroutine[Any, Any, Any]],
) -> Callable[Concatenate[_HomevoltEntityT, _P], Coroutine[Any, Any, None]]:
"""Decorate Homevolt calls to handle exceptions."""
async def handler(
self: _HomevoltEntityT, *args: _P.args, **kwargs: _P.kwargs
) -> None:
try:
await func(self, *args, **kwargs)
except HomevoltAuthenticationError as error:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="auth_failed",
) from error
except HomevoltConnectionError as error:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="communication_error",
translation_placeholders={"error": str(error)},
) from error
except HomevoltError as error:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="unknown_error",
translation_placeholders={"error": str(error)},
) from error
return handler

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["homevolt==0.4.4"],
"requirements": ["homevolt==0.5.0"],
"zeroconf": [
{
"name": "homevolt*",

View File

@@ -22,13 +22,11 @@ from homeassistant.const import (
UnitOfTemperature,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER
from .coordinator import HomevoltConfigEntry, HomevoltDataUpdateCoordinator
from .entity import HomevoltEntity
PARALLEL_UPDATES = 0 # Coordinator-based updates
@@ -309,11 +307,10 @@ async def async_setup_entry(
async_add_entities(entities)
class HomevoltSensor(CoordinatorEntity[HomevoltDataUpdateCoordinator], SensorEntity):
class HomevoltSensor(HomevoltEntity, SensorEntity):
"""Representation of a Homevolt sensor."""
entity_description: SensorEntityDescription
_attr_has_entity_name = True
def __init__(
self,
@@ -322,24 +319,12 @@ class HomevoltSensor(CoordinatorEntity[HomevoltDataUpdateCoordinator], SensorEnt
sensor_key: str,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator)
self.entity_description = description
unique_id = coordinator.data.unique_id
self._attr_unique_id = f"{unique_id}_{sensor_key}"
sensor_data = coordinator.data.sensors[sensor_key]
super().__init__(coordinator, sensor_data.device_identifier)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.unique_id}_{sensor_key}"
self._sensor_key = sensor_key
device_metadata = coordinator.data.device_metadata.get(
sensor_data.device_identifier
)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, f"{unique_id}_{sensor_data.device_identifier}")},
configuration_url=coordinator.client.base_url,
manufacturer=MANUFACTURER,
model=device_metadata.model if device_metadata else None,
name=device_metadata.name if device_metadata else None,
)
@property
def available(self) -> bool:
"""Return if entity is available."""

View File

@@ -160,6 +160,22 @@
"tmin": {
"name": "Minimum temperature"
}
},
"switch": {
"local_mode": {
"name": "Local mode"
}
}
},
"exceptions": {
"auth_failed": {
"message": "[%key:common::config_flow::error::invalid_auth%]"
},
"communication_error": {
"message": "[%key:common::config_flow::error::cannot_connect%]"
},
"unknown_error": {
"message": "[%key:common::config_flow::error::unknown%]"
}
}
}

View File

@@ -0,0 +1,55 @@
"""Support for Homevolt switch entities."""
from __future__ import annotations
from typing import Any
from homeassistant.components.switch import SwitchEntity
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import HomevoltConfigEntry, HomevoltDataUpdateCoordinator
from .entity import HomevoltEntity, homevolt_exception_handler
PARALLEL_UPDATES = 0 # Coordinator-based updates
async def async_setup_entry(
hass: HomeAssistant,
entry: HomevoltConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Homevolt switch entities."""
coordinator = entry.runtime_data
async_add_entities([HomevoltLocalModeSwitch(coordinator)])
class HomevoltLocalModeSwitch(HomevoltEntity, SwitchEntity):
"""Switch entity for Homevolt local mode."""
_attr_entity_category = EntityCategory.CONFIG
_attr_translation_key = "local_mode"
def __init__(self, coordinator: HomevoltDataUpdateCoordinator) -> None:
"""Initialize the switch entity."""
self._attr_unique_id = f"{coordinator.data.unique_id}_local_mode"
device_id = coordinator.data.unique_id
super().__init__(coordinator, f"ems_{device_id}")
@property
def is_on(self) -> bool:
"""Return true if local mode is enabled."""
return self.coordinator.client.local_mode_enabled
@homevolt_exception_handler
async def async_turn_on(self, **kwargs: Any) -> None:
"""Enable local mode."""
await self.coordinator.client.enable_local_mode()
await self.coordinator.async_request_refresh()
@homevolt_exception_handler
async def async_turn_off(self, **kwargs: Any) -> None:
"""Disable local mode."""
await self.coordinator.client.disable_local_mode()
await self.coordinator.async_request_refresh()

View File

@@ -9,7 +9,7 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import LOGGER
from .coordinator import IntelliClimaConfigEntry, IntelliClimaCoordinator
PLATFORMS = [Platform.FAN]
PLATFORMS = [Platform.FAN, Platform.SELECT]
async def async_setup_entry(

View File

@@ -27,8 +27,6 @@ class IntelliClimaEntity(CoordinatorEntity[IntelliClimaCoordinator]):
"""Class initializer."""
super().__init__(coordinator=coordinator)
self._attr_unique_id = device.id
# Make this HA "device" use the IntelliClima device name.
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},

View File

@@ -62,6 +62,7 @@ class IntelliClimaVMCFan(IntelliClimaECOEntity, FanEntity):
super().__init__(coordinator, device)
self._speed_range = (int(FanSpeed.sleep), int(FanSpeed.high))
self._attr_unique_id = device.id
@property
def is_on(self) -> bool:

View File

@@ -49,7 +49,7 @@ rules:
comment: |
Unclear if discovery is possible.
docs-data-update: done
docs-examples: todo
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done

View File

@@ -0,0 +1,96 @@
"""Select platform for IntelliClima VMC."""
from pyintelliclima.const import FanMode, FanSpeed
from pyintelliclima.intelliclima_types import IntelliClimaECO
from homeassistant.components.select import SelectEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import IntelliClimaConfigEntry, IntelliClimaCoordinator
from .entity import IntelliClimaECOEntity
# Coordinator is used to centralize the data updates
PARALLEL_UPDATES = 0
FAN_MODE_TO_INTELLICLIMA_MODE = {
"forward": FanMode.inward,
"reverse": FanMode.outward,
"alternate": FanMode.alternate,
"sensor": FanMode.sensor,
}
INTELLICLIMA_MODE_TO_FAN_MODE = {v: k for k, v in FAN_MODE_TO_INTELLICLIMA_MODE.items()}
async def async_setup_entry(
hass: HomeAssistant,
entry: IntelliClimaConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up IntelliClima VMC fan mode select."""
coordinator = entry.runtime_data
entities: list[IntelliClimaVMCFanModeSelect] = [
IntelliClimaVMCFanModeSelect(
coordinator=coordinator,
device=ecocomfort2,
)
for ecocomfort2 in coordinator.data.ecocomfort2_devices.values()
]
async_add_entities(entities)
class IntelliClimaVMCFanModeSelect(IntelliClimaECOEntity, SelectEntity):
"""Representation of an IntelliClima VMC fan mode selector."""
_attr_translation_key = "fan_mode"
_attr_options = ["forward", "reverse", "alternate", "sensor"]
def __init__(
self,
coordinator: IntelliClimaCoordinator,
device: IntelliClimaECO,
) -> None:
"""Class initializer."""
super().__init__(coordinator, device)
self._attr_unique_id = f"{device.id}_fan_mode"
@property
def current_option(self) -> str | None:
"""Return the current fan mode."""
device_data = self._device_data
if device_data.mode_set == FanMode.off:
return None
# If in auto mode (sensor mode with auto speed), return None (handled by fan entity preset mode)
if (
device_data.speed_set == FanSpeed.auto
and device_data.mode_set == FanMode.sensor
):
return None
return INTELLICLIMA_MODE_TO_FAN_MODE.get(FanMode(device_data.mode_set))
async def async_select_option(self, option: str) -> None:
"""Set the fan mode."""
device_data = self._device_data
mode = FAN_MODE_TO_INTELLICLIMA_MODE[option]
# Determine speed: keep current speed if available, otherwise default to sleep
if (
device_data.speed_set == FanSpeed.auto
or device_data.mode_set == FanMode.off
):
speed = FanSpeed.sleep
else:
speed = device_data.speed_set
await self.coordinator.api.ecocomfort.set_mode_speed(
self._device_sn, mode, speed
)
await self.coordinator.async_request_refresh()

View File

@@ -22,5 +22,18 @@
"description": "Authenticate against IntelliClima cloud"
}
}
},
"entity": {
"select": {
"fan_mode": {
"name": "Fan direction mode",
"state": {
"alternate": "Alternating",
"forward": "Forward",
"reverse": "Reverse",
"sensor": "Sensor"
}
}
}
}
}

View File

@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/kaleidescape",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["pykaleidescape==1.1.1"],
"requirements": ["pykaleidescape==1.1.3"],
"ssdp": [
{
"deviceType": "schemas-upnp-org:device:Basic:1",

View File

@@ -6,7 +6,11 @@ import logging
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers import (
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from .const import DOMAIN
from .coordinator import (
@@ -80,6 +84,21 @@ async def async_setup_entry(
lhm_coordinator = LibreHardwareMonitorCoordinator(hass, config_entry)
await lhm_coordinator.async_config_entry_first_refresh()
if lhm_coordinator.data.is_deprecated_version:
issue_id = f"deprecated_api_{config_entry.entry_id}"
ir.async_create_issue(
hass,
DOMAIN,
issue_id,
breaks_in_ha_version="2026.9.0",
is_fixable=False,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_api",
translation_placeholders={
"lhm_releases_url": "https://github.com/LibreHardwareMonitor/LibreHardwareMonitor/releases"
},
)
config_entry.runtime_data = lhm_coordinator
await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS)

View File

@@ -21,7 +21,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_PORT, CONF_USERNAME
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers import device_registry as dr, issue_registry as ir
from homeassistant.helpers.aiohttp_client import async_create_clientsession
from homeassistant.helpers.device_registry import DeviceEntry
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
@@ -50,7 +50,7 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
config_entry=config_entry,
update_interval=timedelta(seconds=DEFAULT_SCAN_INTERVAL),
)
self._entry_id = config_entry.entry_id
self._api = LibreHardwareMonitorClient(
host=config_entry.data[CONF_HOST],
port=config_entry.data[CONF_PORT],
@@ -59,13 +59,14 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
session=async_create_clientsession(hass),
)
device_entries: list[DeviceEntry] = dr.async_entries_for_config_entry(
registry=dr.async_get(self.hass), config_entry_id=config_entry.entry_id
registry=dr.async_get(self.hass), config_entry_id=self._entry_id
)
self._previous_devices: dict[DeviceId, DeviceName] = {
DeviceId(next(iter(device.identifiers))[1]): DeviceName(device.name)
for device in device_entries
if device.identifiers and device.name
}
self._is_deprecated_version: bool | None = None
async def _async_update_data(self) -> LibreHardwareMonitorData:
try:
@@ -80,6 +81,12 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
except LibreHardwareMonitorNoDevicesError as err:
raise UpdateFailed("No sensor data available, will retry") from err
# Check whether user has upgraded LHM from a deprecated version while the integration is running
if self._is_deprecated_version and not lhm_data.is_deprecated_version:
# Clear deprecation issue
ir.async_delete_issue(self.hass, DOMAIN, f"deprecated_api_{self._entry_id}")
self._is_deprecated_version = lhm_data.is_deprecated_version
await self._async_handle_changes_in_devices(
dict(lhm_data.main_device_ids_and_names)
)

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["librehardwaremonitor-api==1.9.1"]
"requirements": ["librehardwaremonitor-api==1.10.1"]
}

View File

@@ -5,6 +5,7 @@ from __future__ import annotations
from typing import Any
from librehardwaremonitor_api.model import LibreHardwareMonitorSensorData
from librehardwaremonitor_api.sensor_type import SensorType
from homeassistant.components.sensor import SensorEntity, SensorStateClass
from homeassistant.core import HomeAssistant, callback
@@ -53,12 +54,8 @@ class LibreHardwareMonitorSensor(
super().__init__(coordinator)
self._attr_name: str = sensor_data.name
self._attr_native_value: str | None = sensor_data.value
self._attr_extra_state_attributes: dict[str, Any] = {
STATE_MIN_VALUE: sensor_data.min,
STATE_MAX_VALUE: sensor_data.max,
}
self._attr_native_unit_of_measurement = sensor_data.unit
self._set_state(coordinator.data.is_deprecated_version, sensor_data)
self._attr_unique_id: str = f"{entry_id}_{sensor_data.sensor_id}"
self._sensor_id: str = sensor_data.sensor_id
@@ -70,15 +67,36 @@ class LibreHardwareMonitorSensor(
model=sensor_data.device_type,
)
def _set_state(
self,
is_deprecated_lhm_version: bool,
sensor_data: LibreHardwareMonitorSensorData,
) -> None:
value = sensor_data.value
min_value = sensor_data.min
max_value = sensor_data.max
unit = sensor_data.unit
if not is_deprecated_lhm_version and sensor_data.type == SensorType.THROUGHPUT:
# Temporary fix: convert the B/s value to KB/s to not break existing entries
# This will be migrated properly once SensorDeviceClass is introduced
value = f"{(float(value) / 1024):.1f}" if value else None
min_value = f"{(float(min_value) / 1024):.1f}" if min_value else None
max_value = f"{(float(max_value) / 1024):.1f}" if max_value else None
unit = "KB/s"
self._attr_native_value: str | None = value
self._attr_extra_state_attributes: dict[str, Any] = {
STATE_MIN_VALUE: min_value,
STATE_MAX_VALUE: max_value,
}
self._attr_native_unit_of_measurement = unit
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
if sensor_data := self.coordinator.data.sensor_data.get(self._sensor_id):
self._attr_native_value = sensor_data.value
self._attr_extra_state_attributes = {
STATE_MIN_VALUE: sensor_data.min,
STATE_MAX_VALUE: sensor_data.max,
}
self._set_state(self.coordinator.data.is_deprecated_version, sensor_data)
else:
self._attr_native_value = None

View File

@@ -33,5 +33,11 @@
}
}
}
},
"issues": {
"deprecated_api": {
"description": "Your version of Libre Hardware Monitor is deprecated and may not provide stable sensor data. To fix this issue:\n\n1. Download version 0.9.5 or later from {lhm_releases_url}\n2. Close Libre Hardware Monitor on your computer\n3. Install or extract the new version and start Libre Hardware Monitor again (you might have to re-enable the remote web server)\n4. Home Assistant will detect the new version and this issue will clear automatically",
"title": "Deprecated Libre Hardware Monitor version"
}
}
}

View File

@@ -1,4 +1,4 @@
"""The liebherr integration."""
"""The Liebherr integration."""
from __future__ import annotations
@@ -17,7 +17,12 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
PLATFORMS: list[Platform] = [Platform.NUMBER, Platform.SENSOR, Platform.SWITCH]
PLATFORMS: list[Platform] = [
Platform.NUMBER,
Platform.SELECT,
Platform.SENSOR,
Platform.SWITCH,
]
async def async_setup_entry(hass: HomeAssistant, entry: LiebherrConfigEntry) -> bool:

View File

@@ -1,5 +1,55 @@
{
"entity": {
"select": {
"bio_fresh_plus": {
"default": "mdi:leaf"
},
"bio_fresh_plus_bottom_zone": {
"default": "mdi:leaf"
},
"bio_fresh_plus_middle_zone": {
"default": "mdi:leaf"
},
"bio_fresh_plus_top_zone": {
"default": "mdi:leaf"
},
"hydro_breeze": {
"default": "mdi:weather-windy"
},
"hydro_breeze_bottom_zone": {
"default": "mdi:weather-windy"
},
"hydro_breeze_middle_zone": {
"default": "mdi:weather-windy"
},
"hydro_breeze_top_zone": {
"default": "mdi:weather-windy"
},
"ice_maker": {
"default": "mdi:cube-outline",
"state": {
"off": "mdi:cube-outline-off"
}
},
"ice_maker_bottom_zone": {
"default": "mdi:cube-outline",
"state": {
"off": "mdi:cube-outline-off"
}
},
"ice_maker_middle_zone": {
"default": "mdi:cube-outline",
"state": {
"off": "mdi:cube-outline-off"
}
},
"ice_maker_top_zone": {
"default": "mdi:cube-outline",
"state": {
"off": "mdi:cube-outline-off"
}
}
},
"switch": {
"night_mode": {
"default": "mdi:sleep",

View File

@@ -0,0 +1,216 @@
"""Select platform for Liebherr integration."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from enum import StrEnum
from typing import TYPE_CHECKING, Any
from pyliebherrhomeapi import (
BioFreshPlusControl,
BioFreshPlusMode,
HydroBreezeControl,
HydroBreezeMode,
IceMakerControl,
IceMakerMode,
ZonePosition,
)
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import ZONE_POSITION_MAP, LiebherrEntity
PARALLEL_UPDATES = 1
type SelectControl = IceMakerControl | HydroBreezeControl | BioFreshPlusControl
@dataclass(frozen=True, kw_only=True)
class LiebherrSelectEntityDescription(SelectEntityDescription):
"""Describes a Liebherr select entity."""
control_type: type[SelectControl]
mode_enum: type[StrEnum]
current_mode_fn: Callable[[SelectControl], StrEnum | str | None]
options_fn: Callable[[SelectControl], list[str]]
set_fn: Callable[[LiebherrCoordinator, int, StrEnum], Coroutine[Any, Any, None]]
def _ice_maker_options(control: SelectControl) -> list[str]:
"""Return available ice maker options."""
if TYPE_CHECKING:
assert isinstance(control, IceMakerControl)
options = [IceMakerMode.OFF.value, IceMakerMode.ON.value]
if control.has_max_ice:
options.append(IceMakerMode.MAX_ICE.value)
return options
def _hydro_breeze_options(control: SelectControl) -> list[str]:
"""Return available HydroBreeze options."""
return [mode.value for mode in HydroBreezeMode]
def _bio_fresh_plus_options(control: SelectControl) -> list[str]:
"""Return available BioFresh-Plus options."""
if TYPE_CHECKING:
assert isinstance(control, BioFreshPlusControl)
return [
mode.value
for mode in control.supported_modes
if isinstance(mode, BioFreshPlusMode)
]
SELECT_TYPES: list[LiebherrSelectEntityDescription] = [
LiebherrSelectEntityDescription(
key="ice_maker",
translation_key="ice_maker",
control_type=IceMakerControl,
mode_enum=IceMakerMode,
current_mode_fn=lambda c: c.ice_maker_mode, # type: ignore[union-attr]
options_fn=_ice_maker_options,
set_fn=lambda coordinator, zone_id, mode: coordinator.client.set_ice_maker(
device_id=coordinator.device_id,
zone_id=zone_id,
mode=mode, # type: ignore[arg-type]
),
),
LiebherrSelectEntityDescription(
key="hydro_breeze",
translation_key="hydro_breeze",
control_type=HydroBreezeControl,
mode_enum=HydroBreezeMode,
current_mode_fn=lambda c: c.current_mode, # type: ignore[union-attr]
options_fn=_hydro_breeze_options,
set_fn=lambda coordinator, zone_id, mode: coordinator.client.set_hydro_breeze(
device_id=coordinator.device_id,
zone_id=zone_id,
mode=mode, # type: ignore[arg-type]
),
),
LiebherrSelectEntityDescription(
key="bio_fresh_plus",
translation_key="bio_fresh_plus",
control_type=BioFreshPlusControl,
mode_enum=BioFreshPlusMode,
current_mode_fn=lambda c: c.current_mode, # type: ignore[union-attr]
options_fn=_bio_fresh_plus_options,
set_fn=lambda coordinator, zone_id, mode: coordinator.client.set_bio_fresh_plus(
device_id=coordinator.device_id,
zone_id=zone_id,
mode=mode, # type: ignore[arg-type]
),
),
]
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr select entities."""
entities: list[LiebherrSelectEntity] = []
for coordinator in entry.runtime_data.values():
has_multiple_zones = len(coordinator.data.get_temperature_controls()) > 1
for control in coordinator.data.controls:
for description in SELECT_TYPES:
if isinstance(control, description.control_type):
if TYPE_CHECKING:
assert isinstance(
control,
IceMakerControl | HydroBreezeControl | BioFreshPlusControl,
)
entities.append(
LiebherrSelectEntity(
coordinator=coordinator,
description=description,
zone_id=control.zone_id,
has_multiple_zones=has_multiple_zones,
)
)
async_add_entities(entities)
class LiebherrSelectEntity(LiebherrEntity, SelectEntity):
"""Representation of a Liebherr select entity."""
entity_description: LiebherrSelectEntityDescription
def __init__(
self,
coordinator: LiebherrCoordinator,
description: LiebherrSelectEntityDescription,
zone_id: int,
has_multiple_zones: bool,
) -> None:
"""Initialize the select entity."""
super().__init__(coordinator)
self.entity_description = description
self._zone_id = zone_id
self._attr_unique_id = f"{coordinator.device_id}_{description.key}_{zone_id}"
# Set options from the control
control = self._select_control
if control is not None:
self._attr_options = description.options_fn(control)
# Add zone suffix only for multi-zone devices
if has_multiple_zones:
temp_controls = coordinator.data.get_temperature_controls()
if (
(tc := temp_controls.get(zone_id))
and isinstance(tc.zone_position, ZonePosition)
and (zone_key := ZONE_POSITION_MAP.get(tc.zone_position))
):
self._attr_translation_key = f"{description.translation_key}_{zone_key}"
@property
def _select_control(self) -> SelectControl | None:
"""Get the select control for this entity."""
for control in self.coordinator.data.controls:
if (
isinstance(control, self.entity_description.control_type)
and control.zone_id == self._zone_id
):
if TYPE_CHECKING:
assert isinstance(
control,
IceMakerControl | HydroBreezeControl | BioFreshPlusControl,
)
return control
return None
@property
def current_option(self) -> str | None:
"""Return the current selected option."""
control = self._select_control
if TYPE_CHECKING:
assert isinstance(
control,
IceMakerControl | HydroBreezeControl | BioFreshPlusControl,
)
mode = self.entity_description.current_mode_fn(control)
if isinstance(mode, StrEnum):
return mode.value
return None
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self._select_control is not None
async def async_select_option(self, option: str) -> None:
"""Change the selected option."""
mode = self.entity_description.mode_enum(option)
await self._async_send_command(
self.entity_description.set_fn(self.coordinator, self._zone_id, mode),
)

View File

@@ -47,6 +47,112 @@
"name": "Top zone setpoint"
}
},
"select": {
"bio_fresh_plus": {
"name": "BioFresh-Plus",
"state": {
"minus_two_minus_two": "-2°C | -2°C",
"minus_two_zero": "-2°C | 0°C",
"zero_minus_two": "0°C | -2°C",
"zero_zero": "0°C | 0°C"
}
},
"bio_fresh_plus_bottom_zone": {
"name": "Bottom zone BioFresh-Plus",
"state": {
"minus_two_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_minus_two%]",
"minus_two_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_zero%]",
"zero_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_minus_two%]",
"zero_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_zero%]"
}
},
"bio_fresh_plus_middle_zone": {
"name": "Middle zone BioFresh-Plus",
"state": {
"minus_two_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_minus_two%]",
"minus_two_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_zero%]",
"zero_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_minus_two%]",
"zero_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_zero%]"
}
},
"bio_fresh_plus_top_zone": {
"name": "Top zone BioFresh-Plus",
"state": {
"minus_two_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_minus_two%]",
"minus_two_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::minus_two_zero%]",
"zero_minus_two": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_minus_two%]",
"zero_zero": "[%key:component::liebherr::entity::select::bio_fresh_plus::state::zero_zero%]"
}
},
"hydro_breeze": {
"name": "HydroBreeze",
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]",
"off": "[%key:common::state::off%]"
}
},
"hydro_breeze_bottom_zone": {
"name": "Bottom zone HydroBreeze",
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]",
"off": "[%key:common::state::off%]"
}
},
"hydro_breeze_middle_zone": {
"name": "Middle zone HydroBreeze",
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]",
"off": "[%key:common::state::off%]"
}
},
"hydro_breeze_top_zone": {
"name": "Top zone HydroBreeze",
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]",
"off": "[%key:common::state::off%]"
}
},
"ice_maker": {
"name": "IceMaker",
"state": {
"max_ice": "MaxIce",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
},
"ice_maker_bottom_zone": {
"name": "Bottom zone IceMaker",
"state": {
"max_ice": "[%key:component::liebherr::entity::select::ice_maker::state::max_ice%]",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
},
"ice_maker_middle_zone": {
"name": "Middle zone IceMaker",
"state": {
"max_ice": "[%key:component::liebherr::entity::select::ice_maker::state::max_ice%]",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
},
"ice_maker_top_zone": {
"name": "Top zone IceMaker",
"state": {
"max_ice": "[%key:component::liebherr::entity::select::ice_maker::state::max_ice%]",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
}
},
"sensor": {
"bottom_zone": {
"name": "Bottom zone"

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any, Generic
from pylitterbot import FeederRobot, LitterRobot3, LitterRobot4, Robot
from pylitterbot import FeederRobot, LitterRobot3, LitterRobot4, LitterRobot5, Robot
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
@@ -24,20 +24,24 @@ class RobotButtonEntityDescription(ButtonEntityDescription, Generic[_WhiskerEnti
press_fn: Callable[[_WhiskerEntityT], Coroutine[Any, Any, bool]]
ROBOT_BUTTON_MAP: dict[type[Robot], RobotButtonEntityDescription] = {
LitterRobot3: RobotButtonEntityDescription[LitterRobot3](
ROBOT_BUTTON_MAP: dict[tuple[type[Robot], ...], RobotButtonEntityDescription] = {
(LitterRobot3, LitterRobot5): RobotButtonEntityDescription[
LitterRobot3 | LitterRobot5
](
key="reset_waste_drawer",
translation_key="reset_waste_drawer",
entity_category=EntityCategory.CONFIG,
press_fn=lambda robot: robot.reset_waste_drawer(),
),
LitterRobot4: RobotButtonEntityDescription[LitterRobot4](
(LitterRobot4, LitterRobot5): RobotButtonEntityDescription[
LitterRobot4 | LitterRobot5
](
key="reset",
translation_key="reset",
entity_category=EntityCategory.CONFIG,
press_fn=lambda robot: robot.reset(),
),
FeederRobot: RobotButtonEntityDescription[FeederRobot](
(FeederRobot,): RobotButtonEntityDescription[FeederRobot](
key="give_snack",
translation_key="give_snack",
press_fn=lambda robot: robot.give_snack(),

View File

@@ -65,7 +65,7 @@ class LitterRobotDataUpdateCoordinator(DataUpdateCoordinator[None]):
except LitterRobotLoginException as ex:
raise ConfigEntryAuthFailed("Invalid credentials") from ex
except LitterRobotException as ex:
raise UpdateFailed("Unable to connect to Litter-Robot API") from ex
raise UpdateFailed("Unable to connect to Whisker API") from ex
def litter_robots(self) -> Generator[LitterRobot]:
"""Get Litter-Robots from the account."""

View File

@@ -1,6 +1,6 @@
{
"domain": "litterrobot",
"name": "Litter-Robot",
"name": "Whisker",
"codeowners": ["@natekspencer", "@tkdrob"],
"config_flow": true,
"dhcp": [

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any, Generic, TypeVar
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, Robot
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, LitterRobot5, Robot
from pylitterbot.robot.litterrobot4 import BrightnessLevel, NightLightMode
from homeassistant.components.select import SelectEntity, SelectEntityDescription
@@ -32,9 +32,11 @@ class RobotSelectEntityDescription(
select_fn: Callable[[_WhiskerEntityT, str], Coroutine[Any, Any, bool]]
ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] = {
ROBOT_SELECT_MAP: dict[
type[Robot] | tuple[type[Robot], ...], tuple[RobotSelectEntityDescription, ...]
] = {
LitterRobot: (
RobotSelectEntityDescription[LitterRobot, int]( # type: ignore[type-abstract] # only used for isinstance check
RobotSelectEntityDescription[LitterRobot, int](
key="cycle_delay",
translation_key="cycle_delay",
unit_of_measurement=UnitOfTime.MINUTES,
@@ -43,8 +45,8 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
select_fn=lambda robot, opt: robot.set_wait_time(int(opt)),
),
),
LitterRobot4: (
RobotSelectEntityDescription[LitterRobot4, str](
(LitterRobot4, LitterRobot5): (
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="globe_brightness",
translation_key="globe_brightness",
current_fn=(
@@ -61,7 +63,7 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
)
),
),
RobotSelectEntityDescription[LitterRobot4, str](
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="globe_light",
translation_key="globe_light",
current_fn=(
@@ -78,7 +80,7 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
)
),
),
RobotSelectEntityDescription[LitterRobot4, str](
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="panel_brightness",
translation_key="brightness_level",
current_fn=(

View File

@@ -7,7 +7,7 @@ from dataclasses import dataclass
from datetime import datetime
from typing import Any, Generic
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, Pet, Robot
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, LitterRobot5, Pet, Robot
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -44,8 +44,10 @@ class RobotSensorEntityDescription(SensorEntityDescription, Generic[_WhiskerEnti
value_fn: Callable[[_WhiskerEntityT], float | datetime | str | None]
ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
LitterRobot: [ # type: ignore[type-abstract] # only used for isinstance check
ROBOT_SENSOR_MAP: dict[
type[Robot] | tuple[type[Robot], ...], list[RobotSensorEntityDescription]
] = {
LitterRobot: [
RobotSensorEntityDescription[LitterRobot](
key="waste_drawer_level",
translation_key="waste_drawer",
@@ -145,7 +147,9 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
)
),
),
RobotSensorEntityDescription[LitterRobot4](
],
(LitterRobot4, LitterRobot5): [
RobotSensorEntityDescription[LitterRobot4 | LitterRobot5](
key="litter_level",
translation_key="litter_level",
native_unit_of_measurement=PERCENTAGE,
@@ -153,7 +157,7 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda robot: robot.litter_level,
),
RobotSensorEntityDescription[LitterRobot4](
RobotSensorEntityDescription[LitterRobot4 | LitterRobot5](
key="pet_weight",
translation_key="pet_weight",
native_unit_of_measurement=UnitOfMass.POUNDS,

View File

@@ -8,6 +8,6 @@
"documentation": "https://www.home-assistant.io/integrations/matter",
"integration_type": "hub",
"iot_class": "local_push",
"requirements": ["python-matter-server==8.1.2"],
"requirements": ["matter-python-client==0.4.1"],
"zeroconf": ["_matter._tcp.local.", "_matterc._udp.local."]
}

View File

@@ -908,6 +908,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ApparentPower,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -924,6 +925,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ReactivePower,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -939,6 +941,7 @@ DISCOVERY_SCHEMAS = [
),
entity_class=MatterSensor,
required_attributes=(clusters.ElectricalPowerMeasurement.Attributes.Voltage,),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -956,6 +959,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.RMSVoltage,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -973,6 +977,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ApparentCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -990,6 +995,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ActiveCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1007,6 +1013,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ReactiveCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1024,6 +1031,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.RMSCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1039,6 +1047,7 @@ DISCOVERY_SCHEMAS = [
device_to_ha=lambda x: x.energy,
),
entity_class=MatterSensor,
allow_none_value=True,
required_attributes=(
clusters.ElectricalEnergyMeasurement.Attributes.CumulativeEnergyImported,
),
@@ -1058,6 +1067,7 @@ DISCOVERY_SCHEMAS = [
device_to_ha=lambda x: x.energy,
),
entity_class=MatterSensor,
allow_none_value=True,
required_attributes=(
clusters.ElectricalEnergyMeasurement.Attributes.CumulativeEnergyExported,
),

View File

@@ -46,30 +46,42 @@ async def async_setup_entry(
class MatterSwitchEntityDescription(SwitchEntityDescription, MatterEntityDescription):
"""Describe Matter Switch entities."""
inverted: bool = False
class MatterSwitch(MatterEntity, SwitchEntity):
"""Representation of a Matter switch."""
entity_description: MatterSwitchEntityDescription
_platform_translation_key = "switch"
def _get_command_for_value(self, value: bool) -> ClusterCommand:
"""Get the appropriate command for the desired value.
Applies inversion if needed (e.g., for inverted logic like mute).
"""
send_value = not value if self.entity_description.inverted else value
return (
clusters.OnOff.Commands.On()
if send_value
else clusters.OnOff.Commands.Off()
)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn switch on."""
await self.send_device_command(
clusters.OnOff.Commands.On(),
)
await self.send_device_command(self._get_command_for_value(True))
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn switch off."""
await self.send_device_command(
clusters.OnOff.Commands.Off(),
)
await self.send_device_command(self._get_command_for_value(False))
@callback
def _update_from_device(self) -> None:
"""Update from device."""
self._attr_is_on = self.get_matter_attribute_value(
self._entity_info.primary_attribute
)
value = self.get_matter_attribute_value(self._entity_info.primary_attribute)
if self.entity_description.inverted:
value = not value
self._attr_is_on = value
class MatterGenericCommandSwitch(MatterSwitch):
@@ -121,9 +133,7 @@ class MatterGenericCommandSwitch(MatterSwitch):
@dataclass(frozen=True, kw_only=True)
class MatterGenericCommandSwitchEntityDescription(
SwitchEntityDescription, MatterEntityDescription
):
class MatterGenericCommandSwitchEntityDescription(MatterSwitchEntityDescription):
"""Describe Matter Generic command Switch entities."""
# command: a custom callback to create the command to send to the device
@@ -133,9 +143,7 @@ class MatterGenericCommandSwitchEntityDescription(
@dataclass(frozen=True, kw_only=True)
class MatterNumericSwitchEntityDescription(
SwitchEntityDescription, MatterEntityDescription
):
class MatterNumericSwitchEntityDescription(MatterSwitchEntityDescription):
"""Describe Matter Numeric Switch entities."""
@@ -146,11 +154,10 @@ class MatterNumericSwitch(MatterSwitch):
async def _async_set_native_value(self, value: bool) -> None:
"""Update the current value."""
send_value: Any = value
if value_convert := self.entity_description.ha_to_device:
send_value = value_convert(value)
await self.write_attribute(
value=send_value,
)
await self.write_attribute(value=send_value)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn switch on."""
@@ -248,19 +255,12 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=MatterNumericSwitchEntityDescription(
entity_description=MatterSwitchEntityDescription(
key="MatterMuteToggle",
translation_key="speaker_mute",
device_to_ha={
True: False, # True means volume is on, so HA should show mute as off
False: True, # False means volume is off (muted), so HA should show mute as on
}.get,
ha_to_device={
False: True, # HA showing mute as off means volume is on, so send True
True: False, # HA showing mute as on means volume is off (muted), so send False
}.get,
inverted=True,
),
entity_class=MatterNumericSwitch,
entity_class=MatterSwitch,
required_attributes=(clusters.OnOff.Attributes.OnOff,),
device_type=(device_types.Speaker,),
),

View File

@@ -0,0 +1,130 @@
"""Integration for MyNeomitis."""
from __future__ import annotations
from dataclasses import dataclass
import logging
from typing import Any
import aiohttp
import pyaxencoapi
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_EMAIL,
CONF_PASSWORD,
EVENT_HOMEASSISTANT_STOP,
Platform,
)
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.aiohttp_client import async_get_clientsession
_LOGGER = logging.getLogger(__name__)
PLATFORMS = [Platform.SELECT]
@dataclass
class MyNeomitisRuntimeData:
"""Runtime data for MyNeomitis integration."""
api: pyaxencoapi.PyAxencoAPI
devices: list[dict[str, Any]]
type MyNeomitisConfigEntry = ConfigEntry[MyNeomitisRuntimeData]
async def async_setup_entry(hass: HomeAssistant, entry: MyNeomitisConfigEntry) -> bool:
"""Set up MyNeomitis from a config entry."""
session = async_get_clientsession(hass)
email: str = entry.data[CONF_EMAIL]
password: str = entry.data[CONF_PASSWORD]
api = pyaxencoapi.PyAxencoAPI(session)
connected = False
try:
await api.login(email, password)
await api.connect_websocket()
connected = True
_LOGGER.debug("Successfully connected to Login/WebSocket")
# Retrieve the user's devices
devices: list[dict[str, Any]] = await api.get_devices()
except aiohttp.ClientResponseError as err:
if connected:
try:
await api.disconnect_websocket()
except (
TimeoutError,
ConnectionError,
aiohttp.ClientError,
) as disconnect_err:
_LOGGER.error(
"Error while disconnecting WebSocket for %s: %s",
entry.entry_id,
disconnect_err,
)
if err.status == 401:
raise ConfigEntryAuthFailed(
"Authentication failed, please update your credentials"
) from err
raise ConfigEntryNotReady(f"Error connecting to API: {err}") from err
except (TimeoutError, ConnectionError, aiohttp.ClientError) as err:
if connected:
try:
await api.disconnect_websocket()
except (
TimeoutError,
ConnectionError,
aiohttp.ClientError,
) as disconnect_err:
_LOGGER.error(
"Error while disconnecting WebSocket for %s: %s",
entry.entry_id,
disconnect_err,
)
raise ConfigEntryNotReady(f"Error connecting to API/WebSocket: {err}") from err
entry.runtime_data = MyNeomitisRuntimeData(api=api, devices=devices)
async def _async_disconnect_websocket(_event: Event) -> None:
"""Disconnect WebSocket on Home Assistant shutdown."""
try:
await api.disconnect_websocket()
except (TimeoutError, ConnectionError, aiohttp.ClientError) as err:
_LOGGER.error(
"Error while disconnecting WebSocket for %s: %s",
entry.entry_id,
err,
)
entry.async_on_unload(
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STOP, _async_disconnect_websocket
)
)
# Load platforms
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: MyNeomitisConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
try:
await entry.runtime_data.api.disconnect_websocket()
except (TimeoutError, ConnectionError) as err:
_LOGGER.error(
"Error while disconnecting WebSocket for %s: %s",
entry.entry_id,
err,
)
return unload_ok

View File

@@ -0,0 +1,78 @@
"""Config flow for MyNeomitis integration."""
import logging
from typing import Any
import aiohttp
from pyaxencoapi import PyAxencoAPI
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_EMAIL, CONF_PASSWORD
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import CONF_USER_ID, DOMAIN
_LOGGER = logging.getLogger(__name__)
class MyNeoConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle the configuration flow for the MyNeomitis integration."""
VERSION = 1
MINOR_VERSION = 1
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step of the configuration flow."""
errors: dict[str, str] = {}
if user_input is not None:
email: str = user_input[CONF_EMAIL]
password: str = user_input[CONF_PASSWORD]
session = async_get_clientsession(self.hass)
api = PyAxencoAPI(session)
try:
await api.login(email, password)
except aiohttp.ClientResponseError as e:
if e.status == 401:
errors["base"] = "invalid_auth"
elif e.status >= 500:
errors["base"] = "cannot_connect"
else:
errors["base"] = "unknown"
except aiohttp.ClientConnectionError:
errors["base"] = "cannot_connect"
except aiohttp.ClientError:
errors["base"] = "unknown"
except Exception:
_LOGGER.exception("Unexpected error during login")
errors["base"] = "unknown"
if not errors:
# Prevent duplicate configuration with the same user ID
await self.async_set_unique_id(api.user_id)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=f"MyNeomitis ({email})",
data={
CONF_EMAIL: email,
CONF_PASSWORD: password,
CONF_USER_ID: api.user_id,
},
)
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_EMAIL): str,
vol.Required(CONF_PASSWORD): str,
}
),
errors=errors,
)

View File

@@ -0,0 +1,4 @@
"""Constants for the MyNeomitis integration."""
DOMAIN = "myneomitis"
CONF_USER_ID = "user_id"

View File

@@ -0,0 +1,31 @@
{
"entity": {
"select": {
"pilote": {
"state": {
"antifrost": "mdi:snowflake",
"auto": "mdi:refresh-auto",
"boost": "mdi:rocket-launch",
"comfort": "mdi:fire",
"eco": "mdi:leaf",
"eco_1": "mdi:leaf",
"eco_2": "mdi:leaf",
"standby": "mdi:toggle-switch-off-outline"
}
},
"relais": {
"state": {
"auto": "mdi:refresh-auto",
"off": "mdi:toggle-switch-off-outline",
"on": "mdi:toggle-switch"
}
},
"ufh": {
"state": {
"cooling": "mdi:snowflake",
"heating": "mdi:fire"
}
}
}
}
}

View File

@@ -0,0 +1,11 @@
{
"domain": "myneomitis",
"name": "MyNeomitis",
"codeowners": ["@l-pr"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/myneomitis",
"integration_type": "hub",
"iot_class": "cloud_push",
"quality_scale": "bronze",
"requirements": ["pyaxencoapi==1.0.6"]
}

View File

@@ -0,0 +1,76 @@
rules:
# Bronze tier rules
action-setup:
status: exempt
comment: Integration does not register service actions.
appropriate-polling:
status: exempt
comment: Integration uses WebSocket push updates, not polling.
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: Integration does not provide service actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver tier rules
action-exceptions:
status: exempt
comment: Integration does not provide service actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration has no configuration parameters beyond initial setup.
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates:
status: exempt
comment: Integration uses WebSocket callbacks to push updates directly to entities, not coordinator-based polling.
reauthentication-flow: todo
test-coverage: done
# Gold tier rules
devices: todo
diagnostics: todo
discovery-update-info:
status: exempt
comment: Integration is cloud-based and does not use local discovery.
discovery:
status: exempt
comment: Integration requires manual authentication via cloud service.
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: todo
entity-category: todo
entity-device-class: todo
entity-disabled-by-default: todo
entity-translations: done
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues: todo
stale-devices: todo
# Platinum tier rules
async-dependency: done
inject-websession: done
strict-typing: todo

View File

@@ -0,0 +1,208 @@
"""Select entities for MyNeomitis integration.
This module defines and sets up the select entities for the MyNeomitis integration.
"""
from __future__ import annotations
from dataclasses import dataclass
import logging
from typing import Any
from pyaxencoapi import PyAxencoAPI
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import MyNeomitisConfigEntry
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
SUPPORTED_MODELS: frozenset[str] = frozenset({"EWS"})
SUPPORTED_SUB_MODELS: frozenset[str] = frozenset({"UFH"})
PRESET_MODE_MAP = {
"comfort": 1,
"eco": 2,
"antifrost": 3,
"standby": 4,
"boost": 6,
"setpoint": 8,
"comfort_plus": 20,
"eco_1": 40,
"eco_2": 41,
"auto": 60,
}
PRESET_MODE_MAP_RELAIS = {
"on": 1,
"off": 2,
"auto": 60,
}
PRESET_MODE_MAP_UFH = {
"heating": 0,
"cooling": 1,
}
REVERSE_PRESET_MODE_MAP = {v: k for k, v in PRESET_MODE_MAP.items()}
REVERSE_PRESET_MODE_MAP_RELAIS = {v: k for k, v in PRESET_MODE_MAP_RELAIS.items()}
REVERSE_PRESET_MODE_MAP_UFH = {v: k for k, v in PRESET_MODE_MAP_UFH.items()}
@dataclass(frozen=True, kw_only=True)
class MyNeoSelectEntityDescription(SelectEntityDescription):
"""Describe MyNeomitis select entity."""
preset_mode_map: dict[str, int]
reverse_preset_mode_map: dict[int, str]
state_key: str
SELECT_TYPES: dict[str, MyNeoSelectEntityDescription] = {
"relais": MyNeoSelectEntityDescription(
key="relais",
translation_key="relais",
options=list(PRESET_MODE_MAP_RELAIS),
preset_mode_map=PRESET_MODE_MAP_RELAIS,
reverse_preset_mode_map=REVERSE_PRESET_MODE_MAP_RELAIS,
state_key="targetMode",
),
"pilote": MyNeoSelectEntityDescription(
key="pilote",
translation_key="pilote",
options=list(PRESET_MODE_MAP),
preset_mode_map=PRESET_MODE_MAP,
reverse_preset_mode_map=REVERSE_PRESET_MODE_MAP,
state_key="targetMode",
),
"ufh": MyNeoSelectEntityDescription(
key="ufh",
translation_key="ufh",
options=list(PRESET_MODE_MAP_UFH),
preset_mode_map=PRESET_MODE_MAP_UFH,
reverse_preset_mode_map=REVERSE_PRESET_MODE_MAP_UFH,
state_key="changeOverUser",
),
}
async def async_setup_entry(
hass: HomeAssistant,
config_entry: MyNeomitisConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Select entities from a config entry."""
api = config_entry.runtime_data.api
devices = config_entry.runtime_data.devices
def _create_entity(device: dict) -> MyNeoSelect:
"""Create a select entity for a device."""
if device["model"] == "EWS":
# According to the MyNeomitis API, EWS "relais" devices expose a "relayMode"
# field in their state, while "pilote" devices do not. We therefore use the
# presence of "relayMode" as an explicit heuristic to distinguish relais
# from pilote devices. If the upstream API changes this behavior, this
# detection logic must be revisited.
if "relayMode" in device.get("state", {}):
description = SELECT_TYPES["relais"]
else:
description = SELECT_TYPES["pilote"]
else: # UFH
description = SELECT_TYPES["ufh"]
return MyNeoSelect(api, device, description)
select_entities = [
_create_entity(device)
for device in devices
if device["model"] in SUPPORTED_MODELS | SUPPORTED_SUB_MODELS
]
async_add_entities(select_entities)
class MyNeoSelect(SelectEntity):
"""Select entity for MyNeomitis devices."""
entity_description: MyNeoSelectEntityDescription
_attr_has_entity_name = True
_attr_name = None # Entity represents the device itself
_attr_should_poll = False
def __init__(
self,
api: PyAxencoAPI,
device: dict[str, Any],
description: MyNeoSelectEntityDescription,
) -> None:
"""Initialize the MyNeoSelect entity."""
self.entity_description = description
self._api = api
self._device = device
self._attr_unique_id = device["_id"]
self._attr_available = device["connected"]
self._attr_device_info = dr.DeviceInfo(
identifiers={(DOMAIN, device["_id"])},
name=device["name"],
manufacturer="Axenco",
model=device["model"],
)
# Set current option based on device state
current_mode = device.get("state", {}).get(description.state_key)
self._attr_current_option = description.reverse_preset_mode_map.get(
current_mode
)
self._unavailable_logged: bool = False
async def async_added_to_hass(self) -> None:
"""Register listener when entity is added to hass."""
await super().async_added_to_hass()
if unsubscribe := self._api.register_listener(
self._device["_id"], self.handle_ws_update
):
self.async_on_remove(unsubscribe)
@callback
def handle_ws_update(self, new_state: dict[str, Any]) -> None:
"""Handle WebSocket updates for the device."""
if not new_state:
return
if "connected" in new_state:
self._attr_available = new_state["connected"]
if not self._attr_available:
if not self._unavailable_logged:
_LOGGER.info("The entity %s is unavailable", self.entity_id)
self._unavailable_logged = True
elif self._unavailable_logged:
_LOGGER.info("The entity %s is back online", self.entity_id)
self._unavailable_logged = False
# Check for state updates using the description's state_key
state_key = self.entity_description.state_key
if state_key in new_state:
mode = new_state.get(state_key)
if mode is not None:
self._attr_current_option = (
self.entity_description.reverse_preset_mode_map.get(mode)
)
self.async_write_ha_state()
async def async_select_option(self, option: str) -> None:
"""Send the new mode via the API."""
mode_code = self.entity_description.preset_mode_map.get(option)
if mode_code is None:
_LOGGER.warning("Unknown mode selected: %s", option)
return
await self._api.set_device_mode(self._device["_id"], mode_code)
self._attr_current_option = option
self.async_write_ha_state()

View File

@@ -0,0 +1,57 @@
{
"config": {
"abort": {
"already_configured": "This integration is already configured."
},
"error": {
"cannot_connect": "Could not connect to the MyNeomitis service. Please try again later.",
"invalid_auth": "Authentication failed. Please check your email address and password.",
"unknown": "An unexpected error occurred. Please try again."
},
"step": {
"user": {
"data": {
"email": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"email": "Your email address used for your MyNeomitis account",
"password": "Your MyNeomitis account password"
},
"description": "Enter your MyNeomitis account credentials.",
"title": "Connect to MyNeomitis"
}
}
},
"entity": {
"select": {
"pilote": {
"state": {
"antifrost": "Frost protection",
"auto": "[%key:common::state::auto%]",
"boost": "Boost",
"comfort": "Comfort",
"comfort_plus": "Comfort +",
"eco": "Eco",
"eco_1": "Eco -1",
"eco_2": "Eco -2",
"setpoint": "Setpoint",
"standby": "[%key:common::state::standby%]"
}
},
"relais": {
"state": {
"auto": "[%key:common::state::auto%]",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
},
"ufh": {
"state": {
"cooling": "Cooling",
"heating": "Heating"
}
}
}
}
}

View File

@@ -81,6 +81,9 @@
"service": "mdi:comment-remove"
},
"publish": {
"sections": {
"actions": "mdi:gesture-tap-button"
},
"service": "mdi:send"
}
}

View File

@@ -27,7 +27,14 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import NtfyConfigEntry
from .entity import NtfyBaseEntity
from .services import ATTR_ATTACH_FILE, ATTR_FILENAME, ATTR_SEQUENCE_ID
from .services import (
ACTIONS_MAP,
ATTR_ACTION,
ATTR_ACTIONS,
ATTR_ATTACH_FILE,
ATTR_FILENAME,
ATTR_SEQUENCE_ID,
)
_LOGGER = logging.getLogger(__name__)
@@ -105,6 +112,15 @@ class NtfyNotifyEntity(NtfyBaseEntity, NotifyEntity):
params.setdefault(ATTR_FILENAME, media.path.name)
actions: list[dict[str, Any]] | None = params.get(ATTR_ACTIONS)
if actions:
params["actions"] = [
ACTIONS_MAP[action[ATTR_ACTION]](
**{k: v for k, v in action.items() if k != ATTR_ACTION}
)
for action in actions
]
msg = Message(topic=self.topic, **params)
try:
await self.ntfy.publish(msg, attachment)

View File

@@ -3,6 +3,7 @@
from datetime import timedelta
from typing import Any
from aiontfy import BroadcastAction, CopyAction, HttpAction, ViewAction
import voluptuous as vol
from yarl import URL
@@ -34,6 +35,28 @@ ATTR_ATTACH_FILE = "attach_file"
ATTR_FILENAME = "filename"
GRP_ATTACHMENT = "attachment"
MSG_ATTACHMENT = "Only one attachment source is allowed: URL or local file"
ATTR_ACTIONS = "actions"
ATTR_ACTION = "action"
ATTR_VIEW = "view"
ATTR_BROADCAST = "broadcast"
ATTR_HTTP = "http"
ATTR_LABEL = "label"
ATTR_URL = "url"
ATTR_CLEAR = "clear"
ATTR_INTENT = "intent"
ATTR_EXTRAS = "extras"
ATTR_METHOD = "method"
ATTR_HEADERS = "headers"
ATTR_BODY = "body"
ATTR_VALUE = "value"
ATTR_COPY = "copy"
ACTIONS_MAP = {
ATTR_VIEW: ViewAction,
ATTR_BROADCAST: BroadcastAction,
ATTR_HTTP: HttpAction,
ATTR_COPY: CopyAction,
}
MAX_ACTIONS_ALLOWED = 3 # ntfy only supports up to 3 actions per notification
def validate_filename(params: dict[str, Any]) -> dict[str, Any]:
@@ -45,6 +68,40 @@ def validate_filename(params: dict[str, Any]) -> dict[str, Any]:
return params
ACTION_SCHEMA = vol.Schema(
{
vol.Required(ATTR_LABEL): cv.string,
vol.Optional(ATTR_CLEAR, default=False): cv.boolean,
}
)
VIEW_SCHEMA = ACTION_SCHEMA.extend(
{
vol.Required(ATTR_ACTION): vol.Equal("view"),
vol.Required(ATTR_URL): vol.All(vol.Url(), vol.Coerce(URL)),
}
)
BROADCAST_SCHEMA = ACTION_SCHEMA.extend(
{
vol.Required(ATTR_ACTION): vol.Equal("broadcast"),
vol.Optional(ATTR_INTENT): cv.string,
vol.Optional(ATTR_EXTRAS): dict[str, str],
}
)
HTTP_SCHEMA = VIEW_SCHEMA.extend(
{
vol.Required(ATTR_ACTION): vol.Equal("http"),
vol.Optional(ATTR_METHOD): cv.string,
vol.Optional(ATTR_HEADERS): dict[str, str],
vol.Optional(ATTR_BODY): cv.string,
}
)
COPY_SCHEMA = ACTION_SCHEMA.extend(
{
vol.Required(ATTR_ACTION): vol.Equal("copy"),
vol.Required(ATTR_VALUE): cv.string,
}
)
SERVICE_PUBLISH_SCHEMA = vol.All(
cv.make_entity_service_schema(
{
@@ -69,6 +126,14 @@ SERVICE_PUBLISH_SCHEMA = vol.All(
ATTR_ATTACH_FILE, GRP_ATTACHMENT, MSG_ATTACHMENT
): MediaSelector({"accept": ["*/*"]}),
vol.Optional(ATTR_FILENAME): cv.string,
vol.Optional(ATTR_ACTIONS): vol.All(
cv.ensure_list,
vol.Length(
max=MAX_ACTIONS_ALLOWED,
msg="Too many actions defined. A maximum of 3 is supported",
),
[vol.Any(VIEW_SCHEMA, BROADCAST_SCHEMA, HTTP_SCHEMA, COPY_SCHEMA)],
),
}
),
validate_filename,

View File

@@ -99,6 +99,65 @@ publish:
type: url
autocomplete: url
example: https://example.org/logo.png
actions:
selector:
object:
label_field: "label"
description_field: "url"
multiple: true
translation_key: actions
fields:
action:
required: true
selector:
select:
options:
- value: view
label: Open website/app
- value: http
label: Send HTTP request
- value: broadcast
label: Send Android broadcast
- value: copy
label: Copy to clipboard
translation_key: action_type
mode: dropdown
label:
selector:
text:
required: true
clear:
selector:
boolean:
url:
selector:
text:
type: url
method:
selector:
select:
options:
- GET
- POST
- PUT
- DELETE
custom_value: true
headers:
selector:
object:
body:
selector:
text:
multiline: true
intent:
selector:
text:
extras:
selector:
object:
value:
selector:
text:
sequence_id:
required: false
selector:

View File

@@ -318,6 +318,50 @@
}
},
"selector": {
"actions": {
"fields": {
"action": {
"description": "Select the type of action to add to the notification",
"name": "Action type"
},
"body": {
"description": "The body of the HTTP request for `http` actions.",
"name": "HTTP body"
},
"clear": {
"description": "Clear notification after action button is tapped",
"name": "Clear notification"
},
"extras": {
"description": "Extras to include in the intent as key-value pairs for 'broadcast' actions",
"name": "Intent extras"
},
"headers": {
"description": "Additional HTTP headers as key-value pairs for 'http' actions",
"name": "HTTP headers"
},
"intent": {
"description": "Android intent to send when the 'broadcast' action is triggered",
"name": "Intent"
},
"label": {
"description": "Label of the action button",
"name": "Label"
},
"method": {
"description": "HTTP method to use for the 'http' action",
"name": "HTTP method"
},
"url": {
"description": "URL to open for the 'view' action or to request for the 'http' action",
"name": "URL"
},
"value": {
"description": "Value to copy to clipboard when the 'copy' action is triggered",
"name": "Value"
}
}
},
"priority": {
"options": {
"1": "Minimum",
@@ -352,6 +396,10 @@
"publish": {
"description": "Publishes a notification message to a ntfy topic",
"fields": {
"actions": {
"description": "Up to three actions ('view', 'broadcast', or 'http') can be added as buttons below the notification. Actions are executed when the corresponding button is tapped or clicked.",
"name": "Action buttons"
},
"attach": {
"description": "Attach images or other files by URL.",
"name": "Attachment URL"

View File

@@ -9,5 +9,5 @@
"integration_type": "service",
"iot_class": "local_push",
"quality_scale": "platinum",
"requirements": ["python-overseerr==0.8.0"]
"requirements": ["python-overseerr==0.9.0"]
}

View File

@@ -79,6 +79,14 @@ async def _async_get_requests(call: ServiceCall) -> ServiceResponse:
req["media"] = await _get_media(
client, request.media.media_type, request.media.tmdb_id
)
for user in (req["modified_by"], req["requested_by"]):
del user["avatar_e_tag"]
del user["avatar_version"]
del user["permissions"]
del user["recovery_link_expiration_date"]
del user["settings"]
del user["user_type"]
del user["warnings"]
result.append(req)
return {"requests": cast(list[JsonValueType], result)}

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
from dataclasses import asdict, dataclass
from typing import Any
from homeassistant.components.climate import (
@@ -38,10 +38,7 @@ class PlugwiseClimateExtraStoredData(ExtraStoredData):
def as_dict(self) -> dict[str, Any]:
"""Return a dict representation of the text data."""
return {
"last_active_schedule": self.last_active_schedule,
"previous_action_mode": self.previous_action_mode,
}
return asdict(self)
@classmethod
def from_dict(cls, restored: dict[str, Any]) -> PlugwiseClimateExtraStoredData:
@@ -102,7 +99,9 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity, RestoreEntity):
extra_data.as_dict()
)
self._last_active_schedule = plugwise_extra_data.last_active_schedule
self._previous_action_mode = plugwise_extra_data.previous_action_mode
self._previous_action_mode = (
plugwise_extra_data.previous_action_mode or HVACAction.HEATING.value
)
def __init__(
self,
@@ -202,11 +201,10 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity, RestoreEntity):
if self.coordinator.api.cooling_present:
if "regulation_modes" in self._gateway_data:
selected = self._gateway_data.get("select_regulation_mode")
if selected == HVACAction.COOLING.value:
hvac_modes.append(HVACMode.COOL)
if selected == HVACAction.HEATING.value:
if "heating" in self._gateway_data["regulation_modes"]:
hvac_modes.append(HVACMode.HEAT)
if "cooling" in self._gateway_data["regulation_modes"]:
hvac_modes.append(HVACMode.COOL)
else:
hvac_modes.append(HVACMode.HEAT_COOL)
else:
@@ -253,40 +251,75 @@ class PlugwiseClimateEntity(PlugwiseEntity, ClimateEntity, RestoreEntity):
await self.coordinator.api.set_temperature(self._location, data)
def _regulation_mode_for_hvac(self, hvac_mode: HVACMode) -> str | None:
"""Return the API regulation value for a manual HVAC mode, or None."""
if hvac_mode == HVACMode.HEAT:
return HVACAction.HEATING.value
if hvac_mode == HVACMode.COOL:
return HVACAction.COOLING.value
return None
@plugwise_command
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the hvac mode."""
"""Set the HVAC mode (off, heat, cool, heat_cool, or auto/schedule)."""
if hvac_mode == self.hvac_mode:
return
api = self.coordinator.api
current_schedule = self.device.get("select_schedule")
# OFF: single API call
if hvac_mode == HVACMode.OFF:
await self.coordinator.api.set_regulation_mode(hvac_mode.value)
else:
current = self.device.get("select_schedule")
desired = current
await api.set_regulation_mode(hvac_mode.value)
return
# Capture the last valid schedule
if desired and desired != "off":
self._last_active_schedule = desired
elif desired == "off":
desired = self._last_active_schedule
# Enabling HVACMode.AUTO requires a previously set schedule for saving and restoring
if hvac_mode == HVACMode.AUTO and not desired:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key=ERROR_NO_SCHEDULE,
)
await self.coordinator.api.set_schedule_state(
self._location,
STATE_ON if hvac_mode == HVACMode.AUTO else STATE_OFF,
desired,
# Manual mode (heat/cool/heat_cool) without a schedule: set regulation only
if (
current_schedule is None
and hvac_mode != HVACMode.AUTO
and (
regulation := self._regulation_mode_for_hvac(hvac_mode)
or self._previous_action_mode
)
if self.hvac_mode == HVACMode.OFF and self._previous_action_mode:
await self.coordinator.api.set_regulation_mode(
self._previous_action_mode
):
await api.set_regulation_mode(regulation)
return
# Manual mode: ensure regulation and turn off schedule when needed
if hvac_mode in (HVACMode.HEAT, HVACMode.COOL, HVACMode.HEAT_COOL):
regulation = self._regulation_mode_for_hvac(hvac_mode) or (
self._previous_action_mode
if self.hvac_mode in (HVACMode.HEAT_COOL, HVACMode.OFF)
else None
)
if regulation:
await api.set_regulation_mode(regulation)
if (
self.hvac_mode == HVACMode.OFF and current_schedule not in (None, "off")
) or (self.hvac_mode == HVACMode.AUTO and current_schedule is not None):
await api.set_schedule_state(
self._location, STATE_OFF, current_schedule
)
return
# AUTO: restore schedule and regulation
desired_schedule = current_schedule
if desired_schedule and desired_schedule != "off":
self._last_active_schedule = desired_schedule
elif desired_schedule == "off":
desired_schedule = self._last_active_schedule
if not desired_schedule:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key=ERROR_NO_SCHEDULE,
)
if self._previous_action_mode:
if self.hvac_mode == HVACMode.OFF:
await api.set_regulation_mode(self._previous_action_mode)
await api.set_schedule_state(self._location, STATE_ON, desired_schedule)
@plugwise_command
async def async_set_preset_mode(self, preset_mode: str) -> None:

View File

@@ -19,6 +19,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_create_clientsession
import homeassistant.helpers.config_validation as cv
import homeassistant.helpers.device_registry as dr
from homeassistant.helpers.device_registry import DeviceEntry
import homeassistant.helpers.entity_registry as er
from homeassistant.helpers.typing import ConfigType
@@ -137,3 +138,26 @@ async def async_migrate_entry(hass: HomeAssistant, entry: PortainerConfigEntry)
hass.config_entries.async_update_entry(entry=entry, version=4)
return True
async def async_remove_config_entry_device(
hass: HomeAssistant,
entry: PortainerConfigEntry,
device: DeviceEntry,
) -> bool:
"""Remove a config entry from a device."""
coordinator = entry.runtime_data
valid_identifiers: set[tuple[str, str]] = set()
# The Portainer integration creates devices for both endpoints and containers. That's why we're doing it double
valid_identifiers.update(
(DOMAIN, f"{entry.entry_id}_{endpoint_id}") for endpoint_id in coordinator.data
)
valid_identifiers.update(
(DOMAIN, f"{entry.entry_id}_{container_name}")
for endpoint in coordinator.data.values()
for container_name in endpoint.containers
)
return not device.identifiers.intersection(valid_identifiers)

View File

@@ -30,11 +30,8 @@ rules:
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: todo
reauthentication-flow:
status: todo
comment: |
No reauthentication flow is defined. It will be done in a next iteration.
parallel-updates: done
reauthentication-flow: done
test-coverage: done
# Gold
devices: done
@@ -47,25 +44,27 @@ rules:
status: exempt
comment: |
No discovery is implemented, since it's software based.
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: todo
entity-category: todo
entity-device-class: todo
entity-disabled-by-default: todo
entity-translations: todo
exception-translations: todo
icon-translations: todo
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: done
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
repair-issues: todo
stale-devices: todo
repair-issues:
status: exempt
comment: |
No repair issues are implemented, currently.
stale-devices: done
# Platinum
async-dependency: todo
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -176,7 +176,7 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
list[dict[str, Any]], list[tuple[list[dict[str, Any]], list[dict[str, Any]]]]
]:
"""Fetch all nodes, and then proceed to the VMs and containers."""
nodes = self.proxmox.nodes.get()
nodes = self.proxmox.nodes.get() or []
vms_containers = [self._get_vms_containers(node) for node in nodes]
return nodes, vms_containers

View File

@@ -0,0 +1,28 @@
"""Diagnostics support for Proxmox VE."""
from __future__ import annotations
from dataclasses import asdict
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant
from . import ProxmoxConfigEntry
TO_REDACT = [CONF_USERNAME, CONF_PASSWORD, CONF_HOST]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: ProxmoxConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a Proxmox VE config entry."""
return {
"config_entry": async_redact_data(config_entry.as_dict(), TO_REDACT),
"devices": {
node: asdict(node_data)
for node, node_data in config_entry.runtime_data.data.items()
},
}

Some files were not shown because too many files have changed in this diff Show More