Compare commits

..

58 Commits

Author SHA1 Message Date
epenet
834227a762 Use constants in calendar test (#164021) 2026-02-25 10:51:58 +01:00
Ludovic BOUÉ
3426846361 Add CLEAN_AREA feature to Matter vacuum entity (#163570)
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
Co-authored-by: Artur Pragacz <artur@pragacz.com>
2026-02-25 10:47:47 +01:00
Artur Pragacz
50f39621e9 Add vacuum area mapping not configured issue (#163965) 2026-02-25 10:45:44 +01:00
epenet
dc133bf7cc Move Tuya helpers to external library (#158791) 2026-02-25 10:35:12 +01:00
TheJulianJES
3219417a7d Bump ZHA to 1.0.0 (#164013) 2026-02-25 09:51:30 +01:00
Joost Lekkerkerker
9a23a518ed Add integration_type device to ws66i (#163987) 2026-02-25 09:50:16 +01:00
Joost Lekkerkerker
7e62852723 Add integration_type hub to watts (#163973) 2026-02-25 09:45:33 +01:00
Zhephyr
0a1027391f Add pet last seen flap device id and user id sensors to Sure Petcare (#160215) 2026-02-25 08:59:22 +01:00
Allen Porter
7644fc4325 Update MCP client integration to use new OAuth spec (#161611)
Co-authored-by: Robert Resch <robert@resch.dev>
2026-02-24 23:18:25 -08:00
Yangqian Yan
2f80720730 Add Full support for roborock Zeo washing/drying machines (#159575)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-24 23:17:56 -08:00
Joost Lekkerkerker
644c74f311 Add integration_type hub to zwave_me (#164000) 2026-02-25 07:29:36 +01:00
Joost Lekkerkerker
29370add66 Add integration_type service to zamg (#163997) 2026-02-25 07:28:32 +01:00
Joost Lekkerkerker
fc4680ad86 Add integration_type device to youless (#163996) 2026-02-25 07:28:11 +01:00
Joost Lekkerkerker
174076ba76 Add integration_type hub to yolink (#163995) 2026-02-25 07:27:37 +01:00
Joost Lekkerkerker
f3590bd9cf Add integration_type device to yeelight (#163994) 2026-02-25 07:27:08 +01:00
Joost Lekkerkerker
ae7f71219f Add integration_type device to yardian (#163993) 2026-02-25 07:26:27 +01:00
Joost Lekkerkerker
e1529620db Add integration_type hub to yale (#163989) 2026-02-25 07:25:20 +01:00
Joost Lekkerkerker
9a56d30924 Add integration_type hub to yale_smart_alarm (#163990) 2026-02-25 07:24:54 +01:00
Joost Lekkerkerker
d6df2b3c4c Add integration_type device to yamaha_musiccast (#163992) 2026-02-25 07:24:28 +01:00
Joost Lekkerkerker
9740dc65aa Add integration_type device to yalexs_ble (#163991) 2026-02-25 07:23:47 +01:00
Joost Lekkerkerker
b914971531 Add integration_type device to wolflink (#163982) 2026-02-25 07:23:14 +01:00
Joost Lekkerkerker
9007c65b50 Add integration_type hub to wilight (#163979) 2026-02-25 07:22:35 +01:00
Joost Lekkerkerker
a4a2847b03 Add integration_type hub to weheat (#163977) 2026-02-25 07:21:04 +01:00
Joost Lekkerkerker
9a11db2ad5 Add integration_type service to weatherkit (#163976) 2026-02-25 07:20:32 +01:00
Joost Lekkerkerker
2d445f8f53 Add integration_type hub to weatherflow_cloud (#163975) 2026-02-25 07:20:06 +01:00
Joost Lekkerkerker
f07c386529 Add integration_type device to watergate (#163972) 2026-02-25 07:18:54 +01:00
Joost Lekkerkerker
3cd79581dc Add integration_type hub to zimi (#163999) 2026-02-25 07:07:50 +01:00
Joost Lekkerkerker
e82df86dda Add integration_type hub to xiaomi_aqara (#163988) 2026-02-25 07:07:25 +01:00
Joost Lekkerkerker
1629d2b204 Add integration_type service to worldclock (#163986) 2026-02-25 07:07:05 +01:00
Joost Lekkerkerker
a6e60d8b73 Add integration_type hub to withings (#163980) 2026-02-25 07:06:45 +01:00
Joost Lekkerkerker
ef6650548e Add integration_type service to waze_travel_time (#163974) 2026-02-25 07:06:14 +01:00
Manu
52a2e94fc4 Bump aiontfy to 0.8.1 (#164010) 2026-02-25 07:05:36 +01:00
Klaas Schoute
6bba7e7583 Bump powerfox to v2.1.1 (#164004) 2026-02-25 02:14:27 +01:00
MizterB
58e8a8d398 Ecobee username/password authentication (#161716)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-25 01:41:36 +01:00
Klaas Schoute
6b0303a1ef Set quality scale to platinum for Powerfox Local integration (#164003) 2026-02-25 01:22:27 +01:00
Klaas Schoute
249e6c2f3d Add reconfiguration flow for Powerfox Local integration (#164002) 2026-02-25 01:05:30 +01:00
Simone Chemelli
7ae0380b33 Update IQS to gold for UptimeRobot (#162926) 2026-02-25 01:05:17 +01:00
Tom
889faa5a5c Add v6 firmware support to airOS (#163889)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-25 01:02:26 +01:00
Klaas Schoute
9b810c64d9 Add diagnostics support for Powerfox Local integration (#163985) 2026-02-25 00:24:33 +01:00
Joost Lekkerkerker
1e3bed9864 Add integration_type device to wiz (#163981) 2026-02-25 00:04:34 +01:00
Tom
eac3fb651e Update airOS quality_scale (#163895)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-24 23:47:18 +01:00
Karl Beecken
8b285239f0 Update Teltonika IQS to silver (#163943) 2026-02-24 23:37:46 +01:00
Andreas Jakl
d0a74ad539 Update quality scale to silver for nrgkick integration (#163964) 2026-02-24 23:35:06 +01:00
Andreas Jakl
0f071c1ae5 Fix accessing optional username and password for nrgkick integration (#163963) 2026-02-24 23:33:40 +01:00
mettolen
e671e4408b Implement dynamic devices for Liebherr integration (#163951)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-24 23:32:51 +01:00
Klaas Schoute
697441969b Add reauthentication flow for Powerfox Local integration (#163966) 2026-02-24 23:29:19 +01:00
nic
bc324a1a6e Add ZoneMinder integration test suite (#163115)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-24 23:27:13 +01:00
Robin Lintermann
e505ad9003 Update availability of entities when connection changes (#163252) 2026-02-24 23:25:57 +01:00
Jan Čermák
6a91771f04 Use native ARM runner for builder action, update to builder 2026.02.1 (#163942) 2026-02-24 23:14:50 +01:00
Christian Lackas
e7df4356f4 Fix HmIP-RGBW monochrome mode FEATURE_NOT_SUPPORTED error (#161917) 2026-02-24 22:38:47 +01:00
Luke Lashley
a41207d369 Implement changes for Clean area for Roborock. (#163956) 2026-02-24 22:34:56 +01:00
cdheiser
28e8d7c3eb Add tests to lutron (#162055)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-24 22:30:31 +01:00
mettolen
e514faf0bc Fix Saunum session parameters to use timedelta (#163962) 2026-02-24 22:14:09 +01:00
Erwin Douna
7894a80728 Proxmox separate errors and patch tests (#163922) 2026-02-24 22:08:50 +01:00
Przemko92
6751f6f4a2 Add sensor for compit integration (#161527) 2026-02-24 21:49:47 +01:00
Erwin Douna
ce0dd0eb7b Fix small typo in Portainer containers (#163957) 2026-02-24 21:45:34 +01:00
Erwin Douna
7cb595f768 Add sensor platform to Proxmox (#163404) 2026-02-24 21:45:07 +01:00
Kamil Breguła
dfbd4ffb2d Add diagnostics to met (#157805)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-24 21:34:54 +01:00
271 changed files with 15616 additions and 3888 deletions

View File

@@ -272,7 +272,7 @@ jobs:
name: Build ${{ matrix.machine }} machine core image
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ubuntu-latest
runs-on: ${{ matrix.runs-on }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
@@ -294,6 +294,21 @@ jobs:
- raspberrypi5-64
- yellow
- green
include:
# Default: aarch64 on native ARM runner
- arch: aarch64
runs-on: ubuntu-24.04-arm
# Overrides for amd64 machines
- machine: generic-x86-64
arch: amd64
runs-on: ubuntu-24.04
- machine: qemux86-64
arch: amd64
runs-on: ubuntu-24.04
# TODO: remove, intel-nuc is a legacy name for x86-64, renamed in 2021
- machine: intel-nuc
arch: amd64
runs-on: ubuntu-24.04
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -321,8 +336,9 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image
uses: home-assistant/builder@2025.11.0 # zizmor: ignore[unpinned-uses]
uses: home-assistant/builder@6cb4fd3d1338b6e22d0958a4bcb53e0965ea63b4 # 2026.02.1
with:
image: ${{ matrix.arch }}
args: |
$BUILD_ARGS \
--target /data/machine \

3
CODEOWNERS generated
View File

@@ -242,8 +242,6 @@ build.json @home-assistant/supervisor
/tests/components/bosch_alarm/ @mag1024 @sanjay900
/homeassistant/components/bosch_shc/ @tschamm
/tests/components/bosch_shc/ @tschamm
/homeassistant/components/brands/ @home-assistant/core
/tests/components/brands/ @home-assistant/core
/homeassistant/components/braviatv/ @bieniu @Drafteed
/tests/components/braviatv/ @bieniu @Drafteed
/homeassistant/components/bring/ @miaucl @tr4nt0r
@@ -1968,6 +1966,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/zone/ @home-assistant/core
/tests/components/zone/ @home-assistant/core
/homeassistant/components/zoneminder/ @rohankapoorcom @nabbi
/tests/components/zoneminder/ @rohankapoorcom @nabbi
/homeassistant/components/zwave_js/ @home-assistant/z-wave
/tests/components/zwave_js/ @home-assistant/z-wave
/homeassistant/components/zwave_me/ @lawfulchaos @Z-Wave-Me @PoltoS

View File

@@ -210,7 +210,6 @@ DEFAULT_INTEGRATIONS = {
"analytics", # Needed for onboarding
"application_credentials",
"backup",
"brands",
"frontend",
"hardware",
"labs",

View File

@@ -4,7 +4,16 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.exceptions import (
AirOSConnectionAuthenticationError,
AirOSConnectionSetupError,
AirOSDataMissingError,
AirOSDeviceConnectionError,
AirOSKeyDataMissingError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
from homeassistant.const import (
CONF_HOST,
@@ -15,6 +24,11 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryError,
ConfigEntryNotReady,
)
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -39,15 +53,40 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
hass, verify_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL]
)
airos_device = AirOS8(
host=entry.data[CONF_HOST],
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
session=session,
use_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
conn_data = {
CONF_HOST: entry.data[CONF_HOST],
CONF_USERNAME: entry.data[CONF_USERNAME],
CONF_PASSWORD: entry.data[CONF_PASSWORD],
"use_ssl": entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
"session": session,
}
# Determine firmware version before creating the device instance
try:
device_data: DetectDeviceData = await async_get_firmware_data(**conn_data)
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
TimeoutError,
) as err:
raise ConfigEntryNotReady from err
except (
AirOSConnectionAuthenticationError,
AirOSDataMissingError,
) as err:
raise ConfigEntryAuthFailed from err
except AirOSKeyDataMissingError as err:
raise ConfigEntryError("key_data_missing") from err
except Exception as err:
raise ConfigEntryError("unknown") from err
airos_class: type[AirOS8 | AirOS6] = (
AirOS8 if device_data["fw_major"] == 8 else AirOS6
)
coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device)
airos_device = airos_class(**conn_data)
coordinator = AirOSDataUpdateCoordinator(hass, entry, device_data, airos_device)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator

View File

@@ -4,7 +4,9 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import Generic, TypeVar
from airos.data import AirOSDataBaseClass
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -18,25 +20,24 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirOS8Data, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
@dataclass(frozen=True, kw_only=True)
class AirOSBinarySensorEntityDescription(BinarySensorEntityDescription):
class AirOSBinarySensorEntityDescription(
BinarySensorEntityDescription,
Generic[AirOSDataModel],
):
"""Describe an AirOS binary sensor."""
value_fn: Callable[[AirOS8Data], bool]
value_fn: Callable[[AirOSDataModel], bool]
BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
AirOSBinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOS8BinarySensorEntityDescription = AirOSBinarySensorEntityDescription[AirOS8Data]
COMMON_BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
AirOSBinarySensorEntityDescription(
key="dhcp_client",
translation_key="dhcp_client",
@@ -52,14 +53,6 @@ BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
value_fn=lambda data: data.services.dhcpd,
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="dhcp6_server",
translation_key="dhcp6_server",
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.dhcp6d_stateful,
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="pppoe",
translation_key="pppoe",
@@ -70,6 +63,23 @@ BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
),
)
AIROS8_BINARY_SENSORS: tuple[AirOS8BinarySensorEntityDescription, ...] = (
AirOS8BinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOS8BinarySensorEntityDescription(
key="dhcp6_server",
translation_key="dhcp6_server",
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.dhcp6d_stateful,
entity_registry_enabled_default=False,
),
)
async def async_setup_entry(
hass: HomeAssistant,
@@ -79,10 +89,20 @@ async def async_setup_entry(
"""Set up the AirOS binary sensors from a config entry."""
coordinator = config_entry.runtime_data
async_add_entities(
AirOSBinarySensor(coordinator, description) for description in BINARY_SENSORS
entities: list[BinarySensorEntity] = []
entities.extend(
AirOSBinarySensor(coordinator, description)
for description in COMMON_BINARY_SENSORS
)
if coordinator.device_data["fw_major"] == 8:
entities.extend(
AirOSBinarySensor(coordinator, description)
for description in AIROS8_BINARY_SENSORS
)
async_add_entities(entities)
class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):
"""Representation of a binary sensor."""

View File

@@ -2,8 +2,6 @@
from __future__ import annotations
import logging
from airos.exceptions import AirOSException
from homeassistant.components.button import (
@@ -18,8 +16,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import DOMAIN, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
REBOOT_BUTTON = ButtonEntityDescription(

View File

@@ -7,6 +7,8 @@ from collections.abc import Mapping
import logging
from typing import Any
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.discovery import airos_discover_devices
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -17,6 +19,7 @@ from airos.exceptions import (
AirOSKeyDataMissingError,
AirOSListenerError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
import voluptuous as vol
from homeassistant.config_entries import (
@@ -53,10 +56,11 @@ from .const import (
MAC_ADDRESS,
SECTION_ADVANCED_SETTINGS,
)
from .coordinator import AirOS8
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
# Discovery duration in seconds, airOS announces every 20 seconds
DISCOVER_INTERVAL: int = 30
@@ -92,7 +96,7 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOS8
self.airos_device: AirOSDeviceDetect
self.errors: dict[str, str] = {}
self.discovered_devices: dict[str, dict[str, Any]] = {}
self.discovery_abort_reason: str | None = None
@@ -135,16 +139,14 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
verify_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
airos_device = AirOS8(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
await airos_device.login()
airos_data = await airos_device.status()
device_data: DetectDeviceData = await async_get_firmware_data(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
except (
AirOSConnectionSetupError,
@@ -159,14 +161,14 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected exception during credential validation")
self.errors["base"] = "unknown"
else:
await self.async_set_unique_id(airos_data.derived.mac)
await self.async_set_unique_id(device_data["mac"])
if self.source in [SOURCE_REAUTH, SOURCE_RECONFIGURE]:
self._abort_if_unique_id_mismatch()
else:
self._abort_if_unique_id_configured()
return {"title": airos_data.host.hostname, "data": config_data}
return {"title": device_data["hostname"], "data": config_data}
return None

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6, AirOS6Data
from airos.airos8 import AirOS8, AirOS8Data
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -11,6 +12,7 @@ from airos.exceptions import (
AirOSDataMissingError,
AirOSDeviceConnectionError,
)
from airos.helpers import DetectDeviceData
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -21,19 +23,28 @@ from .const import DOMAIN, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
AirOSDataDetect = AirOS8Data | AirOS6Data
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
"""Class to manage fetching AirOS data from single endpoint."""
airos_device: AirOSDeviceDetect
config_entry: AirOSConfigEntry
def __init__(
self, hass: HomeAssistant, config_entry: AirOSConfigEntry, airos_device: AirOS8
self,
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
device_data: DetectDeviceData,
airos_device: AirOSDeviceDetect,
) -> None:
"""Initialize the coordinator."""
self.airos_device = airos_device
self.device_data = device_data
super().__init__(
hass,
_LOGGER,
@@ -42,7 +53,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> AirOS8Data:
async def _async_update_data(self) -> AirOSDataDetect:
"""Fetch data from AirOS."""
try:
await self.airos_device.login()
@@ -62,7 +73,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
except (AirOSDataMissingError,) as err:
except AirOSDataMissingError as err:
_LOGGER.error("Expected data not returned by airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -7,6 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"quality_scale": "platinum",
"requirements": ["airos==0.6.4"]
}

View File

@@ -42,16 +42,20 @@ rules:
# Gold
devices: done
diagnostics: done
discovery-update-info: todo
discovery: todo
discovery-update-info: done
discovery:
status: exempt
comment: No way to detect device on the network
docs-data-update: done
docs-examples: todo
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
dynamic-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
@@ -61,8 +65,10 @@ rules:
status: exempt
comment: no (custom) icons used or envisioned
reconfiguration-flow: done
repair-issues: todo
stale-devices: todo
repair-issues: done
stale-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
# Platinum
async-dependency: done

View File

@@ -5,8 +5,14 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import Generic, TypeVar
from airos.data import DerivedWirelessMode, DerivedWirelessRole, NetRole
from airos.data import (
AirOSDataBaseClass,
DerivedWirelessMode,
DerivedWirelessRole,
NetRole,
)
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -37,15 +43,19 @@ WIRELESS_ROLE_OPTIONS = [mode.value for mode in DerivedWirelessRole]
PARALLEL_UPDATES = 0
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
@dataclass(frozen=True, kw_only=True)
class AirOSSensorEntityDescription(SensorEntityDescription):
class AirOSSensorEntityDescription(SensorEntityDescription, Generic[AirOSDataModel]):
"""Describe an AirOS sensor."""
value_fn: Callable[[AirOS8Data], StateType]
value_fn: Callable[[AirOSDataModel], StateType]
SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
AirOS8SensorEntityDescription = AirOSSensorEntityDescription[AirOS8Data]
COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
AirOSSensorEntityDescription(
key="host_cpuload",
translation_key="host_cpuload",
@@ -75,54 +85,6 @@ SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
translation_key="wireless_essid",
value_fn=lambda data: data.wireless.essid,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOSSensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
AirOSSensorEntityDescription(
key="host_uptime",
translation_key="host_uptime",
@@ -158,6 +120,57 @@ SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
options=WIRELESS_ROLE_OPTIONS,
entity_registry_enabled_default=False,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
)
AIROS8_SENSORS: tuple[AirOS8SensorEntityDescription, ...] = (
AirOS8SensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOS8SensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
)
@@ -169,7 +182,14 @@ async def async_setup_entry(
"""Set up the AirOS sensors from a config entry."""
coordinator = config_entry.runtime_data
async_add_entities(AirOSSensor(coordinator, description) for description in SENSORS)
async_add_entities(
AirOSSensor(coordinator, description) for description in COMMON_SENSORS
)
if coordinator.device_data["fw_major"] == 8:
async_add_entities(
AirOSSensor(coordinator, description) for description in AIROS8_SENSORS
)
class AirOSSensor(AirOSEntity, SensorEntity):

View File

@@ -1,291 +0,0 @@
"""The Brands integration."""
from __future__ import annotations
from collections import deque
from http import HTTPStatus
import logging
from pathlib import Path
from random import SystemRandom
import time
from typing import Any, Final
from aiohttp import ClientError, hdrs, web
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components.http import KEY_AUTHENTICATED, HomeAssistantView
from homeassistant.core import HomeAssistant, callback, valid_domain
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import async_get_custom_components
from .const import (
ALLOWED_IMAGES,
BRANDS_CDN_URL,
CACHE_TTL,
CATEGORY_RE,
CDN_TIMEOUT,
DOMAIN,
HARDWARE_IMAGE_RE,
IMAGE_FALLBACKS,
PLACEHOLDER,
TOKEN_CHANGE_INTERVAL,
)
_LOGGER = logging.getLogger(__name__)
_RND: Final = SystemRandom()
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Brands integration."""
access_tokens: deque[str] = deque([], 2)
access_tokens.append(hex(_RND.getrandbits(256))[2:])
hass.data[DOMAIN] = access_tokens
@callback
def _rotate_token(_now: Any) -> None:
"""Rotate the access token."""
access_tokens.append(hex(_RND.getrandbits(256))[2:])
async_track_time_interval(hass, _rotate_token, TOKEN_CHANGE_INTERVAL)
hass.http.register_view(BrandsIntegrationView(hass))
hass.http.register_view(BrandsHardwareView(hass))
websocket_api.async_register_command(hass, ws_access_token)
return True
@callback
@websocket_api.websocket_command({vol.Required("type"): "brands/access_token"})
def ws_access_token(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Return the current brands access token."""
access_tokens: deque[str] = hass.data[DOMAIN]
connection.send_result(msg["id"], {"token": access_tokens[-1]})
def _read_cached_file_with_marker(
cache_path: Path,
) -> tuple[bytes | None, float] | None:
"""Read a cached file, distinguishing between content and 404 markers.
Returns (content, mtime) where content is None for 404 markers (empty files).
Returns None if the file does not exist at all.
"""
if not cache_path.is_file():
return None
mtime = cache_path.stat().st_mtime
data = cache_path.read_bytes()
if not data:
# Empty file is a 404 marker
return (None, mtime)
return (data, mtime)
def _write_cache_file(cache_path: Path, data: bytes) -> None:
"""Write data to cache file, creating directories as needed."""
cache_path.parent.mkdir(parents=True, exist_ok=True)
cache_path.write_bytes(data)
def _read_brand_file(brand_dir: Path, image: str) -> bytes | None:
"""Read a brand image, trying fallbacks in a single I/O pass."""
for candidate in (image, *IMAGE_FALLBACKS.get(image, ())):
file_path = brand_dir / candidate
if file_path.is_file():
return file_path.read_bytes()
return None
class _BrandsBaseView(HomeAssistantView):
"""Base view for serving brand images."""
requires_auth = False
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the view."""
self._hass = hass
self._cache_dir = Path(hass.config.cache_path(DOMAIN))
def _authenticate(self, request: web.Request) -> None:
"""Authenticate the request using Bearer token or query token."""
access_tokens: deque[str] = self._hass.data[DOMAIN]
authenticated = (
request[KEY_AUTHENTICATED] or request.query.get("token") in access_tokens
)
if not authenticated:
if hdrs.AUTHORIZATION in request.headers:
raise web.HTTPUnauthorized
raise web.HTTPForbidden
async def _serve_from_custom_integration(
self,
domain: str,
image: str,
) -> web.Response | None:
"""Try to serve a brand image from a custom integration."""
custom_components = await async_get_custom_components(self._hass)
if (integration := custom_components.get(domain)) is None:
return None
if not integration.has_branding:
return None
brand_dir = Path(integration.file_path) / "brand"
data = await self._hass.async_add_executor_job(
_read_brand_file, brand_dir, image
)
if data is not None:
return self._build_response(data)
return None
async def _serve_from_cache_or_cdn(
self,
cdn_path: str,
cache_subpath: str,
*,
fallback_placeholder: bool = True,
) -> web.Response:
"""Serve from disk cache, fetching from CDN if needed."""
cache_path = self._cache_dir / cache_subpath
now = time.time()
# Try disk cache
result = await self._hass.async_add_executor_job(
_read_cached_file_with_marker, cache_path
)
if result is not None:
data, mtime = result
# Schedule background refresh if stale
if now - mtime > CACHE_TTL:
self._hass.async_create_background_task(
self._fetch_and_cache(cdn_path, cache_path),
f"brands_refresh_{cache_subpath}",
)
else:
# Cache miss - fetch from CDN
data = await self._fetch_and_cache(cdn_path, cache_path)
if data is None:
if fallback_placeholder:
return await self._serve_placeholder(
image=cache_subpath.rsplit("/", 1)[-1]
)
return web.Response(status=HTTPStatus.NOT_FOUND)
return self._build_response(data)
async def _fetch_and_cache(
self,
cdn_path: str,
cache_path: Path,
) -> bytes | None:
"""Fetch from CDN and write to cache. Returns data or None on 404."""
url = f"{BRANDS_CDN_URL}/{cdn_path}"
session = async_get_clientsession(self._hass)
try:
resp = await session.get(url, timeout=CDN_TIMEOUT)
except ClientError, TimeoutError:
_LOGGER.debug("Failed to fetch brand from CDN: %s", cdn_path)
return None
if resp.status == HTTPStatus.NOT_FOUND:
# Cache the 404 as empty file
await self._hass.async_add_executor_job(_write_cache_file, cache_path, b"")
return None
if resp.status != HTTPStatus.OK:
_LOGGER.debug("Unexpected CDN response %s for %s", resp.status, cdn_path)
return None
data = await resp.read()
await self._hass.async_add_executor_job(_write_cache_file, cache_path, data)
return data
async def _serve_placeholder(self, image: str) -> web.Response:
"""Serve a placeholder image."""
return await self._serve_from_cache_or_cdn(
cdn_path=f"_/{PLACEHOLDER}/{image}",
cache_subpath=f"integrations/{PLACEHOLDER}/{image}",
fallback_placeholder=False,
)
@staticmethod
def _build_response(data: bytes) -> web.Response:
"""Build a response with proper headers."""
return web.Response(
body=data,
content_type="image/png",
)
class BrandsIntegrationView(_BrandsBaseView):
"""Serve integration brand images."""
name = "api:brands:integration"
url = "/api/brands/integration/{domain}/{image}"
async def get(
self,
request: web.Request,
domain: str,
image: str,
) -> web.Response:
"""Handle GET request for an integration brand image."""
self._authenticate(request)
if not valid_domain(domain) or image not in ALLOWED_IMAGES:
return web.Response(status=HTTPStatus.NOT_FOUND)
use_placeholder = request.query.get("placeholder") != "no"
# 1. Try custom integration local files
if (
response := await self._serve_from_custom_integration(domain, image)
) is not None:
return response
# 2. Try cache / CDN (always use direct path for proper 404 caching)
return await self._serve_from_cache_or_cdn(
cdn_path=f"brands/{domain}/{image}",
cache_subpath=f"integrations/{domain}/{image}",
fallback_placeholder=use_placeholder,
)
class BrandsHardwareView(_BrandsBaseView):
"""Serve hardware brand images."""
name = "api:brands:hardware"
url = "/api/brands/hardware/{category}/{image:.+}"
async def get(
self,
request: web.Request,
category: str,
image: str,
) -> web.Response:
"""Handle GET request for a hardware brand image."""
self._authenticate(request)
if not CATEGORY_RE.match(category):
return web.Response(status=HTTPStatus.NOT_FOUND)
# Hardware images have dynamic names like "manufacturer_model.png"
# Validate it ends with .png and contains only safe characters
if not HARDWARE_IMAGE_RE.match(image):
return web.Response(status=HTTPStatus.NOT_FOUND)
cache_subpath = f"hardware/{category}/{image}"
return await self._serve_from_cache_or_cdn(
cdn_path=cache_subpath,
cache_subpath=cache_subpath,
)

View File

@@ -1,57 +0,0 @@
"""Constants for the Brands integration."""
from __future__ import annotations
from datetime import timedelta
import re
from typing import Final
from aiohttp import ClientTimeout
DOMAIN: Final = "brands"
# CDN
BRANDS_CDN_URL: Final = "https://brands.home-assistant.io"
CDN_TIMEOUT: Final = ClientTimeout(total=10)
PLACEHOLDER: Final = "_placeholder"
# Caching
CACHE_TTL: Final = 30 * 24 * 60 * 60 # 30 days in seconds
# Access token
TOKEN_CHANGE_INTERVAL: Final = timedelta(minutes=30)
# Validation
CATEGORY_RE: Final = re.compile(r"^[a-z0-9_]+$")
HARDWARE_IMAGE_RE: Final = re.compile(r"^[a-z0-9_-]+\.png$")
# Images and fallback chains
ALLOWED_IMAGES: Final = frozenset(
{
"icon.png",
"logo.png",
"icon@2x.png",
"logo@2x.png",
"dark_icon.png",
"dark_logo.png",
"dark_icon@2x.png",
"dark_logo@2x.png",
}
)
# Fallback chains for image resolution, mirroring the brands CDN build logic.
# When a requested image is not found, we try each fallback in order.
IMAGE_FALLBACKS: Final[dict[str, list[str]]] = {
"logo.png": ["icon.png"],
"icon@2x.png": ["icon.png"],
"logo@2x.png": ["logo.png", "icon.png"],
"dark_icon.png": ["icon.png"],
"dark_logo.png": ["dark_icon.png", "logo.png", "icon.png"],
"dark_icon@2x.png": ["icon@2x.png", "icon.png"],
"dark_logo@2x.png": [
"dark_icon@2x.png",
"logo@2x.png",
"logo.png",
"icon.png",
],
}

View File

@@ -1,10 +0,0 @@
{
"domain": "brands",
"name": "Brands",
"codeowners": ["@home-assistant/core"],
"config_flow": false,
"dependencies": ["http", "websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/brands",
"integration_type": "system",
"quality_scale": "internal"
}

View File

@@ -38,7 +38,7 @@ async def _root_payload(
media_class=MediaClass.DIRECTORY,
media_content_id="",
media_content_type="presets",
thumbnail="/api/brands/integration/cambridge_audio/logo.png",
thumbnail="https://brands.home-assistant.io/_/cambridge_audio/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -14,6 +14,7 @@ PLATFORMS = [
Platform.CLIMATE,
Platform.NUMBER,
Platform.SELECT,
Platform.SENSOR,
Platform.WATER_HEATER,
]

View File

@@ -158,6 +158,119 @@
"winter": "mdi:snowflake"
}
}
},
"sensor": {
"alarm_code": {
"default": "mdi:alert-circle",
"state": {
"no_alarm": "mdi:check-circle"
}
},
"battery_level": {
"default": "mdi:battery"
},
"boiler_temperature": {
"default": "mdi:thermometer"
},
"calculated_heating_temperature": {
"default": "mdi:thermometer"
},
"calculated_target_temperature": {
"default": "mdi:thermometer"
},
"charging_power": {
"default": "mdi:flash"
},
"circuit_target_temperature": {
"default": "mdi:thermometer"
},
"co2_percent": {
"default": "mdi:molecule-co2"
},
"collector_power": {
"default": "mdi:solar-power"
},
"collector_temperature": {
"default": "mdi:thermometer"
},
"dhw_measured_temperature": {
"default": "mdi:thermometer"
},
"energy_consumption": {
"default": "mdi:lightning-bolt"
},
"energy_smart_grid_yesterday": {
"default": "mdi:lightning-bolt"
},
"energy_today": {
"default": "mdi:lightning-bolt"
},
"energy_total": {
"default": "mdi:lightning-bolt"
},
"energy_yesterday": {
"default": "mdi:lightning-bolt"
},
"fuel_level": {
"default": "mdi:gauge"
},
"humidity": {
"default": "mdi:water-percent"
},
"mixer_temperature": {
"default": "mdi:thermometer"
},
"outdoor_temperature": {
"default": "mdi:thermometer"
},
"pk1_function": {
"default": "mdi:cog",
"state": {
"cooling": "mdi:snowflake-thermometer",
"off": "mdi:cog-off",
"summer": "mdi:weather-sunny",
"winter": "mdi:snowflake"
}
},
"pm10_level": {
"default": "mdi:air-filter",
"state": {
"exceeded": "mdi:alert",
"no_sensor": "mdi:cancel",
"normal": "mdi:air-filter",
"warning": "mdi:alert-circle-outline"
}
},
"pm25_level": {
"default": "mdi:air-filter",
"state": {
"exceeded": "mdi:alert",
"no_sensor": "mdi:cancel",
"normal": "mdi:air-filter",
"warning": "mdi:alert-circle-outline"
}
},
"return_circuit_temperature": {
"default": "mdi:thermometer"
},
"tank_temperature_t2": {
"default": "mdi:thermometer"
},
"tank_temperature_t3": {
"default": "mdi:thermometer"
},
"tank_temperature_t4": {
"default": "mdi:thermometer"
},
"target_heating_temperature": {
"default": "mdi:thermometer"
},
"ventilation_alarm": {
"default": "mdi:alert",
"state": {
"no_alarm": "mdi:check-circle"
}
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -203,6 +203,219 @@
"winter": "Winter"
}
}
},
"sensor": {
"actual_buffer_temp": {
"name": "Actual buffer temperature"
},
"actual_dhw_temp": {
"name": "Actual DHW temperature"
},
"actual_hc_temperature_zone": {
"name": "Actual heating circuit {zone} temperature"
},
"actual_upper_source_temp": {
"name": "Actual upper source temperature"
},
"alarm_code": {
"name": "Alarm code",
"state": {
"battery_fault": "Battery fault",
"damaged_outdoor_temp": "Damaged outdoor temperature sensor",
"damaged_return_temp": "Damaged return temperature sensor",
"discharged_battery": "Discharged battery",
"internal_af": "Internal fault",
"low_battery_level": "Low battery level",
"no_alarm": "No alarm",
"no_battery": "No battery",
"no_power": "No power",
"no_pump": "No pump",
"pump_fault": "Pump fault"
}
},
"battery_level": {
"name": "Battery level"
},
"boiler_temperature": {
"name": "Boiler temperature"
},
"buffer_return_temperature": {
"name": "Buffer return temperature"
},
"buffer_set_temperature": {
"name": "Buffer set temperature"
},
"calculated_buffer_temp": {
"name": "Calculated buffer temperature"
},
"calculated_dhw_temp": {
"name": "Calculated DHW temperature"
},
"calculated_heating_temperature": {
"name": "Calculated heating temperature"
},
"calculated_target_temperature": {
"name": "Calculated target temperature"
},
"calculated_upper_source_temp": {
"name": "Calculated upper source temperature"
},
"charging_power": {
"name": "Charging power"
},
"circuit_target_temperature": {
"name": "Circuit target temperature"
},
"co2_percent": {
"name": "CO2 percent"
},
"collector_power": {
"name": "Collector power"
},
"collector_temperature": {
"name": "Collector temperature"
},
"dhw_measured_temperature": {
"name": "DHW measured temperature"
},
"dhw_temperature": {
"name": "DHW temperature"
},
"energy_consumption": {
"name": "Energy consumption"
},
"energy_smart_grid_yesterday": {
"name": "Energy smart grid yesterday"
},
"energy_today": {
"name": "Energy today"
},
"energy_total": {
"name": "Energy total"
},
"energy_yesterday": {
"name": "Energy yesterday"
},
"fuel_level": {
"name": "Fuel level"
},
"heating_target_temperature_zone": {
"name": "Heating circuit {zone} target temperature"
},
"lower_source_temperature": {
"name": "Lower source temperature"
},
"mixer_temperature": {
"name": "Mixer temperature"
},
"mixer_temperature_zone": {
"name": "Mixer {zone} temperature"
},
"outdoor_temperature": {
"name": "Outdoor temperature"
},
"pk1_function": {
"name": "PK1 function",
"state": {
"cooling": "Cooling",
"holiday": "Holiday",
"nano_nr_1": "Nano 1",
"nano_nr_2": "Nano 2",
"nano_nr_3": "Nano 3",
"nano_nr_4": "Nano 4",
"nano_nr_5": "Nano 5",
"off": "Off",
"on": "On",
"summer": "Summer",
"winter": "Winter"
}
},
"pm10_level": {
"name": "PM10 level",
"state": {
"exceeded": "Exceeded",
"no_sensor": "No sensor",
"normal": "Normal",
"warning": "Warning"
}
},
"pm1_level": {
"name": "PM1 level"
},
"pm25_level": {
"name": "PM2.5 level",
"state": {
"exceeded": "Exceeded",
"no_sensor": "No sensor",
"normal": "Normal",
"warning": "Warning"
}
},
"pm4_level": {
"name": "PM4 level"
},
"preset_mode": {
"name": "Preset mode"
},
"protection_temperature": {
"name": "Protection temperature"
},
"pump_status": {
"name": "Pump status",
"state": {
"off": "Off",
"on": "On"
}
},
"return_circuit_temperature": {
"name": "Return circuit temperature"
},
"set_target_temperature": {
"name": "Set target temperature"
},
"tank_temperature_t2": {
"name": "Tank T2 bottom temperature"
},
"tank_temperature_t3": {
"name": "Tank T3 top temperature"
},
"tank_temperature_t4": {
"name": "Tank T4 temperature"
},
"target_heating_temperature": {
"name": "Target heating temperature"
},
"target_temperature": {
"name": "Target temperature"
},
"temperature_alert": {
"name": "Temperature alert",
"state": {
"alert": "Alert",
"no_alert": "No alert"
}
},
"upper_source_temperature": {
"name": "Upper source temperature"
},
"ventilation_alarm": {
"name": "Ventilation alarm",
"state": {
"ahu_alarm": "AHU alarm",
"bot_alarm": "BOT alarm",
"damaged_exhaust_sensor": "Damaged exhaust sensor",
"damaged_preheater_sensor": "Damaged preheater sensor",
"damaged_supply_and_exhaust_sensors": "Damaged supply and exhaust sensors",
"damaged_supply_sensor": "Damaged supply sensor",
"no_alarm": "No alarm"
}
},
"ventilation_gear": {
"name": "Ventilation gear"
},
"weather_curve": {
"name": "Weather curve"
}
}
}
}

View File

@@ -2,10 +2,17 @@
from datetime import timedelta
from pyecobee import ECOBEE_API_KEY, ECOBEE_REFRESH_TOKEN, Ecobee, ExpiredTokenError
from pyecobee import (
ECOBEE_API_KEY,
ECOBEE_PASSWORD,
ECOBEE_REFRESH_TOKEN,
ECOBEE_USERNAME,
Ecobee,
ExpiredTokenError,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY
from homeassistant.const import CONF_API_KEY, CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant
from homeassistant.util import Throttle
@@ -18,10 +25,19 @@ type EcobeeConfigEntry = ConfigEntry[EcobeeData]
async def async_setup_entry(hass: HomeAssistant, entry: EcobeeConfigEntry) -> bool:
"""Set up ecobee via a config entry."""
api_key = entry.data[CONF_API_KEY]
api_key = entry.data.get(CONF_API_KEY)
username = entry.data.get(CONF_USERNAME)
password = entry.data.get(CONF_PASSWORD)
refresh_token = entry.data[CONF_REFRESH_TOKEN]
runtime_data = EcobeeData(hass, entry, api_key=api_key, refresh_token=refresh_token)
runtime_data = EcobeeData(
hass,
entry,
api_key=api_key,
username=username,
password=password,
refresh_token=refresh_token,
)
if not await runtime_data.refresh():
return False
@@ -46,14 +62,32 @@ class EcobeeData:
"""
def __init__(
self, hass: HomeAssistant, entry: ConfigEntry, api_key: str, refresh_token: str
self,
hass: HomeAssistant,
entry: ConfigEntry,
api_key: str | None = None,
username: str | None = None,
password: str | None = None,
refresh_token: str | None = None,
) -> None:
"""Initialize the Ecobee data object."""
self._hass = hass
self.entry = entry
self.ecobee = Ecobee(
config={ECOBEE_API_KEY: api_key, ECOBEE_REFRESH_TOKEN: refresh_token}
)
if api_key:
self.ecobee = Ecobee(
config={ECOBEE_API_KEY: api_key, ECOBEE_REFRESH_TOKEN: refresh_token}
)
elif username and password:
self.ecobee = Ecobee(
config={
ECOBEE_USERNAME: username,
ECOBEE_PASSWORD: password,
ECOBEE_REFRESH_TOKEN: refresh_token,
}
)
else:
raise ValueError("No ecobee credentials provided")
@Throttle(MIN_TIME_BETWEEN_UPDATES)
async def update(self):
@@ -69,12 +103,23 @@ class EcobeeData:
"""Refresh ecobee tokens and update config entry."""
_LOGGER.debug("Refreshing ecobee tokens and updating config entry")
if await self._hass.async_add_executor_job(self.ecobee.refresh_tokens):
self._hass.config_entries.async_update_entry(
self.entry,
data={
data = {}
if self.ecobee.config.get(ECOBEE_API_KEY):
data = {
CONF_API_KEY: self.ecobee.config[ECOBEE_API_KEY],
CONF_REFRESH_TOKEN: self.ecobee.config[ECOBEE_REFRESH_TOKEN],
},
}
elif self.ecobee.config.get(ECOBEE_USERNAME) and self.ecobee.config.get(
ECOBEE_PASSWORD
):
data = {
CONF_USERNAME: self.ecobee.config[ECOBEE_USERNAME],
CONF_PASSWORD: self.ecobee.config[ECOBEE_PASSWORD],
CONF_REFRESH_TOKEN: self.ecobee.config[ECOBEE_REFRESH_TOKEN],
}
self._hass.config_entries.async_update_entry(
self.entry,
data=data,
)
return True
_LOGGER.error("Error refreshing ecobee tokens")

View File

@@ -2,15 +2,21 @@
from typing import Any
from pyecobee import ECOBEE_API_KEY, Ecobee
from pyecobee import ECOBEE_API_KEY, ECOBEE_PASSWORD, ECOBEE_USERNAME, Ecobee
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY
from homeassistant.const import CONF_API_KEY, CONF_PASSWORD, CONF_USERNAME
from .const import CONF_REFRESH_TOKEN, DOMAIN
_USER_SCHEMA = vol.Schema({vol.Required(CONF_API_KEY): str})
_USER_SCHEMA = vol.Schema(
{
vol.Optional(CONF_API_KEY): str,
vol.Optional(CONF_USERNAME): str,
vol.Optional(CONF_PASSWORD): str,
}
)
class EcobeeFlowHandler(ConfigFlow, domain=DOMAIN):
@@ -27,13 +33,34 @@ class EcobeeFlowHandler(ConfigFlow, domain=DOMAIN):
errors = {}
if user_input is not None:
# Use the user-supplied API key to attempt to obtain a PIN from ecobee.
self._ecobee = Ecobee(config={ECOBEE_API_KEY: user_input[CONF_API_KEY]})
api_key = user_input.get(CONF_API_KEY)
username = user_input.get(CONF_USERNAME)
password = user_input.get(CONF_PASSWORD)
if await self.hass.async_add_executor_job(self._ecobee.request_pin):
# We have a PIN; move to the next step of the flow.
return await self.async_step_authorize()
errors["base"] = "pin_request_failed"
if api_key and not (username or password):
# Use the user-supplied API key to attempt to obtain a PIN from ecobee.
self._ecobee = Ecobee(config={ECOBEE_API_KEY: api_key})
if await self.hass.async_add_executor_job(self._ecobee.request_pin):
# We have a PIN; move to the next step of the flow.
return await self.async_step_authorize()
errors["base"] = "pin_request_failed"
elif username and password and not api_key:
self._ecobee = Ecobee(
config={
ECOBEE_USERNAME: username,
ECOBEE_PASSWORD: password,
}
)
if await self.hass.async_add_executor_job(self._ecobee.refresh_tokens):
config = {
CONF_USERNAME: username,
CONF_PASSWORD: password,
CONF_REFRESH_TOKEN: self._ecobee.refresh_token,
}
return self.async_create_entry(title=DOMAIN, data=config)
errors["base"] = "login_failed"
else:
errors["base"] = "invalid_auth"
return self.async_show_form(
step_id="user",

View File

@@ -4,6 +4,8 @@
"single_instance_allowed": "[%key:common::config_flow::abort::single_instance_allowed%]"
},
"error": {
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"login_failed": "Error authenticating with ecobee; please verify your credentials are correct.",
"pin_request_failed": "Error requesting PIN from ecobee; please verify API key is correct.",
"token_request_failed": "Error requesting tokens from ecobee; please try again."
},

View File

@@ -304,7 +304,7 @@ def base_owntone_library() -> BrowseMedia:
can_play=False,
can_expand=True,
children=children,
thumbnail="/api/brands/integration/forked_daapd/logo.png",
thumbnail="https://brands.home-assistant.io/_/forked_daapd/logo.png",
)
@@ -321,7 +321,7 @@ def library(other: Sequence[BrowseMedia] | None) -> BrowseMedia:
media_content_type=MediaType.APP,
can_play=False,
can_expand=True,
thumbnail="/api/brands/integration/forked_daapd/logo.png",
thumbnail="https://brands.home-assistant.io/_/forked_daapd/logo.png",
)
]
if other:

View File

@@ -207,7 +207,7 @@ class SupervisorOSUpdateEntity(HassioOSEntity, UpdateEntity):
@property
def entity_picture(self) -> str | None:
"""Return the icon of the entity."""
return "/api/brands/integration/homeassistant/icon.png?placeholder=no"
return "https://brands.home-assistant.io/homeassistant/icon.png"
@property
def release_url(self) -> str | None:
@@ -258,7 +258,7 @@ class SupervisorSupervisorUpdateEntity(HassioSupervisorEntity, UpdateEntity):
@property
def entity_picture(self) -> str | None:
"""Return the icon of the entity."""
return "/api/brands/integration/hassio/icon.png?placeholder=no"
return "https://brands.home-assistant.io/hassio/icon.png"
async def async_install(
self, version: str | None, backup: bool, **kwargs: Any
@@ -296,7 +296,7 @@ class SupervisorCoreUpdateEntity(HassioCoreEntity, UpdateEntity):
@property
def entity_picture(self) -> str | None:
"""Return the icon of the entity."""
return "/api/brands/integration/homeassistant/icon.png?placeholder=no"
return "https://brands.home-assistant.io/homeassistant/icon.png"
@property
def release_url(self) -> str | None:

View File

@@ -55,7 +55,7 @@ async def async_setup_entry(
entities: list[HomematicipGenericEntity] = []
entities.extend(
HomematicipLightHS(hap, d, ch.index)
HomematicipColorLight(hap, d, ch.index)
for d in hap.home.devices
for ch in d.functionalChannels
if ch.functionalChannelType == FunctionalChannelType.UNIVERSAL_LIGHT_CHANNEL
@@ -136,16 +136,32 @@ class HomematicipLight(HomematicipGenericEntity, LightEntity):
await self._device.turn_off_async()
class HomematicipLightHS(HomematicipGenericEntity, LightEntity):
"""Representation of the HomematicIP light with HS color mode."""
_attr_color_mode = ColorMode.HS
_attr_supported_color_modes = {ColorMode.HS}
class HomematicipColorLight(HomematicipGenericEntity, LightEntity):
"""Representation of the HomematicIP color light."""
def __init__(self, hap: HomematicipHAP, device: Device, channel_index: int) -> None:
"""Initialize the light entity."""
super().__init__(hap, device, channel=channel_index, is_multi_channel=True)
def _supports_color(self) -> bool:
"""Return true if device supports hue/saturation color control."""
channel = self.get_channel_or_raise()
return channel.hue is not None and channel.saturationLevel is not None
@property
def color_mode(self) -> ColorMode:
"""Return the color mode of the light."""
if self._supports_color():
return ColorMode.HS
return ColorMode.BRIGHTNESS
@property
def supported_color_modes(self) -> set[ColorMode]:
"""Return the supported color modes."""
if self._supports_color():
return {ColorMode.HS}
return {ColorMode.BRIGHTNESS}
@property
def is_on(self) -> bool:
"""Return true if light is on."""
@@ -172,18 +188,26 @@ class HomematicipLightHS(HomematicipGenericEntity, LightEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the light on."""
channel = self.get_channel_or_raise()
hs_color = kwargs.get(ATTR_HS_COLOR, (0.0, 0.0))
hue = hs_color[0] % 360.0
saturation = hs_color[1] / 100.0
dim_level = round(kwargs.get(ATTR_BRIGHTNESS, 255) / 255.0, 2)
if ATTR_HS_COLOR not in kwargs:
hue = channel.hue
saturation = channel.saturationLevel
if ATTR_BRIGHTNESS not in kwargs:
# If no brightness is set, use the current brightness
dim_level = channel.dimLevel or 1.0
# Use dim-only method for monochrome mode (hue/saturation not supported)
if not self._supports_color():
await channel.set_dim_level_async(dim_level=dim_level)
return
# Full color mode with hue/saturation
if ATTR_HS_COLOR in kwargs:
hs_color = kwargs[ATTR_HS_COLOR]
hue = hs_color[0] % 360.0
saturation = hs_color[1] / 100.0
else:
hue = channel.hue
saturation = channel.saturationLevel
await channel.set_hue_saturation_dim_level_async(
hue=hue, saturation_level=saturation, dim_level=dim_level
)

View File

@@ -219,7 +219,7 @@ async def library_payload(hass):
)
for child in library_info.children:
child.thumbnail = "/api/brands/integration/kodi/logo.png"
child.thumbnail = "https://brands.home-assistant.io/_/kodi/logo.png"
with contextlib.suppress(BrowseError):
item = await media_source.async_browse_media(

View File

@@ -3,6 +3,8 @@
from __future__ import annotations
import asyncio
from datetime import datetime
import logging
from pyliebherrhomeapi import LiebherrClient
from pyliebherrhomeapi.exceptions import (
@@ -14,8 +16,13 @@ from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.event import async_track_time_interval
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .const import DEVICE_SCAN_INTERVAL, DOMAIN
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator, LiebherrData
_LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [
Platform.NUMBER,
@@ -42,7 +49,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: LiebherrConfigEntry) ->
raise ConfigEntryNotReady(f"Failed to connect to Liebherr API: {err}") from err
# Create a coordinator for each device (may be empty if no devices)
coordinators: dict[str, LiebherrCoordinator] = {}
data = LiebherrData(client=client)
for device in devices:
coordinator = LiebherrCoordinator(
hass=hass,
@@ -50,20 +57,61 @@ async def async_setup_entry(hass: HomeAssistant, entry: LiebherrConfigEntry) ->
client=client,
device_id=device.device_id,
)
coordinators[device.device_id] = coordinator
data.coordinators[device.device_id] = coordinator
await asyncio.gather(
*(
coordinator.async_config_entry_first_refresh()
for coordinator in coordinators.values()
for coordinator in data.coordinators.values()
)
)
# Store coordinators in runtime data
entry.runtime_data = coordinators
# Store runtime data
entry.runtime_data = data
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
# Schedule periodic scan for new devices
async def _async_scan_for_new_devices(_now: datetime) -> None:
"""Scan for new devices added to the account."""
try:
devices = await client.get_devices()
except LiebherrAuthenticationError, LiebherrConnectionError:
_LOGGER.debug("Failed to scan for new devices")
return
except Exception:
_LOGGER.exception("Unexpected error scanning for new devices")
return
new_coordinators: list[LiebherrCoordinator] = []
for device in devices:
if device.device_id not in data.coordinators:
coordinator = LiebherrCoordinator(
hass=hass,
config_entry=entry,
client=client,
device_id=device.device_id,
)
await coordinator.async_refresh()
if not coordinator.last_update_success:
_LOGGER.debug("Failed to set up new device %s", device.device_id)
continue
data.coordinators[device.device_id] = coordinator
new_coordinators.append(coordinator)
if new_coordinators:
async_dispatcher_send(
hass,
f"{DOMAIN}_new_device_{entry.entry_id}",
new_coordinators,
)
entry.async_on_unload(
async_track_time_interval(
hass, _async_scan_for_new_devices, DEVICE_SCAN_INTERVAL
)
)
return True

View File

@@ -6,4 +6,6 @@ from typing import Final
DOMAIN: Final = "liebherr"
MANUFACTURER: Final = "Liebherr"
SCAN_INTERVAL: Final = timedelta(seconds=60)
DEVICE_SCAN_INTERVAL: Final = timedelta(minutes=5)
REFRESH_DELAY: Final = timedelta(seconds=5)

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from datetime import timedelta
from dataclasses import dataclass, field
import logging
from pyliebherrhomeapi import (
@@ -18,13 +18,20 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
type LiebherrConfigEntry = ConfigEntry[dict[str, LiebherrCoordinator]]
from .const import DOMAIN, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=60)
@dataclass
class LiebherrData:
"""Runtime data for the Liebherr integration."""
client: LiebherrClient
coordinators: dict[str, LiebherrCoordinator] = field(default_factory=dict)
type LiebherrConfigEntry = ConfigEntry[LiebherrData]
class LiebherrCoordinator(DataUpdateCoordinator[DeviceState]):

View File

@@ -29,6 +29,6 @@ async def async_get_config_entry_diagnostics(
},
"data": asdict(coordinator.data),
}
for device_id, coordinator in entry.runtime_data.items()
for device_id, coordinator in entry.runtime_data.coordinators.items()
},
}

View File

@@ -16,9 +16,11 @@ from homeassistant.components.number import (
NumberEntityDescription,
)
from homeassistant.const import UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import LiebherrZoneEntity
@@ -53,22 +55,41 @@ NUMBER_TYPES: tuple[LiebherrNumberEntityDescription, ...] = (
)
def _create_number_entities(
coordinators: list[LiebherrCoordinator],
) -> list[LiebherrNumber]:
"""Create number entities for the given coordinators."""
return [
LiebherrNumber(
coordinator=coordinator,
zone_id=temp_control.zone_id,
description=description,
)
for coordinator in coordinators
for temp_control in coordinator.data.get_temperature_controls().values()
for description in NUMBER_TYPES
]
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr number entities."""
coordinators = entry.runtime_data
async_add_entities(
LiebherrNumber(
coordinator=coordinator,
zone_id=temp_control.zone_id,
description=description,
_create_number_entities(list(entry.runtime_data.coordinators.values()))
)
@callback
def _async_new_device(coordinators: list[LiebherrCoordinator]) -> None:
"""Add number entities for new devices."""
async_add_entities(_create_number_entities(coordinators))
entry.async_on_unload(
async_dispatcher_connect(
hass, f"{DOMAIN}_new_device_{entry.entry_id}", _async_new_device
)
for coordinator in coordinators.values()
for temp_control in coordinator.data.get_temperature_controls().values()
for description in NUMBER_TYPES
)

View File

@@ -53,7 +53,7 @@ rules:
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
dynamic-devices: done
entity-category: done
entity-device-class: done
entity-disabled-by-default:

View File

@@ -18,9 +18,11 @@ from pyliebherrhomeapi import (
)
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import ZONE_POSITION_MAP, LiebherrEntity
@@ -109,15 +111,13 @@ SELECT_TYPES: list[LiebherrSelectEntityDescription] = [
]
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr select entities."""
def _create_select_entities(
coordinators: list[LiebherrCoordinator],
) -> list[LiebherrSelectEntity]:
"""Create select entities for the given coordinators."""
entities: list[LiebherrSelectEntity] = []
for coordinator in entry.runtime_data.values():
for coordinator in coordinators:
has_multiple_zones = len(coordinator.data.get_temperature_controls()) > 1
for control in coordinator.data.controls:
@@ -137,7 +137,29 @@ async def async_setup_entry(
)
)
async_add_entities(entities)
return entities
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr select entities."""
async_add_entities(
_create_select_entities(list(entry.runtime_data.coordinators.values()))
)
@callback
def _async_new_device(coordinators: list[LiebherrCoordinator]) -> None:
"""Add select entities for new devices."""
async_add_entities(_create_select_entities(coordinators))
entry.async_on_unload(
async_dispatcher_connect(
hass, f"{DOMAIN}_new_device_{entry.entry_id}", _async_new_device
)
)
class LiebherrSelectEntity(LiebherrEntity, SelectEntity):

View File

@@ -14,10 +14,12 @@ from homeassistant.components.sensor import (
SensorStateClass,
)
from homeassistant.const import UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .const import DOMAIN
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import LiebherrZoneEntity
@@ -48,22 +50,41 @@ SENSOR_TYPES: tuple[LiebherrSensorEntityDescription, ...] = (
)
def _create_sensor_entities(
coordinators: list[LiebherrCoordinator],
) -> list[LiebherrSensor]:
"""Create sensor entities for the given coordinators."""
return [
LiebherrSensor(
coordinator=coordinator,
zone_id=temp_control.zone_id,
description=description,
)
for coordinator in coordinators
for temp_control in coordinator.data.get_temperature_controls().values()
for description in SENSOR_TYPES
]
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr sensor entities."""
coordinators = entry.runtime_data
async_add_entities(
LiebherrSensor(
coordinator=coordinator,
zone_id=temp_control.zone_id,
description=description,
_create_sensor_entities(list(entry.runtime_data.coordinators.values()))
)
@callback
def _async_new_device(coordinators: list[LiebherrCoordinator]) -> None:
"""Add sensor entities for new devices."""
async_add_entities(_create_sensor_entities(coordinators))
entry.async_on_unload(
async_dispatcher_connect(
hass, f"{DOMAIN}_new_device_{entry.entry_id}", _async_new_device
)
for coordinator in coordinators.values()
for temp_control in coordinator.data.get_temperature_controls().values()
for description in SENSOR_TYPES
)

View File

@@ -15,9 +15,11 @@ from pyliebherrhomeapi.const import (
)
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import ZONE_POSITION_MAP, LiebherrEntity
@@ -90,15 +92,13 @@ DEVICE_SWITCH_TYPES: dict[str, LiebherrDeviceSwitchEntityDescription] = {
}
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr switch entities."""
def _create_switch_entities(
coordinators: list[LiebherrCoordinator],
) -> list[LiebherrDeviceSwitch | LiebherrZoneSwitch]:
"""Create switch entities for the given coordinators."""
entities: list[LiebherrDeviceSwitch | LiebherrZoneSwitch] = []
for coordinator in entry.runtime_data.values():
for coordinator in coordinators:
has_multiple_zones = len(coordinator.data.get_temperature_controls()) > 1
for control in coordinator.data.controls:
@@ -127,7 +127,29 @@ async def async_setup_entry(
)
)
async_add_entities(entities)
return entities
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr switch entities."""
async_add_entities(
_create_switch_entities(list(entry.runtime_data.coordinators.values()))
)
@callback
def _async_new_device(coordinators: list[LiebherrCoordinator]) -> None:
"""Add switch entities for new devices."""
async_add_entities(_create_switch_entities(coordinators))
entry.async_on_unload(
async_dispatcher_connect(
hass, f"{DOMAIN}_new_device_{entry.entry_id}", _async_new_device
)
)
class LiebherrDeviceSwitch(LiebherrEntity, SwitchEntity):

View File

@@ -42,7 +42,7 @@ async def async_get_media_browser_root_object(
media_class=MediaClass.APP,
media_content_id="",
media_content_type=DOMAIN,
thumbnail="/api/brands/integration/lovelace/logo.png",
thumbnail="https://brands.home-assistant.io/_/lovelace/logo.png",
can_play=False,
can_expand=True,
)
@@ -72,7 +72,7 @@ async def async_browse_media(
media_class=MediaClass.APP,
media_content_id=DEFAULT_DASHBOARD,
media_content_type=DOMAIN,
thumbnail="/api/brands/integration/lovelace/logo.png",
thumbnail="https://brands.home-assistant.io/_/lovelace/logo.png",
can_play=True,
can_expand=False,
)
@@ -104,7 +104,7 @@ async def async_browse_media(
media_class=MediaClass.APP,
media_content_id=f"{info['url_path']}/{view['path']}",
media_content_type=DOMAIN,
thumbnail="/api/brands/integration/lovelace/logo.png",
thumbnail="https://brands.home-assistant.io/_/lovelace/logo.png",
can_play=True,
can_expand=False,
)
@@ -213,7 +213,7 @@ def _item_from_info(info: dict) -> BrowseMedia:
media_class=MediaClass.APP,
media_content_id=info["url_path"],
media_content_type=DOMAIN,
thumbnail="/api/brands/integration/lovelace/logo.png",
thumbnail="https://brands.home-assistant.io/_/lovelace/logo.png",
can_play=True,
can_expand=len(info["views"]) > 1,
)

View File

@@ -285,9 +285,9 @@ class MatterEntity(Entity):
self,
command: ClusterCommand,
**kwargs: Any,
) -> None:
) -> Any:
"""Send device command on the primary attribute's endpoint."""
await self.matter_client.send_device_command(
return await self.matter_client.send_device_command(
node_id=self._endpoint.node.node_id,
endpoint_id=self._endpoint.endpoint_id,
command=command,

View File

@@ -10,6 +10,7 @@ from chip.clusters import Objects as clusters
from matter_server.client.models import device_types
from homeassistant.components.vacuum import (
Segment,
StateVacuumEntity,
StateVacuumEntityDescription,
VacuumActivity,
@@ -70,6 +71,7 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
"""Representation of a Matter Vacuum cleaner entity."""
_last_accepted_commands: list[int] | None = None
_last_service_area_feature_map: int | None = None
_supported_run_modes: (
dict[int, clusters.RvcRunMode.Structs.ModeOptionStruct] | None
) = None
@@ -136,6 +138,16 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
"No supported run mode found to start the vacuum cleaner."
)
# Reset selected areas to an unconstrained selection to ensure start
# performs a full clean and does not reuse a previous area-targeted
# selection.
if VacuumEntityFeature.CLEAN_AREA in self.supported_features:
# Matter ServiceArea: an empty NewAreas list means unconstrained
# operation (full clean).
await self.send_device_command(
clusters.ServiceArea.Commands.SelectAreas(newAreas=[])
)
await self.send_device_command(
clusters.RvcRunMode.Commands.ChangeToMode(newMode=mode.mode)
)
@@ -144,6 +156,66 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
"""Pause the cleaning task."""
await self.send_device_command(clusters.RvcOperationalState.Commands.Pause())
@property
def _current_segments(self) -> dict[str, Segment]:
"""Return the current cleanable segments reported by the device."""
supported_areas: list[clusters.ServiceArea.Structs.AreaStruct] = (
self.get_matter_attribute_value(
clusters.ServiceArea.Attributes.SupportedAreas
)
)
segments: dict[str, Segment] = {}
for area in supported_areas:
area_name = None
if area.areaInfo and area.areaInfo.locationInfo:
area_name = area.areaInfo.locationInfo.locationName
if area_name:
segment_id = str(area.areaID)
segments[segment_id] = Segment(id=segment_id, name=area_name)
return segments
async def async_get_segments(self) -> list[Segment]:
"""Get the segments that can be cleaned.
Returns a list of segments containing their ids and names.
"""
return list(self._current_segments.values())
async def async_clean_segments(self, segment_ids: list[str], **kwargs: Any) -> None:
"""Clean the specified segments.
Args:
segment_ids: List of segment IDs to clean.
**kwargs: Additional arguments (unused).
"""
area_ids = [int(segment_id) for segment_id in segment_ids]
mode = self._get_run_mode_by_tag(ModeTag.CLEANING)
if mode is None:
raise HomeAssistantError(
"No supported run mode found to start the vacuum cleaner."
)
response = await self.send_device_command(
clusters.ServiceArea.Commands.SelectAreas(newAreas=area_ids)
)
if (
response
and response.status != clusters.ServiceArea.Enums.SelectAreasStatus.kSuccess
):
raise HomeAssistantError(
f"Failed to select areas: {response.statusText or response.status.name}"
)
await self.send_device_command(
clusters.RvcRunMode.Commands.ChangeToMode(newMode=mode.mode)
)
@callback
def _update_from_device(self) -> None:
"""Update from device."""
@@ -176,16 +248,34 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
state = VacuumActivity.CLEANING
self._attr_activity = state
if (
VacuumEntityFeature.CLEAN_AREA in self.supported_features
and self.registry_entry is not None
and (last_seen_segments := self.last_seen_segments) is not None
and self._current_segments != {s.id: s for s in last_seen_segments}
):
self.async_create_segments_issue()
@callback
def _calculate_features(self) -> None:
"""Calculate features for HA Vacuum platform."""
accepted_operational_commands: list[int] = self.get_matter_attribute_value(
clusters.RvcOperationalState.Attributes.AcceptedCommandList
)
# in principle the feature set should not change, except for the accepted commands
if self._last_accepted_commands == accepted_operational_commands:
service_area_feature_map: int | None = self.get_matter_attribute_value(
clusters.ServiceArea.Attributes.FeatureMap
)
# In principle the feature set should not change, except for accepted
# commands and service area feature map.
if (
self._last_accepted_commands == accepted_operational_commands
and self._last_service_area_feature_map == service_area_feature_map
):
return
self._last_accepted_commands = accepted_operational_commands
self._last_service_area_feature_map = service_area_feature_map
supported_features: VacuumEntityFeature = VacuumEntityFeature(0)
supported_features |= VacuumEntityFeature.START
supported_features |= VacuumEntityFeature.STATE
@@ -212,6 +302,12 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
in accepted_operational_commands
):
supported_features |= VacuumEntityFeature.RETURN_HOME
# Check if Map feature is enabled for clean area support
if (
service_area_feature_map is not None
and service_area_feature_map & clusters.ServiceArea.Bitmaps.Feature.kMaps
):
supported_features |= VacuumEntityFeature.CLEAN_AREA
self._attr_supported_features = supported_features
@@ -228,6 +324,10 @@ DISCOVERY_SCHEMAS = [
clusters.RvcRunMode.Attributes.CurrentMode,
clusters.RvcOperationalState.Attributes.OperationalState,
),
optional_attributes=(
clusters.ServiceArea.Attributes.FeatureMap,
clusters.ServiceArea.Attributes.SupportedAreas,
),
device_type=(device_types.RoboticVacuumCleaner,),
allow_none_value=True,
),

View File

@@ -2,9 +2,11 @@
from __future__ import annotations
from collections.abc import Mapping
import asyncio
from collections.abc import Iterable, Mapping
from dataclasses import dataclass
import logging
import re
from typing import Any, cast
import httpx
@@ -41,6 +43,48 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
# Headers and regex for WWW-Authenticate parsing for rfc9728
WWW_AUTHENTICATE_HEADER = "WWW-Authenticate"
RESOURCE_METADATA_REGEXP = r'resource_metadata="([^"]+)"'
OAUTH_PROTECTED_RESOURCE_ENDPOINT = "/.well-known/oauth-protected-resource"
SCOPES_REGEXP = r'scope="([^"]+)"'
@dataclass
class AuthenticateHeader:
"""Class to hold info from the WWW-Authenticate header for supporting rfc9728."""
resource_metadata_url: str
scopes: list[str] | None = None
@classmethod
def from_header(
cls, url: str, error_response: httpx.Response
) -> AuthenticateHeader | None:
"""Create AuthenticateHeader from WWW-Authenticate header."""
if not (header := error_response.headers.get(WWW_AUTHENTICATE_HEADER)) or not (
match := re.search(RESOURCE_METADATA_REGEXP, header)
):
return None
resource_metadata_url = str(URL(url).join(URL(match.group(1))))
scope_match = re.search(SCOPES_REGEXP, header)
return cls(
resource_metadata_url=resource_metadata_url,
scopes=scope_match.group(1).split(" ") if scope_match else None,
)
@dataclass
class ResourceMetadata:
"""Class to hold protected resource metadata defined in rfc9728."""
authorization_servers: list[str]
"""List of authorization server URLs."""
supported_scopes: list[str] | None = None
"""List of supported scopes."""
# OAuth server discovery endpoint for rfc8414
OAUTH_DISCOVERY_ENDPOINT = ".well-known/oauth-authorization-server"
MCP_DISCOVERY_HEADERS = {
@@ -58,40 +102,27 @@ class OAuthConfig:
scopes: list[str] | None = None
async def async_discover_oauth_config(
hass: HomeAssistant, mcp_server_url: str
async def async_discover_authorization_server(
hass: HomeAssistant, auth_server_url: str
) -> OAuthConfig:
"""Discover the OAuth configuration for the MCP server.
This implements the functionality in the MCP spec for discovery. If the MCP server URL
is https://api.example.com/v1/mcp, then:
- The authorization base URL is https://api.example.com
- The metadata endpoint MUST be at https://api.example.com/.well-known/oauth-authorization-server
- For servers that do not implement OAuth 2.0 Authorization Server Metadata, the client uses
default paths relative to the authorization base URL.
"""
parsed_url = URL(mcp_server_url)
discovery_endpoint = str(parsed_url.with_path(OAUTH_DISCOVERY_ENDPOINT))
"""Perform OAuth 2.0 Authorization Server Metadata discovery as per RFC8414."""
parsed_url = URL(auth_server_url)
urls_to_try = [
str(parsed_url.with_path(path))
for path in _authorization_server_discovery_paths(parsed_url)
]
# Pick any successful response and propagate exceptions except for
# 404 where we fall back to assuming some default paths.
try:
async with httpx.AsyncClient(headers=MCP_DISCOVERY_HEADERS) as client:
response = await client.get(discovery_endpoint)
response.raise_for_status()
except httpx.TimeoutException as error:
_LOGGER.info("Timeout connecting to MCP server: %s", error)
raise TimeoutConnectError from error
except httpx.HTTPStatusError as error:
if error.response.status_code == 404:
_LOGGER.info("Authorization Server Metadata not found, using default paths")
return OAuthConfig(
authorization_server=AuthorizationServer(
authorize_url=str(parsed_url.with_path("/authorize")),
token_url=str(parsed_url.with_path("/token")),
)
response = await _async_fetch_any(hass, urls_to_try)
except NotFoundError:
_LOGGER.info("Authorization Server Metadata not found, using default paths")
return OAuthConfig(
authorization_server=AuthorizationServer(
authorize_url=str(parsed_url.with_path("/authorize")),
token_url=str(parsed_url.with_path("/token")),
)
raise CannotConnect from error
except httpx.HTTPError as error:
_LOGGER.info("Cannot discover OAuth configuration: %s", error)
raise CannotConnect from error
)
data = response.json()
authorize_url = data["authorization_endpoint"]
@@ -130,7 +161,8 @@ async def validate_input(
except httpx.HTTPStatusError as error:
_LOGGER.info("Cannot connect to MCP server: %s", error)
if error.response.status_code == 401:
raise InvalidAuth from error
auth_header = AuthenticateHeader.from_header(url, error.response)
raise InvalidAuth(auth_header) from error
raise CannotConnect from error
except httpx.HTTPError as error:
_LOGGER.info("Cannot connect to MCP server: %s", error)
@@ -156,6 +188,7 @@ class ModelContextProtocolConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
super().__init__()
self.data: dict[str, Any] = {}
self.oauth_config: OAuthConfig | None = None
self.auth_header: AuthenticateHeader | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
@@ -171,7 +204,8 @@ class ModelContextProtocolConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
errors["base"] = "timeout_connect"
except CannotConnect:
errors["base"] = "cannot_connect"
except InvalidAuth:
except InvalidAuth as err:
self.auth_header = err.metadata
self.data[CONF_URL] = user_input[CONF_URL]
return await self.async_step_auth_discovery()
except MissingCapabilities:
@@ -196,12 +230,34 @@ class ModelContextProtocolConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
"""Handle the OAuth server discovery step.
Since this OAuth server requires authentication, this step will attempt
to find the OAuth medata then run the OAuth authentication flow.
to find the OAuth metadata then run the OAuth authentication flow.
"""
resource_metadata: ResourceMetadata | None = None
try:
oauth_config = await async_discover_oauth_config(
self.hass, self.data[CONF_URL]
)
if self.auth_header:
_LOGGER.debug(
"Resource metadata discovery from header: %s", self.auth_header
)
resource_metadata = await async_discover_protected_resource(
self.hass,
self.auth_header.resource_metadata_url,
self.data[CONF_URL],
)
_LOGGER.debug("Protected resource metadata: %s", resource_metadata)
oauth_config = await async_discover_authorization_server(
self.hass,
# Use the first authorization server from the resource metadata as it
# is the most common to have only one and there is not a defined strategy.
resource_metadata.authorization_servers[0],
)
else:
_LOGGER.debug(
"Discovering authorization server without protected resource metadata"
)
oauth_config = await async_discover_authorization_server(
self.hass,
self.data[CONF_URL],
)
except TimeoutConnectError:
return self.async_abort(reason="timeout_connect")
except CannotConnect:
@@ -216,7 +272,9 @@ class ModelContextProtocolConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
{
CONF_AUTHORIZATION_URL: oauth_config.authorization_server.authorize_url,
CONF_TOKEN_URL: oauth_config.authorization_server.token_url,
CONF_SCOPE: oauth_config.scopes,
CONF_SCOPE: _select_scopes(
self.auth_header, oauth_config, resource_metadata
),
}
)
return await self.async_step_credentials_choice()
@@ -326,6 +384,143 @@ class ModelContextProtocolConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
return await self.async_step_auth()
async def _async_fetch_any(
hass: HomeAssistant,
urls: Iterable[str],
) -> httpx.Response:
"""Fetch all URLs concurrently and return the first successful response."""
async def fetch(url: str) -> httpx.Response:
_LOGGER.debug("Fetching URL %s", url)
try:
async with httpx.AsyncClient() as client:
response = await client.get(url)
response.raise_for_status()
return response
except httpx.TimeoutException as error:
_LOGGER.debug("Timeout fetching URL %s: %s", url, error)
raise TimeoutConnectError from error
except httpx.HTTPStatusError as error:
_LOGGER.debug("Server error for URL %s: %s", url, error)
if error.response.status_code == 404:
raise NotFoundError from error
raise CannotConnect from error
except httpx.HTTPError as error:
_LOGGER.debug("Cannot fetch URL %s: %s", url, error)
raise CannotConnect from error
tasks = [asyncio.create_task(fetch(url)) for url in urls]
return_err: Exception | None = None
try:
for future in asyncio.as_completed(tasks):
try:
return await future
except Exception as err: # noqa: BLE001
_LOGGER.debug("Fetch failed: %s", err)
if return_err is None:
return_err = err
continue
finally:
for task in tasks:
task.cancel()
raise return_err or CannotConnect("No responses received from any URL")
async def async_discover_protected_resource(
hass: HomeAssistant,
auth_url: str,
mcp_server_url: str,
) -> ResourceMetadata:
"""Discover the OAuth configuration for a protected resource for MCP spec version 2025-11-25+.
This implements the functionality in the MCP spec for discovery. We use the information
from the WWW-Authenticate header to fetch the resource metadata implementing
RFC9728.
For the url https://example.com/public/mcp we attempt these urls:
- https://example.com/.well-known/oauth-protected-resource/public/mcp
- https://example.com/.well-known/oauth-protected-resource
"""
parsed_url = URL(mcp_server_url)
urls_to_try = {
auth_url,
str(
parsed_url.with_path(
f"{OAUTH_PROTECTED_RESOURCE_ENDPOINT}{parsed_url.path}"
)
),
str(parsed_url.with_path(OAUTH_PROTECTED_RESOURCE_ENDPOINT)),
}
response = await _async_fetch_any(hass, list(urls_to_try))
# Parse the OAuth Authorization Protected Resource Metadata (rfc9728). We
# expect to find at least one authorization server in the response and
# a valid resource field that matches the MCP server URL.
data = response.json()
if (
not (authorization_servers := data.get("authorization_servers"))
or not (resource := data.get("resource"))
or (resource != mcp_server_url)
):
_LOGGER.error("Invalid OAuth resource metadata: %s", data)
raise CannotConnect("OAuth resource metadata is invalid")
return ResourceMetadata(
authorization_servers=authorization_servers,
supported_scopes=data.get("scopes_supported"),
)
def _authorization_server_discovery_paths(auth_server_url: URL) -> list[str]:
"""Return the list of paths to try for OAuth server discovery.
For an auth server url with path components, e.g., https://auth.example.com/tenant1
clients try endpoints in the following priority order:
- OAuth 2.0 Authorization Server Metadata with path insertion:
https://auth.example.com/.well-known/oauth-authorization-server/tenant1
- OpenID Connect Discovery 1.0 with path insertion:
https://auth.example.com/.well-known/openid-configuration/tenant1
- OpenID Connect Discovery 1.0 path appending:
https://auth.example.com/tenant1/.well-known/openid-configuration
For an auth server url without path components, e.g., https://auth.example.com
clients try:
- OAuth 2.0 Authorization Server Metadata:
https://auth.example.com/.well-known/oauth-authorization-server
- OpenID Connect Discovery 1.0:
https://auth.example.com/.well-known/openid-configuration
"""
if auth_server_url.path and auth_server_url.path != "/":
return [
f"/.well-known/oauth-authorization-server{auth_server_url.path}",
f"/.well-known/openid-configuration{auth_server_url.path}",
f"{auth_server_url.path}/.well-known/openid-configuration",
]
return [
"/.well-known/oauth-authorization-server",
"/.well-known/openid-configuration",
]
def _select_scopes(
auth_header: AuthenticateHeader | None,
oauth_config: OAuthConfig,
resource_metadata: ResourceMetadata | None,
) -> list[str] | None:
"""Select OAuth scopes based on the MCP spec scope selection strategy.
This follows the MCP spec strategy of preferring first the authenticate header,
then the protected resource metadata, then finally the default scopes from
the OAuth discovery.
"""
if auth_header and auth_header.scopes:
return auth_header.scopes
if resource_metadata and resource_metadata.supported_scopes:
return resource_metadata.supported_scopes
return oauth_config.scopes
class InvalidUrl(HomeAssistantError):
"""Error to indicate the URL format is invalid."""
@@ -338,9 +533,18 @@ class TimeoutConnectError(HomeAssistantError):
"""Error to indicate we cannot connect."""
class NotFoundError(CannotConnect):
"""Error to indicate the resource was not found."""
class InvalidAuth(HomeAssistantError):
"""Error to indicate there is invalid auth."""
def __init__(self, metadata: AuthenticateHeader | None = None) -> None:
"""Initialize the error."""
super().__init__()
self.metadata = metadata
class MissingCapabilities(HomeAssistantError):
"""Error to indicate that the MCP server is missing required capabilities."""

View File

@@ -83,7 +83,7 @@ class MediaSourceItem:
identifier=None,
media_class=MediaClass.APP,
media_content_type=MediaType.APP,
thumbnail=f"/api/brands/integration/{source.domain}/logo.png",
thumbnail=f"https://brands.home-assistant.io/_/{source.domain}/logo.png",
title=source.name,
can_play=False,
can_expand=True,

View File

@@ -0,0 +1,30 @@
"""Diagnostics support for Met.no integration."""
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_LATITUDE, CONF_LONGITUDE
from homeassistant.core import HomeAssistant
from .coordinator import MetWeatherConfigEntry
TO_REDACT = [
CONF_LATITUDE,
CONF_LONGITUDE,
]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MetWeatherConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator_data = entry.runtime_data.data
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": {
"current_weather_data": coordinator_data.current_weather_data,
"daily_forecast": coordinator_data.daily_forecast,
"hourly_forecast": coordinator_data.hourly_forecast,
},
}

View File

@@ -213,8 +213,8 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
if info := await self._async_validate_credentials(
self._pending_host,
errors,
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
username=user_input.get(CONF_USERNAME),
password=user_input.get(CONF_PASSWORD),
):
await self.async_set_unique_id(info["serial"], raise_on_progress=False)
self._abort_if_unique_id_configured()
@@ -222,8 +222,8 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
title=info["title"],
data={
CONF_HOST: self._pending_host,
CONF_USERNAME: user_input[CONF_USERNAME],
CONF_PASSWORD: user_input[CONF_PASSWORD],
CONF_USERNAME: user_input.get(CONF_USERNAME),
CONF_PASSWORD: user_input.get(CONF_PASSWORD),
},
)
@@ -253,8 +253,8 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
if info := await self._async_validate_credentials(
reauth_entry.data[CONF_HOST],
errors,
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
username=user_input.get(CONF_USERNAME),
password=user_input.get(CONF_PASSWORD),
):
await self.async_set_unique_id(info["serial"], raise_on_progress=False)
self._abort_if_unique_id_mismatch()
@@ -318,11 +318,13 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
reconfigure_entry = self._get_reconfigure_entry()
if user_input is not None:
username = user_input.get(CONF_USERNAME)
password = user_input.get(CONF_PASSWORD)
if info := await self._async_validate_credentials(
self._pending_host,
errors,
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
username=username,
password=password,
):
await self.async_set_unique_id(info["serial"], raise_on_progress=False)
self._abort_if_unique_id_mismatch()
@@ -330,8 +332,8 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
reconfigure_entry,
data_updates={
CONF_HOST: self._pending_host,
CONF_USERNAME: user_input[CONF_USERNAME],
CONF_PASSWORD: user_input[CONF_PASSWORD],
CONF_USERNAME: username,
CONF_PASSWORD: password,
},
)

View File

@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/nrgkick",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"quality_scale": "silver",
"requirements": ["nrgkick-api==1.7.1"],
"zeroconf": ["_nrgkick._tcp.local."]
}

View File

@@ -41,7 +41,7 @@ rules:
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: todo
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
test-coverage: done

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_push",
"loggers": ["aiontfy"],
"quality_scale": "platinum",
"requirements": ["aiontfy==0.8.0"]
"requirements": ["aiontfy==0.8.1"]
}

View File

@@ -23,7 +23,7 @@ async def async_get_media_browser_root_object(
media_class=MediaClass.APP,
media_content_id="",
media_content_type="plex",
thumbnail="/api/brands/integration/plex/logo.png",
thumbnail="https://brands.home-assistant.io/_/plex/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -94,7 +94,7 @@ def browse_media( # noqa: C901
can_expand=True,
children=[],
children_media_class=MediaClass.DIRECTORY,
thumbnail="/api/brands/integration/plex/logo.png",
thumbnail="https://brands.home-assistant.io/_/plex/logo.png",
)
if platform != "sonos":
server_info.children.append(

View File

@@ -148,7 +148,8 @@
"name": "Operating system version"
},
"stack_containers_count": {
"name": "Container count"
"name": "Containers",
"unit_of_measurement": "containers"
},
"stack_type": {
"name": "Type",

View File

@@ -7,7 +7,7 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "silver",
"requirements": ["powerfox==2.1.0"],
"requirements": ["powerfox==2.1.1"],
"zeroconf": [
{
"name": "powerfox*",

View File

@@ -2,12 +2,18 @@
from __future__ import annotations
from collections.abc import Mapping
from typing import Any
from powerfox import PowerfoxAuthenticationError, PowerfoxConnectionError, PowerfoxLocal
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.config_entries import (
SOURCE_RECONFIGURE,
SOURCE_USER,
ConfigFlow,
ConfigFlowResult,
)
from homeassistant.const import CONF_API_KEY, CONF_HOST
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
@@ -21,6 +27,12 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
STEP_REAUTH_DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_API_KEY): str,
}
)
class PowerfoxLocalConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Powerfox Local."""
@@ -33,7 +45,7 @@ class PowerfoxLocalConfigFlow(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the user step."""
errors: dict[str, str] = {}
errors = {}
if user_input is not None:
self._host = user_input[CONF_HOST]
@@ -47,7 +59,15 @@ class PowerfoxLocalConfigFlow(ConfigFlow, domain=DOMAIN):
except PowerfoxConnectionError:
errors["base"] = "cannot_connect"
else:
return self._async_create_entry()
if self.source == SOURCE_USER:
return self._async_create_entry()
return self.async_update_reload_and_abort(
self._get_reconfigure_entry(),
data={
CONF_HOST: self._host,
CONF_API_KEY: self._api_key,
},
)
return self.async_show_form(
step_id="user",
@@ -84,6 +104,51 @@ class PowerfoxLocalConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a confirmation flow for zeroconf discovery."""
return self._async_create_entry()
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle re-authentication flow."""
self._host = entry_data[CONF_HOST]
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle re-authentication confirmation."""
errors = {}
if user_input is not None:
self._api_key = user_input[CONF_API_KEY]
reauth_entry = self._get_reauth_entry()
client = PowerfoxLocal(
host=reauth_entry.data[CONF_HOST],
api_key=user_input[CONF_API_KEY],
session=async_get_clientsession(self.hass),
)
try:
await client.value()
except PowerfoxAuthenticationError:
errors["base"] = "invalid_auth"
except PowerfoxConnectionError:
errors["base"] = "cannot_connect"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data_updates=user_input,
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=STEP_REAUTH_DATA_SCHEMA,
errors=errors,
)
async def async_step_reconfigure(
self, user_input: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle reconfiguration."""
return await self.async_step_user()
def _async_create_entry(self) -> ConfigFlowResult:
"""Create a config entry."""
return self.async_create_entry(
@@ -103,5 +168,8 @@ class PowerfoxLocalConfigFlow(ConfigFlow, domain=DOMAIN):
)
await client.value()
await self.async_set_unique_id(self._device_id)
self._abort_if_unique_id_configured(updates={CONF_HOST: self._host})
await self.async_set_unique_id(self._device_id, raise_on_progress=False)
if self.source == SOURCE_RECONFIGURE:
self._abort_if_unique_id_mismatch()
else:
self._abort_if_unique_id_configured(updates={CONF_HOST: self._host})

View File

@@ -2,11 +2,17 @@
from __future__ import annotations
from powerfox import LocalResponse, PowerfoxConnectionError, PowerfoxLocal
from powerfox import (
LocalResponse,
PowerfoxAuthenticationError,
PowerfoxConnectionError,
PowerfoxLocal,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY, CONF_HOST
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
@@ -40,6 +46,12 @@ class PowerfoxLocalDataUpdateCoordinator(DataUpdateCoordinator[LocalResponse]):
"""Fetch data from the local poweropti."""
try:
return await self.client.value()
except PowerfoxAuthenticationError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="invalid_auth",
translation_placeholders={"error": str(err)},
) from err
except PowerfoxConnectionError as err:
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -0,0 +1,24 @@
"""Support for Powerfox Local diagnostics."""
from __future__ import annotations
from typing import Any
from homeassistant.core import HomeAssistant
from .coordinator import PowerfoxLocalConfigEntry
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: PowerfoxLocalConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for Powerfox Local config entry."""
coordinator = entry.runtime_data
return {
"power": coordinator.data.power,
"energy_usage": coordinator.data.energy_usage,
"energy_usage_high_tariff": coordinator.data.energy_usage_high_tariff,
"energy_usage_low_tariff": coordinator.data.energy_usage_low_tariff,
"energy_return": coordinator.data.energy_return,
}

View File

@@ -6,8 +6,8 @@
"documentation": "https://www.home-assistant.io/integrations/powerfox_local",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["powerfox==2.1.0"],
"quality_scale": "platinum",
"requirements": ["powerfox==2.1.1"],
"zeroconf": [
{
"name": "powerfox*",

View File

@@ -43,12 +43,12 @@ rules:
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
reauthentication-flow: done
test-coverage: done
# Gold
devices: done
diagnostics: todo
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: done
@@ -74,7 +74,7 @@ rules:
status: exempt
comment: |
There is no need for icon translations.
reconfiguration-flow: todo
reconfiguration-flow: done
repair-issues:
status: exempt
comment: |

View File

@@ -2,13 +2,26 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"unique_id_mismatch": "Please ensure you reconfigure against the same device."
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]"
},
"step": {
"reauth_confirm": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
},
"data_description": {
"api_key": "[%key:component::powerfox_local::config::step::user::data_description::api_key%]"
},
"description": "The API key for your Poweropti device is no longer valid.",
"title": "[%key:common::config_flow::title::reauth%]"
},
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
@@ -43,6 +56,9 @@
}
},
"exceptions": {
"invalid_auth": {
"message": "Error while authenticating with the device: {error}"
},
"update_failed": {
"message": "Error while updating the device: {error}"
}

View File

@@ -40,6 +40,7 @@ from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator
PLATFORMS = [
Platform.BINARY_SENSOR,
Platform.BUTTON,
Platform.SENSOR,
]

View File

@@ -74,16 +74,20 @@ def _get_nodes_data(data: dict[str, Any]) -> list[dict[str, Any]]:
raise ProxmoxSSLError from err
except ConnectTimeout as err:
raise ProxmoxConnectTimeout from err
except (ResourceException, requests.exceptions.ConnectionError) as err:
except ResourceException as err:
raise ProxmoxNoNodesFound from err
except requests.exceptions.ConnectionError as err:
raise ProxmoxConnectionError from err
nodes_data: list[dict[str, Any]] = []
for node in nodes:
try:
vms = client.nodes(node["node"]).qemu.get()
containers = client.nodes(node["node"]).lxc.get()
except (ResourceException, requests.exceptions.ConnectionError) as err:
except ResourceException as err:
raise ProxmoxNoNodesFound from err
except requests.exceptions.ConnectionError as err:
raise ProxmoxConnectionError from err
nodes_data.append(
{
@@ -197,18 +201,30 @@ class ProxmoxveConfigFlow(ConfigFlow, domain=DOMAIN):
"""Validate the user input. Return nodes data and/or errors."""
errors: dict[str, str] = {}
proxmox_nodes: list[dict[str, Any]] = []
err: ProxmoxError | None = None
try:
proxmox_nodes = await self.hass.async_add_executor_job(
_get_nodes_data, user_input
)
except ProxmoxConnectTimeout:
except ProxmoxConnectTimeout as exc:
errors["base"] = "connect_timeout"
except ProxmoxAuthenticationError:
err = exc
except ProxmoxAuthenticationError as exc:
errors["base"] = "invalid_auth"
except ProxmoxSSLError:
err = exc
except ProxmoxSSLError as exc:
errors["base"] = "ssl_error"
except ProxmoxNoNodesFound:
err = exc
except ProxmoxNoNodesFound as exc:
errors["base"] = "no_nodes_found"
err = exc
except ProxmoxConnectionError as exc:
errors["base"] = "cannot_connect"
err = exc
if err is not None:
_LOGGER.debug("Error: %s: %s", errors["base"], err)
return proxmox_nodes, errors
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
@@ -227,6 +243,8 @@ class ProxmoxveConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_abort(reason="ssl_error")
except ProxmoxNoNodesFound:
return self.async_abort(reason="no_nodes_found")
except ProxmoxConnectionError:
return self.async_abort(reason="cannot_connect")
return self.async_create_entry(
title=import_data[CONF_HOST],
@@ -234,17 +252,25 @@ class ProxmoxveConfigFlow(ConfigFlow, domain=DOMAIN):
)
class ProxmoxNoNodesFound(HomeAssistantError):
class ProxmoxError(HomeAssistantError):
"""Base class for Proxmox VE errors."""
class ProxmoxNoNodesFound(ProxmoxError):
"""Error to indicate no nodes found."""
class ProxmoxConnectTimeout(HomeAssistantError):
class ProxmoxConnectTimeout(ProxmoxError):
"""Error to indicate a connection timeout."""
class ProxmoxSSLError(HomeAssistantError):
class ProxmoxSSLError(ProxmoxError):
"""Error to indicate an SSL error."""
class ProxmoxAuthenticationError(HomeAssistantError):
class ProxmoxAuthenticationError(ProxmoxError):
"""Error to indicate an authentication error."""
class ProxmoxConnectionError(ProxmoxError):
"""Error to indicate a connection error."""

View File

@@ -101,12 +101,18 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
translation_key="timeout_connect",
translation_placeholders={"error": repr(err)},
) from err
except (ResourceException, requests.exceptions.ConnectionError) as err:
except ResourceException as err:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="cannot_connect",
translation_placeholders={"error": repr(err)},
) from err
async def _async_update_data(self) -> dict[str, ProxmoxNodeData]:
"""Fetch data from Proxmox VE API."""
@@ -133,12 +139,18 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
translation_key="timeout_connect",
translation_placeholders={"error": repr(err)},
) from err
except (ResourceException, requests.exceptions.ConnectionError) as err:
except ResourceException as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="cannot_connect",
translation_placeholders={"error": repr(err)},
) from err
data: dict[str, ProxmoxNodeData] = {}
for node, (vms, containers) in zip(nodes, vms_containers, strict=True):

View File

@@ -13,6 +13,71 @@
"stop": {
"default": "mdi:stop"
}
},
"sensor": {
"container_cpu": {
"default": "mdi:cpu-64-bit"
},
"container_disk": {
"default": "mdi:harddisk"
},
"container_max_cpu": {
"default": "mdi:cpu-64-bit"
},
"container_max_disk": {
"default": "mdi:harddisk"
},
"container_max_memory": {
"default": "mdi:memory"
},
"container_memory": {
"default": "mdi:memory"
},
"container_status": {
"default": "mdi:server"
},
"node_cpu": {
"default": "mdi:cpu-64-bit"
},
"node_disk": {
"default": "mdi:harddisk"
},
"node_max_cpu": {
"default": "mdi:cpu-64-bit"
},
"node_max_disk": {
"default": "mdi:harddisk"
},
"node_max_memory": {
"default": "mdi:memory"
},
"node_memory": {
"default": "mdi:memory"
},
"node_status": {
"default": "mdi:server"
},
"vm_cpu": {
"default": "mdi:cpu-64-bit"
},
"vm_disk": {
"default": "mdi:harddisk"
},
"vm_max_cpu": {
"default": "mdi:cpu-64-bit"
},
"vm_max_disk": {
"default": "mdi:harddisk"
},
"vm_max_memory": {
"default": "mdi:memory"
},
"vm_memory": {
"default": "mdi:memory"
},
"vm_status": {
"default": "mdi:server"
}
}
}
}

View File

@@ -0,0 +1,386 @@
"""Sensor platform for Proxmox VE integration."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import Any
from homeassistant.components.sensor import (
EntityCategory,
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
StateType,
)
from homeassistant.const import PERCENTAGE, UnitOfInformation
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator, ProxmoxNodeData
from .entity import ProxmoxContainerEntity, ProxmoxNodeEntity, ProxmoxVMEntity
@dataclass(frozen=True, kw_only=True)
class ProxmoxNodeSensorEntityDescription(SensorEntityDescription):
"""Class to hold Proxmox node sensor description."""
value_fn: Callable[[ProxmoxNodeData], StateType]
@dataclass(frozen=True, kw_only=True)
class ProxmoxVMSensorEntityDescription(SensorEntityDescription):
"""Class to hold Proxmox VM sensor description."""
value_fn: Callable[[dict[str, Any]], StateType]
@dataclass(frozen=True, kw_only=True)
class ProxmoxContainerSensorEntityDescription(SensorEntityDescription):
"""Class to hold Proxmox container sensor description."""
value_fn: Callable[[dict[str, Any]], StateType]
NODE_SENSORS: tuple[ProxmoxNodeSensorEntityDescription, ...] = (
ProxmoxNodeSensorEntityDescription(
key="node_cpu",
translation_key="node_cpu",
value_fn=lambda data: data.node["cpu"] * 100,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
suggested_display_precision=2,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxNodeSensorEntityDescription(
key="node_max_cpu",
translation_key="node_max_cpu",
value_fn=lambda data: data.node["maxcpu"],
),
ProxmoxNodeSensorEntityDescription(
key="node_disk",
translation_key="node_disk",
value_fn=lambda data: data.node["disk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxNodeSensorEntityDescription(
key="node_max_disk",
translation_key="node_max_disk",
value_fn=lambda data: data.node["maxdisk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxNodeSensorEntityDescription(
key="node_memory",
translation_key="node_memory",
value_fn=lambda data: data.node["mem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxNodeSensorEntityDescription(
key="node_max_memory",
translation_key="node_max_memory",
value_fn=lambda data: data.node["maxmem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxNodeSensorEntityDescription(
key="node_status",
translation_key="node_status",
value_fn=lambda data: data.node["status"],
device_class=SensorDeviceClass.ENUM,
options=["online", "offline"],
),
)
VM_SENSORS: tuple[ProxmoxVMSensorEntityDescription, ...] = (
ProxmoxVMSensorEntityDescription(
key="vm_max_cpu",
translation_key="vm_max_cpu",
value_fn=lambda data: data["cpus"],
),
ProxmoxVMSensorEntityDescription(
key="vm_cpu",
translation_key="vm_cpu",
value_fn=lambda data: data["cpu"] * 100,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
suggested_display_precision=2,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxVMSensorEntityDescription(
key="vm_memory",
translation_key="vm_memory",
value_fn=lambda data: data["mem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxVMSensorEntityDescription(
key="vm_max_memory",
translation_key="vm_max_memory",
value_fn=lambda data: data["maxmem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxVMSensorEntityDescription(
key="vm_disk",
translation_key="vm_disk",
value_fn=lambda data: data["disk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxVMSensorEntityDescription(
key="vm_max_disk",
translation_key="vm_max_disk",
value_fn=lambda data: data["maxdisk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxVMSensorEntityDescription(
key="vm_status",
translation_key="vm_status",
value_fn=lambda data: data["status"],
device_class=SensorDeviceClass.ENUM,
options=["running", "stopped", "suspended"],
),
)
CONTAINER_SENSORS: tuple[ProxmoxContainerSensorEntityDescription, ...] = (
ProxmoxContainerSensorEntityDescription(
key="container_max_cpu",
translation_key="container_max_cpu",
value_fn=lambda data: data["cpus"],
),
ProxmoxContainerSensorEntityDescription(
key="container_cpu",
translation_key="container_cpu",
value_fn=lambda data: data["cpu"] * 100,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
suggested_display_precision=2,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxContainerSensorEntityDescription(
key="container_memory",
translation_key="container_memory",
value_fn=lambda data: data["mem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxContainerSensorEntityDescription(
key="container_max_memory",
translation_key="container_max_memory",
value_fn=lambda data: data["maxmem"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxContainerSensorEntityDescription(
key="container_disk",
translation_key="container_disk",
value_fn=lambda data: data["disk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxContainerSensorEntityDescription(
key="container_max_disk",
translation_key="container_max_disk",
value_fn=lambda data: data["maxdisk"],
device_class=SensorDeviceClass.DATA_SIZE,
native_unit_of_measurement=UnitOfInformation.BYTES,
suggested_unit_of_measurement=UnitOfInformation.GIBIBYTES,
suggested_display_precision=1,
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.MEASUREMENT,
),
ProxmoxContainerSensorEntityDescription(
key="container_status",
translation_key="container_status",
value_fn=lambda data: data["status"],
device_class=SensorDeviceClass.ENUM,
options=["running", "stopped", "suspended"],
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: ProxmoxConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Proxmox VE sensors."""
coordinator = entry.runtime_data
def _async_add_new_nodes(nodes: list[ProxmoxNodeData]) -> None:
"""Add new node sensors."""
async_add_entities(
ProxmoxNodeSensor(coordinator, entity_description, node)
for node in nodes
for entity_description in NODE_SENSORS
)
def _async_add_new_vms(
vms: list[tuple[ProxmoxNodeData, dict[str, Any]]],
) -> None:
"""Add new VM sensors."""
async_add_entities(
ProxmoxVMSensor(coordinator, entity_description, vm, node_data)
for (node_data, vm) in vms
for entity_description in VM_SENSORS
)
def _async_add_new_containers(
containers: list[tuple[ProxmoxNodeData, dict[str, Any]]],
) -> None:
"""Add new container sensors."""
async_add_entities(
ProxmoxContainerSensor(
coordinator, entity_description, container, node_data
)
for (node_data, container) in containers
for entity_description in CONTAINER_SENSORS
)
coordinator.new_nodes_callbacks.append(_async_add_new_nodes)
coordinator.new_vms_callbacks.append(_async_add_new_vms)
coordinator.new_containers_callbacks.append(_async_add_new_containers)
_async_add_new_nodes(
[
node_data
for node_data in coordinator.data.values()
if node_data.node["node"] in coordinator.known_nodes
]
)
_async_add_new_vms(
[
(node_data, vm_data)
for node_data in coordinator.data.values()
for vmid, vm_data in node_data.vms.items()
if (node_data.node["node"], vmid) in coordinator.known_vms
]
)
_async_add_new_containers(
[
(node_data, container_data)
for node_data in coordinator.data.values()
for vmid, container_data in node_data.containers.items()
if (node_data.node["node"], vmid) in coordinator.known_containers
]
)
class ProxmoxNodeSensor(ProxmoxNodeEntity, SensorEntity):
"""Representation of a Proxmox VE node sensor."""
entity_description: ProxmoxNodeSensorEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxNodeSensorEntityDescription,
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator, node_data)
self.entity_description = entity_description
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{node_data.node['id']}_{entity_description.key}"
@property
def native_value(self) -> StateType:
"""Return the native value of the sensor."""
return self.entity_description.value_fn(self.coordinator.data[self.device_name])
class ProxmoxVMSensor(ProxmoxVMEntity, SensorEntity):
"""Represents a Proxmox VE VM sensor."""
entity_description: ProxmoxVMSensorEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxVMSensorEntityDescription,
vm_data: dict[str, Any],
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the Proxmox VM sensor."""
self.entity_description = entity_description
super().__init__(coordinator, vm_data, node_data)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{self.device_id}_{entity_description.key}"
@property
def native_value(self) -> StateType:
"""Return the native value of the sensor."""
return self.entity_description.value_fn(self.vm_data)
class ProxmoxContainerSensor(ProxmoxContainerEntity, SensorEntity):
"""Represents a Proxmox VE container sensor."""
entity_description: ProxmoxContainerSensorEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxContainerSensorEntityDescription,
container_data: dict[str, Any],
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the Proxmox container sensor."""
self.entity_description = entity_description
super().__init__(coordinator, container_data, node_data)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{self.device_id}_{entity_description.key}"
@property
def native_value(self) -> StateType:
"""Return the native value of the sensor."""
return self.entity_description.value_fn(self.container_data)

View File

@@ -77,6 +77,85 @@
"stop_all": {
"name": "Stop all"
}
},
"sensor": {
"container_cpu": {
"name": "CPU usage"
},
"container_disk": {
"name": "Disk usage"
},
"container_max_cpu": {
"name": "Max CPU"
},
"container_max_disk": {
"name": "Max disk usage"
},
"container_max_memory": {
"name": "Max memory usage"
},
"container_memory": {
"name": "Memory usage"
},
"container_status": {
"name": "Status",
"state": {
"running": "Running",
"stopped": "Stopped",
"suspended": "Suspended"
}
},
"node_cpu": {
"name": "CPU usage"
},
"node_disk": {
"name": "Disk usage"
},
"node_max_cpu": {
"name": "Max CPU"
},
"node_max_disk": {
"name": "Max disk usage"
},
"node_max_memory": {
"name": "Max memory usage"
},
"node_memory": {
"name": "Memory usage"
},
"node_status": {
"name": "Status",
"state": {
"offline": "Offline",
"online": "Online"
}
},
"vm_cpu": {
"name": "CPU usage"
},
"vm_disk": {
"name": "Disk usage"
},
"vm_max_cpu": {
"name": "Max CPU"
},
"vm_max_disk": {
"name": "Max disk usage"
},
"vm_max_memory": {
"name": "Max memory usage"
},
"vm_memory": {
"name": "Memory usage"
},
"vm_status": {
"name": "Status",
"state": {
"running": "Running",
"stopped": "Stopped",
"suspended": "Suspended"
}
}
}
},
"exceptions": {
@@ -109,6 +188,10 @@
}
},
"issues": {
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring {integration_title} via YAML is deprecated and will be removed in a future release. While importing your configuration, a connection error occurred. Please correct your YAML configuration and restart Home Assistant, or remove the {domain} key from your configuration and configure the integration via the UI.",
"title": "[%key:component::proxmoxve::issues::deprecated_yaml_import_issue_connect_timeout::title%]"
},
"deprecated_yaml_import_issue_connect_timeout": {
"description": "Configuring {integration_title} via YAML is deprecated and will be removed in a future release. While importing your configuration, a connection timeout occurred. Please correct your YAML configuration and restart Home Assistant, or remove the {domain} key from your configuration and configure the integration via the UI.",
"title": "The {integration_title} YAML configuration is being removed"

View File

@@ -6,6 +6,7 @@ from collections.abc import Callable
from dataclasses import dataclass
from roborock.data import CleanFluidStatus, RoborockStateCode
from roborock.roborock_message import RoborockZeoProtocol
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -15,9 +16,15 @@ from homeassistant.components.binary_sensor import (
from homeassistant.const import ATTR_BATTERY_CHARGING, EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockCoordinatedEntityV1
from .coordinator import (
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockWashingMachineUpdateCoordinator,
)
from .entity import RoborockCoordinatedEntityA01, RoborockCoordinatedEntityV1
from .models import DeviceState
PARALLEL_UPDATES = 0
@@ -34,6 +41,14 @@ class RoborockBinarySensorDescription(BinarySensorEntityDescription):
"""Whether this sensor is for the dock."""
@dataclass(frozen=True, kw_only=True)
class RoborockBinarySensorDescriptionA01(BinarySensorEntityDescription):
"""A class that describes Roborock A01 binary sensors."""
data_protocol: RoborockZeoProtocol
value_fn: Callable[[StateType], bool]
BINARY_SENSOR_DESCRIPTIONS = [
RoborockBinarySensorDescription(
key="dry_status",
@@ -111,13 +126,33 @@ BINARY_SENSOR_DESCRIPTIONS = [
]
ZEO_BINARY_SENSOR_DESCRIPTIONS: list[RoborockBinarySensorDescriptionA01] = [
RoborockBinarySensorDescriptionA01(
key="detergent_empty",
data_protocol=RoborockZeoProtocol.DETERGENT_EMPTY,
device_class=BinarySensorDeviceClass.PROBLEM,
translation_key="detergent_empty",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=bool,
),
RoborockBinarySensorDescriptionA01(
key="softener_empty",
data_protocol=RoborockZeoProtocol.SOFTENER_EMPTY,
device_class=BinarySensorDeviceClass.PROBLEM,
translation_key="softener_empty",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=bool,
),
]
async def async_setup_entry(
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Roborock vacuum binary sensors."""
async_add_entities(
entities: list[BinarySensorEntity] = [
RoborockBinarySensorEntity(
coordinator,
description,
@@ -125,7 +160,18 @@ async def async_setup_entry(
for coordinator in config_entry.runtime_data.v1
for description in BINARY_SENSOR_DESCRIPTIONS
if description.value_fn(coordinator.data) is not None
]
entities.extend(
RoborockBinarySensorEntityA01(
coordinator,
description,
)
for coordinator in config_entry.runtime_data.a01
if isinstance(coordinator, RoborockWashingMachineUpdateCoordinator)
for description in ZEO_BINARY_SENSOR_DESCRIPTIONS
if description.data_protocol in coordinator.request_protocols
)
async_add_entities(entities)
class RoborockBinarySensorEntity(RoborockCoordinatedEntityV1, BinarySensorEntity):
@@ -150,3 +196,24 @@ class RoborockBinarySensorEntity(RoborockCoordinatedEntityV1, BinarySensorEntity
def is_on(self) -> bool:
"""Return the value reported by the sensor."""
return bool(self.entity_description.value_fn(self.coordinator.data))
class RoborockBinarySensorEntityA01(RoborockCoordinatedEntityA01, BinarySensorEntity):
"""Representation of a A01 Roborock binary sensor."""
entity_description: RoborockBinarySensorDescriptionA01
def __init__(
self,
coordinator: RoborockDataUpdateCoordinatorA01,
description: RoborockBinarySensorDescriptionA01,
) -> None:
"""Initialize the entity."""
self.entity_description = description
super().__init__(f"{description.key}_{coordinator.duid_slug}", coordinator)
@property
def is_on(self) -> bool:
"""Return the value reported by the sensor."""
value = self.coordinator.data[self.entity_description.data_protocol]
return self.entity_description.value_fn(value)

View File

@@ -10,6 +10,7 @@ from typing import Any
from roborock.devices.traits.v1.consumeable import ConsumableAttribute
from roborock.exceptions import RoborockException
from roborock.roborock_message import RoborockZeoProtocol
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
@@ -18,8 +19,13 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockEntity, RoborockEntityV1
from .coordinator import (
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockWashingMachineUpdateCoordinator,
)
from .entity import RoborockCoordinatedEntityA01, RoborockEntity, RoborockEntityV1
_LOGGER = logging.getLogger(__name__)
@@ -65,6 +71,32 @@ CONSUMABLE_BUTTON_DESCRIPTIONS = [
]
@dataclass(frozen=True, kw_only=True)
class RoborockButtonDescriptionA01(ButtonEntityDescription):
"""Describes a Roborock A01 button entity."""
data_protocol: RoborockZeoProtocol
ZEO_BUTTON_DESCRIPTIONS = [
RoborockButtonDescriptionA01(
key="start",
data_protocol=RoborockZeoProtocol.START,
translation_key="start",
),
RoborockButtonDescriptionA01(
key="pause",
data_protocol=RoborockZeoProtocol.PAUSE,
translation_key="pause",
),
RoborockButtonDescriptionA01(
key="shutdown",
data_protocol=RoborockZeoProtocol.SHUTDOWN,
translation_key="shutdown",
),
]
async def async_setup_entry(
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
@@ -98,6 +130,15 @@ async def async_setup_entry(
)
for routine in routines
),
(
RoborockButtonEntityA01(
coordinator,
description,
)
for coordinator in config_entry.runtime_data.a01
if isinstance(coordinator, RoborockWashingMachineUpdateCoordinator)
for description in ZEO_BUTTON_DESCRIPTIONS
),
)
)
@@ -160,3 +201,35 @@ class RoborockRoutineButtonEntity(RoborockEntity, ButtonEntity):
async def async_press(self, **kwargs: Any) -> None:
"""Press the button."""
await self._coordinator.execute_routines(self._routine_id)
class RoborockButtonEntityA01(RoborockCoordinatedEntityA01, ButtonEntity):
"""A class to define Roborock A01 button entities."""
entity_description: RoborockButtonDescriptionA01
def __init__(
self,
coordinator: RoborockDataUpdateCoordinatorA01,
entity_description: RoborockButtonDescriptionA01,
) -> None:
"""Create an A01 button entity."""
self.entity_description = entity_description
super().__init__(
f"{entity_description.key}_{coordinator.duid_slug}", coordinator
)
async def async_press(self) -> None:
"""Press the button."""
try:
await self.coordinator.api.set_value( # type: ignore[attr-defined]
self.entity_description.data_protocol,
1,
)
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="button_press_failed",
) from err
finally:
await self.coordinator.async_request_refresh()

View File

@@ -432,6 +432,18 @@ class RoborockWashingMachineUpdateCoordinator(
RoborockZeoProtocol.COUNTDOWN,
RoborockZeoProtocol.WASHING_LEFT,
RoborockZeoProtocol.ERROR,
RoborockZeoProtocol.TIMES_AFTER_CLEAN,
RoborockZeoProtocol.DETERGENT_EMPTY,
RoborockZeoProtocol.SOFTENER_EMPTY,
RoborockZeoProtocol.DETERGENT_TYPE,
RoborockZeoProtocol.SOFTENER_TYPE,
RoborockZeoProtocol.MODE,
RoborockZeoProtocol.PROGRAM,
RoborockZeoProtocol.TEMP,
RoborockZeoProtocol.RINSE_TIMES,
RoborockZeoProtocol.SPIN_LEVEL,
RoborockZeoProtocol.DRYING_MODE,
RoborockZeoProtocol.SOUND_SET,
]
async def _async_update_data(

View File

@@ -3,21 +3,35 @@
import asyncio
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
import logging
from typing import Any
from roborock import B01Props, CleanTypeMapping
from roborock.data import RoborockDockDustCollectionModeCode, WaterLevelMapping
from roborock.data import (
RoborockDockDustCollectionModeCode,
RoborockEnum,
WaterLevelMapping,
ZeoDetergentType,
ZeoDryingMode,
ZeoMode,
ZeoProgram,
ZeoRinse,
ZeoSoftenerType,
ZeoSpin,
ZeoTemperature,
)
from roborock.devices.traits.b01 import Q7PropertiesApi
from roborock.devices.traits.v1 import PropertiesApi
from roborock.devices.traits.v1.home import HomeTrait
from roborock.devices.traits.v1.maps import MapsTrait
from roborock.exceptions import RoborockException
from roborock.roborock_message import RoborockZeoProtocol
from roborock.roborock_typing import RoborockCommand
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, MAP_SLEEP
@@ -25,11 +39,18 @@ from .coordinator import (
RoborockB01Q7UpdateCoordinator,
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
)
from .entity import (
RoborockCoordinatedEntityA01,
RoborockCoordinatedEntityB01Q7,
RoborockCoordinatedEntityV1,
)
from .entity import RoborockCoordinatedEntityB01Q7, RoborockCoordinatedEntityV1
PARALLEL_UPDATES = 0
_LOGGER = logging.getLogger(__name__)
@dataclass(frozen=True, kw_only=True)
class RoborockSelectDescription(SelectEntityDescription):
@@ -65,6 +86,16 @@ class RoborockB01SelectDescription(SelectEntityDescription):
"""Function to get all options of the select entity or returns None if not supported."""
@dataclass(frozen=True, kw_only=True)
class RoborockSelectDescriptionA01(SelectEntityDescription):
"""Class to describe a Roborock A01 select entity."""
# The protocol that the select entity will send to the api.
data_protocol: RoborockZeoProtocol
# Enum class for the select entity
enum_class: type[RoborockEnum]
B01_SELECT_DESCRIPTIONS: list[RoborockB01SelectDescription] = [
RoborockB01SelectDescription(
key="water_flow",
@@ -139,6 +170,66 @@ SELECT_DESCRIPTIONS: list[RoborockSelectDescription] = [
]
A01_SELECT_DESCRIPTIONS: list[RoborockSelectDescriptionA01] = [
RoborockSelectDescriptionA01(
key="program",
data_protocol=RoborockZeoProtocol.PROGRAM,
translation_key="program",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoProgram,
),
RoborockSelectDescriptionA01(
key="mode",
data_protocol=RoborockZeoProtocol.MODE,
translation_key="mode",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoMode,
),
RoborockSelectDescriptionA01(
key="temperature",
data_protocol=RoborockZeoProtocol.TEMP,
translation_key="temperature",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoTemperature,
),
RoborockSelectDescriptionA01(
key="drying_mode",
data_protocol=RoborockZeoProtocol.DRYING_MODE,
translation_key="drying_mode",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoDryingMode,
),
RoborockSelectDescriptionA01(
key="spin_level",
data_protocol=RoborockZeoProtocol.SPIN_LEVEL,
translation_key="spin_level",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoSpin,
),
RoborockSelectDescriptionA01(
key="rinse_times",
data_protocol=RoborockZeoProtocol.RINSE_TIMES,
translation_key="rinse_times",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoRinse,
),
RoborockSelectDescriptionA01(
key="detergent_type",
data_protocol=RoborockZeoProtocol.DETERGENT_TYPE,
translation_key="detergent_type",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoDetergentType,
),
RoborockSelectDescriptionA01(
key="softener_type",
data_protocol=RoborockZeoProtocol.SOFTENER_TYPE,
translation_key="softener_type",
entity_category=EntityCategory.CONFIG,
enum_class=ZeoSoftenerType,
),
]
async def async_setup_entry(
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
@@ -169,6 +260,12 @@ async def async_setup_entry(
for description in B01_SELECT_DESCRIPTIONS
if (options := description.options_lambda(coordinator.api)) is not None
)
async_add_entities(
RoborockSelectEntityA01(coordinator, description)
for coordinator in config_entry.runtime_data.a01
for description in A01_SELECT_DESCRIPTIONS
if description.data_protocol in coordinator.request_protocols
)
class RoborockB01SelectEntity(RoborockCoordinatedEntityB01Q7, SelectEntity):
@@ -308,3 +405,64 @@ class RoborockCurrentMapSelectEntity(RoborockCoordinatedEntityV1, SelectEntity):
if current_map_info := self._home_trait.current_map_data:
return current_map_info.name or f"Map {current_map_info.map_flag}"
return None
class RoborockSelectEntityA01(RoborockCoordinatedEntityA01, SelectEntity):
"""A class to let you set options on a Roborock A01 device."""
entity_description: RoborockSelectDescriptionA01
def __init__(
self,
coordinator: RoborockDataUpdateCoordinatorA01,
entity_description: RoborockSelectDescriptionA01,
) -> None:
"""Create an A01 select entity."""
self.entity_description = entity_description
super().__init__(
f"{entity_description.key}_{coordinator.duid_slug}",
coordinator,
)
self._attr_options = list(entity_description.enum_class.keys())
async def async_select_option(self, option: str) -> None:
"""Set the option."""
# Get the protocol value for the selected option
option_values = self.entity_description.enum_class.as_dict()
if option not in option_values:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="select_option_failed",
)
value = option_values[option]
try:
await self.coordinator.api.set_value( # type: ignore[attr-defined]
self.entity_description.data_protocol,
value,
)
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="command_failed",
translation_placeholders={
"command": self.entity_description.key,
},
) from err
await self.coordinator.async_request_refresh()
@property
def current_option(self) -> str | None:
"""Get the current status of the select entity from coordinator data."""
if self.entity_description.data_protocol not in self.coordinator.data:
return None
current_value = self.coordinator.data[self.entity_description.data_protocol]
if current_value is None:
return None
_LOGGER.debug(
"current_value: %s for %s",
current_value,
self.entity_description.key,
)
return str(current_value)

View File

@@ -37,6 +37,8 @@ from .coordinator import (
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockWashingMachineUpdateCoordinator,
RoborockWetDryVacUpdateCoordinator,
)
from .entity import (
RoborockCoordinatedEntityA01,
@@ -252,7 +254,7 @@ SENSOR_DESCRIPTIONS = [
),
]
A01_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
DYAD_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
RoborockSensorDescriptionA01(
key="status",
data_protocol=RoborockDyadDataProtocol.STATUS,
@@ -303,6 +305,9 @@ A01_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
translation_key="total_cleaning_time",
entity_category=EntityCategory.DIAGNOSTIC,
),
]
ZEO_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
RoborockSensorDescriptionA01(
key="state",
data_protocol=RoborockZeoProtocol.STATE,
@@ -335,6 +340,12 @@ A01_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
entity_category=EntityCategory.DIAGNOSTIC,
options=ZeoError.keys(),
),
RoborockSensorDescriptionA01(
key="times_after_clean",
data_protocol=RoborockZeoProtocol.TIMES_AFTER_CLEAN,
translation_key="times_after_clean",
entity_category=EntityCategory.DIAGNOSTIC,
),
]
Q7_B01_SENSOR_DESCRIPTIONS = [
@@ -418,7 +429,18 @@ async def async_setup_entry(
description,
)
for coordinator in coordinators.a01
for description in A01_SENSOR_DESCRIPTIONS
if isinstance(coordinator, RoborockWetDryVacUpdateCoordinator)
for description in DYAD_SENSOR_DESCRIPTIONS
if description.data_protocol in coordinator.request_protocols
)
entities.extend(
RoborockSensorEntityA01(
coordinator,
description,
)
for coordinator in coordinators.a01
if isinstance(coordinator, RoborockWashingMachineUpdateCoordinator)
for description in ZEO_SENSOR_DESCRIPTIONS
if description.data_protocol in coordinator.request_protocols
)
entities.extend(

View File

@@ -50,6 +50,13 @@
"clean_fluid_empty": {
"name": "Cleaning fluid"
},
"detergent_empty": {
"name": "Detergent",
"state": {
"off": "Available",
"on": "[%key:common::state::empty%]"
}
},
"dirty_box_full": {
"name": "Dirty water box"
},
@@ -62,6 +69,13 @@
"mop_drying_status": {
"name": "Mop drying"
},
"softener_empty": {
"name": "Softener",
"state": {
"off": "Available",
"on": "[%key:common::state::empty%]"
}
},
"water_box_attached": {
"name": "Water box attached"
},
@@ -70,6 +84,9 @@
}
},
"button": {
"pause": {
"name": "Pause"
},
"reset_air_filter_consumable": {
"name": "Reset air filter consumable"
},
@@ -81,6 +98,12 @@
},
"reset_side_brush_consumable": {
"name": "Reset side brush consumable"
},
"shutdown": {
"name": "Shutdown"
},
"start": {
"name": "Start"
}
},
"number": {
@@ -97,6 +120,25 @@
"vacuum": "Vacuum only"
}
},
"detergent_type": {
"name": "Detergent type",
"state": {
"empty": "[%key:common::state::empty%]",
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]"
}
},
"drying_mode": {
"name": "Drying mode",
"state": {
"iron": "Iron",
"none": "No drying",
"quick": "Quick",
"store": "Store",
"time_dry": "Time dry"
}
},
"dust_collection_mode": {
"name": "Empty mode",
"state": {
@@ -106,6 +148,19 @@
"smart": "Smart"
}
},
"mode": {
"name": "Operating mode",
"state": {
"drain": "Drain",
"dry": "Dry",
"heavy": "Heavy",
"pre_wash": "Pre-wash",
"rinse_spin": "Rinse & spin",
"spin": "Spin",
"wash": "Wash",
"wash_and_dry": "Wash and dry"
}
},
"mop_intensity": {
"name": "Mop intensity",
"state": {
@@ -138,9 +193,90 @@
"standard": "Standard"
}
},
"program": {
"name": "Wash program",
"state": {
"air_refresh": "Air refresh",
"anti_allergen": "Anti-allergen",
"anti_mites": "Anti-mites",
"baby_care": "Baby care",
"bedding": "Bedding",
"boiling_wash": "Boiling wash",
"bra": "Bra",
"cotton_linen": "Cotton/Linen",
"custom": "Custom",
"down": "Down",
"down_clean": "Down clean",
"exo_40_60": "Exo 40/60",
"gentle": "Gentle",
"intensive": "Intensive",
"new_clothes": "New clothes",
"night": "Night",
"panties": "Panties",
"quick": "Quick",
"rinse_and_spin": "Rinse and spin",
"sanitize": "Sanitize",
"season": "Season",
"shirts": "Shirts",
"silk": "Silk",
"socks": "Socks",
"sportswear": "Sportswear",
"stain_removal": "Stain removal",
"standard": "Standard",
"synthetics": "Synthetics",
"t_shirts": "T-shirts",
"towels": "Towels",
"twenty_c": "20°C",
"underwear": "Underwear",
"warming": "Warming",
"wool": "Wool"
}
},
"rinse_times": {
"name": "Rinse times",
"state": {
"high": "4",
"low": "2",
"max": "5",
"mid": "3",
"min": "1",
"none": "Default"
}
},
"selected_map": {
"name": "Selected map"
},
"softener_type": {
"name": "Softener type",
"state": {
"empty": "[%key:common::state::empty%]",
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "[%key:common::state::medium%]"
}
},
"spin_level": {
"name": "Spin level",
"state": {
"high": "1000 RPM",
"max": "1400 RPM",
"mid": "800 RPM",
"none": "Default",
"very_high": "1200 RPM",
"very_low": "600 RPM"
}
},
"temperature": {
"name": "Water temperature",
"state": {
"30": "30°C",
"40": "40°C",
"60": "60°C",
"90": "90°C",
"auto": "[%key:common::state::auto%]",
"cold": "Cold"
}
},
"water_flow": {
"name": "Water flow",
"state": {
@@ -307,6 +443,9 @@
"strainer_time_left": {
"name": "Strainer time left"
},
"times_after_clean": {
"name": "Times after clean"
},
"total_cleaning_area": {
"name": "Total cleaning area"
},
@@ -375,14 +514,14 @@
"communication_error": "Communication error",
"door_lock_error": "Door lock error",
"drain_error": "Drain error",
"drying_error": "Drying error",
"drying_error_e_12": "Drying error E12",
"drying_error": "Drying error: check air inlet temperature sensor",
"drying_error_e_12": "Drying error: check air outlet temperature sensor",
"drying_error_e_13": "Drying error E13",
"drying_error_e_14": "Drying error E14",
"drying_error_e_15": "Drying error E15",
"drying_error_e_16": "Drying error E16",
"drying_error_restart": "Restart the washer",
"drying_error_water_flow": "Check water flow",
"drying_error_e_14": "Drying error: check inlet condenser temperature sensor",
"drying_error_e_15": "Drying error: check heating element or turntable",
"drying_error_e_16": "Drying error: check drying fan",
"drying_error_restart": "Drying error: restart the washer",
"drying_error_water_flow": "Drying error: check water flow",
"heating_error": "Heating error",
"inverter_error": "Inverter error",
"none": "[%key:component::roborock::entity::sensor::vacuum_error::state::none%]",
@@ -420,6 +559,9 @@
"off_peak_switch": {
"name": "Off-peak charging"
},
"sound_setting": {
"name": "Sound setting"
},
"status_indicator": {
"name": "Status indicator light"
}
@@ -464,6 +606,9 @@
}
},
"exceptions": {
"button_press_failed": {
"message": "Failed to press button"
},
"command_failed": {
"message": "Error while calling {command}"
},
@@ -482,9 +627,6 @@
"mqtt_unauthorized": {
"message": "Roborock MQTT servers rejected the connection due to rate limiting or invalid credentials. You may either attempt to reauthenticate or wait and reload the integration."
},
"multiple_maps_in_clean": {
"message": "All segments must belong to the same map. Got segments from maps: {map_flags}"
},
"no_coordinators": {
"message": "No devices were able to successfully setup"
},
@@ -497,6 +639,9 @@
"segment_id_parse_error": {
"message": "Invalid segment ID format: {segment_id}"
},
"select_option_failed": {
"message": "Failed to set selected option"
},
"update_data_fail": {
"message": "Failed to update data"
},
@@ -510,7 +655,6 @@
"title": "Cloud API used"
}
},
"options": {
"step": {
"drawables": {

View File

@@ -10,6 +10,7 @@ from typing import Any
from roborock.devices.traits.v1 import PropertiesApi
from roborock.devices.traits.v1.common import RoborockSwitchBase
from roborock.exceptions import RoborockException
from roborock.roborock_message import RoborockDyadDataProtocol, RoborockZeoProtocol
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
@@ -18,8 +19,12 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockEntityV1
from .coordinator import (
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
)
from .entity import RoborockCoordinatedEntityA01, RoborockEntityV1
_LOGGER = logging.getLogger(__name__)
@@ -67,12 +72,30 @@ SWITCH_DESCRIPTIONS: list[RoborockSwitchDescription] = [
]
@dataclass(frozen=True, kw_only=True)
class RoborockSwitchDescriptionA01(SwitchEntityDescription):
"""Class to describe a Roborock A01 switch entity."""
data_protocol: RoborockDyadDataProtocol | RoborockZeoProtocol
A01_SWITCH_DESCRIPTIONS: list[RoborockSwitchDescriptionA01] = [
RoborockSwitchDescriptionA01(
key="sound_setting",
data_protocol=RoborockZeoProtocol.SOUND_SET,
translation_key="sound_setting",
entity_category=EntityCategory.CONFIG,
),
]
async def async_setup_entry(
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Roborock switch platform."""
# V1 switches - using trait pattern from HEAD
async_add_entities(
[
RoborockSwitch(
@@ -87,6 +110,17 @@ async def async_setup_entry(
]
)
# A01 switches
async_add_entities(
RoborockSwitchA01(
coordinator,
description,
)
for coordinator in config_entry.runtime_data.a01
for description in A01_SWITCH_DESCRIPTIONS
if description.data_protocol in coordinator.request_protocols
)
class RoborockSwitch(RoborockEntityV1, SwitchEntity):
"""A class to let you turn functionality on Roborock devices on and off that does need a coordinator."""
@@ -137,3 +171,52 @@ class RoborockSwitch(RoborockEntityV1, SwitchEntity):
def is_on(self) -> bool | None:
"""Return True if entity is on."""
return self._trait.is_on
class RoborockSwitchA01(RoborockCoordinatedEntityA01, SwitchEntity):
"""A class to let you turn functionality on Roborock A01 devices on and off."""
entity_description: RoborockSwitchDescriptionA01
def __init__(
self,
coordinator: RoborockDataUpdateCoordinatorA01,
description: RoborockSwitchDescriptionA01,
) -> None:
"""Initialize the entity."""
self.entity_description = description
super().__init__(f"{description.key}_{coordinator.duid_slug}", coordinator)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the switch."""
try:
await self.coordinator.api.set_value( # type: ignore[attr-defined]
self.entity_description.data_protocol, 0
)
await self.coordinator.async_request_refresh()
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_options_failed",
) from err
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn on the switch."""
try:
await self.coordinator.api.set_value( # type: ignore[attr-defined]
self.entity_description.data_protocol, 1
)
await self.coordinator.async_request_refresh()
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_options_failed",
) from err
@property
def is_on(self) -> bool | None:
"""Return True if entity is on."""
status = self.coordinator.data.get(self.entity_description.data_protocol)
if status is None:
return None
return bool(status)

View File

@@ -1,6 +1,5 @@
"""Support for Roborock vacuum class."""
import asyncio
import logging
from typing import Any
@@ -14,11 +13,11 @@ from homeassistant.components.vacuum import (
VacuumActivity,
VacuumEntityFeature,
)
from homeassistant.core import HomeAssistant, ServiceResponse
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.core import HomeAssistant, ServiceResponse, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, MAP_SLEEP
from .const import DOMAIN
from .coordinator import (
RoborockB01Q7UpdateCoordinator,
RoborockConfigEntry,
@@ -121,6 +120,26 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
self._home_trait = coordinator.properties_api.home
self._maps_trait = coordinator.properties_api.maps
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator.
Creates a repair issue when the vacuum reports different segments than
what was available when the area mapping was last configured.
"""
super()._handle_coordinator_update()
last_seen = self.last_seen_segments
if last_seen is None:
# No area mapping has been configured yet; nothing to check.
return
current_ids = {
f"{map_flag}_{room.segment_id}"
for map_flag, map_info in (self._home_trait.home_map_info or {}).items()
for room in map_info.rooms
}
if current_ids != {seg.id for seg in last_seen}:
self.async_create_segments_issue()
@property
def fan_speed_list(self) -> list[str]:
"""Get the list of available fan speeds."""
@@ -192,7 +211,7 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
return []
return [
Segment(
id=f"{map_flag}:{room.segment_id}",
id=f"{map_flag}_{room.segment_id}",
name=room.name,
group=map_info.name,
)
@@ -204,51 +223,21 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
"""Clean the specified segments."""
parsed: list[tuple[int, int]] = []
for seg_id in segment_ids:
# Segment id is mapflag:segment_id
parts = seg_id.split(":")
if len(parts) != 2:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="segment_id_parse_error",
translation_placeholders={"segment_id": seg_id},
)
try:
# We need to make sure both parts are ints.
parsed.append((int(parts[0]), int(parts[1])))
except ValueError as err:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="segment_id_parse_error",
translation_placeholders={"segment_id": seg_id},
) from err
map_flag_str, room_id_str = seg_id.split("_", maxsplit=1)
parsed.append((int(map_flag_str), int(room_id_str)))
# Because segment_ids can overlap for each map,
# we need to make sure that only one map is passed in.
unique_map_flags = {map_flag for map_flag, _ in parsed}
if len(unique_map_flags) > 1:
map_flags_str = ", ".join(str(flag) for flag in sorted(unique_map_flags))
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="multiple_maps_in_clean",
translation_placeholders={"map_flags": map_flags_str},
)
target_map_flag = next(iter(unique_map_flags))
if self._maps_trait.current_map != target_map_flag:
# If the user is attempting to clean an area on a map that is not selected, we should try to change.
try:
await self._maps_trait.set_current_map(target_map_flag)
except RoborockException as err:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="command_failed",
translation_placeholders={"command": "load_multi_map"},
) from err
await asyncio.sleep(MAP_SLEEP)
# Segments from other maps are silently ignored; only segments
# belonging to the currently active map are cleaned.
current_map = self._maps_trait.current_map
current_map_segments = [
seg_id for map_flag, seg_id in parsed if map_flag == current_map
]
if not current_map_segments:
return
# We can now confirm all segments are on our current map, so clean them all.
await self.send(
RoborockCommand.APP_SEGMENT_CLEAN,
[{"segments": [seg_id for _, seg_id in parsed]}],
[{"segments": current_map_segments}],
)
async def async_send_command(

View File

@@ -131,7 +131,7 @@ async def root_payload(
)
for child in children:
child.thumbnail = "/api/brands/integration/roku/logo.png"
child.thumbnail = "https://brands.home-assistant.io/_/roku/logo.png"
try:
browse_item = await media_source.async_browse_media(hass, None)

View File

@@ -35,7 +35,7 @@ async def _root_payload(
media_class=MediaClass.DIRECTORY,
media_content_id="",
media_content_type="presets",
thumbnail="/api/brands/integration/russound_rio/logo.png",
thumbnail="https://brands.home-assistant.io/_/russound_rio/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -3,6 +3,7 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
from typing import Any
from pysaunum import MAX_TEMPERATURE, MIN_TEMPERATURE, SaunumException
@@ -241,9 +242,9 @@ class LeilSaunaClimate(LeilSaunaEntity, ClimateEntity):
async def async_start_session(
self,
duration: int = 120,
duration: timedelta = timedelta(minutes=120),
target_temperature: int = 80,
fan_duration: int = 10,
fan_duration: timedelta = timedelta(minutes=10),
) -> None:
"""Start a sauna session with custom parameters."""
if self.coordinator.data.door_open:
@@ -254,11 +255,15 @@ class LeilSaunaClimate(LeilSaunaEntity, ClimateEntity):
try:
# Set all parameters before starting the session
await self.coordinator.client.async_set_sauna_duration(duration)
await self.coordinator.client.async_set_sauna_duration(
int(duration.total_seconds() // 60)
)
await self.coordinator.client.async_set_target_temperature(
target_temperature
)
await self.coordinator.client.async_set_fan_duration(fan_duration)
await self.coordinator.client.async_set_fan_duration(
int(fan_duration.total_seconds() // 60)
)
await self.coordinator.client.async_start_session()
except SaunumException as err:
raise HomeAssistantError(

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from datetime import timedelta
from pysaunum import MAX_DURATION, MAX_FAN_DURATION, MAX_TEMPERATURE, MIN_TEMPERATURE
import voluptuous as vol
@@ -27,14 +29,22 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_START_SESSION,
entity_domain=CLIMATE_DOMAIN,
schema={
vol.Optional(ATTR_DURATION, default=120): vol.All(
cv.positive_int, vol.Range(min=1, max=MAX_DURATION)
vol.Optional(ATTR_DURATION, default=timedelta(minutes=120)): vol.All(
cv.time_period,
vol.Range(
min=timedelta(minutes=1),
max=timedelta(minutes=MAX_DURATION),
),
),
vol.Optional(ATTR_TARGET_TEMPERATURE, default=80): vol.All(
cv.positive_int, vol.Range(min=MIN_TEMPERATURE, max=MAX_TEMPERATURE)
),
vol.Optional(ATTR_FAN_DURATION, default=10): vol.All(
cv.positive_int, vol.Range(min=1, max=MAX_FAN_DURATION)
vol.Optional(ATTR_FAN_DURATION, default=timedelta(minutes=10)): vol.All(
cv.time_period,
vol.Range(
min=timedelta(minutes=1),
max=timedelta(minutes=MAX_FAN_DURATION),
),
),
},
func="async_start_session",

View File

@@ -1,6 +1,7 @@
"""Common base for entities."""
from dataclasses import dataclass
import logging
from typing import Any
from pysmarlaapi import Federwiege
@@ -10,6 +11,8 @@ from homeassistant.helpers.entity import Entity, EntityDescription
from .const import DEVICE_MODEL_NAME, DOMAIN, MANUFACTURER_NAME
_LOGGER = logging.getLogger(__name__)
@dataclass(frozen=True, kw_only=True)
class SmarlaEntityDescription(EntityDescription):
@@ -30,6 +33,7 @@ class SmarlaBaseEntity(Entity):
def __init__(self, federwiege: Federwiege, desc: SmarlaEntityDescription) -> None:
"""Initialise the entity."""
self.entity_description = desc
self._federwiege = federwiege
self._property = federwiege.get_property(desc.service, desc.property)
self._attr_unique_id = f"{federwiege.serial_number}-{desc.key}"
self._attr_device_info = DeviceInfo(
@@ -39,15 +43,35 @@ class SmarlaBaseEntity(Entity):
manufacturer=MANUFACTURER_NAME,
serial_number=federwiege.serial_number,
)
self._unavailable_logged = False
async def on_change(self, value: Any):
@property
def available(self) -> bool:
"""Return True if entity is available."""
return self._federwiege.available
async def on_availability_change(self, available: bool) -> None:
"""Handle availability changes."""
if not self.available and not self._unavailable_logged:
_LOGGER.info("Entity %s is unavailable", self.entity_id)
self._unavailable_logged = True
elif self.available and self._unavailable_logged:
_LOGGER.info("Entity %s is back online", self.entity_id)
self._unavailable_logged = False
# Notify ha that state changed
self.async_write_ha_state()
async def on_change(self, value: Any) -> None:
"""Notify ha when state changes."""
self.async_write_ha_state()
async def async_added_to_hass(self) -> None:
"""Run when this Entity has been added to HA."""
await self._federwiege.add_listener(self.on_availability_change)
await self._property.add_listener(self.on_change)
async def async_will_remove_from_hass(self) -> None:
"""Entity being removed from hass."""
await self._property.remove_listener(self.on_change)
await self._federwiege.remove_listener(self.on_availability_change)

View File

@@ -24,9 +24,9 @@ rules:
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: todo
entity-unavailable: done
integration-owner: done
log-when-unavailable: todo
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
test-coverage: done

View File

@@ -330,7 +330,7 @@ async def root_payload(
media_class=MediaClass.DIRECTORY,
media_content_id="",
media_content_type="favorites",
thumbnail="/api/brands/integration/sonos/logo.png",
thumbnail="https://brands.home-assistant.io/_/sonos/logo.png",
can_play=False,
can_expand=True,
)
@@ -345,7 +345,7 @@ async def root_payload(
media_class=MediaClass.DIRECTORY,
media_content_id="",
media_content_type="library",
thumbnail="/api/brands/integration/sonos/logo.png",
thumbnail="https://brands.home-assistant.io/_/sonos/logo.png",
can_play=False,
can_expand=True,
)
@@ -358,7 +358,7 @@ async def root_payload(
media_class=MediaClass.APP,
media_content_id="",
media_content_type="plex",
thumbnail="/api/brands/integration/plex/logo.png",
thumbnail="https://brands.home-assistant.io/_/plex/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -212,7 +212,7 @@ async def async_browse_media(
media_class=MediaClass.APP,
media_content_id=f"{MEDIA_PLAYER_PREFIX}{config_entry.entry_id}",
media_content_type=f"{MEDIA_PLAYER_PREFIX}library",
thumbnail="/api/brands/integration/spotify/logo.png",
thumbnail="https://brands.home-assistant.io/_/spotify/logo.png",
can_play=False,
can_expand=True,
)
@@ -223,7 +223,7 @@ async def async_browse_media(
media_class=MediaClass.APP,
media_content_id=MEDIA_PLAYER_PREFIX,
media_content_type="spotify",
thumbnail="/api/brands/integration/spotify/logo.png",
thumbnail="https://brands.home-assistant.io/_/spotify/logo.png",
can_play=False,
can_expand=True,
children=children,

View File

@@ -6,6 +6,7 @@ from typing import cast
from surepy.entities import SurepyEntity
from surepy.entities.devices import Felaqua as SurepyFelaqua
from surepy.entities.pet import Pet as SurepyPet
from surepy.enums import EntityType
from homeassistant.components.sensor import SensorDeviceClass, SensorEntity
@@ -41,6 +42,9 @@ async def async_setup_entry(
if surepy_entity.type == EntityType.FELAQUA:
entities.append(Felaqua(surepy_entity.id, coordinator))
if surepy_entity.type == EntityType.PET:
entities.append(PetLastSeenFlapDevice(surepy_entity.id, coordinator))
entities.append(PetLastSeenUser(surepy_entity.id, coordinator))
async_add_entities(entities)
@@ -108,3 +112,55 @@ class Felaqua(SurePetcareEntity, SensorEntity):
"""Update the state."""
surepy_entity = cast(SurepyFelaqua, surepy_entity)
self._attr_native_value = surepy_entity.water_remaining
class PetLastSeenFlapDevice(SurePetcareEntity, SensorEntity):
"""Sensor for the last flap device id used by the pet.
Note: Will be unknown if the last status is not from a flap update.
"""
_attr_entity_category = EntityCategory.DIAGNOSTIC
_attr_entity_registry_enabled_default = False
def __init__(
self, surepetcare_id: int, coordinator: SurePetcareDataCoordinator
) -> None:
"""Initialize last seen flap device id sensor."""
super().__init__(surepetcare_id, coordinator)
self._attr_name = f"{self._device_name} Last seen flap device id"
self._attr_unique_id = f"{self._device_id}-last_seen_flap_device"
@callback
def _update_attr(self, surepy_entity: SurepyEntity) -> None:
surepy_entity = cast(SurepyPet, surepy_entity)
position = surepy_entity._data.get("position", {}) # noqa: SLF001
device_id = position.get("device_id")
self._attr_native_value = str(device_id) if device_id is not None else None
class PetLastSeenUser(SurePetcareEntity, SensorEntity):
"""Sensor for the last user id that manually changed the pet location.
Note: Will be unknown if the last status is not from a manual update.
"""
_attr_entity_category = EntityCategory.DIAGNOSTIC
_attr_entity_registry_enabled_default = False
def __init__(
self, surepetcare_id: int, coordinator: SurePetcareDataCoordinator
) -> None:
"""Initialize last seen user id sensor."""
super().__init__(surepetcare_id, coordinator)
self._attr_name = f"{self._device_name} Last seen user id"
self._attr_unique_id = f"{self._device_id}-last_seen_user"
@callback
def _update_attr(self, surepy_entity: SurepyEntity) -> None:
surepy_entity = cast(SurepyPet, surepy_entity)
position = surepy_entity._data.get("position", {}) # noqa: SLF001
user_id = position.get("user_id")
self._attr_native_value = str(user_id) if user_id is not None else None

View File

@@ -14,6 +14,6 @@
"documentation": "https://www.home-assistant.io/integrations/teltonika",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "bronze",
"quality_scale": "silver",
"requirements": ["teltasync==0.1.3"]
}

View File

@@ -30,8 +30,10 @@ rules:
status: exempt
comment: No custom actions registered.
config-entry-unloading: done
docs-configuration-parameters: todo
docs-installation-parameters: todo
docs-configuration-parameters:
status: exempt
comment: No options flow
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
@@ -44,12 +46,12 @@ rules:
diagnostics: todo
discovery-update-info: done
discovery: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: todo
docs-troubleshooting: todo
docs-troubleshooting: done
docs-use-cases: todo
dynamic-devices: todo
entity-category: todo

View File

@@ -266,7 +266,7 @@ class StateUpdateEntity(TemplateEntity, AbstractTemplateUpdate):
# The default picture for update entities would use `self.platform.platform_name` in
# place of `template`. This does not work when creating an entity preview because
# the platform does not exist for that entity, therefore this is hardcoded as `template`.
return "/api/brands/integration/template/icon.png"
return "https://brands.home-assistant.io/_/template/icon.png"
return self._attr_entity_picture

View File

@@ -214,7 +214,7 @@ class TTSMediaSource(MediaSource):
media_class=MediaClass.APP,
media_content_type="provider",
title=engine_instance.name,
thumbnail=f"/api/brands/integration/{engine_domain}/logo.png",
thumbnail=f"https://brands.home-assistant.io/_/{engine_domain}/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -5,6 +5,12 @@ from __future__ import annotations
from base64 import b64decode
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeEnumWrapper,
DPCodeRawWrapper,
)
from tuya_device_handlers.type_information import EnumTypeInformation
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.alarm_control_panel import (
@@ -20,8 +26,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import DeviceWrapper, DPCodeEnumWrapper, DPCodeRawWrapper
from .type_information import EnumTypeInformation
ALARM: dict[DeviceCategory, tuple[AlarmControlPanelEntityDescription, ...]] = {
DeviceCategory.MAL: (
@@ -39,7 +43,7 @@ class _AlarmChangedByWrapper(DPCodeRawWrapper):
Decode base64 to utf-16be string, but only if alarm has been triggered.
"""
def read_device_status(self, device: CustomerDevice) -> str | None:
def read_device_status(self, device: CustomerDevice) -> str | None: # type: ignore[override]
"""Read the device status."""
if (
device.status.get(DPCode.MASTER_STATE) != "alarm"

View File

@@ -4,6 +4,12 @@ from __future__ import annotations
from dataclasses import dataclass
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.binary_sensor import DPCodeBitmapBitWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeWrapper,
)
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.binary_sensor import (
@@ -19,12 +25,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBitmapBitWrapper,
DPCodeBooleanWrapper,
DPCodeWrapper,
)
@dataclass(frozen=True)

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import DPCodeBooleanWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
@@ -13,7 +15,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import DeviceWrapper, DPCodeBooleanWrapper
BUTTONS: dict[DeviceCategory, tuple[ButtonEntityDescription, ...]] = {
DeviceCategory.HXD: (

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import DPCodeBooleanWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components import ffmpeg
@@ -13,7 +15,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import DeviceWrapper, DPCodeBooleanWrapper
CAMERAS: tuple[DeviceCategory, ...] = (
DeviceCategory.DGHSXJ,

View File

@@ -6,6 +6,13 @@ import collections
from dataclasses import dataclass
from typing import Any, Self
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from tuya_device_handlers.type_information import EnumTypeInformation
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.climate import (
@@ -33,13 +40,6 @@ from .const import (
DPCode,
)
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from .type_information import EnumTypeInformation
TUYA_HVAC_TO_HA = {
"auto": HVACMode.HEAT_COOL,
@@ -177,8 +177,10 @@ class _HvacModeWrapper(DPCodeEnumWrapper):
return None
return TUYA_HVAC_TO_HA[raw]
def _convert_value_to_raw_value(
self, device: CustomerDevice, value: HVACMode
def _convert_value_to_raw_value( # type: ignore[override]
self,
device: CustomerDevice,
value: HVACMode,
) -> Any:
"""Convert value to raw value."""
return next(

View File

@@ -82,18 +82,6 @@ class WorkMode(StrEnum):
WHITE = "white"
class DPType(StrEnum):
"""Data point types."""
BITMAP = "Bitmap"
BOOLEAN = "Boolean"
ENUM = "Enum"
INTEGER = "Integer"
JSON = "Json"
RAW = "Raw"
STRING = "String"
class DeviceCategory(StrEnum):
"""Tuya device categories.

View File

@@ -5,6 +5,17 @@ from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from tuya_device_handlers.type_information import (
EnumTypeInformation,
IntegerTypeInformation,
)
from tuya_device_handlers.utils import RemapHelper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.cover import (
@@ -22,14 +33,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from .type_information import EnumTypeInformation, IntegerTypeInformation
from .util import RemapHelper
class _DPCodePercentageMappingWrapper(DPCodeIntegerWrapper):
@@ -84,7 +87,7 @@ class _InstructionBooleanWrapper(DPCodeBooleanWrapper):
options = ["open", "close"]
_ACTION_MAPPINGS = {"open": True, "close": False}
def _convert_value_to_raw_value(self, device: CustomerDevice, value: str) -> bool:
def _convert_value_to_raw_value(self, device: CustomerDevice, value: str) -> bool: # type: ignore[override]
return self._ACTION_MAPPINGS[value]
@@ -130,7 +133,7 @@ class _IsClosedEnumWrapper(DPCodeEnumWrapper):
"fully_open": False,
}
def read_device_status(self, device: CustomerDevice) -> bool | None:
def read_device_status(self, device: CustomerDevice) -> bool | None: # type: ignore[override]
if (value := super().read_device_status(device)) is None:
return None
return self._MAPPINGS.get(value)

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from typing import Any
from tuya_device_handlers.device_wrapper import DEVICE_WARNINGS
from tuya_sharing import CustomerDevice
from homeassistant.components.diagnostics import REDACTED
@@ -14,7 +15,6 @@ from homeassistant.util import dt as dt_util
from . import TuyaConfigEntry
from .const import DOMAIN, DPCode
from .type_information import DEVICE_WARNINGS
_REDACTED_DPCODES = {
DPCode.ALARM_MESSAGE,

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from typing import Any
from tuya_device_handlers.device_wrapper import DeviceWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.helpers.device_registry import DeviceInfo
@@ -11,7 +12,6 @@ from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity import Entity
from .const import DOMAIN, LOGGER, TUYA_HA_SIGNAL_UPDATE_ENTITY
from .models import DeviceWrapper
class TuyaEntity(Entity):

View File

@@ -6,6 +6,13 @@ from base64 import b64decode
from dataclasses import dataclass
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeEnumWrapper,
DPCodeRawWrapper,
DPCodeStringWrapper,
DPCodeTypeInformationWrapper,
)
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.event import (
@@ -20,19 +27,14 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeEnumWrapper,
DPCodeRawWrapper,
DPCodeStringWrapper,
DPCodeTypeInformationWrapper,
)
class _EventEnumWrapper(DPCodeEnumWrapper):
"""Wrapper for event enum DP codes."""
def read_device_status(self, device: CustomerDevice) -> tuple[str, None] | None:
def read_device_status( # type: ignore[override]
self, device: CustomerDevice
) -> tuple[str, None] | None:
"""Return the event details."""
if (raw_value := super().read_device_status(device)) is None:
return None
@@ -67,7 +69,7 @@ class _DoorbellPicWrapper(DPCodeRawWrapper):
super().__init__(dpcode, type_information)
self.options = ["triggered"]
def read_device_status(
def read_device_status( # type: ignore[override]
self, device: CustomerDevice
) -> tuple[str, dict[str, Any]] | None:
"""Return the event attributes for the doorbell picture."""

View File

@@ -4,6 +4,14 @@ from __future__ import annotations
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from tuya_device_handlers.type_information import IntegerTypeInformation
from tuya_device_handlers.utils import RemapHelper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.fan import (
@@ -23,14 +31,7 @@ from homeassistant.util.percentage import (
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from .type_information import IntegerTypeInformation
from .util import RemapHelper, get_dpcode
from .util import get_dpcode
_DIRECTION_DPCODES = (DPCode.FAN_DIRECTION,)
_MODE_DPCODES = (DPCode.FAN_MODE, DPCode.MODE)
@@ -82,7 +83,7 @@ def _has_a_valid_dpcode(device: CustomerDevice) -> bool:
class _FanSpeedEnumWrapper(DPCodeEnumWrapper):
"""Wrapper for fan speed DP code (from an enum)."""
def read_device_status(self, device: CustomerDevice) -> int | None:
def read_device_status(self, device: CustomerDevice) -> int | None: # type: ignore[override]
"""Get the current speed as a percentage."""
if (value := super().read_device_status(device)) is None:
return None

View File

@@ -5,6 +5,12 @@ from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.humidifier import (
@@ -20,12 +26,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
)
from .util import ActionDPCodeNotFoundError, get_dpcode

View File

@@ -7,6 +7,15 @@ from enum import StrEnum
import json
from typing import Any, cast
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import (
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
DPCodeJsonWrapper,
)
from tuya_device_handlers.type_information import IntegerTypeInformation
from tuya_device_handlers.utils import RemapHelper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.light import (
@@ -30,15 +39,6 @@ from homeassistant.util.json import json_loads_object
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode, WorkMode
from .entity import TuyaEntity
from .models import (
DeviceWrapper,
DPCodeBooleanWrapper,
DPCodeEnumWrapper,
DPCodeIntegerWrapper,
DPCodeJsonWrapper,
)
from .type_information import IntegerTypeInformation
from .util import RemapHelper
class _BrightnessWrapper(DPCodeIntegerWrapper):
@@ -174,7 +174,7 @@ class _ColorDataWrapper(DPCodeJsonWrapper):
s_type = DEFAULT_S_TYPE
v_type = DEFAULT_V_TYPE
def read_device_status(
def read_device_status( # type: ignore[override]
self, device: CustomerDevice
) -> tuple[float, float, float] | None:
"""Return a tuple (H, S, V) from this color data."""

View File

@@ -43,5 +43,8 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["tuya_sharing"],
"requirements": ["tuya-device-sharing-sdk==0.2.8"]
"requirements": [
"tuya-device-handlers==0.0.10",
"tuya-device-sharing-sdk==0.2.8"
]
}

View File

@@ -1,329 +0,0 @@
"""Tuya Home Assistant Base Device Model."""
from __future__ import annotations
import logging
from typing import Any, Self
from tuya_sharing import CustomerDevice
from homeassistant.components.sensor import SensorStateClass
from .type_information import (
BitmapTypeInformation,
BooleanTypeInformation,
EnumTypeInformation,
IntegerTypeInformation,
JsonTypeInformation,
RawTypeInformation,
StringTypeInformation,
TypeInformation,
)
_LOGGER = logging.getLogger(__name__)
class DeviceWrapper[T]:
"""Base device wrapper."""
native_unit: str | None = None
suggested_unit: str | None = None
state_class: SensorStateClass | None = None
max_value: float
min_value: float
value_step: float
options: list[str]
def initialize(self, device: CustomerDevice) -> None:
"""Initialize the wrapper with device data.
Called when the entity is added to Home Assistant.
Override in subclasses to perform initialization logic.
"""
def skip_update(
self,
device: CustomerDevice,
updated_status_properties: list[str],
dp_timestamps: dict[str, int] | None,
) -> bool:
"""Determine if the wrapper should skip an update.
The default is to always skip if updated properties is given,
unless overridden in subclasses.
"""
# If updated_status_properties is None, we should not skip,
# as we don't have information on what was updated
# This happens for example on online/offline updates, where
# we still want to update the entity state
return updated_status_properties is not None
def read_device_status(self, device: CustomerDevice) -> T | None:
"""Read device status and convert to a Home Assistant value."""
raise NotImplementedError
def get_update_commands(
self, device: CustomerDevice, value: T
) -> list[dict[str, Any]]:
"""Generate update commands for a Home Assistant action."""
raise NotImplementedError
class DPCodeWrapper(DeviceWrapper):
"""Base device wrapper for a single DPCode.
Used as a common interface for referring to a DPCode, and
access read conversion routines.
"""
def __init__(self, dpcode: str) -> None:
"""Init DPCodeWrapper."""
self.dpcode = dpcode
def skip_update(
self,
device: CustomerDevice,
updated_status_properties: list[str],
dp_timestamps: dict[str, int] | None,
) -> bool:
"""Determine if the wrapper should skip an update.
By default, skip if updated_status_properties is given and
does not include this dpcode.
"""
# If updated_status_properties is None, we should not skip,
# as we don't have information on what was updated
# This happens for example on online/offline updates, where
# we still want to update the entity state
return (
updated_status_properties is not None
and self.dpcode not in updated_status_properties
)
def _convert_value_to_raw_value(self, device: CustomerDevice, value: Any) -> Any:
"""Convert a Home Assistant value back to a raw device value.
This is called by `get_update_commands` to prepare the value for sending
back to the device, and should be implemented in concrete classes if needed.
"""
raise NotImplementedError
def get_update_commands(
self, device: CustomerDevice, value: Any
) -> list[dict[str, Any]]:
"""Get the update commands for the dpcode.
The Home Assistant value is converted back to a raw device value.
"""
return [
{
"code": self.dpcode,
"value": self._convert_value_to_raw_value(device, value),
}
]
class DPCodeTypeInformationWrapper[T: TypeInformation](DPCodeWrapper):
"""Base DPCode wrapper with Type Information."""
_DPTYPE: type[T]
type_information: T
def __init__(self, dpcode: str, type_information: T) -> None:
"""Init DPCodeWrapper."""
super().__init__(dpcode)
self.type_information = type_information
def read_device_status(self, device: CustomerDevice) -> Any | None:
"""Read the device value for the dpcode."""
return self.type_information.process_raw_value(
device.status.get(self.dpcode), device
)
@classmethod
def find_dpcode(
cls,
device: CustomerDevice,
dpcodes: str | tuple[str, ...] | None,
*,
prefer_function: bool = False,
) -> Self | None:
"""Find and return a DPCodeTypeInformationWrapper for the given DP codes."""
if type_information := cls._DPTYPE.find_dpcode(
device, dpcodes, prefer_function=prefer_function
):
return cls(
dpcode=type_information.dpcode, type_information=type_information
)
return None
class DPCodeBooleanWrapper(DPCodeTypeInformationWrapper[BooleanTypeInformation]):
"""Simple wrapper for boolean values.
Supports True/False only.
"""
_DPTYPE = BooleanTypeInformation
def _convert_value_to_raw_value(
self, device: CustomerDevice, value: Any
) -> Any | None:
"""Convert a Home Assistant value back to a raw device value."""
if value in (True, False):
return value
# Currently only called with boolean values
# Safety net in case of future changes
raise ValueError(f"Invalid boolean value `{value}`")
class DPCodeJsonWrapper(DPCodeTypeInformationWrapper[JsonTypeInformation]):
"""Wrapper to extract information from a JSON value."""
_DPTYPE = JsonTypeInformation
class DPCodeEnumWrapper(DPCodeTypeInformationWrapper[EnumTypeInformation]):
"""Simple wrapper for EnumTypeInformation values."""
_DPTYPE = EnumTypeInformation
def __init__(self, dpcode: str, type_information: EnumTypeInformation) -> None:
"""Init DPCodeEnumWrapper."""
super().__init__(dpcode, type_information)
self.options = type_information.range
def _convert_value_to_raw_value(self, device: CustomerDevice, value: Any) -> Any:
"""Convert a Home Assistant value back to a raw device value."""
if value in self.type_information.range:
return value
# Guarded by select option validation
# Safety net in case of future changes
raise ValueError(
f"Enum value `{value}` out of range: {self.type_information.range}"
)
class DPCodeIntegerWrapper(DPCodeTypeInformationWrapper[IntegerTypeInformation]):
"""Simple wrapper for IntegerTypeInformation values."""
_DPTYPE = IntegerTypeInformation
def __init__(self, dpcode: str, type_information: IntegerTypeInformation) -> None:
"""Init DPCodeIntegerWrapper."""
super().__init__(dpcode, type_information)
self.native_unit = type_information.unit
self.min_value = self.type_information.scale_value(type_information.min)
self.max_value = self.type_information.scale_value(type_information.max)
self.value_step = self.type_information.scale_value(type_information.step)
def _convert_value_to_raw_value(self, device: CustomerDevice, value: Any) -> Any:
"""Convert a Home Assistant value back to a raw device value."""
new_value = round(value * (10**self.type_information.scale))
if self.type_information.min <= new_value <= self.type_information.max:
return new_value
# Guarded by number validation
# Safety net in case of future changes
raise ValueError(
f"Value `{new_value}` (converted from `{value}`) out of range:"
f" ({self.type_information.min}-{self.type_information.max})"
)
class DPCodeDeltaIntegerWrapper(DPCodeIntegerWrapper):
"""Wrapper for integer values with delta report accumulation.
This wrapper handles sensors that report incremental (delta) values
instead of cumulative totals. It accumulates the delta values locally
to provide a running total.
"""
_accumulated_value: float = 0
_last_dp_timestamp: int | None = None
def __init__(self, dpcode: str, type_information: IntegerTypeInformation) -> None:
"""Init DPCodeDeltaIntegerWrapper."""
super().__init__(dpcode, type_information)
# Delta reports use TOTAL_INCREASING state class
self.state_class = SensorStateClass.TOTAL_INCREASING
def skip_update(
self,
device: CustomerDevice,
updated_status_properties: list[str],
dp_timestamps: dict[str, int] | None,
) -> bool:
"""Override skip_update to process delta updates.
Processes delta accumulation before determining if update should be skipped.
"""
if (
super().skip_update(device, updated_status_properties, dp_timestamps)
or dp_timestamps is None
or (current_timestamp := dp_timestamps.get(self.dpcode)) is None
or current_timestamp == self._last_dp_timestamp
or (raw_value := super().read_device_status(device)) is None
):
return True
delta = float(raw_value)
self._accumulated_value += delta
_LOGGER.debug(
"Delta update for %s: +%s, total: %s",
self.dpcode,
delta,
self._accumulated_value,
)
self._last_dp_timestamp = current_timestamp
return False
def read_device_status(self, device: CustomerDevice) -> float | None:
"""Read device status, returning accumulated value for delta reports."""
return self._accumulated_value
class DPCodeRawWrapper(DPCodeTypeInformationWrapper[RawTypeInformation]):
"""Wrapper to extract information from a RAW/binary value."""
_DPTYPE = RawTypeInformation
class DPCodeStringWrapper(DPCodeTypeInformationWrapper[StringTypeInformation]):
"""Wrapper to extract information from a STRING value."""
_DPTYPE = StringTypeInformation
class DPCodeBitmapBitWrapper(DPCodeWrapper):
"""Simple wrapper for a specific bit in bitmap values."""
def __init__(self, dpcode: str, mask: int) -> None:
"""Init DPCodeBitmapWrapper."""
super().__init__(dpcode)
self._mask = mask
def read_device_status(self, device: CustomerDevice) -> bool | None:
"""Read the device value for the dpcode."""
if (raw_value := device.status.get(self.dpcode)) is None:
return None
return (raw_value & (1 << self._mask)) != 0
@classmethod
def find_dpcode(
cls,
device: CustomerDevice,
dpcodes: str | tuple[str, ...],
*,
bitmap_key: str,
) -> Self | None:
"""Find and return a DPCodeBitmapBitWrapper for the given DP codes."""
if (
type_information := BitmapTypeInformation.find_dpcode(device, dpcodes)
) and bitmap_key in type_information.label:
return cls(
type_information.dpcode, type_information.label.index(bitmap_key)
)
return None

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import DPCodeIntegerWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.number import (
@@ -25,7 +27,6 @@ from .const import (
DPCode,
)
from .entity import TuyaEntity
from .models import DeviceWrapper, DPCodeIntegerWrapper
NUMBERS: dict[DeviceCategory, tuple[NumberEntityDescription, ...]] = {
DeviceCategory.BH: (

View File

@@ -1,60 +0,0 @@
"""Parsers for RAW (base64-encoded bytes) values."""
from dataclasses import dataclass
import struct
from typing import Self
@dataclass(kw_only=True)
class ElectricityData:
"""Electricity RAW value."""
current: float
power: float
voltage: float
@classmethod
def from_bytes(cls, raw: bytes) -> Self | None:
"""Parse bytes and return an ElectricityValue object."""
# Format:
# - legacy: 8 bytes
# - v01: [ver=0x01][len=0x0F][data(15 bytes)]
# - v02: [ver=0x02][len=0x0F][data(15 bytes)][sign_bitmap(1 byte)]
# Data layout (big-endian):
# - voltage: 2B, unit 0.1 V
# - current: 3B, unit 0.001 A (i.e., mA)
# - active power: 3B, unit 0.001 kW (i.e., W)
# - reactive power: 3B, unit 0.001 kVar
# - apparent power: 3B, unit 0.001 kVA
# - power factor: 1B, unit 0.01
# Sign bitmap (v02 only, 1 bit means negative):
# - bit0 current
# - bit1 active power
# - bit2 reactive
# - bit3 power factor
is_v1 = len(raw) == 17 and raw[0:2] == b"\x01\x0f"
is_v2 = len(raw) == 18 and raw[0:2] == b"\x02\x0f"
if is_v1 or is_v2:
data = raw[2:17]
voltage = struct.unpack(">H", data[0:2])[0] / 10.0
current = struct.unpack(">L", b"\x00" + data[2:5])[0]
power = struct.unpack(">L", b"\x00" + data[5:8])[0]
if is_v2:
sign_bitmap = raw[17]
if sign_bitmap & 0x01:
current = -current
if sign_bitmap & 0x02:
power = -power
return cls(current=current, power=power, voltage=voltage)
if len(raw) >= 8:
voltage = struct.unpack(">H", raw[0:2])[0] / 10.0
current = struct.unpack(">L", b"\x00" + raw[2:5])[0]
power = struct.unpack(">L", b"\x00" + raw[5:8])[0]
return cls(current=current, power=power, voltage=voltage)
return None

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
from tuya_device_handlers.device_wrapper.common import DPCodeEnumWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.select import SelectEntity, SelectEntityDescription
@@ -13,7 +15,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import TuyaConfigEntry
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
from .models import DeviceWrapper, DPCodeEnumWrapper
# All descriptions can be found here. Mostly the Enum data types in the
# default instructions set of each category end up being a select.

Some files were not shown because too many files have changed in this diff Show More