Compare commits

...

57 Commits

Author SHA1 Message Date
Erik Montnemery
261322b9de Merge branch 'dev' into add_cover_triggers_xxx_closed 2025-11-27 11:09:34 +01:00
Paulus Schoutsen
2ec5190243 Install requirements_test_all in dev (#157392) 2025-11-27 10:30:50 +01:00
Erik Montnemery
a706db8fdb Minor polish of cover trigger tests (#157397) 2025-11-27 09:57:03 +01:00
starkillerOG
a00923c48b Bump reolink-aio to 0.16.6 (#157399) 2025-11-27 09:53:25 +01:00
Sarah Seidman
7480d59f0f Normalize input for Droplet pairing code (#157361) 2025-11-27 08:36:30 +01:00
Erik Montnemery
4c8d9ed401 Adjust type hints in sensor group (#157373) 2025-11-27 08:34:16 +01:00
Lukas
eef10c59db Pooldose bump api 0.8.0 (new) (#157381) 2025-11-27 08:33:32 +01:00
Erik
b467186319 Add cover entity triggers xxx_closed 2025-11-27 08:15:06 +01:00
dependabot[bot]
a1a1f8dd77 Bump docker/metadata-action from 5.5.1 to 5.9.0 (#157395) 2025-11-27 07:26:58 +01:00
dependabot[bot]
c75a5c5151 Bump docker/setup-buildx-action from 3.5.0 to 3.11.1 (#157396) 2025-11-27 07:25:16 +01:00
Allen Porter
cdaaa2bd8f Update fitbit to use new asyncio client library for device list (#157308)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-27 00:23:49 -05:00
Allen Porter
bd84dac8fb Update roborock test typing (#157370) 2025-11-27 00:21:48 -05:00
Allen Porter
42cbeca5b0 Remove old roborock map storage (#157379) 2025-11-27 00:21:04 -05:00
Allen Porter
ad0a498d10 Bump python-roborock to 3.8.1 (#157376) 2025-11-26 16:12:19 -08:00
Jan Bouwhuis
973405822b Move translatable URL out of strings.json for knx integration (#155244) 2025-11-26 23:09:59 +01:00
Franck Nijhof
b883d2f519 Bump version to 2026.1.0dev0 2025-11-26 17:15:29 +00:00
Christopher Fenner
4654d6de87 Filter devices based on online status in ViCare integration (#157287) 2025-11-26 18:00:52 +01:00
Ludovic BOUÉ
990c8cd4e6 Add Matter Window covering operational status (#156066)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2025-11-26 18:00:13 +01:00
Raphael Hehl
f8c76f42e3 Add session clearing on config entry removal for UniFi Protect integration (#157360)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-11-26 17:59:49 +01:00
Erik Montnemery
21d914c8ca Disable experimental conditions according to labs flag setting (#157345) 2025-11-26 17:59:12 +01:00
Erik Montnemery
ec77add1a6 Reload scripts when labs flag automation.new_triggers_conditions is set (#157348) 2025-11-26 17:53:38 +01:00
Erik Montnemery
ef3b7dfd1d Reload automations when labs flag automation.new_triggers_conditions is set (#157347) 2025-11-26 17:45:25 +01:00
Robert Resch
51241d963d Bump deebot-client to 16.4.0 (#157358) 2025-11-26 17:28:41 +01:00
Joost Lekkerkerker
7c48e6e046 Delete leftover SmartThings smartapps (#157188) 2025-11-26 17:14:36 +01:00
Bram Kragten
38d8da4279 Update frontend to 20251126.0 (#157352) 2025-11-26 17:13:25 +01:00
Raphael Hehl
3396a72fa8 Bump uiprotect to version 7.29.0 (#157354) 2025-11-26 17:04:38 +01:00
Erik Montnemery
2d26ab390e Save device registry store in worker thread (#157351) 2025-11-26 17:02:10 +01:00
Thomas55555
1bf5bc9323 Bump google air quality api to 1.1.2 (#157337) 2025-11-26 16:04:01 +01:00
Erik Montnemery
87ea96a3e0 Save entity registry store in worker thread (#157274) 2025-11-26 16:03:14 +01:00
Jan Čermák
e3cf65510b Update Home Assistant base image to 2025.11.3 (#157346) 2025-11-26 15:15:08 +01:00
Robert Resch
f69fce68d6 Use buildx imagetools to copy base image to docker.io and enable provenance (#157341)
Co-authored-by: Stefan Agner <stefan@agner.ch>
2025-11-26 15:12:32 +01:00
Abílio Costa
f758cfa82f Add get_conditions_for_target websocket command (#157344)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-11-26 14:08:56 +00:00
Artur Pragacz
9c7a928b29 Add get encryption key websocket to esphome (#154058) 2025-11-26 14:41:19 +01:00
Petro31
405a9948a2 Deprecate legacy and undocumented template entity configurations (#155355) 2025-11-26 14:30:06 +01:00
Oscar
0e3bab3ce4 Energyid bugfix (#157343) 2025-11-26 14:29:28 +01:00
Erik Montnemery
4900d25ac8 Disable experimental triggers according to labs flag setting (#157320) 2025-11-26 14:27:05 +01:00
Shay Levy
ea10cdb4b0 Remove Shelly redundant device entry check for sleepy devices (#157333) 2025-11-26 14:54:51 +02:00
Oscar
6baf77d256 Energyid integration (#138206)
Co-authored-by: Jan Pecinovsky <jan.pecinovsky@energieid.be>
Co-authored-by: Jan Pecinovsky <janpecinovsky@gmail.com>
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-11-26 13:38:57 +01:00
Artur Pragacz
13bc0ebed8 Remove incorrect after dependency in music assistant (#157339) 2025-11-26 13:38:18 +01:00
Marcel van der Veldt
611af9c832 Add support for authentication to the Music Assistant integration (#157257)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
Co-authored-by: Artur Pragacz <artur@pragacz.com>
2025-11-26 13:34:26 +01:00
Abílio Costa
c2b7a63dd9 Add get_services_for_target websocket command (#157334) 2025-11-26 12:30:51 +00:00
Robert Resch
550716a753 Optimize docker container publish job (#157076)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-26 13:24:45 +01:00
TheJulianJES
56a71e6798 Add missing ZHA button strings (#157335) 2025-11-26 13:21:17 +01:00
Simone Chemelli
80ec51c56b Bump aioamazondevices to 10.0.0 (#157331) 2025-11-26 13:01:40 +01:00
Allen Porter
ea651c4a22 Overhaul Roborock integration to use new devices based API (#154837) 2025-11-26 12:52:09 +01:00
Bram Kragten
ff40ce419e Add context support for triggers.yaml (#156531) 2025-11-26 12:50:17 +01:00
OzGav
d95308719c Qualify Music Assistant to Bronze Quality Level (#155260)
Co-authored-by: Artur Pragacz <49985303+arturpragacz@users.noreply.github.com>
2025-11-26 12:42:21 +01:00
Petro31
f4fb95ee43 Modernize template light (#156469) 2025-11-26 12:13:27 +01:00
Simone Chemelli
14d95cc86b Temporary raise scan interval for Alexa Devices (#157326) 2025-11-26 11:29:57 +01:00
Joost Lekkerkerker
4257435975 Add Matter info to SmartThings Device (#157321) 2025-11-26 11:28:49 +01:00
Abílio Costa
a6aab088fb Add get_triggers_for_target websocket command (#156778) 2025-11-26 11:05:03 +01:00
Aarni Koskela
655a63c104 Add clamp/wrap/remap to template math functions (#154537) 2025-11-26 11:00:12 +01:00
Robert Resch
a2ade413c2 Fix aarch64 image download by specifing the platform (#157316)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-26 10:02:35 +01:00
Jan Bouwhuis
10299b2ef4 Add description placeholders to service translation strings (#154984)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2025-11-26 09:54:22 +01:00
David Rapan
26444d8d34 Move Shelly sensor translation logic to base class (#157129)
Signed-off-by: David Rapan <david@rapan.cz>
2025-11-26 10:43:16 +02:00
Lukas
554c122a37 Add switch platform to PoolDose integration (#157296) 2025-11-26 09:09:35 +01:00
puddly
1c0dd02a7c Abort USB discovery flows on device unplug (#156303) 2025-11-26 09:00:41 +01:00
167 changed files with 12866 additions and 4445 deletions

View File

@@ -14,7 +14,9 @@ env:
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
BASE_IMAGE_VERSION: "2025.11.0"
# Base image version from https://github.com/home-assistant/docker
BASE_IMAGE_VERSION: "2025.11.3"
ARCHITECTURES: '["amd64", "aarch64"]'
jobs:
init:
@@ -25,6 +27,7 @@ jobs:
version: ${{ steps.version.outputs.version }}
channel: ${{ steps.version.outputs.channel }}
publish: ${{ steps.version.outputs.publish }}
architectures: ${{ env.ARCHITECTURES }}
steps:
- name: Checkout the repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
@@ -85,7 +88,7 @@ jobs:
strategy:
fail-fast: false
matrix:
arch: ["amd64", "aarch64"]
arch: ${{ fromJson(needs.init.outputs.architectures) }}
include:
- arch: amd64
os: ubuntu-latest
@@ -350,9 +353,6 @@ jobs:
matrix:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Checkout the repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
@@ -366,88 +366,94 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: matrix.registry == 'ghcr.io/home-assistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Meta Image
- name: Verify architecture image signatures
shell: bash
run: |
export DOCKER_CLI_EXPERIMENTAL=enabled
ARCHS=$(echo '${{ needs.init.outputs.architectures }}' | jq -r '.[]')
for arch in $ARCHS; do
echo "Verifying ${arch} image signature..."
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp https://github.com/home-assistant/core/.* \
"ghcr.io/home-assistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"
done
echo "✓ All images verified successfully"
function create_manifest() {
local tag_l=${1}
local tag_r=${2}
local registry=${{ matrix.registry }}
# Generate all Docker tags based on version string
# Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# Examples:
# 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
- name: Generate Docker metadata
id: meta
uses: docker/metadata-action@318604b99e75e41977312d83839a89be02ca4893 # v5.9.0
with:
images: ${{ matrix.registry }}/home-assistant
sep-tags: ","
tags: |
type=raw,value=${{ needs.init.outputs.version }},priority=9999
type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
docker manifest create "${registry}/home-assistant:${tag_l}" \
"${registry}/amd64-homeassistant:${tag_r}" \
"${registry}/aarch64-homeassistant:${tag_r}"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.7.1
docker manifest annotate "${registry}/home-assistant:${tag_l}" \
"${registry}/amd64-homeassistant:${tag_r}" \
--os linux --arch amd64
- name: Copy architecture images to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
shell: bash
run: |
# Use imagetools to copy image blobs directly between registries
# This preserves provenance/attestations and seems to be much faster than pull/push
ARCHS=$(echo '${{ needs.init.outputs.architectures }}' | jq -r '.[]')
for arch in $ARCHS; do
echo "Copying ${arch} image to DockerHub..."
docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${{ needs.init.outputs.version }}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"
cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${{ needs.init.outputs.version }}"
done
docker manifest annotate "${registry}/home-assistant:${tag_l}" \
"${registry}/aarch64-homeassistant:${tag_r}" \
--os linux --arch arm64 --variant=v8
- name: Create and push multi-arch manifests
shell: bash
run: |
# Build list of architecture images dynamically
ARCHS=$(echo '${{ needs.init.outputs.architectures }}' | jq -r '.[]')
ARCH_IMAGES=()
for arch in $ARCHS; do
ARCH_IMAGES+=("${{ matrix.registry }}/${arch}-homeassistant:${{ needs.init.outputs.version }}")
done
docker manifest push --purge "${registry}/home-assistant:${tag_l}"
cosign sign --yes "${registry}/home-assistant:${tag_l}"
}
# Build list of all tags for single manifest creation
# Note: Using sep-tags=',' in metadata-action for easier parsing
TAG_ARGS=()
IFS=',' read -ra TAGS <<< "${{ steps.meta.outputs.tags }}"
for tag in "${TAGS[@]}"; do
TAG_ARGS+=("--tag" "${tag}")
done
function validate_image() {
local image=${1}
if ! cosign verify --certificate-oidc-issuer https://token.actions.githubusercontent.com --certificate-identity-regexp https://github.com/home-assistant/core/.* "${image}"; then
echo "Invalid signature!"
exit 1
fi
}
# Create manifest with ALL tags in a single operation (much faster!)
echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
function push_dockerhub() {
local image=${1}
local tag=${2}
# Sign each tag separately (signing requires individual tag names)
echo "Signing all tags..."
for tag in "${TAGS[@]}"; do
echo "Signing ${tag}"
cosign sign --yes "${tag}"
done
docker tag "ghcr.io/home-assistant/${image}:${tag}" "docker.io/homeassistant/${image}:${tag}"
docker push "docker.io/homeassistant/${image}:${tag}"
cosign sign --yes "docker.io/homeassistant/${image}:${tag}"
}
# Pull images from github container registry and verify signature
docker pull "ghcr.io/home-assistant/amd64-homeassistant:${{ needs.init.outputs.version }}"
docker pull "ghcr.io/home-assistant/aarch64-homeassistant:${{ needs.init.outputs.version }}"
validate_image "ghcr.io/home-assistant/amd64-homeassistant:${{ needs.init.outputs.version }}"
validate_image "ghcr.io/home-assistant/aarch64-homeassistant:${{ needs.init.outputs.version }}"
if [[ "${{ matrix.registry }}" == "docker.io/homeassistant" ]]; then
# Upload images to dockerhub
push_dockerhub "amd64-homeassistant" "${{ needs.init.outputs.version }}"
push_dockerhub "aarch64-homeassistant" "${{ needs.init.outputs.version }}"
fi
# Create version tag
create_manifest "${{ needs.init.outputs.version }}" "${{ needs.init.outputs.version }}"
# Create general tags
if [[ "${{ needs.init.outputs.version }}" =~ d ]]; then
create_manifest "dev" "${{ needs.init.outputs.version }}"
elif [[ "${{ needs.init.outputs.version }}" =~ b ]]; then
create_manifest "beta" "${{ needs.init.outputs.version }}"
create_manifest "rc" "${{ needs.init.outputs.version }}"
else
create_manifest "stable" "${{ needs.init.outputs.version }}"
create_manifest "latest" "${{ needs.init.outputs.version }}"
create_manifest "beta" "${{ needs.init.outputs.version }}"
create_manifest "rc" "${{ needs.init.outputs.version }}"
# Create series version tag (e.g. 2021.6)
v="${{ needs.init.outputs.version }}"
create_manifest "${v%.*}" "${{ needs.init.outputs.version }}"
fi
echo "All manifests created and signed successfully"
build_python:
name: Build PyPi package

View File

@@ -40,7 +40,7 @@ env:
CACHE_VERSION: 2
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2025.12"
HA_SHORT_VERSION: "2026.1"
DEFAULT_PYTHON: "3.13"
ALL_PYTHON_VERSIONS: "['3.13', '3.14']"
# 10.3 is the oldest supported version

View File

@@ -187,6 +187,7 @@ homeassistant.components.elkm1.*
homeassistant.components.emulated_hue.*
homeassistant.components.energenie_power_sockets.*
homeassistant.components.energy.*
homeassistant.components.energyid.*
homeassistant.components.energyzero.*
homeassistant.components.enigma2.*
homeassistant.components.enphase_envoy.*

2
CODEOWNERS generated
View File

@@ -452,6 +452,8 @@ build.json @home-assistant/supervisor
/tests/components/energenie_power_sockets/ @gnumpi
/homeassistant/components/energy/ @home-assistant/core
/tests/components/energy/ @home-assistant/core
/homeassistant/components/energyid/ @JrtPec @Molier
/tests/components/energyid/ @JrtPec @Molier
/homeassistant/components/energyzero/ @klaasnicolaas
/tests/components/energyzero/ @klaasnicolaas
/homeassistant/components/enigma2/ @autinerd

View File

@@ -21,7 +21,7 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import _LOGGER, CONF_LOGIN_DATA, DOMAIN
SCAN_INTERVAL = 30
SCAN_INTERVAL = 300
type AmazonConfigEntry = ConfigEntry[AmazonDevicesCoordinator]
@@ -45,7 +45,7 @@ class AmazonDevicesCoordinator(DataUpdateCoordinator[dict[str, AmazonDevice]]):
config_entry=entry,
update_interval=timedelta(seconds=SCAN_INTERVAL),
request_refresh_debouncer=Debouncer(
hass, _LOGGER, cooldown=30, immediate=False
hass, _LOGGER, cooldown=SCAN_INTERVAL, immediate=False
),
)
self.api = AmazonEchoApi(

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==9.0.3"]
"requirements": ["aioamazondevices==10.0.0"]
}

View File

@@ -12,8 +12,9 @@ from typing import Any, Protocol, cast
from propcache.api import cached_property
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components import labs, websocket_api
from homeassistant.components.blueprint import CONF_USE_BLUEPRINT
from homeassistant.components.labs import async_listen as async_labs_listen
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_MODE,
@@ -114,6 +115,51 @@ ATTR_SOURCE = "source"
ATTR_VARIABLES = "variables"
SERVICE_TRIGGER = "trigger"
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG = "new_triggers_conditions"
_EXPERIMENTAL_CONDITION_PLATFORMS = {
"light",
}
_EXPERIMENTAL_TRIGGER_PLATFORMS = {
"alarm_control_panel",
"assist_satellite",
"climate",
"cover",
"fan",
"lawn_mower",
"light",
"media_player",
"text",
"vacuum",
}
@callback
def is_disabled_experimental_condition(hass: HomeAssistant, platform: str) -> bool:
"""Check if the platform is a disabled experimental condition platform."""
return (
platform in _EXPERIMENTAL_CONDITION_PLATFORMS
and not labs.async_is_preview_feature_enabled(
hass,
DOMAIN,
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
)
)
@callback
def is_disabled_experimental_trigger(hass: HomeAssistant, platform: str) -> bool:
"""Check if the platform is a disabled experimental trigger platform."""
return (
platform in _EXPERIMENTAL_TRIGGER_PLATFORMS
and not labs.async_is_preview_feature_enabled(
hass,
DOMAIN,
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
)
)
class IfAction(Protocol):
"""Define the format of if_action."""
@@ -317,6 +363,20 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
schema=vol.Schema({vol.Optional(CONF_ID): str}),
)
@callback
def new_triggers_conditions_listener() -> None:
"""Handle new_triggers_conditions flag change."""
hass.async_create_task(
reload_helper.execute_service(ServiceCall(hass, DOMAIN, SERVICE_RELOAD))
)
async_labs_listen(
hass,
DOMAIN,
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
new_triggers_conditions_listener,
)
websocket_api.async_register_command(hass, websocket_config)
return True

View File

@@ -19,8 +19,17 @@ from homeassistant.helpers.typing import UNDEFINED, UndefinedType
from . import ATTR_CURRENT_POSITION, CoverDeviceClass, CoverState
from .const import DOMAIN
ATTR_FULLY_CLOSED: Final = "fully_closed"
ATTR_FULLY_OPENED: Final = "fully_opened"
COVER_CLOSED_TRIGGER_SCHEMA = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST.extend(
{
vol.Required(CONF_OPTIONS): {
vol.Required(ATTR_FULLY_CLOSED, default=False): bool,
},
}
)
COVER_OPENED_TRIGGER_SCHEMA = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST.extend(
{
vol.Required(CONF_OPTIONS): {
@@ -72,6 +81,19 @@ class CoverOpenedClosedTrigger(EntityTriggerBase):
}
class CoverClosedTrigger(CoverOpenedClosedTrigger):
"""Class for cover closed triggers."""
_schema = COVER_CLOSED_TRIGGER_SCHEMA
_to_states = {CoverState.CLOSED, CoverState.CLOSING}
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:
"""Initialize the state trigger."""
super().__init__(hass, config)
if self._options.get(ATTR_FULLY_CLOSED):
self._attribute_value = 0
class CoverOpenedTrigger(CoverOpenedClosedTrigger):
"""Class for cover opened triggers."""
@@ -85,6 +107,19 @@ class CoverOpenedTrigger(CoverOpenedClosedTrigger):
self._attribute_value = 100
def make_cover_closed_trigger(
device_class: CoverDeviceClass | None,
) -> type[CoverClosedTrigger]:
"""Create an entity state attribute trigger class."""
class CustomTrigger(CoverClosedTrigger):
"""Trigger for entity state changes."""
_device_class = device_class
return CustomTrigger
def make_cover_opened_trigger(
device_class: CoverDeviceClass | None,
) -> type[CoverOpenedTrigger]:
@@ -99,6 +134,15 @@ def make_cover_opened_trigger(
TRIGGERS: dict[str, type[Trigger]] = {
"awning_closed": make_cover_closed_trigger(CoverDeviceClass.AWNING),
"blind_closed": make_cover_closed_trigger(CoverDeviceClass.BLIND),
"curtain_closed": make_cover_closed_trigger(CoverDeviceClass.CURTAIN),
"door_closed": make_cover_closed_trigger(CoverDeviceClass.DOOR),
"garage_closed": make_cover_closed_trigger(CoverDeviceClass.GARAGE),
"gate_closed": make_cover_closed_trigger(CoverDeviceClass.GATE),
"shade_closed": make_cover_closed_trigger(CoverDeviceClass.SHADE),
"shutter_closed": make_cover_closed_trigger(CoverDeviceClass.SHUTTER),
"window_closed": make_cover_closed_trigger(CoverDeviceClass.WINDOW),
"awning_opened": make_cover_opened_trigger(CoverDeviceClass.AWNING),
"blind_opened": make_cover_opened_trigger(CoverDeviceClass.BLIND),
"curtain_opened": make_cover_opened_trigger(CoverDeviceClass.CURTAIN),

View File

@@ -15,6 +15,11 @@ from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import DOMAIN
def normalize_pairing_code(code: str) -> str:
"""Normalize pairing code by removing spaces and capitalizing."""
return code.replace(" ", "").upper()
class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle Droplet config flow."""
@@ -52,14 +57,13 @@ class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
# Test if we can connect before returning
session = async_get_clientsession(self.hass)
if await self._droplet_discovery.try_connect(
session, user_input[CONF_CODE]
):
code = normalize_pairing_code(user_input[CONF_CODE])
if await self._droplet_discovery.try_connect(session, code):
device_data = {
CONF_IP_ADDRESS: self._droplet_discovery.host,
CONF_PORT: self._droplet_discovery.port,
CONF_DEVICE_ID: device_id,
CONF_CODE: user_input[CONF_CODE],
CONF_CODE: code,
}
return self.async_create_entry(
@@ -90,14 +94,15 @@ class DropletConfigFlow(ConfigFlow, domain=DOMAIN):
user_input[CONF_IP_ADDRESS], DropletConnection.DEFAULT_PORT, ""
)
session = async_get_clientsession(self.hass)
if await self._droplet_discovery.try_connect(
session, user_input[CONF_CODE]
) and (device_id := await self._droplet_discovery.get_device_id()):
code = normalize_pairing_code(user_input[CONF_CODE])
if await self._droplet_discovery.try_connect(session, code) and (
device_id := await self._droplet_discovery.get_device_id()
):
device_data = {
CONF_IP_ADDRESS: self._droplet_discovery.host,
CONF_PORT: self._droplet_discovery.port,
CONF_DEVICE_ID: device_id,
CONF_CODE: user_input[CONF_CODE],
CONF_CODE: code,
}
await self.async_set_unique_id(device_id, raise_on_progress=False)
self._abort_if_unique_id_configured(

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==16.3.0"]
"requirements": ["py-sucks==0.9.11", "deebot-client==16.4.0"]
}

View File

@@ -0,0 +1,401 @@
"""The EnergyID integration."""
from __future__ import annotations
from dataclasses import dataclass
import datetime as dt
from datetime import timedelta
import functools
import logging
from aiohttp import ClientError, ClientResponseError
from energyid_webhooks.client_v2 import WebhookClient
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import STATE_UNAVAILABLE, STATE_UNKNOWN
from homeassistant.core import (
CALLBACK_TYPE,
Event,
EventStateChangedData,
HomeAssistant,
callback,
)
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import (
async_track_entity_registry_updated_event,
async_track_state_change_event,
async_track_time_interval,
)
from .const import (
CONF_DEVICE_ID,
CONF_DEVICE_NAME,
CONF_ENERGYID_KEY,
CONF_HA_ENTITY_UUID,
CONF_PROVISIONING_KEY,
CONF_PROVISIONING_SECRET,
DOMAIN,
)
_LOGGER = logging.getLogger(__name__)
type EnergyIDConfigEntry = ConfigEntry[EnergyIDRuntimeData]
DEFAULT_UPLOAD_INTERVAL_SECONDS = 60
@dataclass
class EnergyIDRuntimeData:
"""Runtime data for the EnergyID integration."""
client: WebhookClient
mappings: dict[str, str]
state_listener: CALLBACK_TYPE | None = None
registry_tracking_listener: CALLBACK_TYPE | None = None
unavailable_logged: bool = False
async def async_setup_entry(hass: HomeAssistant, entry: EnergyIDConfigEntry) -> bool:
"""Set up EnergyID from a config entry."""
session = async_get_clientsession(hass)
client = WebhookClient(
provisioning_key=entry.data[CONF_PROVISIONING_KEY],
provisioning_secret=entry.data[CONF_PROVISIONING_SECRET],
device_id=entry.data[CONF_DEVICE_ID],
device_name=entry.data[CONF_DEVICE_NAME],
session=session,
)
entry.runtime_data = EnergyIDRuntimeData(
client=client,
mappings={},
)
is_claimed = None
try:
is_claimed = await client.authenticate()
except TimeoutError as err:
raise ConfigEntryNotReady(
f"Timeout authenticating with EnergyID: {err}"
) from err
except ClientResponseError as err:
# 401/403 = invalid credentials, trigger reauth
if err.status in (401, 403):
raise ConfigEntryAuthFailed(f"Invalid credentials: {err}") from err
# Other HTTP errors are likely temporary
raise ConfigEntryNotReady(
f"HTTP error authenticating with EnergyID: {err}"
) from err
except ClientError as err:
# Network/connection errors are temporary
raise ConfigEntryNotReady(
f"Connection error authenticating with EnergyID: {err}"
) from err
except Exception as err:
# Unknown errors - log and retry (safer than forcing reauth)
_LOGGER.exception("Unexpected error during EnergyID authentication")
raise ConfigEntryNotReady(
f"Unexpected error authenticating with EnergyID: {err}"
) from err
if not is_claimed:
# Device exists but not claimed = user needs to claim it = auth issue
raise ConfigEntryAuthFailed("Device is not claimed. Please re-authenticate.")
_LOGGER.debug("EnergyID device '%s' authenticated successfully", client.device_name)
async def _async_synchronize_sensors(now: dt.datetime | None = None) -> None:
"""Callback for periodically synchronizing sensor data."""
try:
await client.synchronize_sensors()
if entry.runtime_data.unavailable_logged:
_LOGGER.debug("Connection to EnergyID re-established")
entry.runtime_data.unavailable_logged = False
except (OSError, RuntimeError) as err:
if not entry.runtime_data.unavailable_logged:
_LOGGER.debug("EnergyID is unavailable: %s", err)
entry.runtime_data.unavailable_logged = True
upload_interval = DEFAULT_UPLOAD_INTERVAL_SECONDS
if client.webhook_policy:
upload_interval = client.webhook_policy.get(
"uploadInterval", DEFAULT_UPLOAD_INTERVAL_SECONDS
)
# Schedule the callback and automatically unsubscribe when the entry is unloaded.
entry.async_on_unload(
async_track_time_interval(
hass, _async_synchronize_sensors, timedelta(seconds=upload_interval)
)
)
entry.async_on_unload(entry.add_update_listener(config_entry_update_listener))
update_listeners(hass, entry)
_LOGGER.debug(
"Starting EnergyID background sync for '%s'",
client.device_name,
)
return True
async def config_entry_update_listener(
hass: HomeAssistant, entry: EnergyIDConfigEntry
) -> None:
"""Handle config entry updates, including subentry changes."""
_LOGGER.debug("Config entry updated for %s, reloading listeners", entry.entry_id)
update_listeners(hass, entry)
@callback
def update_listeners(hass: HomeAssistant, entry: EnergyIDConfigEntry) -> None:
"""Set up or update state listeners and queue initial states."""
runtime_data = entry.runtime_data
client = runtime_data.client
# Clean up old state listener
if runtime_data.state_listener:
runtime_data.state_listener()
runtime_data.state_listener = None
mappings: dict[str, str] = {}
entities_to_track: list[str] = []
old_mappings = set(runtime_data.mappings.keys())
new_mappings = set()
ent_reg = er.async_get(hass)
subentries = list(entry.subentries.values())
_LOGGER.debug(
"Found %d subentries in entry.subentries: %s",
len(subentries),
[s.data for s in subentries],
)
# Build current entity mappings
tracked_entity_ids = []
for subentry in subentries:
entity_uuid = subentry.data.get(CONF_HA_ENTITY_UUID)
energyid_key = subentry.data.get(CONF_ENERGYID_KEY)
if not (entity_uuid and energyid_key):
continue
entity_entry = ent_reg.async_get(entity_uuid)
if not entity_entry:
_LOGGER.warning(
"Entity with UUID %s does not exist, skipping mapping to %s",
entity_uuid,
energyid_key,
)
continue
ha_entity_id = entity_entry.entity_id
tracked_entity_ids.append(ha_entity_id)
if not hass.states.get(ha_entity_id):
# Entity exists in registry but is not present in the state machine
_LOGGER.debug(
"Entity %s does not exist in state machine yet, will track when available (mapping to %s)",
ha_entity_id,
energyid_key,
)
# Still add to entities_to_track so we can handle it when state appears
entities_to_track.append(ha_entity_id)
continue
mappings[ha_entity_id] = energyid_key
entities_to_track.append(ha_entity_id)
new_mappings.add(ha_entity_id)
client.get_or_create_sensor(energyid_key)
if ha_entity_id not in old_mappings:
_LOGGER.debug(
"New mapping detected for %s, queuing initial state", ha_entity_id
)
if (
current_state := hass.states.get(ha_entity_id)
) and current_state.state not in (
STATE_UNKNOWN,
STATE_UNAVAILABLE,
):
try:
value = float(current_state.state)
timestamp = current_state.last_updated or dt.datetime.now(dt.UTC)
client.get_or_create_sensor(energyid_key).update(value, timestamp)
except (ValueError, TypeError):
_LOGGER.debug(
"Could not convert initial state of %s to float: %s",
ha_entity_id,
current_state.state,
)
# Clean up old entity registry listener
if runtime_data.registry_tracking_listener:
runtime_data.registry_tracking_listener()
runtime_data.registry_tracking_listener = None
# Set up listeners for entity registry changes
if tracked_entity_ids:
_LOGGER.debug("Setting up entity registry tracking for: %s", tracked_entity_ids)
def _handle_entity_registry_change(
event: Event[er.EventEntityRegistryUpdatedData],
) -> None:
"""Handle entity registry changes for our tracked entities."""
_LOGGER.debug("Registry event for tracked entity: %s", event.data)
if event.data["action"] == "update":
# Type is now narrowed to _EventEntityRegistryUpdatedData_Update
if "entity_id" in event.data["changes"]:
old_entity_id = event.data["changes"]["entity_id"]
new_entity_id = event.data["entity_id"]
_LOGGER.debug(
"Tracked entity ID changed: %s -> %s",
old_entity_id,
new_entity_id,
)
# Entity ID changed, need to reload listeners to track new ID
update_listeners(hass, entry)
elif event.data["action"] == "remove":
_LOGGER.debug("Tracked entity removed: %s", event.data["entity_id"])
# reminder: Create repair issue to notify user about removed entity
update_listeners(hass, entry)
# Track the specific entity IDs we care about
unsub_entity_registry = async_track_entity_registry_updated_event(
hass, tracked_entity_ids, _handle_entity_registry_change
)
runtime_data.registry_tracking_listener = unsub_entity_registry
if removed_mappings := old_mappings - new_mappings:
_LOGGER.debug("Removed mappings: %s", ", ".join(removed_mappings))
runtime_data.mappings = mappings
if not entities_to_track:
_LOGGER.debug(
"No valid sensor mappings configured for '%s'", client.device_name
)
return
unsub_state_change = async_track_state_change_event(
hass,
entities_to_track,
functools.partial(_async_handle_state_change, hass, entry.entry_id),
)
runtime_data.state_listener = unsub_state_change
_LOGGER.debug(
"Now tracking state changes for %d entities for '%s': %s",
len(entities_to_track),
client.device_name,
entities_to_track,
)
@callback
def _async_handle_state_change(
hass: HomeAssistant, entry_id: str, event: Event[EventStateChangedData]
) -> None:
"""Handle state changes for tracked entities."""
entity_id = event.data["entity_id"]
new_state = event.data["new_state"]
_LOGGER.debug(
"State change detected for entity: %s, new value: %s",
entity_id,
new_state.state if new_state else "None",
)
if not new_state or new_state.state in (STATE_UNKNOWN, STATE_UNAVAILABLE):
return
entry = hass.config_entries.async_get_entry(entry_id)
if not entry or not hasattr(entry, "runtime_data"):
# Entry is being unloaded or not yet fully initialized
return
runtime_data = entry.runtime_data
client = runtime_data.client
# Check if entity is already mapped
if energyid_key := runtime_data.mappings.get(entity_id):
# Entity already mapped, just update value
_LOGGER.debug(
"Updating EnergyID sensor %s with value %s", energyid_key, new_state.state
)
else:
# Entity not mapped yet - check if it should be (handles late-appearing entities)
ent_reg = er.async_get(hass)
for subentry in entry.subentries.values():
entity_uuid = subentry.data.get(CONF_HA_ENTITY_UUID)
energyid_key_candidate = subentry.data.get(CONF_ENERGYID_KEY)
if not (entity_uuid and energyid_key_candidate):
continue
entity_entry = ent_reg.async_get(entity_uuid)
if entity_entry and entity_entry.entity_id == entity_id:
# Found it! Add to mappings and send initial value
energyid_key = energyid_key_candidate
runtime_data.mappings[entity_id] = energyid_key
client.get_or_create_sensor(energyid_key)
_LOGGER.debug(
"Entity %s now available in state machine, adding to mappings (key: %s)",
entity_id,
energyid_key,
)
break
else:
# Not a tracked entity, ignore
return
try:
value = float(new_state.state)
except (ValueError, TypeError):
return
client.get_or_create_sensor(energyid_key).update(value, new_state.last_updated)
async def async_unload_entry(hass: HomeAssistant, entry: EnergyIDConfigEntry) -> bool:
"""Unload a config entry."""
_LOGGER.debug("Unloading EnergyID entry for %s", entry.title)
try:
# Unload subentries if present (guarded for test and reload scenarios)
if hasattr(hass.config_entries, "async_entries") and hasattr(entry, "entry_id"):
subentries = [
e.entry_id
for e in hass.config_entries.async_entries(DOMAIN)
if getattr(e, "parent_entry", None) == entry.entry_id
]
for subentry_id in subentries:
await hass.config_entries.async_unload(subentry_id)
# Only clean up listeners and client if runtime_data is present
if hasattr(entry, "runtime_data"):
runtime_data = entry.runtime_data
# Remove state listener
if runtime_data.state_listener:
runtime_data.state_listener()
# Remove registry tracking listener
if runtime_data.registry_tracking_listener:
runtime_data.registry_tracking_listener()
try:
await runtime_data.client.close()
except Exception:
_LOGGER.exception("Error closing EnergyID client for %s", entry.title)
del entry.runtime_data
except Exception:
_LOGGER.exception("Error during async_unload_entry for %s", entry.title)
return False
return True

View File

@@ -0,0 +1,293 @@
"""Config flow for EnergyID integration."""
import asyncio
from collections.abc import Mapping
import logging
from typing import Any
from aiohttp import ClientError, ClientResponseError
from energyid_webhooks.client_v2 import WebhookClient
import voluptuous as vol
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryFlow,
)
from homeassistant.core import callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.instance_id import async_get as async_get_instance_id
from .const import (
CONF_DEVICE_ID,
CONF_DEVICE_NAME,
CONF_PROVISIONING_KEY,
CONF_PROVISIONING_SECRET,
DOMAIN,
ENERGYID_DEVICE_ID_FOR_WEBHOOK_PREFIX,
MAX_POLLING_ATTEMPTS,
NAME,
POLLING_INTERVAL,
)
from .energyid_sensor_mapping_flow import EnergyIDSensorMappingFlowHandler
_LOGGER = logging.getLogger(__name__)
class EnergyIDConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle the configuration flow for the EnergyID integration."""
def __init__(self) -> None:
"""Initialize the config flow."""
self._flow_data: dict[str, Any] = {}
self._polling_task: asyncio.Task | None = None
async def _perform_auth_and_get_details(self) -> str | None:
"""Authenticate with EnergyID and retrieve device details."""
_LOGGER.debug("Starting authentication with EnergyID")
client = WebhookClient(
provisioning_key=self._flow_data[CONF_PROVISIONING_KEY],
provisioning_secret=self._flow_data[CONF_PROVISIONING_SECRET],
device_id=self._flow_data[CONF_DEVICE_ID],
device_name=self._flow_data[CONF_DEVICE_NAME],
session=async_get_clientsession(self.hass),
)
try:
is_claimed = await client.authenticate()
except ClientResponseError as err:
if err.status == 401:
_LOGGER.debug("Invalid provisioning key or secret")
return "invalid_auth"
_LOGGER.debug(
"Client response error during EnergyID authentication: %s", err
)
return "cannot_connect"
except ClientError as err:
_LOGGER.debug(
"Failed to connect to EnergyID during authentication: %s", err
)
return "cannot_connect"
except Exception:
_LOGGER.exception("Unexpected error during EnergyID authentication")
return "unknown_auth_error"
else:
_LOGGER.debug("Authentication successful, claimed: %s", is_claimed)
if is_claimed:
self._flow_data["record_number"] = client.recordNumber
self._flow_data["record_name"] = client.recordName
_LOGGER.debug(
"Device claimed with record number: %s, record name: %s",
client.recordNumber,
client.recordName,
)
return None
self._flow_data["claim_info"] = client.get_claim_info()
self._flow_data["claim_info"]["integration_name"] = NAME
_LOGGER.debug(
"Device needs claim, claim info: %s", self._flow_data["claim_info"]
)
return "needs_claim"
async def _async_poll_for_claim(self) -> None:
"""Poll EnergyID to check if device has been claimed."""
for _attempt in range(1, MAX_POLLING_ATTEMPTS + 1):
await asyncio.sleep(POLLING_INTERVAL)
auth_status = await self._perform_auth_and_get_details()
if auth_status is None:
# Device claimed - advance flow to async_step_create_entry
_LOGGER.debug("Device claimed, advancing to create entry")
self.hass.async_create_task(
self.hass.config_entries.flow.async_configure(self.flow_id)
)
return
if auth_status != "needs_claim":
# Stop polling on non-transient errors
# No user notification needed here as the error will be handled
# in the next flow step when the user continues the flow
_LOGGER.debug("Polling stopped due to error: %s", auth_status)
return
_LOGGER.debug("Polling timeout after %s attempts", MAX_POLLING_ATTEMPTS)
# No user notification here because:
# 1. User may still be completing the claim process in EnergyID portal
# 2. Immediate notification could interrupt their workflow or cause confusion
# 3. When user clicks "Submit" to continue, the flow validates claim status
# and will show appropriate error/success messages based on current state
# 4. Timeout allows graceful fallback: user can retry claim or see proper error
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step of the configuration flow."""
_LOGGER.debug("Starting user step with input: %s", user_input)
errors: dict[str, str] = {}
if user_input is not None:
instance_id = await async_get_instance_id(self.hass)
# Note: This device_id is for EnergyID's webhook system, not related to HA's device registry
device_suffix = f"{int(asyncio.get_event_loop().time() * 1000)}"
device_id = (
f"{ENERGYID_DEVICE_ID_FOR_WEBHOOK_PREFIX}{instance_id}_{device_suffix}"
)
self._flow_data = {
**user_input,
CONF_DEVICE_ID: device_id,
CONF_DEVICE_NAME: self.hass.config.location_name,
}
_LOGGER.debug("Flow data after user input: %s", self._flow_data)
auth_status = await self._perform_auth_and_get_details()
if auth_status is None:
await self.async_set_unique_id(device_id)
self._abort_if_unique_id_configured()
_LOGGER.debug(
"Creating entry with title: %s", self._flow_data["record_name"]
)
return self.async_create_entry(
title=self._flow_data["record_name"],
data=self._flow_data,
description="add_sensor_mapping_hint",
description_placeholders={"integration_name": NAME},
)
if auth_status == "needs_claim":
_LOGGER.debug("Redirecting to auth and claim step")
return await self.async_step_auth_and_claim()
errors["base"] = auth_status
_LOGGER.debug("Errors encountered during user step: %s", errors)
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_PROVISIONING_KEY): str,
vol.Required(CONF_PROVISIONING_SECRET): cv.string,
}
),
errors=errors,
description_placeholders={
"docs_url": "https://app.energyid.eu/integrations/home-assistant",
"integration_name": NAME,
},
)
async def async_step_auth_and_claim(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the step for device claiming using external step with polling."""
_LOGGER.debug("Starting auth and claim step with input: %s", user_input)
claim_info = self._flow_data.get("claim_info", {})
# Start polling when we first enter this step
if self._polling_task is None:
self._polling_task = self.hass.async_create_task(
self._async_poll_for_claim()
)
# Show external step to open the EnergyID website
return self.async_external_step(
step_id="auth_and_claim",
url=claim_info.get("claim_url", ""),
description_placeholders=claim_info,
)
# Check if device has been claimed
auth_status = await self._perform_auth_and_get_details()
if auth_status is None:
# Device has been claimed
if self._polling_task and not self._polling_task.done():
self._polling_task.cancel()
self._polling_task = None
return self.async_external_step_done(next_step_id="create_entry")
# Device not claimed yet, show the external step again
if self._polling_task and not self._polling_task.done():
self._polling_task.cancel()
self._polling_task = None
return self.async_external_step(
step_id="auth_and_claim",
url=claim_info.get("claim_url", ""),
description_placeholders=claim_info,
)
async def async_step_create_entry(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Final step to create the entry after successful claim."""
_LOGGER.debug("Creating entry with title: %s", self._flow_data["record_name"])
return self.async_create_entry(
title=self._flow_data["record_name"],
data=self._flow_data,
description="add_sensor_mapping_hint",
description_placeholders={"integration_name": NAME},
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Perform reauthentication upon an API authentication error."""
# Note: This device_id is for EnergyID's webhook system, not related to HA's device registry
self._flow_data = {
CONF_DEVICE_ID: entry_data[CONF_DEVICE_ID],
CONF_DEVICE_NAME: entry_data[CONF_DEVICE_NAME],
}
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm reauthentication dialog."""
errors: dict[str, str] = {}
if user_input is not None:
self._flow_data.update(user_input)
auth_status = await self._perform_auth_and_get_details()
if auth_status is None:
# Authentication successful and claimed
await self.async_set_unique_id(self._flow_data["record_number"])
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={
CONF_PROVISIONING_KEY: user_input[CONF_PROVISIONING_KEY],
CONF_PROVISIONING_SECRET: user_input[CONF_PROVISIONING_SECRET],
},
)
if auth_status == "needs_claim":
return await self.async_step_auth_and_claim()
errors["base"] = auth_status
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema(
{
vol.Required(CONF_PROVISIONING_KEY): str,
vol.Required(CONF_PROVISIONING_SECRET): cv.string,
}
),
errors=errors,
description_placeholders={
"docs_url": "https://app.energyid.eu/integrations/home-assistant",
"integration_name": NAME,
},
)
@classmethod
@callback
def async_get_supported_subentry_types(
cls, config_entry: ConfigEntry
) -> dict[str, type[ConfigSubentryFlow]]:
"""Return subentries supported by this integration."""
return {"sensor_mapping": EnergyIDSensorMappingFlowHandler}

View File

@@ -0,0 +1,21 @@
"""Constants for the EnergyID integration."""
from typing import Final
DOMAIN: Final = "energyid"
NAME: Final = "EnergyID"
# --- Config Flow and Entry Data ---
CONF_PROVISIONING_KEY: Final = "provisioning_key"
CONF_PROVISIONING_SECRET: Final = "provisioning_secret"
CONF_DEVICE_ID: Final = "device_id"
CONF_DEVICE_NAME: Final = "device_name"
# --- Subentry (Mapping) Data ---
CONF_HA_ENTITY_UUID: Final = "ha_entity_uuid"
CONF_ENERGYID_KEY: Final = "energyid_key"
# --- Webhook and Polling Configuration ---
ENERGYID_DEVICE_ID_FOR_WEBHOOK_PREFIX: Final = "homeassistant_eid_"
POLLING_INTERVAL: Final = 2 # seconds
MAX_POLLING_ATTEMPTS: Final = 60 # 2 minutes total

View File

@@ -0,0 +1,156 @@
"""Subentry flow for EnergyID integration, handling sensor mapping management."""
import logging
from typing import Any
import voluptuous as vol
from homeassistant.components.sensor import SensorDeviceClass, SensorStateClass
from homeassistant.config_entries import ConfigSubentryFlow, SubentryFlowResult
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.selector import EntitySelector, EntitySelectorConfig
from .const import CONF_ENERGYID_KEY, CONF_HA_ENTITY_UUID, DOMAIN, NAME
_LOGGER = logging.getLogger(__name__)
@callback
def _get_suggested_entities(hass: HomeAssistant) -> list[str]:
"""Return a sorted list of suggested sensor entity IDs for mapping."""
ent_reg = er.async_get(hass)
suitable_entities = []
for entity_entry in ent_reg.entities.values():
if not (
entity_entry.domain == Platform.SENSOR and entity_entry.platform != DOMAIN
):
continue
if not hass.states.get(entity_entry.entity_id):
continue
state_class = (entity_entry.capabilities or {}).get("state_class")
has_numeric_indicators = (
state_class
in (
SensorStateClass.MEASUREMENT,
SensorStateClass.TOTAL,
SensorStateClass.TOTAL_INCREASING,
)
or entity_entry.device_class
in (
SensorDeviceClass.ENERGY,
SensorDeviceClass.GAS,
SensorDeviceClass.POWER,
SensorDeviceClass.TEMPERATURE,
SensorDeviceClass.VOLUME,
)
or entity_entry.original_device_class
in (
SensorDeviceClass.ENERGY,
SensorDeviceClass.GAS,
SensorDeviceClass.POWER,
SensorDeviceClass.TEMPERATURE,
SensorDeviceClass.VOLUME,
)
)
if has_numeric_indicators:
suitable_entities.append(entity_entry.entity_id)
return sorted(suitable_entities)
@callback
def _validate_mapping_input(
ha_entity_id: str | None,
current_mappings: set[str],
ent_reg: er.EntityRegistry,
) -> dict[str, str]:
"""Validate mapping input and return errors if any."""
errors: dict[str, str] = {}
if not ha_entity_id:
errors["base"] = "entity_required"
return errors
# Check if entity exists
entity_entry = ent_reg.async_get(ha_entity_id)
if not entity_entry:
errors["base"] = "entity_not_found"
return errors
# Check if entity is already mapped (by UUID)
entity_uuid = entity_entry.id
if entity_uuid in current_mappings:
errors["base"] = "entity_already_mapped"
return errors
class EnergyIDSensorMappingFlowHandler(ConfigSubentryFlow):
"""Handle EnergyID sensor mapping subentry flow for adding new mappings."""
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle the user step for adding a new sensor mapping."""
errors: dict[str, str] = {}
config_entry = self._get_entry()
ent_reg = er.async_get(self.hass)
if user_input is not None:
ha_entity_id = user_input.get("ha_entity_id")
# Get current mappings by UUID
current_mappings = {
uuid
for sub in config_entry.subentries.values()
if (uuid := sub.data.get(CONF_HA_ENTITY_UUID)) is not None
}
errors = _validate_mapping_input(ha_entity_id, current_mappings, ent_reg)
if not errors and ha_entity_id:
# Get entity registry entry
entity_entry = ent_reg.async_get(ha_entity_id)
if entity_entry:
energyid_key = ha_entity_id.split(".", 1)[-1]
subentry_data = {
CONF_HA_ENTITY_UUID: entity_entry.id, # Store UUID only
CONF_ENERGYID_KEY: energyid_key,
}
title = f"{ha_entity_id.split('.', 1)[-1]} connection to {NAME}"
_LOGGER.debug(
"Creating subentry with title='%s', data=%s",
title,
subentry_data,
)
_LOGGER.debug("Parent config entry ID: %s", config_entry.entry_id)
_LOGGER.debug(
"Creating subentry with parent: %s", self._get_entry().entry_id
)
return self.async_create_entry(title=title, data=subentry_data)
errors["base"] = "entity_not_found"
suggested_entities = _get_suggested_entities(self.hass)
data_schema = vol.Schema(
{
vol.Required("ha_entity_id"): EntitySelector(
EntitySelectorConfig(include_entities=suggested_entities)
),
}
)
return self.async_show_form(
step_id="user",
data_schema=data_schema,
errors=errors,
description_placeholders={"integration_name": NAME},
)

View File

@@ -0,0 +1,12 @@
{
"domain": "energyid",
"name": "EnergyID",
"codeowners": ["@JrtPec", "@Molier"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/energyid",
"integration_type": "service",
"iot_class": "cloud_push",
"loggers": ["energyid_webhooks"],
"quality_scale": "silver",
"requirements": ["energyid-webhooks==0.0.14"]
}

View File

@@ -0,0 +1,137 @@
rules:
# Bronze
action-setup:
status: exempt
comment: The integration does not expose any custom service actions.
appropriate-polling:
status: exempt
comment: The integration uses a push-based mechanism with a background sync task, not polling.
brands:
status: done
common-modules:
status: done
config-flow-test-coverage:
status: done
config-flow:
status: done
dependency-transparency:
status: done
docs-actions:
status: exempt
comment: The integration does not expose any custom service actions.
docs-high-level-description:
status: done
docs-installation-instructions:
status: done
docs-removal-instructions:
status: done
entity-event-setup:
status: exempt
comment: This integration does not create its own entities.
entity-unique-id:
status: exempt
comment: This integration does not create its own entities.
has-entity-name:
status: exempt
comment: This integration does not create its own entities.
runtime-data:
status: done
test-before-configure:
status: done
test-before-setup:
status: done
unique-config-entry:
status: done
# Silver
action-exceptions:
status: exempt
comment: The integration does not expose any custom service actions.
config-entry-unloading:
status: done
docs-configuration-parameters:
status: done
docs-installation-parameters:
status: done
entity-unavailable:
status: exempt
comment: This integration does not create its own entities.
integration-owner:
status: done
log-when-unavailable:
status: done
comment: The integration logs a single message when the EnergyID service is unavailable.
parallel-updates:
status: exempt
comment: This integration does not create its own entities.
reauthentication-flow:
status: done
test-coverage:
status: done
# Gold
devices:
status: exempt
comment: The integration does not create any entities, nor does it create devices.
diagnostics:
status: todo
comment: Diagnostics will be added in a follow-up PR to help with debugging.
discovery:
status: exempt
comment: Configuration requires manual entry of provisioning credentials.
discovery-update-info:
status: exempt
comment: No discovery mechanism is used.
docs-data-update:
status: done
docs-examples:
status: done
docs-known-limitations:
status: done
docs-supported-devices:
status: exempt
comment: This is a service integration not tied to specific device models.
docs-supported-functions:
status: done
docs-troubleshooting:
status: done
docs-use-cases:
status: done
dynamic-devices:
status: exempt
comment: The integration creates a single device entry for the service connection.
entity-category:
status: exempt
comment: This integration does not create its own entities.
entity-device-class:
status: exempt
comment: This integration does not create its own entities.
entity-disabled-by-default:
status: exempt
comment: This integration does not create its own entities.
entity-translations:
status: exempt
comment: This integration does not create its own entities.
exception-translations:
status: done
icon-translations:
status: exempt
comment: This integration does not create its own entities.
reconfiguration-flow:
status: todo
comment: Reconfiguration will be added in a follow-up PR to allow updating the device name.
repair-issues:
status: exempt
comment: Authentication issues are handled via the reauthentication flow.
stale-devices:
status: exempt
comment: Creates a single service device entry tied to the config entry.
# Platinum
async-dependency:
status: done
inject-websession:
status: done
strict-typing:
status: todo
comment: Full strict typing compliance will be addressed in a future update.

View File

@@ -0,0 +1,71 @@
{
"config": {
"abort": {
"already_configured": "This device is already configured.",
"reauth_successful": "Reauthentication successful."
},
"create_entry": {
"add_sensor_mapping_hint": "You can now add mappings from any sensor in Home Assistant to {integration_name} using the '+ add sensor mapping' button."
},
"error": {
"cannot_connect": "Failed to connect to {integration_name} API.",
"claim_failed_or_timed_out": "Claiming the device failed or the code expired.",
"invalid_auth": "Invalid provisioning key or secret.",
"unknown_auth_error": "Unexpected error occurred during authentication."
},
"step": {
"auth_and_claim": {
"description": "This Home Assistant connection needs to be claimed in your {integration_name} portal before it can send data.\n\n1. Go to: {claim_url}\n2. Enter code: **{claim_code}**\n3. (Code expires: {valid_until})\n\nAfter successfully claiming the device in {integration_name}, select **Submit** below to continue.",
"title": "Claim device in {integration_name}"
},
"reauth_confirm": {
"data": {
"provisioning_key": "[%key:component::energyid::config::step::user::data::provisioning_key%]",
"provisioning_secret": "[%key:component::energyid::config::step::user::data::provisioning_secret%]"
},
"data_description": {
"provisioning_key": "[%key:component::energyid::config::step::user::data_description::provisioning_key%]",
"provisioning_secret": "[%key:component::energyid::config::step::user::data_description::provisioning_secret%]"
},
"description": "Please re-enter your {integration_name} provisioning key and secret to restore the connection.\n\nMore info: {docs_url}",
"title": "Reauthenticate {integration_name}"
},
"user": {
"data": {
"provisioning_key": "Provisioning key",
"provisioning_secret": "Provisioning secret"
},
"data_description": {
"provisioning_key": "Your unique key for provisioning.",
"provisioning_secret": "Your secret associated with the provisioning key."
},
"description": "Enter your {integration_name} webhook provisioning key and secret. Find these in your {integration_name} integration setup under provisioning credentials.\n\nMore info: {docs_url}",
"title": "Connect to {integration_name}"
}
}
},
"config_subentries": {
"sensor_mapping": {
"entry_type": "service",
"error": {
"entity_already_mapped": "This Home Assistant entity is already mapped.",
"entity_required": "You must select a sensor entity."
},
"initiate_flow": {
"user": "Add sensor mapping"
},
"step": {
"user": {
"data": {
"ha_entity_id": "Home Assistant sensor"
},
"data_description": {
"ha_entity_id": "Select the sensor from Home Assistant to send to {integration_name}."
},
"description": "Select a Home Assistant sensor to send to {integration_name}. The sensor name will be used as the {integration_name} metric key.",
"title": "Add sensor mapping"
}
}
}
}
}

View File

@@ -25,6 +25,7 @@ from .domain_data import DomainData
from .encryption_key_storage import async_get_encryption_key_storage
from .entry_data import ESPHomeConfigEntry, RuntimeEntryData
from .manager import DEVICE_CONFLICT_ISSUE_FORMAT, ESPHomeManager, cleanup_instance
from .websocket_api import async_setup as async_setup_websocket_api
_LOGGER = logging.getLogger(__name__)
@@ -38,6 +39,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
ffmpeg_proxy.async_setup(hass)
await assist_satellite.async_setup(hass)
await dashboard.async_setup(hass)
async_setup_websocket_api(hass)
return True

View File

@@ -0,0 +1,52 @@
"""ESPHome websocket API."""
import logging
from typing import Any
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.core import HomeAssistant, callback
from .const import CONF_NOISE_PSK
_LOGGER = logging.getLogger(__name__)
TYPE = "type"
ENTRY_ID = "entry_id"
@callback
def async_setup(hass: HomeAssistant) -> None:
"""Set up the websocket API."""
websocket_api.async_register_command(hass, get_encryption_key)
@callback
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required(TYPE): "esphome/get_encryption_key",
vol.Required(ENTRY_ID): str,
}
)
def get_encryption_key(
hass: HomeAssistant,
connection: websocket_api.connection.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Get the encryption key for an ESPHome config entry."""
entry = hass.config_entries.async_get_entry(msg[ENTRY_ID])
if entry is None:
connection.send_error(
msg["id"], websocket_api.ERR_NOT_FOUND, "Config entry not found"
)
return
connection.send_result(
msg["id"],
{
"encryption_key": entry.data.get(CONF_NOISE_PSK),
},
)

View File

@@ -1,22 +1,30 @@
"""API for fitbit bound to Home Assistant OAuth."""
from abc import ABC, abstractmethod
from collections.abc import Callable
from collections.abc import Awaitable, Callable
import logging
from typing import Any, cast
from fitbit import Fitbit
from fitbit.exceptions import HTTPException, HTTPUnauthorized
from fitbit_web_api import ApiClient, Configuration, DevicesApi
from fitbit_web_api.exceptions import (
ApiException,
OpenApiException,
UnauthorizedException,
)
from fitbit_web_api.models.device import Device
from requests.exceptions import ConnectionError as RequestsConnectionError
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.util.unit_system import METRIC_SYSTEM
from .const import FitbitUnitSystem
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice, FitbitProfile
from .model import FitbitProfile
_LOGGER = logging.getLogger(__name__)
@@ -58,6 +66,14 @@ class FitbitApi(ABC):
expires_at=float(token[CONF_EXPIRES_AT]),
)
async def _async_get_fitbit_web_api(self) -> ApiClient:
"""Create and return an ApiClient configured with the current access token."""
token = await self.async_get_access_token()
configuration = Configuration()
configuration.pool_manager = async_get_clientsession(self._hass)
configuration.access_token = token[CONF_ACCESS_TOKEN]
return ApiClient(configuration)
async def async_get_user_profile(self) -> FitbitProfile:
"""Return the user profile from the API."""
if self._profile is None:
@@ -94,21 +110,13 @@ class FitbitApi(ABC):
return FitbitUnitSystem.METRIC
return FitbitUnitSystem.EN_US
async def async_get_devices(self) -> list[FitbitDevice]:
"""Return available devices."""
client = await self._async_get_client()
devices: list[dict[str, str]] = await self._run(client.get_devices)
async def async_get_devices(self) -> list[Device]:
"""Return available devices using fitbit-web-api."""
client = await self._async_get_fitbit_web_api()
devices_api = DevicesApi(client)
devices: list[Device] = await self._run_async(devices_api.get_devices)
_LOGGER.debug("get_devices=%s", devices)
return [
FitbitDevice(
id=device["id"],
device_version=device["deviceVersion"],
battery_level=int(device["batteryLevel"]),
battery=device["battery"],
type=device["type"],
)
for device in devices
]
return devices
async def async_get_latest_time_series(self, resource_type: str) -> dict[str, Any]:
"""Return the most recent value from the time series for the specified resource type."""
@@ -140,6 +148,20 @@ class FitbitApi(ABC):
_LOGGER.debug("Error from fitbit API: %s", err)
raise FitbitApiException("Error from fitbit API") from err
async def _run_async[_T](self, func: Callable[[], Awaitable[_T]]) -> _T:
"""Run client command."""
try:
return await func()
except UnauthorizedException as err:
_LOGGER.debug("Unauthorized error from fitbit API: %s", err)
raise FitbitAuthException("Authentication error from fitbit API") from err
except ApiException as err:
_LOGGER.debug("Error from fitbit API: %s", err)
raise FitbitApiException("Error from fitbit API") from err
except OpenApiException as err:
_LOGGER.debug("Error communicating with fitbit API: %s", err)
raise FitbitApiException("Communication error from fitbit API") from err
class OAuthFitbitApi(FitbitApi):
"""Provide fitbit authentication tied to an OAuth2 based config entry."""

View File

@@ -6,6 +6,8 @@ import datetime
import logging
from typing import Final
from fitbit_web_api.models.device import Device
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
@@ -13,7 +15,6 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .api import FitbitApi
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice
_LOGGER = logging.getLogger(__name__)
@@ -23,7 +24,7 @@ TIMEOUT = 10
type FitbitConfigEntry = ConfigEntry[FitbitData]
class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, Device]]):
"""Coordinator for fetching fitbit devices from the API."""
config_entry: FitbitConfigEntry
@@ -41,7 +42,7 @@ class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
)
self._api = api
async def _async_update_data(self) -> dict[str, FitbitDevice]:
async def _async_update_data(self) -> dict[str, Device]:
"""Fetch data from API endpoint."""
async with asyncio.timeout(TIMEOUT):
try:
@@ -50,7 +51,7 @@ class FitbitDeviceCoordinator(DataUpdateCoordinator[dict[str, FitbitDevice]]):
raise ConfigEntryAuthFailed(err) from err
except FitbitApiException as err:
raise UpdateFailed(err) from err
return {device.id: device for device in devices}
return {device.id: device for device in devices if device.id is not None}
@dataclass

View File

@@ -6,6 +6,6 @@
"dependencies": ["application_credentials", "http"],
"documentation": "https://www.home-assistant.io/integrations/fitbit",
"iot_class": "cloud_polling",
"loggers": ["fitbit"],
"requirements": ["fitbit==0.3.1"]
"loggers": ["fitbit", "fitbit_web_api"],
"requirements": ["fitbit==0.3.1", "fitbit-web-api==2.13.5"]
}

View File

@@ -21,26 +21,6 @@ class FitbitProfile:
"""The locale defined in the user's Fitbit account settings."""
@dataclass
class FitbitDevice:
"""Device from the Fitbit API response."""
id: str
"""The device ID."""
device_version: str
"""The product name of the device."""
battery_level: int
"""The battery level as a percentage."""
battery: str
"""Returns the battery level of the device."""
type: str
"""The type of the device such as TRACKER or SCALE."""
@dataclass
class FitbitConfig:
"""Information from the fitbit ConfigEntry data."""

View File

@@ -8,6 +8,8 @@ import datetime
import logging
from typing import Any, Final, cast
from fitbit_web_api.models.device import Device
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -32,7 +34,7 @@ from .api import FitbitApi
from .const import ATTRIBUTION, BATTERY_LEVELS, DOMAIN, FitbitScope, FitbitUnitSystem
from .coordinator import FitbitConfigEntry, FitbitDeviceCoordinator
from .exceptions import FitbitApiException, FitbitAuthException
from .model import FitbitDevice, config_from_entry_data
from .model import config_from_entry_data
_LOGGER: Final = logging.getLogger(__name__)
@@ -657,7 +659,7 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
coordinator: FitbitDeviceCoordinator,
user_profile_id: str,
description: FitbitSensorEntityDescription,
device: FitbitDevice,
device: Device,
enable_default_override: bool,
) -> None:
"""Initialize the Fitbit sensor."""
@@ -677,7 +679,9 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
@property
def icon(self) -> str | None:
"""Icon to use in the frontend, if any."""
if battery_level := BATTERY_LEVELS.get(self.device.battery):
if self.device.battery is not None and (
battery_level := BATTERY_LEVELS.get(self.device.battery)
):
return icon_for_battery_level(battery_level=battery_level)
return self.entity_description.icon
@@ -697,7 +701,7 @@ class FitbitBatterySensor(CoordinatorEntity[FitbitDeviceCoordinator], SensorEnti
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.device = self.coordinator.data[self.device.id]
self.device = self.coordinator.data[cast(str, self.device.id)]
self._attr_native_value = self.device.battery
self.async_write_ha_state()
@@ -715,7 +719,7 @@ class FitbitBatteryLevelSensor(
coordinator: FitbitDeviceCoordinator,
user_profile_id: str,
description: FitbitSensorEntityDescription,
device: FitbitDevice,
device: Device,
) -> None:
"""Initialize the Fitbit sensor."""
super().__init__(coordinator)
@@ -736,6 +740,6 @@ class FitbitBatteryLevelSensor(
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self.device = self.coordinator.data[self.device.id]
self.device = self.coordinator.data[cast(str, self.device.id)]
self._attr_native_value = self.device.battery_level
self.async_write_ha_state()

View File

@@ -23,5 +23,5 @@
"winter_mode": {}
},
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20251105.1"]
"requirements": ["home-assistant-frontend==20251126.0"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["google_air_quality_api"],
"quality_scale": "bronze",
"requirements": ["google_air_quality_api==1.1.1"]
"requirements": ["google_air_quality_api==1.1.2"]
}

View File

@@ -53,7 +53,7 @@ from homeassistant.helpers.issue_registry import (
async_create_issue,
async_delete_issue,
)
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import CONF_IGNORE_NON_NUMERIC, DOMAIN
from .entity import GroupEntity
@@ -374,7 +374,7 @@ class SensorGroup(GroupEntity, SensorEntity):
def async_update_group_state(self) -> None:
"""Query all members and determine the sensor group state."""
self.calculate_state_attributes(self._get_valid_entities())
states: list[StateType] = []
states: list[str] = []
valid_units = self._valid_units
valid_states: list[bool] = []
sensor_values: list[tuple[str, float, State]] = []

View File

@@ -31,7 +31,7 @@ from homeassistant.const import (
UnitOfTemperature,
UnitOfVolume,
)
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.core import HomeAssistant, ServiceCall, ServiceResponse, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.device_registry import DeviceEntry
from homeassistant.helpers.issue_registry import (
@@ -81,11 +81,22 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
)
@callback
def service_handler(call: ServiceCall | None = None) -> None:
def service_handler(call: ServiceCall | None = None) -> ServiceResponse:
"""Do nothing."""
return None
hass.services.async_register(
DOMAIN, "test_service_1", service_handler, SCHEMA_SERVICE_TEST_SERVICE_1
DOMAIN,
"test_service_1",
service_handler,
SCHEMA_SERVICE_TEST_SERVICE_1,
description_placeholders={
"meep_1": "foo",
"meep_2": "bar",
"meep_3": "beer",
"meep_4": "milk",
"meep_5": "https://example.com",
},
)
return True

View File

@@ -117,14 +117,16 @@
},
"services": {
"test_service_1": {
"description": "Fake action for testing",
"description": "Fake action for testing {meep_2}",
"fields": {
"field_1": {
"description": "Number of seconds",
"name": "Field 1"
"description": "Number of seconds {meep_4}",
"example": "Example: {meep_5}",
"name": "Field 1 {meep_3}"
},
"field_2": {
"description": "Mode",
"example": "Field 2 example",
"name": "Field 2"
},
"field_3": {
@@ -136,7 +138,7 @@
"name": "Field 4"
}
},
"name": "Test action 1",
"name": "Test action {meep_1}",
"sections": {
"advanced_fields": {
"description": "Some very advanced things",

View File

@@ -39,6 +39,10 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
_DESCRIPTION_PLACEHOLDERS = {
"sensor_value_types_url": "https://www.home-assistant.io/integrations/knx/#value-types"
}
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -48,6 +52,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_SEND,
service_send_to_knx_bus,
schema=SERVICE_KNX_SEND_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
hass.services.async_register(
@@ -63,6 +68,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_EVENT_REGISTER,
service_event_register_modify,
schema=SERVICE_KNX_EVENT_REGISTER_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
async_register_admin_service(
@@ -71,6 +77,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
SERVICE_KNX_EXPOSURE_REGISTER,
service_exposure_register_modify,
schema=SERVICE_KNX_EXPOSURE_REGISTER_SCHEMA,
description_placeholders=_DESCRIPTION_PLACEHOLDERS,
)
async_register_admin_service(

View File

@@ -674,7 +674,7 @@
"name": "Remove event registration"
},
"type": {
"description": "If set, the payload will be decoded as given DPT in the event data `value` key. KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "If set, the payload will be decoded as given DPT in the event data `value` key. KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},
@@ -704,7 +704,7 @@
"name": "Remove exposure"
},
"type": {
"description": "Telegrams will be encoded as given DPT. 'binary' and all KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "Telegrams will be encoded as given DPT. 'binary' and all KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},
@@ -740,7 +740,7 @@
"name": "Send as Response"
},
"type": {
"description": "If set, the payload will not be sent as raw bytes, but encoded as given DPT. KNX sensor types are valid values (see https://www.home-assistant.io/integrations/knx/#value-types).",
"description": "If set, the payload will not be sent as raw bytes, but encoded as given DPT. KNX sensor types are valid values (see {sensor_value_types_url}).",
"name": "Value type"
}
},

View File

@@ -485,4 +485,18 @@ DISCOVERY_SCHEMAS = [
required_attributes=(clusters.RefrigeratorAlarm.Attributes.State,),
allow_multi=True,
),
MatterDiscoverySchema(
platform=Platform.BINARY_SENSOR,
entity_description=MatterBinarySensorEntityDescription(
key="WindowCoveringConfigStatusOperational",
device_class=BinarySensorDeviceClass.PROBLEM,
entity_category=EntityCategory.DIAGNOSTIC,
# unset Operational bit from ConfigStatus bitmap means problem
device_to_ha=lambda x: not bool(
x & clusters.WindowCovering.Bitmaps.ConfigStatus.kOperational
),
),
entity_class=MatterBinarySensor,
required_attributes=(clusters.WindowCovering.Attributes.ConfigStatus,),
),
]

View File

@@ -8,16 +8,25 @@ from dataclasses import dataclass, field
from typing import TYPE_CHECKING
from music_assistant_client import MusicAssistantClient
from music_assistant_client.exceptions import CannotConnect, InvalidServerVersion
from music_assistant_client.exceptions import (
CannotConnect,
InvalidServerVersion,
MusicAssistantClientException,
)
from music_assistant_models.config_entries import PlayerConfig
from music_assistant_models.enums import EventType
from music_assistant_models.errors import ActionUnavailable, MusicAssistantError
from music_assistant_models.errors import (
ActionUnavailable,
AuthenticationFailed,
InvalidToken,
MusicAssistantError,
)
from music_assistant_models.player import Player
from homeassistant.config_entries import ConfigEntry, ConfigEntryState
from homeassistant.const import CONF_URL, EVENT_HOMEASSISTANT_STOP, Platform
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.issue_registry import (
@@ -26,7 +35,7 @@ from homeassistant.helpers.issue_registry import (
async_delete_issue,
)
from .const import ATTR_CONF_EXPOSE_PLAYER_TO_HA, DOMAIN, LOGGER
from .const import ATTR_CONF_EXPOSE_PLAYER_TO_HA, CONF_TOKEN, DOMAIN, LOGGER
from .helpers import get_music_assistant_client
from .services import register_actions
@@ -59,6 +68,7 @@ class MusicAssistantEntryData:
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Music Assistant component."""
register_actions(hass)
return True
@@ -68,7 +78,9 @@ async def async_setup_entry( # noqa: C901
"""Set up Music Assistant from a config entry."""
http_session = async_get_clientsession(hass, verify_ssl=False)
mass_url = entry.data[CONF_URL]
mass = MusicAssistantClient(mass_url, http_session)
# Get token from config entry (for schema >= AUTH_SCHEMA_VERSION)
token = entry.data.get(CONF_TOKEN)
mass = MusicAssistantClient(mass_url, http_session, token=token)
try:
async with asyncio.timeout(CONNECT_TIMEOUT):
@@ -87,6 +99,14 @@ async def async_setup_entry( # noqa: C901
translation_key="invalid_server_version",
)
raise ConfigEntryNotReady(f"Invalid server version: {err}") from err
except (AuthenticationFailed, InvalidToken) as err:
raise ConfigEntryAuthFailed(
f"Authentication failed for {mass_url}: {err}"
) from err
except MusicAssistantClientException as err:
raise ConfigEntryNotReady(
f"Failed to connect to music assistant server {mass_url}: {err}"
) from err
except MusicAssistantError as err:
LOGGER.exception("Failed to connect to music assistant server", exc_info=err)
raise ConfigEntryNotReady(

View File

@@ -2,40 +2,79 @@
from __future__ import annotations
from collections.abc import Mapping
from typing import TYPE_CHECKING, Any
from urllib.parse import urlencode
from music_assistant_client import MusicAssistantClient
from music_assistant_client.auth_helpers import create_long_lived_token, get_server_info
from music_assistant_client.exceptions import (
CannotConnect,
InvalidServerVersion,
MusicAssistantClientException,
)
from music_assistant_models.api import ServerInfoMessage
from music_assistant_models.errors import AuthenticationFailed, InvalidToken
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_URL
from homeassistant.core import HomeAssistant
from homeassistant.helpers import aiohttp_client
from homeassistant.helpers.config_entry_oauth2_flow import (
_encode_jwt,
async_get_redirect_uri,
)
from homeassistant.helpers.service_info.hassio import HassioServiceInfo
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import DOMAIN, LOGGER
from .const import (
AUTH_SCHEMA_VERSION,
CONF_TOKEN,
DOMAIN,
HASSIO_DISCOVERY_SCHEMA_VERSION,
LOGGER,
)
DEFAULT_TITLE = "Music Assistant"
DEFAULT_URL = "http://mass.local:8095"
STEP_USER_SCHEMA = vol.Schema({vol.Required(CONF_URL): str})
STEP_AUTH_TOKEN_SCHEMA = vol.Schema({vol.Required(CONF_TOKEN): str})
def _parse_zeroconf_server_info(properties: dict[str, str]) -> ServerInfoMessage:
"""Parse zeroconf properties to ServerInfoMessage."""
return ServerInfoMessage(
server_id=properties["server_id"],
server_version=properties["server_version"],
schema_version=int(properties["schema_version"]),
min_supported_schema_version=int(properties["min_supported_schema_version"]),
base_url=properties["base_url"],
homeassistant_addon=properties["homeassistant_addon"].lower() == "true",
onboard_done=properties["onboard_done"].lower() == "true",
)
async def _get_server_info(hass: HomeAssistant, url: str) -> ServerInfoMessage:
"""Validate the user input allows us to connect."""
"""Get MA server info for the given URL."""
session = aiohttp_client.async_get_clientsession(hass)
return await get_server_info(server_url=url, aiohttp_session=session)
async def _test_connection(hass: HomeAssistant, url: str, token: str) -> None:
"""Test connection to MA server with given URL and token."""
session = aiohttp_client.async_get_clientsession(hass)
async with MusicAssistantClient(
url, aiohttp_client.async_get_clientsession(hass)
server_url=url,
aiohttp_session=session,
token=token,
) as client:
if TYPE_CHECKING:
assert client.server_info is not None
return client.server_info
# Just executing any command to test the connection.
# If auth is required and the token is invalid, this will raise.
await client.send_command("info")
class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
@@ -46,16 +85,18 @@ class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Set up flow instance."""
self.url: str | None = None
self.token: str | None = None
self.server_info: ServerInfoMessage | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle a manual configuration."""
errors: dict[str, str] = {}
if user_input is not None:
self.url = user_input[CONF_URL]
try:
server_info = await _get_server_info(self.hass, user_input[CONF_URL])
server_info = await _get_server_info(self.hass, self.url)
except CannotConnect:
errors["base"] = "cannot_connect"
except InvalidServerVersion:
@@ -64,16 +105,21 @@ class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
self.server_info = server_info
await self.async_set_unique_id(
server_info.server_id, raise_on_progress=False
)
self._abort_if_unique_id_configured(
updates={CONF_URL: user_input[CONF_URL]}
)
self._abort_if_unique_id_configured(updates={CONF_URL: self.url})
# Check if authentication is required for this server
if server_info.schema_version >= AUTH_SCHEMA_VERSION:
# Redirect to browser-based authentication
return await self.async_step_auth()
# Old server, no auth needed
return self.async_create_entry(
title=DEFAULT_TITLE,
data={CONF_URL: user_input[CONF_URL]},
data={CONF_URL: self.url},
)
suggested_values = user_input
@@ -88,16 +134,87 @@ class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
errors=errors,
)
async def async_step_hassio(
self, discovery_info: HassioServiceInfo
) -> ConfigFlowResult:
"""Handle Home Assistant add-on discovery.
This flow is triggered by the Music Assistant add-on.
"""
# Build URL from add-on discovery info
# The add-on exposes the API on port 8095, but also hosts an internal-only
# webserver (default at port 8094) for the Home Assistant integration to connect to.
# The info where the internal API is exposed is passed via discovery_info
host = discovery_info.config["host"]
port = discovery_info.config["port"]
self.url = f"http://{host}:{port}"
try:
server_info = await _get_server_info(self.hass, self.url)
except CannotConnect:
return self.async_abort(reason="cannot_connect")
except InvalidServerVersion:
return self.async_abort(reason="invalid_server_version")
except MusicAssistantClientException:
LOGGER.exception("Unexpected exception during add-on discovery")
return self.async_abort(reason="unknown")
if not server_info.onboard_done:
return self.async_abort(reason="server_not_ready")
# We trust the token from hassio discovery and validate it during setup
self.token = discovery_info.config["auth_token"]
self.server_info = server_info
await self.async_set_unique_id(server_info.server_id)
self._abort_if_unique_id_configured(
updates={CONF_URL: self.url, CONF_TOKEN: self.token}
)
return await self.async_step_hassio_confirm()
async def async_step_hassio_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm the add-on discovery."""
if TYPE_CHECKING:
assert self.url is not None
if user_input is not None:
data = {CONF_URL: self.url}
if self.token:
data[CONF_TOKEN] = self.token
return self.async_create_entry(
title=DEFAULT_TITLE,
data=data,
)
self._set_confirm_only()
return self.async_show_form(step_id="hassio_confirm")
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
"""Handle a zeroconf discovery for a Music Assistant server."""
try:
server_info = ServerInfoMessage.from_dict(discovery_info.properties)
except LookupError:
# Parse zeroconf properties (strings) to ServerInfoMessage
server_info = _parse_zeroconf_server_info(discovery_info.properties)
except (LookupError, KeyError, ValueError):
return self.async_abort(reason="invalid_discovery_info")
if server_info.schema_version >= HASSIO_DISCOVERY_SCHEMA_VERSION:
# Ignore servers running as Home Assistant add-on
# (they should be discovered through hassio discovery instead)
if server_info.homeassistant_addon:
LOGGER.debug("Ignoring add-on server in zeroconf discovery")
return self.async_abort(reason="already_discovered_addon")
# Ignore servers that have not completed onboarding yet
if not server_info.onboard_done:
LOGGER.debug("Ignoring server that hasn't completed onboarding")
return self.async_abort(reason="server_not_ready")
self.url = server_info.base_url
self.server_info = server_info
await self.async_set_unique_id(server_info.server_id)
self._abort_if_unique_id_configured(updates={CONF_URL: self.url})
@@ -115,8 +232,15 @@ class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle user-confirmation of discovered server."""
if TYPE_CHECKING:
assert self.url is not None
assert self.server_info is not None
if user_input is not None:
# Check if authentication is required for this server
if self.server_info.schema_version >= AUTH_SCHEMA_VERSION:
# Redirect to browser-based authentication
return await self.async_step_auth()
# Old server, no auth needed
return self.async_create_entry(
title=DEFAULT_TITLE,
data={CONF_URL: self.url},
@@ -127,3 +251,152 @@ class MusicAssistantConfigFlow(ConfigFlow, domain=DOMAIN):
step_id="discovery_confirm",
description_placeholders={"url": self.url},
)
async def async_step_auth(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle authentication via redirect to MA login."""
if TYPE_CHECKING:
assert self.url is not None
# Check if we're returning from the external auth step with a token
if user_input is not None:
if "error" in user_input:
return self.async_abort(reason="auth_error")
# OAuth2 callback sends token as "code" parameter
if "code" in user_input:
self.token = user_input["code"]
return self.async_external_step_done(next_step_id="finish_auth")
# Check if we can use external auth (redirect flow)
try:
redirect_uri = async_get_redirect_uri(self.hass)
except RuntimeError:
# No current request context or missing required headers
return await self.async_step_auth_manual()
# Use OAuth2 callback URL with JWT-encoded state
state = _encode_jwt(
self.hass, {"flow_id": self.flow_id, "redirect_uri": redirect_uri}
)
# Music Assistant server will redirect to: {redirect_uri}?state={state}&code={token}
params = urlencode(
{
"return_url": f"{redirect_uri}?state={state}",
"device_name": "Home Assistant",
}
)
login_url = f"{self.url}/login?{params}"
return self.async_external_step(step_id="auth", url=login_url)
async def async_step_finish_auth(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Finish authentication after receiving token."""
if TYPE_CHECKING:
assert self.url is not None
assert self.token is not None
# Exchange session token for long-lived token
# The login flow gives us a session token (short expiration)
session = aiohttp_client.async_get_clientsession(self.hass)
try:
LOGGER.debug("Creating long-lived token")
long_lived_token = await create_long_lived_token(
self.url,
self.token,
"Home Assistant",
aiohttp_session=session,
)
LOGGER.debug("Successfully created long-lived token")
except (TimeoutError, CannotConnect):
return self.async_abort(reason="cannot_connect")
except (AuthenticationFailed, InvalidToken) as err:
LOGGER.error("Authentication failed: %s", err)
return self.async_abort(reason="auth_failed")
except InvalidServerVersion as err:
LOGGER.error("Invalid server version: %s", err)
return self.async_abort(reason="invalid_server_version")
except MusicAssistantClientException:
LOGGER.exception("Unexpected exception during connection test")
return self.async_abort(reason="unknown")
if self.source == SOURCE_REAUTH:
reauth_entry = self._get_reauth_entry()
return self.async_update_reload_and_abort(
reauth_entry,
data={CONF_URL: self.url, CONF_TOKEN: long_lived_token},
)
# Connection has been validated by creating a long-lived token
return self.async_create_entry(
title=DEFAULT_TITLE,
data={CONF_URL: self.url, CONF_TOKEN: long_lived_token},
)
async def async_step_auth_manual(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle manual token entry as fallback."""
if TYPE_CHECKING:
assert self.url is not None
errors: dict[str, str] = {}
if user_input is not None:
self.token = user_input[CONF_TOKEN]
try:
# Test the connection with the provided token
await _test_connection(self.hass, self.url, self.token)
except CannotConnect:
return self.async_abort(reason="cannot_connect")
except InvalidServerVersion:
return self.async_abort(reason="invalid_server_version")
except (AuthenticationFailed, InvalidToken):
errors["base"] = "auth_failed"
except MusicAssistantClientException:
LOGGER.exception("Unexpected exception during manual auth")
return self.async_abort(reason="unknown")
else:
if self.source == SOURCE_REAUTH:
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data={CONF_URL: self.url, CONF_TOKEN: self.token},
)
return self.async_create_entry(
title=DEFAULT_TITLE,
data={CONF_URL: self.url, CONF_TOKEN: self.token},
)
return self.async_show_form(
step_id="auth_manual",
data_schema=vol.Schema({vol.Required(CONF_TOKEN): str}),
description_placeholders={"url": self.url},
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle reauth when token is invalid or expired."""
self.url = entry_data[CONF_URL]
# Show confirmation before redirecting to auth
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm reauth dialog."""
if TYPE_CHECKING:
assert self.url is not None
if user_input is not None:
# Redirect to auth flow
return await self.async_step_auth()
return self.async_show_form(
step_id="reauth_confirm",
description_placeholders={"url": self.url},
)

View File

@@ -7,6 +7,13 @@ DOMAIN_EVENT = f"{DOMAIN}_event"
DEFAULT_NAME = "Music Assistant"
# Schema version where mandatory authentication was added to the MA webserver
AUTH_SCHEMA_VERSION = 28
# Schema version where hassio discovery support was added
HASSIO_DISCOVERY_SCHEMA_VERSION = 28
CONF_TOKEN = "token"
ATTR_IS_GROUP = "is_group"
ATTR_GROUP_MEMBERS = "group_members"
ATTR_GROUP_PARENTS = "group_parents"

View File

@@ -1,12 +1,14 @@
{
"domain": "music_assistant",
"name": "Music Assistant",
"after_dependencies": ["media_source", "media_player"],
"after_dependencies": ["media_source"],
"codeowners": ["@music-assistant", "@arturpragacz"],
"config_flow": true,
"dependencies": ["auth"],
"documentation": "https://www.home-assistant.io/integrations/music_assistant",
"iot_class": "local_push",
"loggers": ["music_assistant"],
"quality_scale": "bronze",
"requirements": ["music-assistant-client==1.3.2"],
"zeroconf": ["_mass._tcp.local."]
}

View File

@@ -0,0 +1,64 @@
rules:
# Bronze
action-setup: done
appropriate-polling:
status: exempt
comment: Integration is local push
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: todo
parallel-updates: todo
reauthentication-flow:
status: exempt
comment: Devices don't require authentication
test-coverage: todo
# Gold
devices: done
diagnostics: todo
discovery-update-info: done
discovery: done
docs-data-update: todo
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: done
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: todo
icon-translations: done
reconfiguration-flow: todo
repair-issues: done
stale-devices: done
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -3,20 +3,41 @@
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"reconfiguration_successful": "Successfully reconfigured the Music Assistant integration.",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]"
},
"error": {
"auth_error": "Authentication error, please try again",
"auth_failed": "Authentication failed, please try again",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_server_version": "The Music Assistant server is not the correct version",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"error": {
"auth_failed": "[%key:component::music_assistant::config::abort::auth_failed%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_server_version": "[%key:component::music_assistant::config::abort::invalid_server_version%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"auth_manual": {
"data": {
"token": "Long-lived access token"
},
"data_description": {
"token": "Create a long-lived access token in your Music Assistant server settings and paste it here"
},
"title": "Enter long-lived access token"
},
"discovery_confirm": {
"description": "Do you want to add the Music Assistant server `{url}` to Home Assistant?",
"title": "Discovered Music Assistant server"
},
"hassio_confirm": {
"description": "Do you want to add the Music Assistant server to Home Assistant?",
"title": "Discovered Music Assistant add-on"
},
"reauth_confirm": {
"description": "The authentication token for Music Assistant server `{url}` is no longer valid. Please re-authenticate to continue using the integration.",
"title": "Reauthentication required"
},
"user": {
"data": {
"url": "[%key:common::config_flow::data::url%]"

View File

@@ -17,7 +17,7 @@ from .coordinator import PooldoseConfigEntry, PooldoseCoordinator
_LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [Platform.BINARY_SENSOR, Platform.SENSOR]
PLATFORMS: list[Platform] = [Platform.BINARY_SENSOR, Platform.SENSOR, Platform.SWITCH]
async def async_migrate_entry(hass: HomeAssistant, entry: PooldoseConfigEntry) -> bool:

View File

@@ -120,6 +120,29 @@
"ph_type_dosing": {
"default": "mdi:beaker"
}
},
"switch": {
"frequency_input": {
"default": "mdi:sine-wave",
"state": {
"off": "mdi:pulse",
"on": "mdi:sine-wave"
}
},
"pause_dosing": {
"default": "mdi:pause",
"state": {
"off": "mdi:play",
"on": "mdi:pause"
}
},
"pump_monitoring": {
"default": "mdi:pump",
"state": {
"off": "mdi:pump-off",
"on": "mdi:pump"
}
}
}
}
}

View File

@@ -11,5 +11,5 @@
"documentation": "https://www.home-assistant.io/integrations/pooldose",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["python-pooldose==0.7.8"]
"requirements": ["python-pooldose==0.8.0"]
}

View File

@@ -161,6 +161,17 @@
"alcalyne": "pH+"
}
}
},
"switch": {
"frequency_input": {
"name": "Frequency input"
},
"pause_dosing": {
"name": "Pause dosing"
},
"pump_monitoring": {
"name": "Pump monitoring"
}
}
}
}

View File

@@ -0,0 +1,95 @@
"""Switches for the Seko PoolDose integration."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Any, cast
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PooldoseConfigEntry
from .entity import PooldoseEntity
if TYPE_CHECKING:
from .coordinator import PooldoseCoordinator
_LOGGER = logging.getLogger(__name__)
SWITCH_DESCRIPTIONS: tuple[SwitchEntityDescription, ...] = (
SwitchEntityDescription(
key="pause_dosing",
translation_key="pause_dosing",
entity_category=EntityCategory.CONFIG,
),
SwitchEntityDescription(
key="pump_monitoring",
translation_key="pump_monitoring",
entity_category=EntityCategory.CONFIG,
),
SwitchEntityDescription(
key="frequency_input",
translation_key="frequency_input",
entity_category=EntityCategory.CONFIG,
),
)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: PooldoseConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up PoolDose switch entities from a config entry."""
if TYPE_CHECKING:
assert config_entry.unique_id is not None
coordinator = config_entry.runtime_data
switch_data = coordinator.data["switch"]
serial_number = config_entry.unique_id
async_add_entities(
PooldoseSwitch(coordinator, serial_number, coordinator.device_info, description)
for description in SWITCH_DESCRIPTIONS
if description.key in switch_data
)
class PooldoseSwitch(PooldoseEntity, SwitchEntity):
"""Switch entity for the Seko PoolDose Python API."""
def __init__(
self,
coordinator: PooldoseCoordinator,
serial_number: str,
device_info: Any,
description: SwitchEntityDescription,
) -> None:
"""Initialize the switch."""
super().__init__(coordinator, serial_number, device_info, description, "switch")
self._async_update_attrs()
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
self._async_update_attrs()
super()._handle_coordinator_update()
def _async_update_attrs(self) -> None:
"""Update switch attributes."""
data = cast(dict, self.get_data())
self._attr_is_on = cast(bool, data["value"])
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
await self.coordinator.client.set_switch(self.entity_description.key, True)
self._attr_is_on = True
self.async_write_ha_state()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
await self.coordinator.client.set_switch(self.entity_description.key, False)
self._attr_is_on = False
self.async_write_ha_state()

View File

@@ -19,5 +19,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.16.5"]
"requirements": ["reolink-aio==0.16.6"]
}

View File

@@ -9,16 +9,14 @@ import logging
from typing import Any
from roborock import (
HomeDataRoom,
RoborockException,
RoborockInvalidCredentials,
RoborockInvalidUserAgreement,
RoborockNoUserAgreement,
)
from roborock.data import DeviceData, HomeDataDevice, HomeDataProduct, UserData
from roborock.version_1_apis.roborock_mqtt_client_v1 import RoborockMqttClientV1
from roborock.version_a01_apis import RoborockMqttClientA01
from roborock.web_api import RoborockApiClient
from roborock.data import UserData
from roborock.devices.device import RoborockDevice
from roborock.devices.device_manager import UserParams, create_device_manager
from homeassistant.const import CONF_USERNAME, EVENT_HOMEASSISTANT_STOP
from homeassistant.core import HomeAssistant
@@ -32,8 +30,10 @@ from .coordinator import (
RoborockCoordinators,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockWashingMachineUpdateCoordinator,
RoborockWetDryVacUpdateCoordinator,
)
from .roborock_storage import async_remove_map_storage
from .roborock_storage import CacheStore, async_cleanup_map_storage
SCAN_INTERVAL = timedelta(seconds=30)
@@ -42,16 +42,21 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> bool:
"""Set up roborock from a config entry."""
await async_cleanup_map_storage(hass, entry.entry_id)
user_data = UserData.from_dict(entry.data[CONF_USER_DATA])
api_client = RoborockApiClient(
entry.data[CONF_USERNAME],
entry.data[CONF_BASE_URL],
session=async_get_clientsession(hass),
user_params = UserParams(
username=entry.data[CONF_USERNAME],
user_data=user_data,
base_url=entry.data[CONF_BASE_URL],
)
_LOGGER.debug("Getting home data")
cache = CacheStore(hass, entry.entry_id)
try:
home_data = await api_client.get_home_data_v3(user_data)
device_manager = await create_device_manager(
user_params,
cache=cache,
session=async_get_clientsession(hass),
)
except RoborockInvalidCredentials as err:
raise ConfigEntryAuthFailed(
"Invalid credentials",
@@ -75,29 +80,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
translation_domain=DOMAIN,
translation_key="home_data_fail",
) from err
devices = await device_manager.get_devices()
_LOGGER.debug("Device manager found %d devices", len(devices))
for device in devices:
entry.async_on_unload(device.close)
_LOGGER.debug("Got home data %s", home_data)
all_devices: list[HomeDataDevice] = home_data.devices + home_data.received_devices
device_map: dict[str, HomeDataDevice] = {
device.duid: device for device in all_devices
}
product_info: dict[str, HomeDataProduct] = {
product.id: product for product in home_data.products
}
# Get a Coordinator if the device is available or if we have connected to the device before
coordinators = await asyncio.gather(
*build_setup_functions(
hass,
entry,
device_map,
user_data,
product_info,
home_data.rooms,
api_client,
),
*build_setup_functions(hass, entry, devices, user_data),
return_exceptions=True,
)
# Valid coordinators are those where we had networking cached or we could get networking
v1_coords = [
coord
for coord in coordinators
@@ -115,9 +106,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
translation_key="no_coordinators",
)
valid_coordinators = RoborockCoordinators(v1_coords, a01_coords)
await asyncio.gather(
*(coord.refresh_coordinator_map() for coord in valid_coordinators.v1)
)
async def on_stop(_: Any) -> None:
_LOGGER.debug("Shutting down roborock")
@@ -125,7 +113,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
*(
coordinator.async_shutdown()
for coordinator in valid_coordinators.values()
)
),
cache.flush(),
)
entry.async_on_unload(
@@ -138,6 +127,17 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
_remove_stale_devices(hass, entry, devices)
return True
def _remove_stale_devices(
hass: HomeAssistant,
entry: RoborockConfigEntry,
devices: list[RoborockDevice],
) -> None:
device_map: dict[str, RoborockDevice] = {device.duid: device for device in devices}
device_registry = dr.async_get(hass)
device_entries = dr.async_entries_for_config_entry(
device_registry, config_entry_id=entry.entry_id
@@ -159,8 +159,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
remove_config_entry_id=entry.entry_id,
)
return True
async def async_migrate_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> bool:
"""Migrate old configuration entries to the new format."""
@@ -190,11 +188,8 @@ async def async_migrate_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -
def build_setup_functions(
hass: HomeAssistant,
entry: RoborockConfigEntry,
device_map: dict[str, HomeDataDevice],
devices: list[RoborockDevice],
user_data: UserData,
product_info: dict[str, HomeDataProduct],
home_data_rooms: list[HomeDataRoom],
api_client: RoborockApiClient,
) -> list[
Coroutine[
Any,
@@ -203,134 +198,45 @@ def build_setup_functions(
]
]:
"""Create a list of setup functions that can later be called asynchronously."""
return [
setup_device(
hass,
entry,
user_data,
device,
product_info[device.product_id],
home_data_rooms,
api_client,
)
for device in device_map.values()
]
coordinators: list[
RoborockDataUpdateCoordinator | RoborockDataUpdateCoordinatorA01
] = []
for device in devices:
_LOGGER.debug("Creating device %s: %s", device.name, device)
if device.v1_properties is not None:
coordinators.append(
RoborockDataUpdateCoordinator(hass, entry, device, device.v1_properties)
)
elif device.dyad is not None:
coordinators.append(
RoborockWetDryVacUpdateCoordinator(hass, entry, device, device.dyad)
)
elif device.zeo is not None:
coordinators.append(
RoborockWashingMachineUpdateCoordinator(hass, entry, device, device.zeo)
)
else:
_LOGGER.warning(
"Not adding device %s because its protocol version %s or category %s is not supported",
device.duid,
device.device_info.pv,
device.product.category.name,
)
return [setup_coordinator(coordinator) for coordinator in coordinators]
async def setup_device(
hass: HomeAssistant,
entry: RoborockConfigEntry,
user_data: UserData,
device: HomeDataDevice,
product_info: HomeDataProduct,
home_data_rooms: list[HomeDataRoom],
api_client: RoborockApiClient,
async def setup_coordinator(
coordinator: RoborockDataUpdateCoordinator | RoborockDataUpdateCoordinatorA01,
) -> RoborockDataUpdateCoordinator | RoborockDataUpdateCoordinatorA01 | None:
"""Set up a coordinator for a given device."""
if device.pv == "1.0":
return await setup_device_v1(
hass, entry, user_data, device, product_info, home_data_rooms, api_client
)
if device.pv == "A01":
return await setup_device_a01(hass, entry, user_data, device, product_info)
_LOGGER.warning(
"Not adding device %s because its protocol version %s or category %s is not supported",
device.duid,
device.pv,
product_info.category.name,
)
return None
async def setup_device_v1(
hass: HomeAssistant,
entry: RoborockConfigEntry,
user_data: UserData,
device: HomeDataDevice,
product_info: HomeDataProduct,
home_data_rooms: list[HomeDataRoom],
api_client: RoborockApiClient,
) -> RoborockDataUpdateCoordinator | None:
"""Set up a device Coordinator."""
mqtt_client = await hass.async_add_executor_job(
RoborockMqttClientV1, user_data, DeviceData(device, product_info.model)
)
try:
await mqtt_client.async_connect()
networking = await mqtt_client.get_networking()
if networking is None:
# If the api does not return an error but does return None for
# get_networking - then we need to go through cache checking.
raise RoborockException("Networking request returned None.") # noqa: TRY301
except RoborockException as err:
_LOGGER.warning(
"Not setting up %s because we could not get the network information of the device. "
"Please confirm it is online and the Roborock servers can communicate with it",
device.name,
)
_LOGGER.debug(err)
await mqtt_client.async_release()
raise
coordinator = RoborockDataUpdateCoordinator(
hass,
entry,
device,
networking,
product_info,
mqtt_client,
home_data_rooms,
api_client,
user_data,
)
"""Set up a single coordinator."""
try:
await coordinator.async_config_entry_first_refresh()
except ConfigEntryNotReady as ex:
except ConfigEntryNotReady:
await coordinator.async_shutdown()
if isinstance(coordinator.api, RoborockMqttClientV1):
_LOGGER.warning(
"Not setting up %s because the we failed to get data for the first time using the online client. "
"Please ensure your Home Assistant instance can communicate with this device. "
"You may need to open firewall instances on your Home Assistant network and on your Vacuum's network",
device.name,
)
# Most of the time if we fail to connect using the mqtt client, the problem is due to firewall,
# but in case if it isn't, the error can be included in debug logs for the user to grab.
if coordinator.last_exception:
_LOGGER.debug(coordinator.last_exception)
raise coordinator.last_exception from ex
elif coordinator.last_exception:
# If this is reached, we have verified that we can communicate with the Vacuum locally,
# so if there is an error here - it is not a communication issue but some other problem
extra_error = f"Please create an issue with the following error included: {coordinator.last_exception}"
_LOGGER.warning(
"Not setting up %s because the coordinator failed to get data for the first time using the "
"offline client %s",
device.name,
extra_error,
)
raise coordinator.last_exception from ex
return coordinator
async def setup_device_a01(
hass: HomeAssistant,
entry: RoborockConfigEntry,
user_data: UserData,
device: HomeDataDevice,
product_info: HomeDataProduct,
) -> RoborockDataUpdateCoordinatorA01 | None:
"""Set up a A01 protocol device."""
mqtt_client = await hass.async_add_executor_job(
RoborockMqttClientA01,
user_data,
DeviceData(device, product_info.model),
product_info.category,
)
coord = RoborockDataUpdateCoordinatorA01(
hass, entry, device, product_info, mqtt_client
)
await coord.async_config_entry_first_refresh()
return coord
raise
else:
return coordinator
async def async_unload_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> bool:
@@ -340,4 +246,5 @@ async def async_unload_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
async def async_remove_entry(hass: HomeAssistant, entry: RoborockConfigEntry) -> None:
"""Handle removal of an entry."""
await async_remove_map_storage(hass, entry.entry_id)
store = CacheStore(hass, entry.entry_id)
await store.async_remove()

View File

@@ -6,7 +6,6 @@ from collections.abc import Callable
from dataclasses import dataclass
from roborock.data import RoborockStateCode
from roborock.roborock_typing import DeviceProp
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -19,6 +18,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockCoordinatedEntityV1
from .models import DeviceState
PARALLEL_UPDATES = 0
@@ -27,9 +27,11 @@ PARALLEL_UPDATES = 0
class RoborockBinarySensorDescription(BinarySensorEntityDescription):
"""A class that describes Roborock binary sensors."""
value_fn: Callable[[DeviceProp], bool | int | None]
# If it is a dock entity
value_fn: Callable[[DeviceState], bool | int | None]
"""A function that extracts the sensor value from DeviceState."""
is_dock_entity: bool = False
"""Whether this sensor is for the dock."""
BINARY_SENSOR_DESCRIPTIONS = [
@@ -92,7 +94,7 @@ async def async_setup_entry(
)
for coordinator in config_entry.runtime_data.v1
for description in BINARY_SENSOR_DESCRIPTIONS
if description.value_fn(coordinator.roborock_device_info.props) is not None
if description.value_fn(coordinator.data) is not None
)
@@ -117,8 +119,4 @@ class RoborockBinarySensorEntity(RoborockCoordinatedEntityV1, BinarySensorEntity
@property
def is_on(self) -> bool:
"""Return the value reported by the sensor."""
return bool(
self.entity_description.value_fn(
self.coordinator.roborock_device_info.props
)
)
return bool(self.entity_description.value_fn(self.coordinator.data))

View File

@@ -5,18 +5,24 @@ from __future__ import annotations
import asyncio
from dataclasses import dataclass
import itertools
import logging
from typing import Any
from roborock.roborock_typing import RoborockCommand
from roborock.devices.traits.v1.consumeable import ConsumableAttribute
from roborock.exceptions import RoborockException
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockEntity, RoborockEntityV1
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
@@ -24,40 +30,35 @@ PARALLEL_UPDATES = 0
class RoborockButtonDescription(ButtonEntityDescription):
"""Describes a Roborock button entity."""
command: RoborockCommand
param: list | dict | None
attribute: ConsumableAttribute
CONSUMABLE_BUTTON_DESCRIPTIONS = [
RoborockButtonDescription(
key="reset_sensor_consumable",
translation_key="reset_sensor_consumable",
command=RoborockCommand.RESET_CONSUMABLE,
param=["sensor_dirty_time"],
attribute=ConsumableAttribute.SENSOR_DIRTY_TIME,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
),
RoborockButtonDescription(
key="reset_air_filter_consumable",
translation_key="reset_air_filter_consumable",
command=RoborockCommand.RESET_CONSUMABLE,
param=["filter_work_time"],
attribute=ConsumableAttribute.FILTER_WORK_TIME,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
),
RoborockButtonDescription(
key="reset_side_brush_consumable",
translation_key="reset_side_brush_consumable",
command=RoborockCommand.RESET_CONSUMABLE,
param=["side_brush_work_time"],
attribute=ConsumableAttribute.SIDE_BRUSH_WORK_TIME,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
),
RoborockButtonDescription(
key="reset_main_brush_consumable",
translation_key="reset_main_brush_consumable",
command=RoborockCommand.RESET_CONSUMABLE,
param=["main_brush_work_time"],
attribute=ConsumableAttribute.MAIN_BRUSH_WORK_TIME,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
),
@@ -115,13 +116,26 @@ class RoborockButtonEntity(RoborockEntityV1, ButtonEntity):
super().__init__(
f"{entity_description.key}_{coordinator.duid_slug}",
coordinator.device_info,
coordinator.api,
api=coordinator.properties_api.command,
)
self.entity_description = entity_description
self._consumable = coordinator.properties_api.consumables
async def async_press(self) -> None:
"""Press the button."""
await self.send(self.entity_description.command, self.entity_description.param)
try:
await self._consumable.reset_consumable(self.entity_description.attribute)
except RoborockException as err:
# This error message could be improved since it is fairly low level
# and technical. Can add a more user friendly message with the
# name of the attribute being reset.
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="command_failed",
translation_placeholders={
"command": "RESET_CONSUMABLE",
},
) from err
class RoborockRoutineButtonEntity(RoborockEntity, ButtonEntity):
@@ -138,7 +152,6 @@ class RoborockRoutineButtonEntity(RoborockEntity, ButtonEntity):
super().__init__(
f"{entity_description.key}_{coordinator.duid_slug}",
coordinator.device_info,
coordinator.api,
)
self._routine_id = int(entity_description.key)
self._coordinator = coordinator

View File

@@ -2,35 +2,18 @@
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from datetime import timedelta
import io
from datetime import datetime, timedelta
import logging
from typing import Any, TypeVar
from propcache.api import cached_property
from roborock import HomeDataRoom
from roborock.data import (
DeviceData,
HomeDataDevice,
HomeDataProduct,
HomeDataScene,
NetworkInfo,
RoborockCategory,
UserData,
)
from roborock.exceptions import RoborockException
from roborock.data import HomeDataScene
from roborock.devices.device import RoborockDevice
from roborock.devices.traits.a01 import DyadApi, ZeoApi
from roborock.devices.traits.v1 import PropertiesApi
from roborock.exceptions import RoborockDeviceBusy, RoborockException
from roborock.roborock_message import RoborockDyadDataProtocol, RoborockZeoProtocol
from roborock.roborock_typing import DeviceProp
from roborock.version_1_apis.roborock_local_client_v1 import RoborockLocalClientV1
from roborock.version_1_apis.roborock_mqtt_client_v1 import RoborockMqttClientV1
from roborock.version_a01_apis import RoborockClientA01
from roborock.web_api import RoborockApiClient
from vacuum_map_parser_base.config.color import ColorsPalette, SupportedColor
from vacuum_map_parser_base.config.image_config import ImageConfig
from vacuum_map_parser_base.config.size import Size, Sizes
from vacuum_map_parser_base.map_data import MapData
from vacuum_map_parser_roborock.map_data_parser import RoborockMapDataParser
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_CONNECTIONS
@@ -49,21 +32,14 @@ from homeassistant.util import dt as dt_util, slugify
from .const import (
A01_UPDATE_INTERVAL,
CONF_SHOW_BACKGROUND,
DEFAULT_DRAWABLES,
DOMAIN,
DRAWABLES,
IMAGE_CACHE_INTERVAL,
MAP_FILE_FORMAT,
MAP_SCALE,
MAP_SLEEP,
V1_CLOUD_IN_CLEANING_INTERVAL,
V1_CLOUD_NOT_CLEANING_INTERVAL,
V1_LOCAL_IN_CLEANING_INTERVAL,
V1_LOCAL_NOT_CLEANING_INTERVAL,
)
from .models import RoborockA01HassDeviceInfo, RoborockHassDeviceInfo, RoborockMapInfo
from .roborock_storage import RoborockMapStorage
from .models import DeviceState
SCAN_INTERVAL = timedelta(seconds=30)
@@ -87,7 +63,7 @@ class RoborockCoordinators:
type RoborockConfigEntry = ConfigEntry[RoborockCoordinators]
class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceState]):
"""Class to manage fetching data from the API."""
config_entry: RoborockConfigEntry
@@ -96,13 +72,8 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
self,
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
device: HomeDataDevice,
device_networking: NetworkInfo,
product_info: HomeDataProduct,
cloud_api: RoborockMqttClientV1,
home_data_rooms: list[HomeDataRoom],
api_client: RoborockApiClient,
user_data: UserData,
device: RoborockDevice,
properties_api: PropertiesApi,
) -> None:
"""Initialize."""
super().__init__(
@@ -113,62 +84,24 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
# Assume we can use the local api.
update_interval=V1_LOCAL_NOT_CLEANING_INTERVAL,
)
self.roborock_device_info = RoborockHassDeviceInfo(
device,
device_networking,
product_info,
DeviceProp(),
)
device_data = DeviceData(device, product_info.model, device_networking.ip)
self.api: RoborockLocalClientV1 | RoborockMqttClientV1 = RoborockLocalClientV1(
device_data, queue_timeout=5
)
self.cloud_api = cloud_api
self._device = device
self.properties_api = properties_api
self.device_info = DeviceInfo(
name=self.roborock_device_info.device.name,
name=self._device.device_info.name,
identifiers={(DOMAIN, self.duid)},
manufacturer="Roborock",
model=self.roborock_device_info.product.model,
model_id=self.roborock_device_info.product.model,
sw_version=self.roborock_device_info.device.fv,
model=self._device.product.model,
model_id=self._device.product.model,
sw_version=self._device.device_info.fv,
)
self.current_map: int | None = None
if mac := self.roborock_device_info.network_info.mac:
if mac := properties_api.network_info.mac:
self.device_info[ATTR_CONNECTIONS] = {
(dr.CONNECTION_NETWORK_MAC, dr.format_mac(mac))
}
# Maps from map flag to map name
self.maps: dict[int, RoborockMapInfo] = {}
self._home_data_rooms = {str(room.id): room.name for room in home_data_rooms}
self.map_storage = RoborockMapStorage(
hass, self.config_entry.entry_id, self.duid_slug
)
self._user_data = user_data
self._api_client = api_client
self._is_cloud_api = False
drawables = [
drawable
for drawable, default_value in DEFAULT_DRAWABLES.items()
if config_entry.options.get(DRAWABLES, {}).get(drawable, default_value)
]
colors = ColorsPalette()
if not config_entry.options.get(CONF_SHOW_BACKGROUND, False):
colors = ColorsPalette({SupportedColor.MAP_OUTSIDE: (0, 0, 0, 0)})
self.map_parser = RoborockMapDataParser(
colors,
Sizes(
{
k: v * MAP_SCALE
for k, v in Sizes.SIZES.items()
if k != Size.MOP_PATH_WIDTH
}
),
drawables,
ImageConfig(scale=MAP_SCALE),
[],
)
self.last_update_state: str | None = None
# Keep track of last attempt to refresh maps/rooms to know when to try again.
self._last_home_update_attempt: datetime
self.last_home_update: datetime | None = None
@cached_property
def dock_device_info(self) -> DeviceInfo:
@@ -177,39 +110,40 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
This must happen after the coordinator does the first update.
Which will be the case when this is called.
"""
dock_type = self.roborock_device_info.props.status.dock_type
dock_type = self.properties_api.status.dock_type
return DeviceInfo(
name=f"{self.roborock_device_info.device.name} Dock",
name=f"{self._device.device_info.name} Dock",
identifiers={(DOMAIN, f"{self.duid}_dock")},
manufacturer="Roborock",
model=f"{self.roborock_device_info.product.model} Dock",
model=f"{self._device.product.model} Dock",
model_id=str(dock_type.value) if dock_type is not None else "Unknown",
sw_version=self.roborock_device_info.device.fv,
sw_version=self._device.device_info.fv,
)
def parse_map_data_v1(
self, map_bytes: bytes
) -> tuple[bytes | None, MapData | None]:
"""Parse map_bytes and return MapData and the image."""
try:
parsed_map = self.map_parser.parse(map_bytes)
except (IndexError, ValueError) as err:
_LOGGER.debug("Exception when parsing map contents: %s", err)
return None, None
if parsed_map.image is None:
return None, None
img_byte_arr = io.BytesIO()
parsed_map.image.data.save(img_byte_arr, format=MAP_FILE_FORMAT)
return img_byte_arr.getvalue(), parsed_map
async def _async_setup(self) -> None:
"""Set up the coordinator."""
# Verify we can communicate locally - if we can't, switch to cloud api
await self._verify_api()
self.api.is_available = True
try:
maps = await self.api.get_multi_maps_list()
await self.properties_api.status.refresh()
except RoborockException as err:
_LOGGER.debug("Failed to update data during setup: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_data_fail",
) from err
self._last_home_update_attempt = dt_util.utcnow()
# This populates a cache of maps/rooms so we have the information
# even for maps that are inactive but is a no-op if we already have
# the information. This will cycle through all the available maps and
# requires the device to be idle. If the device is busy cleaning, then
# we'll retry later in `update_map` and in the mean time we won't have
# all map/room information.
try:
await self.properties_api.home.discover_home()
except RoborockDeviceBusy:
_LOGGER.info("Home discovery skipped while device is busy/cleaning")
except RoborockException as err:
_LOGGER.debug("Failed to get maps: %s", err)
raise UpdateFailed(
@@ -217,81 +151,32 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
translation_key="map_failure",
translation_placeholders={"error": str(err)},
) from err
# Rooms names populated later with calls to `set_current_map_rooms` for each map
roborock_maps = maps.map_info if (maps and maps.map_info) else ()
stored_images = await asyncio.gather(
*[
self.map_storage.async_load_map(roborock_map.mapFlag)
for roborock_map in roborock_maps
]
)
self.maps = {
roborock_map.mapFlag: RoborockMapInfo(
flag=roborock_map.mapFlag,
name=roborock_map.name or f"Map {roborock_map.mapFlag}",
rooms={},
image=image,
last_updated=dt_util.utcnow() - IMAGE_CACHE_INTERVAL,
map_data=None,
)
for image, roborock_map in zip(stored_images, roborock_maps, strict=False)
}
else:
# Force a map refresh on first setup
self.last_home_update = dt_util.utcnow() - IMAGE_CACHE_INTERVAL
async def update_map(self) -> None:
"""Update the currently selected map."""
# The current map was set in the props update, so these can be done without
# worry of applying them to the wrong map.
if self.current_map is None or self.current_map not in self.maps:
# This exists as a safeguard/ to keep mypy happy.
return
try:
response = await self.cloud_api.get_map_v1()
await self.properties_api.home.discover_home()
await self.properties_api.home.refresh()
except RoborockException as ex:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
) from ex
if not isinstance(response, bytes):
_LOGGER.debug("Failed to parse map contents: %s", response)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
)
parsed_image, parsed_map = self.parse_map_data_v1(response)
if parsed_image is None or parsed_map is None:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
)
current_roborock_map_info = self.maps[self.current_map]
if parsed_image != self.maps[self.current_map].image:
await self.map_storage.async_save_map(
self.current_map,
parsed_image,
)
current_roborock_map_info.image = parsed_image
current_roborock_map_info.last_updated = dt_util.utcnow()
current_roborock_map_info.map_data = parsed_map
else:
self.last_home_update = dt_util.utcnow()
async def _verify_api(self) -> None:
"""Verify that the api is reachable. If it is not, switch clients."""
if isinstance(self.api, RoborockLocalClientV1):
try:
await self.api.async_connect()
await self.api.ping()
if self._device.is_connected:
if self._device.is_local_connected:
async_delete_issue(
self.hass, DOMAIN, f"cloud_api_used_{self.duid_slug}"
)
except RoborockException:
_LOGGER.warning(
"Using the cloud API for device %s. This is not recommended as it can lead to rate limiting. We recommend making your vacuum accessible by your Home Assistant instance",
self.duid,
)
await self.api.async_disconnect()
# We use the cloud api if the local api fails to connect.
self.api = self.cloud_api
else:
self.update_interval = V1_CLOUD_NOT_CLEANING_INTERVAL
self._is_cloud_api = True
async_create_issue(
self.hass,
DOMAIN,
@@ -299,100 +184,81 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
is_fixable=False,
severity=IssueSeverity.WARNING,
translation_key="cloud_api_used",
translation_placeholders={
"device_name": self.roborock_device_info.device.name
},
translation_placeholders={"device_name": self._device.name},
learn_more_url="https://www.home-assistant.io/integrations/roborock/#the-integration-tells-me-it-cannot-reach-my-vacuum-and-is-using-the-cloud-api-and-that-this-is-not-supported-or-i-am-having-any-networking-issues",
)
# Right now this should never be called if the cloud api is the primary api,
# but in the future if it is, a new else should be added.
async def async_shutdown(self) -> None:
"""Shutdown the coordinator."""
await super().async_shutdown()
await asyncio.gather(
self.map_storage.flush(),
self.api.async_release(),
self.cloud_api.async_release(),
)
async def _update_device_prop(self) -> None:
"""Update device properties."""
if (device_prop := await self.api.get_prop()) is not None:
self.roborock_device_info.props.update(device_prop)
await _refresh_traits(
[
trait
for trait in (
self.properties_api.status,
self.properties_api.consumables,
self.properties_api.clean_summary,
self.properties_api.dnd,
self.properties_api.dust_collection_mode,
self.properties_api.wash_towel_mode,
self.properties_api.smart_wash_params,
self.properties_api.sound_volume,
self.properties_api.child_lock,
self.properties_api.dust_collection_mode,
self.properties_api.flow_led_status,
self.properties_api.valley_electricity_timer,
)
if trait is not None
]
)
_LOGGER.debug("Updated device properties")
async def _async_update_data(self) -> DeviceProp:
async def _async_update_data(self) -> DeviceState:
"""Update data via library."""
try:
# Update device props and standard api information
await self._update_device_prop()
# Set the new map id from the updated device props
self._set_current_map()
# Get the rooms for that map id.
# If the vacuum is currently cleaning and it has been IMAGE_CACHE_INTERVAL
# since the last map update, you can update the map.
new_status = self.roborock_device_info.props.status
new_status = self.properties_api.status
if (
self.current_map is not None
and (current_map := self.maps.get(self.current_map))
and (
(
new_status.in_cleaning
and (dt_util.utcnow() - current_map.last_updated)
> IMAGE_CACHE_INTERVAL
)
or self.last_update_state != new_status.state_name
)
):
new_status.in_cleaning
and (dt_util.utcnow() - self._last_home_update_attempt)
> IMAGE_CACHE_INTERVAL
) or self.last_update_state != new_status.state_name:
self._last_home_update_attempt = dt_util.utcnow()
try:
await self.update_map()
except HomeAssistantError as err:
_LOGGER.debug("Failed to update map: %s", err)
await self.set_current_map_rooms()
except RoborockException as ex:
_LOGGER.debug("Failed to update data: %s", ex)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_data_fail",
) from ex
if self.roborock_device_info.props.status.in_cleaning:
if self._is_cloud_api:
self.update_interval = V1_CLOUD_IN_CLEANING_INTERVAL
else:
if self.properties_api.status.in_cleaning:
if self._device.is_local_connected:
self.update_interval = V1_LOCAL_IN_CLEANING_INTERVAL
elif self._is_cloud_api:
self.update_interval = V1_CLOUD_NOT_CLEANING_INTERVAL
else:
else:
self.update_interval = V1_CLOUD_IN_CLEANING_INTERVAL
elif self._device.is_local_connected:
self.update_interval = V1_LOCAL_NOT_CLEANING_INTERVAL
self.last_update_state = self.roborock_device_info.props.status.state_name
return self.roborock_device_info.props
def _set_current_map(self) -> None:
if (
self.roborock_device_info.props.status is not None
and self.roborock_device_info.props.status.current_map is not None
):
self.current_map = self.roborock_device_info.props.status.current_map
async def set_current_map_rooms(self) -> None:
"""Fetch all of the rooms for the current map and set on RoborockMapInfo."""
# The api is only able to access rooms for the currently selected map
# So it is important this is only called when you have the map you care
# about selected.
if self.current_map is None or self.current_map not in self.maps:
return
room_mapping = await self.api.get_room_mapping()
self.maps[self.current_map].rooms = {
room.segment_id: self._home_data_rooms.get(room.iot_id, "Unknown")
for room in room_mapping or ()
}
else:
self.update_interval = V1_CLOUD_NOT_CLEANING_INTERVAL
self.last_update_state = self.properties_api.status.state_name
return DeviceState(
status=self.properties_api.status,
dnd_timer=self.properties_api.dnd,
consumable=self.properties_api.consumables,
clean_summary=self.properties_api.clean_summary,
)
async def get_routines(self) -> list[HomeDataScene]:
"""Get routines."""
try:
return await self._api_client.get_scenes(self._user_data, self.duid)
return await self.properties_api.routines.get_routines()
except RoborockException as err:
_LOGGER.error("Failed to get routines %s", err)
raise HomeAssistantError(
@@ -406,7 +272,7 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
async def execute_routines(self, routine_id: int) -> None:
"""Execute routines."""
try:
await self._api_client.execute_scene(self._user_data, routine_id)
await self.properties_api.routines.execute_routine(routine_id)
except RoborockException as err:
_LOGGER.error("Failed to execute routines %s %s", routine_id, err)
raise HomeAssistantError(
@@ -420,85 +286,43 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceProp]):
@cached_property
def duid(self) -> str:
"""Get the unique id of the device as specified by Roborock."""
return self.roborock_device_info.device.duid
return self._device.duid
@cached_property
def duid_slug(self) -> str:
"""Get the slug of the duid."""
return slugify(self.duid)
async def refresh_coordinator_map(self) -> None:
"""Get the starting map information for all maps for this device.
@property
def device(self) -> RoborockDevice:
"""Get the RoborockDevice."""
return self._device
The following steps must be done synchronously.
Only one map can be loaded at a time per device.
"""
cur_map = self.current_map
# This won't be None at this point as the coordinator will have run first.
if cur_map is None:
# If we don't have a cur map(shouldn't happen) just
# return as we can't do anything.
return
if self.data.status.in_cleaning:
# If the vacuum is cleaning, we cannot change maps
# as it will interrupt the cleaning.
_LOGGER.info(
"Vacuum is cleaning, not switching to other maps to fetch rooms"
async def _refresh_traits(traits: list[Any]) -> None:
"""Refresh a list of traits serially.
We refresh traits serially to avoid overloading the cloud servers or device
with requests. If any single trait fails to refresh, we stop the whole
update process and raise UpdateFailed.
"""
for trait in traits:
try:
await trait.refresh()
except RoborockException as ex:
_LOGGER.debug(
"Failed to update data (%s): %s", trait.__class__.__name__, ex
)
# Since this is hitting the cloud api, we want to be careful and will just
# stop here rather than retrying in the future.
map_flags = [cur_map]
else:
map_flags = sorted(
self.maps, key=lambda data: data == cur_map, reverse=True
)
for map_flag in map_flags:
if map_flag != cur_map:
# Only change the map and sleep if we have multiple maps.
try:
await self.cloud_api.load_multi_map(map_flag)
except RoborockException as ex:
_LOGGER.debug(
"Failed to change to map %s when refreshing maps: %s",
map_flag,
ex,
)
continue
else:
self.current_map = map_flag
# We cannot get the map until the roborock servers fully process the
# map change. If the above command fails, we should still sleep, just
# in case it executes delayed.
await asyncio.sleep(MAP_SLEEP)
tasks = [self.set_current_map_rooms()]
# The image is set within async_setup, so if it exists, we have it here.
if self.maps[map_flag].image is None:
# If we don't have a cached map, let's update it here so that it can be
# cached in the future.
tasks.append(self.update_map())
# If either of these fail, we don't care, and we want to continue.
await asyncio.gather(*tasks, return_exceptions=True)
if len(self.maps) > 1 and not self.data.status.in_cleaning:
# Set the map back to the map the user previously had selected so that it
# does not change the end user's app.
# Only needs to happen when we changed maps above.
try:
await self.cloud_api.load_multi_map(cur_map)
except RoborockException as ex:
_LOGGER.warning(
"Failed to change back to map %s when refreshing maps: %s",
cur_map,
ex,
)
self.current_map = cur_map
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_data_fail",
) from ex
class RoborockDataUpdateCoordinatorA01(
DataUpdateCoordinator[
dict[RoborockDyadDataProtocol | RoborockZeoProtocol, StateType]
]
):
_V = TypeVar("_V", bound=RoborockDyadDataProtocol | RoborockZeoProtocol)
class RoborockDataUpdateCoordinatorA01(DataUpdateCoordinator[dict[_V, StateType]]):
"""Class to manage fetching data from the API for A01 devices."""
config_entry: RoborockConfigEntry
@@ -507,9 +331,7 @@ class RoborockDataUpdateCoordinatorA01(
self,
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
device: HomeDataDevice,
product_info: HomeDataProduct,
api: RoborockClientA01,
device: RoborockDevice,
) -> None:
"""Initialize."""
super().__init__(
@@ -519,53 +341,88 @@ class RoborockDataUpdateCoordinatorA01(
name=DOMAIN,
update_interval=A01_UPDATE_INTERVAL,
)
self.api = api
self._device = device
self.device_info = DeviceInfo(
name=device.name,
identifiers={(DOMAIN, device.duid)},
manufacturer="Roborock",
model=product_info.model,
sw_version=device.fv,
model=device.product.model,
sw_version=device.device_info.fv,
)
self.request_protocols: list[
RoborockDyadDataProtocol | RoborockZeoProtocol
] = []
if product_info.category == RoborockCategory.WET_DRY_VAC:
self.request_protocols = [
RoborockDyadDataProtocol.STATUS,
RoborockDyadDataProtocol.POWER,
RoborockDyadDataProtocol.MESH_LEFT,
RoborockDyadDataProtocol.BRUSH_LEFT,
RoborockDyadDataProtocol.ERROR,
RoborockDyadDataProtocol.TOTAL_RUN_TIME,
]
elif product_info.category == RoborockCategory.WASHING_MACHINE:
self.request_protocols = [
RoborockZeoProtocol.STATE,
RoborockZeoProtocol.COUNTDOWN,
RoborockZeoProtocol.WASHING_LEFT,
RoborockZeoProtocol.ERROR,
]
else:
_LOGGER.warning("The device you added is not yet supported")
self.roborock_device_info = RoborockA01HassDeviceInfo(device, product_info)
async def _async_update_data(
self,
) -> dict[RoborockDyadDataProtocol | RoborockZeoProtocol, StateType]:
return await self.api.update_values(self.request_protocols)
async def async_shutdown(self) -> None:
"""Shutdown the coordinator on config entry unload."""
await super().async_shutdown()
await self.api.async_release()
self.request_protocols: list[_V] = []
@cached_property
def duid(self) -> str:
"""Get the unique id of the device as specified by Roborock."""
return self.roborock_device_info.device.duid
return self._device.duid
@cached_property
def duid_slug(self) -> str:
"""Get the slug of the duid."""
return slugify(self.duid)
@property
def device(self) -> RoborockDevice:
"""Get the RoborockDevice."""
return self._device
class RoborockWashingMachineUpdateCoordinator(
RoborockDataUpdateCoordinatorA01[RoborockZeoProtocol]
):
"""Coordinator for Zeo devices."""
def __init__(
self,
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
device: RoborockDevice,
api: ZeoApi,
) -> None:
"""Initialize."""
super().__init__(hass, config_entry, device)
self.api = api
self.request_protocols: list[RoborockZeoProtocol] = []
# This currently only supports the washing machine protocols
self.request_protocols = [
RoborockZeoProtocol.STATE,
RoborockZeoProtocol.COUNTDOWN,
RoborockZeoProtocol.WASHING_LEFT,
RoborockZeoProtocol.ERROR,
]
async def _async_update_data(
self,
) -> dict[RoborockZeoProtocol, StateType]:
return await self.api.query_values(self.request_protocols)
class RoborockWetDryVacUpdateCoordinator(
RoborockDataUpdateCoordinatorA01[RoborockDyadDataProtocol]
):
"""Coordinator for Dyad devices."""
def __init__(
self,
hass: HomeAssistant,
config_entry: RoborockConfigEntry,
device: RoborockDevice,
api: DyadApi,
) -> None:
"""Initialize."""
super().__init__(hass, config_entry, device)
self.api = api
# This currenltly only supports the WetDryVac protocols
self.request_protocols: list[RoborockDyadDataProtocol] = [
RoborockDyadDataProtocol.STATUS,
RoborockDyadDataProtocol.POWER,
RoborockDyadDataProtocol.MESH_LEFT,
RoborockDyadDataProtocol.BRUSH_LEFT,
RoborockDyadDataProtocol.ERROR,
RoborockDyadDataProtocol.TOTAL_RUN_TIME,
]
async def _async_update_data(
self,
) -> dict[RoborockDyadDataProtocol, StateType]:
return await self.api.query_values(self.request_protocols)

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
import logging
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
@@ -10,9 +11,9 @@ from homeassistant.core import HomeAssistant
from .coordinator import RoborockConfigEntry
TO_REDACT_CONFIG = ["token", "sn", "rruid", CONF_UNIQUE_ID, "username", "uid"]
_LOGGER = logging.getLogger(__name__)
TO_REDACT_COORD = ["duid", "localKey", "mac", "bssid"]
TO_REDACT_CONFIG = ["token", "sn", "rruid", CONF_UNIQUE_ID, "username", "uid"]
async def async_get_config_entry_diagnostics(
@@ -24,12 +25,7 @@ async def async_get_config_entry_diagnostics(
return {
"config_entry": async_redact_data(config_entry.data, TO_REDACT_CONFIG),
"coordinators": {
f"**REDACTED-{i}**": {
"roborock_device_info": async_redact_data(
coordinator.roborock_device_info.as_dict(), TO_REDACT_COORD
),
"api": coordinator.api.diagnostic_data,
}
f"**REDACTED-{i}**": coordinator.device.diagnostic_data()
for i, coordinator in enumerate(coordinators.values())
},
}

View File

@@ -2,19 +2,10 @@
from typing import Any
from roborock.api import RoborockClient
from roborock.command_cache import CacheableAttribute
from roborock.data import Consumable, Status
from roborock.data import Status
from roborock.devices.traits.v1.command import CommandTrait
from roborock.exceptions import RoborockException
from roborock.roborock_message import RoborockDataProtocol
from roborock.roborock_typing import RoborockCommand
from roborock.version_1_apis.roborock_client_v1 import (
CLOUD_REQUIRED,
AttributeCache,
RoborockClientV1,
)
from roborock.version_1_apis.roborock_mqtt_client_v1 import RoborockMqttClientV1
from roborock.version_a01_apis import RoborockClientA01
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import DeviceInfo
@@ -34,39 +25,30 @@ class RoborockEntity(Entity):
self,
unique_id: str,
device_info: DeviceInfo,
api: RoborockClient,
) -> None:
"""Initialize the Roborock Device."""
self._attr_unique_id = unique_id
self._attr_device_info = device_info
self._api = api
class RoborockEntityV1(RoborockEntity):
"""Representation of a base Roborock V1 Entity."""
_api: RoborockClientV1
def __init__(
self, unique_id: str, device_info: DeviceInfo, api: RoborockClientV1
self, unique_id: str, device_info: DeviceInfo, api: CommandTrait
) -> None:
"""Initialize the Roborock Device."""
super().__init__(unique_id, device_info, api)
super().__init__(unique_id, device_info)
self._api = api
def get_cache(self, attribute: CacheableAttribute) -> AttributeCache:
"""Get an item from the api cache."""
return self._api.cache[attribute]
@classmethod
async def _send_command(
cls,
async def send(
self,
command: RoborockCommand | str,
api: RoborockClientV1,
params: dict[str, Any] | list[Any] | int | None = None,
) -> dict:
"""Send a Roborock command with params to a given api."""
try:
response: dict = await api.send_command(command, params)
response: dict = await self._api.send(command, params=params)
except RoborockException as err:
if isinstance(command, RoborockCommand):
command_name = command.name
@@ -81,31 +63,6 @@ class RoborockEntityV1(RoborockEntity):
) from err
return response
async def send(
self,
command: RoborockCommand | str,
params: dict[str, Any] | list[Any] | int | None = None,
) -> dict:
"""Send a command to a vacuum cleaner."""
return await self._send_command(command, self._api, params)
@property
def api(self) -> RoborockClientV1:
"""Returns the api."""
return self._api
class RoborockEntityA01(RoborockEntity):
"""Representation of a base Roborock Entity for A01 devices."""
_api: RoborockClientA01
def __init__(
self, unique_id: str, device_info: DeviceInfo, api: RoborockClientA01
) -> None:
"""Initialize the Roborock Device."""
super().__init__(unique_id, device_info, api)
class RoborockCoordinatedEntityV1(
RoborockEntityV1, CoordinatorEntity[RoborockDataUpdateCoordinator]
@@ -118,9 +75,6 @@ class RoborockCoordinatedEntityV1(
self,
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
listener_request: list[RoborockDataProtocol]
| RoborockDataProtocol
| None = None,
is_dock_entity: bool = False,
) -> None:
"""Initialize the coordinated Roborock Device."""
@@ -130,27 +84,10 @@ class RoborockCoordinatedEntityV1(
device_info=coordinator.device_info
if not is_dock_entity
else coordinator.dock_device_info,
api=coordinator.api,
api=coordinator.properties_api.command,
)
CoordinatorEntity.__init__(self, coordinator=coordinator)
self._attr_unique_id = unique_id
if isinstance(listener_request, RoborockDataProtocol):
listener_request = [listener_request]
self.listener_requests = listener_request or []
async def async_added_to_hass(self) -> None:
"""Add listeners when the device is added to hass."""
await super().async_added_to_hass()
for listener_request in self.listener_requests:
self.api.add_listener(
listener_request, self._update_from_listener, cache=self.api.cache
)
async def async_will_remove_from_hass(self) -> None:
"""Remove listeners when the device is removed from hass."""
for listener_request in self.listener_requests:
self.api.remove_listener(listener_request, self._update_from_listener)
await super().async_will_remove_from_hass()
@property
def _device_status(self) -> Status:
@@ -158,36 +95,19 @@ class RoborockCoordinatedEntityV1(
data = self.coordinator.data
return data.status
@property
def cloud_api(self) -> RoborockMqttClientV1:
"""Return the cloud api."""
return self.coordinator.cloud_api
async def send(
self,
command: RoborockCommand | str,
params: dict[str, Any] | list[Any] | int | None = None,
) -> dict:
"""Overloads normal send command but refreshes coordinator."""
if command in CLOUD_REQUIRED:
res = await self._send_command(command, self.coordinator.cloud_api, params)
else:
res = await self._send_command(command, self._api, params)
res = await super().send(command, params)
await self.coordinator.async_refresh()
return res
def _update_from_listener(self, value: Status | Consumable) -> None:
"""Update the status or consumable data from a listener and then write the new entity state."""
if isinstance(value, Status):
self.coordinator.roborock_device_info.props.status = value
else:
self.coordinator.roborock_device_info.props.consumable = value
self.coordinator.data = self.coordinator.roborock_device_info.props
self.schedule_update_ha_state()
class RoborockCoordinatedEntityA01(
RoborockEntityA01, CoordinatorEntity[RoborockDataUpdateCoordinatorA01]
RoborockEntity, CoordinatorEntity[RoborockDataUpdateCoordinatorA01]
):
"""Representation of a base a coordinated Roborock Entity."""
@@ -197,11 +117,10 @@ class RoborockCoordinatedEntityA01(
coordinator: RoborockDataUpdateCoordinatorA01,
) -> None:
"""Initialize the coordinated Roborock Device."""
RoborockEntityA01.__init__(
RoborockEntity.__init__(
self,
unique_id=unique_id,
device_info=coordinator.device_info,
api=coordinator.api,
)
CoordinatorEntity.__init__(self, coordinator=coordinator)
self._attr_unique_id = unique_id

View File

@@ -3,10 +3,14 @@
from datetime import datetime
import logging
from roborock.devices.traits.v1.home import HomeTrait
from roborock.devices.traits.v1.map_content import MapContent
from homeassistant.components.image import ImageEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
@@ -30,11 +34,13 @@ async def async_setup_entry(
config_entry,
f"{coord.duid_slug}_map_{map_info.name}",
coord,
map_info.flag,
coord.properties_api.home,
map_info.map_flag,
map_info.name,
)
for coord in config_entry.runtime_data.v1
for map_info in coord.maps.values()
if coord.properties_api.home is not None
for map_info in (coord.properties_api.home.home_map_info or {}).values()
),
)
@@ -51,6 +57,7 @@ class RoborockMap(RoborockCoordinatedEntityV1, ImageEntity):
config_entry: ConfigEntry,
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
home_trait: HomeTrait,
map_flag: int,
map_name: str,
) -> None:
@@ -59,31 +66,40 @@ class RoborockMap(RoborockCoordinatedEntityV1, ImageEntity):
ImageEntity.__init__(self, coordinator.hass)
self.config_entry = config_entry
self._attr_name = map_name
self._home_trait = home_trait
self.map_flag = map_flag
self.cached_map = b""
self.cached_map: bytes | None = None
self._attr_entity_category = EntityCategory.DIAGNOSTIC
@property
def is_selected(self) -> bool:
"""Return if this map is the currently selected map."""
return self.map_flag == self.coordinator.current_map
async def async_added_to_hass(self) -> None:
"""When entity is added to hass load any previously cached maps from disk."""
await super().async_added_to_hass()
self._attr_image_last_updated = self.coordinator.maps[
self.map_flag
].last_updated
self._attr_image_last_updated = self.coordinator.last_home_update
self.async_write_ha_state()
@property
def _map_content(self) -> MapContent | None:
if self._home_trait.home_map_content and (
map_content := self._home_trait.home_map_content.get(self.map_flag)
):
return map_content
return None
def _handle_coordinator_update(self) -> None:
# If the coordinator has updated the map, we can update the image.
self._attr_image_last_updated = self.coordinator.maps[
self.map_flag
].last_updated
"""Handle updated data from the coordinator.
If the coordinator has updated the map, we can update the image.
"""
if (map_content := self._map_content) is None:
return
if self.cached_map != map_content.image_content:
self.cached_map = map_content.image_content
self._attr_image_last_updated = self.coordinator.last_home_update
super()._handle_coordinator_update()
async def async_image(self) -> bytes | None:
"""Get the cached image."""
return self.coordinator.maps[self.map_flag].image
if (map_content := self._map_content) is None:
raise HomeAssistantError("Map flag not found in coordinator maps")
return map_content.image_content

View File

@@ -19,7 +19,7 @@
"loggers": ["roborock"],
"quality_scale": "silver",
"requirements": [
"python-roborock==3.7.1",
"python-roborock==3.8.1",
"vacuum-map-parser-roborock==0.1.4"
]
}

View File

@@ -2,12 +2,32 @@
from dataclasses import dataclass
from datetime import datetime
import logging
from typing import Any
from roborock.data import HomeDataDevice, HomeDataProduct, NetworkInfo
from roborock.roborock_typing import DeviceProp
from roborock.data import (
CleanSummaryWithDetail,
Consumable,
DnDTimer,
HomeDataDevice,
HomeDataProduct,
NetworkInfo,
Status,
)
from vacuum_map_parser_base.map_data import MapData
_LOGGER = logging.getLogger(__name__)
@dataclass
class DeviceState:
"""Data about the current state of a device."""
status: Status
dnd_timer: DnDTimer
consumable: Consumable
clean_summary: CleanSummaryWithDetail
@dataclass
class RoborockHassDeviceInfo:
@@ -16,7 +36,6 @@ class RoborockHassDeviceInfo:
device: HomeDataDevice
network_info: NetworkInfo
product: HomeDataProduct
props: DeviceProp
def as_dict(self) -> dict[str, dict[str, Any]]:
"""Turn RoborockHassDeviceInfo into a dictionary."""
@@ -24,7 +43,6 @@ class RoborockHassDeviceInfo:
"device": self.device.as_dict(),
"network_info": self.network_info.as_dict(),
"product": self.product.as_dict(),
"props": self.props.as_dict(),
}
@@ -49,14 +67,6 @@ class RoborockMapInfo:
flag: int
name: str
rooms: dict[int, str]
image: bytes | None
last_updated: datetime
map_data: MapData | None
@property
def current_room(self) -> str | None:
"""Get the currently active room for this map if any."""
if self.map_data is None or self.map_data.vacuum_room is None:
return None
return self.rooms.get(self.map_data.vacuum_room)

View File

@@ -1,14 +1,12 @@
"""Support for Roborock number."""
import asyncio
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
import logging
from typing import Any
from roborock.command_cache import CacheableAttribute
from roborock.devices.traits.v1 import PropertiesApi
from roborock.exceptions import RoborockException
from roborock.version_1_apis.roborock_client_v1 import AttributeCache
from homeassistant.components.number import NumberEntity, NumberEntityDescription
from homeassistant.const import PERCENTAGE, EntityCategory
@@ -29,10 +27,14 @@ PARALLEL_UPDATES = 0
class RoborockNumberDescription(NumberEntityDescription):
"""Class to describe a Roborock number entity."""
# Gets the status of the switch
cache_key: CacheableAttribute
# Sets the status of the switch
update_value: Callable[[AttributeCache, float], Coroutine[Any, Any, None]]
trait: Callable[[PropertiesApi], Any | None]
"""Function to determine if number entity is supported by the device."""
get_value: Callable[[Any], float]
"""Function to get the value from the trait."""
set_value: Callable[[Any, float], Coroutine[Any, Any, None]]
"""Function to set the value on the trait."""
NUMBER_DESCRIPTIONS: list[RoborockNumberDescription] = [
@@ -42,9 +44,10 @@ NUMBER_DESCRIPTIONS: list[RoborockNumberDescription] = [
native_min_value=0,
native_max_value=100,
native_unit_of_measurement=PERCENTAGE,
cache_key=CacheableAttribute.sound_volume,
entity_category=EntityCategory.CONFIG,
update_value=lambda cache, value: cache.update_value([int(value)]),
trait=lambda api: api.sound_volume,
get_value=lambda trait: float(trait.volume),
set_value=lambda trait, value: trait.set_volume(int(value)),
)
]
@@ -55,36 +58,19 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Roborock number platform."""
possible_entities: list[
tuple[RoborockDataUpdateCoordinator, RoborockNumberDescription]
] = [
(coordinator, description)
for coordinator in config_entry.runtime_data.v1
for description in NUMBER_DESCRIPTIONS
]
# We need to check if this function is supported by the device.
results = await asyncio.gather(
*(
coordinator.api.get_from_cache(description.cache_key)
for coordinator, description in possible_entities
),
return_exceptions=True,
)
valid_entities: list[RoborockNumberEntity] = []
for (coordinator, description), result in zip(
possible_entities, results, strict=False
):
if result is None or isinstance(result, RoborockException):
_LOGGER.debug("Not adding entity because of %s", result)
else:
valid_entities.append(
RoborockNumberEntity(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description,
)
async_add_entities(
[
RoborockNumberEntity(
f"{description.key}_{coordinator.duid_slug}",
coordinator=coordinator,
entity_description=description,
trait=trait,
)
async_add_entities(valid_entities)
for coordinator in config_entry.runtime_data.v1
for description in NUMBER_DESCRIPTIONS
if (trait := description.trait(coordinator.properties_api)) is not None
]
)
class RoborockNumberEntity(RoborockEntityV1, NumberEntity):
@@ -97,23 +83,24 @@ class RoborockNumberEntity(RoborockEntityV1, NumberEntity):
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
entity_description: RoborockNumberDescription,
trait: Any,
) -> None:
"""Create a number entity."""
self.entity_description = entity_description
super().__init__(unique_id, coordinator.device_info, coordinator.api)
super().__init__(
unique_id, coordinator.device_info, api=coordinator.properties_api.command
)
self._trait = trait
@property
def native_value(self) -> float | None:
"""Get native value."""
val: float = self.get_cache(self.entity_description.cache_key).value
return val
return self.entity_description.get_value(self._trait)
async def async_set_native_value(self, value: float) -> None:
"""Set number value."""
try:
await self.entity_description.update_value(
self.get_cache(self.entity_description.cache_key), value
)
await self.entity_description.set_value(self._trait, value)
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,

View File

@@ -1,95 +1,87 @@
"""Roborock storage."""
import dataclasses
import logging
from pathlib import Path
import shutil
from typing import Any
from roborock.devices.cache import Cache, CacheData
from homeassistant.core import HomeAssistant
from homeassistant.helpers.storage import Store
from .const import DOMAIN, MAP_FILENAME_SUFFIX
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
STORAGE_PATH = f".storage/{DOMAIN}"
MAPS_PATH = "maps"
CACHE_VERSION = 1
def _storage_path_prefix(hass: HomeAssistant, entry_id: str) -> Path:
"""Storage path for the old map storage cache location."""
return Path(hass.config.path(STORAGE_PATH)) / entry_id
class RoborockMapStorage:
"""Store and retrieve maps for a Roborock device.
async def async_cleanup_map_storage(hass: HomeAssistant, entry_id: str) -> None:
"""Remove map storage in the old format, if any.
An instance of RoborockMapStorage is created for each device and manages
local storage of maps for that device.
This removes all on-disk map files for the given config entry. This is the
old format that was replaced by the `CacheStore` implementation.
"""
def __init__(self, hass: HomeAssistant, entry_id: str, device_id_slug: str) -> None:
"""Initialize RoborockMapStorage."""
self._hass = hass
self._path_prefix = (
_storage_path_prefix(hass, entry_id) / MAPS_PATH / device_id_slug
)
self._write_queue: dict[int, bytes] = {}
async def async_load_map(self, map_flag: int) -> bytes | None:
"""Load maps from disk."""
filename = self._path_prefix / f"{map_flag}{MAP_FILENAME_SUFFIX}"
return await self._hass.async_add_executor_job(self._load_map, filename)
def _load_map(self, filename: Path) -> bytes | None:
"""Load maps from disk."""
if not filename.exists():
return None
try:
return filename.read_bytes()
except OSError as err:
_LOGGER.debug("Unable to read map file: %s %s", filename, err)
return None
async def async_save_map(self, map_flag: int, content: bytes) -> None:
"""Save the map to a pending write queue."""
self._write_queue[map_flag] = content
async def flush(self) -> None:
"""Flush all maps to disk."""
_LOGGER.debug("Flushing %s maps to disk", len(self._write_queue))
queue = self._write_queue.copy()
def _flush_all() -> None:
for map_flag, content in queue.items():
filename = self._path_prefix / f"{map_flag}{MAP_FILENAME_SUFFIX}"
self._save_map(filename, content)
await self._hass.async_add_executor_job(_flush_all)
self._write_queue.clear()
def _save_map(self, filename: Path, content: bytes) -> None:
"""Write the map to disk."""
_LOGGER.debug("Saving map to disk: %s", filename)
try:
filename.parent.mkdir(parents=True, exist_ok=True)
except OSError as err:
_LOGGER.error("Unable to create map directory: %s %s", filename, err)
return
try:
filename.write_bytes(content)
except OSError as err:
_LOGGER.error("Unable to write map file: %s %s", filename, err)
async def async_remove_map_storage(hass: HomeAssistant, entry_id: str) -> None:
"""Remove all map storage associated with a config entry."""
def remove(path_prefix: Path) -> None:
try:
if path_prefix.exists():
if path_prefix.exists() and path_prefix.is_dir():
_LOGGER.debug("Removing maps from disk store: %s", path_prefix)
shutil.rmtree(path_prefix, ignore_errors=True)
except OSError as err:
_LOGGER.error("Unable to remove map files in %s: %s", path_prefix, err)
path_prefix = _storage_path_prefix(hass, entry_id)
_LOGGER.debug("Removing maps from disk store: %s", path_prefix)
await hass.async_add_executor_job(remove, path_prefix)
class CacheStore(Cache):
"""Store and retrieve cache for a Roborock device.
This implements the roborock Cache interface, backend by a Home Assistant
Store that can be flushed to disk. This also manages dispatching the
roborock map contents to separate on disk files via RoborockMapStorage
since maps can be large.
"""
def __init__(self, hass: HomeAssistant, entry_id: str) -> None:
"""Initialize CacheStore."""
self._cache_store = Store[dict[str, Any]](
hass,
version=CACHE_VERSION,
key=f"{DOMAIN}/{entry_id}",
private=True,
)
self._cache_data: CacheData | None = None
async def get(self) -> CacheData:
"""Retrieve cached metadata."""
if self._cache_data is None:
if data := await self._cache_store.async_load():
self._cache_data = CacheData(**data)
else:
self._cache_data = CacheData()
return self._cache_data
async def set(self, value: CacheData) -> None:
"""Save cached metadata."""
self._cache_data = value
async def flush(self) -> None:
"""Flush cached metadata to disk."""
if self._cache_data is not None:
await self._cache_store.async_save(dataclasses.asdict(self._cache_data))
async def async_remove(self) -> None:
"""Remove cached metadata from disk."""
await self._cache_store.async_remove()

View File

@@ -5,15 +5,19 @@ from collections.abc import Callable
from dataclasses import dataclass
from roborock.data import RoborockDockDustCollectionModeCode
from roborock.roborock_message import RoborockDataProtocol
from roborock.roborock_typing import DeviceProp, RoborockCommand
from roborock.devices.traits.v1 import PropertiesApi
from roborock.devices.traits.v1.home import HomeTrait
from roborock.devices.traits.v1.maps import MapsTrait
from roborock.exceptions import RoborockException
from roborock.roborock_typing import RoborockCommand
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import MAP_SLEEP
from .const import DOMAIN, MAP_SLEEP
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockCoordinatedEntityV1
@@ -24,18 +28,20 @@ PARALLEL_UPDATES = 0
class RoborockSelectDescription(SelectEntityDescription):
"""Class to describe a Roborock select entity."""
# The command that the select entity will send to the api.
api_command: RoborockCommand
# Gets the current value of the select entity.
value_fn: Callable[[DeviceProp], str | None]
# Gets all options of the select entity.
options_lambda: Callable[[DeviceProp], list[str] | None]
# Takes the value from the select entity and converts it for the api.
parameter_lambda: Callable[[str, DeviceProp], list[int]]
"""The command that the select entity will send to the API."""
value_fn: Callable[[PropertiesApi], str | None]
"""Function to get the current value of the select entity."""
options_lambda: Callable[[PropertiesApi], list[str] | None]
"""Function to get all options of the select entity or returns None if not supported."""
parameter_lambda: Callable[[str, PropertiesApi], list[int]]
"""Function to get the parameters for the api command."""
protocol_listener: RoborockDataProtocol | None = None
# If it is a dock entity
is_dock_entity: bool = False
"""Whether this entity is for the dock."""
SELECT_DESCRIPTIONS: list[RoborockSelectDescription] = [
@@ -43,34 +49,37 @@ SELECT_DESCRIPTIONS: list[RoborockSelectDescription] = [
key="water_box_mode",
translation_key="mop_intensity",
api_command=RoborockCommand.SET_WATER_BOX_CUSTOM_MODE,
value_fn=lambda data: data.status.water_box_mode_name,
value_fn=lambda api: api.status.water_box_mode_name,
entity_category=EntityCategory.CONFIG,
options_lambda=lambda data: data.status.water_box_mode.keys()
if data.status.water_box_mode is not None
else None,
parameter_lambda=lambda key, prop: [prop.status.get_mop_intensity_code(key)],
protocol_listener=RoborockDataProtocol.WATER_BOX_MODE,
options_lambda=lambda api: (
api.status.water_box_mode.keys()
if api.status.water_box_mode is not None
else None
),
parameter_lambda=lambda key, api: [api.status.get_mop_intensity_code(key)],
),
RoborockSelectDescription(
key="mop_mode",
translation_key="mop_mode",
api_command=RoborockCommand.SET_MOP_MODE,
value_fn=lambda data: data.status.mop_mode_name,
value_fn=lambda api: api.status.mop_mode_name,
entity_category=EntityCategory.CONFIG,
options_lambda=lambda data: data.status.mop_mode.keys()
if data.status.mop_mode is not None
else None,
parameter_lambda=lambda key, prop: [prop.status.get_mop_mode_code(key)],
options_lambda=lambda api: (
api.status.mop_mode.keys() if api.status.mop_mode is not None else None
),
parameter_lambda=lambda key, api: [api.status.get_mop_mode_code(key)],
),
RoborockSelectDescription(
key="dust_collection_mode",
translation_key="dust_collection_mode",
api_command=RoborockCommand.SET_DUST_COLLECTION_MODE,
value_fn=lambda data: data.dust_collection_mode_name,
value_fn=lambda api: api.dust_collection_mode.mode.name, # type: ignore[union-attr]
entity_category=EntityCategory.CONFIG,
options_lambda=lambda data: RoborockDockDustCollectionModeCode.keys()
if data.dust_collection_mode_name is not None
else None,
options_lambda=lambda api: (
RoborockDockDustCollectionModeCode.keys()
if api.dust_collection_mode is not None
else None
),
parameter_lambda=lambda key, _: [
RoborockDockDustCollectionModeCode.as_dict().get(key)
],
@@ -91,17 +100,17 @@ async def async_setup_entry(
for coordinator in config_entry.runtime_data.v1
for description in SELECT_DESCRIPTIONS
if (
options := description.options_lambda(
coordinator.roborock_device_info.props
)
(options := description.options_lambda(coordinator.properties_api))
is not None
)
is not None
)
async_add_entities(
RoborockCurrentMapSelectEntity(
f"selected_map_{coordinator.duid_slug}", coordinator
f"selected_map_{coordinator.duid_slug}", coordinator, home_trait, map_trait
)
for coordinator in config_entry.runtime_data.v1
if (home_trait := coordinator.properties_api.home) is not None
if (map_trait := coordinator.properties_api.maps) is not None
)
@@ -121,7 +130,6 @@ class RoborockSelectEntity(RoborockCoordinatedEntityV1, SelectEntity):
super().__init__(
f"{entity_description.key}_{coordinator.duid_slug}",
coordinator,
entity_description.protocol_listener,
is_dock_entity=entity_description.is_dock_entity,
)
self._attr_options = options
@@ -130,13 +138,15 @@ class RoborockSelectEntity(RoborockCoordinatedEntityV1, SelectEntity):
"""Set the option."""
await self.send(
self.entity_description.api_command,
self.entity_description.parameter_lambda(option, self.coordinator.data),
self.entity_description.parameter_lambda(
option, self.coordinator.properties_api
),
)
@property
def current_option(self) -> str | None:
"""Get the current status of the select entity from device props."""
return self.entity_description.value_fn(self.coordinator.data)
return self.entity_description.value_fn(self.coordinator.properties_api)
class RoborockCurrentMapSelectEntity(RoborockCoordinatedEntityV1, SelectEntity):
@@ -145,35 +155,60 @@ class RoborockCurrentMapSelectEntity(RoborockCoordinatedEntityV1, SelectEntity):
_attr_entity_category = EntityCategory.CONFIG
_attr_translation_key = "selected_map"
def __init__(
self,
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
home_trait: HomeTrait,
maps_trait: MapsTrait,
) -> None:
"""Create a select entity to choose the current map."""
super().__init__(unique_id, coordinator)
self._home_trait = home_trait
self._maps_trait = maps_trait
@property
def _available_map_names(self) -> dict[int, str]:
"""Get the available maps by map id."""
return {
map_id: map_.name or f"Map {map_id}"
for map_id, map_ in (self._home_trait.home_map_info or {}).items()
}
async def async_select_option(self, option: str) -> None:
"""Set the option."""
for map_id, map_ in self.coordinator.maps.items():
if map_.name == option:
await self._send_command(
RoborockCommand.LOAD_MULTI_MAP,
self.cloud_api,
[map_id],
)
# Update the current map id manually so that nothing gets broken
# if another service hits the api.
self.coordinator.current_map = map_id
for map_id, map_name in self._available_map_names.items():
if map_name == option:
try:
await self._maps_trait.set_current_map(map_id)
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="command_failed",
translation_placeholders={
"command": "load_multi_map",
},
) from err
# We need to wait after updating the map
# so that other commands will be executed correctly.
await asyncio.sleep(MAP_SLEEP)
await self.coordinator.async_refresh()
try:
await self._home_trait.refresh()
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="update_data_fail",
) from err
break
@property
def options(self) -> list[str]:
"""Gets all of the names of rooms that we are currently aware of."""
return [roborock_map.name for roborock_map in self.coordinator.maps.values()]
return list(self._available_map_names.values())
@property
def current_option(self) -> str | None:
"""Get the current status of the select entity from device_status."""
if (
(current_map := self.coordinator.current_map) is not None
and current_map in self.coordinator.maps
): # 63 means it is searching for a map.
return self.coordinator.maps[current_map].name
if current_map_info := self._home_trait.current_map_data:
return current_map_info.name or f"Map {current_map_info.map_flag}"
return None

View File

@@ -5,6 +5,7 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import datetime
import logging
from roborock.data import (
DyadError,
@@ -16,12 +17,7 @@ from roborock.data import (
ZeoError,
ZeoState,
)
from roborock.roborock_message import (
RoborockDataProtocol,
RoborockDyadDataProtocol,
RoborockZeoProtocol,
)
from roborock.roborock_typing import DeviceProp
from roborock.roborock_message import RoborockDyadDataProtocol, RoborockZeoProtocol
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -44,6 +40,9 @@ from .entity import (
RoborockCoordinatedEntityV1,
RoborockEntity,
)
from .models import DeviceState
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
@@ -52,9 +51,7 @@ PARALLEL_UPDATES = 0
class RoborockSensorDescription(SensorEntityDescription):
"""A class that describes Roborock sensors."""
value_fn: Callable[[DeviceProp], StateType | datetime.datetime]
protocol_listener: RoborockDataProtocol | None = None
value_fn: Callable[[DeviceState], StateType | datetime.datetime]
# If it is a dock entity
is_dock_entity: bool = False
@@ -67,10 +64,10 @@ class RoborockSensorDescriptionA01(SensorEntityDescription):
data_protocol: RoborockDyadDataProtocol | RoborockZeoProtocol
def _dock_error_value_fn(properties: DeviceProp) -> str | None:
def _dock_error_value_fn(state: DeviceState) -> str | None:
if (
status := properties.status.dock_error_status
) is not None and properties.status.dock_type != RoborockDockTypeCode.no_dock:
status := state.status.dock_error_status
) is not None and state.status.dock_type != RoborockDockTypeCode.no_dock:
return status.name
return None
@@ -85,7 +82,6 @@ SENSOR_DESCRIPTIONS = [
translation_key="main_brush_time_left",
value_fn=lambda data: data.consumable.main_brush_time_left,
entity_category=EntityCategory.DIAGNOSTIC,
protocol_listener=RoborockDataProtocol.MAIN_BRUSH_WORK_TIME,
),
RoborockSensorDescription(
native_unit_of_measurement=UnitOfTime.SECONDS,
@@ -95,7 +91,6 @@ SENSOR_DESCRIPTIONS = [
translation_key="side_brush_time_left",
value_fn=lambda data: data.consumable.side_brush_time_left,
entity_category=EntityCategory.DIAGNOSTIC,
protocol_listener=RoborockDataProtocol.SIDE_BRUSH_WORK_TIME,
),
RoborockSensorDescription(
native_unit_of_measurement=UnitOfTime.SECONDS,
@@ -105,7 +100,6 @@ SENSOR_DESCRIPTIONS = [
translation_key="filter_time_left",
value_fn=lambda data: data.consumable.filter_time_left,
entity_category=EntityCategory.DIAGNOSTIC,
protocol_listener=RoborockDataProtocol.FILTER_WORK_TIME,
),
RoborockSensorDescription(
native_unit_of_measurement=UnitOfTime.HOURS,
@@ -166,7 +160,6 @@ SENSOR_DESCRIPTIONS = [
value_fn=lambda data: data.status.state_name,
entity_category=EntityCategory.DIAGNOSTIC,
options=RoborockStateCode.keys(),
protocol_listener=RoborockDataProtocol.STATE,
),
RoborockSensorDescription(
key="cleaning_area",
@@ -189,7 +182,6 @@ SENSOR_DESCRIPTIONS = [
value_fn=lambda data: data.status.error_code_name,
entity_category=EntityCategory.DIAGNOSTIC,
options=RoborockErrorCode.keys(),
protocol_listener=RoborockDataProtocol.ERROR_CODE,
),
RoborockSensorDescription(
key="battery",
@@ -197,23 +189,26 @@ SENSOR_DESCRIPTIONS = [
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.BATTERY,
protocol_listener=RoborockDataProtocol.BATTERY,
),
RoborockSensorDescription(
key="last_clean_start",
translation_key="last_clean_start",
value_fn=lambda data: data.last_clean_record.begin_datetime
if data.last_clean_record is not None
else None,
value_fn=lambda data: (
data.clean_summary.last_clean_record.begin_datetime
if data.clean_summary.last_clean_record is not None
else None
),
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.TIMESTAMP,
),
RoborockSensorDescription(
key="last_clean_end",
translation_key="last_clean_end",
value_fn=lambda data: data.last_clean_record.end_datetime
if data.last_clean_record is not None
else None,
value_fn=lambda data: (
data.clean_summary.last_clean_record.end_datetime
if data.clean_summary.last_clean_record is not None
else None
),
entity_category=EntityCategory.DIAGNOSTIC,
device_class=SensorDeviceClass.TIMESTAMP,
),
@@ -246,7 +241,6 @@ SENSOR_DESCRIPTIONS = [
),
]
A01_SENSOR_DESCRIPTIONS: list[RoborockSensorDescriptionA01] = [
RoborockSensorDescriptionA01(
key="status",
@@ -340,6 +334,7 @@ async def async_setup_entry(
) -> None:
"""Set up the Roborock vacuum sensors."""
coordinators = config_entry.runtime_data
entities: list[RoborockEntity] = [
RoborockSensorEntity(
coordinator,
@@ -347,7 +342,7 @@ async def async_setup_entry(
)
for coordinator in coordinators.v1
for description in SENSOR_DESCRIPTIONS
if description.value_fn(coordinator.roborock_device_info.props) is not None
if description.value_fn(coordinator.data) is not None
]
entities.extend(RoborockCurrentRoom(coordinator) for coordinator in coordinators.v1)
entities.extend(
@@ -357,7 +352,7 @@ async def async_setup_entry(
)
for coordinator in coordinators.a01
for description in A01_SENSOR_DESCRIPTIONS
if description.data_protocol in coordinator.data
if description.data_protocol in coordinator.request_protocols
)
async_add_entities(entities)
@@ -377,16 +372,13 @@ class RoborockSensorEntity(RoborockCoordinatedEntityV1, SensorEntity):
super().__init__(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description.protocol_listener,
is_dock_entity=description.is_dock_entity,
)
@property
def native_value(self) -> StateType | datetime.datetime:
"""Return the value reported by the sensor."""
return self.entity_description.value_fn(
self.coordinator.roborock_device_info.props
)
return self.entity_description.value_fn(self.coordinator.data)
class RoborockCurrentRoom(RoborockCoordinatedEntityV1, SensorEntity):
@@ -404,30 +396,29 @@ class RoborockCurrentRoom(RoborockCoordinatedEntityV1, SensorEntity):
super().__init__(
f"current_room_{coordinator.duid_slug}",
coordinator,
None,
is_dock_entity=False,
)
self._home_trait = coordinator.properties_api.home
self._map_content_trait = coordinator.properties_api.map_content
@property
def options(self) -> list[str]:
"""Return the currently valid rooms."""
if (
self.coordinator.current_map is not None
and self.coordinator.current_map in self.coordinator.maps
):
return list(
self.coordinator.maps[self.coordinator.current_map].rooms.values()
)
if self._home_trait.current_map_data is not None:
return [room.name for room in self._home_trait.current_map_data.rooms]
return []
@property
def native_value(self) -> str | None:
"""Return the value reported by the sensor."""
if (
self.coordinator.current_map is not None
and self.coordinator.current_map in self.coordinator.maps
self._home_trait.current_map_data is not None
and self._map_content_trait.map_data is not None
and self._map_content_trait.map_data.vacuum_room is not None
):
return self.coordinator.maps[self.coordinator.current_map].current_room
for room in self._home_trait.current_map_data.rooms:
if room.segment_id == self._map_content_trait.map_data.vacuum_room:
return room.name
return None

View File

@@ -2,15 +2,14 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import Any
from roborock.command_cache import CacheableAttribute
from roborock.devices.traits.v1 import PropertiesApi
from roborock.devices.traits.v1.common import RoborockSwitchBase
from roborock.exceptions import RoborockException
from roborock.version_1_apis.roborock_client_v1 import AttributeCache
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
@@ -31,69 +30,35 @@ PARALLEL_UPDATES = 0
class RoborockSwitchDescription(SwitchEntityDescription):
"""Class to describe a Roborock switch entity."""
# Gets the status of the switch
cache_key: CacheableAttribute
# Sets the status of the switch
update_value: Callable[[AttributeCache, bool], Coroutine[Any, Any, None]]
# Attribute from cache
attribute: str
trait: Callable[[PropertiesApi], RoborockSwitchBase | None]
# If it is a dock entity
is_dock_entity: bool = False
SWITCH_DESCRIPTIONS: list[RoborockSwitchDescription] = [
RoborockSwitchDescription(
cache_key=CacheableAttribute.child_lock_status,
update_value=lambda cache, value: cache.update_value(
{"lock_status": 1 if value else 0}
),
attribute="lock_status",
trait=lambda traits: traits.child_lock,
key="child_lock",
translation_key="child_lock",
entity_category=EntityCategory.CONFIG,
is_dock_entity=True,
),
RoborockSwitchDescription(
cache_key=CacheableAttribute.flow_led_status,
update_value=lambda cache, value: cache.update_value(
{"status": 1 if value else 0}
),
attribute="status",
trait=lambda traits: traits.flow_led_status,
key="status_indicator",
translation_key="status_indicator",
entity_category=EntityCategory.CONFIG,
is_dock_entity=True,
),
RoborockSwitchDescription(
cache_key=CacheableAttribute.dnd_timer,
update_value=lambda cache, value: cache.update_value(
[
cache.value.get("start_hour"),
cache.value.get("start_minute"),
cache.value.get("end_hour"),
cache.value.get("end_minute"),
]
)
if value
else cache.close_value(),
attribute="enabled",
trait=lambda traits: traits.dnd,
key="dnd_switch",
translation_key="dnd_switch",
entity_category=EntityCategory.CONFIG,
),
RoborockSwitchDescription(
cache_key=CacheableAttribute.valley_electricity_timer,
update_value=lambda cache, value: cache.update_value(
[
cache.value.get("start_hour"),
cache.value.get("start_minute"),
cache.value.get("end_hour"),
cache.value.get("end_minute"),
]
)
if value
else cache.close_value(),
attribute="enabled",
trait=lambda traits: traits.valley_electricity_timer,
key="off_peak_switch",
translation_key="off_peak_switch",
entity_category=EntityCategory.CONFIG,
@@ -108,36 +73,19 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Roborock switch platform."""
possible_entities: list[
tuple[RoborockDataUpdateCoordinator, RoborockSwitchDescription]
] = [
(coordinator, description)
for coordinator in config_entry.runtime_data.v1
for description in SWITCH_DESCRIPTIONS
]
# We need to check if this function is supported by the device.
results = await asyncio.gather(
*(
coordinator.api.get_from_cache(description.cache_key)
for coordinator, description in possible_entities
),
return_exceptions=True,
)
valid_entities: list[RoborockSwitch] = []
for (coordinator, description), result in zip(
possible_entities, results, strict=False
):
if result is None or isinstance(result, Exception):
_LOGGER.debug("Not adding entity because of %s", result)
else:
valid_entities.append(
RoborockSwitch(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description,
)
async_add_entities(
[
RoborockSwitch(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description,
trait,
)
async_add_entities(valid_entities)
for coordinator in config_entry.runtime_data.v1
for description in SWITCH_DESCRIPTIONS
if (trait := description.trait(coordinator.properties_api)) is not None
]
)
class RoborockSwitch(RoborockEntityV1, SwitchEntity):
@@ -150,23 +98,25 @@ class RoborockSwitch(RoborockEntityV1, SwitchEntity):
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
entity_description: RoborockSwitchDescription,
trait: RoborockSwitchBase,
) -> None:
"""Initialize the entity."""
self.entity_description = entity_description
super().__init__(
unique_id,
coordinator.device_info
if not entity_description.is_dock_entity
else coordinator.dock_device_info,
coordinator.api,
(
coordinator.device_info
if not entity_description.is_dock_entity
else coordinator.dock_device_info
),
coordinator.properties_api.command,
)
self._trait = trait
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the switch."""
try:
await self.entity_description.update_value(
self.get_cache(self.entity_description.cache_key), False
)
await self._trait.disable()
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
@@ -176,9 +126,7 @@ class RoborockSwitch(RoborockEntityV1, SwitchEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn on the switch."""
try:
await self.entity_description.update_value(
self.get_cache(self.entity_description.cache_key), True
)
await self._trait.enable()
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
@@ -188,9 +136,4 @@ class RoborockSwitch(RoborockEntityV1, SwitchEntity):
@property
def is_on(self) -> bool | None:
"""Return True if entity is on."""
status = self.get_cache(self.entity_description.cache_key).value.get(
self.entity_description.attribute
)
if status is None:
return status
return bool(status)
return self._trait.is_on

View File

@@ -1,6 +1,5 @@
"""Support for Roborock time."""
import asyncio
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
import datetime
@@ -8,9 +7,8 @@ from datetime import time
import logging
from typing import Any
from roborock.command_cache import CacheableAttribute
from roborock.data import DnDTimer
from roborock.exceptions import RoborockException
from roborock.version_1_apis.roborock_client_v1 import AttributeCache
from homeassistant.components.time import TimeEntity, TimeEntityDescription
from homeassistant.const import EntityCategory
@@ -31,63 +29,67 @@ PARALLEL_UPDATES = 0
class RoborockTimeDescription(TimeEntityDescription):
"""Class to describe a Roborock time entity."""
# Gets the status of the switch
cache_key: CacheableAttribute
# Sets the status of the switch
update_value: Callable[[AttributeCache, datetime.time], Coroutine[Any, Any, None]]
# Attribute from cache
get_value: Callable[[AttributeCache], datetime.time]
trait: Callable[[Any], Any | None]
"""Function to determine if time entity is supported by the device."""
get_value: Callable[[Any], datetime.time]
"""Function to get the value from the trait."""
update_value: Callable[[Any, datetime.time], Coroutine[Any, Any, None]]
"""Function to set the value on the trait."""
TIME_DESCRIPTIONS: list[RoborockTimeDescription] = [
RoborockTimeDescription(
key="dnd_start_time",
translation_key="dnd_start_time",
cache_key=CacheableAttribute.dnd_timer,
update_value=lambda cache, desired_time: cache.update_value(
[
desired_time.hour,
desired_time.minute,
cache.value.get("end_hour"),
cache.value.get("end_minute"),
]
trait=lambda api: api.dnd,
update_value=lambda trait, desired_time: trait.set_dnd_timer(
DnDTimer(
enabled=trait.enabled,
start_hour=desired_time.hour,
start_minute=desired_time.minute,
end_hour=trait.end_hour,
end_minute=trait.end_minute,
)
),
get_value=lambda cache: datetime.time(
hour=cache.value.get("start_hour"), minute=cache.value.get("start_minute")
get_value=lambda trait: datetime.time(
hour=trait.start_hour, minute=trait.start_minute
),
entity_category=EntityCategory.CONFIG,
),
RoborockTimeDescription(
key="dnd_end_time",
translation_key="dnd_end_time",
cache_key=CacheableAttribute.dnd_timer,
update_value=lambda cache, desired_time: cache.update_value(
[
cache.value.get("start_hour"),
cache.value.get("start_minute"),
desired_time.hour,
desired_time.minute,
]
trait=lambda api: api.dnd,
update_value=lambda trait, desired_time: trait.set_dnd_timer(
DnDTimer(
enabled=trait.enabled,
start_hour=trait.start_hour,
start_minute=trait.start_minute,
end_hour=desired_time.hour,
end_minute=desired_time.minute,
)
),
get_value=lambda cache: datetime.time(
hour=cache.value.get("end_hour"), minute=cache.value.get("end_minute")
get_value=lambda trait: datetime.time(
hour=trait.end_hour, minute=trait.end_minute
),
entity_category=EntityCategory.CONFIG,
),
RoborockTimeDescription(
key="off_peak_start",
translation_key="off_peak_start",
cache_key=CacheableAttribute.valley_electricity_timer,
update_value=lambda cache, desired_time: cache.update_value(
trait=lambda api: api.valley_electricity_timer,
update_value=lambda trait, desired_time: trait.update_value(
[
desired_time.hour,
desired_time.minute,
cache.value.get("end_hour"),
cache.value.get("end_minute"),
trait.end_hour,
trait.end_minute,
]
),
get_value=lambda cache: datetime.time(
hour=cache.value.get("start_hour"), minute=cache.value.get("start_minute")
get_value=lambda trait: datetime.time(
hour=trait.start_hour, minute=trait.start_minute
),
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -95,17 +97,17 @@ TIME_DESCRIPTIONS: list[RoborockTimeDescription] = [
RoborockTimeDescription(
key="off_peak_end",
translation_key="off_peak_end",
cache_key=CacheableAttribute.valley_electricity_timer,
update_value=lambda cache, desired_time: cache.update_value(
trait=lambda api: api.valley_electricity_timer,
update_value=lambda trait, desired_time: trait.update_value(
[
cache.value.get("start_hour"),
cache.value.get("start_minute"),
trait.start_hour,
trait.start_minute,
desired_time.hour,
desired_time.minute,
]
),
get_value=lambda cache: datetime.time(
hour=cache.value.get("end_hour"), minute=cache.value.get("end_minute")
get_value=lambda trait: datetime.time(
hour=trait.end_hour, minute=trait.end_minute
),
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -119,36 +121,19 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Roborock time platform."""
possible_entities: list[
tuple[RoborockDataUpdateCoordinator, RoborockTimeDescription]
] = [
(coordinator, description)
for coordinator in config_entry.runtime_data.v1
for description in TIME_DESCRIPTIONS
]
# We need to check if this function is supported by the device.
results = await asyncio.gather(
*(
coordinator.api.get_from_cache(description.cache_key)
for coordinator, description in possible_entities
),
return_exceptions=True,
)
valid_entities: list[RoborockTimeEntity] = []
for (coordinator, description), result in zip(
possible_entities, results, strict=False
):
if result is None or isinstance(result, RoborockException):
_LOGGER.debug("Not adding entity because of %s", result)
else:
valid_entities.append(
RoborockTimeEntity(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description,
)
async_add_entities(
[
RoborockTimeEntity(
f"{description.key}_{coordinator.duid_slug}",
coordinator,
description,
trait,
)
async_add_entities(valid_entities)
for coordinator in config_entry.runtime_data.v1
for description in TIME_DESCRIPTIONS
if (trait := description.trait(coordinator.properties_api)) is not None
]
)
class RoborockTimeEntity(RoborockEntityV1, TimeEntity):
@@ -161,24 +146,24 @@ class RoborockTimeEntity(RoborockEntityV1, TimeEntity):
unique_id: str,
coordinator: RoborockDataUpdateCoordinator,
entity_description: RoborockTimeDescription,
trait: Any,
) -> None:
"""Create a time entity."""
self.entity_description = entity_description
super().__init__(unique_id, coordinator.device_info, coordinator.api)
super().__init__(
unique_id, coordinator.device_info, api=coordinator.properties_api.command
)
self._trait = trait
@property
def native_value(self) -> time | None:
"""Return the value reported by the time."""
return self.entity_description.get_value(
self.get_cache(self.entity_description.cache_key)
)
return self.entity_description.get_value(self._trait)
async def async_set_value(self, value: time) -> None:
"""Set the time."""
try:
await self.entity_description.update_value(
self.get_cache(self.entity_description.cache_key), value
)
await self.entity_description.update_value(self._trait, value)
except RoborockException as err:
raise HomeAssistantError(
translation_domain=DOMAIN,

View File

@@ -1,9 +1,10 @@
"""Support for Roborock vacuum class."""
import logging
from typing import Any
from roborock.data import RoborockStateCode
from roborock.roborock_message import RoborockDataProtocol
from roborock.exceptions import RoborockException
from roborock.roborock_typing import RoborockCommand
import voluptuous as vol
@@ -26,6 +27,8 @@ from .const import (
from .coordinator import RoborockConfigEntry, RoborockDataUpdateCoordinator
from .entity import RoborockCoordinatedEntityV1
_LOGGER = logging.getLogger(__name__)
STATE_CODE_TO_STATE = {
RoborockStateCode.starting: VacuumActivity.IDLE, # "Starting"
RoborockStateCode.charger_disconnected: VacuumActivity.IDLE, # "Charger disconnected"
@@ -62,11 +65,8 @@ async def async_setup_entry(
) -> None:
"""Set up the Roborock sensor."""
async_add_entities(
RoborockVacuum(coordinator)
for coordinator in config_entry.runtime_data.v1
if isinstance(coordinator, RoborockDataUpdateCoordinator)
RoborockVacuum(coordinator) for coordinator in config_entry.runtime_data.v1
)
platform = entity_platform.async_get_current_platform()
platform.async_register_entity_service(
@@ -124,12 +124,12 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
self,
coordinator.duid_slug,
coordinator,
listener_request=[
RoborockDataProtocol.FAN_POWER,
RoborockDataProtocol.STATE,
],
)
self._attr_fan_speed_list = self._device_status.fan_power_options
@property
def fan_speed_list(self) -> list[str]:
"""Get the list of available fan speeds."""
return self._device_status.fan_power_options
@property
def activity(self) -> VacuumActivity | None:
@@ -197,32 +197,40 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
async def get_maps(self) -> ServiceResponse:
"""Get map information such as map id and room ids."""
home_trait = self.coordinator.properties_api.home
return {
"maps": [
{
"flag": vacuum_map.flag,
"flag": vacuum_map.map_flag,
"name": vacuum_map.name,
# JsonValueType does not accept a int as a key - was not a
# issue with previous asdict() implementation.
"rooms": vacuum_map.rooms, # type: ignore[dict-item]
"rooms": {
# JsonValueType does not accept a int as a key - was not a
# issue with previous asdict() implementation.
room.segment_id: room.name # type: ignore[misc]
for room in vacuum_map.rooms
},
}
for vacuum_map in self.coordinator.maps.values()
for vacuum_map in (home_trait.home_map_info or {}).values()
]
}
async def get_vacuum_current_position(self) -> ServiceResponse:
"""Get the current position of the vacuum from the map."""
map_data = await self.coordinator.cloud_api.get_map_v1()
if not isinstance(map_data, bytes):
map_content_trait = self.coordinator.properties_api.map_content
try:
await map_content_trait.refresh()
except RoborockException as err:
_LOGGER.debug("Failed to refresh map content: %s", err)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
) from err
if map_content_trait.map_data is None:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="map_failure",
)
parsed_map = self.coordinator.map_parser.parse(map_data)
robot_position = parsed_map.vacuum_position
if robot_position is None:
if (robot_position := map_content_trait.map_data.vacuum_position) is None:
raise HomeAssistantError(
translation_domain=DOMAIN, translation_key="position_not_found"
)

View File

@@ -11,8 +11,9 @@ from typing import TYPE_CHECKING, Any, cast
from propcache.api import cached_property
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components import automation, websocket_api
from homeassistant.components.blueprint import CONF_USE_BLUEPRINT
from homeassistant.components.labs import async_listen as async_labs_listen
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_MODE,
@@ -280,6 +281,21 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass.services.async_register(
DOMAIN, SERVICE_TOGGLE, toggle_service, schema=SCRIPT_TURN_ONOFF_SCHEMA
)
@callback
def new_triggers_conditions_listener() -> None:
"""Handle new_triggers_conditions flag change."""
hass.async_create_task(
reload_service(ServiceCall(hass, DOMAIN, SERVICE_RELOAD))
)
async_labs_listen(
hass,
automation.DOMAIN,
automation.NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
new_triggers_conditions_listener,
)
websocket_api.async_register_command(hass, websocket_config)
return True

View File

@@ -170,9 +170,6 @@ async def _async_setup_block_entry(
device_entry = dev_reg.async_get_device(
connections={(CONNECTION_NETWORK_MAC, dr.format_mac(entry.unique_id))},
)
# https://github.com/home-assistant/core/pull/48076
if device_entry and entry.entry_id not in device_entry.config_entries:
device_entry = None
sleep_period = entry.data.get(CONF_SLEEP_PERIOD)
runtime_data = entry.runtime_data
@@ -283,9 +280,6 @@ async def _async_setup_rpc_entry(hass: HomeAssistant, entry: ShellyConfigEntry)
device_entry = dev_reg.async_get_device(
connections={(CONNECTION_NETWORK_MAC, dr.format_mac(entry.unique_id))},
)
# https://github.com/home-assistant/core/pull/48076
if device_entry and entry.entry_id not in device_entry.config_entries:
device_entry = None
sleep_period = entry.data.get(CONF_SLEEP_PERIOD)
runtime_data = entry.runtime_data

View File

@@ -25,7 +25,6 @@ from .coordinator import ShellyBlockCoordinator, ShellyConfigEntry, ShellyRpcCoo
from .utils import (
async_remove_shelly_entity,
get_block_device_info,
get_entity_translation_attributes,
get_rpc_channel_name,
get_rpc_device_info,
get_rpc_key,
@@ -598,17 +597,14 @@ class ShellyRpcAttributeEntity(ShellyRpcEntity, Entity):
def configure_translation_attributes(self) -> None:
"""Configure translation attributes."""
translation_placeholders, translation_key = get_entity_translation_attributes(
get_rpc_channel_name(self.coordinator.device, self.key),
self.entity_description.translation_key,
self.entity_description.device_class,
self._default_to_device_class_name(),
)
if translation_placeholders:
self._attr_translation_placeholders = translation_placeholders
if translation_key:
self._attr_translation_key = translation_key
if (
channel_name := get_rpc_channel_name(self.coordinator.device, self.key)
) and (
translation_key := self.entity_description.translation_key
or (self.device_class if self._default_to_device_class_name() else None)
):
self._attr_translation_placeholders = {"channel_name": channel_name}
self._attr_translation_key = f"{translation_key}_with_channel_name"
class ShellySleepingBlockAttributeEntity(ShellyBlockAttributeEntity):

View File

@@ -63,8 +63,6 @@ from .utils import (
get_blu_trv_device_info,
get_device_entry_gen,
get_device_uptime,
get_entity_translation_attributes,
get_rpc_channel_name,
get_shelly_air_lamp_life,
get_virtual_component_unit,
is_rpc_wifi_stations_disabled,
@@ -106,27 +104,15 @@ class RpcSensor(ShellyRpcAttributeEntity, SensorEntity):
"""Initialize select."""
super().__init__(coordinator, key, attribute, description)
if not description.role:
translation_placeholders, translation_key = (
get_entity_translation_attributes(
get_rpc_channel_name(coordinator.device, key),
description.translation_key,
description.device_class,
self._default_to_device_class_name(),
)
)
if translation_placeholders:
self._attr_translation_placeholders = translation_placeholders
if translation_key:
self._attr_translation_key = translation_key
if self.option_map:
if description.role == ROLE_GENERIC:
self._attr_options = list(self.option_map.values())
else:
self._attr_options = list(self.option_map)
if not description.role:
self.configure_translation_attributes()
@property
def native_value(self) -> StateType:
"""Return value of sensor."""
@@ -1898,19 +1884,7 @@ class RpcSleepingSensor(ShellySleepingRpcAttributeEntity, RestoreSensor):
self.restored_data: SensorExtraStoredData | None = None
if coordinator.device.initialized:
translation_placeholders, translation_key = (
get_entity_translation_attributes(
get_rpc_channel_name(coordinator.device, key),
description.translation_key,
description.device_class,
self._default_to_device_class_name(),
)
)
if translation_placeholders:
self._attr_translation_placeholders = translation_placeholders
if translation_key:
self._attr_translation_key = translation_key
self.configure_translation_attributes()
async def async_added_to_hass(self) -> None:
"""Handle entity which will be added."""

View File

@@ -459,23 +459,6 @@ def get_rpc_sub_device_name(
return f"{device.name} {component.title()} {component_id}"
def get_entity_translation_attributes(
channel_name: str | None,
translation_key: str | None,
device_class: str | None,
default_to_device_class_name: bool,
) -> tuple[dict[str, str] | None, str | None]:
"""Translation attributes for entity with channel name."""
if channel_name is None:
return None, None
key = translation_key
if key is None and default_to_device_class_name:
key = device_class
return {"channel_name": channel_name}, f"{key}_with_channel_name" if key else None
def get_device_entry_gen(entry: ConfigEntry) -> int:
"""Return the device generation from config entry."""
return entry.data.get(CONF_GEN, 1) # type: ignore[no-any-return]

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from collections.abc import Callable
import contextlib
from copy import deepcopy
from dataclasses import dataclass
from http import HTTPStatus
import logging
@@ -22,6 +23,7 @@ from pysmartthings import (
SmartThings,
SmartThingsAuthenticationFailedError,
SmartThingsConnectionError,
SmartThingsError,
SmartThingsSinkError,
Status,
)
@@ -34,6 +36,7 @@ from homeassistant.const import (
ATTR_MANUFACTURER,
ATTR_MODEL,
ATTR_MODEL_ID,
ATTR_SERIAL_NUMBER,
ATTR_SUGGESTED_AREA,
ATTR_SW_VERSION,
ATTR_VIA_DEVICE,
@@ -412,6 +415,33 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
minor_version=2,
)
if entry.minor_version < 3:
data = deepcopy(dict(entry.data))
old_data: dict[str, Any] | None = data.pop(OLD_DATA, None)
if old_data is not None:
_LOGGER.info("Found old data during migration")
client = SmartThings(session=async_get_clientsession(hass))
access_token = old_data[CONF_ACCESS_TOKEN]
installed_app_id = old_data[CONF_INSTALLED_APP_ID]
try:
app = await client.get_installed_app(access_token, installed_app_id)
_LOGGER.info("Found old app %s, named %s", app.app_id, app.display_name)
await client.delete_installed_app(access_token, installed_app_id)
await client.delete_smart_app(access_token, app.app_id)
except SmartThingsError as err:
_LOGGER.warning(
"Could not clean up old smart app during migration: %s", err
)
else:
_LOGGER.info("Successfully cleaned up old smart app during migration")
if CONF_TOKEN not in data:
data[OLD_DATA] = {CONF_LOCATION_ID: old_data[CONF_LOCATION_ID]}
hass.config_entries.async_update_entry(
entry,
data=data,
minor_version=3,
)
return True
@@ -479,6 +509,14 @@ def create_devices(
ATTR_SW_VERSION: viper.software_version,
}
)
if (matter := device.device.matter) is not None:
kwargs.update(
{
ATTR_HW_VERSION: matter.hardware_version,
ATTR_SW_VERSION: matter.software_version,
ATTR_SERIAL_NUMBER: matter.serial_number,
}
)
if (
device_registry.async_get_device({(DOMAIN, device.device.device_id)})
is None

View File

@@ -20,7 +20,7 @@ class SmartThingsConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
"""Handle configuration of SmartThings integrations."""
VERSION = 3
MINOR_VERSION = 2
MINOR_VERSION = 3
DOMAIN = DOMAIN
@property

View File

@@ -18,7 +18,7 @@ from homeassistant.const import (
)
from homeassistant.core import Event, HomeAssistant, ServiceCall
from homeassistant.exceptions import ConfigEntryError, HomeAssistantError
from homeassistant.helpers import discovery
from homeassistant.helpers import discovery, issue_registry as ir
from homeassistant.helpers.device import (
async_remove_stale_devices_links_keep_current_device,
)
@@ -30,12 +30,21 @@ from homeassistant.util.hass_dict import HassKey
from .const import CONF_MAX, CONF_MIN, CONF_STEP, DOMAIN, PLATFORMS
from .coordinator import TriggerUpdateCoordinator
from .helpers import async_get_blueprints
from .helpers import DATA_DEPRECATION, async_get_blueprints
_LOGGER = logging.getLogger(__name__)
DATA_COORDINATORS: HassKey[list[TriggerUpdateCoordinator]] = HassKey(DOMAIN)
def _clean_up_legacy_template_deprecations(hass: HomeAssistant) -> None:
if (found_issues := hass.data.pop(DATA_DEPRECATION, None)) is not None:
issue_registry = ir.async_get(hass)
for domain, issue_id in set(issue_registry.issues):
if domain != DOMAIN or issue_id in found_issues:
continue
ir.async_delete_issue(hass, DOMAIN, issue_id)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the template integration."""
@@ -54,6 +63,8 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def _reload_config(call: Event | ServiceCall) -> None:
"""Reload top-level + platforms."""
hass.data.pop(DATA_DEPRECATION, None)
await async_get_blueprints(hass).async_reset_cache()
try:
unprocessed_conf = await conf_util.async_hass_config_yaml(hass)
@@ -74,6 +85,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
if DOMAIN in conf:
await _process_config(hass, conf)
_clean_up_legacy_template_deprecations(hass)
hass.bus.async_fire(f"event_{DOMAIN}_reloaded", context=call.context)
async_register_admin_service(hass, DOMAIN, SERVICE_RELOAD, _reload_config)

View File

@@ -72,7 +72,11 @@ from . import (
weather as weather_platform,
)
from .const import CONF_DEFAULT_ENTITY_ID, DOMAIN, PLATFORMS, TemplateConfig
from .helpers import async_get_blueprints, rewrite_legacy_to_modern_configs
from .helpers import (
async_get_blueprints,
create_legacy_template_issue,
rewrite_legacy_to_modern_configs,
)
_LOGGER = logging.getLogger(__name__)
@@ -386,11 +390,11 @@ async def async_validate_config(hass: HomeAssistant, config: ConfigType) -> Conf
definitions = (
list(template_config[new_key]) if new_key in template_config else []
)
definitions.extend(
rewrite_legacy_to_modern_configs(
hass, new_key, template_config[old_key], legacy_fields
)
)
for definition in rewrite_legacy_to_modern_configs(
hass, new_key, template_config[old_key], legacy_fields
):
create_legacy_template_issue(hass, definition, new_key)
definitions.append(definition)
template_config = TemplateConfig({**template_config, new_key: definitions})
config_sections.append(template_config)

View File

@@ -1,6 +1,8 @@
"""Helpers for template integration."""
from collections.abc import Callable
from enum import Enum
import hashlib
import itertools
import logging
from typing import Any
@@ -22,15 +24,18 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import PlatformNotReady
from homeassistant.helpers import template
from homeassistant.helpers import issue_registry as ir, template
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
AddEntitiesCallback,
async_get_platforms,
)
from homeassistant.helpers.issue_registry import IssueSeverity
from homeassistant.helpers.singleton import singleton
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import yaml as yaml_util
from homeassistant.util.hass_dict import HassKey
from .const import (
CONF_ADVANCED_OPTIONS,
@@ -46,7 +51,10 @@ from .entity import AbstractTemplateEntity
from .template_entity import TemplateEntity
from .trigger_entity import TriggerEntity
LEGACY_TEMPLATE_DEPRECATION_KEY = "deprecate_legacy_templates"
DATA_BLUEPRINTS = "template_blueprints"
DATA_DEPRECATION: HassKey[list[str]] = HassKey(LEGACY_TEMPLATE_DEPRECATION_KEY)
LEGACY_FIELDS = {
CONF_ICON_TEMPLATE: CONF_ICON,
@@ -180,6 +188,95 @@ def async_create_template_tracking_entities(
async_add_entities(entities)
def _format_template(value: Any) -> Any:
if isinstance(value, template.Template):
return value.template
if isinstance(value, Enum):
return value.name
if isinstance(value, (int, float, str, bool)):
return value
return str(value)
def format_migration_config(
config: ConfigType | list[ConfigType], depth: int = 0
) -> ConfigType | list[ConfigType]:
"""Recursive method to format templates as strings from ConfigType."""
types = (dict, list)
if depth > 9:
raise RecursionError
if isinstance(config, list):
items = []
for item in config:
if isinstance(item, types):
if len(item) > 0:
items.append(format_migration_config(item, depth + 1))
else:
items.append(_format_template(item))
return items # type: ignore[return-value]
formatted_config = {}
for field, value in config.items():
if isinstance(value, types):
if len(value) > 0:
formatted_config[field] = format_migration_config(value, depth + 1)
else:
formatted_config[field] = _format_template(value)
return formatted_config
def create_legacy_template_issue(
hass: HomeAssistant, config: ConfigType, domain: str
) -> None:
"""Create a repair for legacy template entities."""
breadcrumb = "Template Entity"
# Default entity id should be in most legacy configuration because
# it's created from the legacy slug. Vacuum and Lock do not have a
# slug, therefore we need to use the name or unique_id.
if (default_entity_id := config.get(CONF_DEFAULT_ENTITY_ID)) is not None:
breadcrumb = default_entity_id.split(".")[-1]
elif (unique_id := config.get(CONF_UNIQUE_ID)) is not None:
breadcrumb = f"unique_id: {unique_id}"
elif (name := config.get(CONF_NAME)) and isinstance(name, template.Template):
breadcrumb = name.template
issue_id = f"{LEGACY_TEMPLATE_DEPRECATION_KEY}_{domain}_{breadcrumb}_{hashlib.md5(','.join(config.keys()).encode()).hexdigest()}"
if (deprecation_list := hass.data.get(DATA_DEPRECATION)) is None:
hass.data[DATA_DEPRECATION] = deprecation_list = []
deprecation_list.append(issue_id)
try:
modified_yaml = format_migration_config(config)
yaml_config = yaml_util.dump({DOMAIN: [{domain: [modified_yaml]}]})
# Format to show up properly in a numbered bullet on the repair.
yaml_config = " ```\n " + yaml_config.replace("\n", "\n ") + "```"
except RecursionError:
yaml_config = f"{DOMAIN}:\n - {domain}: - ..."
ir.async_create_issue(
hass,
DOMAIN,
issue_id,
breaks_in_ha_version="2026.6",
is_fixable=False,
severity=IssueSeverity.WARNING,
translation_key="deprecated_legacy_templates",
translation_placeholders={
"domain": domain,
"breadcrumb": breadcrumb,
"config": yaml_config,
},
)
async def async_setup_template_platform(
hass: HomeAssistant,
domain: str,
@@ -201,6 +298,10 @@ async def async_setup_template_platform(
)
else:
configs = [rewrite_legacy_to_modern_config(hass, config, legacy_fields)]
for definition in configs:
create_legacy_template_issue(hass, definition, domain)
async_create_template_tracking_entities(
state_entity_cls,
async_add_entities,

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from collections.abc import Generator, Sequence
import logging
from typing import TYPE_CHECKING, Any
@@ -17,8 +16,6 @@ from homeassistant.components.light import (
ATTR_RGBW_COLOR,
ATTR_RGBWW_COLOR,
ATTR_TRANSITION,
DEFAULT_MAX_KELVIN,
DEFAULT_MIN_KELVIN,
DOMAIN as LIGHT_DOMAIN,
ENTITY_ID_FORMAT,
PLATFORM_SCHEMA as LIGHT_PLATFORM_SCHEMA,
@@ -72,6 +69,7 @@ _LOGGER = logging.getLogger(__name__)
_VALID_STATES = [STATE_ON, STATE_OFF, "true", "false"]
# Legacy
ATTR_COLOR_TEMP = "color_temp"
CONF_COLOR_ACTION = "set_color"
CONF_COLOR_TEMPLATE = "color_template"
@@ -289,24 +287,14 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
self._supports_transition_template = config.get(CONF_SUPPORTS_TRANSITION)
# Stored values for template attributes
self._state = initial_state
self._brightness = None
self._temperature: int | None = None
self._hs_color = None
self._rgb_color = None
self._rgbw_color = None
self._rgbww_color = None
self._effect = None
self._effect_list = None
self._max_mireds = None
self._min_mireds = None
self._attr_is_on = initial_state
self._supports_transition = False
self._color_mode: ColorMode | None = None
self._supported_color_modes: set[ColorMode] | None = None
self._attr_color_mode: ColorMode | None = None
def _iterate_scripts(
self, config: dict[str, Any]
) -> Generator[tuple[str, Sequence[dict[str, Any]], ColorMode | None]]:
def _setup_light_features(self, config: ConfigType, name: str) -> None:
"""Setup light scripts, supported color modes, and supported features."""
color_modes = {ColorMode.ONOFF}
for action_id, color_mode in (
(CONF_ON_ACTION, None),
(CONF_OFF_ACTION, None),
@@ -319,80 +307,21 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
(CONF_RGBWW_ACTION, ColorMode.RGBWW),
):
if (action_config := config.get(action_id)) is not None:
yield (action_id, action_config, color_mode)
self.add_script(action_id, action_config, name, DOMAIN)
if color_mode:
color_modes.add(color_mode)
@property
def brightness(self) -> int | None:
"""Return the brightness of the light."""
return self._brightness
self._attr_supported_color_modes = filter_supported_color_modes(color_modes)
if len(self._attr_supported_color_modes) > 1:
self._attr_color_mode = ColorMode.UNKNOWN
if len(self._attr_supported_color_modes) == 1:
self._attr_color_mode = next(iter(self._attr_supported_color_modes))
@property
def color_temp_kelvin(self) -> int | None:
"""Return the color temperature value in Kelvin."""
if self._temperature is None:
return None
return color_util.color_temperature_mired_to_kelvin(self._temperature)
@property
def min_color_temp_kelvin(self) -> int:
"""Return the warmest color_temp_kelvin that this light supports."""
if self._max_mireds is not None:
return color_util.color_temperature_mired_to_kelvin(self._max_mireds)
return DEFAULT_MIN_KELVIN
@property
def max_color_temp_kelvin(self) -> int:
"""Return the coldest color_temp_kelvin that this light supports."""
if self._min_mireds is not None:
return color_util.color_temperature_mired_to_kelvin(self._min_mireds)
return DEFAULT_MAX_KELVIN
@property
def hs_color(self) -> tuple[float, float] | None:
"""Return the hue and saturation color value [float, float]."""
return self._hs_color
@property
def rgb_color(self) -> tuple[int, int, int] | None:
"""Return the rgb color value."""
return self._rgb_color
@property
def rgbw_color(self) -> tuple[int, int, int, int] | None:
"""Return the rgbw color value."""
return self._rgbw_color
@property
def rgbww_color(self) -> tuple[int, int, int, int, int] | None:
"""Return the rgbww color value."""
return self._rgbww_color
@property
def effect(self) -> str | None:
"""Return the effect."""
return self._effect
@property
def effect_list(self) -> list[str] | None:
"""Return the effect list."""
return self._effect_list
@property
def color_mode(self) -> ColorMode | None:
"""Return current color mode."""
return self._color_mode
@property
def supported_color_modes(self) -> set[ColorMode] | None:
"""Flag supported color modes."""
return self._supported_color_modes
@property
def is_on(self) -> bool | None:
"""Return true if device is on."""
return self._state
self._attr_supported_features = LightEntityFeature(0)
if self._action_scripts.get(CONF_EFFECT_ACTION):
self._attr_supported_features |= LightEntityFeature.EFFECT
if self._supports_transition is True:
self._attr_supported_features |= LightEntityFeature.TRANSITION
def set_optimistic_attributes(self, **kwargs) -> bool: # noqa: C901
"""Update attributes which should be set optimistically.
@@ -401,34 +330,52 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"""
optimistic_set = False
if self._attr_assumed_state:
self._state = True
self._attr_is_on = True
optimistic_set = True
if self._level_template is None and ATTR_BRIGHTNESS in kwargs:
_LOGGER.debug(
"Optimistically setting brightness to %s", kwargs[ATTR_BRIGHTNESS]
)
self._brightness = kwargs[ATTR_BRIGHTNESS]
self._attr_brightness = kwargs[ATTR_BRIGHTNESS]
optimistic_set = True
if self._temperature_template is None and ATTR_COLOR_TEMP_KELVIN in kwargs:
color_temp = color_util.color_temperature_kelvin_to_mired(
kwargs[ATTR_COLOR_TEMP_KELVIN]
)
color_temp = kwargs[ATTR_COLOR_TEMP_KELVIN]
_LOGGER.debug(
"Optimistically setting color temperature to %s",
color_temp,
)
self._color_mode = ColorMode.COLOR_TEMP
self._temperature = color_temp
self._attr_color_mode = ColorMode.COLOR_TEMP
self._attr_color_temp_kelvin = color_temp
if self._hs_template is None:
self._hs_color = None
self._attr_hs_color = None
if self._rgb_template is None:
self._rgb_color = None
self._attr_rgb_color = None
if self._rgbw_template is None:
self._rgbw_color = None
self._attr_rgbw_color = None
if self._rgbww_template is None:
self._rgbww_color = None
self._attr_rgbww_color = None
optimistic_set = True
if self._temperature_template is None and ATTR_COLOR_TEMP in kwargs:
color_temp = kwargs[ATTR_COLOR_TEMP]
_LOGGER.debug(
"Optimistically setting color temperature to %s",
color_temp,
)
self._attr_color_mode = ColorMode.COLOR_TEMP
self._attr_color_temp_kelvin = color_util.color_temperature_mired_to_kelvin(
color_temp
)
if self._hs_template is None:
self._attr_hs_color = None
if self._rgb_template is None:
self._attr_rgb_color = None
if self._rgbw_template is None:
self._attr_rgbw_color = None
if self._rgbww_template is None:
self._attr_rgbww_color = None
optimistic_set = True
if self._hs_template is None and ATTR_HS_COLOR in kwargs:
@@ -436,16 +383,16 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"Optimistically setting hs color to %s",
kwargs[ATTR_HS_COLOR],
)
self._color_mode = ColorMode.HS
self._hs_color = kwargs[ATTR_HS_COLOR]
self._attr_color_mode = ColorMode.HS
self._attr_hs_color = kwargs[ATTR_HS_COLOR]
if self._temperature_template is None:
self._temperature = None
self._attr_color_temp_kelvin = None
if self._rgb_template is None:
self._rgb_color = None
self._attr_rgb_color = None
if self._rgbw_template is None:
self._rgbw_color = None
self._attr_rgbw_color = None
if self._rgbww_template is None:
self._rgbww_color = None
self._attr_rgbww_color = None
optimistic_set = True
if self._rgb_template is None and ATTR_RGB_COLOR in kwargs:
@@ -453,16 +400,16 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"Optimistically setting rgb color to %s",
kwargs[ATTR_RGB_COLOR],
)
self._color_mode = ColorMode.RGB
self._rgb_color = kwargs[ATTR_RGB_COLOR]
self._attr_color_mode = ColorMode.RGB
self._attr_rgb_color = kwargs[ATTR_RGB_COLOR]
if self._temperature_template is None:
self._temperature = None
self._attr_color_temp_kelvin = None
if self._hs_template is None:
self._hs_color = None
self._attr_hs_color = None
if self._rgbw_template is None:
self._rgbw_color = None
self._attr_rgbw_color = None
if self._rgbww_template is None:
self._rgbww_color = None
self._attr_rgbww_color = None
optimistic_set = True
if self._rgbw_template is None and ATTR_RGBW_COLOR in kwargs:
@@ -470,16 +417,16 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"Optimistically setting rgbw color to %s",
kwargs[ATTR_RGBW_COLOR],
)
self._color_mode = ColorMode.RGBW
self._rgbw_color = kwargs[ATTR_RGBW_COLOR]
self._attr_color_mode = ColorMode.RGBW
self._attr_rgbw_color = kwargs[ATTR_RGBW_COLOR]
if self._temperature_template is None:
self._temperature = None
self._attr_color_temp_kelvin = None
if self._hs_template is None:
self._hs_color = None
self._attr_hs_color = None
if self._rgb_template is None:
self._rgb_color = None
self._attr_rgb_color = None
if self._rgbww_template is None:
self._rgbww_color = None
self._attr_rgbww_color = None
optimistic_set = True
if self._rgbww_template is None and ATTR_RGBWW_COLOR in kwargs:
@@ -487,16 +434,16 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"Optimistically setting rgbww color to %s",
kwargs[ATTR_RGBWW_COLOR],
)
self._color_mode = ColorMode.RGBWW
self._rgbww_color = kwargs[ATTR_RGBWW_COLOR]
self._attr_color_mode = ColorMode.RGBWW
self._attr_rgbww_color = kwargs[ATTR_RGBWW_COLOR]
if self._temperature_template is None:
self._temperature = None
self._attr_color_temp_kelvin = None
if self._hs_template is None:
self._hs_color = None
self._attr_hs_color = None
if self._rgb_template is None:
self._rgb_color = None
self._attr_rgb_color = None
if self._rgbw_template is None:
self._rgbw_color = None
self._attr_rgbw_color = None
optimistic_set = True
return optimistic_set
@@ -517,8 +464,8 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
):
kelvin = kwargs[ATTR_COLOR_TEMP_KELVIN]
common_params[ATTR_COLOR_TEMP_KELVIN] = kelvin
common_params["color_temp"] = color_util.color_temperature_kelvin_to_mired(
kelvin
common_params[ATTR_COLOR_TEMP] = (
color_util.color_temperature_kelvin_to_mired(kelvin)
)
return (script, common_params)
@@ -527,14 +474,17 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
ATTR_EFFECT in kwargs
and (script := CONF_EFFECT_ACTION) in self._action_scripts
):
assert self._effect_list is not None
assert self._attr_effect_list is not None
effect = kwargs[ATTR_EFFECT]
if self._effect_list is not None and effect not in self._effect_list:
if (
self._attr_effect_list is not None
and effect not in self._attr_effect_list
):
_LOGGER.error(
"Received invalid effect: %s for entity %s. Expected one of: %s",
effect,
self.entity_id,
self._effect_list,
self._attr_effect_list,
)
common_params["effect"] = effect
@@ -614,28 +564,28 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
"""Update the brightness from the template."""
try:
if brightness in (None, "None", ""):
self._brightness = None
self._attr_brightness = None
return
if 0 <= int(brightness) <= 255:
self._brightness = int(brightness)
self._attr_brightness = int(brightness)
else:
_LOGGER.error(
"Received invalid brightness : %s for entity %s. Expected: 0-255",
brightness,
self.entity_id,
)
self._brightness = None
self._attr_brightness = None
except ValueError:
_LOGGER.exception(
"Template must supply an integer brightness from 0-255, or 'None'"
)
self._brightness = None
self._attr_brightness = None
@callback
def _update_effect_list(self, effect_list):
"""Update the effect list from the template."""
if effect_list in (None, "None", ""):
self._effect_list = None
self._attr_effect_list = None
return
if not isinstance(effect_list, list):
@@ -647,46 +597,56 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
effect_list,
self.entity_id,
)
self._effect_list = None
self._attr_effect_list = None
return
if len(effect_list) == 0:
self._effect_list = None
self._attr_effect_list = None
return
self._effect_list = effect_list
self._attr_effect_list = effect_list
@callback
def _update_effect(self, effect):
"""Update the effect from the template."""
if effect in (None, "None", ""):
self._effect = None
self._attr_effect = None
return
if effect not in self._effect_list:
if effect not in self._attr_effect_list:
_LOGGER.error(
"Received invalid effect: %s for entity %s. Expected one of: %s",
effect,
self.entity_id,
self._effect_list,
self._attr_effect_list,
)
self._effect = None
self._attr_effect = None
return
self._effect = effect
self._attr_effect = effect
@callback
def _update_temperature(self, render):
"""Update the temperature from the template."""
try:
if render in (None, "None", ""):
self._temperature = None
self._attr_color_temp_kelvin = None
return
# Support legacy mireds in template light.
temperature = int(render)
min_mireds = self._min_mireds or DEFAULT_MIN_MIREDS
max_mireds = self._max_mireds or DEFAULT_MAX_MIREDS
if (min_kelvin := self._attr_min_color_temp_kelvin) is not None:
min_mireds = color_util.color_temperature_kelvin_to_mired(min_kelvin)
else:
min_mireds = DEFAULT_MIN_MIREDS
if (max_kelvin := self._attr_max_color_temp_kelvin) is not None:
max_mireds = color_util.color_temperature_kelvin_to_mired(max_kelvin)
else:
max_mireds = DEFAULT_MAX_MIREDS
if min_mireds <= temperature <= max_mireds:
self._temperature = temperature
self._attr_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(temperature)
)
else:
_LOGGER.error(
(
@@ -698,26 +658,26 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
min_mireds,
max_mireds,
)
self._temperature = None
self._attr_color_temp_kelvin = None
except ValueError:
_LOGGER.exception(
"Template must supply an integer temperature within the range for"
" this light, or 'None'"
)
self._temperature = None
self._color_mode = ColorMode.COLOR_TEMP
self._attr_color_temp_kelvin = None
self._attr_color_mode = ColorMode.COLOR_TEMP
@callback
def _update_hs(self, render):
"""Update the color from the template."""
if render is None:
self._hs_color = None
self._attr_hs_color = None
return
h_str = s_str = None
if isinstance(render, str):
if render in ("None", ""):
self._hs_color = None
self._attr_hs_color = None
return
h_str, s_str = map(
float, render.replace("(", "").replace(")", "").split(",", 1)
@@ -733,7 +693,7 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
and 0 <= h_str <= 360
and 0 <= s_str <= 100
):
self._hs_color = (h_str, s_str)
self._attr_hs_color = (h_str, s_str)
elif h_str is not None and s_str is not None:
_LOGGER.error(
(
@@ -744,25 +704,25 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
s_str,
self.entity_id,
)
self._hs_color = None
self._attr_hs_color = None
else:
_LOGGER.error(
"Received invalid hs_color : (%s) for entity %s", render, self.entity_id
)
self._hs_color = None
self._color_mode = ColorMode.HS
self._attr_hs_color = None
self._attr_color_mode = ColorMode.HS
@callback
def _update_rgb(self, render):
"""Update the color from the template."""
if render is None:
self._rgb_color = None
self._attr_rgb_color = None
return
r_int = g_int = b_int = None
if isinstance(render, str):
if render in ("None", ""):
self._rgb_color = None
self._attr_rgb_color = None
return
cleanup_char = ["(", ")", "[", "]", " "]
for char in cleanup_char:
@@ -775,7 +735,7 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
value is not None and isinstance(value, (int, float)) and 0 <= value <= 255
for value in (r_int, g_int, b_int)
):
self._rgb_color = (r_int, g_int, b_int)
self._attr_rgb_color = (r_int, g_int, b_int)
elif any(
isinstance(value, (int, float)) and not 0 <= value <= 255
for value in (r_int, g_int, b_int)
@@ -787,27 +747,27 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
b_int,
self.entity_id,
)
self._rgb_color = None
self._attr_rgb_color = None
else:
_LOGGER.error(
"Received invalid rgb_color : (%s) for entity %s",
render,
self.entity_id,
)
self._rgb_color = None
self._color_mode = ColorMode.RGB
self._attr_rgb_color = None
self._attr_color_mode = ColorMode.RGB
@callback
def _update_rgbw(self, render):
"""Update the color from the template."""
if render is None:
self._rgbw_color = None
self._attr_rgbw_color = None
return
r_int = g_int = b_int = w_int = None
if isinstance(render, str):
if render in ("None", ""):
self._rgb_color = None
self._attr_rgb_color = None
return
cleanup_char = ["(", ")", "[", "]", " "]
for char in cleanup_char:
@@ -820,7 +780,7 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
value is not None and isinstance(value, (int, float)) and 0 <= value <= 255
for value in (r_int, g_int, b_int, w_int)
):
self._rgbw_color = (r_int, g_int, b_int, w_int)
self._attr_rgbw_color = (r_int, g_int, b_int, w_int)
elif any(
isinstance(value, (int, float)) and not 0 <= value <= 255
for value in (r_int, g_int, b_int, w_int)
@@ -833,27 +793,27 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
w_int,
self.entity_id,
)
self._rgbw_color = None
self._attr_rgbw_color = None
else:
_LOGGER.error(
"Received invalid rgb_color : (%s) for entity %s",
render,
self.entity_id,
)
self._rgbw_color = None
self._color_mode = ColorMode.RGBW
self._attr_rgbw_color = None
self._attr_color_mode = ColorMode.RGBW
@callback
def _update_rgbww(self, render):
"""Update the color from the template."""
if render is None:
self._rgbww_color = None
self._attr_rgbww_color = None
return
r_int = g_int = b_int = cw_int = ww_int = None
if isinstance(render, str):
if render in ("None", ""):
self._rgb_color = None
self._attr_rgb_color = None
return
cleanup_char = ["(", ")", "[", "]", " "]
for char in cleanup_char:
@@ -866,7 +826,7 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
value is not None and isinstance(value, (int, float)) and 0 <= value <= 255
for value in (r_int, g_int, b_int, cw_int, ww_int)
):
self._rgbww_color = (r_int, g_int, b_int, cw_int, ww_int)
self._attr_rgbww_color = (r_int, g_int, b_int, cw_int, ww_int)
elif any(
isinstance(value, (int, float)) and not 0 <= value <= 255
for value in (r_int, g_int, b_int, cw_int, ww_int)
@@ -880,15 +840,15 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
ww_int,
self.entity_id,
)
self._rgbww_color = None
self._attr_rgbww_color = None
else:
_LOGGER.error(
"Received invalid rgb_color : (%s) for entity %s",
render,
self.entity_id,
)
self._rgbww_color = None
self._color_mode = ColorMode.RGBWW
self._attr_rgbww_color = None
self._attr_color_mode = ColorMode.RGBWW
@callback
def _update_max_mireds(self, render):
@@ -896,30 +856,42 @@ class AbstractTemplateLight(AbstractTemplateEntity, LightEntity):
try:
if render in (None, "None", ""):
self._max_mireds = None
self._attr_max_mireds = DEFAULT_MAX_MIREDS
self._attr_max_color_temp_kelvin = None
return
self._max_mireds = int(render)
self._attr_max_mireds = max_mireds = int(render)
self._attr_max_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(max_mireds)
)
except ValueError:
_LOGGER.exception(
"Template must supply an integer temperature within the range for"
" this light, or 'None'"
)
self._max_mireds = None
self._attr_max_mireds = DEFAULT_MAX_MIREDS
self._attr_max_color_temp_kelvin = None
@callback
def _update_min_mireds(self, render):
"""Update the min mireds from the template."""
try:
if render in (None, "None", ""):
self._min_mireds = None
self._attr_min_mireds = DEFAULT_MIN_MIREDS
self._attr_min_color_temp_kelvin = None
return
self._min_mireds = int(render)
self._attr_min_mireds = min_mireds = int(render)
self._attr_min_color_temp_kelvin = (
color_util.color_temperature_mired_to_kelvin(min_mireds)
)
except ValueError:
_LOGGER.exception(
"Template must supply an integer temperature within the range for"
" this light, or 'None'"
)
self._min_mireds = None
self._attr_min_mireds = DEFAULT_MIN_MIREDS
self._attr_min_color_temp_kelvin = None
@callback
def _update_supports_transition(self, render):
@@ -951,34 +923,18 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
if TYPE_CHECKING:
assert name is not None
color_modes = {ColorMode.ONOFF}
for action_id, action_config, color_mode in self._iterate_scripts(config):
self.add_script(action_id, action_config, name, DOMAIN)
if color_mode:
color_modes.add(color_mode)
self._supported_color_modes = filter_supported_color_modes(color_modes)
if len(self._supported_color_modes) > 1:
self._color_mode = ColorMode.UNKNOWN
if len(self._supported_color_modes) == 1:
self._color_mode = next(iter(self._supported_color_modes))
self._attr_supported_features = LightEntityFeature(0)
if self._action_scripts.get(CONF_EFFECT_ACTION):
self._attr_supported_features |= LightEntityFeature.EFFECT
if self._supports_transition is True:
self._attr_supported_features |= LightEntityFeature.TRANSITION
self._setup_light_features(config, name)
@callback
def _async_setup_templates(self) -> None:
"""Set up templates."""
if self._template:
self.add_template_attribute(
"_state", self._template, None, self._update_state
"_attr_is_on", self._template, None, self._update_state
)
if self._level_template:
self.add_template_attribute(
"_brightness",
"_attr_brightness",
self._level_template,
None,
self._update_brightness,
@@ -986,7 +942,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._max_mireds_template:
self.add_template_attribute(
"_max_mireds_template",
"_attr_max_color_temp_kelvin",
self._max_mireds_template,
None,
self._update_max_mireds,
@@ -994,7 +950,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._min_mireds_template:
self.add_template_attribute(
"_min_mireds_template",
"_attr_min_color_temp_kelvin",
self._min_mireds_template,
None,
self._update_min_mireds,
@@ -1002,7 +958,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._temperature_template:
self.add_template_attribute(
"_temperature",
"_attr_color_temp_kelvin",
self._temperature_template,
None,
self._update_temperature,
@@ -1010,7 +966,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._hs_template:
self.add_template_attribute(
"_hs_color",
"_attr_hs_color",
self._hs_template,
None,
self._update_hs,
@@ -1018,7 +974,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._rgb_template:
self.add_template_attribute(
"_rgb_color",
"_attr_rgb_color",
self._rgb_template,
None,
self._update_rgb,
@@ -1026,7 +982,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._rgbw_template:
self.add_template_attribute(
"_rgbw_color",
"_attr_rgbw_color",
self._rgbw_template,
None,
self._update_rgbw,
@@ -1034,7 +990,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._rgbww_template:
self.add_template_attribute(
"_rgbww_color",
"_attr_rgbww_color",
self._rgbww_template,
None,
self._update_rgbww,
@@ -1042,7 +998,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._effect_list_template:
self.add_template_attribute(
"_effect_list",
"_attr_effect_list",
self._effect_list_template,
None,
self._update_effect_list,
@@ -1050,7 +1006,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
)
if self._effect_template:
self.add_template_attribute(
"_effect",
"_attr_effect",
self._effect_template,
None,
self._update_effect,
@@ -1071,18 +1027,18 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
"""Update the state from the template."""
if isinstance(result, TemplateError):
# This behavior is legacy
self._state = False
self._attr_is_on = False
if not self._availability_template:
self._attr_available = True
return
if isinstance(result, bool):
self._state = result
self._attr_is_on = result
return
state = str(result).lower()
if state in _VALID_STATES:
self._state = state in ("true", STATE_ON)
self._attr_is_on = state in ("true", STATE_ON)
return
_LOGGER.error(
@@ -1091,7 +1047,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
self.entity_id,
", ".join(_VALID_STATES),
)
self._state = None
self._attr_is_on = None
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the light on."""
@@ -1118,7 +1074,7 @@ class StateLightEntity(TemplateEntity, AbstractTemplateLight):
else:
await self.async_run_script(off_script, context=self._context)
if self._attr_assumed_state:
self._state = False
self._attr_is_on = False
self.async_write_ha_state()
@@ -1165,23 +1121,7 @@ class TriggerLightEntity(TriggerEntity, AbstractTemplateLight):
self._to_render_complex.append(key)
self._parse_result.add(key)
color_modes = {ColorMode.ONOFF}
for action_id, action_config, color_mode in self._iterate_scripts(config):
self.add_script(action_id, action_config, name, DOMAIN)
if color_mode:
color_modes.add(color_mode)
self._supported_color_modes = filter_supported_color_modes(color_modes)
if len(self._supported_color_modes) > 1:
self._color_mode = ColorMode.UNKNOWN
if len(self._supported_color_modes) == 1:
self._color_mode = next(iter(self._supported_color_modes))
self._attr_supported_features = LightEntityFeature(0)
if self._action_scripts.get(CONF_EFFECT_ACTION):
self._attr_supported_features |= LightEntityFeature.EFFECT
if self._supports_transition is True:
self._attr_supported_features |= LightEntityFeature.TRANSITION
self._setup_light_features(config, name)
@callback
def _handle_coordinator_update(self) -> None:
@@ -1215,7 +1155,7 @@ class TriggerLightEntity(TriggerEntity, AbstractTemplateLight):
if not self._optimistic:
raw = self._rendered.get(CONF_STATE)
self._state = template.result_as_boolean(raw)
self._attr_is_on = template.result_as_boolean(raw)
write_ha_state = True
elif self._optimistic and len(self._rendered) > 0:
@@ -1229,12 +1169,12 @@ class TriggerLightEntity(TriggerEntity, AbstractTemplateLight):
"""Turn the light on."""
optimistic_set = self.set_optimistic_attributes(**kwargs)
script_id, script_params = self.get_registered_script(**kwargs)
if self._template and self._state is None:
if self._template and self._attr_is_on is None:
# Ensure an optimistic state is set on the entity when turn_on
# is called and the main state hasn't rendered. This will only
# occur when the state is unknown, the template hasn't triggered,
# and turn_on is called.
self._state = True
self._attr_is_on = True
await self.async_run_script(
self._action_scripts[script_id],
@@ -1257,5 +1197,5 @@ class TriggerLightEntity(TriggerEntity, AbstractTemplateLight):
else:
await self.async_run_script(off_script, context=self._context)
if self._attr_assumed_state:
self._state = False
self._attr_is_on = False
self.async_write_ha_state()

View File

@@ -527,6 +527,10 @@
"deprecated_battery_level": {
"description": "The template vacuum options `battery_level` and `battery_level_template` are being removed in 2026.8.\n\nPlease remove the `battery_level` or `battery_level_template` option from the YAML configuration for {entity_id} ({entity_name}).",
"title": "Deprecated battery level option in {entity_name}"
},
"deprecated_legacy_templates": {
"description": "The legacy `platform: template` syntax for `{domain}` is being removed. Please migrate `{breadcrumb}` to the modern template syntax.\n\n1. Remove existing template definition.\n2. Add new template definition:\n{config}\n3. Restart Home Assistant or reload template entities.",
"title": "Legacy {domain} template deprecation"
}
},
"options": {

View File

@@ -15,7 +15,7 @@ from uiprotect.exceptions import BadRequest, ClientError, NotAuthorized
# diagnostics module will not be imported in the executor.
from uiprotect.test_util.anonymize import anonymize_data # noqa: F401
from homeassistant.config_entries import ConfigEntry
from homeassistant.config_entries import ConfigEntry, ConfigEntryState
from homeassistant.const import CONF_API_KEY, EVENT_HOMEASSISTANT_STOP
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import (
@@ -172,6 +172,24 @@ async def async_unload_entry(hass: HomeAssistant, entry: UFPConfigEntry) -> bool
return unload_ok
async def async_remove_entry(hass: HomeAssistant, entry: UFPConfigEntry) -> None:
"""Handle removal of a config entry."""
# Clear the stored session credentials when the integration is removed
if entry.state is ConfigEntryState.LOADED:
# Integration is loaded, use the existing API client
try:
await entry.runtime_data.api.clear_session()
except Exception as err: # noqa: BLE001
_LOGGER.warning("Failed to clear session credentials: %s", err)
else:
# Integration is not loaded, create temporary client to clear session
protect = async_create_api_client(hass, entry)
try:
await protect.clear_session()
except Exception as err: # noqa: BLE001
_LOGGER.warning("Failed to clear session credentials: %s", err)
async def async_remove_config_entry_device(
hass: HomeAssistant, config_entry: UFPConfigEntry, device_entry: dr.DeviceEntry
) -> bool:

View File

@@ -40,7 +40,7 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["uiprotect", "unifi_discovery"],
"requirements": ["uiprotect==7.28.0", "unifi-discovery==1.2.0"],
"requirements": ["uiprotect==7.29.0", "unifi-discovery==1.2.0"],
"ssdp": [
{
"manufacturer": "Ubiquiti Networks",

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine, Sequence
import dataclasses
from datetime import datetime, timedelta
from functools import partial
import logging
@@ -45,7 +44,7 @@ from .utils import (
usb_device_from_path, # noqa: F401
usb_device_from_port, # noqa: F401
usb_device_matches_matcher,
usb_service_info_from_device, # noqa: F401
usb_service_info_from_device,
usb_unique_id_from_service_info, # noqa: F401
)
@@ -59,7 +58,6 @@ ADD_REMOVE_SCAN_COOLDOWN = 5 # 5 second cooldown to give devices a chance to re
__all__ = [
"USBCallbackMatcher",
"async_is_plugged_in",
"async_register_port_event_callback",
"async_register_scan_request_callback",
]
@@ -101,51 +99,6 @@ def async_register_port_event_callback(
return discovery.async_register_port_event_callback(callback)
@hass_callback
def async_is_plugged_in(hass: HomeAssistant, matcher: USBCallbackMatcher) -> bool:
"""Return True is a USB device is present."""
vid = matcher.get("vid", "")
pid = matcher.get("pid", "")
serial_number = matcher.get("serial_number", "")
manufacturer = matcher.get("manufacturer", "")
description = matcher.get("description", "")
if (
vid != vid.upper()
or pid != pid.upper()
or serial_number != serial_number.lower()
or manufacturer != manufacturer.lower()
or description != description.lower()
):
raise ValueError(
f"vid and pid must be uppercase, the rest lowercase in matcher {matcher!r}"
)
usb_discovery: USBDiscovery = hass.data[DOMAIN]
return any(
usb_device_matches_matcher(
USBDevice(
device=device,
vid=vid,
pid=pid,
serial_number=serial_number,
manufacturer=manufacturer,
description=description,
),
matcher,
)
for (
device,
vid,
pid,
serial_number,
manufacturer,
description,
) in usb_discovery.seen
)
@hass_callback
def async_get_usb_matchers_for_device(
hass: HomeAssistant, device: USBDevice
@@ -244,7 +197,6 @@ class USBDiscovery:
"""Init USB Discovery."""
self.hass = hass
self.usb = usb
self.seen: set[tuple[str, ...]] = set()
self.observer_active = False
self._request_debouncer: Debouncer[Coroutine[Any, Any, None]] | None = None
self._add_remove_debouncer: Debouncer[Coroutine[Any, Any, None]] | None = None
@@ -393,30 +345,13 @@ class USBDiscovery:
async def _async_process_discovered_usb_device(self, device: USBDevice) -> None:
"""Process a USB discovery."""
_LOGGER.debug("Discovered USB Device: %s", device)
device_tuple = dataclasses.astuple(device)
if device_tuple in self.seen:
return
self.seen.add(device_tuple)
matched = self.async_get_usb_matchers_for_device(device)
if not matched:
return
service_info: _UsbServiceInfo | None = None
service_info = usb_service_info_from_device(device)
for matcher in matched:
if service_info is None:
service_info = _UsbServiceInfo(
device=await self.hass.async_add_executor_job(
get_serial_by_id, device.device
),
vid=device.vid,
pid=device.pid,
serial_number=device.serial_number,
manufacturer=device.manufacturer,
description=device.description,
)
discovery_flow.async_create_flow(
self.hass,
matcher["domain"],
@@ -424,6 +359,26 @@ class USBDiscovery:
service_info,
)
async def _async_process_removed_usb_device(self, device: USBDevice) -> None:
"""Process a USB removal."""
_LOGGER.debug("Removed USB Device: %s", device)
matched = self.async_get_usb_matchers_for_device(device)
if not matched:
return
service_info = usb_service_info_from_device(device)
for matcher in matched:
for flow in self.hass.config_entries.flow.async_progress_by_init_data_type(
_UsbServiceInfo,
lambda flow_service_info: flow_service_info == service_info,
):
if matcher["domain"] != flow["handler"]:
continue
_LOGGER.debug("Aborting existing flow %s", flow["flow_id"])
self.hass.config_entries.flow.async_abort(flow["flow_id"])
async def _async_process_ports(self, usb_devices: Sequence[USBDevice]) -> None:
"""Process each discovered port."""
_LOGGER.debug("USB devices: %r", usb_devices)
@@ -464,7 +419,10 @@ class USBDiscovery:
except Exception:
_LOGGER.exception("Error in USB port event callback")
for usb_device in filtered_usb_devices:
for usb_device in removed_devices:
await self._async_process_removed_usb_device(usb_device)
for usb_device in added_devices:
await self._async_process_discovered_usb_device(usb_device)
@hass_callback

View File

@@ -76,6 +76,7 @@ def setup_vicare_api(hass: HomeAssistant, entry: ViCareConfigEntry) -> PyViCare:
devices = [
ViCareDevice(config=device_config, api=get_device(entry, device_config))
for device_config in device_config_list
if bool(device_config.isOnline())
]
return ViCareData(client=client, devices=devices)

View File

@@ -0,0 +1,228 @@
"""Automation related helper methods for the Websocket API."""
from __future__ import annotations
from dataclasses import dataclass
import logging
from typing import Any, Self
from homeassistant.const import CONF_TARGET
from homeassistant.core import HomeAssistant
from homeassistant.helpers import target as target_helpers
from homeassistant.helpers.condition import (
async_get_all_descriptions as async_get_all_condition_descriptions,
)
from homeassistant.helpers.entity import (
entity_sources,
get_device_class,
get_supported_features,
)
from homeassistant.helpers.service import (
async_get_all_descriptions as async_get_all_service_descriptions,
)
from homeassistant.helpers.trigger import (
async_get_all_descriptions as async_get_all_trigger_descriptions,
)
from homeassistant.helpers.typing import ConfigType
_LOGGER = logging.getLogger(__name__)
@dataclass(slots=True, kw_only=True)
class _EntityFilter:
"""Single entity filter configuration."""
integration: str | None
domains: set[str]
device_classes: set[str]
supported_features: set[int]
def matches(
self, hass: HomeAssistant, entity_id: str, domain: str, integration: str
) -> bool:
"""Return if entity matches all criteria in this filter."""
if self.integration and integration != self.integration:
return False
if self.domains and domain not in self.domains:
return False
if self.device_classes:
if (
entity_device_class := get_device_class(hass, entity_id)
) is None or entity_device_class not in self.device_classes:
return False
if self.supported_features:
entity_supported_features = get_supported_features(hass, entity_id)
if not any(
feature & entity_supported_features == feature
for feature in self.supported_features
):
return False
return True
@dataclass(slots=True, kw_only=True)
class _AutomationComponentLookupData:
"""Helper class for looking up automation components."""
component: str
filters: list[_EntityFilter]
@classmethod
def create(cls, component: str, target_description: dict[str, Any]) -> Self:
"""Build automation component lookup data from target description."""
filters: list[_EntityFilter] = []
entity_filters_config = target_description.get("entity", [])
for entity_filter_config in entity_filters_config:
entity_filter = _EntityFilter(
integration=entity_filter_config.get("integration"),
domains=set(entity_filter_config.get("domain", [])),
device_classes=set(entity_filter_config.get("device_class", [])),
supported_features=set(
entity_filter_config.get("supported_features", [])
),
)
filters.append(entity_filter)
return cls(component=component, filters=filters)
def matches(
self, hass: HomeAssistant, entity_id: str, domain: str, integration: str
) -> bool:
"""Return if entity matches ANY of the filters."""
if not self.filters:
return True
return any(
f.matches(hass, entity_id, domain, integration) for f in self.filters
)
def _get_automation_component_domains(
target_description: dict[str, Any],
) -> set[str | None]:
"""Get a list of domains (including integration domains) of an automation component.
The list of domains is extracted from each target's entity filters.
If a filter is missing both domain and integration keys, None is added to the
returned set.
"""
entity_filters_config = target_description.get("entity", [])
if not entity_filters_config:
return {None}
domains: set[str | None] = set()
for entity_filter_config in entity_filters_config:
filter_integration = entity_filter_config.get("integration")
filter_domains = entity_filter_config.get("domain", [])
if not filter_domains and not filter_integration:
domains.add(None)
continue
if filter_integration:
domains.add(filter_integration)
for domain in filter_domains:
domains.add(domain)
return domains
def _async_get_automation_components_for_target(
hass: HomeAssistant,
target_selection: ConfigType,
expand_group: bool,
component_descriptions: dict[str, dict[str, Any] | None],
) -> set[str]:
"""Get automation components (triggers/conditions/services) for a target.
Returns all components that can be used on any entity that are currently part of a target.
"""
extracted = target_helpers.async_extract_referenced_entity_ids(
hass,
target_helpers.TargetSelectorData(target_selection),
expand_group=expand_group,
)
_LOGGER.debug("Extracted entities for lookup: %s", extracted)
# Build lookup structure: domain -> list of trigger/condition/service lookup data
domain_components: dict[str | None, list[_AutomationComponentLookupData]] = {}
component_count = 0
for component, description in component_descriptions.items():
if description is None or CONF_TARGET not in description:
_LOGGER.debug("Skipping component %s without target description", component)
continue
domains = _get_automation_component_domains(description[CONF_TARGET])
lookup_data = _AutomationComponentLookupData.create(
component, description[CONF_TARGET]
)
for domain in domains:
domain_components.setdefault(domain, []).append(lookup_data)
component_count += 1
_LOGGER.debug("Automation components per domain: %s", domain_components)
entity_infos = entity_sources(hass)
matched_components: set[str] = set()
for entity_id in extracted.referenced | extracted.indirectly_referenced:
if component_count == len(matched_components):
# All automation components matched already, so we don't need to iterate further
break
entity_info = entity_infos.get(entity_id)
if entity_info is None:
_LOGGER.debug("No entity source found for %s", entity_id)
continue
entity_domain = entity_id.split(".")[0]
entity_integration = entity_info["domain"]
for domain in (entity_domain, entity_integration, None):
for component_data in domain_components.get(domain, []):
if component_data.component in matched_components:
continue
if component_data.matches(
hass, entity_id, entity_domain, entity_integration
):
matched_components.add(component_data.component)
return matched_components
async def async_get_triggers_for_target(
hass: HomeAssistant, target_selector: ConfigType, expand_group: bool
) -> set[str]:
"""Get triggers for a target."""
descriptions = await async_get_all_trigger_descriptions(hass)
return _async_get_automation_components_for_target(
hass, target_selector, expand_group, descriptions
)
async def async_get_conditions_for_target(
hass: HomeAssistant, target_selector: ConfigType, expand_group: bool
) -> set[str]:
"""Get conditions for a target."""
descriptions = await async_get_all_condition_descriptions(hass)
return _async_get_automation_components_for_target(
hass, target_selector, expand_group, descriptions
)
async def async_get_services_for_target(
hass: HomeAssistant, target_selector: ConfigType, expand_group: bool
) -> set[str]:
"""Get services for a target."""
descriptions = await async_get_all_service_descriptions(hass)
# Flatten dicts to be keyed by domain.name to match trigger/condition format
descriptions_flatten = {
f"{domain}.{service_name}": desc
for domain, services in descriptions.items()
for service_name, desc in services.items()
}
return _async_get_automation_components_for_target(
hass, target_selector, expand_group, descriptions_flatten
)

View File

@@ -87,6 +87,11 @@ from homeassistant.setup import (
from homeassistant.util.json import format_unserializable_data
from . import const, decorators, messages
from .automation import (
async_get_conditions_for_target,
async_get_services_for_target,
async_get_triggers_for_target,
)
from .connection import ActiveConnection
from .messages import construct_event_message, construct_result_message
@@ -108,9 +113,12 @@ def async_register_commands(
async_reg(hass, handle_execute_script)
async_reg(hass, handle_extract_from_target)
async_reg(hass, handle_fire_event)
async_reg(hass, handle_get_conditions_for_target)
async_reg(hass, handle_get_config)
async_reg(hass, handle_get_services)
async_reg(hass, handle_get_services_for_target)
async_reg(hass, handle_get_states)
async_reg(hass, handle_get_triggers_for_target)
async_reg(hass, handle_manifest_get)
async_reg(hass, handle_integration_setup_info)
async_reg(hass, handle_manifest_list)
@@ -877,6 +885,75 @@ def handle_extract_from_target(
connection.send_result(msg["id"], extracted_dict)
@decorators.websocket_command(
{
vol.Required("type"): "get_triggers_for_target",
vol.Required("target"): cv.TARGET_FIELDS,
vol.Optional("expand_group", default=True): bool,
}
)
@decorators.async_response
async def handle_get_triggers_for_target(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle get triggers for target command.
This command returns all triggers that can be used with any entities that are currently
part of a target.
"""
triggers = await async_get_triggers_for_target(
hass, msg["target"], msg["expand_group"]
)
connection.send_result(msg["id"], triggers)
@decorators.websocket_command(
{
vol.Required("type"): "get_conditions_for_target",
vol.Required("target"): cv.TARGET_FIELDS,
vol.Optional("expand_group", default=True): bool,
}
)
@decorators.async_response
async def handle_get_conditions_for_target(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle get conditions for target command.
This command returns all conditions that can be used with any entities that are currently
part of a target.
"""
conditions = await async_get_conditions_for_target(
hass, msg["target"], msg["expand_group"]
)
connection.send_result(msg["id"], conditions)
@decorators.websocket_command(
{
vol.Required("type"): "get_services_for_target",
vol.Required("target"): cv.TARGET_FIELDS,
vol.Optional("expand_group", default=True): bool,
}
)
@decorators.async_response
async def handle_get_services_for_target(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle get services for target command.
This command returns all services that can be used with any entities that are currently
part of a target.
"""
services = await async_get_services_for_target(
hass, msg["target"], msg["expand_group"]
)
connection.send_result(msg["id"], services)
@decorators.websocket_command(
{
vol.Required("type"): "subscribe_trigger",

View File

@@ -189,6 +189,8 @@
"button_4": "Fourth button",
"button_5": "Fifth button",
"button_6": "Sixth button",
"button_7": "Seventh button",
"button_8": "Eighth button",
"close": "[%key:common::action::close%]",
"dim_down": "Dim down",
"dim_up": "Dim up",

View File

@@ -15,8 +15,8 @@ if TYPE_CHECKING:
from .helpers.typing import NoEventData
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2025
MINOR_VERSION: Final = 12
MAJOR_VERSION: Final = 2026
MINOR_VERSION: Final = 1
PATCH_VERSION: Final = "0.dev0"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"

View File

@@ -2426,7 +2426,14 @@ class SupportsResponse(enum.StrEnum):
class Service:
"""Representation of a callable service."""
__slots__ = ["domain", "job", "schema", "service", "supports_response"]
__slots__ = [
"description_placeholders",
"domain",
"job",
"schema",
"service",
"supports_response",
]
def __init__(
self,
@@ -2443,11 +2450,13 @@ class Service:
context: Context | None = None,
supports_response: SupportsResponse = SupportsResponse.NONE,
job_type: HassJobType | None = None,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Initialize a service."""
self.job = HassJob(func, f"service {domain}.{service}", job_type=job_type)
self.schema = schema
self.supports_response = supports_response
self.description_placeholders = description_placeholders
class ServiceCall:
@@ -2590,6 +2599,8 @@ class ServiceRegistry:
schema: VolSchemaType | None = None,
supports_response: SupportsResponse = SupportsResponse.NONE,
job_type: HassJobType | None = None,
*,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Register a service.
@@ -2599,7 +2610,13 @@ class ServiceRegistry:
"""
self._hass.verify_event_loop_thread("hass.services.async_register")
self._async_register(
domain, service, service_func, schema, supports_response, job_type
domain,
service,
service_func,
schema,
supports_response,
job_type,
description_placeholders,
)
@callback
@@ -2617,6 +2634,7 @@ class ServiceRegistry:
schema: VolSchemaType | None = None,
supports_response: SupportsResponse = SupportsResponse.NONE,
job_type: HassJobType | None = None,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Register a service.
@@ -2633,6 +2651,7 @@ class ServiceRegistry:
service,
supports_response=supports_response,
job_type=job_type,
description_placeholders=description_placeholders,
)
if domain in self._services:

View File

@@ -186,6 +186,7 @@ FLOWS = {
"emonitor",
"emulated_roku",
"energenie_power_sockets",
"energyid",
"energyzero",
"enigma2",
"enocean",

View File

@@ -1730,6 +1730,12 @@
"integration_type": "virtual",
"supported_by": "energyzero"
},
"energyid": {
"name": "EnergyID",
"integration_type": "service",
"config_flow": true,
"iot_class": "cloud_push"
},
"energyzero": {
"name": "EnergyZero",
"integration_type": "service",

View File

@@ -110,6 +110,9 @@ INPUT_ENTITY_ID = re.compile(
CONDITION_DESCRIPTION_CACHE: HassKey[dict[str, dict[str, Any] | None]] = HassKey(
"condition_description_cache"
)
CONDITION_DISABLED_CONDITIONS: HassKey[set[str]] = HassKey(
"condition_disabled_conditions"
)
CONDITION_PLATFORM_SUBSCRIPTIONS: HassKey[
list[Callable[[set[str]], Coroutine[Any, Any, None]]]
] = HassKey("condition_platform_subscriptions")
@@ -151,9 +154,27 @@ _CONDITIONS_DESCRIPTION_SCHEMA = vol.Schema(
async def async_setup(hass: HomeAssistant) -> None:
"""Set up the condition helper."""
from homeassistant.components import automation, labs # noqa: PLC0415
hass.data[CONDITION_DESCRIPTION_CACHE] = {}
hass.data[CONDITION_DISABLED_CONDITIONS] = set()
hass.data[CONDITION_PLATFORM_SUBSCRIPTIONS] = []
hass.data[CONDITIONS] = {}
@callback
def new_triggers_conditions_listener() -> None:
"""Handle new_triggers_conditions flag change."""
# Invalidate the cache
hass.data[CONDITION_DESCRIPTION_CACHE] = {}
hass.data[CONDITION_DISABLED_CONDITIONS] = set()
labs.async_listen(
hass,
automation.DOMAIN,
automation.NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
new_triggers_conditions_listener,
)
await async_process_integration_platforms(
hass, "condition", _register_condition_platform, wait_for_platforms=True
)
@@ -352,11 +373,21 @@ def trace_condition_function(condition: ConditionCheckerType) -> ConditionChecke
async def _async_get_condition_platform(
hass: HomeAssistant, condition_key: str
) -> tuple[str, ConditionProtocol | None]:
from homeassistant.components import automation # noqa: PLC0415
platform_and_sub_type = condition_key.split(".")
platform: str | None = platform_and_sub_type[0]
platform = _PLATFORM_ALIASES.get(platform, platform)
if platform is None:
return "", None
if automation.is_disabled_experimental_condition(hass, platform):
raise vol.Invalid(
f"Condition '{condition_key}' requires the experimental 'New triggers and "
"conditions' feature to be enabled in Home Assistant Labs settings "
f"(feature flag: '{automation.NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG}')"
)
try:
integration = await async_get_integration(hass, platform)
except IntegrationNotFound:
@@ -1209,6 +1240,8 @@ async def async_get_all_descriptions(
hass: HomeAssistant,
) -> dict[str, dict[str, Any] | None]:
"""Return descriptions (i.e. user documentation) for all conditions."""
from homeassistant.components import automation # noqa: PLC0415
descriptions_cache = hass.data[CONDITION_DESCRIPTION_CACHE]
conditions = hass.data[CONDITIONS]
@@ -1217,7 +1250,12 @@ async def async_get_all_descriptions(
all_conditions = set(conditions)
previous_all_conditions = set(descriptions_cache)
# If the conditions are the same, we can return the cache
if previous_all_conditions == all_conditions:
# mypy complains: Invalid index type "HassKey[set[str]]" for "HassDict"
if (
previous_all_conditions | hass.data[CONDITION_DISABLED_CONDITIONS] # type: ignore[index]
== all_conditions
):
return descriptions_cache
# Files we loaded for missing descriptions
@@ -1257,6 +1295,9 @@ async def async_get_all_descriptions(
new_descriptions_cache = descriptions_cache.copy()
for missing_condition in missing_conditions:
domain = conditions[missing_condition]
if automation.is_disabled_experimental_condition(hass, domain):
hass.data[CONDITION_DISABLED_CONDITIONS].add(missing_condition)
continue
if (
yaml_description := new_conditions_descriptions.get(domain, {}).get(

View File

@@ -781,6 +781,7 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
STORAGE_KEY,
atomic_writes=True,
minor_version=STORAGE_VERSION_MINOR,
serialize_in_event_loop=False,
)
@callback
@@ -1562,10 +1563,15 @@ class DeviceRegistry(BaseRegistry[dict[str, list[dict[str, Any]]]]):
@callback
def _data_to_save(self) -> dict[str, Any]:
"""Return data of device registry to store in a file."""
# Create intermediate lists to allow this method to be called from a thread
# other than the event loop.
return {
"devices": [entry.as_storage_fragment for entry in self.devices.values()],
"devices": [
entry.as_storage_fragment for entry in list(self.devices.values())
],
"deleted_devices": [
entry.as_storage_fragment for entry in self.deleted_devices.values()
entry.as_storage_fragment
for entry in list(self.deleted_devices.values())
],
}

View File

@@ -3,7 +3,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable, Iterable
from collections.abc import Callable, Iterable, Mapping
from datetime import timedelta
import logging
from types import ModuleType
@@ -251,6 +251,8 @@ class EntityComponent[_EntityT: entity.Entity = entity.Entity]:
func: str | Callable[..., Any],
required_features: list[int] | None = None,
supports_response: SupportsResponse = SupportsResponse.NONE,
*,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Register an entity service."""
service.async_register_entity_service(
@@ -263,6 +265,7 @@ class EntityComponent[_EntityT: entity.Entity = entity.Entity]:
required_features=required_features,
schema=schema,
supports_response=supports_response,
description_placeholders=description_placeholders,
)
async def async_setup_platform(

View File

@@ -3,7 +3,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Awaitable, Callable, Coroutine, Iterable
from collections.abc import Awaitable, Callable, Coroutine, Iterable, Mapping
from contextvars import ContextVar
from datetime import timedelta
from logging import Logger, getLogger
@@ -1081,6 +1081,7 @@ class EntityPlatform:
supports_response: SupportsResponse = SupportsResponse.NONE,
*,
entity_device_classes: Iterable[str | None] | None = None,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Register an entity service.
@@ -1100,6 +1101,7 @@ class EntityPlatform:
required_features=required_features,
schema=schema,
supports_response=supports_response,
description_placeholders=description_placeholders,
)
async def _async_update_entity_states(self) -> None:

View File

@@ -824,6 +824,7 @@ class EntityRegistry(BaseRegistry):
STORAGE_KEY,
atomic_writes=True,
minor_version=STORAGE_VERSION_MINOR,
serialize_in_event_loop=False,
)
self.hass.bus.async_listen(
EVENT_DEVICE_REGISTRY_UPDATED,
@@ -1630,13 +1631,17 @@ class EntityRegistry(BaseRegistry):
self.entities = entities
self._entities_data = entities.data
@callback
def _data_to_save(self) -> dict[str, Any]:
"""Return data of entity registry to store in a file."""
# Create intermediate lists to allow this method to be called from a thread
# other than the event loop.
return {
"entities": [entry.as_storage_fragment for entry in self.entities.values()],
"entities": [
entry.as_storage_fragment for entry in list(self.entities.values())
],
"deleted_entities": [
entry.as_storage_fragment for entry in self.deleted_entities.values()
entry.as_storage_fragment
for entry in list(self.deleted_entities.values())
],
}

View File

@@ -77,7 +77,6 @@ class BaseRegistry[_StoreDataT: Mapping[str, Any] | Sequence[Any]](ABC):
delay = SAVE_DELAY if self.hass.state is CoreState.running else SAVE_DELAY_LONG
self._store.async_delay_save(self._data_to_save, delay)
@callback
@abstractmethod
def _data_to_save(self) -> _StoreDataT:
"""Return data of registry to store in a file."""

View File

@@ -57,6 +57,10 @@ class Selector[_T: Mapping[str, Any]]:
CONFIG_SCHEMA: Callable
config: _T
selector_type: str
# Context keys that are allowed to be used in the selector, with list of allowed selector types.
# Selectors can use the value of other fields in the same schema as context for filtering for example.
# The selector defines which context keys it supports and what selector types are allowed for each key.
allowed_context_keys: dict[str, set[str]] = {}
def __init__(self, config: Mapping[str, Any] | None = None) -> None:
"""Instantiate a selector."""
@@ -346,6 +350,11 @@ class AttributeSelector(Selector[AttributeSelectorConfig]):
selector_type = "attribute"
allowed_context_keys = {
# Filters the available attributes based on the selected entity
"filter_entity": {"entity"}
}
CONFIG_SCHEMA = make_selector_config_schema(
{
vol.Required("entity_id"): cv.entity_id,
@@ -1039,6 +1048,11 @@ class MediaSelector(Selector[MediaSelectorConfig]):
selector_type = "media"
allowed_context_keys = {
# Filters the available media based on the selected entity
"filter_entity": {EntitySelector.selector_type}
}
CONFIG_SCHEMA = make_selector_config_schema(
{
vol.Optional("accept"): [str],
@@ -1385,6 +1399,15 @@ class StateSelector(Selector[StateSelectorConfig]):
selector_type = "state"
allowed_context_keys = {
# Filters the available states based on the selected entity
"filter_entity": {EntitySelector.selector_type},
# Filters the available states based on the selected target
"filter_target": {"target"},
# Only show the attribute values of a specific attribute
"filter_attribute": {AttributeSelector.selector_type},
}
CONFIG_SCHEMA = make_selector_config_schema(
{
vol.Optional("entity_id"): cv.entity_id,

View File

@@ -3,7 +3,7 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine, Iterable
from collections.abc import Callable, Coroutine, Iterable, Mapping
import dataclasses
from enum import Enum
from functools import cache, partial
@@ -612,6 +612,8 @@ async def async_get_all_descriptions(
# Don't warn for missing services, because it triggers false
# positives for things like scripts, that register as a service
description = {"fields": yaml_description.get("fields", {})}
if description_placeholders := service.description_placeholders:
description["description_placeholders"] = description_placeholders
for item in ("description", "name", "target"):
if item in yaml_description:
@@ -955,6 +957,8 @@ def async_register_admin_service(
],
schema: VolSchemaType = vol.Schema({}, extra=vol.PREVENT_EXTRA),
supports_response: SupportsResponse = SupportsResponse.NONE,
*,
description_placeholders: Mapping[str, str] | None = None,
) -> None:
"""Register a service that requires admin access."""
hass.services.async_register(
@@ -967,6 +971,7 @@ def async_register_admin_service(
),
schema,
supports_response,
description_placeholders=description_placeholders,
)
@@ -1112,6 +1117,7 @@ def async_register_entity_service(
domain: str,
name: str,
*,
description_placeholders: Mapping[str, str] | None = None,
entity_device_classes: Iterable[str | None] | None = None,
entities: dict[str, Entity],
func: str | Callable[..., Any],
@@ -1145,6 +1151,7 @@ def async_register_entity_service(
schema,
supports_response,
job_type=job_type,
description_placeholders=description_placeholders,
)
@@ -1154,6 +1161,7 @@ def async_register_platform_entity_service(
service_domain: str,
service_name: str,
*,
description_placeholders: Mapping[str, str] | None = None,
entity_device_classes: Iterable[str | None] | None = None,
entity_domain: str,
func: str | Callable[..., Any],
@@ -1191,4 +1199,5 @@ def async_register_platform_entity_service(
schema,
supports_response,
job_type=HassJobType.Coroutinefunction,
description_placeholders=description_placeholders,
)

View File

@@ -6,7 +6,7 @@ from collections.abc import Iterable
from functools import wraps
import math
import statistics
from typing import TYPE_CHECKING, Any
from typing import TYPE_CHECKING, Any, Literal
import jinja2
from jinja2 import pass_environment
@@ -77,6 +77,10 @@ class MathExtension(BaseTemplateExtension):
TemplateFunction(
"bitwise_xor", self.bitwise_xor, as_global=True, as_filter=True
),
# Value constraint functions (as globals and filters)
TemplateFunction("clamp", self.clamp, as_global=True, as_filter=True),
TemplateFunction("wrap", self.wrap, as_global=True, as_filter=True),
TemplateFunction("remap", self.remap, as_global=True, as_filter=True),
],
)
@@ -327,3 +331,114 @@ class MathExtension(BaseTemplateExtension):
def bitwise_xor(first_value: Any, second_value: Any) -> Any:
"""Perform a bitwise xor operation."""
return first_value ^ second_value
@staticmethod
def clamp(value: Any, min_value: Any, max_value: Any) -> Any:
"""Filter and function to clamp a value between min and max bounds.
Constrains value to the range [min_value, max_value] (inclusive).
"""
try:
value_num = float(value)
min_value_num = float(min_value)
max_value_num = float(max_value)
except (ValueError, TypeError) as err:
raise ValueError(
f"function requires numeric arguments, "
f"got {value=}, {min_value=}, {max_value=}"
) from err
return max(min_value_num, min(max_value_num, value_num))
@staticmethod
def wrap(value: Any, min_value: Any, max_value: Any) -> Any:
"""Filter and function to wrap a value within a range.
Wraps value cyclically within [min_value, max_value) (inclusive min, exclusive max).
"""
try:
value_num = float(value)
min_value_num = float(min_value)
max_value_num = float(max_value)
except (ValueError, TypeError) as err:
raise ValueError(
f"function requires numeric arguments, "
f"got {value=}, {min_value=}, {max_value=}"
) from err
try:
range_size = max_value_num - min_value_num
return ((value_num - min_value_num) % range_size) + min_value_num
except ZeroDivisionError: # be lenient: if the range is empty, just clamp
return min_value_num
@staticmethod
def remap(
value: Any,
in_min: Any,
in_max: Any,
out_min: Any,
out_max: Any,
*,
steps: int = 0,
edges: Literal["none", "clamp", "wrap", "mirror"] = "none",
) -> Any:
"""Filter and function to remap a value from one range to another.
Maps value from input range [in_min, in_max] to output range [out_min, out_max].
The steps parameter, if greater than 0, quantizes the output into
the specified number of discrete steps.
The edges parameter controls how out-of-bounds input values are handled:
- "none": No special handling; values outside the input range are extrapolated into the output range.
- "clamp": Values outside the input range are clamped to the nearest boundary.
- "wrap": Values outside the input range are wrapped around cyclically.
- "mirror": Values outside the input range are mirrored back into the range.
"""
try:
value_num = float(value)
in_min_num = float(in_min)
in_max_num = float(in_max)
out_min_num = float(out_min)
out_max_num = float(out_max)
except (ValueError, TypeError) as err:
raise ValueError(
f"function requires numeric arguments, "
f"got {value=}, {in_min=}, {in_max=}, {out_min=}, {out_max=}"
) from err
# Apply edge behavior in original space for accuracy.
if edges == "clamp":
value_num = max(in_min_num, min(in_max_num, value_num))
elif edges == "wrap":
if in_min_num == in_max_num:
raise ValueError(f"{in_min=} must not equal {in_max=}")
range_size = in_max_num - in_min_num # Validated against div0 above.
value_num = ((value_num - in_min_num) % range_size) + in_min_num
elif edges == "mirror":
if in_min_num == in_max_num:
raise ValueError(f"{in_min=} must not equal {in_max=}")
range_size = in_max_num - in_min_num # Validated against div0 above.
# Determine which period we're in and whether it should be mirrored
offset = value_num - in_min_num
period = math.floor(offset / range_size)
position_in_period = offset - (period * range_size)
if (period < 0) or (period % 2 != 0):
position_in_period = range_size - position_in_period
value_num = in_min_num + position_in_period
# Unknown "edges" values are left as-is; no use throwing an error.
steps = max(steps, 0)
if not steps and (in_min_num == out_min_num and in_max_num == out_max_num):
return value_num # No remapping needed. Save some cycles and floating-point precision.
normalized = (value_num - in_min_num) / (in_max_num - in_min_num)
if steps:
normalized = round(normalized * steps) / steps
return out_min_num + (normalized * (out_max_num - out_min_num))

View File

@@ -83,6 +83,7 @@ DATA_PLUGGABLE_ACTIONS: HassKey[defaultdict[tuple, PluggableActionsEntry]] = Has
TRIGGER_DESCRIPTION_CACHE: HassKey[dict[str, dict[str, Any] | None]] = HassKey(
"trigger_description_cache"
)
TRIGGER_DISABLED_TRIGGERS: HassKey[set[str]] = HassKey("trigger_disabled_triggers")
TRIGGER_PLATFORM_SUBSCRIPTIONS: HassKey[
list[Callable[[set[str]], Coroutine[Any, Any, None]]]
] = HassKey("trigger_platform_subscriptions")
@@ -124,9 +125,27 @@ _TRIGGERS_DESCRIPTION_SCHEMA = vol.Schema(
async def async_setup(hass: HomeAssistant) -> None:
"""Set up the trigger helper."""
from homeassistant.components import automation, labs # noqa: PLC0415
hass.data[TRIGGER_DESCRIPTION_CACHE] = {}
hass.data[TRIGGER_DISABLED_TRIGGERS] = set()
hass.data[TRIGGER_PLATFORM_SUBSCRIPTIONS] = []
hass.data[TRIGGERS] = {}
@callback
def new_triggers_conditions_listener() -> None:
"""Handle new_triggers_conditions flag change."""
# Invalidate the cache
hass.data[TRIGGER_DESCRIPTION_CACHE] = {}
hass.data[TRIGGER_DISABLED_TRIGGERS] = set()
labs.async_listen(
hass,
automation.DOMAIN,
automation.NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG,
new_triggers_conditions_listener,
)
await async_process_integration_platforms(
hass, "trigger", _register_trigger_platform, wait_for_platforms=True
)
@@ -694,9 +713,19 @@ class PluggableAction:
async def _async_get_trigger_platform(
hass: HomeAssistant, trigger_key: str
) -> tuple[str, TriggerProtocol]:
from homeassistant.components import automation # noqa: PLC0415
platform_and_sub_type = trigger_key.split(".")
platform = platform_and_sub_type[0]
platform = _PLATFORM_ALIASES.get(platform, platform)
if automation.is_disabled_experimental_trigger(hass, platform):
raise vol.Invalid(
f"Trigger '{trigger_key}' requires the experimental 'New triggers and "
"conditions' feature to be enabled in Home Assistant Labs settings "
f"(feature flag: '{automation.NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG}')"
)
try:
integration = await async_get_integration(hass, platform)
except IntegrationNotFound:
@@ -976,6 +1005,8 @@ async def async_get_all_descriptions(
hass: HomeAssistant,
) -> dict[str, dict[str, Any] | None]:
"""Return descriptions (i.e. user documentation) for all triggers."""
from homeassistant.components import automation # noqa: PLC0415
descriptions_cache = hass.data[TRIGGER_DESCRIPTION_CACHE]
triggers = hass.data[TRIGGERS]
@@ -984,7 +1015,9 @@ async def async_get_all_descriptions(
all_triggers = set(triggers)
previous_all_triggers = set(descriptions_cache)
# If the triggers are the same, we can return the cache
if previous_all_triggers == all_triggers:
# mypy complains: Invalid index type "HassKey[set[str]]" for "HassDict"
if previous_all_triggers | hass.data[TRIGGER_DISABLED_TRIGGERS] == all_triggers: # type: ignore[index]
return descriptions_cache
# Files we loaded for missing descriptions
@@ -1022,6 +1055,9 @@ async def async_get_all_descriptions(
new_descriptions_cache = descriptions_cache.copy()
for missing_trigger in missing_triggers:
domain = triggers[missing_trigger]
if automation.is_disabled_experimental_trigger(hass, domain):
hass.data[TRIGGER_DISABLED_TRIGGERS].add(missing_trigger)
continue
if (
yaml_description := new_triggers_descriptions.get(domain, {}).get(

View File

@@ -39,7 +39,7 @@ habluetooth==5.7.0
hass-nabucasa==1.6.1
hassil==3.4.0
home-assistant-bluetooth==1.13.1
home-assistant-frontend==20251105.1
home-assistant-frontend==20251126.0
home-assistant-intents==2025.11.24
httpx==0.28.1
ifaddr==0.2.0

10
mypy.ini generated
View File

@@ -1626,6 +1626,16 @@ disallow_untyped_defs = true
warn_return_any = true
warn_unreachable = true
[mypy-homeassistant.components.energyid.*]
check_untyped_defs = true
disallow_incomplete_defs = true
disallow_subclassing_any = true
disallow_untyped_calls = true
disallow_untyped_decorators = true
disallow_untyped_defs = true
warn_return_any = true
warn_unreachable = true
[mypy-homeassistant.components.energyzero.*]
check_untyped_defs = true
disallow_incomplete_defs = true

View File

@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "homeassistant"
version = "2025.12.0.dev0"
version = "2026.1.0.dev0"
license = "Apache-2.0"
license-files = ["LICENSE*", "homeassistant/backports/LICENSE*"]
description = "Open-source home automation platform running on Python 3."
@@ -830,7 +830,7 @@ ignore = [
# Disabled because ruff does not understand type of __all__ generated by a function
"PLE0605",
"FURB116"
"FURB116",
]
[tool.ruff.lint.flake8-import-conventions.extend-aliases]

22
requirements_all.txt generated
View File

@@ -190,7 +190,7 @@ aioairzone-cloud==0.7.2
aioairzone==1.0.2
# homeassistant.components.alexa_devices
aioamazondevices==9.0.3
aioamazondevices==10.0.0
# homeassistant.components.ambient_network
# homeassistant.components.ambient_station
@@ -782,7 +782,7 @@ debugpy==1.8.17
decora-wifi==1.4
# homeassistant.components.ecovacs
deebot-client==16.3.0
deebot-client==16.4.0
# homeassistant.components.ihc
# homeassistant.components.namecheapdns
@@ -889,6 +889,9 @@ emulated-roku==0.3.0
# homeassistant.components.huisbaasje
energyflip-client==0.2.2
# homeassistant.components.energyid
energyid-webhooks==0.0.14
# homeassistant.components.energyzero
energyzero==2.1.1
@@ -955,6 +958,9 @@ fing_agent_api==1.0.3
# homeassistant.components.fints
fints==3.1.0
# homeassistant.components.fitbit
fitbit-web-api==2.13.5
# homeassistant.components.fitbit
fitbit==0.3.1
@@ -1087,7 +1093,7 @@ google-nest-sdm==9.1.0
google-photos-library-api==0.12.1
# homeassistant.components.google_air_quality
google_air_quality_api==1.1.1
google_air_quality_api==1.1.2
# homeassistant.components.slide
# homeassistant.components.slide_local
@@ -1195,7 +1201,7 @@ hole==0.9.0
holidays==0.84
# homeassistant.components.frontend
home-assistant-frontend==20251105.1
home-assistant-frontend==20251126.0
# homeassistant.components.conversation
home-assistant-intents==2025.11.24
@@ -2545,7 +2551,7 @@ python-overseerr==0.7.1
python-picnic-api2==1.3.1
# homeassistant.components.pooldose
python-pooldose==0.7.8
python-pooldose==0.8.0
# homeassistant.components.rabbitair
python-rabbitair==0.0.8
@@ -2554,7 +2560,7 @@ python-rabbitair==0.0.8
python-ripple-api==0.0.3
# homeassistant.components.roborock
python-roborock==3.7.1
python-roborock==3.8.1
# homeassistant.components.smarttub
python-smarttub==0.0.45
@@ -2714,7 +2720,7 @@ renault-api==0.5.0
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.16.5
reolink-aio==0.16.6
# homeassistant.components.idteck_prox
rfk101py==0.0.1
@@ -3047,7 +3053,7 @@ typedmonarchmoney==0.4.4
uasiren==0.0.1
# homeassistant.components.unifiprotect
uiprotect==7.28.0
uiprotect==7.29.0
# homeassistant.components.landisgyr_heat_meter
ultraheat-api==0.5.7

View File

@@ -181,7 +181,7 @@ aioairzone-cloud==0.7.2
aioairzone==1.0.2
# homeassistant.components.alexa_devices
aioamazondevices==9.0.3
aioamazondevices==10.0.0
# homeassistant.components.ambient_network
# homeassistant.components.ambient_station
@@ -691,7 +691,7 @@ dbus-fast==3.1.2
debugpy==1.8.17
# homeassistant.components.ecovacs
deebot-client==16.3.0
deebot-client==16.4.0
# homeassistant.components.ihc
# homeassistant.components.namecheapdns
@@ -783,6 +783,9 @@ emulated-roku==0.3.0
# homeassistant.components.huisbaasje
energyflip-client==0.2.2
# homeassistant.components.energyid
energyid-webhooks==0.0.14
# homeassistant.components.energyzero
energyzero==2.1.1
@@ -843,6 +846,9 @@ fing_agent_api==1.0.3
# homeassistant.components.fints
fints==3.1.0
# homeassistant.components.fitbit
fitbit-web-api==2.13.5
# homeassistant.components.fitbit
fitbit==0.3.1
@@ -963,7 +969,7 @@ google-nest-sdm==9.1.0
google-photos-library-api==0.12.1
# homeassistant.components.google_air_quality
google_air_quality_api==1.1.1
google_air_quality_api==1.1.2
# homeassistant.components.slide
# homeassistant.components.slide_local
@@ -1053,7 +1059,7 @@ hole==0.9.0
holidays==0.84
# homeassistant.components.frontend
home-assistant-frontend==20251105.1
home-assistant-frontend==20251126.0
# homeassistant.components.conversation
home-assistant-intents==2025.11.24
@@ -2126,13 +2132,13 @@ python-overseerr==0.7.1
python-picnic-api2==1.3.1
# homeassistant.components.pooldose
python-pooldose==0.7.8
python-pooldose==0.8.0
# homeassistant.components.rabbitair
python-rabbitair==0.0.8
# homeassistant.components.roborock
python-roborock==3.7.1
python-roborock==3.8.1
# homeassistant.components.smarttub
python-smarttub==0.0.45
@@ -2268,7 +2274,7 @@ renault-api==0.5.0
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.16.5
reolink-aio==0.16.6
# homeassistant.components.rflink
rflink==0.0.67
@@ -2532,7 +2538,7 @@ typedmonarchmoney==0.4.4
uasiren==0.0.1
# homeassistant.components.unifiprotect
uiprotect==7.28.0
uiprotect==7.29.0
# homeassistant.components.landisgyr_heat_meter
ultraheat-api==0.5.7

View File

@@ -9,9 +9,8 @@ cd "$(realpath "$(dirname "$0")/..")"
echo "Installing development dependencies..."
uv pip install \
-e . \
-r requirements_test.txt \
-r requirements_test_all.txt \
colorlog \
--constraint homeassistant/package_constraints.txt \
--upgrade \
--config-settings editable_mode=compat

View File

@@ -648,7 +648,6 @@ INTEGRATIONS_WITHOUT_QUALITY_SCALE_FILE = [
"mqtt_statestream",
"msteams",
"mullvad",
"music_assistant",
"mutesync",
"mvglive",
"mycroft",
@@ -1665,7 +1664,6 @@ INTEGRATIONS_WITHOUT_SCALE = [
"mqtt_statestream",
"msteams",
"mullvad",
"music_assistant",
"mutesync",
"mvglive",
"mycroft",

Some files were not shown because too many files have changed in this diff Show More