Compare commits

...

53 Commits

Author SHA1 Message Date
Ariel Ebersberger
e937b09652 Merge branch 'dev' into python-3.14.3 2026-04-15 19:04:03 +02:00
Ariel Ebersberger
533871babb Optimize add_job to skip double-deferral for @callback targets (#168198) 2026-04-15 18:50:33 +02:00
Erik Montnemery
1dc93a80c4 Improve type annotations and remove unused code in mobile_app (#168298) 2026-04-15 18:09:10 +02:00
Erik Montnemery
f8a94c6f22 Fix climate trigger labs flag test (#168299) 2026-04-15 17:53:26 +02:00
Erik Montnemery
b127d13587 Add additional media_player triggers (#156927)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
2026-04-15 17:34:36 +02:00
renovate[bot]
1895f8ebce Update attrs to 26.1.0 (#168276)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Franck Nijhof <git@frenck.dev>
2026-04-15 17:22:33 +02:00
renovate[bot]
b6916954dc Update respx to 0.23.1 (#168272)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-15 17:10:28 +02:00
renovate[bot]
23181f5275 Update pytest-github-actions-annotate-failures to 0.4.0 (#168269)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-15 16:59:51 +02:00
Robert Resch
607a10d1e1 Use pip to install dynamically extracted version from requirements.txt (#168246) 2026-04-15 16:34:01 +02:00
Ariel Ebersberger
ecb814adb0 Add test coverage for add_job and fix docstring (#168291) 2026-04-15 16:17:01 +02:00
G Johansson
67df556e84 Add async_on_create_entry method to create config entries (#155016)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-15 15:57:32 +02:00
AlCalzone
4d472418c5 Ensure extra_fields in Z-Wave automation config are strings (#168281) 2026-04-15 15:12:18 +02:00
renovate[bot]
cf6441561c Update voluptuous-openapi to 0.3.0 (#168275)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-15 15:06:24 +02:00
Erik Montnemery
6d8d447355 Revert "Add last_non_buffering_state media_player state attribute (#166941)" (#168285) 2026-04-15 14:41:02 +02:00
Erik Montnemery
ab5ae33290 Exclude unavailable and unknown in trigger first and last checks (#168224)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-15 14:20:49 +02:00
renovate[bot]
c0bf9a2bd2 Update pytest-sugar to 1.1.1 (#168270)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-15 13:07:21 +02:00
Norbert Rittel
d862b999ae Capitalize "REST" abbreviation in scrape error messages (#168280) 2026-04-15 11:36:39 +02:00
Erik Montnemery
d6be6e8810 Improve timer tests (#168277) 2026-04-15 11:21:59 +02:00
Daniel Hjelseth Høyer
f397f4c908 Handle Tibber async_get_client failing (#168207)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-04-15 10:50:29 +02:00
G Johansson
d58e7862c0 Scrape sub config entry (#141389) 2026-04-15 09:59:12 +02:00
Erik Montnemery
84f57f9859 Deduplicate toggle entity condition tests (#168195) 2026-04-15 08:19:09 +02:00
Erik Montnemery
c6169ec8eb Add update conditions (#167751) 2026-04-15 08:03:51 +02:00
renovate[bot]
c47cecf350 Update SQLAlchemy to 2.0.49 (#168260) 2026-04-15 07:20:58 +02:00
renovate[bot]
e31f611901 Update pytest-cov to 7.1.0 (#168267) 2026-04-15 07:20:10 +02:00
renovate[bot]
bc36b1dda2 Update coverage to 7.13.5 (#168238) 2026-04-15 07:19:39 +02:00
renovate[bot]
b3967130f0 Update orjson to 3.11.8 (#168259) 2026-04-15 06:40:43 +02:00
renovate[bot]
2960db3d8e Update codespell (#168235) 2026-04-15 06:34:50 +02:00
Christopher Fenner
a74cf69607 Bump PyViCare to v2.59.0 (#168254) 2026-04-15 01:42:16 +02:00
Noah Husby
d3e346c728 Bump aiorussound to 5.0.1 (#168255) 2026-04-15 01:03:10 +02:00
renovate[bot]
efb93c928e Update pylint-per-file-ignores to 3.2.1 (#168243)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: edenhaus <26537646+edenhaus@users.noreply.github.com>
2026-04-15 00:41:54 +02:00
Paulus Schoutsen
b7894407d2 Git ignore Claude worktrees (#168247) 2026-04-15 00:17:38 +02:00
Ian Foster
0b7da89e6e Modernize reauth flow in ruckus_unleashed (#168013) 2026-04-15 00:07:20 +02:00
Leon Grave
c765077442 Fresh-r integration: Get Quality Scale to Platinum (#167148)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-15 00:07:01 +02:00
Franck Nijhof
efbfeb7c30 Set parallel updates for Rituals Perfume Genie platforms (#168042) 2026-04-14 23:55:19 +02:00
Leon Grave
5670a12805 Add FreshrEntity base class (#168023)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-14 23:55:03 +02:00
Joakim Plate
d88fe45393 Wait for complete set of product data before accepting gardena device (#166481)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-14 23:32:30 +02:00
Leon Grave
b231742049 Add test for LoginError reauth in FreshrReadingsCoordinator (#168022)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-04-14 23:06:46 +02:00
Fredrik Mårtensson
99dc368c79 Add feeder meal plan actions to tuya (#161488)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-04-14 23:02:31 +02:00
cdheiser
a21a0a6577 Refactor Lutron setup logic (#167993) 2026-04-14 23:01:49 +02:00
Ian Foster
513fff12ac Improve setup exception handling in ruckus_unleashed (#168014) 2026-04-14 22:48:45 +02:00
David Bonnes
4474ad0450 Add native DHW service to Evohome (#167359)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-14 22:40:28 +02:00
Ronald van der Meer
a5d640acdb Add diagnostics to Duco integration (#168231) 2026-04-14 22:07:16 +02:00
renovate[bot]
da66632798 Update syrupy to 5.1.0 (#168241)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-14 21:49:20 +02:00
renovate[bot]
f5998856b4 Update yamllint (#168242)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-14 21:49:02 +02:00
renovate[bot]
d5441ff99e Update freezegun to 1.5.5 (#168236)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-14 21:41:48 +02:00
renovate[bot]
3848d4e8a6 Update Pillow to 12.2.0 (#168234)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-04-14 21:41:14 +02:00
Paulus Schoutsen
599c548264 Bump serialx to 1.2.2 (#168229) 2026-04-14 21:21:26 +02:00
Franck Nijhof
b18602cd18 Disable Renovate vulnerability alerts flow (#168233) 2026-04-14 21:11:07 +02:00
Stefan Agner
a45e2d74ec Split hassio data coordinator and add dedicated stats coordinator (#167080)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-04-14 20:51:13 +02:00
Franck Nijhof
a952636c28 Refine Renovate config with built-in manager and review follow-ups (#168225) 2026-04-14 20:27:59 +02:00
Daniel Hjelseth Høyer
ccd1d9f8ea Bump pyTibber to 0.37.1 (#168208) 2026-04-14 19:41:05 +02:00
Franck Nijhof
a4d4fe3722 Add Renovate config for allow-listed Python dependency updates (#168192) 2026-04-14 18:56:51 +02:00
Jan Čermák
427faf4854 Bump base image to 2026.02.0 with Python 3.14.3, use 3.14.3 in CI
This also bumps libcec used in the base image to 7.1.1, full changelog:
* https://github.com/home-assistant/docker/releases/tag/2026.02.0

Python changelog:
* https://docs.python.org/release/3.14.3/whatsnew/changelog.html
2026-04-10 10:23:28 +02:00
171 changed files with 5217 additions and 2313 deletions

161
.github/renovate.json vendored Normal file
View File

@@ -0,0 +1,161 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": ["config:recommended"],
"enabledManagers": [
"pep621",
"pip_requirements",
"pre-commit",
"homeassistant-manifest"
],
"pre-commit": {
"enabled": true
},
"pip_requirements": {
"managerFilePatterns": [
"/(^|/)requirements[\\w_-]*\\.txt$/",
"/(^|/)homeassistant/package_constraints\\.txt$/"
]
},
"homeassistant-manifest": {
"managerFilePatterns": [
"/^homeassistant/components/[^/]+/manifest\\.json$/"
]
},
"minimumReleaseAge": "7 days",
"prConcurrentLimit": 10,
"prHourlyLimit": 2,
"schedule": ["before 6am"],
"semanticCommits": "disabled",
"commitMessageAction": "Update",
"commitMessageTopic": "{{depName}}",
"commitMessageExtra": "to {{newVersion}}",
"automerge": false,
"vulnerabilityAlerts": {
"enabled": false
},
"packageRules": [
{
"description": "Deny all by default — allowlist below re-enables specific packages",
"matchPackageNames": ["*"],
"enabled": false
},
{
"description": "Core runtime dependencies (allowlisted)",
"matchPackageNames": [
"aiohttp",
"aiohttp-fast-zlib",
"aiohttp_cors",
"aiohttp-asyncmdnsresolver",
"yarl",
"httpx",
"requests",
"urllib3",
"certifi",
"orjson",
"PyYAML",
"Jinja2",
"cryptography",
"pyOpenSSL",
"PyJWT",
"SQLAlchemy",
"Pillow",
"attrs",
"uv",
"voluptuous",
"voluptuous-serialize",
"voluptuous-openapi",
"zeroconf"
],
"enabled": true,
"labels": ["dependency", "core"]
},
{
"description": "Test dependencies (allowlisted)",
"matchPackageNames": [
"pytest",
"pytest-asyncio",
"pytest-aiohttp",
"pytest-cov",
"pytest-freezer",
"pytest-github-actions-annotate-failures",
"pytest-socket",
"pytest-sugar",
"pytest-timeout",
"pytest-unordered",
"pytest-picked",
"pytest-xdist",
"pylint",
"pylint-per-file-ignores",
"astroid",
"coverage",
"freezegun",
"syrupy",
"respx",
"requests-mock",
"ruff",
"codespell",
"yamllint",
"zizmor"
],
"enabled": true,
"labels": ["dependency"]
},
{
"description": "For types-* stubs, only allow patch updates. Major/minor bumps track the upstream runtime package version and must be manually coordinated with the corresponding pin.",
"matchPackageNames": ["/^types-/"],
"matchUpdateTypes": ["patch"],
"enabled": true,
"labels": ["dependency"]
},
{
"description": "Pre-commit hook repos (allowlisted, matched by owner/repo)",
"matchPackageNames": [
"astral-sh/ruff-pre-commit",
"codespell-project/codespell",
"adrienverge/yamllint",
"zizmorcore/zizmor-pre-commit"
],
"enabled": true,
"labels": ["dependency"]
},
{
"description": "Group ruff pre-commit hook with its PyPI twin into one PR",
"matchPackageNames": ["astral-sh/ruff-pre-commit", "ruff"],
"groupName": "ruff",
"groupSlug": "ruff"
},
{
"description": "Group codespell pre-commit hook with its PyPI twin into one PR",
"matchPackageNames": ["codespell-project/codespell", "codespell"],
"groupName": "codespell",
"groupSlug": "codespell"
},
{
"description": "Group yamllint pre-commit hook with its PyPI twin into one PR",
"matchPackageNames": ["adrienverge/yamllint", "yamllint"],
"groupName": "yamllint",
"groupSlug": "yamllint"
},
{
"description": "Group zizmor pre-commit hook with its PyPI twin into one PR",
"matchPackageNames": ["zizmorcore/zizmor-pre-commit", "zizmor"],
"groupName": "zizmor",
"groupSlug": "zizmor"
},
{
"description": "Group pylint with astroid (their versions are linked and must move together)",
"matchPackageNames": ["pylint", "astroid"],
"groupName": "pylint",
"groupSlug": "pylint"
}
]
}

View File

@@ -14,7 +14,7 @@ env:
UV_HTTP_TIMEOUT: 60 UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true" UV_SYSTEM_PYTHON: "true"
# Base image version from https://github.com/home-assistant/docker # Base image version from https://github.com/home-assistant/docker
BASE_IMAGE_VERSION: "2026.01.0" BASE_IMAGE_VERSION: "2026.02.0"
ARCHITECTURES: '["amd64", "aarch64"]' ARCHITECTURES: '["amd64", "aarch64"]'
permissions: {} permissions: {}

1
.gitignore vendored
View File

@@ -142,5 +142,6 @@ pytest_buckets.txt
# AI tooling # AI tooling
.claude/settings.local.json .claude/settings.local.json
.claude/worktrees/
.serena/ .serena/

View File

@@ -8,7 +8,7 @@ repos:
- id: ruff-format - id: ruff-format
files: ^((homeassistant|pylint|script|tests)/.+)?[^/]+\.(py|pyi)$ files: ^((homeassistant|pylint|script|tests)/.+)?[^/]+\.(py|pyi)$
- repo: https://github.com/codespell-project/codespell - repo: https://github.com/codespell-project/codespell
rev: v2.4.1 rev: v2.4.2
hooks: hooks:
- id: codespell - id: codespell
args: args:
@@ -36,7 +36,7 @@ repos:
- --branch=master - --branch=master
- --branch=rc - --branch=rc
- repo: https://github.com/adrienverge/yamllint.git - repo: https://github.com/adrienverge/yamllint.git
rev: v1.37.1 rev: v1.38.0
hooks: hooks:
- id: yamllint - id: yamllint
- repo: https://github.com/rbubley/mirrors-prettier - repo: https://github.com/rbubley/mirrors-prettier

View File

@@ -1 +1 @@
3.14.2 3.14.3

16
Dockerfile generated
View File

@@ -19,25 +19,23 @@ ENV \
UV_SYSTEM_PYTHON=true \ UV_SYSTEM_PYTHON=true \
UV_NO_CACHE=true UV_NO_CACHE=true
WORKDIR /usr/src
# Home Assistant S6-Overlay # Home Assistant S6-Overlay
COPY rootfs / COPY rootfs /
# Add go2rtc binary # Add go2rtc binary
COPY --from=ghcr.io/alexxit/go2rtc@sha256:675c318b23c06fd862a61d262240c9a63436b4050d177ffc68a32710d9e05bae /usr/local/bin/go2rtc /bin/go2rtc COPY --from=ghcr.io/alexxit/go2rtc@sha256:675c318b23c06fd862a61d262240c9a63436b4050d177ffc68a32710d9e05bae /usr/local/bin/go2rtc /bin/go2rtc
RUN \
# Verify go2rtc can be executed
go2rtc --version \
# Install uv
&& pip3 install uv==0.11.1
WORKDIR /usr/src
## Setup Home Assistant Core dependencies ## Setup Home Assistant Core dependencies
COPY requirements.txt homeassistant/ COPY requirements.txt homeassistant/
COPY homeassistant/package_constraints.txt homeassistant/homeassistant/ COPY homeassistant/package_constraints.txt homeassistant/homeassistant/
RUN \ RUN \
uv pip install \ # Verify go2rtc can be executed
go2rtc --version \
# Install uv at the version pinned in the requirements file
&& pip3 install --no-cache-dir "uv==$(awk -F'==' '/^uv==/{print $2}' homeassistant/requirements.txt)" \
&& uv pip install \
--no-build \ --no-build \
-r homeassistant/requirements.txt -r homeassistant/requirements.txt

View File

@@ -152,6 +152,7 @@ _EXPERIMENTAL_CONDITION_PLATFORMS = {
"text", "text",
"timer", "timer",
"todo", "todo",
"update",
"vacuum", "vacuum",
"valve", "valve",
"water_heater", "water_heater",

View File

@@ -6,5 +6,5 @@
"iot_class": "local_polling", "iot_class": "local_polling",
"loggers": ["pydoods"], "loggers": ["pydoods"],
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["pydoods==1.0.2", "Pillow==12.1.1"] "requirements": ["pydoods==1.0.2", "Pillow==12.2.0"]
} }

View File

@@ -0,0 +1,53 @@
"""Diagnostics support for Duco."""
from __future__ import annotations
import asyncio
from dataclasses import asdict
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_HOST
from homeassistant.core import HomeAssistant
from .coordinator import DucoConfigEntry
TO_REDACT = {
CONF_HOST,
"mac",
"host_name",
"serial_board_box",
"serial_board_comm",
"serial_duco_box",
"serial_duco_comm",
}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: DucoConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator = entry.runtime_data
board = asdict(coordinator.board_info)
board.pop("time")
lan_info, duco_diags, write_remaining = await asyncio.gather(
coordinator.client.async_get_lan_info(),
coordinator.client.async_get_diagnostics(),
coordinator.client.async_get_write_req_remaining(),
)
return async_redact_data(
{
"entry_data": entry.data,
"board_info": board,
"lan_info": asdict(lan_info),
"nodes": {
str(node_id): asdict(node) for node_id, node in coordinator.data.items()
},
"duco_diagnostics": [asdict(d) for d in duco_diags],
"write_requests_remaining": write_remaining,
},
TO_REDACT,
)

View File

@@ -45,7 +45,7 @@ rules:
# Gold # Gold
devices: done devices: done
diagnostics: todo diagnostics: done
discovery-update-info: discovery-update-info:
status: todo status: todo
comment: >- comment: >-
@@ -74,7 +74,7 @@ rules:
entity-device-class: done entity-device-class: done
entity-disabled-by-default: done entity-disabled-by-default: done
entity-translations: done entity-translations: done
exception-translations: todo exception-translations: done
icon-translations: done icon-translations: done
reconfiguration-flow: todo reconfiguration-flow: todo
repair-issues: todo repair-issues: todo

View File

@@ -22,9 +22,8 @@ CONF_LOCATION_IDX: Final = "location_idx"
SCAN_INTERVAL_DEFAULT: Final = timedelta(seconds=300) SCAN_INTERVAL_DEFAULT: Final = timedelta(seconds=300)
SCAN_INTERVAL_MINIMUM: Final = timedelta(seconds=60) SCAN_INTERVAL_MINIMUM: Final = timedelta(seconds=60)
ATTR_PERIOD: Final = "period" # number of days
ATTR_DURATION: Final = "duration" # number of minutes, <24h ATTR_DURATION: Final = "duration" # number of minutes, <24h
ATTR_PERIOD: Final = "period" # number of days
ATTR_SETPOINT: Final = "setpoint" ATTR_SETPOINT: Final = "setpoint"
@@ -37,3 +36,4 @@ class EvoService(StrEnum):
RESET_SYSTEM = "reset_system" RESET_SYSTEM = "reset_system"
SET_ZONE_OVERRIDE = "set_zone_override" SET_ZONE_OVERRIDE = "set_zone_override"
CLEAR_ZONE_OVERRIDE = "clear_zone_override" CLEAR_ZONE_OVERRIDE = "clear_zone_override"
SET_DHW_OVERRIDE = "set_dhw_override"

View File

@@ -22,6 +22,9 @@
"reset_system": { "reset_system": {
"service": "mdi:refresh" "service": "mdi:refresh"
}, },
"set_dhw_override": {
"service": "mdi:water-heater"
},
"set_system_mode": { "set_system_mode": {
"service": "mdi:pencil" "service": "mdi:pencil"
}, },

View File

@@ -14,7 +14,8 @@ from evohomeasync2.schemas.const import (
import voluptuous as vol import voluptuous as vol
from homeassistant.components.climate import DOMAIN as CLIMATE_DOMAIN from homeassistant.components.climate import DOMAIN as CLIMATE_DOMAIN
from homeassistant.const import ATTR_MODE from homeassistant.components.water_heater import DOMAIN as WATER_HEATER_DOMAIN
from homeassistant.const import ATTR_MODE, ATTR_STATE
from homeassistant.core import HomeAssistant, ServiceCall, callback from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ServiceValidationError from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv, service from homeassistant.helpers import config_validation as cv, service
@@ -49,6 +50,15 @@ SET_ZONE_OVERRIDE_SCHEMA: Final[dict[str | vol.Marker, Any]] = {
), ),
} }
# DHW service schemas (registered as entity services)
SET_DHW_OVERRIDE_SCHEMA: Final[dict[str | vol.Marker, Any]] = {
vol.Required(ATTR_STATE): cv.boolean,
vol.Optional(ATTR_DURATION): vol.All(
cv.time_period,
vol.Range(min=timedelta(days=0), max=timedelta(days=1)),
),
}
def _register_zone_entity_services(hass: HomeAssistant) -> None: def _register_zone_entity_services(hass: HomeAssistant) -> None:
"""Register entity-level services for zones.""" """Register entity-level services for zones."""
@@ -71,6 +81,19 @@ def _register_zone_entity_services(hass: HomeAssistant) -> None:
) )
def _register_dhw_entity_services(hass: HomeAssistant) -> None:
"""Register entity-level services for DHW zones."""
service.async_register_platform_entity_service(
hass,
DOMAIN,
EvoService.SET_DHW_OVERRIDE,
entity_domain=WATER_HEATER_DOMAIN,
schema=SET_DHW_OVERRIDE_SCHEMA,
func="async_set_dhw_override",
)
def _validate_set_system_mode_params(tcs: ControlSystem, data: dict[str, Any]) -> None: def _validate_set_system_mode_params(tcs: ControlSystem, data: dict[str, Any]) -> None:
"""Validate that a set_system_mode service call is properly formed.""" """Validate that a set_system_mode service call is properly formed."""
@@ -156,3 +179,4 @@ def setup_service_functions(
) )
_register_zone_entity_services(hass) _register_zone_entity_services(hass)
_register_dhw_entity_services(hass)

View File

@@ -58,3 +58,19 @@ clear_zone_override:
domain: climate domain: climate
supported_features: supported_features:
- climate.ClimateEntityFeature.TARGET_TEMPERATURE - climate.ClimateEntityFeature.TARGET_TEMPERATURE
set_dhw_override:
target:
entity:
integration: evohome
domain: water_heater
fields:
state:
required: true
selector:
boolean:
duration:
example: "02:15"
selector:
duration:
enable_second: false

View File

@@ -21,7 +21,7 @@
}, },
"services": { "services": {
"clear_zone_override": { "clear_zone_override": {
"description": "Sets a zone to follow its schedule.", "description": "Sets the zone to follow its schedule.",
"name": "Clear zone override" "name": "Clear zone override"
}, },
"refresh_system": { "refresh_system": {
@@ -29,11 +29,25 @@
"name": "Refresh system" "name": "Refresh system"
}, },
"reset_system": { "reset_system": {
"description": "Sets the system to `Auto` mode and resets all the zones to follow their schedules. Not all Evohome systems support this feature (i.e. `AutoWithReset` mode).", "description": "Sets the system mode to `Auto` mode and resets all the zones to follow their schedules. Not all Evohome systems support this feature (i.e. `AutoWithReset` mode).",
"name": "Reset system" "name": "Reset system"
}, },
"set_dhw_override": {
"description": "Overrides the DHW state, either indefinitely or for a specified duration, after which it will revert to following its schedule.",
"fields": {
"duration": {
"description": "The DHW will revert to its schedule after this time. If 0 the change is until the next scheduled setpoint.",
"name": "Duration"
},
"state": {
"description": "The DHW state: True (on: heat the water up to the setpoint) or False (off).",
"name": "State"
}
},
"name": "Set DHW override"
},
"set_system_mode": { "set_system_mode": {
"description": "Sets the system mode, either indefinitely, or for a specified period of time, after which it will revert to `Auto`. Not all systems support all modes.", "description": "Sets the system mode, either indefinitely or until a specified end time, after which it will revert to `Auto`. Not all systems support all modes.",
"fields": { "fields": {
"duration": { "duration": {
"description": "The duration in hours; used only with `AutoWithEco` mode (up to 24 hours).", "description": "The duration in hours; used only with `AutoWithEco` mode (up to 24 hours).",
@@ -51,7 +65,7 @@
"name": "Set system mode" "name": "Set system mode"
}, },
"set_zone_override": { "set_zone_override": {
"description": "Overrides the zone's setpoint, either indefinitely, or for a specified period of time, after which it will revert to following its schedule.", "description": "Overrides the zone setpoint, either indefinitely or for a specified duration, after which it will revert to following its schedule.",
"fields": { "fields": {
"duration": { "duration": {
"description": "The zone will revert to its schedule after this time. If 0 the change is until the next scheduled setpoint.", "description": "The zone will revert to its schedule after this time. If 0 the change is until the next scheduled setpoint.",

View File

@@ -2,6 +2,7 @@
from __future__ import annotations from __future__ import annotations
from datetime import timedelta
import logging import logging
from typing import Any from typing import Any
@@ -97,6 +98,28 @@ class EvoDHW(EvoChild, WaterHeaterEntity):
PRECISION_TENTHS if coordinator.client_v1 else PRECISION_WHOLE PRECISION_TENTHS if coordinator.client_v1 else PRECISION_WHOLE
) )
async def async_set_dhw_override(
self, state: bool, duration: timedelta | None = None
) -> None:
"""Override the DHW zone's on/off state, either permanently or for a duration."""
if duration is None:
until = None # indefinitely, aka permanent override
elif duration.total_seconds() == 0:
await self._update_schedule()
until = self.setpoints.get("next_sp_from")
else:
until = dt_util.now() + duration
until = dt_util.as_utc(until) if until else None
if state:
await self.coordinator.call_client_api(self._evo_device.set_on(until=until))
else:
await self.coordinator.call_client_api(
self._evo_device.set_off(until=until)
)
@property @property
def current_operation(self) -> str | None: def current_operation(self) -> str | None:
"""Return the current operating mode (Auto, On, or Off).""" """Return the current operating mode (Auto, On, or Off)."""

View File

@@ -6,7 +6,7 @@ from datetime import timedelta
from aiohttp import ClientError from aiohttp import ClientError
from pyfreshr import FreshrClient from pyfreshr import FreshrClient
from pyfreshr.exceptions import ApiResponseError, LoginError from pyfreshr.exceptions import ApiResponseError, LoginError
from pyfreshr.models import DeviceReadings, DeviceSummary from pyfreshr.models import DeviceReadings, DeviceSummary, DeviceType
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
@@ -18,6 +18,12 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import DOMAIN, LOGGER from .const import DOMAIN, LOGGER
_DEVICE_TYPE_NAMES: dict[DeviceType, str] = {
DeviceType.FRESH_R: "Fresh-r",
DeviceType.FORWARD: "Fresh-r Forward",
DeviceType.MONITOR: "Fresh-r Monitor",
}
DEVICES_SCAN_INTERVAL = timedelta(hours=1) DEVICES_SCAN_INTERVAL = timedelta(hours=1)
READINGS_SCAN_INTERVAL = timedelta(minutes=10) READINGS_SCAN_INTERVAL = timedelta(minutes=10)
@@ -110,6 +116,12 @@ class FreshrReadingsCoordinator(DataUpdateCoordinator[DeviceReadings]):
) )
self._device = device self._device = device
self._client = client self._client = client
self.device_info = dr.DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=_DEVICE_TYPE_NAMES.get(device.device_type, "Fresh-r"),
serial_number=device.id,
manufacturer="Fresh-r",
)
@property @property
def device_id(self) -> str: def device_id(self) -> str:

View File

@@ -0,0 +1,18 @@
"""Base entity for the Fresh-r integration."""
from __future__ import annotations
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .coordinator import FreshrReadingsCoordinator
class FreshrEntity(CoordinatorEntity[FreshrReadingsCoordinator]):
"""Base class for Fresh-r entities."""
_attr_has_entity_name = True
def __init__(self, coordinator: FreshrReadingsCoordinator) -> None:
"""Initialize the Fresh-r entity."""
super().__init__(coordinator)
self._attr_device_info = coordinator.device_info

View File

@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/freshr", "documentation": "https://www.home-assistant.io/integrations/freshr",
"integration_type": "hub", "integration_type": "hub",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"quality_scale": "silver", "quality_scale": "platinum",
"requirements": ["pyfreshr==1.2.0"] "requirements": ["pyfreshr==1.2.0"]
} }

View File

@@ -21,12 +21,10 @@ from homeassistant.const import (
UnitOfVolumeFlowRate, UnitOfVolumeFlowRate,
) )
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import FreshrConfigEntry, FreshrReadingsCoordinator from .coordinator import FreshrConfigEntry, FreshrReadingsCoordinator
from .entity import FreshrEntity
PARALLEL_UPDATES = 0 PARALLEL_UPDATES = 0
@@ -93,12 +91,6 @@ _TEMP = FreshrSensorEntityDescription(
value_fn=lambda r: r.temp, value_fn=lambda r: r.temp,
) )
_DEVICE_TYPE_NAMES: dict[DeviceType, str] = {
DeviceType.FRESH_R: "Fresh-r",
DeviceType.FORWARD: "Fresh-r Forward",
DeviceType.MONITOR: "Fresh-r Monitor",
}
SENSOR_TYPES: dict[DeviceType, tuple[FreshrSensorEntityDescription, ...]] = { SENSOR_TYPES: dict[DeviceType, tuple[FreshrSensorEntityDescription, ...]] = {
DeviceType.FRESH_R: (_T1, _T2, _CO2, _HUM, _FLOW, _DP), DeviceType.FRESH_R: (_T1, _T2, _CO2, _HUM, _FLOW, _DP),
DeviceType.FORWARD: (_T1, _T2, _CO2, _HUM, _FLOW, _DP, _TEMP), DeviceType.FORWARD: (_T1, _T2, _CO2, _HUM, _FLOW, _DP, _TEMP),
@@ -131,17 +123,10 @@ async def async_setup_entry(
descriptions = SENSOR_TYPES.get( descriptions = SENSOR_TYPES.get(
device.device_type, SENSOR_TYPES[DeviceType.FRESH_R] device.device_type, SENSOR_TYPES[DeviceType.FRESH_R]
) )
device_info = DeviceInfo(
identifiers={(DOMAIN, device_id)},
name=_DEVICE_TYPE_NAMES.get(device.device_type, "Fresh-r"),
serial_number=device_id,
manufacturer="Fresh-r",
)
entities.extend( entities.extend(
FreshrSensor( FreshrSensor(
config_entry.runtime_data.readings[device_id], config_entry.runtime_data.readings[device_id],
description, description,
device_info,
) )
for description in descriptions for description in descriptions
) )
@@ -151,22 +136,19 @@ async def async_setup_entry(
config_entry.async_on_unload(coordinator.async_add_listener(_check_devices)) config_entry.async_on_unload(coordinator.async_add_listener(_check_devices))
class FreshrSensor(CoordinatorEntity[FreshrReadingsCoordinator], SensorEntity): class FreshrSensor(FreshrEntity, SensorEntity):
"""Representation of a Fresh-r sensor.""" """Representation of a Fresh-r sensor."""
_attr_has_entity_name = True
entity_description: FreshrSensorEntityDescription entity_description: FreshrSensorEntityDescription
def __init__( def __init__(
self, self,
coordinator: FreshrReadingsCoordinator, coordinator: FreshrReadingsCoordinator,
description: FreshrSensorEntityDescription, description: FreshrSensorEntityDescription,
device_info: DeviceInfo,
) -> None: ) -> None:
"""Initialize the sensor.""" """Initialize the sensor."""
super().__init__(coordinator) super().__init__(coordinator)
self.entity_description = description self.entity_description = description
self._attr_device_info = device_info
self._attr_unique_id = f"{coordinator.device_id}_{description.key}" self._attr_unique_id = f"{coordinator.device_id}_{description.key}"
@property @property

View File

@@ -2,7 +2,6 @@
from __future__ import annotations from __future__ import annotations
import asyncio
import logging import logging
from bleak.backends.device import BLEDevice from bleak.backends.device import BLEDevice
@@ -13,7 +12,8 @@ from gardena_bluetooth.exceptions import (
CharacteristicNotFound, CharacteristicNotFound,
CommunicationFailure, CommunicationFailure,
) )
from gardena_bluetooth.parse import CharacteristicTime from gardena_bluetooth.parse import CharacteristicTime, ProductType
from gardena_bluetooth.scan import async_get_manufacturer_data
from homeassistant.components import bluetooth from homeassistant.components import bluetooth
from homeassistant.const import CONF_ADDRESS, Platform from homeassistant.const import CONF_ADDRESS, Platform
@@ -29,7 +29,6 @@ from .coordinator import (
GardenaBluetoothConfigEntry, GardenaBluetoothConfigEntry,
GardenaBluetoothCoordinator, GardenaBluetoothCoordinator,
) )
from .util import async_get_product_type
PLATFORMS: list[Platform] = [ PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR, Platform.BINARY_SENSOR,
@@ -76,11 +75,10 @@ async def async_setup_entry(
address = entry.data[CONF_ADDRESS] address = entry.data[CONF_ADDRESS]
try: mfg_data = await async_get_manufacturer_data({address})
async with asyncio.timeout(TIMEOUT): product_type = mfg_data[address].product_type
product_type = await async_get_product_type(hass, address) if product_type == ProductType.UNKNOWN:
except TimeoutError as exception: raise ConfigEntryNotReady("Unable to find product type")
raise ConfigEntryNotReady("Unable to find product type") from exception
client = Client(get_connection(hass, address), product_type) client = Client(get_connection(hass, address), product_type)
try: try:

View File

@@ -9,6 +9,7 @@ from gardena_bluetooth.client import Client
from gardena_bluetooth.const import PRODUCT_NAMES, DeviceInformation, ScanService from gardena_bluetooth.const import PRODUCT_NAMES, DeviceInformation, ScanService
from gardena_bluetooth.exceptions import CharacteristicNotFound, CommunicationFailure from gardena_bluetooth.exceptions import CharacteristicNotFound, CommunicationFailure
from gardena_bluetooth.parse import ManufacturerData, ProductType from gardena_bluetooth.parse import ManufacturerData, ProductType
from gardena_bluetooth.scan import async_get_manufacturer_data
import voluptuous as vol import voluptuous as vol
from homeassistant.components.bluetooth import ( from homeassistant.components.bluetooth import (
@@ -24,41 +25,27 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__) _LOGGER = logging.getLogger(__name__)
_SUPPORTED_PRODUCT_TYPES = {
ProductType.PUMP,
ProductType.VALVE,
ProductType.WATER_COMPUTER,
ProductType.AUTOMATS,
ProductType.PRESSURE_TANKS,
ProductType.AQUA_CONTOURS,
}
def _is_supported(discovery_info: BluetoothServiceInfo): def _is_supported(discovery_info: BluetoothServiceInfo):
"""Check if device is supported.""" """Check if device is supported."""
if ScanService not in discovery_info.service_uuids: if ScanService not in discovery_info.service_uuids:
return False return False
if not (data := discovery_info.manufacturer_data.get(ManufacturerData.company)): if discovery_info.manufacturer_data.get(ManufacturerData.company) is None:
_LOGGER.debug("Missing manufacturer data: %s", discovery_info) _LOGGER.debug("Missing manufacturer data: %s", discovery_info)
return False return False
manufacturer_data = ManufacturerData.decode(data)
product_type = ProductType.from_manufacturer_data(manufacturer_data)
if product_type not in (
ProductType.PUMP,
ProductType.VALVE,
ProductType.WATER_COMPUTER,
ProductType.AUTOMATS,
ProductType.PRESSURE_TANKS,
ProductType.AQUA_CONTOURS,
):
_LOGGER.debug("Unsupported device: %s", manufacturer_data)
return False
return True return True
def _get_name(discovery_info: BluetoothServiceInfo):
data = discovery_info.manufacturer_data[ManufacturerData.company]
manufacturer_data = ManufacturerData.decode(data)
product_type = ProductType.from_manufacturer_data(manufacturer_data)
return PRODUCT_NAMES.get(product_type, "Gardena Device")
class GardenaBluetoothConfigFlow(ConfigFlow, domain=DOMAIN): class GardenaBluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Gardena Bluetooth.""" """Handle a config flow for Gardena Bluetooth."""
@@ -90,11 +77,13 @@ class GardenaBluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult: ) -> ConfigFlowResult:
"""Handle the bluetooth discovery step.""" """Handle the bluetooth discovery step."""
_LOGGER.debug("Discovered device: %s", discovery_info) _LOGGER.debug("Discovered device: %s", discovery_info)
if not _is_supported(discovery_info): data = await async_get_manufacturer_data({discovery_info.address})
product_type = data[discovery_info.address].product_type
if product_type not in _SUPPORTED_PRODUCT_TYPES:
return self.async_abort(reason="no_devices_found") return self.async_abort(reason="no_devices_found")
self.address = discovery_info.address self.address = discovery_info.address
self.devices = {discovery_info.address: _get_name(discovery_info)} self.devices = {discovery_info.address: PRODUCT_NAMES[product_type]}
await self.async_set_unique_id(self.address) await self.async_set_unique_id(self.address)
self._abort_if_unique_id_configured() self._abort_if_unique_id_configured()
return await self.async_step_confirm() return await self.async_step_confirm()
@@ -131,12 +120,21 @@ class GardenaBluetoothConfigFlow(ConfigFlow, domain=DOMAIN):
return await self.async_step_confirm() return await self.async_step_confirm()
current_addresses = self._async_current_ids(include_ignore=False) current_addresses = self._async_current_ids(include_ignore=False)
candidates = set()
for discovery_info in async_discovered_service_info(self.hass): for discovery_info in async_discovered_service_info(self.hass):
address = discovery_info.address address = discovery_info.address
if address in current_addresses or not _is_supported(discovery_info): if address in current_addresses or not _is_supported(discovery_info):
continue continue
candidates.add(address)
self.devices[address] = _get_name(discovery_info) data = await async_get_manufacturer_data(candidates)
for address, mfg_data in data.items():
if mfg_data.product_type not in _SUPPORTED_PRODUCT_TYPES:
continue
self.devices[address] = PRODUCT_NAMES[mfg_data.product_type]
# Keep selection sorted by address to ensure stable tests
self.devices = dict(sorted(self.devices.items(), key=lambda x: x[0]))
if not self.devices: if not self.devices:
return self.async_abort(reason="no_devices_found") return self.async_abort(reason="no_devices_found")

View File

@@ -15,5 +15,5 @@
"integration_type": "device", "integration_type": "device",
"iot_class": "local_polling", "iot_class": "local_polling",
"loggers": ["bleak", "bleak_esphome", "gardena_bluetooth"], "loggers": ["bleak", "bleak_esphome", "gardena_bluetooth"],
"requirements": ["gardena-bluetooth==2.3.0"] "requirements": ["gardena-bluetooth==2.4.0"]
} }

View File

@@ -1,51 +0,0 @@
"""Utility functions for Gardena Bluetooth integration."""
import asyncio
from collections.abc import AsyncIterator
from gardena_bluetooth.parse import ManufacturerData, ProductType
from homeassistant.components import bluetooth
async def _async_service_info(
hass, address
) -> AsyncIterator[bluetooth.BluetoothServiceInfoBleak]:
queue = asyncio.Queue[bluetooth.BluetoothServiceInfoBleak]()
def _callback(
service_info: bluetooth.BluetoothServiceInfoBleak,
change: bluetooth.BluetoothChange,
) -> None:
if change != bluetooth.BluetoothChange.ADVERTISEMENT:
return
queue.put_nowait(service_info)
service_info = bluetooth.async_last_service_info(hass, address, True)
if service_info:
yield service_info
cancel = bluetooth.async_register_callback(
hass,
_callback,
{bluetooth.match.ADDRESS: address},
bluetooth.BluetoothScanningMode.ACTIVE,
)
try:
while True:
yield await queue.get()
finally:
cancel()
async def async_get_product_type(hass, address: str) -> ProductType:
"""Wait for enough packets of manufacturer data to get the product type."""
data = ManufacturerData()
async for service_info in _async_service_info(hass, address):
data.update(service_info.manufacturer_data.get(ManufacturerData.company, b""))
product_type = ProductType.from_manufacturer_data(data)
if product_type is not ProductType.UNKNOWN:
return product_type
raise AssertionError("Iterator should have been infinite")

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/generic", "documentation": "https://www.home-assistant.io/integrations/generic",
"integration_type": "device", "integration_type": "device",
"iot_class": "local_push", "iot_class": "local_push",
"requirements": ["av==16.0.1", "Pillow==12.1.1"] "requirements": ["av==16.0.1", "Pillow==12.2.0"]
} }

View File

@@ -91,10 +91,14 @@ from .const import (
DATA_STORE, DATA_STORE,
DATA_SUPERVISOR_INFO, DATA_SUPERVISOR_INFO,
DOMAIN, DOMAIN,
HASSIO_UPDATE_INTERVAL, HASSIO_MAIN_UPDATE_INTERVAL,
MAIN_COORDINATOR,
STATS_COORDINATOR,
) )
from .coordinator import ( from .coordinator import (
HassioDataUpdateCoordinator, HassioAddOnDataUpdateCoordinator,
HassioMainDataUpdateCoordinator,
HassioStatsDataUpdateCoordinator,
get_addons_info, get_addons_info,
get_addons_list, get_addons_list,
get_addons_stats, get_addons_stats,
@@ -384,12 +388,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
] ]
hass.data[DATA_SUPERVISOR_INFO]["addons"] = hass.data[DATA_ADDONS_LIST] hass.data[DATA_SUPERVISOR_INFO]["addons"] = hass.data[DATA_ADDONS_LIST]
async_call_later(
hass,
HASSIO_UPDATE_INTERVAL,
HassJob(update_info_data, cancel_on_shutdown=True),
)
# Fetch data # Fetch data
update_info_task = hass.async_create_task(update_info_data(), eager_start=True) update_info_task = hass.async_create_task(update_info_data(), eager_start=True)
@@ -436,7 +434,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
# os info not yet fetched from supervisor, retry later # os info not yet fetched from supervisor, retry later
async_call_later( async_call_later(
hass, hass,
HASSIO_UPDATE_INTERVAL, HASSIO_MAIN_UPDATE_INTERVAL,
async_setup_hardware_integration_job, async_setup_hardware_integration_job,
) )
return return
@@ -462,9 +460,20 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up a config entry.""" """Set up a config entry."""
dev_reg = dr.async_get(hass) dev_reg = dr.async_get(hass)
coordinator = HassioDataUpdateCoordinator(hass, entry, dev_reg)
coordinator = HassioMainDataUpdateCoordinator(hass, entry, dev_reg)
await coordinator.async_config_entry_first_refresh() await coordinator.async_config_entry_first_refresh()
hass.data[ADDONS_COORDINATOR] = coordinator hass.data[MAIN_COORDINATOR] = coordinator
addon_coordinator = HassioAddOnDataUpdateCoordinator(
hass, entry, dev_reg, coordinator.jobs
)
await addon_coordinator.async_config_entry_first_refresh()
hass.data[ADDONS_COORDINATOR] = addon_coordinator
stats_coordinator = HassioStatsDataUpdateCoordinator(hass, entry)
await stats_coordinator.async_config_entry_first_refresh()
hass.data[STATS_COORDINATOR] = stats_coordinator
def deprecated_setup_issue() -> None: def deprecated_setup_issue() -> None:
os_info = get_os_info(hass) os_info = get_os_info(hass)
@@ -531,10 +540,12 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS) unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
# Unload coordinator # Unload coordinator
coordinator: HassioDataUpdateCoordinator = hass.data[ADDONS_COORDINATOR] coordinator: HassioMainDataUpdateCoordinator = hass.data[MAIN_COORDINATOR]
coordinator.unload() coordinator.unload()
# Pop coordinator # Pop coordinators
hass.data.pop(MAIN_COORDINATOR, None)
hass.data.pop(ADDONS_COORDINATOR, None) hass.data.pop(ADDONS_COORDINATOR, None)
hass.data.pop(STATS_COORDINATOR, None)
return unload_ok return unload_ok

View File

@@ -22,6 +22,7 @@ from .const import (
ATTR_STATE, ATTR_STATE,
DATA_KEY_ADDONS, DATA_KEY_ADDONS,
DATA_KEY_MOUNTS, DATA_KEY_MOUNTS,
MAIN_COORDINATOR,
) )
from .entity import HassioAddonEntity, HassioMountEntity from .entity import HassioAddonEntity, HassioMountEntity
@@ -60,17 +61,18 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Binary sensor set up for Hass.io config entry.""" """Binary sensor set up for Hass.io config entry."""
coordinator = hass.data[ADDONS_COORDINATOR] addons_coordinator = hass.data[ADDONS_COORDINATOR]
coordinator = hass.data[MAIN_COORDINATOR]
async_add_entities( async_add_entities(
itertools.chain( itertools.chain(
[ [
HassioAddonBinarySensor( HassioAddonBinarySensor(
addon=addon, addon=addon,
coordinator=coordinator, coordinator=addons_coordinator,
entity_description=entity_description, entity_description=entity_description,
) )
for addon in coordinator.data[DATA_KEY_ADDONS].values() for addon in addons_coordinator.data[DATA_KEY_ADDONS].values()
for entity_description in ADDON_ENTITY_DESCRIPTIONS for entity_description in ADDON_ENTITY_DESCRIPTIONS
], ],
[ [

View File

@@ -77,7 +77,9 @@ EVENT_JOB = "job"
UPDATE_KEY_SUPERVISOR = "supervisor" UPDATE_KEY_SUPERVISOR = "supervisor"
STARTUP_COMPLETE = "complete" STARTUP_COMPLETE = "complete"
MAIN_COORDINATOR = "hassio_main_coordinator"
ADDONS_COORDINATOR = "hassio_addons_coordinator" ADDONS_COORDINATOR = "hassio_addons_coordinator"
STATS_COORDINATOR = "hassio_stats_coordinator"
DATA_COMPONENT: HassKey[HassIO] = HassKey(DOMAIN) DATA_COMPONENT: HassKey[HassIO] = HassKey(DOMAIN)
@@ -94,7 +96,9 @@ DATA_SUPERVISOR_STATS = "hassio_supervisor_stats"
DATA_ADDONS_INFO = "hassio_addons_info" DATA_ADDONS_INFO = "hassio_addons_info"
DATA_ADDONS_STATS = "hassio_addons_stats" DATA_ADDONS_STATS = "hassio_addons_stats"
DATA_ADDONS_LIST = "hassio_addons_list" DATA_ADDONS_LIST = "hassio_addons_list"
HASSIO_UPDATE_INTERVAL = timedelta(minutes=5) HASSIO_MAIN_UPDATE_INTERVAL = timedelta(minutes=5)
HASSIO_ADDON_UPDATE_INTERVAL = timedelta(minutes=15)
HASSIO_STATS_UPDATE_INTERVAL = timedelta(seconds=60)
ATTR_AUTO_UPDATE = "auto_update" ATTR_AUTO_UPDATE = "auto_update"
ATTR_VERSION = "version" ATTR_VERSION = "version"

View File

@@ -7,7 +7,7 @@ from collections import defaultdict
from collections.abc import Awaitable from collections.abc import Awaitable
from copy import deepcopy from copy import deepcopy
import logging import logging
from typing import TYPE_CHECKING, Any, cast from typing import TYPE_CHECKING, Any
from aiohasupervisor import SupervisorError, SupervisorNotFoundError from aiohasupervisor import SupervisorError, SupervisorNotFoundError
from aiohasupervisor.models import ( from aiohasupervisor.models import (
@@ -15,9 +15,9 @@ from aiohasupervisor.models import (
CIFSMountResponse, CIFSMountResponse,
InstalledAddon, InstalledAddon,
NFSMountResponse, NFSMountResponse,
ResponseData,
StoreInfo, StoreInfo,
) )
from aiohasupervisor.models.base import ResponseData
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_MANUFACTURER, ATTR_NAME from homeassistant.const import ATTR_MANUFACTURER, ATTR_NAME
@@ -35,13 +35,11 @@ from .const import (
ATTR_SLUG, ATTR_SLUG,
ATTR_URL, ATTR_URL,
ATTR_VERSION, ATTR_VERSION,
CONTAINER_INFO,
CONTAINER_STATS, CONTAINER_STATS,
CORE_CONTAINER, CORE_CONTAINER,
DATA_ADDONS_INFO, DATA_ADDONS_INFO,
DATA_ADDONS_LIST, DATA_ADDONS_LIST,
DATA_ADDONS_STATS, DATA_ADDONS_STATS,
DATA_COMPONENT,
DATA_CORE_INFO, DATA_CORE_INFO,
DATA_CORE_STATS, DATA_CORE_STATS,
DATA_HOST_INFO, DATA_HOST_INFO,
@@ -59,7 +57,9 @@ from .const import (
DATA_SUPERVISOR_INFO, DATA_SUPERVISOR_INFO,
DATA_SUPERVISOR_STATS, DATA_SUPERVISOR_STATS,
DOMAIN, DOMAIN,
HASSIO_UPDATE_INTERVAL, HASSIO_ADDON_UPDATE_INTERVAL,
HASSIO_MAIN_UPDATE_INTERVAL,
HASSIO_STATS_UPDATE_INTERVAL,
REQUEST_REFRESH_DELAY, REQUEST_REFRESH_DELAY,
SUPERVISOR_CONTAINER, SUPERVISOR_CONTAINER,
SupervisorEntityModel, SupervisorEntityModel,
@@ -318,7 +318,314 @@ def async_remove_devices_from_dev_reg(
dev_reg.async_remove_device(dev.id) dev_reg.async_remove_device(dev.id)
class HassioDataUpdateCoordinator(DataUpdateCoordinator): class HassioStatsDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Class to retrieve Hass.io container stats."""
config_entry: ConfigEntry
def __init__(self, hass: HomeAssistant, config_entry: ConfigEntry) -> None:
"""Initialize coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=DOMAIN,
update_interval=HASSIO_STATS_UPDATE_INTERVAL,
request_refresh_debouncer=Debouncer(
hass, _LOGGER, cooldown=REQUEST_REFRESH_DELAY, immediate=False
),
)
self.supervisor_client = get_supervisor_client(hass)
self._container_updates: defaultdict[str, dict[str, set[str]]] = defaultdict(
lambda: defaultdict(set)
)
async def _async_update_data(self) -> dict[str, Any]:
"""Update stats data via library."""
try:
await self._fetch_stats()
except SupervisorError as err:
raise UpdateFailed(f"Error on Supervisor API: {err}") from err
new_data: dict[str, Any] = {}
new_data[DATA_KEY_CORE] = get_core_stats(self.hass)
new_data[DATA_KEY_SUPERVISOR] = get_supervisor_stats(self.hass)
new_data[DATA_KEY_ADDONS] = get_addons_stats(self.hass)
return new_data
async def _fetch_stats(self) -> None:
"""Fetch container stats for subscribed entities."""
container_updates = self._container_updates
data = self.hass.data
client = self.supervisor_client
# Fetch core and supervisor stats
updates: dict[str, Awaitable] = {}
if container_updates.get(CORE_CONTAINER, {}).get(CONTAINER_STATS):
updates[DATA_CORE_STATS] = client.homeassistant.stats()
if container_updates.get(SUPERVISOR_CONTAINER, {}).get(CONTAINER_STATS):
updates[DATA_SUPERVISOR_STATS] = client.supervisor.stats()
if updates:
api_results: list[ResponseData] = await asyncio.gather(*updates.values())
for key, result in zip(updates, api_results, strict=True):
data[key] = result.to_dict()
# Fetch addon stats
addons_list = get_addons_list(self.hass) or []
started_addons = {
addon[ATTR_SLUG]
for addon in addons_list
if addon.get("state") in {AddonState.STARTED, AddonState.STARTUP}
}
addons_stats: dict[str, Any] = data.setdefault(DATA_ADDONS_STATS, {})
# Clean up cache for stopped/removed addons
for slug in addons_stats.keys() - started_addons:
del addons_stats[slug]
# Fetch stats for addons with subscribed entities
addon_stats_results = dict(
await asyncio.gather(
*[
self._update_addon_stats(slug)
for slug in started_addons
if container_updates.get(slug, {}).get(CONTAINER_STATS)
]
)
)
addons_stats.update(addon_stats_results)
async def _update_addon_stats(self, slug: str) -> tuple[str, dict[str, Any] | None]:
"""Update single addon stats."""
try:
stats = await self.supervisor_client.addons.addon_stats(slug)
except SupervisorError as err:
_LOGGER.warning("Could not fetch stats for %s: %s", slug, err)
return (slug, None)
return (slug, stats.to_dict())
@callback
def async_enable_container_updates(
self, slug: str, entity_id: str, types: set[str]
) -> CALLBACK_TYPE:
"""Enable stats updates for a container."""
enabled_updates = self._container_updates[slug]
for key in types:
enabled_updates[key].add(entity_id)
@callback
def _remove() -> None:
for key in types:
enabled_updates[key].discard(entity_id)
if not enabled_updates[key]:
del enabled_updates[key]
if not enabled_updates:
self._container_updates.pop(slug, None)
return _remove
class HassioAddOnDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Class to retrieve Hass.io Add-on status."""
config_entry: ConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: ConfigEntry,
dev_reg: dr.DeviceRegistry,
jobs: SupervisorJobs,
) -> None:
"""Initialize coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=DOMAIN,
update_interval=HASSIO_ADDON_UPDATE_INTERVAL,
# We don't want an immediate refresh since we want to avoid
# hammering the Supervisor API on startup
request_refresh_debouncer=Debouncer(
hass, _LOGGER, cooldown=REQUEST_REFRESH_DELAY, immediate=False
),
)
self.entry_id = config_entry.entry_id
self.dev_reg = dev_reg
self._addon_info_subscriptions: defaultdict[str, set[str]] = defaultdict(set)
self.supervisor_client = get_supervisor_client(hass)
self.jobs = jobs
async def _async_update_data(self) -> dict[str, Any]:
"""Update data via library."""
is_first_update = not self.data
client = self.supervisor_client
try:
installed_addons: list[InstalledAddon] = await client.addons.list()
all_addons = {addon.slug for addon in installed_addons}
# Fetch addon info for all addons on first update, or only
# for addons with subscribed entities on subsequent updates.
addon_info_results = dict(
await asyncio.gather(
*[
self._update_addon_info(slug)
for slug in all_addons
if is_first_update or self._addon_info_subscriptions.get(slug)
]
)
)
except SupervisorError as err:
raise UpdateFailed(f"Error on Supervisor API: {err}") from err
# Update hass.data for legacy accessor functions
data = self.hass.data
addons_list_dicts = [addon.to_dict() for addon in installed_addons]
data[DATA_ADDONS_LIST] = addons_list_dicts
# Update addon info cache in hass.data
addon_info_cache: dict[str, Any] = data.setdefault(DATA_ADDONS_INFO, {})
for slug in addon_info_cache.keys() - all_addons:
del addon_info_cache[slug]
addon_info_cache.update(addon_info_results)
# Deprecated 2026.4.0: Folding addons.list results into supervisor_info
# for compatibility. Written to hass.data only, not coordinator data.
if DATA_SUPERVISOR_INFO in data:
data[DATA_SUPERVISOR_INFO]["addons"] = addons_list_dicts
# Build clean coordinator data
store_data = get_store(self.hass)
if store_data:
repositories = {
repo.slug: repo.name
for repo in StoreInfo.from_dict(store_data).repositories
}
else:
repositories = {}
new_data: dict[str, Any] = {}
new_data[DATA_KEY_ADDONS] = {
(slug := addon[ATTR_SLUG]): {
**addon,
ATTR_AUTO_UPDATE: (addon_info_cache.get(slug) or {}).get(
ATTR_AUTO_UPDATE, False
),
ATTR_REPOSITORY: repositories.get(
repo_slug := addon.get(ATTR_REPOSITORY, ""), repo_slug
),
}
for addon in addons_list_dicts
}
# If this is the initial refresh, register all addons
if is_first_update:
async_register_addons_in_dev_reg(
self.entry_id, self.dev_reg, new_data[DATA_KEY_ADDONS].values()
)
# Remove add-ons that are no longer installed from device registry
supervisor_addon_devices = {
list(device.identifiers)[0][1]
for device in self.dev_reg.devices.get_devices_for_config_entry_id(
self.entry_id
)
if device.model == SupervisorEntityModel.ADDON
}
if stale_addons := supervisor_addon_devices - set(new_data[DATA_KEY_ADDONS]):
async_remove_devices_from_dev_reg(self.dev_reg, stale_addons)
# If there are new add-ons, we should reload the config entry so we can
# create new devices and entities. We can return an empty dict because
# coordinator will be recreated.
if self.data and (
set(new_data[DATA_KEY_ADDONS]) - set(self.data[DATA_KEY_ADDONS])
):
self.hass.async_create_task(
self.hass.config_entries.async_reload(self.entry_id)
)
return {}
return new_data
async def get_changelog(self, addon_slug: str) -> str | None:
"""Get the changelog for an add-on."""
try:
return await self.supervisor_client.store.addon_changelog(addon_slug)
except SupervisorNotFoundError:
return None
async def _update_addon_info(self, slug: str) -> tuple[str, dict[str, Any] | None]:
"""Return the info for an addon."""
try:
info = await self.supervisor_client.addons.addon_info(slug)
except SupervisorError as err:
_LOGGER.warning("Could not fetch info for %s: %s", slug, err)
return (slug, None)
# Translate to legacy hassio names for compatibility
info_dict = info.to_dict()
info_dict["hassio_api"] = info_dict.pop("supervisor_api")
info_dict["hassio_role"] = info_dict.pop("supervisor_role")
return (slug, info_dict)
@callback
def async_enable_addon_info_updates(
self, slug: str, entity_id: str
) -> CALLBACK_TYPE:
"""Enable info updates for an add-on."""
self._addon_info_subscriptions[slug].add(entity_id)
@callback
def _remove() -> None:
self._addon_info_subscriptions[slug].discard(entity_id)
if not self._addon_info_subscriptions[slug]:
del self._addon_info_subscriptions[slug]
return _remove
async def _async_refresh(
self,
log_failures: bool = True,
raise_on_auth_failed: bool = False,
scheduled: bool = False,
raise_on_entry_error: bool = False,
) -> None:
"""Refresh data."""
if not scheduled and not raise_on_auth_failed:
# Force reloading add-on updates for non-scheduled
# updates.
#
# If `raise_on_auth_failed` is set, it means this is
# the first refresh and we do not want to delay
# startup or cause a timeout so we only refresh the
# updates if this is not a scheduled refresh and
# we are not doing the first refresh.
try:
await self.supervisor_client.store.reload()
except SupervisorError as err:
_LOGGER.warning("Error on Supervisor API: %s", err)
await super()._async_refresh(
log_failures, raise_on_auth_failed, scheduled, raise_on_entry_error
)
async def force_addon_info_data_refresh(self, addon_slug: str) -> None:
"""Force refresh of addon info data for a specific addon."""
try:
slug, info = await self._update_addon_info(addon_slug)
if info is not None and DATA_KEY_ADDONS in self.data:
if slug in self.data[DATA_KEY_ADDONS]:
data = deepcopy(self.data)
data[DATA_KEY_ADDONS][slug].update(info)
self.async_set_updated_data(data)
except SupervisorError as err:
_LOGGER.warning("Could not refresh info for %s: %s", addon_slug, err)
class HassioMainDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Class to retrieve Hass.io status.""" """Class to retrieve Hass.io status."""
config_entry: ConfigEntry config_entry: ConfigEntry
@@ -332,82 +639,77 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
_LOGGER, _LOGGER,
config_entry=config_entry, config_entry=config_entry,
name=DOMAIN, name=DOMAIN,
update_interval=HASSIO_UPDATE_INTERVAL, update_interval=HASSIO_MAIN_UPDATE_INTERVAL,
# We don't want an immediate refresh since we want to avoid # We don't want an immediate refresh since we want to avoid
# fetching the container stats right away and avoid hammering # hammering the Supervisor API on startup
# the Supervisor API on startup
request_refresh_debouncer=Debouncer( request_refresh_debouncer=Debouncer(
hass, _LOGGER, cooldown=REQUEST_REFRESH_DELAY, immediate=False hass, _LOGGER, cooldown=REQUEST_REFRESH_DELAY, immediate=False
), ),
) )
self.hassio = hass.data[DATA_COMPONENT]
self.data = {}
self.entry_id = config_entry.entry_id self.entry_id = config_entry.entry_id
self.dev_reg = dev_reg self.dev_reg = dev_reg
self.is_hass_os = (get_info(self.hass) or {}).get("hassos") is not None self.is_hass_os = (get_info(self.hass) or {}).get("hassos") is not None
self._container_updates: defaultdict[str, dict[str, set[str]]] = defaultdict(
lambda: defaultdict(set)
)
self.supervisor_client = get_supervisor_client(hass) self.supervisor_client = get_supervisor_client(hass)
self.jobs = SupervisorJobs(hass) self.jobs = SupervisorJobs(hass)
async def _async_update_data(self) -> dict[str, Any]: async def _async_update_data(self) -> dict[str, Any]:
"""Update data via library.""" """Update data via library."""
is_first_update = not self.data is_first_update = not self.data
client = self.supervisor_client
try: try:
await self.force_data_refresh(is_first_update) (
info,
core_info,
supervisor_info,
os_info,
host_info,
store_info,
network_info,
) = await asyncio.gather(
client.info(),
client.homeassistant.info(),
client.supervisor.info(),
client.os.info(),
client.host.info(),
client.store.info(),
client.network.info(),
)
mounts_info = await client.mounts.info()
await self.jobs.refresh_data(is_first_update)
except SupervisorError as err: except SupervisorError as err:
raise UpdateFailed(f"Error on Supervisor API: {err}") from err raise UpdateFailed(f"Error on Supervisor API: {err}") from err
# Build clean coordinator data
new_data: dict[str, Any] = {} new_data: dict[str, Any] = {}
supervisor_info = get_supervisor_info(self.hass) or {} new_data[DATA_KEY_CORE] = core_info.to_dict()
addons_info = get_addons_info(self.hass) or {} new_data[DATA_KEY_SUPERVISOR] = supervisor_info.to_dict()
addons_stats = get_addons_stats(self.hass) new_data[DATA_KEY_HOST] = host_info.to_dict()
store_data = get_store(self.hass)
mounts_info = await self.supervisor_client.mounts.info()
addons_list = get_addons_list(self.hass) or []
if store_data:
repositories = {
repo.slug: repo.name
for repo in StoreInfo.from_dict(store_data).repositories
}
else:
repositories = {}
new_data[DATA_KEY_ADDONS] = {
(slug := addon[ATTR_SLUG]): {
**addon,
**(addons_stats.get(slug) or {}),
ATTR_AUTO_UPDATE: (addons_info.get(slug) or {}).get(
ATTR_AUTO_UPDATE, False
),
ATTR_REPOSITORY: repositories.get(
repo_slug := addon.get(ATTR_REPOSITORY, ""), repo_slug
),
}
for addon in addons_list
}
if self.is_hass_os:
new_data[DATA_KEY_OS] = get_os_info(self.hass)
new_data[DATA_KEY_CORE] = {
**(get_core_info(self.hass) or {}),
**get_core_stats(self.hass),
}
new_data[DATA_KEY_SUPERVISOR] = {
**supervisor_info,
**get_supervisor_stats(self.hass),
}
new_data[DATA_KEY_HOST] = get_host_info(self.hass) or {}
new_data[DATA_KEY_MOUNTS] = {mount.name: mount for mount in mounts_info.mounts} new_data[DATA_KEY_MOUNTS] = {mount.name: mount for mount in mounts_info.mounts}
if self.is_hass_os:
new_data[DATA_KEY_OS] = os_info.to_dict()
# If this is the initial refresh, register all addons and return the dict # Update hass.data for legacy accessor functions
data = self.hass.data
data[DATA_INFO] = info.to_dict()
data[DATA_CORE_INFO] = new_data[DATA_KEY_CORE]
data[DATA_OS_INFO] = new_data.get(DATA_KEY_OS, os_info.to_dict())
data[DATA_HOST_INFO] = new_data[DATA_KEY_HOST]
data[DATA_STORE] = store_info.to_dict()
data[DATA_NETWORK_INFO] = network_info.to_dict()
# Separate dict for hass.data supervisor info since we add deprecated
# compat keys that should not be in coordinator data
supervisor_info_dict = supervisor_info.to_dict()
# Deprecated 2026.4.0: Folding repositories and addons into
# supervisor_info for compatibility. Written to hass.data only, not
# coordinator data. Preserve the addons key from the addon coordinator.
supervisor_info_dict["repositories"] = data[DATA_STORE][ATTR_REPOSITORIES]
if (prev := data.get(DATA_SUPERVISOR_INFO)) and "addons" in prev:
supervisor_info_dict["addons"] = prev["addons"]
data[DATA_SUPERVISOR_INFO] = supervisor_info_dict
# If this is the initial refresh, register all main components
if is_first_update: if is_first_update:
async_register_addons_in_dev_reg(
self.entry_id, self.dev_reg, new_data[DATA_KEY_ADDONS].values()
)
async_register_mounts_in_dev_reg( async_register_mounts_in_dev_reg(
self.entry_id, self.dev_reg, new_data[DATA_KEY_MOUNTS].values() self.entry_id, self.dev_reg, new_data[DATA_KEY_MOUNTS].values()
) )
@@ -423,17 +725,6 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
self.entry_id, self.dev_reg, new_data[DATA_KEY_OS] self.entry_id, self.dev_reg, new_data[DATA_KEY_OS]
) )
# Remove add-ons that are no longer installed from device registry
supervisor_addon_devices = {
list(device.identifiers)[0][1]
for device in self.dev_reg.devices.get_devices_for_config_entry_id(
self.entry_id
)
if device.model == SupervisorEntityModel.ADDON
}
if stale_addons := supervisor_addon_devices - set(new_data[DATA_KEY_ADDONS]):
async_remove_devices_from_dev_reg(self.dev_reg, stale_addons)
# Remove mounts that no longer exists from device registry # Remove mounts that no longer exists from device registry
supervisor_mount_devices = { supervisor_mount_devices = {
device.name device.name
@@ -453,12 +744,11 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
# Remove the OS device if it exists and the installation is not hassos # Remove the OS device if it exists and the installation is not hassos
self.dev_reg.async_remove_device(dev.id) self.dev_reg.async_remove_device(dev.id)
# If there are new add-ons or mounts, we should reload the config entry so we can # If there are new mounts, we should reload the config entry so we can
# create new devices and entities. We can return an empty dict because # create new devices and entities. We can return an empty dict because
# coordinator will be recreated. # coordinator will be recreated.
if self.data and ( if self.data and (
set(new_data[DATA_KEY_ADDONS]) - set(self.data[DATA_KEY_ADDONS]) set(new_data[DATA_KEY_MOUNTS]) - set(self.data.get(DATA_KEY_MOUNTS, {}))
or set(new_data[DATA_KEY_MOUNTS]) - set(self.data[DATA_KEY_MOUNTS])
): ):
self.hass.async_create_task( self.hass.async_create_task(
self.hass.config_entries.async_reload(self.entry_id) self.hass.config_entries.async_reload(self.entry_id)
@@ -467,146 +757,6 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
return new_data return new_data
async def get_changelog(self, addon_slug: str) -> str | None:
"""Get the changelog for an add-on."""
try:
return await self.supervisor_client.store.addon_changelog(addon_slug)
except SupervisorNotFoundError:
return None
async def force_data_refresh(self, first_update: bool) -> None:
"""Force update of the addon info."""
container_updates = self._container_updates
data = self.hass.data
client = self.supervisor_client
updates: dict[str, Awaitable[ResponseData]] = {
DATA_INFO: client.info(),
DATA_CORE_INFO: client.homeassistant.info(),
DATA_SUPERVISOR_INFO: client.supervisor.info(),
DATA_OS_INFO: client.os.info(),
DATA_STORE: client.store.info(),
}
if CONTAINER_STATS in container_updates[CORE_CONTAINER]:
updates[DATA_CORE_STATS] = client.homeassistant.stats()
if CONTAINER_STATS in container_updates[SUPERVISOR_CONTAINER]:
updates[DATA_SUPERVISOR_STATS] = client.supervisor.stats()
# Pull off addons.list results for further processing before caching
addons_list, *results = await asyncio.gather(
client.addons.list(), *updates.values()
)
for key, result in zip(updates, cast(list[ResponseData], results), strict=True):
data[key] = result.to_dict()
installed_addons = cast(list[InstalledAddon], addons_list)
data[DATA_ADDONS_LIST] = [addon.to_dict() for addon in installed_addons]
# Deprecated 2026.4.0: Folding repositories and addons.list results into supervisor_info for compatibility
# Can drop this after removal period
data[DATA_SUPERVISOR_INFO].update(
{
"repositories": data[DATA_STORE][ATTR_REPOSITORIES],
"addons": [addon.to_dict() for addon in installed_addons],
}
)
all_addons = {addon.slug for addon in installed_addons}
started_addons = {
addon.slug
for addon in installed_addons
if addon.state in {AddonState.STARTED, AddonState.STARTUP}
}
#
# Update addon info if its the first update or
# there is at least one entity that needs the data.
#
# When entities are added they call async_enable_container_updates
# to enable updates for the endpoints they need via
# async_added_to_hass. This ensures that we only update
# the data for the endpoints that are needed to avoid unnecessary
# API calls since otherwise we would fetch stats for all containers
# and throw them away.
#
for data_key, update_func, enabled_key, wanted_addons, needs_first_update in (
(
DATA_ADDONS_STATS,
self._update_addon_stats,
CONTAINER_STATS,
started_addons,
False,
),
(
DATA_ADDONS_INFO,
self._update_addon_info,
CONTAINER_INFO,
all_addons,
True,
),
):
container_data: dict[str, Any] = data.setdefault(data_key, {})
# Clean up cache
for slug in container_data.keys() - wanted_addons:
del container_data[slug]
# Update cache from API
container_data.update(
dict(
await asyncio.gather(
*[
update_func(slug)
for slug in wanted_addons
if (first_update and needs_first_update)
or enabled_key in container_updates[slug]
]
)
)
)
# Refresh jobs data
await self.jobs.refresh_data(first_update)
async def _update_addon_stats(self, slug: str) -> tuple[str, dict[str, Any] | None]:
"""Update single addon stats."""
try:
stats = await self.supervisor_client.addons.addon_stats(slug)
except SupervisorError as err:
_LOGGER.warning("Could not fetch stats for %s: %s", slug, err)
return (slug, None)
return (slug, stats.to_dict())
async def _update_addon_info(self, slug: str) -> tuple[str, dict[str, Any] | None]:
"""Return the info for an addon."""
try:
info = await self.supervisor_client.addons.addon_info(slug)
except SupervisorError as err:
_LOGGER.warning("Could not fetch info for %s: %s", slug, err)
return (slug, None)
# Translate to legacy hassio names for compatibility
info_dict = info.to_dict()
info_dict["hassio_api"] = info_dict.pop("supervisor_api")
info_dict["hassio_role"] = info_dict.pop("supervisor_role")
return (slug, info_dict)
@callback
def async_enable_container_updates(
self, slug: str, entity_id: str, types: set[str]
) -> CALLBACK_TYPE:
"""Enable updates for an add-on."""
enabled_updates = self._container_updates[slug]
for key in types:
enabled_updates[key].add(entity_id)
@callback
def _remove() -> None:
for key in types:
enabled_updates[key].remove(entity_id)
return _remove
async def _async_refresh( async def _async_refresh(
self, self,
log_failures: bool = True, log_failures: bool = True,
@@ -616,14 +766,16 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
) -> None: ) -> None:
"""Refresh data.""" """Refresh data."""
if not scheduled and not raise_on_auth_failed: if not scheduled and not raise_on_auth_failed:
# Force refreshing updates for non-scheduled updates # Force reloading updates of main components for
# non-scheduled updates.
#
# If `raise_on_auth_failed` is set, it means this is # If `raise_on_auth_failed` is set, it means this is
# the first refresh and we do not want to delay # the first refresh and we do not want to delay
# startup or cause a timeout so we only refresh the # startup or cause a timeout so we only refresh the
# updates if this is not a scheduled refresh and # updates if this is not a scheduled refresh and
# we are not doing the first refresh. # we are not doing the first refresh.
try: try:
await self.supervisor_client.refresh_updates() await self.supervisor_client.reload_updates()
except SupervisorError as err: except SupervisorError as err:
_LOGGER.warning("Error on Supervisor API: %s", err) _LOGGER.warning("Error on Supervisor API: %s", err)
@@ -631,18 +783,6 @@ class HassioDataUpdateCoordinator(DataUpdateCoordinator):
log_failures, raise_on_auth_failed, scheduled, raise_on_entry_error log_failures, raise_on_auth_failed, scheduled, raise_on_entry_error
) )
async def force_addon_info_data_refresh(self, addon_slug: str) -> None:
"""Force refresh of addon info data for a specific addon."""
try:
slug, info = await self._update_addon_info(addon_slug)
if info is not None and DATA_KEY_ADDONS in self.data:
if slug in self.data[DATA_KEY_ADDONS]:
data = deepcopy(self.data)
data[DATA_KEY_ADDONS][slug].update(info)
self.async_set_updated_data(data)
except SupervisorError as err:
_LOGGER.warning("Could not refresh info for %s: %s", addon_slug, err)
@callback @callback
def unload(self) -> None: def unload(self) -> None:
"""Clean up when config entry unloaded.""" """Clean up when config entry unloaded."""

View File

@@ -11,8 +11,12 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr, entity_registry as er from homeassistant.helpers import device_registry as dr, entity_registry as er
from .const import ADDONS_COORDINATOR from .const import ADDONS_COORDINATOR, MAIN_COORDINATOR, STATS_COORDINATOR
from .coordinator import HassioDataUpdateCoordinator from .coordinator import (
HassioAddOnDataUpdateCoordinator,
HassioMainDataUpdateCoordinator,
HassioStatsDataUpdateCoordinator,
)
async def async_get_config_entry_diagnostics( async def async_get_config_entry_diagnostics(
@@ -20,7 +24,9 @@ async def async_get_config_entry_diagnostics(
config_entry: ConfigEntry, config_entry: ConfigEntry,
) -> dict[str, Any]: ) -> dict[str, Any]:
"""Return diagnostics for a config entry.""" """Return diagnostics for a config entry."""
coordinator: HassioDataUpdateCoordinator = hass.data[ADDONS_COORDINATOR] coordinator: HassioMainDataUpdateCoordinator = hass.data[MAIN_COORDINATOR]
addons_coordinator: HassioAddOnDataUpdateCoordinator = hass.data[ADDONS_COORDINATOR]
stats_coordinator: HassioStatsDataUpdateCoordinator = hass.data[STATS_COORDINATOR]
device_registry = dr.async_get(hass) device_registry = dr.async_get(hass)
entity_registry = er.async_get(hass) entity_registry = er.async_get(hass)
@@ -53,5 +59,7 @@ async def async_get_config_entry_diagnostics(
return { return {
"coordinator_data": coordinator.data, "coordinator_data": coordinator.data,
"addons_coordinator_data": addons_coordinator.data,
"stats_coordinator_data": stats_coordinator.data,
"devices": devices, "devices": devices,
} }

View File

@@ -13,7 +13,6 @@ from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import ( from .const import (
ATTR_SLUG, ATTR_SLUG,
CONTAINER_STATS, CONTAINER_STATS,
CORE_CONTAINER,
DATA_KEY_ADDONS, DATA_KEY_ADDONS,
DATA_KEY_CORE, DATA_KEY_CORE,
DATA_KEY_HOST, DATA_KEY_HOST,
@@ -21,20 +20,79 @@ from .const import (
DATA_KEY_OS, DATA_KEY_OS,
DATA_KEY_SUPERVISOR, DATA_KEY_SUPERVISOR,
DOMAIN, DOMAIN,
KEY_TO_UPDATE_TYPES,
SUPERVISOR_CONTAINER,
) )
from .coordinator import HassioDataUpdateCoordinator from .coordinator import (
HassioAddOnDataUpdateCoordinator,
HassioMainDataUpdateCoordinator,
HassioStatsDataUpdateCoordinator,
)
class HassioAddonEntity(CoordinatorEntity[HassioDataUpdateCoordinator]): class HassioStatsEntity(CoordinatorEntity[HassioStatsDataUpdateCoordinator]):
"""Base entity for container stats (CPU, memory)."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: HassioStatsDataUpdateCoordinator,
entity_description: EntityDescription,
*,
container_id: str,
data_key: str,
device_id: str,
unique_id_prefix: str,
) -> None:
"""Initialize base entity."""
super().__init__(coordinator)
self.entity_description = entity_description
self._container_id = container_id
self._data_key = data_key
self._attr_unique_id = f"{unique_id_prefix}_{entity_description.key}"
self._attr_device_info = DeviceInfo(identifiers={(DOMAIN, device_id)})
@property
def available(self) -> bool:
"""Return True if entity is available."""
if self._data_key == DATA_KEY_ADDONS:
return (
super().available
and DATA_KEY_ADDONS in self.coordinator.data
and self.entity_description.key
in (
self.coordinator.data[DATA_KEY_ADDONS].get(self._container_id) or {}
)
)
return (
super().available
and self._data_key in self.coordinator.data
and self.entity_description.key in self.coordinator.data[self._data_key]
)
async def async_added_to_hass(self) -> None:
"""Subscribe to stats updates."""
await super().async_added_to_hass()
self.async_on_remove(
self.coordinator.async_enable_container_updates(
self._container_id, self.entity_id, {CONTAINER_STATS}
)
)
# Stats are only fetched for containers with subscribed entities.
# The first coordinator refresh (before entities exist) has no
# subscribers, so no stats are fetched. Schedule a debounced
# refresh so that all stats entities registering during platform
# setup are batched into a single API call.
await self.coordinator.async_request_refresh()
class HassioAddonEntity(CoordinatorEntity[HassioAddOnDataUpdateCoordinator]):
"""Base entity for a Hass.io add-on.""" """Base entity for a Hass.io add-on."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioAddOnDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
addon: dict[str, Any], addon: dict[str, Any],
) -> None: ) -> None:
@@ -56,26 +114,23 @@ class HassioAddonEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
) )
async def async_added_to_hass(self) -> None: async def async_added_to_hass(self) -> None:
"""Subscribe to updates.""" """Subscribe to addon info updates."""
await super().async_added_to_hass() await super().async_added_to_hass()
update_types = KEY_TO_UPDATE_TYPES[self.entity_description.key]
self.async_on_remove( self.async_on_remove(
self.coordinator.async_enable_container_updates( self.coordinator.async_enable_addon_info_updates(
self._addon_slug, self.entity_id, update_types self._addon_slug, self.entity_id
) )
) )
if CONTAINER_STATS in update_types:
await self.coordinator.async_request_refresh()
class HassioOSEntity(CoordinatorEntity[HassioDataUpdateCoordinator]): class HassioOSEntity(CoordinatorEntity[HassioMainDataUpdateCoordinator]):
"""Base Entity for Hass.io OS.""" """Base Entity for Hass.io OS."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioMainDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
) -> None: ) -> None:
"""Initialize base entity.""" """Initialize base entity."""
@@ -94,14 +149,14 @@ class HassioOSEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
) )
class HassioHostEntity(CoordinatorEntity[HassioDataUpdateCoordinator]): class HassioHostEntity(CoordinatorEntity[HassioMainDataUpdateCoordinator]):
"""Base Entity for Hass.io host.""" """Base Entity for Hass.io host."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioMainDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
) -> None: ) -> None:
"""Initialize base entity.""" """Initialize base entity."""
@@ -120,14 +175,14 @@ class HassioHostEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
) )
class HassioSupervisorEntity(CoordinatorEntity[HassioDataUpdateCoordinator]): class HassioSupervisorEntity(CoordinatorEntity[HassioMainDataUpdateCoordinator]):
"""Base Entity for Supervisor.""" """Base Entity for Supervisor."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioMainDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
) -> None: ) -> None:
"""Initialize base entity.""" """Initialize base entity."""
@@ -146,27 +201,15 @@ class HassioSupervisorEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
in self.coordinator.data[DATA_KEY_SUPERVISOR] in self.coordinator.data[DATA_KEY_SUPERVISOR]
) )
async def async_added_to_hass(self) -> None:
"""Subscribe to updates."""
await super().async_added_to_hass()
update_types = KEY_TO_UPDATE_TYPES[self.entity_description.key]
self.async_on_remove(
self.coordinator.async_enable_container_updates(
SUPERVISOR_CONTAINER, self.entity_id, update_types
)
)
if CONTAINER_STATS in update_types:
await self.coordinator.async_request_refresh()
class HassioCoreEntity(CoordinatorEntity[HassioMainDataUpdateCoordinator]):
class HassioCoreEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
"""Base Entity for Core.""" """Base Entity for Core."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioMainDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
) -> None: ) -> None:
"""Initialize base entity.""" """Initialize base entity."""
@@ -184,27 +227,15 @@ class HassioCoreEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
and self.entity_description.key in self.coordinator.data[DATA_KEY_CORE] and self.entity_description.key in self.coordinator.data[DATA_KEY_CORE]
) )
async def async_added_to_hass(self) -> None:
"""Subscribe to updates."""
await super().async_added_to_hass()
update_types = KEY_TO_UPDATE_TYPES[self.entity_description.key]
self.async_on_remove(
self.coordinator.async_enable_container_updates(
CORE_CONTAINER, self.entity_id, update_types
)
)
if CONTAINER_STATS in update_types:
await self.coordinator.async_request_refresh()
class HassioMountEntity(CoordinatorEntity[HassioMainDataUpdateCoordinator]):
class HassioMountEntity(CoordinatorEntity[HassioDataUpdateCoordinator]):
"""Base Entity for Mount.""" """Base Entity for Mount."""
_attr_has_entity_name = True _attr_has_entity_name = True
def __init__( def __init__(
self, self,
coordinator: HassioDataUpdateCoordinator, coordinator: HassioMainDataUpdateCoordinator,
entity_description: EntityDescription, entity_description: EntityDescription,
mount: CIFSMountResponse | NFSMountResponse, mount: CIFSMountResponse | NFSMountResponse,
) -> None: ) -> None:

View File

@@ -28,7 +28,6 @@ from homeassistant.helpers.issue_registry import (
) )
from .const import ( from .const import (
ADDONS_COORDINATOR,
ATTR_DATA, ATTR_DATA,
ATTR_HEALTHY, ATTR_HEALTHY,
ATTR_SLUG, ATTR_SLUG,
@@ -54,6 +53,7 @@ from .const import (
ISSUE_KEY_SYSTEM_DOCKER_CONFIG, ISSUE_KEY_SYSTEM_DOCKER_CONFIG,
ISSUE_KEY_SYSTEM_FREE_SPACE, ISSUE_KEY_SYSTEM_FREE_SPACE,
ISSUE_MOUNT_MOUNT_FAILED, ISSUE_MOUNT_MOUNT_FAILED,
MAIN_COORDINATOR,
PLACEHOLDER_KEY_ADDON, PLACEHOLDER_KEY_ADDON,
PLACEHOLDER_KEY_ADDON_URL, PLACEHOLDER_KEY_ADDON_URL,
PLACEHOLDER_KEY_FREE_SPACE, PLACEHOLDER_KEY_FREE_SPACE,
@@ -62,7 +62,7 @@ from .const import (
STARTUP_COMPLETE, STARTUP_COMPLETE,
UPDATE_KEY_SUPERVISOR, UPDATE_KEY_SUPERVISOR,
) )
from .coordinator import HassioDataUpdateCoordinator, get_addons_list, get_host_info from .coordinator import HassioMainDataUpdateCoordinator, get_addons_list, get_host_info
from .handler import get_supervisor_client from .handler import get_supervisor_client
ISSUE_KEY_UNHEALTHY = "unhealthy" ISSUE_KEY_UNHEALTHY = "unhealthy"
@@ -417,8 +417,8 @@ class SupervisorIssues:
def _async_coordinator_refresh(self) -> None: def _async_coordinator_refresh(self) -> None:
"""Refresh coordinator to update latest data in entities.""" """Refresh coordinator to update latest data in entities."""
coordinator: HassioDataUpdateCoordinator | None coordinator: HassioMainDataUpdateCoordinator | None
if coordinator := self._hass.data.get(ADDONS_COORDINATOR): if coordinator := self._hass.data.get(MAIN_COORDINATOR):
coordinator.config_entry.async_create_task( coordinator.config_entry.async_create_task(
self._hass, coordinator.async_refresh() self._hass, coordinator.async_refresh()
) )

View File

@@ -17,20 +17,24 @@ from .const import (
ADDONS_COORDINATOR, ADDONS_COORDINATOR,
ATTR_CPU_PERCENT, ATTR_CPU_PERCENT,
ATTR_MEMORY_PERCENT, ATTR_MEMORY_PERCENT,
ATTR_SLUG,
ATTR_VERSION, ATTR_VERSION,
ATTR_VERSION_LATEST, ATTR_VERSION_LATEST,
CORE_CONTAINER,
DATA_KEY_ADDONS, DATA_KEY_ADDONS,
DATA_KEY_CORE, DATA_KEY_CORE,
DATA_KEY_HOST, DATA_KEY_HOST,
DATA_KEY_OS, DATA_KEY_OS,
DATA_KEY_SUPERVISOR, DATA_KEY_SUPERVISOR,
MAIN_COORDINATOR,
STATS_COORDINATOR,
SUPERVISOR_CONTAINER,
) )
from .entity import ( from .entity import (
HassioAddonEntity, HassioAddonEntity,
HassioCoreEntity,
HassioHostEntity, HassioHostEntity,
HassioOSEntity, HassioOSEntity,
HassioSupervisorEntity, HassioStatsEntity,
) )
COMMON_ENTITY_DESCRIPTIONS = ( COMMON_ENTITY_DESCRIPTIONS = (
@@ -63,10 +67,7 @@ STATS_ENTITY_DESCRIPTIONS = (
), ),
) )
ADDON_ENTITY_DESCRIPTIONS = COMMON_ENTITY_DESCRIPTIONS + STATS_ENTITY_DESCRIPTIONS
CORE_ENTITY_DESCRIPTIONS = STATS_ENTITY_DESCRIPTIONS
OS_ENTITY_DESCRIPTIONS = COMMON_ENTITY_DESCRIPTIONS OS_ENTITY_DESCRIPTIONS = COMMON_ENTITY_DESCRIPTIONS
SUPERVISOR_ENTITY_DESCRIPTIONS = STATS_ENTITY_DESCRIPTIONS
HOST_ENTITY_DESCRIPTIONS = ( HOST_ENTITY_DESCRIPTIONS = (
SensorEntityDescription( SensorEntityDescription(
@@ -114,36 +115,64 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Sensor set up for Hass.io config entry.""" """Sensor set up for Hass.io config entry."""
coordinator = hass.data[ADDONS_COORDINATOR] addons_coordinator = hass.data[ADDONS_COORDINATOR]
coordinator = hass.data[MAIN_COORDINATOR]
stats_coordinator = hass.data[STATS_COORDINATOR]
entities: list[ entities: list[SensorEntity] = []
HassioOSSensor | HassioAddonSensor | CoreSensor | SupervisorSensor | HostSensor
] = [ # Add-on non-stats sensors (version, version_latest)
entities.extend(
HassioAddonSensor( HassioAddonSensor(
addon=addon, addon=addon,
coordinator=coordinator, coordinator=addons_coordinator,
entity_description=entity_description, entity_description=entity_description,
) )
for addon in coordinator.data[DATA_KEY_ADDONS].values() for addon in addons_coordinator.data[DATA_KEY_ADDONS].values()
for entity_description in ADDON_ENTITY_DESCRIPTIONS for entity_description in COMMON_ENTITY_DESCRIPTIONS
]
entities.extend(
CoreSensor(
coordinator=coordinator,
entity_description=entity_description,
)
for entity_description in CORE_ENTITY_DESCRIPTIONS
) )
# Add-on stats sensors (cpu_percent, memory_percent)
entities.extend( entities.extend(
SupervisorSensor( HassioStatsSensor(
coordinator=coordinator, coordinator=stats_coordinator,
entity_description=entity_description, entity_description=entity_description,
container_id=addon[ATTR_SLUG],
data_key=DATA_KEY_ADDONS,
device_id=addon[ATTR_SLUG],
unique_id_prefix=addon[ATTR_SLUG],
) )
for entity_description in SUPERVISOR_ENTITY_DESCRIPTIONS for addon in addons_coordinator.data[DATA_KEY_ADDONS].values()
for entity_description in STATS_ENTITY_DESCRIPTIONS
) )
# Core stats sensors
entities.extend(
HassioStatsSensor(
coordinator=stats_coordinator,
entity_description=entity_description,
container_id=CORE_CONTAINER,
data_key=DATA_KEY_CORE,
device_id="core",
unique_id_prefix="home_assistant_core",
)
for entity_description in STATS_ENTITY_DESCRIPTIONS
)
# Supervisor stats sensors
entities.extend(
HassioStatsSensor(
coordinator=stats_coordinator,
entity_description=entity_description,
container_id=SUPERVISOR_CONTAINER,
data_key=DATA_KEY_SUPERVISOR,
device_id="supervisor",
unique_id_prefix="home_assistant_supervisor",
)
for entity_description in STATS_ENTITY_DESCRIPTIONS
)
# Host sensors
entities.extend( entities.extend(
HostSensor( HostSensor(
coordinator=coordinator, coordinator=coordinator,
@@ -152,6 +181,7 @@ async def async_setup_entry(
for entity_description in HOST_ENTITY_DESCRIPTIONS for entity_description in HOST_ENTITY_DESCRIPTIONS
) )
# OS sensors
if coordinator.is_hass_os: if coordinator.is_hass_os:
entities.extend( entities.extend(
HassioOSSensor( HassioOSSensor(
@@ -175,8 +205,21 @@ class HassioAddonSensor(HassioAddonEntity, SensorEntity):
] ]
class HassioStatsSensor(HassioStatsEntity, SensorEntity):
"""Sensor to track container stats."""
@property
def native_value(self) -> str:
"""Return native value of entity."""
if self._data_key == DATA_KEY_ADDONS:
return self.coordinator.data[DATA_KEY_ADDONS][self._container_id][
self.entity_description.key
]
return self.coordinator.data[self._data_key][self.entity_description.key]
class HassioOSSensor(HassioOSEntity, SensorEntity): class HassioOSSensor(HassioOSEntity, SensorEntity):
"""Sensor to track a Hass.io add-on attribute.""" """Sensor to track a Hass.io OS attribute."""
@property @property
def native_value(self) -> str: def native_value(self) -> str:
@@ -184,24 +227,6 @@ class HassioOSSensor(HassioOSEntity, SensorEntity):
return self.coordinator.data[DATA_KEY_OS][self.entity_description.key] return self.coordinator.data[DATA_KEY_OS][self.entity_description.key]
class CoreSensor(HassioCoreEntity, SensorEntity):
"""Sensor to track a core attribute."""
@property
def native_value(self) -> str:
"""Return native value of entity."""
return self.coordinator.data[DATA_KEY_CORE][self.entity_description.key]
class SupervisorSensor(HassioSupervisorEntity, SensorEntity):
"""Sensor to track a supervisor attribute."""
@property
def native_value(self) -> str:
"""Return native value of entity."""
return self.coordinator.data[DATA_KEY_SUPERVISOR][self.entity_description.key]
class HostSensor(HassioHostEntity, SensorEntity): class HostSensor(HassioHostEntity, SensorEntity):
"""Sensor to track a host attribute.""" """Sensor to track a host attribute."""

View File

@@ -32,7 +32,6 @@ from homeassistant.helpers import (
from homeassistant.util.dt import now from homeassistant.util.dt import now
from .const import ( from .const import (
ADDONS_COORDINATOR,
ATTR_ADDON, ATTR_ADDON,
ATTR_ADDONS, ATTR_ADDONS,
ATTR_APP, ATTR_APP,
@@ -46,9 +45,10 @@ from .const import (
ATTR_PASSWORD, ATTR_PASSWORD,
ATTR_SLUG, ATTR_SLUG,
DOMAIN, DOMAIN,
MAIN_COORDINATOR,
SupervisorEntityModel, SupervisorEntityModel,
) )
from .coordinator import HassioDataUpdateCoordinator, get_addons_info from .coordinator import HassioMainDataUpdateCoordinator, get_addons_info
SERVICE_ADDON_START = "addon_start" SERVICE_ADDON_START = "addon_start"
SERVICE_ADDON_STOP = "addon_stop" SERVICE_ADDON_STOP = "addon_stop"
@@ -406,7 +406,7 @@ def async_register_network_storage_services(
async def async_mount_reload(service: ServiceCall) -> None: async def async_mount_reload(service: ServiceCall) -> None:
"""Handle service calls for Hass.io.""" """Handle service calls for Hass.io."""
coordinator: HassioDataUpdateCoordinator | None = None coordinator: HassioMainDataUpdateCoordinator | None = None
if (device := dev_reg.async_get(service.data[ATTR_DEVICE_ID])) is None: if (device := dev_reg.async_get(service.data[ATTR_DEVICE_ID])) is None:
raise ServiceValidationError( raise ServiceValidationError(
@@ -417,7 +417,7 @@ def async_register_network_storage_services(
if ( if (
device.name is None device.name is None
or device.model != SupervisorEntityModel.MOUNT or device.model != SupervisorEntityModel.MOUNT
or (coordinator := hass.data.get(ADDONS_COORDINATOR)) is None or (coordinator := hass.data.get(MAIN_COORDINATOR)) is None
or coordinator.entry_id not in device.config_entries or coordinator.entry_id not in device.config_entries
): ):
raise ServiceValidationError( raise ServiceValidationError(

View File

@@ -29,6 +29,7 @@ from .const import (
DATA_KEY_CORE, DATA_KEY_CORE,
DATA_KEY_OS, DATA_KEY_OS,
DATA_KEY_SUPERVISOR, DATA_KEY_SUPERVISOR,
MAIN_COORDINATOR,
) )
from .entity import ( from .entity import (
HassioAddonEntity, HassioAddonEntity,
@@ -51,9 +52,9 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up Supervisor update based on a config entry.""" """Set up Supervisor update based on a config entry."""
coordinator = hass.data[ADDONS_COORDINATOR] coordinator = hass.data[MAIN_COORDINATOR]
entities = [ entities: list[UpdateEntity] = [
SupervisorSupervisorUpdateEntity( SupervisorSupervisorUpdateEntity(
coordinator=coordinator, coordinator=coordinator,
entity_description=ENTITY_DESCRIPTION, entity_description=ENTITY_DESCRIPTION,
@@ -64,15 +65,6 @@ async def async_setup_entry(
), ),
] ]
entities.extend(
SupervisorAddonUpdateEntity(
addon=addon,
coordinator=coordinator,
entity_description=ENTITY_DESCRIPTION,
)
for addon in coordinator.data[DATA_KEY_ADDONS].values()
)
if coordinator.is_hass_os: if coordinator.is_hass_os:
entities.append( entities.append(
SupervisorOSUpdateEntity( SupervisorOSUpdateEntity(
@@ -81,6 +73,16 @@ async def async_setup_entry(
) )
) )
addons_coordinator = hass.data[ADDONS_COORDINATOR]
entities.extend(
SupervisorAddonUpdateEntity(
addon=addon,
coordinator=addons_coordinator,
entity_description=ENTITY_DESCRIPTION,
)
for addon in addons_coordinator.data[DATA_KEY_ADDONS].values()
)
async_add_entities(entities) async_add_entities(entities)

View File

@@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/homeassistant_hardware", "documentation": "https://www.home-assistant.io/integrations/homeassistant_hardware",
"integration_type": "system", "integration_type": "system",
"requirements": [ "requirements": [
"serialx==1.1.1", "serialx==1.2.2",
"universal-silabs-flasher==1.0.3", "universal-silabs-flasher==1.0.3",
"ha-silabs-firmware-client==0.3.0" "ha-silabs-firmware-client==0.3.0"
] ]

View File

@@ -13,5 +13,5 @@
"documentation": "https://www.home-assistant.io/integrations/husqvarna_automower_ble", "documentation": "https://www.home-assistant.io/integrations/husqvarna_automower_ble",
"integration_type": "device", "integration_type": "device",
"iot_class": "local_polling", "iot_class": "local_polling",
"requirements": ["automower-ble==0.2.8", "gardena-bluetooth==2.3.0"] "requirements": ["automower-ble==0.2.8", "gardena-bluetooth==2.4.0"]
} }

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/image_upload", "documentation": "https://www.home-assistant.io/integrations/image_upload",
"integration_type": "system", "integration_type": "system",
"quality_scale": "internal", "quality_scale": "internal",
"requirements": ["Pillow==12.1.1"] "requirements": ["Pillow==12.2.0"]
} }

View File

@@ -92,83 +92,15 @@ async def async_setup_entry(
for area in lutron_client.areas: for area in lutron_client.areas:
_LOGGER.debug("Working on area %s", area.name) _LOGGER.debug("Working on area %s", area.name)
for output in area.outputs: for output in area.outputs:
platform = None _setup_output(
_LOGGER.debug("Working on output %s", output.type) hass, entry_data, output, area.name, entity_registry, device_registry
if output.type == "SYSTEM_SHADE":
entry_data.covers.append((area.name, output))
platform = Platform.COVER
elif output.type == "CEILING_FAN_TYPE":
entry_data.fans.append((area.name, output))
platform = Platform.FAN
elif output.is_dimmable:
entry_data.lights.append((area.name, output))
platform = Platform.LIGHT
else:
entry_data.switches.append((area.name, output))
platform = Platform.SWITCH
_async_check_entity_unique_id(
hass,
entity_registry,
platform,
output.uuid,
output.legacy_uuid,
entry_data.client.guid,
)
_async_check_device_identifiers(
hass,
device_registry,
output.uuid,
output.legacy_uuid,
entry_data.client.guid,
) )
for keypad in area.keypads: for keypad in area.keypads:
_async_check_keypad_identifiers( _setup_keypad(
hass, hass, entry_data, keypad, area.name, entity_registry, device_registry
device_registry,
keypad.id,
keypad.uuid,
keypad.legacy_uuid,
entry_data.client.guid,
) )
for button in keypad.buttons:
# If the button has a function assigned to it, add it as a scene
if button.name != "Unknown Button" and button.button_type in (
"SingleAction",
"Toggle",
"SingleSceneRaiseLower",
"MasterRaiseLower",
"AdvancedToggle",
):
# Associate an LED with a button if there is one
led = next(
(led for led in keypad.leds if led.number == button.number),
None,
)
entry_data.scenes.append((area.name, keypad, button, led))
platform = Platform.SCENE
_async_check_entity_unique_id(
hass,
entity_registry,
platform,
button.uuid,
button.legacy_uuid,
entry_data.client.guid,
)
if led is not None:
platform = Platform.SWITCH
_async_check_entity_unique_id(
hass,
entity_registry,
platform,
led.uuid,
led.legacy_uuid,
entry_data.client.guid,
)
if button.button_type:
entry_data.buttons.append((area.name, keypad, button))
if area.occupancy_group is not None: if area.occupancy_group is not None:
entry_data.binary_sensors.append((area.name, area.occupancy_group)) entry_data.binary_sensors.append((area.name, area.occupancy_group))
platform = Platform.BINARY_SENSOR platform = Platform.BINARY_SENSOR
@@ -202,6 +134,99 @@ async def async_setup_entry(
return True return True
def _setup_output(
hass: HomeAssistant,
entry_data: LutronData,
output: Output,
area_name: str,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
) -> None:
"""Set up a Lutron output."""
_LOGGER.debug("Working on output %s", output.type)
if output.type == "SYSTEM_SHADE":
entry_data.covers.append((area_name, output))
platform = Platform.COVER
elif output.type == "CEILING_FAN_TYPE":
entry_data.fans.append((area_name, output))
platform = Platform.FAN
elif output.is_dimmable:
entry_data.lights.append((area_name, output))
platform = Platform.LIGHT
else:
entry_data.switches.append((area_name, output))
platform = Platform.SWITCH
_async_check_entity_unique_id(
hass,
entity_registry,
platform,
output.uuid,
output.legacy_uuid,
entry_data.client.guid,
)
_async_check_device_identifiers(
hass,
device_registry,
output.uuid,
output.legacy_uuid,
entry_data.client.guid,
)
def _setup_keypad(
hass: HomeAssistant,
entry_data: LutronData,
keypad: Keypad,
area_name: str,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
) -> None:
"""Set up a Lutron keypad."""
_async_check_keypad_identifiers(
hass,
device_registry,
keypad.id,
keypad.uuid,
keypad.legacy_uuid,
entry_data.client.guid,
)
leds_by_number = {led.number: led for led in keypad.leds}
for button in keypad.buttons:
# If the button has a function assigned to it, add it as a scene
if button.name != "Unknown Button" and button.button_type in (
"SingleAction",
"Toggle",
"SingleSceneRaiseLower",
"MasterRaiseLower",
"AdvancedToggle",
):
# Associate an LED with a button if there is one
led = leds_by_number.get(button.number)
entry_data.scenes.append((area_name, keypad, button, led))
_async_check_entity_unique_id(
hass,
entity_registry,
Platform.SCENE,
button.uuid,
button.legacy_uuid,
entry_data.client.guid,
)
if led is not None:
_async_check_entity_unique_id(
hass,
entity_registry,
Platform.SWITCH,
led.uuid,
led.legacy_uuid,
entry_data.client.guid,
)
if button.button_type:
entry_data.buttons.append((area_name, keypad, button))
def _async_check_entity_unique_id( def _async_check_entity_unique_id(
hass: HomeAssistant, hass: HomeAssistant,
entity_registry: er.EntityRegistry, entity_registry: er.EntityRegistry,

View File

@@ -6,5 +6,5 @@
"iot_class": "cloud_push", "iot_class": "cloud_push",
"loggers": ["matrix_client"], "loggers": ["matrix_client"],
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["matrix-nio==0.25.2", "Pillow==12.1.1", "aiofiles==24.1.0"] "requirements": ["matrix-nio==0.25.2", "Pillow==12.2.0", "aiofiles==24.1.0"]
} }

View File

@@ -75,7 +75,6 @@ from .const import ( # noqa: F401
ATTR_GROUP_MEMBERS, ATTR_GROUP_MEMBERS,
ATTR_INPUT_SOURCE, ATTR_INPUT_SOURCE,
ATTR_INPUT_SOURCE_LIST, ATTR_INPUT_SOURCE_LIST,
ATTR_LAST_NON_BUFFERING_STATE,
ATTR_MEDIA_ALBUM_ARTIST, ATTR_MEDIA_ALBUM_ARTIST,
ATTR_MEDIA_ALBUM_NAME, ATTR_MEDIA_ALBUM_NAME,
ATTR_MEDIA_ANNOUNCE, ATTR_MEDIA_ANNOUNCE,
@@ -588,8 +587,6 @@ class MediaPlayerEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
_attr_volume_level: float | None = None _attr_volume_level: float | None = None
_attr_volume_step: float _attr_volume_step: float
__last_non_buffering_state: MediaPlayerState | None = None
# Implement these for your media player # Implement these for your media player
@cached_property @cached_property
def device_class(self) -> MediaPlayerDeviceClass | None: def device_class(self) -> MediaPlayerDeviceClass | None:
@@ -1127,12 +1124,7 @@ class MediaPlayerEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
@property @property
def state_attributes(self) -> dict[str, Any]: def state_attributes(self) -> dict[str, Any]:
"""Return the state attributes.""" """Return the state attributes."""
if (state := self.state) != MediaPlayerState.BUFFERING: state_attr: dict[str, Any] = {}
self.__last_non_buffering_state = state
state_attr: dict[str, Any] = {
ATTR_LAST_NON_BUFFERING_STATE: self.__last_non_buffering_state
}
if self.support_grouping: if self.support_grouping:
state_attr[ATTR_GROUP_MEMBERS] = self.group_members state_attr[ATTR_GROUP_MEMBERS] = self.group_members

View File

@@ -13,7 +13,6 @@ ATTR_ENTITY_PICTURE_LOCAL = "entity_picture_local"
ATTR_GROUP_MEMBERS = "group_members" ATTR_GROUP_MEMBERS = "group_members"
ATTR_INPUT_SOURCE = "source" ATTR_INPUT_SOURCE = "source"
ATTR_INPUT_SOURCE_LIST = "source_list" ATTR_INPUT_SOURCE_LIST = "source_list"
ATTR_LAST_NON_BUFFERING_STATE = "last_non_buffering_state"
ATTR_MEDIA_ANNOUNCE = "announce" ATTR_MEDIA_ANNOUNCE = "announce"
ATTR_MEDIA_ALBUM_ARTIST = "media_album_artist" ATTR_MEDIA_ALBUM_ARTIST = "media_album_artist"
ATTR_MEDIA_ALBUM_NAME = "media_album_name" ATTR_MEDIA_ALBUM_NAME = "media_album_name"

View File

@@ -123,8 +123,20 @@
} }
}, },
"triggers": { "triggers": {
"paused_playing": {
"trigger": "mdi:pause"
},
"started_playing": {
"trigger": "mdi:play"
},
"stopped_playing": { "stopped_playing": {
"trigger": "mdi:stop" "trigger": "mdi:stop"
},
"turned_off": {
"trigger": "mdi:power"
},
"turned_on": {
"trigger": "mdi:power"
} }
} }
} }

View File

@@ -433,14 +433,50 @@
}, },
"title": "Media player", "title": "Media player",
"triggers": { "triggers": {
"paused_playing": {
"description": "Triggers after one or more media players pause playing.",
"fields": {
"behavior": {
"name": "[%key:component::media_player::common::trigger_behavior_name%]"
}
},
"name": "Media player paused playing"
},
"started_playing": {
"description": "Triggers after one or more media players start playing.",
"fields": {
"behavior": {
"name": "[%key:component::media_player::common::trigger_behavior_name%]"
}
},
"name": "Media player started playing"
},
"stopped_playing": { "stopped_playing": {
"description": "Triggers after one or more media players stop playing media.", "description": "Triggers after one or more media players stop playing.",
"fields": { "fields": {
"behavior": { "behavior": {
"name": "[%key:component::media_player::common::trigger_behavior_name%]" "name": "[%key:component::media_player::common::trigger_behavior_name%]"
} }
}, },
"name": "Media player stopped playing" "name": "Media player stopped playing"
},
"turned_off": {
"description": "Triggers after one or more media players turn off.",
"fields": {
"behavior": {
"name": "[%key:component::media_player::common::trigger_behavior_name%]"
}
},
"name": "Media player turned off"
},
"turned_on": {
"description": "Triggers after one or more media players turn on.",
"fields": {
"behavior": {
"name": "[%key:component::media_player::common::trigger_behavior_name%]"
}
},
"name": "Media player turned on"
} }
} }
} }

View File

@@ -7,6 +7,29 @@ from . import MediaPlayerState
from .const import DOMAIN from .const import DOMAIN
TRIGGERS: dict[str, type[Trigger]] = { TRIGGERS: dict[str, type[Trigger]] = {
"paused_playing": make_entity_transition_trigger(
DOMAIN,
from_states={
MediaPlayerState.BUFFERING,
MediaPlayerState.PLAYING,
},
to_states={
MediaPlayerState.PAUSED,
},
),
"started_playing": make_entity_transition_trigger(
DOMAIN,
from_states={
MediaPlayerState.IDLE,
MediaPlayerState.OFF,
MediaPlayerState.ON,
MediaPlayerState.PAUSED,
},
to_states={
MediaPlayerState.BUFFERING,
MediaPlayerState.PLAYING,
},
),
"stopped_playing": make_entity_transition_trigger( "stopped_playing": make_entity_transition_trigger(
DOMAIN, DOMAIN,
from_states={ from_states={
@@ -20,6 +43,32 @@ TRIGGERS: dict[str, type[Trigger]] = {
MediaPlayerState.ON, MediaPlayerState.ON,
}, },
), ),
"turned_off": make_entity_transition_trigger(
DOMAIN,
from_states={
MediaPlayerState.BUFFERING,
MediaPlayerState.IDLE,
MediaPlayerState.ON,
MediaPlayerState.PAUSED,
MediaPlayerState.PLAYING,
},
to_states={
MediaPlayerState.OFF,
},
),
"turned_on": make_entity_transition_trigger(
DOMAIN,
from_states={
MediaPlayerState.OFF,
},
to_states={
MediaPlayerState.BUFFERING,
MediaPlayerState.IDLE,
MediaPlayerState.ON,
MediaPlayerState.PAUSED,
MediaPlayerState.PLAYING,
},
),
} }

View File

@@ -1,4 +1,4 @@
stopped_playing: .trigger_common: &trigger_common
target: target:
entity: entity:
domain: media_player domain: media_player
@@ -13,3 +13,9 @@ stopped_playing:
- first - first
- last - last
- any - any
paused_playing: *trigger_common
started_playing: *trigger_common
stopped_playing: *trigger_common
turned_off: *trigger_common
turned_on: *trigger_common

View File

@@ -1,5 +1,6 @@
"""Device tracker for Mobile app.""" """Device tracker for Mobile app."""
from collections.abc import Callable
from typing import Any from typing import Any
from homeassistant.components.device_tracker import ( from homeassistant.components.device_tracker import (
@@ -53,11 +54,11 @@ async def async_setup_entry(
class MobileAppEntity(TrackerEntity, RestoreEntity): class MobileAppEntity(TrackerEntity, RestoreEntity):
"""Represent a tracked device.""" """Represent a tracked device."""
def __init__(self, entry, data=None): def __init__(self, entry: ConfigEntry) -> None:
"""Set up Mobile app entity.""" """Set up Mobile app entity."""
self._entry = entry self._entry = entry
self._data = data self._data: dict[str, Any] = {}
self._dispatch_unsub = None self._dispatch_unsub: Callable[[], None] | None = None
@property @property
def unique_id(self) -> str: def unique_id(self) -> str:
@@ -132,12 +133,7 @@ class MobileAppEntity(TrackerEntity, RestoreEntity):
self.update_data, self.update_data,
) )
# Don't restore if we got set up with data.
if self._data is not None:
return
if (state := await self.async_get_last_state()) is None: if (state := await self.async_get_last_state()) is None:
self._data = {}
return return
attr = state.attributes attr = state.attributes
@@ -158,7 +154,7 @@ class MobileAppEntity(TrackerEntity, RestoreEntity):
self._dispatch_unsub = None self._dispatch_unsub = None
@callback @callback
def update_data(self, data): def update_data(self, data: dict[str, Any]) -> None:
"""Mark the device as seen.""" """Mark the device as seen."""
self._data = data self._data = data
self.async_write_ha_state() self.async_write_ha_state()

View File

@@ -54,7 +54,7 @@ class MotionMountErrorStatusSensor(MotionMountEntity, SensorEntity):
def __init__( def __init__(
self, mm: motionmount.MotionMount, config_entry: MotionMountConfigEntry self, mm: motionmount.MotionMount, config_entry: MotionMountConfigEntry
) -> None: ) -> None:
"""Initialize sensor entiry.""" """Initialize sensor entity."""
super().__init__(mm, config_entry) super().__init__(mm, config_entry)
self._attr_unique_id = f"{self._base_unique_id}-error-status" self._attr_unique_id = f"{self._base_unique_id}-error-status"

View File

@@ -4,5 +4,5 @@
"codeowners": [], "codeowners": [],
"documentation": "https://www.home-assistant.io/integrations/proxy", "documentation": "https://www.home-assistant.io/integrations/proxy",
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["Pillow==12.1.1"] "requirements": ["Pillow==12.2.0"]
} }

View File

@@ -6,5 +6,5 @@
"iot_class": "calculated", "iot_class": "calculated",
"loggers": ["pyzbar"], "loggers": ["pyzbar"],
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["Pillow==12.1.1", "pyzbar==0.1.7"] "requirements": ["Pillow==12.2.0", "pyzbar==0.1.7"]
} }

View File

@@ -192,7 +192,7 @@ ID_TYPE = BigInteger().with_variant(sqlite.INTEGER, "sqlite")
# For MariaDB and MySQL we can use an unsigned integer type since it will fit 2**32 # For MariaDB and MySQL we can use an unsigned integer type since it will fit 2**32
# for sqlite and postgresql we use a bigint # for sqlite and postgresql we use a bigint
UINT_32_TYPE = BigInteger().with_variant( UINT_32_TYPE = BigInteger().with_variant(
mysql.INTEGER(unsigned=True), # type: ignore[no-untyped-call] mysql.INTEGER(unsigned=True),
"mysql", "mysql",
"mariadb", "mariadb",
) )
@@ -206,12 +206,12 @@ JSONB_VARIANT_CAST = Text().with_variant(
) )
DATETIME_TYPE = ( DATETIME_TYPE = (
DateTime(timezone=True) DateTime(timezone=True)
.with_variant(mysql.DATETIME(timezone=True, fsp=6), "mysql", "mariadb") # type: ignore[no-untyped-call] .with_variant(mysql.DATETIME(timezone=True, fsp=6), "mysql", "mariadb")
.with_variant(FAST_PYSQLITE_DATETIME(), "sqlite") # type: ignore[no-untyped-call] .with_variant(FAST_PYSQLITE_DATETIME(), "sqlite") # type: ignore[no-untyped-call]
) )
DOUBLE_TYPE = ( DOUBLE_TYPE = (
Float() Float()
.with_variant(mysql.DOUBLE(asdecimal=False), "mysql", "mariadb") # type: ignore[no-untyped-call] .with_variant(mysql.DOUBLE(asdecimal=False), "mysql", "mariadb")
.with_variant(oracle.DOUBLE_PRECISION(), "oracle") .with_variant(oracle.DOUBLE_PRECISION(), "oracle")
.with_variant(postgresql.DOUBLE_PRECISION(), "postgresql") .with_variant(postgresql.DOUBLE_PRECISION(), "postgresql")
) )

View File

@@ -7,7 +7,7 @@
"iot_class": "local_push", "iot_class": "local_push",
"quality_scale": "internal", "quality_scale": "internal",
"requirements": [ "requirements": [
"SQLAlchemy==2.0.41", "SQLAlchemy==2.0.49",
"fnv-hash-fast==2.0.0", "fnv-hash-fast==2.0.0",
"psutil-home-assistant==0.0.1" "psutil-home-assistant==0.0.1"
] ]

View File

@@ -447,10 +447,10 @@ def setup_connection_for_dialect(
slow_dependent_subquery = False slow_dependent_subquery = False
if dialect_name == SupportedDialect.SQLITE: if dialect_name == SupportedDialect.SQLITE:
if first_connection: if first_connection:
old_isolation = dbapi_connection.isolation_level # type: ignore[attr-defined] old_isolation = dbapi_connection.isolation_level
dbapi_connection.isolation_level = None # type: ignore[attr-defined] dbapi_connection.isolation_level = None
execute_on_connection(dbapi_connection, "PRAGMA journal_mode=WAL") execute_on_connection(dbapi_connection, "PRAGMA journal_mode=WAL")
dbapi_connection.isolation_level = old_isolation # type: ignore[attr-defined] dbapi_connection.isolation_level = old_isolation
# WAL mode only needs to be setup once # WAL mode only needs to be setup once
# instead of every time we open the sqlite connection # instead of every time we open the sqlite connection
# as its persistent and isn't free to call every time. # as its persistent and isn't free to call every time.

View File

@@ -21,6 +21,8 @@ from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator from .coordinator import RitualsDataUpdateCoordinator
from .entity import DiffuserEntity from .entity import DiffuserEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True, kw_only=True)
class RitualsBinarySensorEntityDescription(BinarySensorEntityDescription): class RitualsBinarySensorEntityDescription(BinarySensorEntityDescription):

View File

@@ -17,6 +17,8 @@ from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator from .coordinator import RitualsDataUpdateCoordinator
from .entity import DiffuserEntity from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True, kw_only=True)
class RitualsNumberEntityDescription(NumberEntityDescription): class RitualsNumberEntityDescription(NumberEntityDescription):

View File

@@ -17,6 +17,8 @@ from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator from .coordinator import RitualsDataUpdateCoordinator
from .entity import DiffuserEntity from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True, kw_only=True)
class RitualsSelectEntityDescription(SelectEntityDescription): class RitualsSelectEntityDescription(SelectEntityDescription):

View File

@@ -21,6 +21,8 @@ from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator from .coordinator import RitualsDataUpdateCoordinator
from .entity import DiffuserEntity from .entity import DiffuserEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True, kw_only=True)
class RitualsSensorEntityDescription(SensorEntityDescription): class RitualsSensorEntityDescription(SensorEntityDescription):

View File

@@ -17,6 +17,8 @@ from .const import DOMAIN
from .coordinator import RitualsDataUpdateCoordinator from .coordinator import RitualsDataUpdateCoordinator
from .entity import DiffuserEntity from .entity import DiffuserEntity
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True, kw_only=True)
class RitualsSwitchEntityDescription(SwitchEntityDescription): class RitualsSwitchEntityDescription(SwitchEntityDescription):

View File

@@ -49,10 +49,14 @@ async def async_setup_entry(
await coordinator.async_config_entry_first_refresh() await coordinator.async_config_entry_first_refresh()
system_info = await ruckus.api.get_system_info() try:
system_info = await ruckus.api.get_system_info()
aps = await ruckus.api.get_aps()
except (ConnectionError, SchemaError) as err:
await ruckus.close()
raise ConfigEntryNotReady from err
registry = dr.async_get(hass) registry = dr.async_get(hass)
aps = await ruckus.api.get_aps()
for access_point in aps: for access_point in aps:
_LOGGER.debug("AP [%s] %s", access_point[API_AP_MAC], entry.entry_id) _LOGGER.debug("AP [%s] %s", access_point[API_AP_MAC], entry.entry_id)
registry.async_get_or_create( registry.async_get_or_create(

View File

@@ -86,12 +86,10 @@ class RuckusConfigFlow(ConfigFlow, domain=DOMAIN):
return self.async_create_entry( return self.async_create_entry(
title=info[KEY_SYS_TITLE], data=user_input title=info[KEY_SYS_TITLE], data=user_input
) )
reauth_entry = self._get_reauth_entry() self._abort_if_unique_id_mismatch(reason="invalid_host")
if info[KEY_SYS_SERIAL] == reauth_entry.unique_id: return self.async_update_reload_and_abort(
return self.async_update_reload_and_abort( self._get_reauth_entry(), data=user_input
reauth_entry, data=user_input )
)
errors["base"] = "invalid_host"
data_schema = DATA_SCHEMA data_schema = DATA_SCHEMA
if self.source == SOURCE_REAUTH: if self.source == SOURCE_REAUTH:

View File

@@ -2,12 +2,12 @@
"config": { "config": {
"abort": { "abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]", "already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"invalid_host": "[%key:common::config_flow::error::invalid_host%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]" "reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
}, },
"error": { "error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]", "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]", "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_host": "[%key:common::config_flow::error::invalid_host%]",
"unknown": "[%key:common::config_flow::error::unknown%]" "unknown": "[%key:common::config_flow::error::unknown%]"
}, },
"step": { "step": {

View File

@@ -8,6 +8,6 @@
"iot_class": "local_push", "iot_class": "local_push",
"loggers": ["aiorussound"], "loggers": ["aiorussound"],
"quality_scale": "silver", "quality_scale": "silver",
"requirements": ["aiorussound==5.0.0"], "requirements": ["aiorussound==5.0.1"],
"zeroconf": ["_rio._tcp.local."] "zeroconf": ["_rio._tcp.local."]
} }

View File

@@ -4,27 +4,40 @@ from __future__ import annotations
import asyncio import asyncio
from collections.abc import Coroutine from collections.abc import Coroutine
from copy import deepcopy
from datetime import timedelta from datetime import timedelta
import logging
from types import MappingProxyType
from typing import Any from typing import Any
import voluptuous as vol import voluptuous as vol
from homeassistant.components.rest import RESOURCE_SCHEMA, create_rest_data_from_config from homeassistant.components.rest import RESOURCE_SCHEMA, create_rest_data_from_config
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN from homeassistant.components.sensor import CONF_STATE_CLASS, DOMAIN as SENSOR_DOMAIN
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry, ConfigSubentry
from homeassistant.const import ( from homeassistant.const import (
CONF_ATTRIBUTE, CONF_ATTRIBUTE,
CONF_AUTHENTICATION,
CONF_DEVICE_CLASS,
CONF_HEADERS,
CONF_NAME,
CONF_PASSWORD,
CONF_SCAN_INTERVAL, CONF_SCAN_INTERVAL,
CONF_TIMEOUT,
CONF_UNIQUE_ID,
CONF_UNIT_OF_MEASUREMENT,
CONF_USERNAME,
CONF_VALUE_TEMPLATE, CONF_VALUE_TEMPLATE,
CONF_VERIFY_SSL,
Platform, Platform,
) )
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.helpers import ( from homeassistant.helpers import (
config_validation as cv, config_validation as cv,
device_registry as dr,
discovery, discovery,
entity_registry as er, entity_registry as er,
) )
from homeassistant.helpers.device_registry import DeviceEntry
from homeassistant.helpers.trigger_template_entity import ( from homeassistant.helpers.trigger_template_entity import (
CONF_AVAILABILITY, CONF_AVAILABILITY,
TEMPLATE_SENSOR_BASE_SCHEMA, TEMPLATE_SENSOR_BASE_SCHEMA,
@@ -32,11 +45,22 @@ from homeassistant.helpers.trigger_template_entity import (
) )
from homeassistant.helpers.typing import ConfigType from homeassistant.helpers.typing import ConfigType
from .const import CONF_INDEX, CONF_SELECT, DEFAULT_SCAN_INTERVAL, DOMAIN, PLATFORMS from .const import (
CONF_ADVANCED,
CONF_AUTH,
CONF_ENCODING,
CONF_INDEX,
CONF_SELECT,
DEFAULT_SCAN_INTERVAL,
DOMAIN,
PLATFORMS,
)
from .coordinator import ScrapeCoordinator from .coordinator import ScrapeCoordinator
type ScrapeConfigEntry = ConfigEntry[ScrapeCoordinator] type ScrapeConfigEntry = ConfigEntry[ScrapeCoordinator]
_LOGGER = logging.getLogger(__name__)
SENSOR_SCHEMA = vol.Schema( SENSOR_SCHEMA = vol.Schema(
{ {
**TEMPLATE_SENSOR_BASE_SCHEMA.schema, **TEMPLATE_SENSOR_BASE_SCHEMA.schema,
@@ -103,7 +127,13 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ScrapeConfigEntry) -> bool: async def async_setup_entry(hass: HomeAssistant, entry: ScrapeConfigEntry) -> bool:
"""Set up Scrape from a config entry.""" """Set up Scrape from a config entry."""
rest_config: dict[str, Any] = COMBINED_SCHEMA(dict(entry.options)) config: dict[str, Any] = dict(entry.options)
# Config flow uses sections but the COMBINED SCHEMA does not
# so we need to flatten the config here
config.update(config.pop(CONF_ADVANCED, {}))
config.update(config.pop(CONF_AUTH, {}))
rest_config: dict[str, Any] = COMBINED_SCHEMA(dict(config))
rest = create_rest_data_from_config(hass, rest_config) rest = create_rest_data_from_config(hass, rest_config)
coordinator = ScrapeCoordinator( coordinator = ScrapeCoordinator(
@@ -117,17 +147,159 @@ async def async_setup_entry(hass: HomeAssistant, entry: ScrapeConfigEntry) -> bo
entry.runtime_data = coordinator entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS) await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
entry.async_on_unload(entry.add_update_listener(update_listener))
return True return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: async def async_migrate_entry(hass: HomeAssistant, entry: ScrapeConfigEntry) -> bool:
"""Migrate old entry."""
if entry.version > 2:
# Don't migrate from future version
return False
if entry.version == 1:
old_to_new_sensor_id = {}
for sensor_config in entry.options[SENSOR_DOMAIN]:
# Create a new sub config entry per sensor
title = sensor_config[CONF_NAME]
old_unique_id = sensor_config[CONF_UNIQUE_ID]
subentry_config = {
CONF_INDEX: sensor_config[CONF_INDEX],
CONF_SELECT: sensor_config[CONF_SELECT],
CONF_ADVANCED: {},
}
for sensor_advanced_key in (
CONF_ATTRIBUTE,
CONF_VALUE_TEMPLATE,
CONF_AVAILABILITY,
CONF_DEVICE_CLASS,
CONF_STATE_CLASS,
CONF_UNIT_OF_MEASUREMENT,
):
if sensor_advanced_key not in sensor_config:
continue
subentry_config[CONF_ADVANCED][sensor_advanced_key] = sensor_config[
sensor_advanced_key
]
new_sub_entry = ConfigSubentry(
data=MappingProxyType(subentry_config),
subentry_type="entity",
title=title,
unique_id=None,
)
_LOGGER.debug(
"Migrating sensor %s with unique id %s to sub config entry id %s, old data %s, new data %s",
title,
old_unique_id,
new_sub_entry.subentry_id,
sensor_config,
subentry_config,
)
old_to_new_sensor_id[old_unique_id] = new_sub_entry.subentry_id
hass.config_entries.async_add_subentry(entry, new_sub_entry)
# Use the new sub config entry id as the unique id for the sensor entity
entity_reg = er.async_get(hass)
entities = er.async_entries_for_config_entry(entity_reg, entry.entry_id)
for entity in entities:
if (old_unique_id := entity.unique_id) in old_to_new_sensor_id:
new_unique_id = old_to_new_sensor_id[old_unique_id]
_LOGGER.debug(
"Migrating entity %s with unique id %s to new unique id %s",
entity.entity_id,
entity.unique_id,
new_unique_id,
)
entity_reg.async_update_entity(
entity.entity_id,
config_entry_id=entry.entry_id,
config_subentry_id=new_unique_id,
new_unique_id=new_unique_id,
)
# Use the new sub config entry id as the identifier for the sensor device
device_reg = dr.async_get(hass)
devices = dr.async_entries_for_config_entry(device_reg, entry.entry_id)
for device in devices:
for domain, identifier in device.identifiers:
if domain != DOMAIN or identifier not in old_to_new_sensor_id:
continue
subentry_id = old_to_new_sensor_id[identifier]
new_identifiers = deepcopy(device.identifiers)
new_identifiers.remove((domain, identifier))
new_identifiers.add((domain, old_to_new_sensor_id[identifier]))
_LOGGER.debug(
"Migrating device %s with identifiers %s to new identifiers %s",
device.id,
device.identifiers,
new_identifiers,
)
device_reg.async_update_device(
device.id,
add_config_entry_id=entry.entry_id,
add_config_subentry_id=subentry_id,
new_identifiers=new_identifiers,
)
# Removing None from the list of subentries if existing
# as the device should only belong to the subentry
# and not the main config entry
device_reg.async_update_device(
device.id,
remove_config_entry_id=entry.entry_id,
remove_config_subentry_id=None,
)
# Update the resource config
new_config_entry_data = dict(entry.options)
new_config_entry_data[CONF_AUTH] = {}
new_config_entry_data[CONF_ADVANCED] = {}
new_config_entry_data.pop(SENSOR_DOMAIN, None)
for resource_advanced_key in (
CONF_HEADERS,
CONF_VERIFY_SSL,
CONF_TIMEOUT,
CONF_ENCODING,
):
if resource_advanced_key in new_config_entry_data:
new_config_entry_data[CONF_ADVANCED][resource_advanced_key] = (
new_config_entry_data.pop(resource_advanced_key)
)
for resource_auth_key in (CONF_AUTHENTICATION, CONF_USERNAME, CONF_PASSWORD):
if resource_auth_key in new_config_entry_data:
new_config_entry_data[CONF_AUTH][resource_auth_key] = (
new_config_entry_data.pop(resource_auth_key)
)
_LOGGER.debug(
"Migrating config entry %s from version 1 to version 2 with data %s",
entry.entry_id,
new_config_entry_data,
)
hass.config_entries.async_update_entry(
entry, version=2, options=new_config_entry_data
)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ScrapeConfigEntry) -> bool:
"""Unload Scrape config entry.""" """Unload Scrape config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS) return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def update_listener(hass: HomeAssistant, entry: ScrapeConfigEntry) -> None:
"""Handle config entry update."""
hass.config_entries.async_schedule_reload(entry.entry_id)
async def async_remove_config_entry_device( async def async_remove_config_entry_device(
hass: HomeAssistant, entry: ConfigEntry, device: DeviceEntry hass: HomeAssistant, entry: ConfigEntry, device: dr.DeviceEntry
) -> bool: ) -> bool:
"""Remove Scrape config entry from a device.""" """Remove Scrape config entry from a device."""
entity_registry = er.async_get(hass) entity_registry = er.async_get(hass)

View File

@@ -2,12 +2,13 @@
from __future__ import annotations from __future__ import annotations
from collections.abc import Mapping from copy import deepcopy
from typing import Any, cast import logging
import uuid from typing import Any
import voluptuous as vol import voluptuous as vol
from homeassistant import data_entry_flow
from homeassistant.components.rest import create_rest_data_from_config from homeassistant.components.rest import create_rest_data_from_config
from homeassistant.components.rest.data import ( # pylint: disable=hass-component-root-import from homeassistant.components.rest.data import ( # pylint: disable=hass-component-root-import
DEFAULT_TIMEOUT, DEFAULT_TIMEOUT,
@@ -18,10 +19,17 @@ from homeassistant.components.rest.schema import ( # pylint: disable=hass-compo
) )
from homeassistant.components.sensor import ( from homeassistant.components.sensor import (
CONF_STATE_CLASS, CONF_STATE_CLASS,
DOMAIN as SENSOR_DOMAIN,
SensorDeviceClass, SensorDeviceClass,
SensorStateClass, SensorStateClass,
) )
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryFlow,
OptionsFlow,
SubentryFlowResult,
)
from homeassistant.const import ( from homeassistant.const import (
CONF_ATTRIBUTE, CONF_ATTRIBUTE,
CONF_AUTHENTICATION, CONF_AUTHENTICATION,
@@ -33,7 +41,6 @@ from homeassistant.const import (
CONF_PAYLOAD, CONF_PAYLOAD,
CONF_RESOURCE, CONF_RESOURCE,
CONF_TIMEOUT, CONF_TIMEOUT,
CONF_UNIQUE_ID,
CONF_UNIT_OF_MEASUREMENT, CONF_UNIT_OF_MEASUREMENT,
CONF_USERNAME, CONF_USERNAME,
CONF_VALUE_TEMPLATE, CONF_VALUE_TEMPLATE,
@@ -42,15 +49,7 @@ from homeassistant.const import (
HTTP_DIGEST_AUTHENTICATION, HTTP_DIGEST_AUTHENTICATION,
UnitOfTemperature, UnitOfTemperature,
) )
from homeassistant.core import async_get_hass from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.schema_config_entry_flow import (
SchemaCommonFlowHandler,
SchemaConfigFlowHandler,
SchemaFlowError,
SchemaFlowFormStep,
SchemaFlowMenuStep,
)
from homeassistant.helpers.selector import ( from homeassistant.helpers.selector import (
BooleanSelector, BooleanSelector,
NumberSelector, NumberSelector,
@@ -69,6 +68,8 @@ from homeassistant.helpers.trigger_template_entity import CONF_AVAILABILITY
from . import COMBINED_SCHEMA from . import COMBINED_SCHEMA
from .const import ( from .const import (
CONF_ADVANCED,
CONF_AUTH,
CONF_ENCODING, CONF_ENCODING,
CONF_INDEX, CONF_INDEX,
CONF_SELECT, CONF_SELECT,
@@ -78,243 +79,212 @@ from .const import (
DOMAIN, DOMAIN,
) )
RESOURCE_SETUP = { _LOGGER = logging.getLogger(__name__)
vol.Required(CONF_RESOURCE): TextSelector(
TextSelectorConfig(type=TextSelectorType.URL)
),
vol.Optional(CONF_METHOD, default=DEFAULT_METHOD): SelectSelector(
SelectSelectorConfig(options=METHODS, mode=SelectSelectorMode.DROPDOWN)
),
vol.Optional(CONF_PAYLOAD): ObjectSelector(),
vol.Optional(CONF_AUTHENTICATION): SelectSelector(
SelectSelectorConfig(
options=[HTTP_BASIC_AUTHENTICATION, HTTP_DIGEST_AUTHENTICATION],
mode=SelectSelectorMode.DROPDOWN,
)
),
vol.Optional(CONF_USERNAME): TextSelector(),
vol.Optional(CONF_PASSWORD): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
vol.Optional(CONF_HEADERS): ObjectSelector(),
vol.Optional(CONF_VERIFY_SSL, default=DEFAULT_VERIFY_SSL): BooleanSelector(),
vol.Optional(CONF_TIMEOUT, default=DEFAULT_TIMEOUT): NumberSelector(
NumberSelectorConfig(min=0, step=1, mode=NumberSelectorMode.BOX)
),
vol.Optional(CONF_ENCODING, default=DEFAULT_ENCODING): TextSelector(),
}
SENSOR_SETUP = { RESOURCE_SETUP = vol.Schema(
vol.Required(CONF_SELECT): TextSelector(),
vol.Optional(CONF_INDEX, default=0): NumberSelector(
NumberSelectorConfig(min=0, step=1, mode=NumberSelectorMode.BOX)
),
vol.Optional(CONF_ATTRIBUTE): TextSelector(),
vol.Optional(CONF_VALUE_TEMPLATE): TemplateSelector(),
vol.Optional(CONF_AVAILABILITY): TemplateSelector(),
vol.Optional(CONF_DEVICE_CLASS): SelectSelector(
SelectSelectorConfig(
options=[
cls.value for cls in SensorDeviceClass if cls != SensorDeviceClass.ENUM
],
mode=SelectSelectorMode.DROPDOWN,
translation_key="device_class",
sort=True,
)
),
vol.Optional(CONF_STATE_CLASS): SelectSelector(
SelectSelectorConfig(
options=[cls.value for cls in SensorStateClass],
mode=SelectSelectorMode.DROPDOWN,
translation_key="state_class",
sort=True,
)
),
vol.Optional(CONF_UNIT_OF_MEASUREMENT): SelectSelector(
SelectSelectorConfig(
options=[cls.value for cls in UnitOfTemperature],
custom_value=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key="unit_of_measurement",
sort=True,
)
),
}
async def validate_rest_setup(
handler: SchemaCommonFlowHandler, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Validate rest setup."""
hass = async_get_hass()
rest_config: dict[str, Any] = COMBINED_SCHEMA(user_input)
try:
rest = create_rest_data_from_config(hass, rest_config)
await rest.async_update()
except Exception as err:
raise SchemaFlowError("resource_error") from err
if rest.data is None:
raise SchemaFlowError("resource_error")
return user_input
async def validate_sensor_setup(
handler: SchemaCommonFlowHandler, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Validate sensor input."""
user_input[CONF_INDEX] = int(user_input[CONF_INDEX])
user_input[CONF_UNIQUE_ID] = str(uuid.uuid1())
# Standard behavior is to merge the result with the options.
# In this case, we want to add a sub-item so we update the options directly.
sensors: list[dict[str, Any]] = handler.options.setdefault(SENSOR_DOMAIN, [])
sensors.append(user_input)
return {}
async def validate_select_sensor(
handler: SchemaCommonFlowHandler, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Store sensor index in flow state."""
handler.flow_state["_idx"] = int(user_input[CONF_INDEX])
return {}
async def get_select_sensor_schema(handler: SchemaCommonFlowHandler) -> vol.Schema:
"""Return schema for selecting a sensor."""
return vol.Schema(
{
vol.Required(CONF_INDEX): vol.In(
{
str(index): config[CONF_NAME]
for index, config in enumerate(handler.options[SENSOR_DOMAIN])
},
)
}
)
async def get_edit_sensor_suggested_values(
handler: SchemaCommonFlowHandler,
) -> dict[str, Any]:
"""Return suggested values for sensor editing."""
idx: int = handler.flow_state["_idx"]
return dict(handler.options[SENSOR_DOMAIN][idx])
async def validate_sensor_edit(
handler: SchemaCommonFlowHandler, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Update edited sensor."""
user_input[CONF_INDEX] = int(user_input[CONF_INDEX])
# Standard behavior is to merge the result with the options.
# In this case, we want to add a sub-item so we update the options directly,
# including popping omitted optional schema items.
idx: int = handler.flow_state["_idx"]
handler.options[SENSOR_DOMAIN][idx].update(user_input)
for key in DATA_SCHEMA_EDIT_SENSOR.schema:
if isinstance(key, vol.Optional) and key not in user_input:
# Key not present, delete keys old value (if present) too
handler.options[SENSOR_DOMAIN][idx].pop(key, None)
return {}
async def get_remove_sensor_schema(handler: SchemaCommonFlowHandler) -> vol.Schema:
"""Return schema for sensor removal."""
return vol.Schema(
{
vol.Required(CONF_INDEX): cv.multi_select(
{
str(index): config[CONF_NAME]
for index, config in enumerate(handler.options[SENSOR_DOMAIN])
},
)
}
)
async def validate_remove_sensor(
handler: SchemaCommonFlowHandler, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Validate remove sensor."""
removed_indexes: set[str] = set(user_input[CONF_INDEX])
# Standard behavior is to merge the result with the options.
# In this case, we want to remove sub-items so we update the options directly.
entity_registry = er.async_get(handler.parent_handler.hass)
sensors: list[dict[str, Any]] = []
sensor: dict[str, Any]
for index, sensor in enumerate(handler.options[SENSOR_DOMAIN]):
if str(index) not in removed_indexes:
sensors.append(sensor)
elif entity_id := entity_registry.async_get_entity_id(
SENSOR_DOMAIN, DOMAIN, sensor[CONF_UNIQUE_ID]
):
entity_registry.async_remove(entity_id)
handler.options[SENSOR_DOMAIN] = sensors
return {}
DATA_SCHEMA_RESOURCE = vol.Schema(RESOURCE_SETUP)
DATA_SCHEMA_EDIT_SENSOR = vol.Schema(SENSOR_SETUP)
DATA_SCHEMA_SENSOR = vol.Schema(
{ {
vol.Optional(CONF_NAME, default=DEFAULT_NAME): TextSelector(), vol.Required(CONF_RESOURCE): TextSelector(
**SENSOR_SETUP, TextSelectorConfig(type=TextSelectorType.URL)
),
vol.Optional(CONF_METHOD, default=DEFAULT_METHOD): SelectSelector(
SelectSelectorConfig(options=METHODS, mode=SelectSelectorMode.DROPDOWN)
),
vol.Optional(CONF_PAYLOAD): ObjectSelector(),
vol.Required(CONF_AUTH): data_entry_flow.section(
vol.Schema(
{
vol.Optional(CONF_AUTHENTICATION): SelectSelector(
SelectSelectorConfig(
options=[
HTTP_BASIC_AUTHENTICATION,
HTTP_DIGEST_AUTHENTICATION,
],
mode=SelectSelectorMode.DROPDOWN,
)
),
vol.Optional(CONF_USERNAME): TextSelector(
TextSelectorConfig(
type=TextSelectorType.TEXT, autocomplete="username"
)
),
vol.Optional(CONF_PASSWORD): TextSelector(
TextSelectorConfig(
type=TextSelectorType.PASSWORD,
autocomplete="current-password",
)
),
}
),
data_entry_flow.SectionConfig(collapsed=True),
),
vol.Required(CONF_ADVANCED): data_entry_flow.section(
vol.Schema(
{
vol.Optional(CONF_HEADERS): ObjectSelector(),
vol.Optional(
CONF_VERIFY_SSL, default=DEFAULT_VERIFY_SSL
): BooleanSelector(),
vol.Optional(CONF_TIMEOUT, default=DEFAULT_TIMEOUT): NumberSelector(
NumberSelectorConfig(min=0, step=1, mode=NumberSelectorMode.BOX)
),
vol.Optional(
CONF_ENCODING, default=DEFAULT_ENCODING
): TextSelector(),
}
),
data_entry_flow.SectionConfig(collapsed=True),
),
} }
) )
CONFIG_FLOW = { SENSOR_SETUP = vol.Schema(
"user": SchemaFlowFormStep( {
schema=DATA_SCHEMA_RESOURCE, vol.Optional(CONF_NAME, default=DEFAULT_NAME): TextSelector(),
next_step="sensor", vol.Required(CONF_SELECT): TextSelector(),
validate_user_input=validate_rest_setup, vol.Optional(CONF_INDEX, default=0): vol.All(
), NumberSelector(
"sensor": SchemaFlowFormStep( NumberSelectorConfig(min=0, step=1, mode=NumberSelectorMode.BOX)
schema=DATA_SCHEMA_SENSOR, ),
validate_user_input=validate_sensor_setup, vol.Coerce(int),
), ),
} vol.Required(CONF_ADVANCED): data_entry_flow.section(
OPTIONS_FLOW = { vol.Schema(
"init": SchemaFlowMenuStep( {
["resource", "add_sensor", "select_edit_sensor", "remove_sensor"] vol.Optional(CONF_ATTRIBUTE): TextSelector(),
), vol.Optional(CONF_VALUE_TEMPLATE): TemplateSelector(),
"resource": SchemaFlowFormStep( vol.Optional(CONF_AVAILABILITY): TemplateSelector(),
DATA_SCHEMA_RESOURCE, vol.Optional(CONF_DEVICE_CLASS): SelectSelector(
validate_user_input=validate_rest_setup, SelectSelectorConfig(
), options=[
"add_sensor": SchemaFlowFormStep( cls.value
DATA_SCHEMA_SENSOR, for cls in SensorDeviceClass
suggested_values=None, if cls != SensorDeviceClass.ENUM
validate_user_input=validate_sensor_setup, ],
), mode=SelectSelectorMode.DROPDOWN,
"select_edit_sensor": SchemaFlowFormStep( translation_key="device_class",
get_select_sensor_schema, sort=True,
suggested_values=None, )
validate_user_input=validate_select_sensor, ),
next_step="edit_sensor", vol.Optional(CONF_STATE_CLASS): SelectSelector(
), SelectSelectorConfig(
"edit_sensor": SchemaFlowFormStep( options=[cls.value for cls in SensorStateClass],
DATA_SCHEMA_EDIT_SENSOR, mode=SelectSelectorMode.DROPDOWN,
suggested_values=get_edit_sensor_suggested_values, translation_key="state_class",
validate_user_input=validate_sensor_edit, sort=True,
), )
"remove_sensor": SchemaFlowFormStep( ),
get_remove_sensor_schema, vol.Optional(CONF_UNIT_OF_MEASUREMENT): SelectSelector(
suggested_values=None, SelectSelectorConfig(
validate_user_input=validate_remove_sensor, options=[cls.value for cls in UnitOfTemperature],
), custom_value=True,
} mode=SelectSelectorMode.DROPDOWN,
translation_key="unit_of_measurement",
sort=True,
)
),
}
),
data_entry_flow.SectionConfig(collapsed=True),
),
}
)
class ScrapeConfigFlowHandler(SchemaConfigFlowHandler, domain=DOMAIN): async def validate_rest_setup(
"""Handle a config flow for Scrape.""" hass: HomeAssistant, user_input: dict[str, Any]
) -> dict[str, Any]:
"""Validate rest setup."""
config = deepcopy(user_input)
config.update(config.pop(CONF_ADVANCED, {}))
config.update(config.pop(CONF_AUTH, {}))
rest_config: dict[str, Any] = COMBINED_SCHEMA(config)
try:
rest = create_rest_data_from_config(hass, rest_config)
await rest.async_update()
except Exception:
_LOGGER.exception("Error when getting resource %s", config[CONF_RESOURCE])
return {"base": "resource_error"}
if rest.data is None:
return {"base": "no_data"}
return {}
config_flow = CONFIG_FLOW
options_flow = OPTIONS_FLOW
options_flow_reloads = True
def async_config_entry_title(self, options: Mapping[str, Any]) -> str: class ScrapeConfigFlow(ConfigFlow, domain=DOMAIN):
"""Return config entry title.""" """Scrape configuration flow."""
return cast(str, options[CONF_RESOURCE])
VERSION = 2
@staticmethod
@callback
def async_get_options_flow(config_entry: ConfigEntry) -> ScrapeOptionFlow:
"""Get the options flow for this handler."""
return ScrapeOptionFlow()
@classmethod
@callback
def async_get_supported_subentry_types(
cls, config_entry: ConfigEntry
) -> dict[str, type[ConfigSubentryFlow]]:
"""Return subentries supported by this handler."""
return {"entity": ScrapeSubentryFlowHandler}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""User flow to create the main config entry."""
errors: dict[str, str] = {}
if user_input is not None:
errors = await validate_rest_setup(self.hass, user_input)
title = user_input[CONF_RESOURCE]
if not errors:
return self.async_create_entry(data={}, options=user_input, title=title)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
RESOURCE_SETUP, user_input or {}
),
errors=errors,
)
class ScrapeOptionFlow(OptionsFlow):
"""Scrape Options flow."""
async def async_step_init(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Manage Scrape options."""
errors: dict[str, str] = {}
if user_input is not None:
errors = await validate_rest_setup(self.hass, user_input)
if not errors:
return self.async_create_entry(data=user_input)
return self.async_show_form(
step_id="init",
data_schema=self.add_suggested_values_to_schema(
RESOURCE_SETUP,
user_input or self.config_entry.options,
),
errors=errors,
)
class ScrapeSubentryFlowHandler(ConfigSubentryFlow):
"""Handle subentry flow."""
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""User flow to create a sensor subentry."""
if user_input is not None:
title = user_input.pop("name")
return self.async_create_entry(data=user_input, title=title)
return self.async_show_form(
step_id="user",
data_schema=self.add_suggested_values_to_schema(
SENSOR_SETUP, user_input or {}
),
)

View File

@@ -14,6 +14,8 @@ DEFAULT_SCAN_INTERVAL = timedelta(minutes=10)
PLATFORMS = [Platform.SENSOR] PLATFORMS = [Platform.SENSOR]
CONF_ADVANCED = "advanced"
CONF_AUTH = "auth"
CONF_ENCODING = "encoding" CONF_ENCODING = "encoding"
CONF_SELECT = "select" CONF_SELECT = "select"
CONF_INDEX = "index" CONF_INDEX = "index"

View File

@@ -0,0 +1,21 @@
{
"config": {
"step": {
"user": {
"sections": {
"advanced": "mdi:cog",
"auth": "mdi:lock"
}
}
}
},
"options": {
"step": {
"init": {
"sections": {
"advanced": "mdi:cog"
}
}
}
}
}

View File

@@ -46,9 +46,10 @@ TRIGGER_ENTITY_OPTIONS = (
CONF_AVAILABILITY, CONF_AVAILABILITY,
CONF_DEVICE_CLASS, CONF_DEVICE_CLASS,
CONF_ICON, CONF_ICON,
CONF_NAME,
CONF_PICTURE, CONF_PICTURE,
CONF_UNIQUE_ID,
CONF_STATE_CLASS, CONF_STATE_CLASS,
CONF_UNIQUE_ID,
CONF_UNIT_OF_MEASUREMENT, CONF_UNIT_OF_MEASUREMENT,
) )
@@ -70,7 +71,7 @@ async def async_setup_platform(
entities: list[ScrapeSensor] = [] entities: list[ScrapeSensor] = []
for sensor_config in sensors_config: for sensor_config in sensors_config:
trigger_entity_config = {CONF_NAME: sensor_config[CONF_NAME]} trigger_entity_config = {}
for key in TRIGGER_ENTITY_OPTIONS: for key in TRIGGER_ENTITY_OPTIONS:
if key not in sensor_config: if key not in sensor_config:
continue continue
@@ -98,23 +99,24 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback, async_add_entities: AddConfigEntryEntitiesCallback,
) -> None: ) -> None:
"""Set up the Scrape sensor entry.""" """Set up the Scrape sensor entry."""
entities: list = []
coordinator = entry.runtime_data coordinator = entry.runtime_data
config = dict(entry.options) for subentry in entry.subentries.values():
for sensor in config["sensor"]: sensor = dict(subentry.data)
sensor.update(sensor.pop("advanced", {}))
sensor[CONF_UNIQUE_ID] = subentry.subentry_id
sensor[CONF_NAME] = subentry.title
sensor_config: ConfigType = vol.Schema( sensor_config: ConfigType = vol.Schema(
TEMPLATE_SENSOR_BASE_SCHEMA.schema, extra=vol.ALLOW_EXTRA TEMPLATE_SENSOR_BASE_SCHEMA.schema, extra=vol.ALLOW_EXTRA
)(sensor) )(sensor)
name: str = sensor_config[CONF_NAME]
value_string: str | None = sensor_config.get(CONF_VALUE_TEMPLATE) value_string: str | None = sensor_config.get(CONF_VALUE_TEMPLATE)
value_template: ValueTemplate | None = ( value_template: ValueTemplate | None = (
ValueTemplate(value_string, hass) if value_string is not None else None ValueTemplate(value_string, hass) if value_string is not None else None
) )
trigger_entity_config: dict[str, str | Template | None] = {CONF_NAME: name} trigger_entity_config: dict[str, str | Template | None] = {}
for key in TRIGGER_ENTITY_OPTIONS: for key in TRIGGER_ENTITY_OPTIONS:
if key not in sensor_config: if key not in sensor_config:
continue continue
@@ -123,21 +125,22 @@ async def async_setup_entry(
continue continue
trigger_entity_config[key] = sensor_config[key] trigger_entity_config[key] = sensor_config[key]
entities.append( async_add_entities(
ScrapeSensor( [
hass, ScrapeSensor(
coordinator, hass,
trigger_entity_config, coordinator,
sensor_config[CONF_SELECT], trigger_entity_config,
sensor_config.get(CONF_ATTRIBUTE), sensor_config[CONF_SELECT],
sensor_config[CONF_INDEX], sensor_config.get(CONF_ATTRIBUTE),
value_template, sensor_config[CONF_INDEX],
False, value_template,
) False,
)
],
config_subentry_id=subentry.subentry_id,
) )
async_add_entities(entities)
class ScrapeSensor(CoordinatorEntity[ScrapeCoordinator], ManualTriggerSensorEntity): class ScrapeSensor(CoordinatorEntity[ScrapeCoordinator], ManualTriggerSensorEntity):
"""Representation of a web scrape sensor.""" """Representation of a web scrape sensor."""

View File

@@ -4,134 +4,140 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_account%]" "already_configured": "[%key:common::config_flow::abort::already_configured_account%]"
}, },
"error": { "error": {
"resource_error": "Could not update rest data. Verify your configuration" "no_data": "REST data is empty. Verify your configuration",
"resource_error": "Could not update REST data. Verify your configuration"
}, },
"step": { "step": {
"sensor": {
"data": {
"attribute": "Attribute",
"availability": "Availability template",
"device_class": "Device class",
"index": "Index",
"name": "[%key:common::config_flow::data::name%]",
"select": "Select",
"state_class": "State class",
"unit_of_measurement": "Unit of measurement",
"value_template": "Value template"
},
"data_description": {
"attribute": "Get value of an attribute on the selected tag.",
"availability": "Defines a template to get the availability of the sensor.",
"device_class": "The type/class of the sensor to set the icon in the frontend.",
"index": "Defines which of the elements returned by the CSS selector to use.",
"select": "Defines what tag to search for. Check Beautifulsoup CSS selectors for details.",
"state_class": "The state_class of the sensor.",
"unit_of_measurement": "Choose unit of measurement or create your own.",
"value_template": "Defines a template to get the state of the sensor."
}
},
"user": { "user": {
"data": { "data": {
"authentication": "Select authentication method",
"encoding": "Character encoding",
"headers": "Headers",
"method": "Method", "method": "Method",
"password": "[%key:common::config_flow::data::password%]",
"payload": "Payload", "payload": "Payload",
"resource": "Resource", "resource": "Resource"
"timeout": "Timeout",
"username": "[%key:common::config_flow::data::username%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
}, },
"data_description": { "data_description": {
"authentication": "Type of the HTTP authentication. Either basic or digest.",
"encoding": "Character encoding to use. Defaults to UTF-8.",
"headers": "Headers to use for the web request.",
"payload": "Payload to use when method is POST.", "payload": "Payload to use when method is POST.",
"resource": "The URL to the website that contains the value.", "resource": "The URL to the website that contains the value."
"timeout": "Timeout for connection to website.", },
"verify_ssl": "Enables/disables verification of SSL/TLS certificate, for example if it is self-signed." "sections": {
"advanced": {
"data": {
"encoding": "Character encoding",
"headers": "Headers",
"timeout": "Timeout",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"encoding": "Character encoding to use. Defaults to UTF-8.",
"headers": "Headers to use for the web request.",
"timeout": "Timeout for connection to website.",
"verify_ssl": "Enables/disables verification of SSL/TLS certificate, for example if it is self-signed."
},
"description": "Provide additional advanced settings for the resource.",
"name": "Advanced settings"
},
"auth": {
"data": {
"authentication": "Select authentication method",
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"authentication": "Type of the HTTP authentication. Either basic or digest."
},
"description": "Provide authentication details to access the resource.",
"name": "Authentication settings"
}
}
}
}
},
"config_subentries": {
"entity": {
"entry_type": "Sensor",
"initiate_flow": {
"user": "Add sensor"
},
"step": {
"user": {
"data": {
"index": "Index",
"select": "Select"
},
"data_description": {
"index": "Defines which of the elements returned by the CSS selector to use.",
"select": "Defines what tag to search for. Check Beautifulsoup CSS selectors for details."
},
"sections": {
"advanced": {
"data": {
"attribute": "Attribute",
"availability": "Availability template",
"device_class": "Device class",
"state_class": "State class",
"unit_of_measurement": "Unit of measurement",
"value_template": "Value template"
},
"data_description": {
"attribute": "Get value of an attribute on the selected tag.",
"availability": "Defines a template to get the availability of the sensor.",
"device_class": "The type/class of the sensor to set the icon in the frontend.",
"state_class": "The state_class of the sensor.",
"unit_of_measurement": "Choose unit of measurement or create your own.",
"value_template": "Defines a template to get the state of the sensor."
},
"description": "Provide additional advanced settings for the sensor.",
"name": "Advanced settings"
}
}
} }
} }
} }
}, },
"options": { "options": {
"error": {
"no_data": "[%key:component::scrape::config::error::no_data%]",
"resource_error": "[%key:component::scrape::config::error::resource_error%]"
},
"step": { "step": {
"add_sensor": {
"data": {
"attribute": "[%key:component::scrape::config::step::sensor::data::attribute%]",
"availability": "[%key:component::scrape::config::step::sensor::data::availability%]",
"device_class": "[%key:component::scrape::config::step::sensor::data::device_class%]",
"index": "[%key:component::scrape::config::step::sensor::data::index%]",
"name": "[%key:common::config_flow::data::name%]",
"select": "[%key:component::scrape::config::step::sensor::data::select%]",
"state_class": "[%key:component::scrape::config::step::sensor::data::state_class%]",
"unit_of_measurement": "[%key:component::scrape::config::step::sensor::data::unit_of_measurement%]",
"value_template": "[%key:component::scrape::config::step::sensor::data::value_template%]"
},
"data_description": {
"attribute": "[%key:component::scrape::config::step::sensor::data_description::attribute%]",
"availability": "[%key:component::scrape::config::step::sensor::data_description::availability%]",
"device_class": "[%key:component::scrape::config::step::sensor::data_description::device_class%]",
"index": "[%key:component::scrape::config::step::sensor::data_description::index%]",
"select": "[%key:component::scrape::config::step::sensor::data_description::select%]",
"state_class": "[%key:component::scrape::config::step::sensor::data_description::state_class%]",
"unit_of_measurement": "[%key:component::scrape::config::step::sensor::data_description::unit_of_measurement%]",
"value_template": "[%key:component::scrape::config::step::sensor::data_description::value_template%]"
}
},
"edit_sensor": {
"data": {
"attribute": "[%key:component::scrape::config::step::sensor::data::attribute%]",
"availability": "[%key:component::scrape::config::step::sensor::data::availability%]",
"device_class": "[%key:component::scrape::config::step::sensor::data::device_class%]",
"index": "[%key:component::scrape::config::step::sensor::data::index%]",
"name": "[%key:common::config_flow::data::name%]",
"select": "[%key:component::scrape::config::step::sensor::data::select%]",
"state_class": "[%key:component::scrape::config::step::sensor::data::state_class%]",
"unit_of_measurement": "[%key:component::scrape::config::step::sensor::data::unit_of_measurement%]",
"value_template": "[%key:component::scrape::config::step::sensor::data::value_template%]"
},
"data_description": {
"attribute": "[%key:component::scrape::config::step::sensor::data_description::attribute%]",
"availability": "[%key:component::scrape::config::step::sensor::data_description::availability%]",
"device_class": "[%key:component::scrape::config::step::sensor::data_description::device_class%]",
"index": "[%key:component::scrape::config::step::sensor::data_description::index%]",
"select": "[%key:component::scrape::config::step::sensor::data_description::select%]",
"state_class": "[%key:component::scrape::config::step::sensor::data_description::state_class%]",
"unit_of_measurement": "[%key:component::scrape::config::step::sensor::data_description::unit_of_measurement%]",
"value_template": "[%key:component::scrape::config::step::sensor::data_description::value_template%]"
}
},
"init": { "init": {
"menu_options": {
"add_sensor": "Add sensor",
"remove_sensor": "Remove sensor",
"resource": "Configure resource",
"select_edit_sensor": "Configure sensor"
}
},
"resource": {
"data": { "data": {
"authentication": "[%key:component::scrape::config::step::user::data::authentication%]",
"encoding": "[%key:component::scrape::config::step::user::data::encoding%]",
"headers": "[%key:component::scrape::config::step::user::data::headers%]",
"method": "[%key:component::scrape::config::step::user::data::method%]", "method": "[%key:component::scrape::config::step::user::data::method%]",
"password": "[%key:common::config_flow::data::password%]",
"payload": "[%key:component::scrape::config::step::user::data::payload%]", "payload": "[%key:component::scrape::config::step::user::data::payload%]",
"resource": "[%key:component::scrape::config::step::user::data::resource%]", "resource": "[%key:component::scrape::config::step::user::data::resource%]"
"timeout": "[%key:component::scrape::config::step::user::data::timeout%]",
"username": "[%key:common::config_flow::data::username%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
}, },
"data_description": { "data_description": {
"authentication": "[%key:component::scrape::config::step::user::data_description::authentication%]",
"encoding": "[%key:component::scrape::config::step::user::data_description::encoding%]",
"headers": "[%key:component::scrape::config::step::user::data_description::headers%]",
"payload": "[%key:component::scrape::config::step::user::data_description::payload%]", "payload": "[%key:component::scrape::config::step::user::data_description::payload%]",
"resource": "[%key:component::scrape::config::step::user::data_description::resource%]", "resource": "[%key:component::scrape::config::step::user::data_description::resource%]"
"timeout": "[%key:component::scrape::config::step::user::data_description::timeout%]", },
"verify_ssl": "[%key:component::scrape::config::step::user::data_description::verify_ssl%]" "sections": {
"advanced": {
"data": {
"encoding": "[%key:component::scrape::config::step::user::sections::advanced::data::encoding%]",
"headers": "[%key:component::scrape::config::step::user::sections::advanced::data::headers%]",
"timeout": "[%key:component::scrape::config::step::user::sections::advanced::data::timeout%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"encoding": "[%key:component::scrape::config::step::user::sections::advanced::data_description::encoding%]",
"headers": "[%key:component::scrape::config::step::user::sections::advanced::data_description::headers%]",
"timeout": "[%key:component::scrape::config::step::user::sections::advanced::data_description::timeout%]",
"verify_ssl": "[%key:component::scrape::config::step::user::sections::advanced::data_description::verify_ssl%]"
},
"description": "[%key:component::scrape::config::step::user::sections::advanced::description%]",
"name": "[%key:component::scrape::config::step::user::sections::advanced::name%]"
},
"auth": {
"data": {
"authentication": "[%key:component::scrape::config::step::user::sections::auth::data::authentication%]",
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"authentication": "[%key:component::scrape::config::step::user::sections::auth::data_description::authentication%]"
},
"description": "[%key:component::scrape::config::step::user::sections::auth::description%]",
"name": "[%key:component::scrape::config::step::user::sections::auth::name%]"
}
} }
} }
} }

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/seven_segments", "documentation": "https://www.home-assistant.io/integrations/seven_segments",
"iot_class": "local_polling", "iot_class": "local_polling",
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["Pillow==12.1.1"] "requirements": ["Pillow==12.2.0"]
} }

View File

@@ -807,7 +807,7 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
) )
ssid_options = [network["ssid"] for network in sorted_networks] ssid_options = [network["ssid"] for network in sorted_networks]
# Pre-select SSID if returning from failed provisioning attempt # Preselect SSID if returning from failed provisioning attempt
suggested_values: dict[str, Any] = {} suggested_values: dict[str, Any] = {}
if self.selected_ssid: if self.selected_ssid:
suggested_values[CONF_SSID] = self.selected_ssid suggested_values[CONF_SSID] = self.selected_ssid
@@ -1086,7 +1086,7 @@ class ShellyConfigFlow(ConfigFlow, domain=DOMAIN):
) -> ConfigFlowResult: ) -> ConfigFlowResult:
"""Handle failed provisioning - allow retry.""" """Handle failed provisioning - allow retry."""
if user_input is not None: if user_input is not None:
# User wants to retry - keep selected_ssid so it's pre-selected # User wants to retry - keep selected_ssid so it's preselected
self.wifi_networks = [] self.wifi_networks = []
return await self.async_step_wifi_scan() return await self.async_step_wifi_scan()

View File

@@ -6,5 +6,5 @@
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["simplehound"], "loggers": ["simplehound"],
"quality_scale": "legacy", "quality_scale": "legacy",
"requirements": ["Pillow==12.1.1", "simplehound==0.3"] "requirements": ["Pillow==12.2.0", "simplehound==0.3"]
} }

View File

@@ -6,5 +6,5 @@
"config_flow": true, "config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sql", "documentation": "https://www.home-assistant.io/integrations/sql",
"iot_class": "local_polling", "iot_class": "local_polling",
"requirements": ["SQLAlchemy==2.0.41", "sqlparse==0.5.5"] "requirements": ["SQLAlchemy==2.0.49", "sqlparse==0.5.5"]
} }

View File

@@ -25,7 +25,6 @@ from homeassistant.components.recorder.statistics import (
) )
from homeassistant.const import UnitOfEnergy from homeassistant.const import UnitOfEnergy
from homeassistant.core import HomeAssistant from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import dt as dt_util from homeassistant.util import dt as dt_util
from homeassistant.util.unit_conversion import EnergyConverter from homeassistant.util.unit_conversion import EnergyConverter
@@ -92,11 +91,40 @@ def _build_home_data(home: tibber.TibberHome) -> TibberHomeData:
return result return result
class TibberDataCoordinator(DataUpdateCoordinator[None]): class TibberCoordinator[_DataT](DataUpdateCoordinator[_DataT]):
"""Handle Tibber data and insert statistics.""" """Base Tibber coordinator."""
config_entry: TibberConfigEntry config_entry: TibberConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: TibberConfigEntry,
*,
name: str,
update_interval: timedelta,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=name,
update_interval=update_interval,
)
self._runtime_data = config_entry.runtime_data
async def _async_get_client(self) -> tibber.Tibber:
"""Get the Tibber client with error handling."""
try:
return await self._runtime_data.async_get_client(self.hass)
except (ClientError, TimeoutError, tibber.exceptions.HttpExceptionError) as err:
raise UpdateFailed(f"Unable to create Tibber client: {err}") from err
class TibberDataCoordinator(TibberCoordinator[None]):
"""Handle Tibber data and insert statistics."""
def __init__( def __init__(
self, self,
hass: HomeAssistant, hass: HomeAssistant,
@@ -106,17 +134,14 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
"""Initialize the data handler.""" """Initialize the data handler."""
super().__init__( super().__init__(
hass, hass,
_LOGGER, config_entry,
config_entry=config_entry,
name=f"Tibber {tibber_connection.name}", name=f"Tibber {tibber_connection.name}",
update_interval=timedelta(minutes=20), update_interval=timedelta(minutes=20),
) )
async def _async_update_data(self) -> None: async def _async_update_data(self) -> None:
"""Update data via API.""" """Update data via API."""
tibber_connection = await self.config_entry.runtime_data.async_get_client( tibber_connection = await self._async_get_client()
self.hass
)
try: try:
await tibber_connection.fetch_consumption_data_active_homes() await tibber_connection.fetch_consumption_data_active_homes()
@@ -132,9 +157,7 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
async def _insert_statistics(self) -> None: async def _insert_statistics(self) -> None:
"""Insert Tibber statistics.""" """Insert Tibber statistics."""
tibber_connection = await self.config_entry.runtime_data.async_get_client( tibber_connection = await self._async_get_client()
self.hass
)
for home in tibber_connection.get_homes(): for home in tibber_connection.get_homes():
sensors: list[tuple[str, bool, str | None, str]] = [] sensors: list[tuple[str, bool, str | None, str]] = []
if home.hourly_consumption_data: if home.hourly_consumption_data:
@@ -254,11 +277,9 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
async_add_external_statistics(self.hass, metadata, statistics) async_add_external_statistics(self.hass, metadata, statistics)
class TibberPriceCoordinator(DataUpdateCoordinator[dict[str, TibberHomeData]]): class TibberPriceCoordinator(TibberCoordinator[dict[str, TibberHomeData]]):
"""Handle Tibber price data and insert statistics.""" """Handle Tibber price data and insert statistics."""
config_entry: TibberConfigEntry
def __init__( def __init__(
self, self,
hass: HomeAssistant, hass: HomeAssistant,
@@ -267,8 +288,7 @@ class TibberPriceCoordinator(DataUpdateCoordinator[dict[str, TibberHomeData]]):
"""Initialize the price coordinator.""" """Initialize the price coordinator."""
super().__init__( super().__init__(
hass, hass,
_LOGGER, config_entry,
config_entry=config_entry,
name=f"{DOMAIN} price", name=f"{DOMAIN} price",
update_interval=timedelta(minutes=1), update_interval=timedelta(minutes=1),
) )
@@ -290,9 +310,7 @@ class TibberPriceCoordinator(DataUpdateCoordinator[dict[str, TibberHomeData]]):
async def _async_update_data(self) -> dict[str, TibberHomeData]: async def _async_update_data(self) -> dict[str, TibberHomeData]:
"""Update data via API and return per-home data for sensors.""" """Update data via API and return per-home data for sensors."""
tibber_connection = await self.config_entry.runtime_data.async_get_client( tibber_connection = await self._async_get_client()
self.hass
)
active_homes = tibber_connection.get_homes(only_active=True) active_homes = tibber_connection.get_homes(only_active=True)
now = dt_util.now() now = dt_util.now()
@@ -347,11 +365,9 @@ class TibberPriceCoordinator(DataUpdateCoordinator[dict[str, TibberHomeData]]):
return result return result
class TibberDataAPICoordinator(DataUpdateCoordinator[dict[str, TibberDevice]]): class TibberDataAPICoordinator(TibberCoordinator[dict[str, TibberDevice]]):
"""Fetch and cache Tibber Data API device capabilities.""" """Fetch and cache Tibber Data API device capabilities."""
config_entry: TibberConfigEntry
def __init__( def __init__(
self, self,
hass: HomeAssistant, hass: HomeAssistant,
@@ -360,12 +376,10 @@ class TibberDataAPICoordinator(DataUpdateCoordinator[dict[str, TibberDevice]]):
"""Initialize the coordinator.""" """Initialize the coordinator."""
super().__init__( super().__init__(
hass, hass,
_LOGGER, entry,
name=f"{DOMAIN} Data API", name=f"{DOMAIN} Data API",
update_interval=timedelta(minutes=1), update_interval=timedelta(minutes=1),
config_entry=entry,
) )
self._runtime_data = entry.runtime_data
self.sensors_by_device: dict[str, dict[str, tibber.data_api.Sensor]] = {} self.sensors_by_device: dict[str, dict[str, tibber.data_api.Sensor]] = {}
def _build_sensor_lookup(self, devices: dict[str, TibberDevice]) -> None: def _build_sensor_lookup(self, devices: dict[str, TibberDevice]) -> None:
@@ -383,15 +397,6 @@ class TibberDataAPICoordinator(DataUpdateCoordinator[dict[str, TibberDevice]]):
return device_sensors.get(sensor_id) return device_sensors.get(sensor_id)
return None return None
async def _async_get_client(self) -> tibber.Tibber:
"""Get the Tibber client with error handling."""
try:
return await self._runtime_data.async_get_client(self.hass)
except ConfigEntryAuthFailed:
raise
except (ClientError, TimeoutError, tibber.UserAgentMissingError) as err:
raise UpdateFailed(f"Unable to create Tibber client: {err}") from err
async def _async_setup(self) -> None: async def _async_setup(self) -> None:
"""Initial load of Tibber Data API devices.""" """Initial load of Tibber Data API devices."""
client = await self._async_get_client() client = await self._async_get_client()

View File

@@ -8,5 +8,5 @@
"integration_type": "hub", "integration_type": "hub",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["tibber"], "loggers": ["tibber"],
"requirements": ["pyTibber==0.37.0"] "requirements": ["pyTibber==0.37.1"]
} }

View File

@@ -17,8 +17,9 @@ from tuya_sharing import (
from homeassistant.config_entries import ConfigEntry from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import device_registry as dr from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.dispatcher import dispatcher_send from homeassistant.helpers.dispatcher import dispatcher_send
from homeassistant.helpers.typing import ConfigType
from .const import ( from .const import (
CONF_ENDPOINT, CONF_ENDPOINT,
@@ -32,6 +33,9 @@ from .const import (
TUYA_DISCOVERY_NEW, TUYA_DISCOVERY_NEW,
TUYA_HA_SIGNAL_UPDATE_ENTITY, TUYA_HA_SIGNAL_UPDATE_ENTITY,
) )
from .services import async_setup_services
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
# Suppress logs from the library, it logs unneeded on error # Suppress logs from the library, it logs unneeded on error
logging.getLogger("tuya_sharing").setLevel(logging.CRITICAL) logging.getLogger("tuya_sharing").setLevel(logging.CRITICAL)
@@ -58,6 +62,13 @@ def _create_manager(entry: TuyaConfigEntry, token_listener: TokenListener) -> Ma
) )
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Tuya Services."""
await async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: TuyaConfigEntry) -> bool: async def async_setup_entry(hass: HomeAssistant, entry: TuyaConfigEntry) -> bool:
"""Async setup hass config entry.""" """Async setup hass config entry."""
await hass.async_add_executor_job( await hass.async_add_executor_job(

View File

@@ -381,5 +381,13 @@
"default": "mdi:watermark" "default": "mdi:watermark"
} }
} }
},
"services": {
"get_feeder_meal_plan": {
"service": "mdi:database-eye"
},
"set_feeder_meal_plan": {
"service": "mdi:database-edit"
}
} }
} }

View File

@@ -0,0 +1,160 @@
"""Services for Tuya integration."""
from enum import StrEnum
from typing import Any
from tuya_device_handlers.device_wrapper.service_feeder_schedule import (
FeederSchedule,
get_feeder_schedule_wrapper,
)
from tuya_sharing import CustomerDevice, Manager
import voluptuous as vol
from homeassistant.const import ATTR_DEVICE_ID
from homeassistant.core import HomeAssistant, ServiceCall, SupportsResponse
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import device_registry as dr
from .const import DOMAIN
DAYS = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"]
FEEDING_ENTRY_SCHEMA = vol.Schema(
{
vol.Optional("days"): [vol.In(DAYS)],
vol.Required("time"): str,
vol.Required("portion"): int,
vol.Required("enabled"): bool,
}
)
class Service(StrEnum):
"""Tuya services."""
GET_FEEDER_MEAL_PLAN = "get_feeder_meal_plan"
SET_FEEDER_MEAL_PLAN = "set_feeder_meal_plan"
def _get_tuya_device(
hass: HomeAssistant, device_id: str
) -> tuple[CustomerDevice, Manager]:
"""Get a Tuya device and manager from a Home Assistant device registry ID."""
device_registry = dr.async_get(hass)
device_entry = device_registry.async_get(device_id)
if device_entry is None:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="device_not_found",
translation_placeholders={
"device_id": device_id,
},
)
# Find the Tuya device ID from identifiers
tuya_device_id = None
for identifier_domain, identifier_value in device_entry.identifiers:
if identifier_domain == DOMAIN:
tuya_device_id = identifier_value
break
if tuya_device_id is None:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="device_not_tuya_device",
translation_placeholders={
"device_id": device_id,
},
)
# Find the device in Tuya config entry
for entry in hass.config_entries.async_loaded_entries(DOMAIN):
manager = entry.runtime_data.manager
if tuya_device_id in manager.device_map:
return manager.device_map[tuya_device_id], manager
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="device_not_found",
translation_placeholders={
"device_id": device_id,
},
)
async def async_get_feeder_meal_plan(
call: ServiceCall,
) -> dict[str, Any]:
"""Handle get_feeder_meal_plan service call."""
device, _ = _get_tuya_device(call.hass, call.data[ATTR_DEVICE_ID])
if not (wrapper := get_feeder_schedule_wrapper(device)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="device_not_support_meal_plan_status",
translation_placeholders={
"device_id": device.id,
},
)
meal_plan = wrapper.read_device_status(device)
if meal_plan is None:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="invalid_meal_plan_data",
)
return {"meal_plan": meal_plan}
async def async_set_feeder_meal_plan(call: ServiceCall) -> None:
"""Handle set_feeder_meal_plan service call."""
device, manager = _get_tuya_device(call.hass, call.data[ATTR_DEVICE_ID])
if not (wrapper := get_feeder_schedule_wrapper(device)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="device_not_support_meal_plan_function",
translation_placeholders={
"device_id": device.id,
},
)
meal_plan: list[FeederSchedule] = call.data["meal_plan"]
await call.hass.async_add_executor_job(
manager.send_commands,
device.id,
wrapper.get_update_commands(device, meal_plan),
)
async def async_setup_services(hass: HomeAssistant) -> None:
"""Set up Tuya services."""
hass.services.async_register(
DOMAIN,
Service.GET_FEEDER_MEAL_PLAN,
async_get_feeder_meal_plan,
schema=vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): str,
}
),
supports_response=SupportsResponse.ONLY,
)
hass.services.async_register(
DOMAIN,
Service.SET_FEEDER_MEAL_PLAN,
async_set_feeder_meal_plan,
schema=vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): str,
vol.Required("meal_plan"): vol.All(
list,
[FEEDING_ENTRY_SCHEMA],
),
}
),
)

View File

@@ -0,0 +1,51 @@
get_feeder_meal_plan:
fields:
device_id:
required: true
selector:
device:
integration: tuya
set_feeder_meal_plan:
fields:
device_id:
required: true
selector:
device:
integration: tuya
meal_plan:
required: true
selector:
object:
translation_key: set_feeder_meal_plan
description_field: portion
multiple: true
fields:
days:
selector:
select:
options:
- monday
- tuesday
- wednesday
- thursday
- friday
- saturday
- sunday
multiple: true
translation_key: days_of_week
time:
selector:
time:
portion:
selector:
number:
min: 0
max: 100
mode: box
unit_of_measurement: "g"
enabled:
selector:
boolean: {}

View File

@@ -1099,6 +1099,80 @@
"exceptions": { "exceptions": {
"action_dpcode_not_found": { "action_dpcode_not_found": {
"message": "Unable to process action as the device does not provide a corresponding function code (expected one of {expected} in {available})." "message": "Unable to process action as the device does not provide a corresponding function code (expected one of {expected} in {available})."
},
"device_not_found": {
"message": "Feeder with ID {device_id} could not be found."
},
"device_not_support_meal_plan_function": {
"message": "Feeder with ID {device_id} does not support meal plan functionality."
},
"device_not_support_meal_plan_status": {
"message": "Feeder with ID {device_id} does not support meal plan status."
},
"device_not_tuya_device": {
"message": "Device with ID {device_id} is not a Tuya feeder."
},
"invalid_meal_plan_data": {
"message": "Unable to parse meal plan data."
}
},
"selector": {
"days_of_week": {
"options": {
"friday": "[%key:common::time::friday%]",
"monday": "[%key:common::time::monday%]",
"saturday": "[%key:common::time::saturday%]",
"sunday": "[%key:common::time::sunday%]",
"thursday": "[%key:common::time::thursday%]",
"tuesday": "[%key:common::time::tuesday%]",
"wednesday": "[%key:common::time::wednesday%]"
}
},
"set_feeder_meal_plan": {
"fields": {
"days": {
"description": "Days of the week for the meal plan.",
"name": "Days"
},
"enabled": {
"description": "Whether the meal plan is enabled.",
"name": "Enabled"
},
"portion": {
"description": "Amount in grams",
"name": "Portion"
},
"time": {
"description": "Time of the meal.",
"name": "Time"
}
}
}
},
"services": {
"get_feeder_meal_plan": {
"description": "Retrieves a meal plan from a Tuya feeder.",
"fields": {
"device_id": {
"description": "The Tuya feeder.",
"name": "[%key:common::config_flow::data::device%]"
}
},
"name": "Get feeder meal plan data"
},
"set_feeder_meal_plan": {
"description": "Sets a meal plan on a Tuya feeder.",
"fields": {
"device_id": {
"description": "[%key:component::tuya::services::get_feeder_meal_plan::fields::device_id::description%]",
"name": "[%key:common::config_flow::data::device%]"
},
"meal_plan": {
"description": "The meal plan data to set.",
"name": "Meal plan"
}
},
"name": "Set feeder meal plan data"
} }
} }
} }

View File

@@ -0,0 +1,17 @@
"""Provides conditions for updates."""
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.condition import Condition, make_entity_state_condition
from .const import DOMAIN
CONDITIONS: dict[str, type[Condition]] = {
"is_available": make_entity_state_condition(DOMAIN, STATE_ON),
"is_not_available": make_entity_state_condition(DOMAIN, STATE_OFF),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the update conditions."""
return CONDITIONS

View File

@@ -0,0 +1,17 @@
.condition_common: &condition_common
target:
entity:
domain: update
fields:
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_available: *condition_common
is_not_available: *condition_common

View File

@@ -1,4 +1,12 @@
{ {
"conditions": {
"is_available": {
"condition": "mdi:package-up"
},
"is_not_available": {
"condition": "mdi:package"
}
},
"entity_component": { "entity_component": {
"_": { "_": {
"default": "mdi:package-up", "default": "mdi:package-up",

View File

@@ -1,7 +1,28 @@
{ {
"common": { "common": {
"condition_behavior_name": "Condition passes if",
"trigger_behavior_name": "Trigger when" "trigger_behavior_name": "Trigger when"
}, },
"conditions": {
"is_available": {
"description": "Tests if one or more updates are available.",
"fields": {
"behavior": {
"name": "[%key:component::update::common::condition_behavior_name%]"
}
},
"name": "Update is available"
},
"is_not_available": {
"description": "Tests if one or more updates are not available.",
"fields": {
"behavior": {
"name": "[%key:component::update::common::condition_behavior_name%]"
}
},
"name": "Update is not available"
}
},
"device_automation": { "device_automation": {
"extra_fields": { "extra_fields": {
"for": "[%key:common::device_automation::extra_fields::for%]" "for": "[%key:common::device_automation::extra_fields::for%]"
@@ -59,6 +80,12 @@
} }
}, },
"selector": { "selector": {
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"trigger_behavior": { "trigger_behavior": {
"options": { "options": {
"any": "Any", "any": "Any",

View File

@@ -12,5 +12,5 @@
"integration_type": "hub", "integration_type": "hub",
"iot_class": "cloud_polling", "iot_class": "cloud_polling",
"loggers": ["PyViCare"], "loggers": ["PyViCare"],
"requirements": ["PyViCare==2.58.1"] "requirements": ["PyViCare==2.59.0"]
} }

View File

@@ -315,7 +315,7 @@ class BaseZhaFlow(ConfigEntryBaseFlow):
return await self.async_step_verify_radio() return await self.async_step_verify_radio()
# Pre-select the currently configured port # Preselect the currently configured port
default_port: vol.Undefined | str = vol.UNDEFINED default_port: vol.Undefined | str = vol.UNDEFINED
if self._radio_mgr.device_path is not None: if self._radio_mgr.device_path is not None:
@@ -345,7 +345,7 @@ class BaseZhaFlow(ConfigEntryBaseFlow):
) )
return await self.async_step_manual_port_config() return await self.async_step_manual_port_config()
# Pre-select the current radio type # Preselect the current radio type
default: vol.Undefined | str = vol.UNDEFINED default: vol.Undefined | str = vol.UNDEFINED
if self._radio_mgr.radio_type is not None: if self._radio_mgr.radio_type is not None:

View File

@@ -23,7 +23,7 @@
"universal_silabs_flasher", "universal_silabs_flasher",
"serialx" "serialx"
], ],
"requirements": ["zha==1.1.2", "serialx==1.1.1"], "requirements": ["zha==1.1.2", "serialx==1.2.2"],
"usb": [ "usb": [
{ {
"description": "*2652*", "description": "*2652*",

View File

@@ -3,6 +3,7 @@
from typing import Any from typing import Any
import voluptuous as vol import voluptuous as vol
from zwave_js_server.const import CommandClass
from homeassistant.helpers import config_validation as cv from homeassistant.helpers import config_validation as cv
@@ -18,6 +19,10 @@ BITMASK_SCHEMA = vol.All(
lambda value: int(value, 16), lambda value: int(value, 16),
) )
COMMAND_CLASS_SCHEMA = vol.All(
vol.Coerce(int), vol.In([cc.value for cc in CommandClass])
)
def boolean(value: Any) -> bool: def boolean(value: Any) -> bool:
"""Validate and coerce a boolean value.""" """Validate and coerce a boolean value."""

View File

@@ -30,7 +30,7 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, entity_registry as er from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.typing import ConfigType, TemplateVarsType from homeassistant.helpers.typing import ConfigType, TemplateVarsType
from .config_validation import VALUE_SCHEMA from .config_validation import COMMAND_CLASS_SCHEMA, VALUE_SCHEMA
from .const import ( from .const import (
ATTR_COMMAND_CLASS, ATTR_COMMAND_CLASS,
ATTR_CONFIG_PARAMETER, ATTR_CONFIG_PARAMETER,
@@ -122,7 +122,7 @@ SET_LOCK_USERCODE_SCHEMA = cv.DEVICE_ACTION_BASE_SCHEMA.extend(
SET_VALUE_SCHEMA = cv.DEVICE_ACTION_BASE_SCHEMA.extend( SET_VALUE_SCHEMA = cv.DEVICE_ACTION_BASE_SCHEMA.extend(
{ {
vol.Required(CONF_TYPE): SERVICE_SET_VALUE, vol.Required(CONF_TYPE): SERVICE_SET_VALUE,
vol.Required(ATTR_COMMAND_CLASS): vol.In([cc.value for cc in CommandClass]), vol.Required(ATTR_COMMAND_CLASS): COMMAND_CLASS_SCHEMA,
vol.Required(ATTR_PROPERTY): vol.Any(int, str), vol.Required(ATTR_PROPERTY): vol.Any(int, str),
vol.Optional(ATTR_PROPERTY_KEY): vol.Any(vol.Coerce(int), cv.string), vol.Optional(ATTR_PROPERTY_KEY): vol.Any(vol.Coerce(int), cv.string),
vol.Optional(ATTR_ENDPOINT): vol.Coerce(int), vol.Optional(ATTR_ENDPOINT): vol.Coerce(int),
@@ -334,7 +334,7 @@ async def async_get_action_capabilities(
{ {
vol.Required(ATTR_COMMAND_CLASS): vol.In( vol.Required(ATTR_COMMAND_CLASS): vol.In(
{ {
CommandClass(cc.id).value: cc.name str(CommandClass(cc.id).value): cc.name
for cc in sorted( for cc in sorted(
node.command_classes, key=lambda cc: cc.name node.command_classes, key=lambda cc: cc.name
) )

View File

@@ -15,7 +15,7 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import condition, config_validation as cv from homeassistant.helpers import condition, config_validation as cv
from homeassistant.helpers.typing import ConfigType, TemplateVarsType from homeassistant.helpers.typing import ConfigType, TemplateVarsType
from .config_validation import VALUE_SCHEMA from .config_validation import COMMAND_CLASS_SCHEMA, VALUE_SCHEMA
from .const import ( from .const import (
ATTR_COMMAND_CLASS, ATTR_COMMAND_CLASS,
ATTR_ENDPOINT, ATTR_ENDPOINT,
@@ -65,7 +65,7 @@ CONFIG_PARAMETER_CONDITION_SCHEMA = cv.DEVICE_CONDITION_BASE_SCHEMA.extend(
VALUE_CONDITION_SCHEMA = cv.DEVICE_CONDITION_BASE_SCHEMA.extend( VALUE_CONDITION_SCHEMA = cv.DEVICE_CONDITION_BASE_SCHEMA.extend(
{ {
vol.Required(CONF_TYPE): VALUE_TYPE, vol.Required(CONF_TYPE): VALUE_TYPE,
vol.Required(ATTR_COMMAND_CLASS): vol.In([cc.value for cc in CommandClass]), vol.Required(ATTR_COMMAND_CLASS): COMMAND_CLASS_SCHEMA,
vol.Required(ATTR_PROPERTY): vol.Any(vol.Coerce(int), cv.string), vol.Required(ATTR_PROPERTY): vol.Any(vol.Coerce(int), cv.string),
vol.Optional(ATTR_PROPERTY_KEY): vol.Any(vol.Coerce(int), cv.string), vol.Optional(ATTR_PROPERTY_KEY): vol.Any(vol.Coerce(int), cv.string),
vol.Optional(ATTR_ENDPOINT): vol.Coerce(int), vol.Optional(ATTR_ENDPOINT): vol.Coerce(int),
@@ -221,7 +221,7 @@ async def async_get_condition_capabilities(
{ {
vol.Required(ATTR_COMMAND_CLASS): vol.In( vol.Required(ATTR_COMMAND_CLASS): vol.In(
{ {
CommandClass(cc.id).value: cc.name str(CommandClass(cc.id).value): cc.name
for cc in sorted( for cc in sorted(
node.command_classes, key=lambda cc: cc.name node.command_classes, key=lambda cc: cc.name
) )

View File

@@ -31,7 +31,7 @@ from homeassistant.helpers import (
from homeassistant.helpers.trigger import TriggerActionType, TriggerInfo from homeassistant.helpers.trigger import TriggerActionType, TriggerInfo
from homeassistant.helpers.typing import ConfigType from homeassistant.helpers.typing import ConfigType
from .config_validation import VALUE_SCHEMA from .config_validation import COMMAND_CLASS_SCHEMA, VALUE_SCHEMA
from .const import ( from .const import (
ATTR_COMMAND_CLASS, ATTR_COMMAND_CLASS,
ATTR_DATA_TYPE, ATTR_DATA_TYPE,
@@ -91,7 +91,7 @@ NOTIFICATION_EVENT_CC_MAPPINGS = (
# Event based trigger schemas # Event based trigger schemas
BASE_EVENT_SCHEMA = DEVICE_TRIGGER_BASE_SCHEMA.extend( BASE_EVENT_SCHEMA = DEVICE_TRIGGER_BASE_SCHEMA.extend(
{ {
vol.Required(ATTR_COMMAND_CLASS): vol.In([cc.value for cc in CommandClass]), vol.Required(ATTR_COMMAND_CLASS): COMMAND_CLASS_SCHEMA,
} }
) )
@@ -162,7 +162,7 @@ NODE_STATUS_SCHEMA = BASE_STATE_SCHEMA.extend(
# zwave_js.value_updated based trigger schemas # zwave_js.value_updated based trigger schemas
BASE_VALUE_UPDATED_SCHEMA = DEVICE_TRIGGER_BASE_SCHEMA.extend( BASE_VALUE_UPDATED_SCHEMA = DEVICE_TRIGGER_BASE_SCHEMA.extend(
{ {
vol.Required(ATTR_COMMAND_CLASS): vol.In([cc.value for cc in CommandClass]), vol.Required(ATTR_COMMAND_CLASS): COMMAND_CLASS_SCHEMA,
vol.Required(ATTR_PROPERTY): vol.Any(int, str), vol.Required(ATTR_PROPERTY): vol.Any(int, str),
vol.Optional(ATTR_PROPERTY_KEY): vol.Any(None, vol.Coerce(int), str), vol.Optional(ATTR_PROPERTY_KEY): vol.Any(None, vol.Coerce(int), str),
vol.Optional(ATTR_ENDPOINT, default=0): vol.Any(None, vol.Coerce(int)), vol.Optional(ATTR_ENDPOINT, default=0): vol.Any(None, vol.Coerce(int)),
@@ -558,7 +558,7 @@ async def async_get_trigger_capabilities(
{ {
vol.Required(ATTR_COMMAND_CLASS): vol.In( vol.Required(ATTR_COMMAND_CLASS): vol.In(
{ {
CommandClass(cc.id).value: cc.name str(CommandClass(cc.id).value): cc.name
for cc in sorted( for cc in sorted(
node.command_classes, key=lambda cc: cc.name node.command_classes, key=lambda cc: cc.name
) )

View File

@@ -572,12 +572,12 @@ def get_value_state_schema(
return vol.Coerce(bool) return vol.Coerce(bool)
if value.configuration_value_type == ConfigurationValueType.ENUMERATED: if value.configuration_value_type == ConfigurationValueType.ENUMERATED:
return vol.In({int(k): v for k, v in value.metadata.states.items()}) return vol.In({str(int(k)): v for k, v in value.metadata.states.items()})
return None return None
if value.metadata.states: if value.metadata.states:
return vol.In({int(k): v for k, v in value.metadata.states.items()}) return vol.In({str(int(k)): v for k, v in value.metadata.states.items()})
return vol.All( return vol.All(
vol.Coerce(int), vol.Coerce(int),

View File

@@ -51,8 +51,8 @@ ATTR_TO = "to"
_OPTIONS_SCHEMA_DICT = { _OPTIONS_SCHEMA_DICT = {
vol.Optional(ATTR_DEVICE_ID): vol.All(cv.ensure_list, [cv.string]), vol.Optional(ATTR_DEVICE_ID): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(ATTR_ENTITY_ID): cv.entity_ids, vol.Optional(ATTR_ENTITY_ID): cv.entity_ids,
vol.Required(ATTR_COMMAND_CLASS): vol.In( vol.Required(ATTR_COMMAND_CLASS): vol.All(
{cc.value: cc.name for cc in CommandClass} vol.Coerce(int), vol.In({cc.value: cc.name for cc in CommandClass})
), ),
vol.Required(ATTR_PROPERTY): vol.Any(vol.Coerce(int), cv.string), vol.Required(ATTR_PROPERTY): vol.Any(vol.Coerce(int), cv.string),
vol.Optional(ATTR_ENDPOINT): vol.Coerce(int), vol.Optional(ATTR_ENDPOINT): vol.Coerce(int),

View File

@@ -316,11 +316,11 @@ class ConfigFlowResult(FlowResult[ConfigFlowContext, str], total=False):
class FlowType(StrEnum): class FlowType(StrEnum):
"""Flow type.""" """Flow type supported in `next_flow` of ConfigFlowResult."""
CONFIG_FLOW = "config_flow" CONFIG_FLOW = "config_flow"
# Add other flow types here as needed in the future, OPTIONS_FLOW = "options_flow"
# if we want to support them in the `next_flow` parameter. CONFIG_SUBENTRIES_FLOW = "config_subentries_flow"
def _validate_item(*, disabled_by: ConfigEntryDisabler | Any | None = None) -> None: def _validate_item(*, disabled_by: ConfigEntryDisabler | Any | None = None) -> None:
@@ -1608,6 +1608,26 @@ class ConfigEntriesFlowManager(
issue_id = f"config_entry_reauth_{flow.handler}_{entry_id}" issue_id = f"config_entry_reauth_{flow.handler}_{entry_id}"
ir.async_delete_issue(self.hass, HOMEASSISTANT_DOMAIN, issue_id) ir.async_delete_issue(self.hass, HOMEASSISTANT_DOMAIN, issue_id)
def _async_validate_next_flow(
self,
result: ConfigFlowResult,
) -> None:
"""Validate `next_flow` in result if provided."""
if (next_flow := result.get("next_flow")) is None:
return
flow_type, flow_id = next_flow
if flow_type not in FlowType:
raise HomeAssistantError(f"Invalid flow type: {flow_type}")
if flow_type == FlowType.CONFIG_FLOW:
# Raises UnknownFlow if the flow does not exist.
self.hass.config_entries.flow.async_get(flow_id)
if flow_type == FlowType.OPTIONS_FLOW:
# Raises UnknownFlow if the flow does not exist.
self.hass.config_entries.options.async_get(flow_id)
if flow_type == FlowType.CONFIG_SUBENTRIES_FLOW:
# Raises UnknownFlow if the flow does not exist.
self.hass.config_entries.subentries.async_get(flow_id)
async def async_finish_flow( async def async_finish_flow(
self, self,
flow: data_entry_flow.FlowHandler[ConfigFlowContext, ConfigFlowResult], flow: data_entry_flow.FlowHandler[ConfigFlowContext, ConfigFlowResult],
@@ -1656,6 +1676,8 @@ class ConfigEntriesFlowManager(
self.config_entries.async_update_entry( self.config_entries.async_update_entry(
entry, discovery_keys=new_discovery_keys entry, discovery_keys=new_discovery_keys
) )
self._async_validate_next_flow(result)
return result return result
# Mark the step as done. # Mark the step as done.
@@ -1770,6 +1792,10 @@ class ConfigEntriesFlowManager(
self.config_entries._async_clean_up(existing_entry) # noqa: SLF001 self.config_entries._async_clean_up(existing_entry) # noqa: SLF001
result["result"] = entry result["result"] = entry
if not existing_entry:
result = await flow.async_on_create_entry(result)
self._async_validate_next_flow(result)
return result return result
async def async_create_flow( async def async_create_flow(
@@ -3291,7 +3317,10 @@ class ConfigFlow(ConfigEntryBaseFlow):
return return
flow_type, flow_id = next_flow flow_type, flow_id = next_flow
if flow_type != FlowType.CONFIG_FLOW: if flow_type != FlowType.CONFIG_FLOW:
raise HomeAssistantError("Invalid next_flow type") raise HomeAssistantError(
"next_flow only supports FlowType.CONFIG_FLOW; "
"use async_on_create_entry for options or subentry flows"
)
# Raises UnknownFlow if the flow does not exist. # Raises UnknownFlow if the flow does not exist.
self.hass.config_entries.flow.async_get(flow_id) self.hass.config_entries.flow.async_get(flow_id)
result["next_flow"] = next_flow result["next_flow"] = next_flow
@@ -3312,6 +3341,15 @@ class ConfigFlow(ConfigEntryBaseFlow):
self._async_set_next_flow_if_valid(result, next_flow) self._async_set_next_flow_if_valid(result, next_flow)
return result return result
async def async_on_create_entry(self, result: ConfigFlowResult) -> ConfigFlowResult:
"""Runs after a config flow has created a config entry.
Can be overridden by integrations to add additional data to the result.
Example: creating next flow entries to the result which needs a
config entry created before it can start.
"""
return result
@callback @callback
def async_create_entry( # type: ignore[override] def async_create_entry( # type: ignore[override]
self, self,

View File

@@ -544,8 +544,9 @@ class HomeAssistant:
) -> None: ) -> None:
"""Add a job to be executed by the event loop or by an executor. """Add a job to be executed by the event loop or by an executor.
If the job is either a coroutine or decorated with @callback, it will be If the job is a coroutine, coroutine function, or decorated with
run by the event loop, if not it will be run by an executor. @callback, it will be run by the event loop, if not it will be run
by an executor.
target: target to call. target: target to call.
args: parameters for method to call. args: parameters for method to call.
@@ -557,6 +558,14 @@ class HomeAssistant:
functools.partial(self.async_create_task, target, eager_start=True) functools.partial(self.async_create_task, target, eager_start=True)
) )
return return
# For @callback targets, schedule directly via call_soon_threadsafe
# to avoid the extra deferral through _async_add_hass_job + call_soon.
# Check iscoroutinefunction to gracefully handle incorrectly labeled @callback functions.
if is_callback_check_partial(target) and not inspect.iscoroutinefunction(
target
):
self.loop.call_soon_threadsafe(target, *args)
return
self.loop.call_soon_threadsafe( self.loop.call_soon_threadsafe(
functools.partial(self._async_add_hass_job, HassJob(target), *args) functools.partial(self._async_add_hass_job, HassJob(target), *args)
) )
@@ -598,8 +607,9 @@ class HomeAssistant:
) -> asyncio.Future[_R] | None: ) -> asyncio.Future[_R] | None:
"""Add a job to be executed by the event loop or by an executor. """Add a job to be executed by the event loop or by an executor.
If the job is either a coroutine or decorated with @callback, it will be If the job is a coroutine, coroutine function, or decorated with
run by the event loop, if not it will be run by an executor. @callback, it will be run by the event loop, if not it will be run
by an executor.
This method must be run in the event loop. This method must be run in the event loop.

View File

@@ -349,6 +349,9 @@ class EntityTriggerBase(Trigger):
"""Trigger for entity state changes.""" """Trigger for entity state changes."""
_domain_specs: Mapping[str, DomainSpec] _domain_specs: Mapping[str, DomainSpec]
_excluded_states: Final[frozenset[str]] = frozenset(
{STATE_UNAVAILABLE, STATE_UNKNOWN}
)
_schema: vol.Schema = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST _schema: vol.Schema = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST
@override @override
@@ -392,6 +395,7 @@ class EntityTriggerBase(Trigger):
self.is_valid_state(state) self.is_valid_state(state)
for entity_id in entity_ids for entity_id in entity_ids
if (state := self._hass.states.get(entity_id)) is not None if (state := self._hass.states.get(entity_id)) is not None
and state.state not in self._excluded_states
) )
def check_one_match(self, entity_ids: set[str]) -> bool: def check_one_match(self, entity_ids: set[str]) -> bool:
@@ -401,6 +405,7 @@ class EntityTriggerBase(Trigger):
self.is_valid_state(state) self.is_valid_state(state)
for entity_id in entity_ids for entity_id in entity_ids
if (state := self._hass.states.get(entity_id)) is not None if (state := self._hass.states.get(entity_id)) is not None
and state.state not in self._excluded_states
) )
== 1 == 1
) )

View File

@@ -15,7 +15,7 @@ astral==2.2
async-interrupt==1.2.2 async-interrupt==1.2.2
async-upnp-client==0.46.2 async-upnp-client==0.46.2
atomicwrites-homeassistant==1.4.1 atomicwrites-homeassistant==1.4.1
attrs==25.4.0 attrs==26.1.0
audioop-lts==0.2.1 audioop-lts==0.2.1
av==16.0.1 av==16.0.1
awesomeversion==25.8.0 awesomeversion==25.8.0
@@ -47,10 +47,10 @@ Jinja2==3.1.6
lru-dict==1.3.0 lru-dict==1.3.0
mutagen==1.47.0 mutagen==1.47.0
openai==2.21.0 openai==2.21.0
orjson==3.11.7 orjson==3.11.8
packaging>=23.1 packaging>=23.1
paho-mqtt==2.1.0 paho-mqtt==2.1.0
Pillow==12.1.1 Pillow==12.2.0
propcache==0.4.1 propcache==0.4.1
psutil-home-assistant==0.0.1 psutil-home-assistant==0.0.1
PyJWT==2.10.1 PyJWT==2.10.1
@@ -64,14 +64,14 @@ PyTurboJPEG==1.8.0
PyYAML==6.0.3 PyYAML==6.0.3
requests==2.33.1 requests==2.33.1
securetar==2026.4.1 securetar==2026.4.1
SQLAlchemy==2.0.41 SQLAlchemy==2.0.49
standard-aifc==3.13.0 standard-aifc==3.13.0
standard-telnetlib==3.13.0 standard-telnetlib==3.13.0
typing-extensions>=4.15.0,<5.0 typing-extensions>=4.15.0,<5.0
ulid-transform==2.2.0 ulid-transform==2.2.0
urllib3>=2.0 urllib3>=2.0
uv==0.11.1 uv==0.11.1
voluptuous-openapi==0.2.0 voluptuous-openapi==0.3.0
voluptuous-serialize==2.7.0 voluptuous-serialize==2.7.0
voluptuous==0.15.2 voluptuous==0.15.2
webrtc-models==0.3.0 webrtc-models==0.3.0

View File

@@ -36,7 +36,7 @@ dependencies = [
"annotatedyaml==1.0.2", "annotatedyaml==1.0.2",
"astral==2.2", "astral==2.2",
"async-interrupt==1.2.2", "async-interrupt==1.2.2",
"attrs==25.4.0", "attrs==26.1.0",
"atomicwrites-homeassistant==1.4.1", "atomicwrites-homeassistant==1.4.1",
"audioop-lts==0.2.1", "audioop-lts==0.2.1",
"awesomeversion==25.8.0", "awesomeversion==25.8.0",
@@ -58,17 +58,17 @@ dependencies = [
"PyJWT==2.10.1", "PyJWT==2.10.1",
# PyJWT has loose dependency. We want the latest one. # PyJWT has loose dependency. We want the latest one.
"cryptography==46.0.7", "cryptography==46.0.7",
"Pillow==12.1.1", "Pillow==12.2.0",
"propcache==0.4.1", "propcache==0.4.1",
"pyOpenSSL==26.0.0", "pyOpenSSL==26.0.0",
"orjson==3.11.7", "orjson==3.11.8",
"packaging>=23.1", "packaging>=23.1",
"psutil-home-assistant==0.0.1", "psutil-home-assistant==0.0.1",
"python-slugify==8.0.4", "python-slugify==8.0.4",
"PyYAML==6.0.3", "PyYAML==6.0.3",
"requests==2.33.1", "requests==2.33.1",
"securetar==2026.4.1", "securetar==2026.4.1",
"SQLAlchemy==2.0.41", "SQLAlchemy==2.0.49",
"standard-aifc==3.13.0", "standard-aifc==3.13.0",
"standard-telnetlib==3.13.0", "standard-telnetlib==3.13.0",
"typing-extensions>=4.15.0,<5.0", "typing-extensions>=4.15.0,<5.0",
@@ -77,7 +77,7 @@ dependencies = [
"uv==0.11.1", "uv==0.11.1",
"voluptuous==0.15.2", "voluptuous==0.15.2",
"voluptuous-serialize==2.7.0", "voluptuous-serialize==2.7.0",
"voluptuous-openapi==0.2.0", "voluptuous-openapi==0.3.0",
"yarl==1.23.0", "yarl==1.23.0",
"webrtc-models==0.3.0", "webrtc-models==0.3.0",
"zeroconf==0.148.0", "zeroconf==0.148.0",
@@ -415,7 +415,7 @@ per-file-ignores = [
# redefined-outer-name: Tests reference fixtures in the test function # redefined-outer-name: Tests reference fixtures in the test function
# use-implicit-booleaness-not-comparison: Tests need to validate that a list # use-implicit-booleaness-not-comparison: Tests need to validate that a list
# or a dict is returned # or a dict is returned
"/tests/:redefined-outer-name,use-implicit-booleaness-not-comparison", "tests/**:redefined-outer-name,use-implicit-booleaness-not-comparison",
] ]
[tool.pylint.REPORTS] [tool.pylint.REPORTS]

10
requirements.txt generated
View File

@@ -14,7 +14,7 @@ annotatedyaml==1.0.2
astral==2.2 astral==2.2
async-interrupt==1.2.2 async-interrupt==1.2.2
atomicwrites-homeassistant==1.4.1 atomicwrites-homeassistant==1.4.1
attrs==25.4.0 attrs==26.1.0
audioop-lts==0.2.1 audioop-lts==0.2.1
awesomeversion==25.8.0 awesomeversion==25.8.0
bcrypt==5.0.0 bcrypt==5.0.0
@@ -34,9 +34,9 @@ infrared-protocols==1.1.0
Jinja2==3.1.6 Jinja2==3.1.6
lru-dict==1.3.0 lru-dict==1.3.0
mutagen==1.47.0 mutagen==1.47.0
orjson==3.11.7 orjson==3.11.8
packaging>=23.1 packaging>=23.1
Pillow==12.1.1 Pillow==12.2.0
propcache==0.4.1 propcache==0.4.1
psutil-home-assistant==0.0.1 psutil-home-assistant==0.0.1
PyJWT==2.10.1 PyJWT==2.10.1
@@ -48,14 +48,14 @@ PyTurboJPEG==1.8.0
PyYAML==6.0.3 PyYAML==6.0.3
requests==2.33.1 requests==2.33.1
securetar==2026.4.1 securetar==2026.4.1
SQLAlchemy==2.0.41 SQLAlchemy==2.0.49
standard-aifc==3.13.0 standard-aifc==3.13.0
standard-telnetlib==3.13.0 standard-telnetlib==3.13.0
typing-extensions>=4.15.0,<5.0 typing-extensions>=4.15.0,<5.0
ulid-transform==2.2.0 ulid-transform==2.2.0
urllib3>=2.0 urllib3>=2.0
uv==0.11.1 uv==0.11.1
voluptuous-openapi==0.2.0 voluptuous-openapi==0.3.0
voluptuous-serialize==2.7.0 voluptuous-serialize==2.7.0
voluptuous==0.15.2 voluptuous==0.15.2
webrtc-models==0.3.0 webrtc-models==0.3.0

14
requirements_all.txt generated
View File

@@ -38,7 +38,7 @@ PSNAWP==3.0.3
# homeassistant.components.qrcode # homeassistant.components.qrcode
# homeassistant.components.seven_segments # homeassistant.components.seven_segments
# homeassistant.components.sighthound # homeassistant.components.sighthound
Pillow==12.1.1 Pillow==12.2.0
# homeassistant.components.plex # homeassistant.components.plex
PlexAPI==4.15.16 PlexAPI==4.15.16
@@ -99,7 +99,7 @@ PyTransportNSW==0.1.1
PyTurboJPEG==1.8.0 PyTurboJPEG==1.8.0
# homeassistant.components.vicare # homeassistant.components.vicare
PyViCare==2.58.1 PyViCare==2.59.0
# homeassistant.components.xiaomi_aqara # homeassistant.components.xiaomi_aqara
PyXiaomiGateway==0.14.3 PyXiaomiGateway==0.14.3
@@ -115,7 +115,7 @@ RtmAPI==0.7.2
# homeassistant.components.recorder # homeassistant.components.recorder
# homeassistant.components.sql # homeassistant.components.sql
SQLAlchemy==2.0.41 SQLAlchemy==2.0.49
# homeassistant.components.tami4 # homeassistant.components.tami4
Tami4EdgeAPI==3.0 Tami4EdgeAPI==3.0
@@ -392,7 +392,7 @@ aioridwell==2025.09.0
aioruckus==0.42 aioruckus==0.42
# homeassistant.components.russound_rio # homeassistant.components.russound_rio
aiorussound==5.0.0 aiorussound==5.0.1
# homeassistant.components.ruuvi_gateway # homeassistant.components.ruuvi_gateway
aioruuvigateway==0.1.0 aioruuvigateway==0.1.0
@@ -1038,7 +1038,7 @@ gTTS==2.5.3
# homeassistant.components.gardena_bluetooth # homeassistant.components.gardena_bluetooth
# homeassistant.components.husqvarna_automower_ble # homeassistant.components.husqvarna_automower_ble
gardena-bluetooth==2.3.0 gardena-bluetooth==2.4.0
# homeassistant.components.google_assistant_sdk # homeassistant.components.google_assistant_sdk
gassist-text==0.0.14 gassist-text==0.0.14
@@ -1928,7 +1928,7 @@ pyRFXtrx==0.31.1
pySDCP==1 pySDCP==1
# homeassistant.components.tibber # homeassistant.components.tibber
pyTibber==0.37.0 pyTibber==0.37.1
# homeassistant.components.dlink # homeassistant.components.dlink
pyW215==0.8.0 pyW215==0.8.0
@@ -2930,7 +2930,7 @@ sentry-sdk==2.48.0
# homeassistant.components.homeassistant_hardware # homeassistant.components.homeassistant_hardware
# homeassistant.components.zha # homeassistant.components.zha
serialx==1.1.1 serialx==1.2.2
# homeassistant.components.sfr_box # homeassistant.components.sfr_box
sfrbox-api==0.1.1 sfrbox-api==0.1.1

View File

@@ -8,8 +8,8 @@
-c homeassistant/package_constraints.txt -c homeassistant/package_constraints.txt
-r requirements_test_pre_commit.txt -r requirements_test_pre_commit.txt
astroid==4.0.4 astroid==4.0.4
coverage==7.10.6 coverage==7.13.5
freezegun==1.5.2 freezegun==1.5.5
# librt is an internal mypy dependency # librt is an internal mypy dependency
librt==0.8.1 librt==0.8.1
license-expression==30.4.3 license-expression==30.4.3
@@ -18,23 +18,23 @@ mypy==1.20.1
prek==0.2.28 prek==0.2.28
pydantic==2.13.0 pydantic==2.13.0
pylint==4.0.5 pylint==4.0.5
pylint-per-file-ignores==1.4.0 pylint-per-file-ignores==3.2.1
pipdeptree==2.26.1 pipdeptree==2.26.1
pytest-asyncio==1.3.0 pytest-asyncio==1.3.0
pytest-aiohttp==1.1.0 pytest-aiohttp==1.1.0
pytest-cov==7.0.0 pytest-cov==7.1.0
pytest-freezer==0.4.9 pytest-freezer==0.4.9
pytest-github-actions-annotate-failures==0.3.0 pytest-github-actions-annotate-failures==0.4.0
pytest-socket==0.7.0 pytest-socket==0.7.0
pytest-sugar==1.0.0 pytest-sugar==1.1.1
pytest-timeout==2.4.0 pytest-timeout==2.4.0
pytest-unordered==0.7.0 pytest-unordered==0.7.0
pytest-picked==0.5.1 pytest-picked==0.5.1
pytest-xdist==3.8.0 pytest-xdist==3.8.0
pytest==9.0.3 pytest==9.0.3
requests-mock==1.12.1 requests-mock==1.12.1
respx==0.22.0 respx==0.23.1
syrupy==5.0.0 syrupy==5.1.0
tqdm==4.67.1 tqdm==4.67.1
types-aiofiles==24.1.0.20250822 types-aiofiles==24.1.0.20250822
types-atomicwrites==1.4.5.1 types-atomicwrites==1.4.5.1

Some files were not shown because too many files have changed in this diff Show More