Compare commits

..

90 Commits

Author SHA1 Message Date
Claude
f119b6dd2e Implement conditional area toggle behavior
When toggling an area, instead of forwarding individual toggle commands
to each entity, check if any entity is currently on. If any is on, turn
all entities off; otherwise turn all entities on.

This provides a more intuitive toggle behavior for areas, similar to how
light switches work in rooms with multiple lights.

https://claude.ai/code/session_014GHmTAHzUNCVQz6jbtfRSm
2026-01-31 15:22:47 +00:00
Shay Levy
eafeba792d Fix Shelly CoIoT repair issue (#161973)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-31 16:33:31 +02:00
Norbert Rittel
c9318b6fbf Clarify action description for input_button helper (#161963) 2026-01-31 15:16:36 +01:00
epenet
99be382abf Remove outdated device registry cleanup in generic_hygrostat (#161859) 2026-01-31 15:15:19 +01:00
epenet
7cfcfca210 Remove outdated device registry cleanup in generic_thermostat (#161861) 2026-01-31 15:14:57 +01:00
epenet
f29daccb19 Remove outdated device registry cleanup in history_stats (#161862) 2026-01-31 15:14:42 +01:00
epenet
be869fce6c Remove outdated device registry cleanup in mold_indicator (#161864) 2026-01-31 15:14:26 +01:00
epenet
7bb0414a39 Remove outdated device registry cleanup in statistics (#161865) 2026-01-31 15:14:09 +01:00
epenet
3f8807d063 Remove outdated device registry cleanup in threshold (#161866) 2026-01-31 15:13:54 +01:00
mettolen
67642e6246 Add reauthentication flow to Liebherr integration (#161902)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-31 15:12:52 +01:00
mvn23
0d215597f3 Fix OpenTherm Gateway button availability (#161933) 2026-01-31 15:06:21 +01:00
mvn23
f41bd2b582 Bump pyotgw to 2.2.3 (#161928) 2026-01-31 15:03:56 +01:00
Norbert Rittel
5c9ec1911b Clarify action descriptions for input_boolean (#161924) 2026-01-31 15:03:08 +01:00
J. Diego Rodríguez Royo
1a0b7fe984 Restore the Home Connect program option entities (#156401)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-01-31 12:32:18 +01:00
Erwin Douna
26ee25d7bb Pattern fix for Proxmox config flow (#161946) 2026-01-31 11:41:41 +01:00
Norbert Rittel
aabf52d3cf Rename "service" to "action", use common state for "High" (#161940) 2026-01-31 11:40:55 +01:00
Erwin Douna
99fcb46a7e Add parallel updates to Portainer (#161947) 2026-01-31 11:40:25 +01:00
Raphael Hehl
6580c5e5bf Bump uiprotect to version 10.1.0 (#161967)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
2026-01-31 11:39:20 +01:00
tronikos
63e7d4dc08 Bump opower to 0.17.0 (#161962) 2026-01-31 11:38:43 +01:00
Sid
cc6900d846 Bump eheimdigital to 1.6.0 (#161961) 2026-01-31 11:38:14 +01:00
Brett Adams
ca2ad22884 Rename drive inverter unavailable state in Teslemetry (#161960)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-31 11:36:12 +01:00
Armin Ghofrani
40944f0f2d Enable prompt caching for Anthropic conversation integration (#158957)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 23:32:47 +03:00
uptimeZERO_
91a3e488b1 Bump media source upload limit from 10mb to 20mb (#161436) 2026-01-30 13:07:37 +01:00
Magnus Øverli
9a1f517e6e Convert flexit_bacnet fireplace mode to climate preset- Rename 'Boost… (#155760)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-30 12:59:10 +01:00
Simone Chemelli
c82c614bb9 Handle hostname resolution for Shelly repair issue (#161914) 2026-01-30 12:26:48 +01:00
Norbert Rittel
20914dce67 Improve action descriptions of camera (#161876) 2026-01-30 12:08:49 +01:00
Paul Bottein
5fc407d2f3 Update frontend to 20260128.3 (#161918) 2026-01-30 11:51:53 +01:00
Marc Mueller
c7444d38a1 Remove pydantic v1 mypy plugin (#161901) 2026-01-30 11:19:06 +01:00
puddly
81f6136bda Bump ZHA to 0.0.88 (#161904) 2026-01-30 11:18:38 +01:00
Steve Easley
862d0ea49e Bump JVC Projector dependency to 2.0.1 (#161898) 2026-01-30 11:17:14 +01:00
hanwg
f2fdfed241 Update translations for Telegram bot (#161903) 2026-01-30 11:13:46 +01:00
David Recordon
15640049cb Fix Control4 HVAC state-to-action mapping (#161916) 2026-01-30 10:59:39 +01:00
dependabot[bot]
5c163434f8 Bump actions/cache from 5.0.2 to 5.0.3 (#161906)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-30 10:47:02 +01:00
Sebastiaan Speck
e54c2ea55e Ensure Renault buttons are supported by the vehicle (#161893)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-01-30 09:58:50 +01:00
Kevin Stillhammer
1ec42693ab Bump fressnapftracker to 0.2.2 (#161913) 2026-01-30 09:32:13 +01:00
epenet
672864ae4f Remove outdated device registry cleanup in trend (#161867) 2026-01-30 08:07:53 +01:00
Artur Pragacz
e54d7e42cb Add subscription pattern for conversation intents (#158456) 2026-01-30 07:19:57 +01:00
Jan Bouwhuis
5d63fce015 Re-add Claude code to devcontainer via native install script (#161807) 2026-01-29 23:35:59 -05:00
Paul Bottein
190fe10eed Allow lovelace path for dashboard in yaml and fix yaml dashboard migration (#161816) 2026-01-29 17:19:37 -05:00
Bram Kragten
ef410c1e2a Update frontend to 20260128.2 (#161881) 2026-01-29 23:02:59 +01:00
Artur Pragacz
5a712398e7 Fix validation of actions config in intent_script (#158266) 2026-01-29 22:12:46 +01:00
Thomas55555
b1be3fe0da Introduce common string for data description of verify_ssl (#160703) 2026-01-29 20:27:37 +00:00
Brett Adams
97a7ab011b Add quality scale to Teslemetry (#159589) 2026-01-29 20:23:09 +00:00
SamareshSingh
694a3050b9 Add device_class inheritance to min_max sensor (#157602)
Signed-off-by: Samaresh Sahoo <ssamaresh01@gmail.com>
Co-authored-by: Samaresh Kumar Singh <ssam18@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-29 21:15:41 +01:00
Erwin Douna
8164e65188 Fix small typo in Portainer strings (#161889) 2026-01-29 20:58:07 +01:00
Marc Mueller
9af0d1eed4 Update fritzconnection to 1.15.1 (#161887) 2026-01-29 20:57:52 +01:00
Jan Bouwhuis
72e6ca55ba Fix use of ambiguous units for reactive power and energy (#161810) 2026-01-29 20:34:09 +01:00
Jeremiah Paige
0fb62a7e97 Add wsdot code-owner (#160807) 2026-01-29 19:52:41 +01:00
Erwin Douna
930eb70a8b Add prune images service to Portainer (#161009)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-29 19:39:17 +01:00
Norbert Rittel
462104fa68 Clarify action descriptions for input numbers (#161847) 2026-01-29 18:43:26 +01:00
mettolen
d0c77d8a7e Delete unused Liebherr snapshot (#161879) 2026-01-29 17:38:56 +01:00
Björn Dalfors
606780b20f Bump nibe to 2.22.0 (#161873) 2026-01-29 17:06:38 +01:00
Tucker Kern
8f465cf2ca Remove deprecated Snapcast group entities and custom grouping services (#160945) 2026-01-29 16:44:50 +01:00
epenet
4e29476dd9 Cleanup deprecated YAML import from datadog (#161870) 2026-01-29 15:33:14 +01:00
epenet
b4328083be Fix incorrect entity_description class in radarr (#161856) 2026-01-29 15:09:06 +01:00
epenet
72ba59f559 Remove outdated device registry cleanup in utility_meter (#161868) 2026-01-29 15:01:41 +01:00
epenet
826168b601 Remove outdated device registry cleanup in integration (#161863) 2026-01-29 15:01:22 +01:00
Sebastiaan Speck
66f181992c Bump renault-api to 0.5.3 (#161857) 2026-01-29 14:02:22 +01:00
epenet
336ef4c37b Remove outdated device registry cleanup in derivative (#161858) 2026-01-29 13:55:49 +01:00
mettolen
72e7bf7f9c Add new Liebherr integration (#161197)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-29 13:49:09 +01:00
Gage Benne
acbdbc9be7 Bump pydexcom to 0.5.1 (#161549) 2026-01-29 12:47:05 +01:00
Steve Easley
3551382f8d Add additional JVC Projector entities (#161134) 2026-01-29 12:45:19 +01:00
Mattia Monga
95014d7e6d Make viaggiatreno work by fixing some critical bugs (#160093)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-01-29 12:41:47 +01:00
Retha Runolfsson
dfe1990484 Add service for switchbot keypad vision (#160659)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-01-29 12:23:38 +01:00
epenet
15ff5d0f74 Modernize tasmota light tests (#161830) 2026-01-29 12:05:03 +01:00
epenet
1407f61a9c Modernize abode light tests (#161829) 2026-01-29 12:01:32 +01:00
epenet
6107b794d6 Modernize hue light tests (#161828) 2026-01-29 12:01:07 +01:00
epenet
7ab8ceab7e Modernize zha light tests (#161826) 2026-01-29 12:00:52 +01:00
epenet
a4db6a9ebc Modernize template light tests (#161833) 2026-01-29 11:59:55 +01:00
Colin
12a2650b6b Add quality scale to openesve (#161651) 2026-01-29 11:55:54 +01:00
Markus Jacobsen
23da7ecedd Bump mozart_api to 5.3.1.108.2 (#161846) 2026-01-29 11:54:11 +01:00
wollew
8d9e7b0b26 Do not use base class of pyvlx in velux light platform (#161837) 2026-01-29 11:52:22 +01:00
epenet
9664047345 Modernize homekit_controller light tests (#161844) 2026-01-29 11:51:59 +01:00
epenet
804fbf9cef Modernize govee_light_local light tests (#161845) 2026-01-29 11:51:22 +01:00
epenet
e10fe074c9 Cleanup deprecated color_temp support in lifx (#161848) 2026-01-29 11:50:53 +01:00
Norbert Rittel
7b0e21da74 Fix action descriptions of alarm_control_panel (#161852) 2026-01-29 11:50:22 +01:00
epenet
29e142cf1e Modernize matter light tests (#161850) 2026-01-29 11:49:51 +01:00
epenet
6b765ebabb Modernize tradfri light tests (#161849) 2026-01-29 11:49:18 +01:00
epenet
899aa62697 Modernize knx light tests (#161851) 2026-01-29 11:42:18 +01:00
dependabot[bot]
a11efba405 Bump docker/login-action from 3.6.0 to 3.7.0 (#161825) 2026-01-29 07:43:41 +01:00
Manu
78280dfc5a Fix string in Namecheap DynamicDNS integration (#161821) 2026-01-29 03:10:09 +01:00
Glenn de Haan
4220bab08a Improve quality scale to gold HDFury integration (#161800) 2026-01-29 00:25:00 +01:00
Marc Mueller
f7dcf8de15 Switch back to mypy 1.19.1 (#161817) 2026-01-29 00:12:46 +01:00
Aaron Godfrey
7e32b50fee Update todoist-api-python to 3.1.0 (#161811)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-29 00:00:53 +01:00
Robert Resch
c875b75272 Use Python 3.14 as default one (#161426)
Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
Co-authored-by: Franck Nijhof <git@frenck.dev>
2026-01-28 23:48:27 +01:00
John Hillery
7368b9ca1d Add sensor for energy remaining to tessie integration (#161796) 2026-01-28 23:41:29 +01:00
Michael Jones
493e8c1a22 Append ID to flood monitoring station name in EAFM (#161794) 2026-01-28 22:18:35 +00:00
Michael Hansen
1b16b24550 Bump intents to 2026.1.28 (#161813) 2026-01-28 23:14:36 +01:00
Franck Nijhof
7637300632 Bump version to 2026.3.0dev0 (#161809) 2026-01-28 23:12:34 +01:00
victorigualada
bdbce57217 Use OpenAI schema dataclasses for cloud stream responses (#161663) 2026-01-28 20:59:03 +01:00
230 changed files with 4438 additions and 3167 deletions

View File

@@ -10,12 +10,12 @@ on:
env:
BUILD_TYPE: core
DEFAULT_PYTHON: "3.13"
DEFAULT_PYTHON: "3.14.2"
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
# Base image version from https://github.com/home-assistant/docker
BASE_IMAGE_VERSION: "2025.12.0"
BASE_IMAGE_VERSION: "2026.01.0"
ARCHITECTURES: '["amd64", "aarch64"]'
jobs:
@@ -184,7 +184,7 @@ jobs:
echo "${{ github.sha }};${{ github.ref }};${{ github.event_name }};${{ github.actor }}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -287,7 +287,7 @@ jobs:
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -358,13 +358,13 @@ jobs:
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -522,7 +522,7 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}

View File

@@ -40,9 +40,9 @@ env:
CACHE_VERSION: 2
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.2"
DEFAULT_PYTHON: "3.13.11"
ALL_PYTHON_VERSIONS: "['3.13.11', '3.14.2']"
HA_SHORT_VERSION: "2026.3"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -310,7 +310,7 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: &actions-cache actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: venv
key: &key-python-venv >-
@@ -374,7 +374,7 @@ jobs:
fi
- name: Save apt cache
if: steps.cache-apt-check.outputs.cache-hit != 'true'
uses: &actions-cache-save actions/cache/save@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: &actions-cache-save actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: *path-apt-cache
key: *key-apt-cache
@@ -425,7 +425,7 @@ jobs:
steps:
- &cache-restore-apt
name: Restore apt cache
uses: &actions-cache-restore actions/cache/restore@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: &actions-cache-restore actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: *path-apt-cache
fail-on-cache-miss: true

View File

@@ -10,7 +10,7 @@ on:
- "**strings.json"
env:
DEFAULT_PYTHON: "3.13"
DEFAULT_PYTHON: "3.14.2"
jobs:
upload:

View File

@@ -17,7 +17,7 @@ on:
- "script/gen_requirements_all.py"
env:
DEFAULT_PYTHON: "3.13"
DEFAULT_PYTHON: "3.14.2"
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name}}

View File

@@ -1 +1 @@
3.13
3.14

4
CODEOWNERS generated
View File

@@ -921,6 +921,8 @@ build.json @home-assistant/supervisor
/tests/components/libre_hardware_monitor/ @Sab44
/homeassistant/components/lidarr/ @tkdrob
/tests/components/lidarr/ @tkdrob
/homeassistant/components/liebherr/ @mettolen
/tests/components/liebherr/ @mettolen
/homeassistant/components/lifx/ @Djelibeybi
/tests/components/lifx/ @Djelibeybi
/homeassistant/components/light/ @home-assistant/core
@@ -1878,6 +1880,8 @@ build.json @home-assistant/supervisor
/tests/components/worldclock/ @fabaff
/homeassistant/components/ws66i/ @ssaenger
/tests/components/ws66i/ @ssaenger
/homeassistant/components/wsdot/ @ucodery
/tests/components/wsdot/ @ucodery
/homeassistant/components/wyoming/ @synesthesiam
/tests/components/wyoming/ @synesthesiam
/homeassistant/components/xbox/ @hunterjm @tr4nt0r

View File

@@ -52,6 +52,9 @@ RUN --mount=type=bind,source=requirements.txt,target=requirements.txt \
--mount=type=bind,source=requirements_test_pre_commit.txt,target=requirements_test_pre_commit.txt \
uv pip install -r requirements.txt -r requirements_test.txt
# Claude Code native install
RUN curl -fsSL https://claude.ai/install.sh | bash
WORKDIR /workspaces
# Set the default shell to bash instead of sh

View File

@@ -10,7 +10,6 @@
"preview_features": {
"snapshots": {
"feedback_url": "https://forms.gle/GqvRmgmghSDco8M46",
"learn_more_url": "https://www.home-assistant.io/blog/2026/02/02/about-device-database/",
"report_issue_url": "https://github.com/OHF-Device-Database/device-database/issues/new"
}
},

View File

@@ -600,6 +600,16 @@ class AnthropicBaseLLMEntity(Entity):
system = chat_log.content[0]
if not isinstance(system, conversation.SystemContent):
raise TypeError("First message must be a system message")
# System prompt with caching enabled
system_prompt: list[TextBlockParam] = [
TextBlockParam(
type="text",
text=system.content,
cache_control={"type": "ephemeral"},
)
]
messages = _convert_content(chat_log.content[1:])
model = options.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
@@ -608,7 +618,7 @@ class AnthropicBaseLLMEntity(Entity):
model=model,
messages=messages,
max_tokens=options.get(CONF_MAX_TOKENS, DEFAULT[CONF_MAX_TOKENS]),
system=system.content,
system=system_prompt,
stream=True,
)
@@ -695,10 +705,6 @@ class AnthropicBaseLLMEntity(Entity):
type="auto",
)
if isinstance(model_args["system"], str):
model_args["system"] = [
TextBlockParam(type="text", text=model_args["system"])
]
model_args["system"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",

View File

@@ -540,17 +540,7 @@ class APCUPSdSensor(APCUPSdEntity, SensorEntity):
data = self.coordinator.data[key]
if self.entity_description.device_class == SensorDeviceClass.TIMESTAMP:
# The date could be "N/A" for certain fields (e.g., XOFFBATT), indicating there is no value yet.
if data == "N/A":
self._attr_native_value = None
return
try:
self._attr_native_value = dateutil.parser.parse(data)
except (dateutil.parser.ParserError, OverflowError):
# If parsing fails we should mark it as unknown, with a log for further debugging.
_LOGGER.warning('Failed to parse date for %s: "%s"', key, data)
self._attr_native_value = None
self._attr_native_value = dateutil.parser.parse(data)
return
self._attr_native_value, inferred_unit = infer_unit(data)

View File

@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/bang_olufsen",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["mozart-api==5.3.1.108.0"],
"requirements": ["mozart-api==5.3.1.108.2"],
"zeroconf": ["_bangolufsen._tcp.local."]
}

View File

@@ -8,6 +8,7 @@ from datetime import timedelta
import json
import logging
from typing import TYPE_CHECKING, Any, cast
from uuid import UUID
from aiohttp import ClientConnectorError
from mozart_api import __version__ as MOZART_API_VERSION
@@ -735,7 +736,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
await self._client.set_active_source(source_id=key)
else:
# Video
await self._client.post_remote_trigger(id=key)
await self._client.post_remote_trigger(id=UUID(key))
async def async_select_sound_mode(self, sound_mode: str) -> None:
"""Select a sound mode."""
@@ -894,7 +895,7 @@ class BeoMediaPlayer(BeoEntity, MediaPlayerEntity):
translation_key="play_media_error",
translation_placeholders={
"media_type": media_type,
"error_message": json.loads(error.body)["message"],
"error_message": json.loads(cast(str, error.body))["message"],
},
) from error

View File

@@ -50,11 +50,11 @@
"selector": {},
"services": {
"disable_motion_detection": {
"description": "Disables the motion detection.",
"description": "Disables the motion detection of a camera.",
"name": "Disable motion detection"
},
"enable_motion_detection": {
"description": "Enables the motion detection.",
"description": "Enables the motion detection of a camera.",
"name": "Enable motion detection"
},
"play_stream": {
@@ -100,11 +100,11 @@
"name": "Take snapshot"
},
"turn_off": {
"description": "Turns off the camera.",
"description": "Turns off a camera.",
"name": "[%key:common::action::turn_off%]"
},
"turn_on": {
"description": "Turns on the camera.",
"description": "Turns on a camera.",
"name": "[%key:common::action::turn_on%]"
}
},

View File

@@ -335,20 +335,18 @@ def _get_config_intents(config: ConfigType, hass_config_path: str) -> dict[str,
"""Return config intents."""
intents = config.get(DOMAIN, {}).get("intents", {})
return {
"intents": {
intent_name: {
"data": [
{
"sentences": sentences,
"metadata": {
METADATA_CUSTOM_SENTENCE: True,
METADATA_CUSTOM_FILE: hass_config_path,
},
}
]
}
for intent_name, sentences in intents.items()
intent_name: {
"data": [
{
"sentences": sentences,
"metadata": {
METADATA_CUSTOM_SENTENCE: True,
METADATA_CUSTOM_FILE: hass_config_path,
},
}
]
}
for intent_name, sentences in intents.items()
}

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from collections.abc import Callable
import dataclasses
import logging
from typing import TYPE_CHECKING, Any
@@ -18,7 +19,7 @@ from homeassistant.core import (
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, intent, singleton
from .const import DATA_COMPONENT, HOME_ASSISTANT_AGENT
from .const import DATA_COMPONENT, HOME_ASSISTANT_AGENT, IntentSource
from .entity import ConversationEntity
from .models import (
AbstractConversationAgent,
@@ -34,9 +35,11 @@ from .trace import (
_LOGGER = logging.getLogger(__name__)
TRIGGER_INTENT_NAME_PREFIX = "HassSentenceTrigger"
if TYPE_CHECKING:
from .default_agent import DefaultAgent
from .trigger import TriggerDetails
from .trigger import TRIGGER_CALLBACK_TYPE
@singleton.singleton("conversation_agent")
@@ -139,6 +142,10 @@ async def async_converse(
return result
type IntentSourceConfig = dict[str, dict[str, Any]]
type IntentsCallback = Callable[[dict[IntentSource, IntentSourceConfig]], None]
class AgentManager:
"""Class to manage conversation agents."""
@@ -147,8 +154,13 @@ class AgentManager:
self.hass = hass
self._agents: dict[str, AbstractConversationAgent] = {}
self.default_agent: DefaultAgent | None = None
self.config_intents: dict[str, Any] = {}
self.triggers_details: list[TriggerDetails] = []
self._intents: dict[IntentSource, IntentSourceConfig] = {
IntentSource.CONFIG: {"intents": {}},
IntentSource.TRIGGER: {"intents": {}},
}
self._intents_subscribers: list[IntentsCallback] = []
self._trigger_callbacks: dict[int, TRIGGER_CALLBACK_TYPE] = {}
self._trigger_callback_counter: int = 0
@callback
def async_get_agent(self, agent_id: str) -> AbstractConversationAgent | None:
@@ -200,27 +212,75 @@ class AgentManager:
async def async_setup_default_agent(self, agent: DefaultAgent) -> None:
"""Set up the default agent."""
agent.update_config_intents(self.config_intents)
agent.update_triggers(self.triggers_details)
self.default_agent = agent
@callback
def subscribe_intents(self, subscriber: IntentsCallback) -> CALLBACK_TYPE:
"""Subscribe to intents updates.
The subscriber callback is called immediately with all intent sources
and whenever intents are updated (only with the changed source).
"""
subscriber(self._intents)
self._intents_subscribers.append(subscriber)
@callback
def unsubscribe() -> None:
"""Unsubscribe from intents updates."""
self._intents_subscribers.remove(subscriber)
return unsubscribe
def _notify_intents_subscribers(self, source: IntentSource) -> None:
"""Notify all intents subscribers of a change to a specific source."""
update = {source: self._intents[source]}
for subscriber in self._intents_subscribers:
subscriber(update)
def update_config_intents(self, intents: dict[str, Any]) -> None:
"""Update config intents."""
self.config_intents = intents
if self.default_agent is not None:
self.default_agent.update_config_intents(intents)
self._intents[IntentSource.CONFIG]["intents"] = intents
self._notify_intents_subscribers(IntentSource.CONFIG)
def register_trigger(self, trigger_details: TriggerDetails) -> CALLBACK_TYPE:
def register_trigger(
self, sentences: list[str], trigger_callback: TRIGGER_CALLBACK_TYPE
) -> CALLBACK_TYPE:
"""Register a trigger."""
self.triggers_details.append(trigger_details)
if self.default_agent is not None:
self.default_agent.update_triggers(self.triggers_details)
trigger_id = self._trigger_callback_counter
self._trigger_callback_counter += 1
trigger_intent_name = f"{TRIGGER_INTENT_NAME_PREFIX}{trigger_id}"
trigger_intents = self._intents[IntentSource.TRIGGER]
trigger_intents["intents"][trigger_intent_name] = {
"data": [{"sentences": sentences}]
}
self._trigger_callbacks[trigger_id] = trigger_callback
self._notify_intents_subscribers(IntentSource.TRIGGER)
@callback
def unregister_trigger() -> None:
"""Unregister the trigger."""
self.triggers_details.remove(trigger_details)
if self.default_agent is not None:
self.default_agent.update_triggers(self.triggers_details)
del trigger_intents["intents"][trigger_intent_name]
del self._trigger_callbacks[trigger_id]
self._notify_intents_subscribers(IntentSource.TRIGGER)
return unregister_trigger
@property
def trigger_sentences(self) -> list[str]:
"""Get all trigger sentences."""
sentences: list[str] = []
trigger_intents = self._intents[IntentSource.TRIGGER]
for trigger_intent in trigger_intents.get("intents", {}).values():
for data in trigger_intent.get("data", []):
sentences.extend(data.get("sentences", []))
return sentences
def get_trigger_callback(
self, trigger_intent_name: str
) -> TRIGGER_CALLBACK_TYPE | None:
"""Get the callback for a trigger from its intent name."""
if not trigger_intent_name.startswith(TRIGGER_INTENT_NAME_PREFIX):
return None
trigger_id = int(trigger_intent_name[len(TRIGGER_INTENT_NAME_PREFIX) :])
return self._trigger_callbacks.get(trigger_id)

View File

@@ -36,6 +36,13 @@ METADATA_CUSTOM_SENTENCE = "hass_custom_sentence"
METADATA_CUSTOM_FILE = "hass_custom_file"
class IntentSource(StrEnum):
"""Source of intents."""
CONFIG = "config"
TRIGGER = "trigger"
class ChatLogEventType(StrEnum):
"""Chat log event type."""

View File

@@ -76,18 +76,18 @@ from homeassistant.helpers.event import async_track_state_added_domain
from homeassistant.util import language as language_util
from homeassistant.util.json import JsonObjectType, json_loads_object
from .agent_manager import get_agent_manager
from .agent_manager import IntentSourceConfig, get_agent_manager
from .chat_log import AssistantContent, ChatLog, ToolResultContent
from .const import (
DOMAIN,
METADATA_CUSTOM_FILE,
METADATA_CUSTOM_SENTENCE,
ConversationEntityFeature,
IntentSource,
)
from .entity import ConversationEntity
from .models import ConversationInput, ConversationResult
from .trace import ConversationTraceEventType, async_conversation_trace_append
from .trigger import TriggerDetails
_LOGGER = logging.getLogger(__name__)
@@ -126,7 +126,7 @@ class SentenceTriggerResult:
sentence: str
sentence_template: str | None
matched_triggers: dict[int, RecognizeResult]
matched_triggers: dict[str, RecognizeResult]
class IntentMatchingStage(Enum):
@@ -236,15 +236,19 @@ class DefaultAgent(ConversationEntity):
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the default agent."""
self.hass = hass
self._lang_intents: dict[str, LanguageIntents | object] = {}
self._load_intents_lock = asyncio.Lock()
# Intents from common conversation config
self._config_intents: dict[str, Any] = {}
self._config_intents_config: IntentSourceConfig = {}
# Sentences that will trigger a callback (skipping intent recognition)
self._triggers_details: list[TriggerDetails] = []
# Intents from conversation triggers
self._trigger_intents: Intents | None = None
self._trigger_intents_config: IntentSourceConfig = {}
# Subscription to intents updates
self._unsub_intents: Callable[[], None] | None = None
# Slot lists for entities, areas, etc.
self._slot_lists: dict[str, SlotList] | None = None
@@ -261,6 +265,33 @@ class DefaultAgent(ConversationEntity):
self.fuzzy_matching = True
self._fuzzy_config: FuzzyConfig | None = None
async def async_added_to_hass(self) -> None:
"""Subscribe to intents updates when added to hass."""
self._unsub_intents = get_agent_manager(self.hass).subscribe_intents(
self._update_intents
)
async def async_will_remove_from_hass(self) -> None:
"""Unsubscribe from intents updates when removed from hass."""
if self._unsub_intents is not None:
self._unsub_intents()
self._unsub_intents = None
@callback
def _update_intents(
self, intents_update: dict[IntentSource, IntentSourceConfig]
) -> None:
"""Handle intents update from agent_manager subscription."""
if IntentSource.CONFIG in intents_update:
self._config_intents_config = intents_update[IntentSource.CONFIG]
# Intents have changed, so we must clear the cache
self._intent_cache.clear()
if IntentSource.TRIGGER in intents_update:
self._trigger_intents_config = intents_update[IntentSource.TRIGGER]
# Force rebuild on next use
self._trigger_intents = None
@property
def supported_languages(self) -> list[str]:
"""Return a list of supported languages."""
@@ -1059,14 +1090,6 @@ class DefaultAgent(ConversationEntity):
# Intents have changed, so we must clear the cache
self._intent_cache.clear()
@callback
def update_config_intents(self, intents: dict[str, Any]) -> None:
"""Update config intents."""
self._config_intents = intents
# Intents have changed, so we must clear the cache
self._intent_cache.clear()
async def async_prepare(self, language: str | None = None) -> None:
"""Load intents for a language."""
if language is None:
@@ -1193,7 +1216,7 @@ class DefaultAgent(ConversationEntity):
merge_dict(
intents_dict,
self._config_intents,
self._config_intents_config,
)
if not intents_dict:
@@ -1461,27 +1484,12 @@ class DefaultAgent(ConversationEntity):
return response_template.async_render(response_args)
@callback
def update_triggers(self, triggers_details: list[TriggerDetails]) -> None:
"""Update triggers."""
self._triggers_details = triggers_details
# Force rebuild on next use
self._trigger_intents = None
def _rebuild_trigger_intents(self) -> None:
"""Rebuild the HassIL intents object from the current trigger sentences."""
"""Rebuild the HassIL intents object from the trigger intents dict."""
intents_dict = {
"language": self.hass.config.language,
"intents": {
# Use trigger data index as a virtual intent name for HassIL.
# This works because the intents are rebuilt on every
# register/unregister.
str(trigger_id): {"data": [{"sentences": trigger_details.sentences}]}
for trigger_id, trigger_details in enumerate(self._triggers_details)
},
**self._trigger_intents_config,
}
trigger_intents = Intents.from_dict(intents_dict)
# Assume slot list references are wildcards
@@ -1496,7 +1504,7 @@ class DefaultAgent(ConversationEntity):
self._trigger_intents = trigger_intents
_LOGGER.debug("Rebuilt trigger intents: %s", intents_dict)
_LOGGER.debug("Rebuilt trigger intents: %s", self._trigger_intents_config)
async def async_recognize_sentence_trigger(
self, user_input: ConversationInput
@@ -1506,7 +1514,7 @@ class DefaultAgent(ConversationEntity):
Calls the registered callbacks if there's a match and returns a sentence
trigger result.
"""
if not self._triggers_details:
if not self._trigger_intents_config.get("intents"):
# No triggers registered
return None
@@ -1516,18 +1524,18 @@ class DefaultAgent(ConversationEntity):
assert self._trigger_intents is not None
matched_triggers: dict[int, RecognizeResult] = {}
matched_triggers: dict[str, RecognizeResult] = {}
matched_template: str | None = None
for result in recognize_all(user_input.text, self._trigger_intents):
if result.intent_sentence is not None:
matched_template = result.intent_sentence.text
trigger_id = int(result.intent.name)
if trigger_id in matched_triggers:
trigger_intent_name = result.intent.name
if trigger_intent_name in matched_triggers:
# Already matched a sentence from this trigger
break
matched_triggers[trigger_id] = result
matched_triggers[trigger_intent_name] = result
if not matched_triggers:
# Sentence did not match any trigger sentences
@@ -1551,10 +1559,14 @@ class DefaultAgent(ConversationEntity):
chat_log: ChatLog,
) -> str:
"""Run sentence trigger callbacks and return response text."""
manager = get_agent_manager(self.hass)
# Gather callback responses in parallel
trigger_callbacks = [
self._triggers_details[trigger_id].callback(user_input, trigger_result)
for trigger_id, trigger_result in result.matched_triggers.items()
trigger_callback(user_input, trigger_result)
for trigger_intent_name, trigger_result in result.matched_triggers.items()
if (trigger_callback := manager.get_trigger_callback(trigger_intent_name))
is not None
]
tool_input = llm.ToolInput(

View File

@@ -165,11 +165,7 @@ async def websocket_list_sentences(
"""List custom registered sentences."""
manager = get_agent_manager(hass)
sentences = []
for trigger_details in manager.triggers_details:
sentences.extend(trigger_details.sentences)
connection.send_result(msg["id"], {"trigger_sentences": sentences})
connection.send_result(msg["id"], {"trigger_sentences": manager.trigger_sentences})
@websocket_api.websocket_command(

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from typing import Any
from hassil.recognize import RecognizeResult
@@ -31,14 +30,6 @@ TRIGGER_CALLBACK_TYPE = Callable[
]
@dataclass(slots=True)
class TriggerDetails:
"""List of sentences and the callback for a trigger."""
sentences: list[str]
callback: TRIGGER_CALLBACK_TYPE
def has_no_punctuation(value: list[str]) -> list[str]:
"""Validate result does not contain punctuation."""
for sentence in value:
@@ -149,5 +140,5 @@ async def async_attach_trigger(
return None
return get_agent_manager(hass).register_trigger(
TriggerDetails(sentences=sentences, callback=call_action)
sentences=sentences, trigger_callback=call_action
)

View File

@@ -3,9 +3,8 @@
import logging
from datadog import DogStatsd, initialize
import voluptuous as vol
from homeassistant.config_entries import SOURCE_IMPORT, ConfigEntry
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_HOST,
CONF_PORT,
@@ -16,53 +15,15 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, state as state_helper
from homeassistant.helpers.typing import ConfigType
from . import config_flow as config_flow
from .const import (
CONF_RATE,
DEFAULT_HOST,
DEFAULT_PORT,
DEFAULT_PREFIX,
DEFAULT_RATE,
DOMAIN,
)
from .const import CONF_RATE, DOMAIN
_LOGGER = logging.getLogger(__name__)
type DatadogConfigEntry = ConfigEntry[DogStatsd]
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.Schema(
{
vol.Required(CONF_HOST, default=DEFAULT_HOST): cv.string,
vol.Optional(CONF_PORT, default=DEFAULT_PORT): cv.port,
vol.Optional(CONF_PREFIX, default=DEFAULT_PREFIX): cv.string,
vol.Optional(CONF_RATE, default=DEFAULT_RATE): vol.All(
vol.Coerce(int), vol.Range(min=1)
),
}
)
},
extra=vol.ALLOW_EXTRA,
)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Datadog integration from YAML, initiating config flow import."""
if DOMAIN not in config:
return True
hass.async_create_task(
hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config[DOMAIN],
)
)
return True
CONFIG_SCHEMA = cv.removed(DOMAIN, raise_if_present=False)
async def async_setup_entry(hass: HomeAssistant, entry: DatadogConfigEntry) -> bool:

View File

@@ -12,8 +12,7 @@ from homeassistant.config_entries import (
OptionsFlow,
)
from homeassistant.const import CONF_HOST, CONF_PORT, CONF_PREFIX
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant, callback
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from homeassistant.core import HomeAssistant, callback
from .const import (
CONF_RATE,
@@ -71,22 +70,6 @@ class DatadogConfigFlow(ConfigFlow, domain=DOMAIN):
errors=errors,
)
async def async_step_import(self, user_input: dict[str, Any]) -> ConfigFlowResult:
"""Handle import from configuration.yaml."""
# Check for duplicates
self._async_abort_entries_match(
{CONF_HOST: user_input[CONF_HOST], CONF_PORT: user_input[CONF_PORT]}
)
result = await self.async_step_user(user_input)
if errors := result.get("errors"):
await deprecate_yaml_issue(self.hass, False)
return self.async_abort(reason=errors["base"])
await deprecate_yaml_issue(self.hass, True)
return result
@staticmethod
@callback
def async_get_options_flow(config_entry: ConfigEntry) -> OptionsFlow:
@@ -163,41 +146,3 @@ async def validate_datadog_connection(
return False
else:
return True
async def deprecate_yaml_issue(
hass: HomeAssistant,
import_success: bool,
) -> None:
"""Create an issue to deprecate YAML config."""
if import_success:
async_create_issue(
hass,
HOMEASSISTANT_DOMAIN,
f"deprecated_yaml_{DOMAIN}",
is_fixable=False,
issue_domain=DOMAIN,
breaks_in_ha_version="2026.2.0",
severity=IssueSeverity.WARNING,
translation_key="deprecated_yaml",
translation_placeholders={
"domain": DOMAIN,
"integration_title": "Datadog",
},
)
else:
async_create_issue(
hass,
DOMAIN,
"deprecated_yaml_import_connection_error",
breaks_in_ha_version="2026.2.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=IssueSeverity.WARNING,
translation_key="deprecated_yaml_import_connection_error",
translation_placeholders={
"domain": DOMAIN,
"integration_title": "Datadog",
"url": f"/config/integrations/dashboard/add?domain={DOMAIN}",
},
)

View File

@@ -25,12 +25,6 @@
}
}
},
"issues": {
"deprecated_yaml_import_connection_error": {
"description": "There was an error connecting to the Datadog Agent when trying to import the YAML configuration.\n\nEnsure the YAML configuration is correct and restart Home Assistant to try again or remove the {domain} configuration from your `configuration.yaml` file and continue to [set up the integration]({url}) manually.",
"title": "{domain} YAML configuration import failed"
}
},
"options": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",

View File

@@ -7,10 +7,7 @@ import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_SOURCE, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
async_remove_helper_config_entry_from_source_device,
@@ -22,11 +19,6 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Derivative from a config entry."""
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass, entry.entry_id, entry.options[CONF_SOURCE]
)
def set_source_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from datetime import datetime
import logging
import urllib
import urllib.error
from pyW215.pyW215 import SmartPlug

View File

@@ -41,13 +41,20 @@ class UKFloodsFlowHandler(ConfigFlow, domain=DOMAIN):
self.stations = {}
for station in stations:
label = station["label"]
rloId = station["RLOIid"]
# API annoyingly sometimes returns a list and some times returns a string
# E.g. L3121 has a label of ['Scurf Dyke', 'Scurf Dyke Dyke Level']
if isinstance(label, list):
label = label[-1]
self.stations[label] = station["stationReference"]
# Similar for RLOIid
# E.g. 0018 has an RLOIid of ['10427', '9154']
if isinstance(rloId, list):
rloId = rloId[-1]
fullName = label + " - " + rloId
self.stations[fullName] = station["stationReference"]
if not self.stations:
return self.async_abort(reason="no_stations")

View File

@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["eheimdigital"],
"quality_scale": "platinum",
"requirements": ["eheimdigital==1.5.0"],
"requirements": ["eheimdigital==1.6.0"],
"zeroconf": [
{ "name": "eheimdigital._http._tcp.local.", "type": "_http._tcp.local." }
]

View File

@@ -4,6 +4,8 @@ import asyncio.exceptions
from typing import Any
from flexit_bacnet import (
OPERATION_MODE_FIREPLACE,
OPERATION_MODE_OFF,
VENTILATION_MODE_AWAY,
VENTILATION_MODE_HOME,
VENTILATION_MODE_STOP,
@@ -12,7 +14,6 @@ from flexit_bacnet.bacnet import DecodingError
from homeassistant.components.climate import (
PRESET_AWAY,
PRESET_BOOST,
PRESET_HOME,
ClimateEntity,
ClimateEntityFeature,
@@ -28,8 +29,10 @@ from .const import (
DOMAIN,
MAX_TEMP,
MIN_TEMP,
OPERATION_TO_PRESET_MODE_MAP,
PRESET_FIREPLACE,
PRESET_HIGH,
PRESET_TO_VENTILATION_MODE_MAP,
VENTILATION_TO_PRESET_MODE_MAP,
)
from .coordinator import FlexitConfigEntry, FlexitCoordinator
from .entity import FlexitEntity
@@ -51,6 +54,7 @@ class FlexitClimateEntity(FlexitEntity, ClimateEntity):
"""Flexit air handling unit."""
_attr_name = None
_attr_translation_key = "flexit_bacnet"
_attr_hvac_modes = [
HVACMode.OFF,
@@ -60,7 +64,8 @@ class FlexitClimateEntity(FlexitEntity, ClimateEntity):
_attr_preset_modes = [
PRESET_AWAY,
PRESET_HOME,
PRESET_BOOST,
PRESET_HIGH,
PRESET_FIREPLACE,
]
_attr_supported_features = (
@@ -127,20 +132,29 @@ class FlexitClimateEntity(FlexitEntity, ClimateEntity):
Requires ClimateEntityFeature.PRESET_MODE.
"""
return VENTILATION_TO_PRESET_MODE_MAP[self.device.ventilation_mode]
return OPERATION_TO_PRESET_MODE_MAP[self.device.operation_mode]
async def async_set_preset_mode(self, preset_mode: str) -> None:
"""Set new preset mode."""
ventilation_mode = PRESET_TO_VENTILATION_MODE_MAP[preset_mode]
try:
await self.device.set_ventilation_mode(ventilation_mode)
if preset_mode == PRESET_FIREPLACE:
# Use trigger method for fireplace mode
await self.device.trigger_fireplace_mode()
else:
# If currently in fireplace mode, toggle it off first
# trigger_fireplace_mode() acts as a toggle
if self.device.operation_mode == OPERATION_MODE_FIREPLACE:
await self.device.trigger_fireplace_mode()
# Set the desired ventilation mode
ventilation_mode = PRESET_TO_VENTILATION_MODE_MAP[preset_mode]
await self.device.set_ventilation_mode(ventilation_mode)
except (asyncio.exceptions.TimeoutError, ConnectionError, DecodingError) as exc:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="set_preset_mode",
translation_placeholders={
"preset": str(ventilation_mode),
"preset": preset_mode,
},
) from exc
finally:
@@ -149,7 +163,7 @@ class FlexitClimateEntity(FlexitEntity, ClimateEntity):
@property
def hvac_mode(self) -> HVACMode:
"""Return hvac operation ie. heat, cool mode."""
if self.device.ventilation_mode == VENTILATION_MODE_STOP:
if self.device.operation_mode == OPERATION_MODE_OFF:
return HVACMode.OFF
return HVACMode.FAN_ONLY

View File

@@ -1,34 +1,40 @@
"""Constants for the Flexit Nordic (BACnet) integration."""
from flexit_bacnet import (
OPERATION_MODE_AWAY,
OPERATION_MODE_FIREPLACE,
OPERATION_MODE_HIGH,
OPERATION_MODE_HOME,
OPERATION_MODE_OFF,
VENTILATION_MODE_AWAY,
VENTILATION_MODE_HIGH,
VENTILATION_MODE_HOME,
VENTILATION_MODE_STOP,
)
from homeassistant.components.climate import (
PRESET_AWAY,
PRESET_BOOST,
PRESET_HOME,
PRESET_NONE,
)
from homeassistant.components.climate import PRESET_AWAY, PRESET_HOME, PRESET_NONE
DOMAIN = "flexit_bacnet"
MAX_TEMP = 30
MIN_TEMP = 10
VENTILATION_TO_PRESET_MODE_MAP = {
VENTILATION_MODE_STOP: PRESET_NONE,
VENTILATION_MODE_AWAY: PRESET_AWAY,
VENTILATION_MODE_HOME: PRESET_HOME,
VENTILATION_MODE_HIGH: PRESET_BOOST,
PRESET_HIGH = "high"
PRESET_FIREPLACE = "fireplace"
# Map operation mode (what device reports) to Home Assistant preset
OPERATION_TO_PRESET_MODE_MAP = {
OPERATION_MODE_OFF: PRESET_NONE,
OPERATION_MODE_AWAY: PRESET_AWAY,
OPERATION_MODE_HOME: PRESET_HOME,
OPERATION_MODE_HIGH: PRESET_HIGH,
OPERATION_MODE_FIREPLACE: PRESET_FIREPLACE,
}
# Map preset to ventilation mode (for setting standard modes)
PRESET_TO_VENTILATION_MODE_MAP = {
PRESET_NONE: VENTILATION_MODE_STOP,
PRESET_AWAY: VENTILATION_MODE_AWAY,
PRESET_HOME: VENTILATION_MODE_HOME,
PRESET_BOOST: VENTILATION_MODE_HIGH,
PRESET_HIGH: VENTILATION_MODE_HIGH,
}

View File

@@ -1,5 +1,17 @@
{
"entity": {
"climate": {
"flexit_bacnet": {
"state_attributes": {
"preset_mode": {
"state": {
"fireplace": "mdi:fireplace",
"high": "mdi:fan-speed-3"
}
}
}
}
},
"number": {
"away_extract_fan_setpoint": {
"default": "mdi:fan-minus"

View File

@@ -26,6 +26,18 @@
"name": "Air filter polluted"
}
},
"climate": {
"flexit_bacnet": {
"state_attributes": {
"preset_mode": {
"state": {
"fireplace": "Fireplace",
"high": "[%key:common::state::high%]"
}
}
}
}
},
"number": {
"away_extract_fan_setpoint": {
"name": "Away extract fan setpoint"
@@ -139,5 +151,11 @@
"switch_turn": {
"message": "Failed to turn the switch {state}."
}
},
"issues": {
"deprecated_fireplace_switch": {
"description": "The fireplace mode switch entity `{entity_id}` is deprecated and will be removed in a future version.\n\nFireplace mode has been moved to a climate preset on the climate entity to better match the device interface.\n\nPlease update your automations to use the `climate.set_preset_mode` action with preset mode `fireplace` instead of using the switch entity.\n\nAfter updating your automations, you can safely disable this switch entity.",
"title": "Fireplace mode switch is deprecated"
}
}
}

View File

@@ -13,9 +13,12 @@ from homeassistant.components.switch import (
SwitchEntity,
SwitchEntityDescription,
)
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from .const import DOMAIN
from .coordinator import FlexitConfigEntry, FlexitCoordinator
@@ -39,13 +42,6 @@ SWITCHES: tuple[FlexitSwitchEntityDescription, ...] = (
turn_on_fn=lambda data: data.enable_electric_heater(),
turn_off_fn=lambda data: data.disable_electric_heater(),
),
FlexitSwitchEntityDescription(
key="fireplace_mode",
translation_key="fireplace_mode",
is_on_fn=lambda data: data.fireplace_ventilation_status,
turn_on_fn=lambda data: data.trigger_fireplace_mode(),
turn_off_fn=lambda data: data.trigger_fireplace_mode(),
),
FlexitSwitchEntityDescription(
key="cooker_hood_mode",
translation_key="cooker_hood_mode",
@@ -53,6 +49,13 @@ SWITCHES: tuple[FlexitSwitchEntityDescription, ...] = (
turn_on_fn=lambda data: data.activate_cooker_hood(),
turn_off_fn=lambda data: data.deactivate_cooker_hood(),
),
FlexitSwitchEntityDescription(
key="fireplace_mode",
translation_key="fireplace_mode",
is_on_fn=lambda data: data.fireplace_ventilation_status,
turn_on_fn=lambda data: data.trigger_fireplace_mode(),
turn_off_fn=lambda data: data.trigger_fireplace_mode(),
),
)
@@ -64,9 +67,42 @@ async def async_setup_entry(
"""Set up Flexit (bacnet) switch from a config entry."""
coordinator = config_entry.runtime_data
async_add_entities(
FlexitSwitch(coordinator, description) for description in SWITCHES
)
entities: list[FlexitSwitch] = []
for description in SWITCHES:
if description.key == "fireplace_mode":
# Check if deprecated fireplace switch is enabled and create repair issue
entity_reg = er.async_get(hass)
fireplace_switch_unique_id = (
f"{coordinator.device.serial_number}-fireplace_mode"
)
# Look up the fireplace switch entity by unique_id
fireplace_switch_entity_id = entity_reg.async_get_entity_id(
Platform.SWITCH, DOMAIN, fireplace_switch_unique_id
)
if not fireplace_switch_entity_id:
continue
entity_registry_entry = entity_reg.async_get(fireplace_switch_entity_id)
if entity_registry_entry:
if entity_registry_entry.disabled:
entity_reg.async_remove(fireplace_switch_entity_id)
else:
async_create_issue(
hass,
DOMAIN,
f"deprecated_switch_{fireplace_switch_unique_id}",
is_fixable=False,
issue_domain=DOMAIN,
severity=IssueSeverity.WARNING,
translation_key="deprecated_fireplace_switch",
translation_placeholders={
"entity_id": fireplace_switch_entity_id,
},
)
entities.append(FlexitSwitch(coordinator, description))
else:
entities.append(FlexitSwitch(coordinator, description))
async_add_entities(entities)
PARALLEL_UPDATES = 1

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["fressnapftracker==0.2.1"]
"requirements": ["fressnapftracker==0.2.2"]
}

View File

@@ -21,5 +21,5 @@
"integration_type": "system",
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260128.4"]
"requirements": ["home-assistant-frontend==20260128.3"]
}

View File

@@ -13,10 +13,7 @@ from homeassistant.helpers import (
discovery,
entity_registry as er,
)
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.event import async_track_entity_registry_updated_event
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
@@ -96,13 +93,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up from a config entry."""
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass,
entry.entry_id,
entry.options[CONF_HUMIDIFIER],
)
def set_humidifier_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -5,10 +5,7 @@ import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import Event, HomeAssistant
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.event import async_track_entity_registry_updated_event
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
@@ -23,13 +20,6 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up from a config entry."""
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass,
entry.entry_id,
entry.options[CONF_HEATER],
)
def set_humidifier_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -8,5 +8,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["googleapiclient"],
"requirements": ["gcal-sync==8.0.0", "oauth2client==4.1.3", "ical==12.1.3"]
"requirements": ["gcal-sync==8.0.0", "oauth2client==4.1.3", "ical==12.1.2"]
}

View File

@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/hdfury",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"quality_scale": "gold",
"requirements": ["hdfury==1.4.2"],
"zeroconf": [
{ "name": "diva-*", "type": "_http._tcp.local." },

View File

@@ -46,24 +46,26 @@ rules:
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: todo
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: exempt
comment: Device type integration.
entity-category: done
entity-device-class: done
entity-disabled-by-default: todo
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
repair-issues: todo
repair-issues:
status: exempt
comment: The integration doesn't have any repair cases.
stale-devices:
status: exempt
comment: Device type integration.

View File

@@ -8,10 +8,7 @@ import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ENTITY_ID, CONF_STATE
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
async_remove_helper_config_entry_from_source_device,
@@ -53,13 +50,6 @@ async def async_setup_entry(
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass,
entry.entry_id,
entry.options[CONF_ENTITY_ID],
)
def set_source_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -24,6 +24,7 @@ from homeassistant.const import (
SERVICE_TOGGLE,
SERVICE_TURN_OFF,
SERVICE_TURN_ON,
STATE_ON,
)
from homeassistant.core import (
Event,
@@ -127,6 +128,19 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
)
return
# Determine the actual service to call
actual_service = service.service
# For toggle, implement conditional behavior: if any entity is on,
# turn all off; otherwise turn all on
if service.service == SERVICE_TOGGLE:
any_on = any(
(state := hass.states.get(entity_id)) is not None
and state.state == STATE_ON
for entity_id in all_referenced
)
actual_service = SERVICE_TURN_OFF if any_on else SERVICE_TURN_ON
# Group entity_ids by domain. groupby requires sorted data.
by_domain = it.groupby(
sorted(all_referenced), lambda item: split_entity_id(item)[0]
@@ -145,7 +159,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
)
continue
if not hass.services.has_service(domain, service.service):
if not hass.services.has_service(domain, actual_service):
unsupported_entities.update(set(ent_ids) & referenced.referenced)
continue
@@ -158,7 +172,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
tasks.append(
hass.services.async_call(
domain,
service.service,
actual_service,
data,
blocking=True,
context=service.context,

View File

@@ -12,5 +12,5 @@
"iot_class": "local_polling",
"loggers": ["incomfortclient"],
"quality_scale": "platinum",
"requirements": ["incomfort-client==0.6.12"]
"requirements": ["incomfort-client==0.6.11"]
}

View File

@@ -90,7 +90,6 @@
"boiler_int": "Boiler internal",
"buffer": "Buffer",
"central_heating": "Central heating",
"central_heating_low": "Central heating low",
"central_heating_rf": "Central heating rf",
"cv_temperature_too_high_e1": "Temperature too high",
"flame_detection_fault_e6": "Flame detection fault",

View File

@@ -23,15 +23,15 @@
"name": "[%key:common::action::reload%]"
},
"toggle": {
"description": "Toggles the helper on/off.",
"description": "Toggles an input boolean on/off.",
"name": "[%key:common::action::toggle%]"
},
"turn_off": {
"description": "Turns off the helper.",
"description": "Turns off an input boolean.",
"name": "[%key:common::action::turn_off%]"
},
"turn_on": {
"description": "Turns on the helper.",
"description": "Turns on an input boolean.",
"name": "[%key:common::action::turn_on%]"
}
},

View File

@@ -15,7 +15,7 @@
},
"services": {
"press": {
"description": "Mimics the physical button press on the device.",
"description": "Mimics a physical button press on a device.",
"name": "Press"
},
"reload": {

View File

@@ -35,11 +35,11 @@
},
"services": {
"decrement": {
"description": "Decrements the current value by 1 step.",
"description": "Decrements the value of an input number by 1 step.",
"name": "Decrement"
},
"increment": {
"description": "Increments the current value by 1 step.",
"description": "Increments the value of an input number by 1 step.",
"name": "Increment"
},
"reload": {
@@ -47,7 +47,7 @@
"name": "[%key:common::action::reload%]"
},
"set_value": {
"description": "Sets the value.",
"description": "Sets the value of an input number.",
"fields": {
"value": {
"description": "The target value.",

View File

@@ -7,10 +7,7 @@ import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
async_remove_helper_config_entry_from_source_device,
@@ -24,13 +21,6 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Integration from a config entry."""
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass,
entry.entry_id,
entry.options[CONF_SOURCE_SENSOR],
)
def set_source_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -76,7 +76,7 @@ async def async_migrate_entities(
def _update_entry(entry: RegistryEntry) -> dict[str, str] | None:
"""Fix unique_id of power binary_sensor entry."""
if entry.domain == Platform.BINARY_SENSOR and ":" not in entry.unique_id:
if "_power" in entry.unique_id:
if entry.unique_id.endswith("_power"):
return {"new_unique_id": f"{coordinator.unique_id}_power"}
return None

View File

@@ -8,7 +8,6 @@ from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import POWER
from .coordinator import JVCConfigEntry, JvcProjectorDataUpdateCoordinator
from .entity import JvcProjectorEntity
@@ -41,4 +40,4 @@ class JvcBinarySensor(JvcProjectorEntity, BinarySensorEntity):
@property
def is_on(self) -> bool:
"""Return true if the JVC Projector is on."""
return self.coordinator.data[POWER] in ON_STATUS
return self.coordinator.data[cmd.Power.name] in ON_STATUS

View File

@@ -3,7 +3,3 @@
NAME = "JVC Projector"
DOMAIN = "jvc_projector"
MANUFACTURER = "JVC"
POWER = "power"
INPUT = "input"
SOURCE = "source"

View File

@@ -2,29 +2,40 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
from typing import TYPE_CHECKING, Any
from jvcprojector import (
JvcProjector,
JvcProjectorAuthError,
JvcProjectorTimeoutError,
command as cmd,
)
from jvcprojector import JvcProjector, JvcProjectorTimeoutError, command as cmd
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import INPUT, NAME, POWER
from .const import NAME
if TYPE_CHECKING:
from jvcprojector import Command
_LOGGER = logging.getLogger(__name__)
INTERVAL_SLOW = timedelta(seconds=10)
INTERVAL_FAST = timedelta(seconds=5)
CORE_COMMANDS: tuple[type[Command], ...] = (
cmd.Power,
cmd.Signal,
cmd.Input,
cmd.LightTime,
)
TRANSLATIONS = str.maketrans({"+": "p", "%": "p", ":": "x"})
TIMEOUT_RETRIES = 12
TIMEOUT_SLEEP = 1
type JVCConfigEntry = ConfigEntry[JvcProjectorDataUpdateCoordinator]
@@ -51,27 +62,108 @@ class JvcProjectorDataUpdateCoordinator(DataUpdateCoordinator[dict[str, str]]):
assert config_entry.unique_id is not None
self.unique_id = config_entry.unique_id
self.capabilities = self.device.capabilities()
self.state: dict[type[Command], str] = {}
async def _async_update_data(self) -> dict[str, Any]:
"""Get the latest state data."""
state: dict[str, str | None] = {
POWER: None,
INPUT: None,
}
"""Update state with the current value of a command."""
commands: set[type[Command]] = set(self.async_contexts())
commands = commands.difference(CORE_COMMANDS)
try:
state[POWER] = await self.device.get(cmd.Power)
last_timeout: JvcProjectorTimeoutError | None = None
if state[POWER] == cmd.Power.ON:
state[INPUT] = await self.device.get(cmd.Input)
for _ in range(TIMEOUT_RETRIES):
try:
new_state = await self._get_device_state(commands)
break
except JvcProjectorTimeoutError as err:
# Timeouts are expected when the projector loses signal and ignores commands for a brief time.
last_timeout = err
await asyncio.sleep(TIMEOUT_SLEEP)
else:
raise UpdateFailed(str(last_timeout)) from last_timeout
except JvcProjectorTimeoutError as err:
raise UpdateFailed(f"Unable to connect to {self.device.host}") from err
except JvcProjectorAuthError as err:
raise ConfigEntryAuthFailed("Password authentication failed") from err
# Clear state on signal loss
if (
new_state.get(cmd.Signal) == cmd.Signal.NONE
and self.state.get(cmd.Signal) != cmd.Signal.NONE
):
self.state = {k: v for k, v in self.state.items() if k in CORE_COMMANDS}
if state[POWER] != cmd.Power.STANDBY:
# Update state with new values
for k, v in new_state.items():
self.state[k] = v
if self.state[cmd.Power] != cmd.Power.STANDBY:
self.update_interval = INTERVAL_FAST
else:
self.update_interval = INTERVAL_SLOW
return state
return {k.name: v for k, v in self.state.items()}
async def _get_device_state(
self, commands: set[type[Command]]
) -> dict[type[Command], str]:
"""Get the current state of the device."""
new_state: dict[type[Command], str] = {}
deferred_commands: list[type[Command]] = []
power = await self._update_command_state(cmd.Power, new_state)
if power == cmd.Power.ON:
signal = await self._update_command_state(cmd.Signal, new_state)
await self._update_command_state(cmd.Input, new_state)
await self._update_command_state(cmd.LightTime, new_state)
if signal == cmd.Signal.SIGNAL:
for command in commands:
if command.depends:
# Command has dependencies so defer until below
deferred_commands.append(command)
else:
await self._update_command_state(command, new_state)
# Deferred commands should have had dependencies met above
for command in deferred_commands:
depend_command, depend_values = next(iter(command.depends.items()))
value: str | None = None
if depend_command in new_state:
value = new_state[depend_command]
elif depend_command in self.state:
value = self.state[depend_command]
if value and value in depend_values:
await self._update_command_state(command, new_state)
elif self.state.get(cmd.Signal) != cmd.Signal.NONE:
new_state[cmd.Signal] = cmd.Signal.NONE
return new_state
async def _update_command_state(
self, command: type[Command], new_state: dict[type[Command], str]
) -> str | None:
"""Update state with the current value of a command."""
value = await self.device.get(command)
if value != self.state.get(command):
new_state[command] = value
return value
def get_options_map(self, command: str) -> dict[str, str]:
"""Get the available options for a command."""
capabilities = self.capabilities.get(command, {})
if TYPE_CHECKING:
assert isinstance(capabilities, dict)
assert isinstance(capabilities.get("parameter", {}), dict)
assert isinstance(capabilities.get("parameter", {}).get("read", {}), dict)
values = list(capabilities.get("parameter", {}).get("read", {}).values())
return {v: v.translate(TRANSLATIONS) for v in values}
def supports(self, command: type[Command]) -> bool:
"""Check if the device supports a command."""
return self.device.supports(command)

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
import logging
from jvcprojector import JvcProjector
from jvcprojector import Command, JvcProjector
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
@@ -20,9 +20,13 @@ class JvcProjectorEntity(CoordinatorEntity[JvcProjectorDataUpdateCoordinator]):
_attr_has_entity_name = True
def __init__(self, coordinator: JvcProjectorDataUpdateCoordinator) -> None:
def __init__(
self,
coordinator: JvcProjectorDataUpdateCoordinator,
command: type[Command] | None = None,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
super().__init__(coordinator, command)
self._attr_unique_id = coordinator.unique_id
self._attr_device_info = DeviceInfo(

View File

@@ -1,7 +1,7 @@
{
"entity": {
"binary_sensor": {
"jvc_power": {
"power": {
"default": "mdi:projector-off",
"state": {
"on": "mdi:projector"
@@ -9,17 +9,47 @@
}
},
"select": {
"anamorphic": {
"default": "mdi:fit-to-screen-outline"
},
"clear_motion_drive": {
"default": "mdi:blur"
},
"dynamic_control": {
"default": "mdi:lightbulb-on-outline"
},
"input": {
"default": "mdi:hdmi-port"
},
"installation_mode": {
"default": "mdi:aspect-ratio"
},
"light_power": {
"default": "mdi:lightbulb-on-outline"
}
},
"sensor": {
"jvc_power_status": {
"default": "mdi:power-plug-off",
"color_depth": {
"default": "mdi:palette-outline"
},
"color_space": {
"default": "mdi:palette-outline"
},
"hdr": {
"default": "mdi:image-filter-hdr-outline"
},
"hdr_processing": {
"default": "mdi:image-filter-hdr-outline"
},
"picture_mode": {
"default": "mdi:movie-roll"
},
"power": {
"default": "mdi:power",
"state": {
"cooling": "mdi:snowflake",
"error": "mdi:alert-circle",
"on": "mdi:power-plug",
"on": "mdi:power",
"warming": "mdi:heat-wave"
}
}

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["jvcprojector"],
"requirements": ["pyjvcprojector==2.0.0"]
"requirements": ["pyjvcprojector==2.0.1"]
}

View File

@@ -14,7 +14,6 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import POWER
from .coordinator import JVCConfigEntry
from .entity import JvcProjectorEntity
@@ -65,6 +64,8 @@ RENAMED_COMMANDS: dict[str, str] = {
"hdmi2": cmd.Remote.HDMI2,
}
ON_STATUS = (cmd.Power.ON, cmd.Power.WARMING)
_LOGGER = logging.getLogger(__name__)
@@ -86,7 +87,7 @@ class JvcProjectorRemote(JvcProjectorEntity, RemoteEntity):
@property
def is_on(self) -> bool:
"""Return True if the entity is on."""
return self.coordinator.data[POWER] in (cmd.Power.ON, cmd.Power.WARMING)
return self.coordinator.data.get(cmd.Power.name) in ON_STATUS
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the device on."""

View File

@@ -2,11 +2,10 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from typing import Final
from jvcprojector import JvcProjector, command as cmd
from jvcprojector import Command, command as cmd
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.core import HomeAssistant
@@ -20,17 +19,37 @@ from .entity import JvcProjectorEntity
class JvcProjectorSelectDescription(SelectEntityDescription):
"""Describes JVC Projector select entities."""
command: Callable[[JvcProjector, str], Awaitable[None]]
command: type[Command]
SELECTS: Final[list[JvcProjectorSelectDescription]] = [
SELECTS: Final[tuple[JvcProjectorSelectDescription, ...]] = (
JvcProjectorSelectDescription(key="input", command=cmd.Input),
JvcProjectorSelectDescription(
key="input",
translation_key="input",
options=[cmd.Input.HDMI1, cmd.Input.HDMI2],
command=lambda device, option: device.set(cmd.Input, option),
)
]
key="installation_mode",
command=cmd.InstallationMode,
entity_registry_enabled_default=False,
),
JvcProjectorSelectDescription(
key="light_power",
command=cmd.LightPower,
entity_registry_enabled_default=False,
),
JvcProjectorSelectDescription(
key="dynamic_control",
command=cmd.DynamicControl,
entity_registry_enabled_default=False,
),
JvcProjectorSelectDescription(
key="clear_motion_drive",
command=cmd.ClearMotionDrive,
entity_registry_enabled_default=False,
),
JvcProjectorSelectDescription(
key="anamorphic",
command=cmd.Anamorphic,
entity_registry_enabled_default=False,
),
)
async def async_setup_entry(
@@ -42,30 +61,45 @@ async def async_setup_entry(
coordinator = entry.runtime_data
async_add_entities(
JvcProjectorSelectEntity(coordinator, description) for description in SELECTS
JvcProjectorSelectEntity(coordinator, description)
for description in SELECTS
if coordinator.supports(description.command)
)
class JvcProjectorSelectEntity(JvcProjectorEntity, SelectEntity):
"""Representation of a JVC Projector select entity."""
entity_description: JvcProjectorSelectDescription
def __init__(
self,
coordinator: JvcProjectorDataUpdateCoordinator,
description: JvcProjectorSelectDescription,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
super().__init__(coordinator, description.command)
self.command: type[Command] = description.command
self.entity_description = description
self._attr_unique_id = f"{coordinator.unique_id}_{description.key}"
self._attr_translation_key = description.key
self._attr_unique_id = f"{self._attr_unique_id}_{description.key}"
self._options_map: dict[str, str] = coordinator.get_options_map(
self.command.name
)
@property
def options(self) -> list[str]:
"""Return a list of selectable options."""
return list(self._options_map.values())
@property
def current_option(self) -> str | None:
"""Return the selected entity option to represent the entity state."""
return self.coordinator.data[self.entity_description.key]
if value := self.coordinator.data.get(self.command.name):
return self._options_map.get(value)
return None
async def async_select_option(self, option: str) -> None:
"""Change the selected option."""
await self.entity_description.command(self.coordinator.device, option)
value = next((k for k, v in self._options_map.items() if v == option), None)
await self.coordinator.device.set(self.command, value)

View File

@@ -2,33 +2,77 @@
from __future__ import annotations
from jvcprojector import command as cmd
from dataclasses import dataclass
from jvcprojector import Command, command as cmd
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.const import EntityCategory, UnitOfTime
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import JVCConfigEntry, JvcProjectorDataUpdateCoordinator
from .entity import JvcProjectorEntity
JVC_SENSORS = (
SensorEntityDescription(
@dataclass(frozen=True, kw_only=True)
class JvcProjectorSensorDescription(SensorEntityDescription):
"""Describes JVC Projector sensor entities."""
command: type[Command]
SENSORS: tuple[JvcProjectorSensorDescription, ...] = (
JvcProjectorSensorDescription(
key="power",
translation_key="jvc_power_status",
command=cmd.Power,
device_class=SensorDeviceClass.ENUM,
),
JvcProjectorSensorDescription(
key="light_time",
command=cmd.LightTime,
device_class=SensorDeviceClass.DURATION,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfTime.HOURS,
),
JvcProjectorSensorDescription(
key="color_depth",
command=cmd.ColorDepth,
device_class=SensorDeviceClass.ENUM,
entity_category=EntityCategory.DIAGNOSTIC,
options=[
cmd.Power.STANDBY,
cmd.Power.ON,
cmd.Power.WARMING,
cmd.Power.COOLING,
cmd.Power.ERROR,
],
entity_registry_enabled_default=False,
),
JvcProjectorSensorDescription(
key="color_space",
command=cmd.ColorSpace,
device_class=SensorDeviceClass.ENUM,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
),
JvcProjectorSensorDescription(
key="hdr",
command=cmd.Hdr,
device_class=SensorDeviceClass.ENUM,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
),
JvcProjectorSensorDescription(
key="hdr_processing",
command=cmd.HdrProcessing,
device_class=SensorDeviceClass.ENUM,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
),
JvcProjectorSensorDescription(
key="picture_mode",
command=cmd.PictureMode,
device_class=SensorDeviceClass.ENUM,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
),
)
@@ -42,24 +86,48 @@ async def async_setup_entry(
coordinator = entry.runtime_data
async_add_entities(
JvcSensor(coordinator, description) for description in JVC_SENSORS
JvcProjectorSensorEntity(coordinator, description)
for description in SENSORS
if coordinator.supports(description.command)
)
class JvcSensor(JvcProjectorEntity, SensorEntity):
class JvcProjectorSensorEntity(JvcProjectorEntity, SensorEntity):
"""The entity class for JVC Projector integration."""
def __init__(
self,
coordinator: JvcProjectorDataUpdateCoordinator,
description: SensorEntityDescription,
description: JvcProjectorSensorDescription,
) -> None:
"""Initialize the JVC Projector sensor."""
super().__init__(coordinator)
super().__init__(coordinator, description.command)
self.command: type[Command] = description.command
self.entity_description = description
self._attr_unique_id = f"{coordinator.unique_id}_{description.key}"
self._attr_translation_key = description.key
self._attr_unique_id = f"{self._attr_unique_id}_{description.key}"
self._options_map: dict[str, str] = {}
if self.device_class == SensorDeviceClass.ENUM:
self._options_map = coordinator.get_options_map(self.command.name)
@property
def options(self) -> list[str] | None:
"""Return a set of possible options."""
if self.device_class == SensorDeviceClass.ENUM:
return list(self._options_map.values())
return None
@property
def native_value(self) -> str | None:
"""Return the native value."""
return self.coordinator.data[self.entity_description.key]
value = self.coordinator.data.get(self.command.name)
if value is None:
return None
if self.device_class == SensorDeviceClass.ENUM:
return self._options_map.get(value)
return value

View File

@@ -36,20 +36,134 @@
"entity": {
"binary_sensor": {
"power": {
"name": "[%key:component::binary_sensor::entity_component::power::name%]"
"name": "Power"
}
},
"select": {
"anamorphic": {
"name": "Anamorphic",
"state": {
"a": "A",
"b": "B",
"c": "C",
"d": "D",
"off": "[%key:common::state::off%]"
}
},
"clear_motion_drive": {
"name": "Clear Motion Drive",
"state": {
"high": "[%key:common::state::high%]",
"inverse-telecine": "Inverse Telecine",
"low": "[%key:common::state::low%]",
"off": "[%key:common::state::off%]"
}
},
"dynamic_control": {
"name": "Dynamic Control",
"state": {
"balanced": "Balanced",
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"mode-1": "Mode 1",
"mode-2": "Mode 2",
"mode-3": "Mode 3",
"off": "[%key:common::state::off%]"
}
},
"input": {
"name": "Input",
"state": {
"hdmi1": "HDMI 1",
"hdmi2": "HDMI 2"
}
},
"installation_mode": {
"name": "Installation Mode",
"state": {
"memory-1": "Memory 1",
"memory-10": "Memory 10",
"memory-2": "Memory 2",
"memory-3": "Memory 3",
"memory-4": "Memory 4",
"memory-5": "Memory 5",
"memory-6": "Memory 6",
"memory-7": "Memory 7",
"memory-8": "Memory 8",
"memory-9": "Memory 9"
}
},
"light_power": {
"name": "Light Power",
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"mid": "[%key:common::state::medium%]",
"normal": "[%key:common::state::normal%]"
}
}
},
"sensor": {
"jvc_power_status": {
"color_depth": {
"name": "Color Depth",
"state": {
"8-bit": "8-bit",
"10-bit": "10-bit",
"12-bit": "12-bit"
}
},
"color_space": {
"name": "Color Space",
"state": {
"rgb": "RGB",
"xv-color": "XV Color",
"ycbcr-420": "YCbCr 4:2:0",
"ycbcr-422": "YCbCr 4:2:2",
"ycbcr-444": "YCbCr 4:4:4",
"yuv": "YUV"
}
},
"hdr": {
"name": "HDR",
"state": {
"hdr": "HDR",
"hdr10p": "HDR10+",
"hybrid-log": "Hybrid Log",
"none": "None",
"sdr": "SDR",
"smpte-st-2084": "SMPTE ST 2084"
}
},
"hdr_processing": {
"name": "HDR Processing",
"state": {
"frame-by-frame": "Frame-by-Frame",
"hdr10p": "HDR10+",
"scene-by-scene": "Scene-by-Scene",
"static": "Static"
}
},
"light_time": {
"name": "Light Time"
},
"picture_mode": {
"name": "Picture Mode",
"state": {
"frame-adapt-hdr": "Frame Adapt HDR",
"frame-adapt-hdr2": "Frame Adapt HDR2",
"frame-adapt-hdr3": "Frame Adapt HDR3",
"hdr1": "HDR1",
"hdr10": "HDR10",
"hdr10-ll": "HDR10 LL",
"hdr2": "HDR2",
"last-setting": "Last Setting",
"pana-pq": "Pana PQ",
"user-4": "User 4",
"user-5": "User 5",
"user-6": "User 6"
}
},
"power": {
"name": "Status",
"state": {
"cooling": "Cooling",

View File

@@ -2,19 +2,16 @@
from __future__ import annotations
import logging
import math
from typing import Any
from propcache.api import cached_property
from xknx.devices import Fan as XknxFan
from xknx.telegram.address import parse_device_group_address
from homeassistant import config_entries
from homeassistant.components.fan import FanEntity, FanEntityFeature
from homeassistant.const import CONF_ENTITY_CATEGORY, CONF_NAME, Platform
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
async_get_current_platform,
@@ -40,58 +37,6 @@ from .storage.const import (
)
from .storage.util import ConfigExtractor
_LOGGER = logging.getLogger(__name__)
@callback
def async_migrate_yaml_uids(
hass: HomeAssistant, platform_config: list[ConfigType]
) -> None:
"""Migrate entities unique_id for YAML switch-only fan entities."""
# issue was introduced in 2026.1 - this migration in 2026.2
ent_reg = er.async_get(hass)
invalid_uid = str(None)
if (
none_entity_id := ent_reg.async_get_entity_id(Platform.FAN, DOMAIN, invalid_uid)
) is None:
return
for config in platform_config:
if not config.get(KNX_ADDRESS) and (
new_uid_base := config.get(FanSchema.CONF_SWITCH_ADDRESS)
):
break
else:
_LOGGER.info(
"No YAML entry found to migrate fan entity '%s' unique_id from '%s'. Removing entry",
none_entity_id,
invalid_uid,
)
ent_reg.async_remove(none_entity_id)
return
new_uid = str(
parse_device_group_address(
new_uid_base[0], # list of group addresses - first item is sending address
)
)
try:
ent_reg.async_update_entity(none_entity_id, new_unique_id=str(new_uid))
_LOGGER.info(
"Migrating fan entity '%s' unique_id from '%s' to %s",
none_entity_id,
invalid_uid,
new_uid,
)
except ValueError:
# New unique_id already exists - remove invalid entry. User might have changed YAML
_LOGGER.info(
"Failed to migrate fan entity '%s' unique_id from '%s' to '%s'. "
"Removing the invalid entry",
none_entity_id,
invalid_uid,
new_uid,
)
ent_reg.async_remove(none_entity_id)
async def async_setup_entry(
hass: HomeAssistant,
@@ -112,7 +57,6 @@ async def async_setup_entry(
entities: list[_KnxFan] = []
if yaml_platform_config := knx_module.config_yaml.get(Platform.FAN):
async_migrate_yaml_uids(hass, yaml_platform_config)
entities.extend(
KnxYamlFan(knx_module, entity_config)
for entity_config in yaml_platform_config
@@ -233,10 +177,7 @@ class KnxYamlFan(_KnxFan, KnxYamlEntity):
self._step_range: tuple[int, int] | None = (1, max_step) if max_step else None
self._attr_entity_category = config.get(CONF_ENTITY_CATEGORY)
if self._device.speed.group_address:
self._attr_unique_id = str(self._device.speed.group_address)
else:
self._attr_unique_id = str(self._device.switch.group_address)
self._attr_unique_id = str(self._device.speed.group_address)
class KnxUiFan(_KnxFan, KnxUiEntity):

View File

@@ -0,0 +1,67 @@
"""The liebherr integration."""
from __future__ import annotations
import asyncio
from pyliebherrhomeapi import LiebherrClient
from pyliebherrhomeapi.exceptions import (
LiebherrAuthenticationError,
LiebherrConnectionError,
)
from homeassistant.const import CONF_API_KEY, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
PLATFORMS: list[Platform] = [Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: LiebherrConfigEntry) -> bool:
"""Set up Liebherr from a config entry."""
# Create shared API client
client = LiebherrClient(
api_key=entry.data[CONF_API_KEY],
session=async_get_clientsession(hass),
)
# Fetch device list to create coordinators
try:
devices = await client.get_devices()
except LiebherrAuthenticationError as err:
raise ConfigEntryAuthFailed("Invalid API key") from err
except LiebherrConnectionError as err:
raise ConfigEntryNotReady(f"Failed to connect to Liebherr API: {err}") from err
# Create a coordinator for each device (may be empty if no devices)
coordinators: dict[str, LiebherrCoordinator] = {}
for device in devices:
coordinator = LiebherrCoordinator(
hass=hass,
config_entry=entry,
client=client,
device_id=device.device_id,
)
coordinators[device.device_id] = coordinator
await asyncio.gather(
*(
coordinator.async_config_entry_first_refresh()
for coordinator in coordinators.values()
)
)
# Store coordinators in runtime data
entry.runtime_data = coordinators
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: LiebherrConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -0,0 +1,103 @@
"""Config flow for the liebherr integration."""
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
from pyliebherrhomeapi import LiebherrClient
from pyliebherrhomeapi.exceptions import (
LiebherrAuthenticationError,
LiebherrConnectionError,
)
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_KEY
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
STEP_USER_DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_API_KEY): str,
}
)
class LiebherrConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for liebherr."""
async def _validate_api_key(self, api_key: str) -> tuple[list, dict[str, str]]:
"""Validate the API key and return devices and errors."""
errors: dict[str, str] = {}
devices: list = []
client = LiebherrClient(
api_key=api_key,
session=async_get_clientsession(self.hass),
)
try:
devices = await client.get_devices()
except LiebherrAuthenticationError:
errors["base"] = "invalid_auth"
except LiebherrConnectionError:
errors["base"] = "cannot_connect"
except Exception:
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
return devices, errors
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
errors: dict[str, str] = {}
if user_input is not None:
user_input[CONF_API_KEY] = user_input[CONF_API_KEY].strip()
self._async_abort_entries_match({CONF_API_KEY: user_input[CONF_API_KEY]})
devices, errors = await self._validate_api_key(user_input[CONF_API_KEY])
if not errors:
if not devices:
return self.async_abort(reason="no_devices")
return self.async_create_entry(
title="Liebherr",
data=user_input,
)
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle re-authentication."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm re-authentication."""
errors: dict[str, str] = {}
if user_input is not None:
api_key = user_input[CONF_API_KEY].strip()
_, errors = await self._validate_api_key(api_key)
if not errors:
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_KEY: api_key},
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=STEP_USER_DATA_SCHEMA,
errors=errors,
)

View File

@@ -0,0 +1,6 @@
"""Constants for the liebherr integration."""
from typing import Final
DOMAIN: Final = "liebherr"
MANUFACTURER: Final = "Liebherr"

View File

@@ -0,0 +1,79 @@
"""DataUpdateCoordinator for Liebherr integration."""
from __future__ import annotations
from datetime import timedelta
import logging
from pyliebherrhomeapi import (
DeviceState,
LiebherrAuthenticationError,
LiebherrClient,
LiebherrConnectionError,
LiebherrTimeoutError,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryError,
ConfigEntryNotReady,
)
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
type LiebherrConfigEntry = ConfigEntry[dict[str, LiebherrCoordinator]]
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=60)
class LiebherrCoordinator(DataUpdateCoordinator[DeviceState]):
"""Class to manage fetching Liebherr data from the API for a single device."""
def __init__(
self,
hass: HomeAssistant,
config_entry: LiebherrConfigEntry,
client: LiebherrClient,
device_id: str,
) -> None:
"""Initialize coordinator."""
super().__init__(
hass,
logger=_LOGGER,
name=f"{DOMAIN}_{device_id}",
update_interval=SCAN_INTERVAL,
config_entry=config_entry,
)
self.client = client
self.device_id = device_id
async def _async_setup(self) -> None:
"""Set up the coordinator by validating device access."""
try:
await self.client.get_device(self.device_id)
except LiebherrAuthenticationError as err:
raise ConfigEntryError("Invalid API key") from err
except LiebherrConnectionError as err:
raise ConfigEntryNotReady(
f"Failed to connect to device {self.device_id}: {err}"
) from err
async def _async_update_data(self) -> DeviceState:
"""Fetch data from API for this device."""
try:
return await self.client.get_device_state(self.device_id)
except LiebherrAuthenticationError as err:
raise ConfigEntryAuthFailed("API key is no longer valid") from err
except LiebherrTimeoutError as err:
raise UpdateFailed(
f"Timeout communicating with device {self.device_id}"
) from err
except LiebherrConnectionError as err:
raise UpdateFailed(
f"Error communicating with device {self.device_id}"
) from err

View File

@@ -0,0 +1,75 @@
"""Base entity for Liebherr integration."""
from __future__ import annotations
from pyliebherrhomeapi import TemperatureControl, ZonePosition
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER
from .coordinator import LiebherrCoordinator
# Zone position to translation key mapping
ZONE_POSITION_MAP = {
ZonePosition.TOP: "top_zone",
ZonePosition.MIDDLE: "middle_zone",
ZonePosition.BOTTOM: "bottom_zone",
}
class LiebherrEntity(CoordinatorEntity[LiebherrCoordinator]):
"""Base entity for Liebherr devices."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: LiebherrCoordinator,
) -> None:
"""Initialize the Liebherr entity."""
super().__init__(coordinator)
device = coordinator.data.device
model = None
if device.device_type:
model = device.device_type.title()
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, coordinator.device_id)},
name=device.nickname or device.device_name,
manufacturer=MANUFACTURER,
model=model,
model_id=device.device_name,
)
class LiebherrZoneEntity(LiebherrEntity):
"""Base entity for zone-based Liebherr entities.
This class should be used for entities that are associated with a specific
temperature control zone (e.g., climate, zone sensors).
"""
def __init__(
self,
coordinator: LiebherrCoordinator,
zone_id: int,
) -> None:
"""Initialize the zone entity."""
super().__init__(coordinator)
self._zone_id = zone_id
@property
def temperature_control(self) -> TemperatureControl | None:
"""Get the temperature control for this zone."""
return self.coordinator.data.get_temperature_controls().get(self._zone_id)
def _get_zone_translation_key(self) -> str | None:
"""Get the translation key for this zone."""
control = self.temperature_control
if control and isinstance(control.zone_position, ZonePosition):
return ZONE_POSITION_MAP.get(control.zone_position)
# Fallback to None to use device model name
return None

View File

@@ -0,0 +1,18 @@
{
"domain": "liebherr",
"name": "Liebherr",
"codeowners": ["@mettolen"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/liebherr",
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["pyliebherrhomeapi"],
"quality_scale": "bronze",
"requirements": ["pyliebherrhomeapi==0.2.1"],
"zeroconf": [
{
"name": "liebherr*",
"type": "_http._tcp.local."
}
]
}

View File

@@ -0,0 +1,74 @@
rules:
# Bronze
action-setup:
status: exempt
comment: Integration does not register custom actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: Integration does not register custom actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: Integration does not register custom actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration has no configurable parameters after initial setup.
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: done
test-coverage: done
# Gold
devices: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: Cloud API does not require updating entry data from network discovery.
discovery: done
docs-data-update: done
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices: todo
entity-category: done
entity-device-class: todo
entity-disabled-by-default: todo
entity-translations: done
exception-translations: todo
icon-translations: todo
reconfiguration-flow:
status: exempt
comment: The only configuration option is the API key, which is handled by the reauthentication flow.
repair-issues:
status: exempt
comment: No repair issues to implement at this time.
stale-devices: todo
# Platinum
async-dependency: done
inject-websession: done
strict-typing: todo

View File

@@ -0,0 +1,118 @@
"""Sensor platform for Liebherr integration."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from pyliebherrhomeapi import TemperatureControl, TemperatureUnit
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .coordinator import LiebherrConfigEntry, LiebherrCoordinator
from .entity import LiebherrZoneEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class LiebherrSensorEntityDescription(SensorEntityDescription):
"""Describes Liebherr sensor entity."""
value_fn: Callable[[TemperatureControl], StateType]
unit_fn: Callable[[TemperatureControl], str]
SENSOR_TYPES: tuple[LiebherrSensorEntityDescription, ...] = (
LiebherrSensorEntityDescription(
key="temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
value_fn=lambda control: control.value,
unit_fn=lambda control: (
UnitOfTemperature.FAHRENHEIT
if control.unit == TemperatureUnit.FAHRENHEIT
else UnitOfTemperature.CELSIUS
),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: LiebherrConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Liebherr sensor entities."""
coordinators = entry.runtime_data
entities: list[LiebherrSensor] = []
for coordinator in coordinators.values():
# Get all temperature controls for this device
temp_controls = coordinator.data.get_temperature_controls()
for temp_control in temp_controls.values():
entities.extend(
LiebherrSensor(
coordinator=coordinator,
zone_id=temp_control.zone_id,
description=description,
)
for description in SENSOR_TYPES
)
async_add_entities(entities)
class LiebherrSensor(LiebherrZoneEntity, SensorEntity):
"""Representation of a Liebherr sensor."""
entity_description: LiebherrSensorEntityDescription
def __init__(
self,
coordinator: LiebherrCoordinator,
zone_id: int,
description: LiebherrSensorEntityDescription,
) -> None:
"""Initialize the sensor entity."""
super().__init__(coordinator, zone_id)
self.entity_description = description
self._attr_unique_id = f"{coordinator.device_id}_{description.key}_{zone_id}"
# If device has only one zone, use model name instead of zone name
temp_controls = coordinator.data.get_temperature_controls()
if len(temp_controls) == 1:
self._attr_name = None
else:
# Set translation key based on zone position for multi-zone devices
self._attr_translation_key = self._get_zone_translation_key()
@property
def native_unit_of_measurement(self) -> str | None:
"""Return the unit of measurement."""
if (temp_control := self.temperature_control) is None:
return None
return self.entity_description.unit_fn(temp_control)
@property
def native_value(self) -> StateType:
"""Return the current value."""
if (temp_control := self.temperature_control) is None:
return None
return self.entity_description.value_fn(temp_control)
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.temperature_control is not None

View File

@@ -0,0 +1,48 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"no_devices": "No devices found for this API key",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"no_devices": "No devices found for this API key",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"reauth_confirm": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
},
"data_description": {
"api_key": "[%key:component::liebherr::config::step::user::data_description::api_key%]"
},
"description": "Your API key is no longer valid. Please enter a new API key to continue."
},
"user": {
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]"
},
"data_description": {
"api_key": "The API key from the Liebherr SmartDevice app. Note: The API key can only be copied once from the app."
},
"description": "Enter your Liebherr HomeAPI key. You can find it in the Liebherr SmartDevice app under Settings → Become a beta tester."
}
}
},
"entity": {
"sensor": {
"bottom_zone": {
"name": "Bottom zone"
},
"middle_zone": {
"name": "Middle zone"
},
"top_zone": {
"name": "Top zone"
}
}
}
}

View File

@@ -73,6 +73,3 @@ LIFX_CEILING_PRODUCT_IDS = {176, 177, 201, 202}
LIFX_128ZONE_CEILING_PRODUCT_IDS = {201, 202}
_LOGGER = logging.getLogger(__package__)
# _ATTR_COLOR_TEMP deprecated - to be removed in 2026.1
_ATTR_COLOR_TEMP = "color_temp"

View File

@@ -33,7 +33,7 @@ from homeassistant.helpers.target import (
async_extract_referenced_entity_ids,
)
from .const import _ATTR_COLOR_TEMP, ATTR_THEME, DOMAIN
from .const import ATTR_THEME, DOMAIN
from .coordinator import LIFXUpdateCoordinator
from .util import convert_8_to_16, find_hsbk
@@ -135,8 +135,6 @@ LIFX_EFFECT_PULSE_SCHEMA = cv.make_entity_service_schema(
vol.Exclusive(ATTR_COLOR_TEMP_KELVIN, COLOR_GROUP): vol.All(
vol.Coerce(int), vol.Range(min=1500, max=9000)
),
# _ATTR_COLOR_TEMP deprecated - to be removed in 2026.1
vol.Exclusive(_ATTR_COLOR_TEMP, COLOR_GROUP): cv.positive_int,
ATTR_PERIOD: vol.All(vol.Coerce(float), vol.Range(min=0.05)),
ATTR_CYCLES: vol.All(vol.Coerce(float), vol.Range(min=1)),
ATTR_MODE: vol.In(PULSE_MODES),

View File

@@ -26,7 +26,6 @@ from homeassistant.helpers import device_registry as dr
from homeassistant.util import color as color_util
from .const import (
_ATTR_COLOR_TEMP,
_LOGGER,
DEFAULT_ATTEMPTS,
DOMAIN,
@@ -115,17 +114,6 @@ def find_hsbk(hass: HomeAssistant, **kwargs: Any) -> list[float | int | None] |
saturation = int(saturation / 100 * 65535)
kelvin = 3500
if ATTR_COLOR_TEMP_KELVIN not in kwargs and _ATTR_COLOR_TEMP in kwargs:
# added in 2025.1, can be removed in 2026.1
_LOGGER.warning(
"The 'color_temp' parameter is deprecated. Please use 'color_temp_kelvin' for"
" all service calls"
)
kelvin = color_util.color_temperature_mired_to_kelvin(
kwargs.pop(_ATTR_COLOR_TEMP)
)
saturation = 0
if ATTR_COLOR_TEMP_KELVIN in kwargs:
kelvin = kwargs.pop(ATTR_COLOR_TEMP_KELVIN)
saturation = 0

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/local_calendar",
"iot_class": "local_polling",
"loggers": ["ical"],
"requirements": ["ical==12.1.3"]
"requirements": ["ical==12.1.2"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/local_todo",
"iot_class": "local_polling",
"requirements": ["ical==12.1.3"]
"requirements": ["ical==12.1.2"]
}

View File

@@ -24,7 +24,7 @@ from .const import DOMAIN, MEDIA_CLASS_MAP, MEDIA_MIME_TYPES, MEDIA_SOURCE_DATA
from .error import Unresolvable
from .models import BrowseMediaSource, MediaSource, MediaSourceItem, PlayMedia
MAX_UPLOAD_SIZE = 1024 * 1024 * 10
MAX_UPLOAD_SIZE = 1024 * 1024 * 20
LOGGER = logging.getLogger(__name__)

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["meteoclimatic"],
"requirements": ["pymeteoclimatic==0.1.1"]
"requirements": ["pymeteoclimatic==0.1.0"]
}

View File

@@ -722,7 +722,7 @@ POLLED_SENSOR_TYPES: Final[tuple[MieleSensorDefinition[MieleFillingLevel], ...]]
description=MieleSensorDescription[MieleFillingLevel](
key="power_disk_level",
translation_key="power_disk_level",
value_fn=lambda value: value.power_disc_filling_level,
value_fn=lambda value: None,
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
),

View File

@@ -11,6 +11,7 @@ import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA as SENSOR_PLATFORM_SCHEMA,
SensorDeviceClass,
SensorEntity,
SensorStateClass,
)
@@ -25,7 +26,9 @@ from homeassistant.const import (
STATE_UNKNOWN,
)
from homeassistant.core import Event, EventStateChangedData, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.entity import get_device_class
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
AddEntitiesCallback,
@@ -259,6 +262,7 @@ class MinMaxSensor(SensorEntity):
)
self._async_min_max_sensor_state_listener(state_event, update_state=False)
self._update_device_class()
self._calc_values()
@property
@@ -345,6 +349,32 @@ class MinMaxSensor(SensorEntity):
self._calc_values()
self.async_write_ha_state()
@callback
def _update_device_class(self) -> None:
"""Update device_class based on source entities.
If all source entities have the same device_class, inherit it.
Otherwise, leave device_class as None.
"""
device_classes: list[SensorDeviceClass | None] = []
for entity_id in self._entity_ids:
try:
device_class = get_device_class(self.hass, entity_id)
if device_class:
device_classes.append(SensorDeviceClass(device_class))
else:
device_classes.append(None)
except (HomeAssistantError, ValueError):
# If we can't get device class for any entity, don't set it
device_classes.append(None)
# Only inherit device_class if all entities have the same non-None device_class
if device_classes and all(
dc is not None and dc == device_classes[0] for dc in device_classes
):
self._attr_device_class = device_classes[0]
@callback
def _calc_values(self) -> None:
"""Calculate the values."""

View File

@@ -9,10 +9,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.device import (
async_entity_id_to_device_id,
async_remove_stale_devices_links_keep_entity_device,
)
from homeassistant.helpers.device import async_entity_id_to_device_id
from homeassistant.helpers.event import async_track_entity_registry_updated_event
from homeassistant.helpers.helper_integration import (
async_handle_source_entity_changes,
@@ -29,11 +26,6 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Mold indicator from a config entry."""
# This can be removed in HA Core 2026.2
async_remove_stale_devices_links_keep_entity_device(
hass, entry.entry_id, entry.options[CONF_INDOOR_HUMIDITY]
)
def set_source_entity_id_or_uuid(source_entity_id: str) -> None:
hass.config_entries.async_update_entry(
entry,

View File

@@ -14,6 +14,7 @@ from onedrive_personal_sdk.exceptions import (
NotFoundError,
OneDriveException,
)
from onedrive_personal_sdk.models.items import ItemUpdate
from homeassistant.const import CONF_ACCESS_TOKEN, Platform
from homeassistant.core import HomeAssistant
@@ -71,6 +72,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
entry, data={**entry.data, CONF_FOLDER_ID: backup_folder.id}
)
# write instance id to description
if backup_folder.description != (instance_id := await async_get_instance_id(hass)):
await _handle_item_operation(
lambda: client.update_drive_item(
backup_folder.id, ItemUpdate(description=instance_id)
),
folder_name,
)
# update in case folder was renamed manually inside OneDrive
if backup_folder.name != entry.data[CONF_FOLDER_NAME]:
hass.config_entries.async_update_entry(
@@ -112,11 +122,7 @@ async def async_unload_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -> None:
"""Migrate backup files from metadata version 1 to version 2.
Version 1: Backup metadata was stored in the backup file's description field.
Version 2: Backup metadata is stored in a separate .metadata.json file.
"""
"""Migrate backup files to metadata version 2."""
files = await client.list_drive_items(backup_folder_id)
for file in files:
if file.description and '"metadata_version": 1' in (
@@ -125,11 +131,24 @@ async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -
metadata = loads(metadata_json)
del metadata["metadata_version"]
metadata_filename = file.name.rsplit(".", 1)[0] + ".metadata.json"
await client.upload_file(
metadata_file = await client.upload_file(
backup_folder_id,
metadata_filename,
dumps(metadata),
)
metadata_description = {
"metadata_version": 2,
"backup_id": metadata["backup_id"],
"backup_file_id": file.id,
}
await client.update_drive_item(
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
await client.update_drive_item(
path_or_id=file.id,
data=ItemUpdate(description=""),
)
_LOGGER.debug("Migrated backup file %s", file.name)

View File

@@ -3,7 +3,10 @@
from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
from dataclasses import dataclass
from functools import wraps
from html import unescape
from json import dumps, loads
import logging
from time import time
from typing import Any, Concatenate
@@ -15,6 +18,7 @@ from onedrive_personal_sdk.exceptions import (
HashMismatchError,
OneDriveException,
)
from onedrive_personal_sdk.models.items import ItemUpdate
from onedrive_personal_sdk.models.upload import FileInfo
from homeassistant.components.backup import (
@@ -26,8 +30,6 @@ from homeassistant.components.backup import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.json import json_dumps
from homeassistant.util.json import json_loads_object
from .const import CONF_DELETE_PERMANENTLY, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .coordinator import OneDriveConfigEntry
@@ -36,6 +38,7 @@ _LOGGER = logging.getLogger(__name__)
MAX_CHUNK_SIZE = 60 * 1024 * 1024 # largest chunk possible, must be <= 60 MiB
TARGET_CHUNKS = 20
TIMEOUT = ClientTimeout(connect=10, total=43200) # 12 hours
METADATA_VERSION = 2
CACHE_TTL = 300
@@ -101,10 +104,13 @@ def handle_backup_errors[_R, **P](
return wrapper
def suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata."""
base_name = suggested_filename(backup).rsplit(".", 1)[0]
return f"{base_name}.tar", f"{base_name}.metadata.json"
@dataclass(kw_only=True)
class OneDriveBackup:
"""Define a OneDrive backup."""
backup: AgentBackup
backup_file_id: str
metadata_file_id: str
class OneDriveBackupAgent(BackupAgent):
@@ -123,7 +129,7 @@ class OneDriveBackupAgent(BackupAgent):
self.name = entry.title
assert entry.unique_id
self.unique_id = entry.unique_id
self._cache_backup_metadata: dict[str, AgentBackup] = {}
self._backup_cache: dict[str, OneDriveBackup] = {}
self._cache_expiration = time()
@handle_backup_errors
@@ -131,11 +137,12 @@ class OneDriveBackupAgent(BackupAgent):
self, backup_id: str, **kwargs: Any
) -> AsyncIterator[bytes]:
"""Download a backup file."""
backup = await self._find_backup_by_id(backup_id)
backup_filename, _ = suggested_filenames(backup)
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
stream = await self._client.download_drive_item(
f"{self._folder_id}:/{backup_filename}:", timeout=TIMEOUT
backups[backup_id].backup_file_id, timeout=TIMEOUT
)
return stream.iter_chunked(1024)
@@ -148,9 +155,9 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Upload a backup."""
backup_filename, metadata_filename = suggested_filenames(backup)
filename = suggested_filename(backup)
file = FileInfo(
backup_filename,
filename,
backup.size,
self._folder_id,
await open_stream(),
@@ -166,7 +173,7 @@ class OneDriveBackupAgent(BackupAgent):
upload_chunk_size = max(upload_chunk_size, 320 * 1024)
try:
await LargeFileUploadClient.upload(
backup_file = await LargeFileUploadClient.upload(
self._token_function,
file,
upload_chunk_size=upload_chunk_size,
@@ -178,27 +185,35 @@ class OneDriveBackupAgent(BackupAgent):
"Hash validation failed, backup file might be corrupt"
) from err
_LOGGER.debug("Uploaded backup to %s", backup_filename)
# Store metadata in separate metadata file (just backup.as_dict(), no extra fields)
metadata_content = json_dumps(backup.as_dict())
# store metadata in metadata file
description = dumps(backup.as_dict())
_LOGGER.debug("Creating metadata: %s", description)
metadata_filename = filename.rsplit(".", 1)[0] + ".metadata.json"
try:
await self._client.upload_file(
metadata_file = await self._client.upload_file(
self._folder_id,
metadata_filename,
metadata_content,
description,
)
except OneDriveException:
# Clean up the backup file if metadata upload fails
_LOGGER.debug(
"Uploading metadata failed, deleting backup file %s", backup_filename
)
await self._client.delete_drive_item(
f"{self._folder_id}:/{backup_filename}:"
)
await self._client.delete_drive_item(backup_file.id)
raise
_LOGGER.debug("Uploaded metadata file %s", metadata_filename)
# add metadata to the metadata file
metadata_description = {
"metadata_version": METADATA_VERSION,
"backup_id": backup.backup_id,
"backup_file_id": backup_file.id,
}
try:
await self._client.update_drive_item(
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
except OneDriveException:
await self._client.delete_drive_item(backup_file.id)
await self._client.delete_drive_item(metadata_file.id)
raise
self._cache_expiration = time()
@handle_backup_errors
@@ -208,63 +223,66 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Delete a backup file."""
backup = await self._find_backup_by_id(backup_id)
backup_filename, metadata_filename = suggested_filenames(backup)
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
backup = backups[backup_id]
delete_permanently = self._entry.options.get(CONF_DELETE_PERMANENTLY, False)
await self._client.delete_drive_item(backup.backup_file_id, delete_permanently)
await self._client.delete_drive_item(
f"{self._folder_id}:/{backup_filename}:", delete_permanently
backup.metadata_file_id, delete_permanently
)
await self._client.delete_drive_item(
f"{self._folder_id}:/{metadata_filename}:", delete_permanently
)
_LOGGER.debug("Deleted backup %s", backup_filename)
self._cache_expiration = time()
@handle_backup_errors
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
return list((await self._list_cached_metadata_files()).values())
return [
backup.backup for backup in (await self._list_cached_backups()).values()
]
@handle_backup_errors
async def async_get_backup(self, backup_id: str, **kwargs: Any) -> AgentBackup:
"""Return a backup."""
return await self._find_backup_by_id(backup_id)
backups = await self._list_cached_backups()
if backup_id not in backups:
raise BackupNotFound(f"Backup {backup_id} not found")
return backups[backup_id].backup
async def _list_cached_metadata_files(self) -> dict[str, AgentBackup]:
"""List metadata files with a cache."""
async def _list_cached_backups(self) -> dict[str, OneDriveBackup]:
"""List backups with a cache."""
if time() <= self._cache_expiration:
return self._cache_backup_metadata
return self._backup_cache
async def _download_metadata(item_id: str) -> AgentBackup | None:
"""Download metadata file."""
items = await self._client.list_drive_items(self._folder_id)
async def download_backup_metadata(item_id: str) -> AgentBackup | None:
try:
metadata_stream = await self._client.download_drive_item(item_id)
except OneDriveException as err:
_LOGGER.warning("Error downloading metadata for %s: %s", item_id, err)
return None
metadata_json = loads(await metadata_stream.read())
return AgentBackup.from_dict(metadata_json)
return AgentBackup.from_dict(
json_loads_object(await metadata_stream.read())
)
items = await self._client.list_drive_items(self._folder_id)
metadata_files: dict[str, AgentBackup] = {}
backups: dict[str, OneDriveBackup] = {}
for item in items:
if item.name and item.name.endswith(".metadata.json"):
if metadata := await _download_metadata(item.id):
metadata_files[metadata.backup_id] = metadata
if item.description and f'"metadata_version": {METADATA_VERSION}' in (
metadata_description_json := unescape(item.description)
):
backup = await download_backup_metadata(item.id)
if backup is None:
continue
metadata_description = loads(metadata_description_json)
backups[backup.backup_id] = OneDriveBackup(
backup=backup,
backup_file_id=metadata_description["backup_file_id"],
metadata_file_id=item.id,
)
self._cache_backup_metadata = metadata_files
self._cache_expiration = time() + CACHE_TTL
return self._cache_backup_metadata
async def _find_backup_by_id(self, backup_id: str) -> AgentBackup:
"""Find a backup by its backup ID on remote."""
metadata_files = await self._list_cached_metadata_files()
if backup := metadata_files.get(backup_id):
return backup
raise BackupNotFound(f"Backup {backup_id} not found")
self._backup_cache = backups
return backups

View File

@@ -129,6 +129,9 @@ class OneDriveConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
except OneDriveException:
self.logger.debug("Failed to create folder", exc_info=True)
errors["base"] = "folder_creation_error"
else:
if folder.description and folder.description != instance_id:
errors[CONF_FOLDER_NAME] = "folder_already_in_use"
if not errors:
title = (
f"{self.approot.created_by.user.display_name}'s OneDrive"

View File

@@ -22,6 +22,7 @@
"default": "[%key:common::config_flow::create_entry::authenticated%]"
},
"error": {
"folder_already_in_use": "Folder already used for backups from another Home Assistant instance",
"folder_creation_error": "Failed to create folder",
"folder_rename_error": "Failed to rename folder"
},

View File

@@ -8,7 +8,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["openevsehttp"],
"quality_scale": "legacy",
"quality_scale": "bronze",
"requirements": ["python-openevse-http==0.2.1"],
"zeroconf": ["_openevse._tcp.local."]
}

View File

@@ -25,6 +25,8 @@ from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import OpenEVSEConfigEntry, OpenEVSEDataUpdateCoordinator
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class OpenEVSENumberDescription(NumberEntityDescription):

View File

@@ -0,0 +1,74 @@
rules:
# Bronze
action-setup:
status: exempt
comment: Integration does not register custom actions.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: Integration does not register custom actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: Integration does not subscribe to events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration has no options flow.
docs-installation-parameters: todo
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
test-coverage: done
# Gold
devices: done
diagnostics: todo
discovery: done
discovery-update-info: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: Integration supports a single device per config entry.
entity-category: todo
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: todo
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: done
comment: Integration creates repair issues for YAML deprecation.
stale-devices:
status: exempt
comment: Integration supports a single device per config entry.
# Platinum
async-dependency: done
inject-websession: todo
strict-typing: todo

View File

@@ -15,8 +15,12 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_create_clientsession
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN
from .coordinator import PortainerCoordinator
from .services import async_setup_services
_PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
@@ -25,6 +29,7 @@ _PLATFORMS: list[Platform] = [
Platform.BUTTON,
]
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
type PortainerConfigEntry = ConfigEntry[PortainerCoordinator]
@@ -49,6 +54,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: PortainerConfigEntry) ->
return True
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Portainer integration."""
await async_setup_services(hass)
return True
async def async_unload_entry(hass: HomeAssistant, entry: PortainerConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, _PLATFORMS)

View File

@@ -23,6 +23,8 @@ from .entity import (
PortainerEndpointEntity,
)
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True)
class PortainerContainerBinarySensorEntityDescription(BinarySensorEntityDescription):

View File

@@ -6,7 +6,6 @@ from abc import abstractmethod
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from datetime import timedelta
import logging
from typing import Any
from pyportainer import Portainer
@@ -35,7 +34,7 @@ from .coordinator import (
)
from .entity import PortainerContainerEntity, PortainerEndpointEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True)

View File

@@ -3,6 +3,7 @@
DOMAIN = "portainer"
DEFAULT_NAME = "Portainer"
ENDPOINT_STATUS_DOWN = 2
CONTAINER_STATE_RUNNING = "running"

View File

@@ -67,5 +67,10 @@
}
}
}
},
"services": {
"prune_images": {
"service": "mdi:delete-sweep"
}
}
}

View File

@@ -7,10 +7,7 @@ rules:
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: |
No custom actions are defined.
docs-actions: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
@@ -33,10 +30,7 @@ rules:
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates:
status: exempt
comment: |
No explicit parallel updates are defined.
parallel-updates: todo
reauthentication-flow:
status: todo
comment: |

View File

@@ -28,6 +28,8 @@ from .entity import (
PortainerEndpointEntity,
)
PARALLEL_UPDATES = 1
@dataclass(frozen=True, kw_only=True)
class PortainerContainerSensorEntityDescription(SensorEntityDescription):

View File

@@ -0,0 +1,115 @@
"""Services for the Portainer integration."""
from datetime import timedelta
from pyportainer import (
PortainerAuthenticationError,
PortainerConnectionError,
PortainerTimeoutError,
)
import voluptuous as vol
from homeassistant.const import ATTR_DEVICE_ID
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.service import async_extract_config_entry_ids
from .const import DOMAIN
from .coordinator import PortainerConfigEntry
ATTR_DATE_UNTIL = "until"
ATTR_DANGLING = "dangling"
SERVICE_PRUNE_IMAGES = "prune_images"
SERVICE_PRUNE_IMAGES_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_DATE_UNTIL): vol.All(
cv.time_period, vol.Range(min=timedelta(minutes=1))
),
vol.Optional(ATTR_DANGLING): cv.boolean,
},
)
async def _extract_config_entry(service_call: ServiceCall) -> PortainerConfigEntry:
"""Extract config entry from the service call."""
target_entry_ids = await async_extract_config_entry_ids(service_call)
target_entries: list[PortainerConfigEntry] = [
loaded_entry
for loaded_entry in service_call.hass.config_entries.async_loaded_entries(
DOMAIN
)
if loaded_entry.entry_id in target_entry_ids
]
if not target_entries:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="invalid_target",
)
return target_entries[0]
async def _get_endpoint_id(
call: ServiceCall,
config_entry: PortainerConfigEntry,
) -> int:
"""Get endpoint data from device ID."""
device_reg = dr.async_get(call.hass)
device_id = call.data[ATTR_DEVICE_ID]
device = device_reg.async_get(device_id)
assert device
coordinator = config_entry.runtime_data
endpoint_data = None
for data in coordinator.data.values():
if (
DOMAIN,
f"{config_entry.entry_id}_{data.endpoint.id}",
) in device.identifiers:
endpoint_data = data
break
assert endpoint_data
return endpoint_data.endpoint.id
async def prune_images(call: ServiceCall) -> None:
"""Prune unused images in Portainer, with more controls."""
config_entry = await _extract_config_entry(call)
coordinator = config_entry.runtime_data
endpoint_id = await _get_endpoint_id(call, config_entry)
try:
await coordinator.portainer.images_prune(
endpoint_id=endpoint_id,
until=call.data.get(ATTR_DATE_UNTIL),
dangling=call.data.get(ATTR_DANGLING, False),
)
except PortainerAuthenticationError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="invalid_auth_no_details",
) from err
except PortainerConnectionError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="cannot_connect_no_details",
) from err
except PortainerTimeoutError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="timeout_connect_no_details",
) from err
async def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services."""
hass.services.async_register(
DOMAIN,
SERVICE_PRUNE_IMAGES,
prune_images,
SERVICE_PRUNE_IMAGES_SCHEMA,
)

View File

@@ -0,0 +1,18 @@
# Services for Portainer
prune_images:
fields:
device_id:
required: true
selector:
device:
integration: portainer
model: Endpoint
until:
required: false
selector:
duration:
dangling:
required: false
selector:
boolean: {}

View File

@@ -155,11 +155,34 @@
"invalid_auth_no_details": {
"message": "An error occurred while trying to authenticate."
},
"invalid_target": {
"message": "Invalid device targeted."
},
"timeout_connect": {
"message": "A timeout occurred while trying to connect to the Portainer instance: {error}"
},
"timeout_connect_no_details": {
"message": "A timeout occurred while trying to connect to the Portainer instance."
}
},
"services": {
"prune_images": {
"description": "Prunes unused images on a Portainer endpoint.",
"fields": {
"dangling": {
"description": "If true, only prune dangling images.",
"name": "Dangling"
},
"device_id": {
"description": "The endpoint to prune images on.",
"name": "Endpoint"
},
"until": {
"description": "Prune images unused for at least this time duration in the past. If not provided, all unused images will be pruned.",
"name": "Until"
}
},
"name": "Prune unused images"
}
}
}

View File

@@ -37,6 +37,9 @@ class PortainerSwitchEntityDescription(SwitchEntityDescription):
turn_off_fn: Callable[[str, Portainer, int, str], Coroutine[Any, Any, None]]
PARALLEL_UPDATES = 1
async def perform_action(
action: str, portainer: Portainer, endpoint_id: int, container_id: str
) -> None:

View File

@@ -384,11 +384,7 @@ class PrometheusMetrics:
if event.data["action"] != "update" or "area_id" not in event.data["changes"]:
return
device_id = event.data.get("device_id")
if device_id is None:
return
device_id = event.data["device_id"]
_LOGGER.debug("Handling device update for %s", device_id)
device = self.device_registry.async_get(device_id)

View File

@@ -123,8 +123,7 @@ class ProxmoxveConfigFlow(ConfigFlow, domain=DOMAIN):
errors["base"] = "ssl_error"
except ProxmoxNoNodesFound:
errors["base"] = "no_nodes_found"
if not errors:
else:
return self.async_create_entry(
title=user_input[CONF_HOST],
data={**user_input, CONF_NODES: proxmox_nodes},

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Callable
from concurrent.futures.thread import _threads_queues, _worker
import sys
import threading
from typing import Any
import weakref
@@ -54,17 +53,10 @@ class DBInterruptibleThreadPoolExecutor(InterruptibleThreadPoolExecutor):
) -> None:
q.put(None)
if sys.version_info >= (3, 14):
additional_args = (
self._create_worker_context(),
self._work_queue,
)
else:
additional_args = (
self._work_queue,
self._initializer,
self._initargs,
)
additional_args = (
self._create_worker_context(),
self._work_queue,
)
num_threads = len(self._threads)
if num_threads < self._max_workers:

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["ical"],
"quality_scale": "silver",
"requirements": ["ical==12.1.3"]
"requirements": ["ical==12.1.2"]
}

View File

@@ -19,7 +19,7 @@
"data_description": {
"calendar_name": "The name of the calendar shown in the UI.",
"url": "The URL of the remote calendar.",
"verify_ssl": "Enable SSL certificate verification for secure connections."
"verify_ssl": "[%key:common::config_flow::description::verify_ssl%]"
},
"description": "Please choose a name for the calendar to be imported"
}

Some files were not shown because too many files have changed in this diff Show More