Compare commits

...

148 Commits

Author SHA1 Message Date
epenet
70ec51bcbf Drop ignore-missing-annotations from pylint 2026-02-23 16:51:00 +01:00
epenet
2f95d1ef78 Mark lock entity type hints as mandatory (#163796) 2026-02-23 16:50:52 +01:00
jesperraemaekers
6d6727ed58 Change weheat codeowner (#163860) 2026-02-23 16:49:41 +01:00
epenet
9c0c9758f0 Mark light entity type hints as mandatory (#163794) 2026-02-23 16:48:30 +01:00
epenet
bfa2da32fc Mark geo_location entity type hints as mandatory (#163790) 2026-02-23 16:48:12 +01:00
Paul Bottein
dfb17c2187 Add configurable panel properties to frontend (#162742)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Petar Petrov <MindFreeze@users.noreply.github.com>
2026-02-23 16:15:44 +01:00
Klaas Schoute
ac65163ebb Bump forecast-solar to v5.0.0 (#163841) 2026-02-23 15:58:54 +01:00
Sab44
f3042741bf Deprecate Libre Hardware Monitor versions below v0.9.5 (#163838) 2026-02-23 15:57:17 +01:00
Joost Lekkerkerker
80936497ce Add Zinvolt integration (#163449) 2026-02-23 15:55:15 +01:00
Joost Lekkerkerker
5e3d2bec68 Add integration_type device to sia (#163393) 2026-02-23 15:18:54 +01:00
Michael
e1667bd5c6 Increase request timeout from 10 to 20s in FRITZ!SmartHome (#163818) 2026-02-23 15:02:10 +01:00
Ludovic BOUÉ
cdb92a54b0 Fix Matter speaker mute toggle (#161128)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 14:38:37 +01:00
Erik Montnemery
74a3f4bbb9 Bump securetar to 2026.2.0 (#163226) 2026-02-23 14:03:43 +01:00
Nic Eggert
6299e8cb77 Add support for current sensors to egauge integration (#163728) 2026-02-23 13:29:20 +01:00
Joost Lekkerkerker
0f6a3a8328 Add integration_type service to snapcast (#163401) 2026-02-23 13:17:31 +01:00
Joost Lekkerkerker
77a56a3e60 Add integration_type device to smart_meter_texas (#163398) 2026-02-23 13:17:02 +01:00
Joost Lekkerkerker
cf5733de97 Add integration_type device to tilt_pi (#163667) 2026-02-23 13:16:37 +01:00
Joost Lekkerkerker
fe377befa6 Add integration_type hub to wallbox (#163752) 2026-02-23 13:12:40 +01:00
Joost Lekkerkerker
9d54236f7d Add integration_type hub to waqi (#163754) 2026-02-23 13:12:11 +01:00
Karl Beecken
bd6b8a812c Teltonika integration: add reauth config flow (#163712) 2026-02-23 13:07:19 +01:00
kshypachov
85eeac6812 Fix Matter energy sensor discovery when value is null (#162044)
Co-authored-by: Ludovic BOUÉ <lboue@users.noreply.github.com>
2026-02-23 11:52:05 +01:00
Robert Resch
ea71c40b0a Bump deebot-client to 18.0.0 (#163835) 2026-02-23 11:45:55 +01:00
Ludovic BOUÉ
99bd66194d Add allow_none_value=True to MatterDiscoverySchema for electrical power attributes (#163195)
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2026-02-23 11:33:57 +01:00
Sab44
13737ff2e6 Bump librehardwaremonitor-api to version 1.10.1 (#163572) 2026-02-23 11:01:58 +01:00
Tom
55c1d52310 Bump airOS to 0.6.4 (#163716) 2026-02-23 09:12:45 +01:00
hanwg
d5ef379caf Refactoring for Telegram bot (#163767) 2026-02-23 08:35:39 +01:00
Ludovic BOUÉ
a5d59decef Ikea bilresa dual button fixture (#163781)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-23 08:33:52 +01:00
Nathan Spencer
c75c5c0773 Adjust buttons to support new Litter-Robot lineup (#163825) 2026-02-23 08:32:33 +01:00
Nathan Spencer
f1fc6d10ad Adjust selects to support new Litter-Robot lineup (#163824) 2026-02-23 08:31:59 +01:00
Nathan Spencer
c3376df227 Adjust sensors to support new Litter-Robot lineup (#163823) 2026-02-23 08:30:46 +01:00
epenet
463003fc33 Add test for Tuya event (#163812) 2026-02-23 08:28:23 +01:00
Michael
b9b45c9994 Bump pyfritzhome to 0.6.20 (#163817) 2026-02-23 08:25:35 +01:00
Sebastiaan Speck
eed3b9fb89 Bump renault-api to 0.5.5 (#163821) 2026-02-23 07:35:37 +01:00
Erwin Douna
88d7954d7c Typing fix for Proxmox coordinator (#163808) 2026-02-23 06:58:43 +01:00
andarotajo
ce2afd85d4 Remove myself as code owner from dwd_weather_warnings (#163810) 2026-02-23 06:54:59 +01:00
Raphael Hehl
be96606b2c Bump uiprotect to 10.2.1 (#163816) 2026-02-23 01:05:23 +01:00
Maciej Bieniek
5afad9cabc Use async_add_executor_job in Fitbit to prevent event loop blocking (#163815) 2026-02-22 22:35:12 +01:00
epenet
19b606841d Mark fan entity type hints as mandatory (#163789) 2026-02-22 21:44:53 +01:00
Maciej Bieniek
abdd51c266 Allow unit of measurement translation in Analytics Insights (#163811) 2026-02-22 20:56:27 +01:00
Norbert Rittel
959bafe78b Fix grammar of amcrest.ptz_control action description (#163802) 2026-02-22 19:47:13 +01:00
Raphael Hehl
383f9c203d Unifiprotect ptz support (#161353)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
Co-authored-by: J. Nick Koston <nick@koston.org>
2026-02-22 10:48:22 -06:00
Harry Heymann
b5d8c1e893 Require product_id for Inovelli LED intensity Matter Number entities (#163680) 2026-02-22 17:47:59 +01:00
epenet
11edd214a1 Improve type hints in igloohome lock (#163795) 2026-02-22 17:13:14 +01:00
Norbert Rittel
15d0241158 Replace "add-on" with "app" in zwave_me (user-facing strings only) (#163703) 2026-02-22 17:12:27 +01:00
Norbert Rittel
309b439744 Replace "add-on" with "app" in recorder (#163714) 2026-02-22 17:11:00 +01:00
Norbert Rittel
49f7c24601 Replace "add-on" with "app" in homeassistant_yellow (#163715) 2026-02-22 17:10:27 +01:00
Ludovic BOUÉ
9f25b4702d Remove CumulativeEnergyExported in fixtures where not needed (#163775) 2026-02-22 17:09:49 +01:00
epenet
a312f9f5bc Improve type hints in lights (#163792) 2026-02-22 17:08:42 +01:00
Marc Mueller
d767a1ca65 Update pillow to 12.1.1 (#163773) 2026-02-22 10:06:08 -06:00
Marc Mueller
d04fb59d56 Update sqlparse to 0.5.5 (#163774) 2026-02-22 10:05:45 -06:00
Marc Mueller
00e441b90d Update pylint to 4.0.5 (#163777) 2026-02-22 10:05:20 -06:00
Aidan Timson
e1fd60aa18 Bump systembridgeconnector to 5.4.3 (#163784) 2026-02-22 10:04:46 -06:00
Luke Lashley
8c41e21b7f Bump python-robroock to 4.17.1 (#163765)
Co-authored-by: Ludovic BOUÉ <lboue@users.noreply.github.com>
2026-02-22 07:44:29 -08:00
Ludovic BOUÉ
b7fd1276aa Roborock: Q7 Model Split and Refactor (#163769)
Co-authored-by: Luke Lashley <conway220@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-22 07:32:11 -08:00
David Bonnes
12d06e80ad Rename evohome's test_evo_services.py to test_services.py (#163731) 2026-02-22 14:10:59 +01:00
Simone Chemelli
a377907fd6 Buomp aiovodafone to 3.1.2 (#163779) 2026-02-22 13:58:05 +01:00
Joost Lekkerkerker
16f4f5d54f Add integration_type device to volumio (#163751) 2026-02-22 10:52:43 +01:00
Joost Lekkerkerker
70585d1e23 Add integration_type device to vilfo (#163748) 2026-02-22 10:51:19 +01:00
Joost Lekkerkerker
2f82c3127d Add integration_type device to venstar (#163745) 2026-02-22 10:50:30 +01:00
Joost Lekkerkerker
af4d9cfac8 Add integration_type hub to vera (#163747) 2026-02-22 10:49:00 +01:00
Joost Lekkerkerker
a9abeb6ca5 Add integration_type device to v2c (#163742) 2026-02-22 10:48:24 +01:00
Joost Lekkerkerker
539ad6bf2b Add integration_type hub to uhoo (#163737) 2026-02-22 10:47:59 +01:00
Joost Lekkerkerker
f3e5cf0e56 Add integration_type device to twinkly (#163735) 2026-02-22 10:47:14 +01:00
Joost Lekkerkerker
d4e40b77cf Add integration_type hub to vegehub (#163744) 2026-02-22 10:46:02 +01:00
Joost Lekkerkerker
953391d9d9 Add integration_type service to uptimerobot (#163741) 2026-02-22 10:45:43 +01:00
Joost Lekkerkerker
4f7edb3c3c Add integration_type service to upcloud (#163740) 2026-02-22 10:45:16 +01:00
Joost Lekkerkerker
6aa4b9cefb Add integration_type service to ukraine_alarm (#163738) 2026-02-22 10:44:52 +01:00
Joost Lekkerkerker
ca01cf1150 Add integration_type service to twilio (#163734) 2026-02-22 10:44:29 +01:00
Joost Lekkerkerker
93ed79008b Add integration_type service to twitch (#163736) 2026-02-22 10:44:08 +01:00
Luke Lashley
429249f3f0 Add support for clean_area to Roborock V1 vacuums (#163760)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-21 19:41:20 -08:00
Joost Lekkerkerker
4fc627a7d8 Add integration_type service to vlc_telnet (#163750) 2026-02-22 00:04:43 +00:00
Joost Lekkerkerker
7c954e9997 Add integration_type device to vivotek (#163749) 2026-02-22 00:00:24 +00:00
Joost Lekkerkerker
95f89df6f4 Add integration_type device to vallox (#163743) 2026-02-21 23:59:53 +00:00
Tim Laing
5bffc14574 Bump pyicloud version to 2.4.1 in manifest and requirements files (#163722) 2026-02-21 23:55:45 +00:00
Ludovic BOUÉ
0e439583a6 Bump python-roborock to 4.15.0 in manifest and requirements files (#163719) 2026-02-21 09:43:06 -08:00
Christopher Fenner
666f6577e6 Bump PyViCare to 2.58.0 (#163686) 2026-02-21 18:09:28 +01:00
Andreas Jakl
ae9f2e6046 NRGkick integration: add reauth config flow (#163619) 2026-02-21 16:43:38 +01:00
Abílio Costa
9b4d209361 Add translated reasons to Govee Light Local setup failures (#163576) 2026-02-21 13:55:34 +01:00
Erwin Douna
99ca425ad0 Bump pyportainer 1.0.28 (#163700) 2026-02-21 11:53:03 +01:00
Josef Zweck
452b0775ee Revert "Replace "add-on" with "app" in zwave_me" (#163701) 2026-02-21 11:13:23 +01:00
Norbert Rittel
dc5caf307b Replace "add-on" with "app" in zwave_me (#163698) 2026-02-21 11:04:45 +01:00
Norbert Rittel
686bcb3199 Replace "add-on" with "app" in homeassistant_hardware (#163696) 2026-02-21 11:04:17 +01:00
hanwg
047d5735d8 Cleanup error handling for Telegram bot (#163689) 2026-02-21 09:19:06 +01:00
Nathan Spencer
c3b0f7ba55 Bump pylitterbot to 2025.1.0 (#163691) 2026-02-21 09:17:59 +01:00
Manu
11f0cd690e Bump aiontfy to 0.8.0 (#163693) 2026-02-21 09:16:14 +01:00
Norbert Rittel
048fbba36d Replace "add-on" with "app" in matter (#163695) 2026-02-21 08:54:49 +01:00
epenet
a791797a6f Mark entity icon type hints as mandatory (#163617) 2026-02-20 22:48:56 +01:00
Erwin Douna
aa2bb44f0e Bump pyportainer 1.0.27 (#163613)
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Jan-Philipp Benecke <jan-philipp@bnck.me>
2026-02-20 21:33:13 +01:00
Marc Mueller
6ecbaa979a Fix hassfest requirements check (#163681) 2026-02-20 20:36:07 +01:00
epenet
6115a4c1fb Use shorthand attributes in swiss_hydrological_data (#163607) 2026-02-20 19:56:39 +01:00
Joost Lekkerkerker
f6459453ed Add integration_type hub to surepetcare (#163646) 2026-02-20 19:51:02 +01:00
epenet
eeb7ce3725 Improve type hints in homematic hub (#163614) 2026-02-20 19:49:23 +01:00
Joost Lekkerkerker
f020948e2d Add integration_type hub to tradfri (#163673)
Co-authored-by: Josef Zweck <josef@zweck.dev>
2026-02-20 19:48:55 +01:00
Joost Lekkerkerker
0711176f9c Add integration_type device to tilt_ble (#163666) 2026-02-20 19:48:35 +01:00
Joost Lekkerkerker
cd26901386 Add integration_type service to todoist (#163668) 2026-02-20 19:46:51 +01:00
Joost Lekkerkerker
3c1b7ada9a Add integration_type device to tolo (#163670) 2026-02-20 19:46:36 +01:00
Joost Lekkerkerker
debf07e3fc Add integration_type device to toon (#163671) 2026-02-20 19:46:07 +01:00
Joost Lekkerkerker
541cc808b0 Add integration_type hub to totalconnect (#163672) 2026-02-20 19:45:21 +01:00
Joost Lekkerkerker
46b0eaecf6 Add integration_type service to trafikverket_camera (#163674) 2026-02-20 19:43:48 +01:00
Joost Lekkerkerker
35e770b998 Add integration_type service to trafikverket_ferry (#163675) 2026-02-20 19:43:14 +01:00
Joost Lekkerkerker
ed9ad950d9 Add integration_type service to trafikverket_train (#163676) 2026-02-20 19:42:55 +01:00
Joost Lekkerkerker
02058afb10 Add integration_type service to trafikverket_weatherstation (#163677) 2026-02-20 19:42:39 +01:00
Joost Lekkerkerker
3f6bfa96fc Add integration_type hub to tellduslive (#163661) 2026-02-20 19:41:34 +01:00
Joost Lekkerkerker
430f064243 Add integration_type device to tesla_wall_connector (#163662) 2026-02-20 19:40:20 +01:00
Joost Lekkerkerker
08adb88c6b Add integration_type device to thermobeacon (#163663) 2026-02-20 19:39:43 +01:00
Joost Lekkerkerker
14b6269dbf Add integration_type device to thermopro (#163664) 2026-02-20 19:39:05 +01:00
Joost Lekkerkerker
19b1fc6561 Add integration_type hub to tibber (#163665) 2026-02-20 19:37:34 +01:00
Joost Lekkerkerker
b6e83d22e3 Add integration_type device to syncthru (#163658) 2026-02-20 19:36:19 +01:00
Joost Lekkerkerker
7cd48ef079 Add integration_type device to tami4 (#163659) 2026-02-20 19:35:29 +01:00
Joost Lekkerkerker
2a03d95bcd Add integration_type service to telegram_bot (#163660) 2026-02-20 19:34:39 +01:00
Joost Lekkerkerker
e7e8c7a53a Add integration_type device to togrill (#163669) 2026-02-20 19:11:57 +01:00
Joost Lekkerkerker
6ce28987ab Add integration_type service to syncthing (#163651) 2026-02-20 16:27:23 +01:00
Joost Lekkerkerker
da537ddb8b Add integration_type device to steamist (#163640) 2026-02-20 16:26:59 +01:00
Joost Lekkerkerker
03f81e4a09 Add integration_type hub to starline (#163638) 2026-02-20 16:25:58 +01:00
Joost Lekkerkerker
88bc6165b5 Add integration_type device to starlink (#163639) 2026-02-20 16:25:33 +01:00
Joost Lekkerkerker
a1f35ed3c4 Add integration_type hub to switcher_kis (#163650) 2026-02-20 17:23:57 +02:00
Joost Lekkerkerker
c15a804ab4 Add integration_type service to srp_energy (#163636) 2026-02-20 16:23:39 +01:00
Joost Lekkerkerker
34f1c4cbe0 Add integration_type device to soundtouch (#163634) 2026-02-20 16:23:00 +01:00
Joost Lekkerkerker
bf950e4916 Add integration_type service to splunk (#163635) 2026-02-20 16:22:33 +01:00
Joost Lekkerkerker
47eba50b4a Add integration_type service to sonarr (#163632) 2026-02-20 16:22:07 +01:00
Joost Lekkerkerker
8ff06f3c72 Add integration_type hub to soma (#163630) 2026-02-20 16:21:35 +01:00
Joost Lekkerkerker
d2918586f9 Add integration_type device to solax (#163629) 2026-02-20 16:21:09 +01:00
Joost Lekkerkerker
8c3e72b53d Add integration_type device to snooz (#163627) 2026-02-20 16:20:31 +01:00
Joost Lekkerkerker
3143d9c4fd Add integration_type hub to snoo (#163626) 2026-02-20 16:20:01 +01:00
Joost Lekkerkerker
04621a2e58 Add integration_type hub to switchbee (#163648) 2026-02-20 16:19:28 +01:00
Joost Lekkerkerker
9b6e6a688d Add integration_type service to swiss_public_transport (#163647) 2026-02-20 16:18:57 +01:00
Joost Lekkerkerker
2bf5f67ecd Add integration_type service to suez_water (#163644) 2026-02-20 16:18:20 +01:00
Joost Lekkerkerker
522f63cdab Add integration_type hub to sunricher_dali (#163645) 2026-02-20 16:18:03 +01:00
Joost Lekkerkerker
03f5e6d6a3 Add integration_type device to songpal (#163633) 2026-02-20 16:17:47 +01:00
Joost Lekkerkerker
c2ba5d87d5 Add integration_type hub to subaru (#163643) 2026-02-20 16:17:03 +01:00
Joost Lekkerkerker
6a9fd67e05 Add integration_type hub to somfy_mylink (#163631) 2026-02-20 16:16:35 +01:00
Joost Lekkerkerker
69db5787ec Add integration_type device to stiebel_eltron (#163641) 2026-02-20 16:15:39 +01:00
Joost Lekkerkerker
8a38bace90 Add integration_type service to streamlabswater (#163642) 2026-02-20 16:15:05 +01:00
epenet
d6f3079518 Use shorthand attributes in london_air (#163601) 2026-02-20 11:49:48 +01:00
epenet
f80e1dd25b Use shorthand attributes in homematic (#163610) 2026-02-20 11:49:04 +01:00
epenet
4937c6521b Add type hint for icon property (#163609) 2026-02-20 11:43:44 +01:00
epenet
cff5a12d5f Use shorthand attributes in reddit (#163600) 2026-02-20 11:43:23 +01:00
epenet
63e4eaf79e Use shorthand attributes in netdata (#163605) 2026-02-20 11:41:56 +01:00
epenet
eccaac4e94 Use shorthand attributes in rmvtransport (#163599) 2026-02-20 11:38:11 +01:00
epenet
5d818cd2ba Use shorthand attributes in transport_nsw (#163598) 2026-02-20 11:37:40 +01:00
epenet
12591a95c6 Use shorthand attributes in torque (#163597) 2026-02-20 11:33:18 +01:00
epenet
1110ca5dc6 Use shorthand attributes in geonetnz_volcano (#163596) 2026-02-20 11:32:45 +01:00
Brett Adams
2a6f6ef684 Add reconfiguration flow to Splunk integration (#163577)
Co-authored-by: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-20 09:13:32 +01:00
Manu
c173505f76 Add state_class to PlayStation Network sensors (#163591) 2026-02-20 09:10:58 +01:00
Manu
201b31c18a Add state_class to Xbox sensors (#163590) 2026-02-20 08:31:26 +01:00
Manu
cb63c1d435 Impprove oauth2 exception handling in Xbox (#163588) 2026-02-20 08:31:10 +01:00
Brett Adams
6abff84f23 Add exception translations for Splunk setup errors (#163579) 2026-02-20 00:19:03 +01:00
Patrick Vorgers
0996ad4d1d Add pagination support for IDrive e2 (#162960) 2026-02-19 22:42:04 +01:00
264 changed files with 6627 additions and 1125 deletions

View File

@@ -37,7 +37,7 @@ on:
type: boolean
env:
CACHE_VERSION: 2
CACHE_VERSION: 3
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.3"
@@ -705,7 +705,7 @@ jobs:
run: |
. venv/bin/activate
python --version
pylint --ignore-missing-annotations=y homeassistant
pylint homeassistant
- name: Run pylint (partially)
if: needs.info.outputs.test_full_suite == 'false'
shell: bash
@@ -714,7 +714,7 @@ jobs:
run: |
. venv/bin/activate
python --version
pylint --ignore-missing-annotations=y $(printf "homeassistant/components/%s " ${INTEGRATIONS_GLOB})
pylint $(printf "homeassistant/components/%s " ${INTEGRATIONS_GLOB})
pylint-tests:
name: Check pylint on tests

View File

@@ -612,6 +612,7 @@ homeassistant.components.yale_smart_alarm.*
homeassistant.components.yalexs_ble.*
homeassistant.components.youtube.*
homeassistant.components.zeroconf.*
homeassistant.components.zinvolt.*
homeassistant.components.zodiac.*
homeassistant.components.zone.*
homeassistant.components.zwave_js.*

10
CODEOWNERS generated
View File

@@ -403,8 +403,8 @@ build.json @home-assistant/supervisor
/tests/components/duke_energy/ @hunterjm
/homeassistant/components/duotecno/ @cereal2nd
/tests/components/duotecno/ @cereal2nd
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192 @andarotajo
/tests/components/dwd_weather_warnings/ @runningman84 @stephan192 @andarotajo
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192
/tests/components/dwd_weather_warnings/ @runningman84 @stephan192
/homeassistant/components/dynalite/ @ziv1234
/tests/components/dynalite/ @ziv1234
/homeassistant/components/eafm/ @Jc2k
@@ -1880,8 +1880,8 @@ build.json @home-assistant/supervisor
/tests/components/webostv/ @thecode
/homeassistant/components/websocket_api/ @home-assistant/core
/tests/components/websocket_api/ @home-assistant/core
/homeassistant/components/weheat/ @jesperraemaekers
/tests/components/weheat/ @jesperraemaekers
/homeassistant/components/weheat/ @barryvdh
/tests/components/weheat/ @barryvdh
/homeassistant/components/wemo/ @esev
/tests/components/wemo/ @esev
/homeassistant/components/whirlpool/ @abmantis @mkmer
@@ -1959,6 +1959,8 @@ build.json @home-assistant/supervisor
/tests/components/zha/ @dmulcahey @adminiuga @puddly @TheJulianJES
/homeassistant/components/zimi/ @markhannon
/tests/components/zimi/ @markhannon
/homeassistant/components/zinvolt/ @joostlek
/tests/components/zinvolt/ @joostlek
/homeassistant/components/zodiac/ @JulienTant
/tests/components/zodiac/ @JulienTant
/homeassistant/components/zone/ @home-assistant/core

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
import hashlib
import json
import logging
from pathlib import Path
@@ -40,17 +39,6 @@ class RestoreBackupFileContent:
restore_homeassistant: bool
def password_to_key(password: str) -> bytes:
"""Generate a AES Key from password.
Matches the implementation in supervisor.backups.utils.password_to_key.
"""
key: bytes = password.encode()
for _ in range(100):
key = hashlib.sha256(key).digest()
return key[:16]
def restore_backup_file_content(config_dir: Path) -> RestoreBackupFileContent | None:
"""Return the contents of the restore backup file."""
instruction_path = config_dir.joinpath(RESTORE_BACKUP_FILE)
@@ -96,15 +84,14 @@ def _extract_backup(
"""Extract the backup file to the config directory."""
with (
TemporaryDirectory() as tempdir,
securetar.SecureTarFile(
securetar.SecureTarArchive(
restore_content.backup_file_path,
gzip=False,
mode="r",
) as ostf,
):
ostf.extractall(
ostf.tar.extractall(
path=Path(tempdir, "extracted"),
members=securetar.secure_path(ostf),
members=securetar.secure_path(ostf.tar),
filter="fully_trusted",
)
backup_meta_file = Path(tempdir, "extracted", "backup.json")
@@ -126,10 +113,7 @@ def _extract_backup(
f"homeassistant.tar{'.gz' if backup_meta['compressed'] else ''}",
),
gzip=backup_meta["compressed"],
key=password_to_key(restore_content.password)
if restore_content.password is not None
else None,
mode="r",
password=restore_content.password,
) as istf:
istf.extractall(
path=Path(tempdir, "homeassistant"),

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.6.3"]
"requirements": ["airos==0.6.4"]
}

View File

@@ -75,7 +75,7 @@
"name": "Go to preset"
},
"ptz_control": {
"description": "Moves (pan/tilt) and/or zoom a PTZ camera.",
"description": "Moves (pan/tilt) and/or zooms a PTZ camera.",
"fields": {
"entity_id": {
"description": "[%key:component::amcrest::services::enable_recording::fields::entity_id::description%]",

View File

@@ -38,7 +38,6 @@ def get_app_entity_description(
translation_key="apps",
name=name_slug,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.apps.get(name_slug),
)
@@ -52,7 +51,6 @@ def get_core_integration_entity_description(
translation_key="core_integrations",
name=name,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.core_integrations.get(domain),
)
@@ -66,7 +64,6 @@ def get_custom_integration_entity_description(
translation_key="custom_integrations",
translation_placeholders={"custom_integration_domain": domain},
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.custom_integrations.get(domain),
)
@@ -77,7 +74,6 @@ GENERAL_SENSORS = [
translation_key="total_active_installations",
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.active_installations,
),
AnalyticsSensorEntityDescription(
@@ -85,7 +81,6 @@ GENERAL_SENSORS = [
translation_key="total_reports_integrations",
entity_registry_enabled_default=False,
state_class=SensorStateClass.TOTAL,
native_unit_of_measurement="active installations",
value_fn=lambda data: data.reports_integrations,
),
]

View File

@@ -24,14 +24,23 @@
},
"entity": {
"sensor": {
"apps": {
"unit_of_measurement": "active installations"
},
"core_integrations": {
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"custom_integrations": {
"name": "{custom_integration_domain} (custom)"
"name": "{custom_integration_domain} (custom)",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"total_active_installations": {
"name": "Total active installations"
"name": "Total active installations",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
},
"total_reports_integrations": {
"name": "Total reported integrations"
"name": "Total reported integrations",
"unit_of_measurement": "[%key:component::analytics_insights::entity::sensor::apps::unit_of_measurement%]"
}
}
},

View File

@@ -64,6 +64,6 @@ class AtagSensor(AtagEntity, SensorEntity):
return self.coordinator.atag.report[self._id].state
@property
def icon(self):
def icon(self) -> str:
"""Return icon."""
return self.coordinator.atag.report[self._id].icon

View File

@@ -33,3 +33,5 @@ EXCLUDE_DATABASE_FROM_BACKUP = [
"home-assistant_v2.db",
"home-assistant_v2.db-wal",
]
SECURETAR_CREATE_VERSION = 2

View File

@@ -20,13 +20,9 @@ import time
from typing import IO, TYPE_CHECKING, Any, Protocol, TypedDict, cast
import aiohttp
from securetar import SecureTarFile, atomic_contents_add
from securetar import SecureTarArchive, atomic_contents_add
from homeassistant.backup_restore import (
RESTORE_BACKUP_FILE,
RESTORE_BACKUP_RESULT_FILE,
password_to_key,
)
from homeassistant.backup_restore import RESTORE_BACKUP_FILE, RESTORE_BACKUP_RESULT_FILE
from homeassistant.const import __version__ as HAVERSION
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import (
@@ -60,6 +56,7 @@ from .const import (
EXCLUDE_DATABASE_FROM_BACKUP,
EXCLUDE_FROM_BACKUP,
LOGGER,
SECURETAR_CREATE_VERSION,
)
from .models import (
AddonInfo,
@@ -1858,20 +1855,22 @@ class CoreBackupReaderWriter(BackupReaderWriter):
return False
outer_secure_tarfile = SecureTarFile(
tar_file_path, "w", gzip=False, bufsize=BUF_SIZE
)
with outer_secure_tarfile as outer_secure_tarfile_tarfile:
with SecureTarArchive(
tar_file_path,
"w",
bufsize=BUF_SIZE,
create_version=SECURETAR_CREATE_VERSION,
password=password,
) as outer_secure_tarfile:
raw_bytes = json_bytes(backup_data)
fileobj = io.BytesIO(raw_bytes)
tar_info = tarfile.TarInfo(name="./backup.json")
tar_info.size = len(raw_bytes)
tar_info.mtime = int(time.time())
outer_secure_tarfile_tarfile.addfile(tar_info, fileobj=fileobj)
with outer_secure_tarfile.create_inner_tar(
outer_secure_tarfile.tar.addfile(tar_info, fileobj=fileobj)
with outer_secure_tarfile.create_tar(
"./homeassistant.tar.gz",
gzip=True,
key=password_to_key(password) if password is not None else None,
) as core_tar:
atomic_contents_add(
tar_file=core_tar,

View File

@@ -8,6 +8,6 @@
"integration_type": "service",
"iot_class": "calculated",
"quality_scale": "internal",
"requirements": ["cronsim==2.7", "securetar==2025.2.1"],
"requirements": ["cronsim==2.7", "securetar==2026.2.0"],
"single_config_entry": true
}

View File

@@ -8,7 +8,6 @@ import copy
from dataclasses import dataclass, replace
from io import BytesIO
import json
import os
from pathlib import Path, PurePath
from queue import SimpleQueue
import tarfile
@@ -16,9 +15,14 @@ import threading
from typing import IO, Any, cast
import aiohttp
from securetar import SecureTarError, SecureTarFile, SecureTarReadError
from securetar import (
SecureTarArchive,
SecureTarError,
SecureTarFile,
SecureTarReadError,
SecureTarRootKeyContext,
)
from homeassistant.backup_restore import password_to_key
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.util import dt as dt_util
@@ -29,7 +33,7 @@ from homeassistant.util.async_iterator import (
)
from homeassistant.util.json import JsonObjectType, json_loads_object
from .const import BUF_SIZE, LOGGER
from .const import BUF_SIZE, LOGGER, SECURETAR_CREATE_VERSION
from .models import AddonInfo, AgentBackup, Folder
@@ -132,17 +136,23 @@ def suggested_filename(backup: AgentBackup) -> str:
def validate_password(path: Path, password: str | None) -> bool:
"""Validate the password."""
with tarfile.open(path, "r:", bufsize=BUF_SIZE) as backup_file:
"""Validate the password.
This assumes every inner tar is encrypted with the same secure tar version and
same password.
"""
with SecureTarArchive(
path, "r", bufsize=BUF_SIZE, password=password
) as backup_file:
compressed = False
ha_tar_name = "homeassistant.tar"
try:
ha_tar = backup_file.extractfile(ha_tar_name)
ha_tar = backup_file.tar.extractfile(ha_tar_name)
except KeyError:
compressed = True
ha_tar_name = "homeassistant.tar.gz"
try:
ha_tar = backup_file.extractfile(ha_tar_name)
ha_tar = backup_file.tar.extractfile(ha_tar_name)
except KeyError:
LOGGER.error("No homeassistant.tar or homeassistant.tar.gz found")
return False
@@ -150,13 +160,12 @@ def validate_password(path: Path, password: str | None) -> bool:
with SecureTarFile(
path, # Not used
gzip=compressed,
key=password_to_key(password) if password is not None else None,
mode="r",
password=password,
fileobj=ha_tar,
):
# If we can read the tar file, the password is correct
return True
except tarfile.ReadError:
except tarfile.ReadError, SecureTarReadError:
LOGGER.debug("Invalid password")
return False
except Exception: # noqa: BLE001
@@ -168,22 +177,23 @@ def validate_password_stream(
input_stream: IO[bytes],
password: str | None,
) -> None:
"""Decrypt a backup."""
with (
tarfile.open(fileobj=input_stream, mode="r|", bufsize=BUF_SIZE) as input_tar,
):
for obj in input_tar:
"""Validate the password.
This assumes every inner tar is encrypted with the same secure tar version and
same password.
"""
with SecureTarArchive(
fileobj=input_stream,
mode="r",
bufsize=BUF_SIZE,
streaming=True,
password=password,
) as input_archive:
for obj in input_archive.tar:
if not obj.name.endswith((".tar", ".tgz", ".tar.gz")):
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
)
with istf.decrypt(obj) as decrypted:
if istf.securetar_header.plaintext_size is None:
with input_archive.extract_tar(obj) as decrypted:
if decrypted.plaintext_size is None:
raise UnsupportedSecureTarVersion
try:
decrypted.read(1) # Read a single byte to trigger the decryption
@@ -212,21 +222,25 @@ def decrypt_backup(
password: str | None,
on_done: Callable[[Exception | None], None],
minimum_size: int,
nonces: NonceGenerator,
key_context: SecureTarRootKeyContext,
) -> None:
"""Decrypt a backup."""
error: Exception | None = None
try:
try:
with (
tarfile.open(
fileobj=input_stream, mode="r|", bufsize=BUF_SIZE
) as input_tar,
SecureTarArchive(
fileobj=input_stream,
mode="r",
bufsize=BUF_SIZE,
streaming=True,
password=password,
) as input_archive,
tarfile.open(
fileobj=output_stream, mode="w|", bufsize=BUF_SIZE
) as output_tar,
):
_decrypt_backup(backup, input_tar, output_tar, password)
_decrypt_backup(backup, input_archive, output_tar)
except (DecryptError, SecureTarError, tarfile.TarError) as err:
LOGGER.warning("Error decrypting backup: %s", err)
error = err
@@ -248,19 +262,18 @@ def decrypt_backup(
def _decrypt_backup(
backup: AgentBackup,
input_tar: tarfile.TarFile,
input_archive: SecureTarArchive,
output_tar: tarfile.TarFile,
password: str | None,
) -> None:
"""Decrypt a backup."""
expected_archives = _get_expected_archives(backup)
for obj in input_tar:
for obj in input_archive.tar:
# We compare with PurePath to avoid issues with different path separators,
# for example when backup.json is added as "./backup.json"
object_path = PurePath(obj.name)
if object_path == PurePath("backup.json"):
# Rewrite the backup.json file to indicate that the backup is decrypted
if not (reader := input_tar.extractfile(obj)):
if not (reader := input_archive.tar.extractfile(obj)):
raise DecryptError
metadata = json_loads_object(reader.read())
metadata["protected"] = False
@@ -272,21 +285,15 @@ def _decrypt_backup(
prefix, _, suffix = object_path.name.partition(".")
if suffix not in ("tar", "tgz", "tar.gz"):
LOGGER.debug("Unknown file %s will not be decrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_tar.addfile(obj, input_archive.tar.extractfile(obj))
continue
if prefix not in expected_archives:
LOGGER.debug("Unknown inner tar file %s will not be decrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_tar.addfile(obj, input_archive.tar.extractfile(obj))
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
)
with istf.decrypt(obj) as decrypted:
if (plaintext_size := istf.securetar_header.plaintext_size) is None:
with input_archive.extract_tar(obj) as decrypted:
# Guard against SecureTar v1 which doesn't store plaintext size
if (plaintext_size := decrypted.plaintext_size) is None:
raise UnsupportedSecureTarVersion
decrypted_obj = copy.deepcopy(obj)
decrypted_obj.size = plaintext_size
@@ -300,7 +307,7 @@ def encrypt_backup(
password: str | None,
on_done: Callable[[Exception | None], None],
minimum_size: int,
nonces: NonceGenerator,
key_context: SecureTarRootKeyContext,
) -> None:
"""Encrypt a backup."""
error: Exception | None = None
@@ -310,11 +317,16 @@ def encrypt_backup(
tarfile.open(
fileobj=input_stream, mode="r|", bufsize=BUF_SIZE
) as input_tar,
tarfile.open(
fileobj=output_stream, mode="w|", bufsize=BUF_SIZE
) as output_tar,
SecureTarArchive(
fileobj=output_stream,
mode="w",
bufsize=BUF_SIZE,
streaming=True,
root_key_context=key_context,
create_version=SECURETAR_CREATE_VERSION,
) as output_archive,
):
_encrypt_backup(backup, input_tar, output_tar, password, nonces)
_encrypt_backup(backup, input_tar, output_archive)
except (EncryptError, SecureTarError, tarfile.TarError) as err:
LOGGER.warning("Error encrypting backup: %s", err)
error = err
@@ -337,9 +349,7 @@ def encrypt_backup(
def _encrypt_backup(
backup: AgentBackup,
input_tar: tarfile.TarFile,
output_tar: tarfile.TarFile,
password: str | None,
nonces: NonceGenerator,
output_archive: SecureTarArchive,
) -> None:
"""Encrypt a backup."""
inner_tar_idx = 0
@@ -357,29 +367,20 @@ def _encrypt_backup(
updated_metadata_b = json.dumps(metadata).encode()
metadata_obj = copy.deepcopy(obj)
metadata_obj.size = len(updated_metadata_b)
output_tar.addfile(metadata_obj, BytesIO(updated_metadata_b))
output_archive.tar.addfile(metadata_obj, BytesIO(updated_metadata_b))
continue
prefix, _, suffix = object_path.name.partition(".")
if suffix not in ("tar", "tgz", "tar.gz"):
LOGGER.debug("Unknown file %s will not be encrypted", obj.name)
output_tar.addfile(obj, input_tar.extractfile(obj))
output_archive.tar.addfile(obj, input_tar.extractfile(obj))
continue
if prefix not in expected_archives:
LOGGER.debug("Unknown inner tar file %s will not be encrypted", obj.name)
continue
istf = SecureTarFile(
None, # Not used
gzip=False,
key=password_to_key(password) if password is not None else None,
mode="r",
fileobj=input_tar.extractfile(obj),
nonce=nonces.get(inner_tar_idx),
output_archive.import_tar(
input_tar.extractfile(obj), obj, derived_key_id=inner_tar_idx
)
inner_tar_idx += 1
with istf.encrypt(obj) as encrypted:
encrypted_obj = copy.deepcopy(obj)
encrypted_obj.size = encrypted.encrypted_size
output_tar.addfile(encrypted_obj, encrypted)
@dataclass(kw_only=True)
@@ -391,21 +392,6 @@ class _CipherWorkerStatus:
writer: AsyncIteratorWriter
class NonceGenerator:
"""Generate nonces for encryption."""
def __init__(self) -> None:
"""Initialize the generator."""
self._nonces: dict[int, bytes] = {}
def get(self, index: int) -> bytes:
"""Get a nonce for the given index."""
if index not in self._nonces:
# Generate a new nonce for the given index
self._nonces[index] = os.urandom(16)
return self._nonces[index]
class _CipherBackupStreamer:
"""Encrypt or decrypt a backup."""
@@ -417,7 +403,7 @@ class _CipherBackupStreamer:
str | None,
Callable[[Exception | None], None],
int,
NonceGenerator,
SecureTarRootKeyContext,
],
None,
]
@@ -435,7 +421,7 @@ class _CipherBackupStreamer:
self._hass = hass
self._open_stream = open_stream
self._password = password
self._nonces = NonceGenerator()
self._key_context = SecureTarRootKeyContext(password)
def size(self) -> int:
"""Return the maximum size of the decrypted or encrypted backup."""
@@ -466,7 +452,7 @@ class _CipherBackupStreamer:
self._password,
on_done,
self.size(),
self._nonces,
self._key_context,
],
)
worker_status = _CipherWorkerStatus(

View File

@@ -74,7 +74,7 @@ class BleBoxLightEntity(BleBoxEntity[blebox_uniapi.light.Light], LightEntity):
return self._feature.is_on
@property
def brightness(self):
def brightness(self) -> int | None:
"""Return the name."""
return self._feature.brightness

View File

@@ -31,6 +31,7 @@ _LOGGER = logging.getLogger(__name__)
def _convert_image_for_editing(data: bytes) -> tuple[bytes, str]:
"""Ensure the image data is in a format accepted by OpenAI image edits."""
img: Image.Image
stream = io.BytesIO(data)
with Image.open(stream) as img:
mode = img.mode

View File

@@ -199,7 +199,7 @@ class Control4Light(Control4Entity, LightEntity):
return self.coordinator.data[self._idx][CONTROL4_NON_DIMMER_VAR] > 0
@property
def brightness(self):
def brightness(self) -> int | None:
"""Return the brightness of this light between 0..255."""
if self._is_dimmer:
for var in CONTROL4_DIMMER_VARS:

View File

@@ -132,7 +132,7 @@ class DecoraWifiLight(LightEntity):
return self._switch.serial
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of the dimmer switch."""
return int(self._switch.brightness * 255 / 100)

View File

@@ -6,5 +6,5 @@
"iot_class": "local_polling",
"loggers": ["pydoods"],
"quality_scale": "legacy",
"requirements": ["pydoods==1.0.2", "Pillow==12.0.0"]
"requirements": ["pydoods==1.0.2", "Pillow==12.1.1"]
}

View File

@@ -1,7 +1,7 @@
{
"domain": "dwd_weather_warnings",
"name": "Deutscher Wetterdienst (DWD) Weather Warnings",
"codeowners": ["@runningman84", "@stephan192", "@andarotajo"],
"codeowners": ["@runningman84", "@stephan192"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/dwd_weather_warnings",
"integration_type": "service",

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["sleekxmppfs", "sucks", "deebot_client"],
"requirements": ["py-sucks==0.9.11", "deebot-client==17.1.0"]
"requirements": ["py-sucks==0.9.11", "deebot-client==18.0.0"]
}

View File

@@ -338,11 +338,11 @@ class EcovacsVacuum(
translation_placeholders={"name": name},
)
if command in "spot_area":
if command == "spot_area":
await self._device.execute_command(
self._capability.clean.action.area(
CleanMode.SPOT_AREA,
str(params["rooms"]),
params["rooms"],
params.get("cleanings", 1),
)
)
@@ -350,7 +350,7 @@ class EcovacsVacuum(
await self._device.execute_command(
self._capability.clean.action.area(
CleanMode.CUSTOM_AREA,
str(params["coordinates"]),
params["coordinates"],
params.get("cleanings", 1),
)
)

View File

@@ -13,7 +13,12 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfElectricPotential, UnitOfEnergy, UnitOfPower
from homeassistant.const import (
UnitOfElectricCurrent,
UnitOfElectricPotential,
UnitOfEnergy,
UnitOfPower,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -59,6 +64,15 @@ SENSORS: tuple[EgaugeSensorEntityDescription, ...] = (
available_fn=lambda data, register: register in data.measurements,
supported_fn=lambda register_info: register_info.type == RegisterType.VOLTAGE,
),
EgaugeSensorEntityDescription(
key="current",
device_class=SensorDeviceClass.CURRENT,
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
native_value_fn=lambda data, register: data.measurements[register],
available_fn=lambda data, register: register in data.measurements,
supported_fn=lambda register_info: register_info.type == RegisterType.CURRENT,
),
)

View File

@@ -75,7 +75,7 @@ class EufyHomeLight(LightEntity):
self._attr_is_on = self._bulb.power
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of this light between 0..255."""
return int(self._brightness * 255 / 100)
@@ -88,7 +88,7 @@ class EufyHomeLight(LightEntity):
)
@property
def hs_color(self):
def hs_color(self) -> tuple[float, float] | None:
"""Return the color of this light."""
return self._hs

View File

@@ -72,7 +72,7 @@ class FitbitApi(ABC):
configuration = Configuration()
configuration.pool_manager = async_get_clientsession(self._hass)
configuration.access_token = token[CONF_ACCESS_TOKEN]
return ApiClient(configuration)
return await self._hass.async_add_executor_job(ApiClient, configuration)
async def async_get_user_profile(self) -> FitbitProfile:
"""Return the user profile from the API."""

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/forecast_solar",
"integration_type": "service",
"iot_class": "cloud_polling",
"requirements": ["forecast-solar==4.2.0"]
"requirements": ["forecast-solar==5.0.0"]
}

View File

@@ -63,6 +63,7 @@ class FritzboxDataUpdateCoordinator(DataUpdateCoordinator[FritzboxCoordinatorDat
host=self.config_entry.data[CONF_HOST],
user=self.config_entry.data[CONF_USERNAME],
password=self.config_entry.data[CONF_PASSWORD],
timeout=20,
)
try:

View File

@@ -7,7 +7,7 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["pyfritzhome"],
"requirements": ["pyfritzhome==0.6.19"],
"requirements": ["pyfritzhome==0.6.20"],
"ssdp": [
{
"st": "urn:schemas-upnp-org:device:fritzbox:1"

View File

@@ -18,7 +18,7 @@ from yarl import URL
from homeassistant.components import onboarding, websocket_api
from homeassistant.components.http import KEY_HASS, HomeAssistantView, StaticPathConfig
from homeassistant.components.websocket_api import ActiveConnection
from homeassistant.components.websocket_api import ERR_NOT_FOUND, ActiveConnection
from homeassistant.config import async_hass_config_yaml
from homeassistant.const import (
CONF_MODE,
@@ -78,6 +78,16 @@ THEMES_STORAGE_VERSION = 1
THEMES_SAVE_DELAY = 60
DATA_THEMES_STORE: HassKey[Store] = HassKey("frontend_themes_store")
DATA_THEMES: HassKey[dict[str, Any]] = HassKey("frontend_themes")
PANELS_STORAGE_KEY = f"{DOMAIN}_panels"
PANELS_STORAGE_VERSION = 1
PANELS_SAVE_DELAY = 10
DATA_PANELS_STORE: HassKey[Store[dict[str, dict[str, Any]]]] = HassKey(
"frontend_panels_store"
)
DATA_PANELS_CONFIG: HassKey[dict[str, dict[str, Any]]] = HassKey(
"frontend_panels_config"
)
DATA_DEFAULT_THEME = "frontend_default_theme"
DATA_DEFAULT_DARK_THEME = "frontend_default_dark_theme"
DEFAULT_THEME = "default"
@@ -312,9 +322,11 @@ class Panel:
self.sidebar_default_visible = sidebar_default_visible
@callback
def to_response(self) -> PanelResponse:
def to_response(
self, config_override: dict[str, Any] | None = None
) -> PanelResponse:
"""Panel as dictionary."""
return {
response: PanelResponse = {
"component_name": self.component_name,
"icon": self.sidebar_icon,
"title": self.sidebar_title,
@@ -324,6 +336,18 @@ class Panel:
"require_admin": self.require_admin,
"config_panel_domain": self.config_panel_domain,
}
if config_override:
if "require_admin" in config_override:
response["require_admin"] = config_override["require_admin"]
if config_override.get("show_in_sidebar") is False:
response["title"] = None
response["icon"] = None
else:
if "icon" in config_override:
response["icon"] = config_override["icon"]
if "title" in config_override:
response["title"] = config_override["title"]
return response
@bind_hass
@@ -415,12 +439,24 @@ def _frontend_root(dev_repo_path: str | None) -> pathlib.Path:
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the serving of the frontend."""
await async_setup_frontend_storage(hass)
panels_store = hass.data[DATA_PANELS_STORE] = Store[dict[str, dict[str, Any]]](
hass, PANELS_STORAGE_VERSION, PANELS_STORAGE_KEY
)
loaded: Any = await panels_store.async_load()
if not isinstance(loaded, dict):
if loaded is not None:
_LOGGER.warning("Ignoring invalid panel storage data")
loaded = {}
hass.data[DATA_PANELS_CONFIG] = loaded
websocket_api.async_register_command(hass, websocket_get_icons)
websocket_api.async_register_command(hass, websocket_get_panels)
websocket_api.async_register_command(hass, websocket_get_themes)
websocket_api.async_register_command(hass, websocket_get_translations)
websocket_api.async_register_command(hass, websocket_get_version)
websocket_api.async_register_command(hass, websocket_subscribe_extra_js)
websocket_api.async_register_command(hass, websocket_update_panel)
hass.http.register_view(ManifestJSONView())
conf = config.get(DOMAIN, {})
@@ -559,6 +595,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
)
async_register_built_in_panel(hass, "profile")
async_register_built_in_panel(hass, "notfound")
@callback
def async_change_listener(
@@ -883,11 +920,18 @@ def websocket_get_panels(
) -> None:
"""Handle get panels command."""
user_is_admin = connection.user.is_admin
panels = {
panel_key: panel.to_response()
for panel_key, panel in connection.hass.data[DATA_PANELS].items()
if user_is_admin or not panel.require_admin
}
panels_config = hass.data[DATA_PANELS_CONFIG]
panels: dict[str, PanelResponse] = {}
for panel_key, panel in connection.hass.data[DATA_PANELS].items():
config_override = panels_config.get(panel_key)
require_admin = (
config_override.get("require_admin", panel.require_admin)
if config_override
else panel.require_admin
)
if not user_is_admin and require_admin:
continue
panels[panel_key] = panel.to_response(config_override)
connection.send_message(websocket_api.result_message(msg["id"], panels))
@@ -986,6 +1030,50 @@ def websocket_subscribe_extra_js(
connection.send_message(websocket_api.result_message(msg["id"]))
@websocket_api.websocket_command(
{
vol.Required("type"): "frontend/update_panel",
vol.Required("url_path"): str,
vol.Optional("title"): vol.Any(cv.string, None),
vol.Optional("icon"): vol.Any(cv.icon, None),
vol.Optional("require_admin"): vol.Any(cv.boolean, None),
vol.Optional("show_in_sidebar"): vol.Any(cv.boolean, None),
}
)
@websocket_api.require_admin
@websocket_api.async_response
async def websocket_update_panel(
hass: HomeAssistant, connection: ActiveConnection, msg: dict[str, Any]
) -> None:
"""Handle update panel command."""
url_path: str = msg["url_path"]
if url_path not in hass.data.get(DATA_PANELS, {}):
connection.send_error(msg["id"], ERR_NOT_FOUND, "Panel not found")
return
panels_config = hass.data[DATA_PANELS_CONFIG]
panel_config = dict(panels_config.get(url_path, {}))
for key in ("title", "icon", "require_admin", "show_in_sidebar"):
if key in msg:
if (value := msg[key]) is None:
panel_config.pop(key, None)
else:
panel_config[key] = value
if panel_config:
panels_config[url_path] = panel_config
else:
panels_config.pop(url_path, None)
hass.data[DATA_PANELS_STORE].async_delay_save(
lambda: hass.data[DATA_PANELS_CONFIG], PANELS_SAVE_DELAY
)
hass.bus.async_fire(EVENT_PANELS_UPDATED)
connection.send_result(msg["id"])
class PanelResponse(TypedDict):
"""Represent the panel response type."""

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/generic",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["av==16.0.1", "Pillow==12.0.0"]
"requirements": ["av==16.0.1", "Pillow==12.1.1"]
}

View File

@@ -58,11 +58,12 @@ async def async_setup_entry(
class GeonetnzVolcanoSensor(SensorEntity):
"""Represents an external event with GeoNet NZ Volcano feed data."""
_attr_icon = DEFAULT_ICON
_attr_native_unit_of_measurement = "alert level"
_attr_should_poll = False
def __init__(self, config_entry_id, feed_manager, external_id, unit_system):
"""Initialize entity with data from feed entry."""
self._config_entry_id = config_entry_id
self._feed_manager = feed_manager
self._external_id = external_id
self._attr_unique_id = f"{config_entry_id}_{external_id}"
@@ -71,8 +72,6 @@ class GeonetnzVolcanoSensor(SensorEntity):
self._distance = None
self._latitude = None
self._longitude = None
self._attribution = None
self._alert_level = None
self._activity = None
self._hazards = None
self._feed_last_update = None
@@ -124,7 +123,7 @@ class GeonetnzVolcanoSensor(SensorEntity):
self._latitude = round(feed_entry.coordinates[0], 5)
self._longitude = round(feed_entry.coordinates[1], 5)
self._attr_attribution = feed_entry.attribution
self._alert_level = feed_entry.alert_level
self._attr_native_value = feed_entry.alert_level
self._activity = feed_entry.activity
self._hazards = feed_entry.hazards
self._feed_last_update = dt_util.as_utc(last_update) if last_update else None
@@ -133,25 +132,10 @@ class GeonetnzVolcanoSensor(SensorEntity):
)
@property
def native_value(self):
"""Return the state of the sensor."""
return self._alert_level
@property
def icon(self):
"""Return the icon to use in the frontend, if any."""
return DEFAULT_ICON
@property
def name(self) -> str | None:
def name(self) -> str:
"""Return the name of the entity."""
return f"Volcano {self._title}"
@property
def native_unit_of_measurement(self):
"""Return the unit of measurement."""
return "alert level"
@property
def extra_state_attributes(self) -> dict[str, Any]:
"""Return the device state attributes."""

View File

@@ -15,7 +15,7 @@ from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from .const import DISCOVERY_TIMEOUT
from .const import DISCOVERY_TIMEOUT, DOMAIN
from .coordinator import GoveeLocalApiCoordinator, GoveeLocalConfigEntry
PLATFORMS: list[Platform] = [Platform.LIGHT]
@@ -52,7 +52,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoveeLocalConfigEntry) -
_LOGGER.error("Start failed, errno: %d", ex.errno)
return False
_LOGGER.error("Port %s already in use", LISTENING_PORT)
raise ConfigEntryNotReady from ex
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="port_in_use",
translation_placeholders={"port": LISTENING_PORT},
) from ex
await coordinator.async_config_entry_first_refresh()
@@ -61,7 +65,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: GoveeLocalConfigEntry) -
while not coordinator.devices:
await asyncio.sleep(delay=1)
except TimeoutError as ex:
raise ConfigEntryNotReady from ex
raise ConfigEntryNotReady(
translation_domain=DOMAIN, translation_key="no_devices_found"
) from ex
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -33,5 +33,13 @@
}
}
}
},
"exceptions": {
"no_devices_found": {
"message": "[%key:common::config_flow::abort::no_devices_found%]"
},
"port_in_use": {
"message": "Port {port} is already in use"
}
}
}

View File

@@ -4,16 +4,16 @@
"abort": {
"fw_download_failed": "{firmware_name} firmware for your {model} failed to download. Make sure Home Assistant has internet access and try again.",
"fw_install_failed": "{firmware_name} firmware failed to install, check Home Assistant logs for more information.",
"not_hassio_thread": "The OpenThread Border Router add-on can only be installed with Home Assistant OS. If you would like to use the {model} as a Thread border router, please manually set up OpenThread Border Router to communicate with it.",
"otbr_addon_already_running": "The OpenThread Border Router add-on is already running, it cannot be installed again.",
"otbr_still_using_stick": "This {model} is in use by the OpenThread Border Router add-on. If you use the Thread network, make sure you have alternative border routers. Uninstall the add-on and try again.",
"unsupported_firmware": "The radio firmware on your {model} could not be determined. Make sure that no other integration or add-on is currently trying to communicate with the device. If you are running Home Assistant OS in a virtual machine or in Docker, please make sure that permissions are set correctly for the device.",
"not_hassio_thread": "The OpenThread Border Router app can only be installed with Home Assistant OS. If you would like to use the {model} as a Thread border router, please manually set up OpenThread Border Router to communicate with it.",
"otbr_addon_already_running": "The OpenThread Border Router app is already running, it cannot be installed again.",
"otbr_still_using_stick": "This {model} is in use by the OpenThread Border Router app. If you use the Thread network, make sure you have alternative border routers. Uninstall the app and try again.",
"unsupported_firmware": "The radio firmware on your {model} could not be determined. Make sure that no other integration or app is currently trying to communicate with the device. If you are running Home Assistant OS in a virtual machine or in Docker, please make sure that permissions are set correctly for the device.",
"zha_still_using_stick": "This {model} is in use by the Zigbee Home Automation integration. Please migrate your Zigbee network to another adapter or delete the integration and try again."
},
"progress": {
"install_firmware": "Installing {firmware_name} firmware.\n\nDo not make any changes to your hardware or software until this finishes.",
"install_otbr_addon": "Installing add-on",
"start_otbr_addon": "Starting add-on"
"install_otbr_addon": "Installing app",
"start_otbr_addon": "Starting app"
},
"step": {
"confirm_otbr": {
@@ -34,7 +34,7 @@
"title": "Updating adapter"
},
"otbr_failed": {
"description": "The OpenThread Border Router add-on installation was unsuccessful. Ensure no other software is trying to communicate with the {model}, you have access to the Internet and can install other add-ons, and try again. Check the Supervisor logs if the problem persists.",
"description": "The OpenThread Border Router app installation was unsuccessful. Ensure no other software is trying to communicate with the {model}, you have access to the Internet and can install other apps, and try again. Check the Supervisor logs if the problem persists.",
"title": "Failed to set up OpenThread Border Router"
},
"pick_firmware": {
@@ -89,11 +89,11 @@
"silabs_multiprotocol_hardware": {
"options": {
"abort": {
"addon_already_running": "Failed to start the {addon_name} add-on because it is already running.",
"addon_info_failed": "Failed to get {addon_name} add-on info.",
"addon_install_failed": "Failed to install the {addon_name} add-on.",
"addon_already_running": "Failed to start the {addon_name} app because it is already running.",
"addon_info_failed": "Failed to get {addon_name} app info.",
"addon_install_failed": "Failed to install the {addon_name} app.",
"addon_set_config_failed": "Failed to set {addon_name} configuration.",
"addon_start_failed": "Failed to start the {addon_name} add-on.",
"addon_start_failed": "Failed to start the {addon_name} app.",
"not_hassio": "The hardware options can only be configured on Home Assistant OS installations.",
"zha_migration_failed": "The ZHA migration did not succeed."
},
@@ -101,8 +101,8 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"progress": {
"install_addon": "Please wait while the {addon_name} add-on installation finishes. This can take several minutes.",
"start_addon": "Please wait while the {addon_name} add-on start completes. This may take some seconds."
"install_addon": "Please wait while the {addon_name} app installation finishes. This can take several minutes.",
"start_addon": "Please wait while the {addon_name} app start completes. This may take some seconds."
},
"step": {
"addon_installed_other_device": {
@@ -129,7 +129,7 @@
"title": "[%key:component::homeassistant_hardware::silabs_multiprotocol_hardware::options::step::reconfigure_addon::title%]"
},
"install_addon": {
"title": "The Silicon Labs Multiprotocol add-on installation has started"
"title": "The Silicon Labs Multiprotocol app installation has started"
},
"notify_channel_change": {
"description": "A Zigbee and Thread channel change has been initiated and will finish in {delay_minutes} minutes.",
@@ -143,7 +143,7 @@
"title": "Reconfigure IEEE 802.15.4 radio multiprotocol support"
},
"start_addon": {
"title": "The Silicon Labs Multiprotocol add-on is starting."
"title": "The Silicon Labs Multiprotocol app is starting."
},
"uninstall_addon": {
"data": {

View File

@@ -25,7 +25,7 @@
"otbr_addon_already_running": "[%key:component::homeassistant_hardware::firmware_picker::options::abort::otbr_addon_already_running%]",
"otbr_still_using_stick": "[%key:component::homeassistant_hardware::firmware_picker::options::abort::otbr_still_using_stick%]",
"read_hw_settings_error": "Failed to read hardware settings",
"unsupported_firmware": "The radio firmware on your {model} could not be determined. Make sure that no other integration or add-on is currently trying to communicate with the device.",
"unsupported_firmware": "The radio firmware on your {model} could not be determined. Make sure that no other integration or app is currently trying to communicate with the device.",
"write_hw_settings_error": "Failed to write hardware settings",
"zha_migration_failed": "[%key:component::homeassistant_hardware::silabs_multiprotocol_hardware::options::abort::zha_migration_failed%]",
"zha_still_using_stick": "[%key:component::homeassistant_hardware::firmware_picker::options::abort::zha_still_using_stick%]"

View File

@@ -3,6 +3,7 @@
from datetime import datetime
from functools import partial
import logging
from typing import Any
from pyhomematic import HMConnection
import voluptuous as vol
@@ -215,8 +216,11 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass.data[DATA_CONF] = remotes = {}
hass.data[DATA_STORE] = set()
interfaces: dict[str, dict[str, Any]] = conf[CONF_INTERFACES]
hosts: dict[str, dict[str, Any]] = conf[CONF_HOSTS]
# Create hosts-dictionary for pyhomematic
for rname, rconfig in conf[CONF_INTERFACES].items():
for rname, rconfig in interfaces.items():
remotes[rname] = {
"ip": rconfig.get(CONF_HOST),
"port": rconfig.get(CONF_PORT),
@@ -232,7 +236,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
"connect": True,
}
for sname, sconfig in conf[CONF_HOSTS].items():
for sname, sconfig in hosts.items():
remotes[sname] = {
"ip": sconfig.get(CONF_HOST),
"port": sconfig[CONF_PORT],
@@ -258,7 +262,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
hass.bus.listen_once(EVENT_HOMEASSISTANT_STOP, hass.data[DATA_HOMEMATIC].stop)
# Init homematic hubs
entity_hubs = [HMHub(hass, homematic, hub_name) for hub_name in conf[CONF_HOSTS]]
entity_hubs = [HMHub(hass, homematic, hub_name) for hub_name in hosts]
def _hm_service_virtualkey(service: ServiceCall) -> None:
"""Service to handle virtualkey servicecalls."""
@@ -294,7 +298,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
def _service_handle_value(service: ServiceCall) -> None:
"""Service to call setValue method for HomeMatic system variable."""
entity_ids = service.data.get(ATTR_ENTITY_ID)
entity_ids: list[str] | None = service.data.get(ATTR_ENTITY_ID)
name = service.data[ATTR_NAME]
value = service.data[ATTR_VALUE]

View File

@@ -11,6 +11,7 @@ from pyhomematic import HMConnection
from pyhomematic.devicetypes.generic import HMGeneric
from homeassistant.const import ATTR_NAME
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.event import track_time_interval
@@ -45,15 +46,16 @@ class HMDevice(Entity):
entity_description: EntityDescription | None = None,
) -> None:
"""Initialize a generic HomeMatic device."""
self._name = config.get(ATTR_NAME)
self._attr_name = config.get(ATTR_NAME)
self._address = config.get(ATTR_ADDRESS)
self._interface = config.get(ATTR_INTERFACE)
self._channel = config.get(ATTR_CHANNEL)
self._state = config.get(ATTR_PARAM)
self._unique_id = config.get(ATTR_UNIQUE_ID)
if unique_id := config.get(ATTR_UNIQUE_ID):
self._attr_unique_id = unique_id.replace(" ", "_")
self._data: dict[str, Any] = {}
self._connected = False
self._available = False
self._attr_available = False
self._channel_map: dict[str, str] = {}
if entity_description is not None:
@@ -67,21 +69,6 @@ class HMDevice(Entity):
"""Load data init callbacks."""
self._subscribe_homematic_events()
@property
def unique_id(self):
"""Return unique ID. HomeMatic entity IDs are unique by default."""
return self._unique_id.replace(" ", "_")
@property
def name(self):
"""Return the name of the device."""
return self._name
@property
def available(self) -> bool:
"""Return true if device is available."""
return self._available
@property
def extra_state_attributes(self) -> dict[str, Any]:
"""Return device specific state attributes."""
@@ -116,7 +103,7 @@ class HMDevice(Entity):
self._load_data_from_hm()
# Link events from pyhomematic
self._available = not self._hmdevice.UNREACH
self._attr_available = not self._hmdevice.UNREACH
except Exception as err: # noqa: BLE001
self._connected = False
_LOGGER.error("Exception while linking %s: %s", self._address, str(err))
@@ -132,7 +119,7 @@ class HMDevice(Entity):
# Availability has changed
if self.available != (not self._hmdevice.UNREACH):
self._available = not self._hmdevice.UNREACH
self._attr_available = not self._hmdevice.UNREACH
has_changed = True
# If it has changed data point, update Home Assistant
@@ -213,14 +200,14 @@ class HMHub(Entity):
_attr_should_poll = False
def __init__(self, hass, homematic, name):
def __init__(self, hass: HomeAssistant, homematic: HMConnection, name: str) -> None:
"""Initialize HomeMatic hub."""
self.hass = hass
self.entity_id = f"{DOMAIN}.{name.lower()}"
self._homematic = homematic
self._variables = {}
self._variables: dict[str, Any] = {}
self._name = name
self._state = None
self._state: int | None = None
# Load data
track_time_interval(self.hass, self._update_hub, SCAN_INTERVAL_HUB)
@@ -230,12 +217,12 @@ class HMHub(Entity):
self.hass.add_job(self._update_variables, None)
@property
def name(self):
def name(self) -> str:
"""Return the name of the device."""
return self._name
@property
def state(self):
def state(self) -> int | None:
"""Return the state of the entity."""
return self._state
@@ -245,7 +232,7 @@ class HMHub(Entity):
return self._variables.copy()
@property
def icon(self):
def icon(self) -> str:
"""Return the icon to use in the frontend, if any."""
return "mdi:gradient-vertical"

View File

@@ -344,4 +344,4 @@ class HMSensor(HMDevice, SensorEntity):
if self._state:
self._data.update({self._state: None})
else:
_LOGGER.critical("Unable to initialize sensor: %s", self._name)
_LOGGER.critical("Unable to initialize sensor: %s", self.name)

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["keyrings.alt", "pyicloud"],
"requirements": ["pyicloud==2.3.0"]
"requirements": ["pyicloud==2.4.1"]
}

View File

@@ -329,14 +329,14 @@ class IDriveE2BackupAgent(BackupAgent):
return self._backup_cache
backups = {}
response = await cast(Any, self._client).list_objects_v2(Bucket=self._bucket)
# Filter for metadata files only
metadata_files = [
obj
for obj in response.get("Contents", [])
if obj["Key"].endswith(".metadata.json")
]
paginator = self._client.get_paginator("list_objects_v2")
metadata_files: list[dict[str, Any]] = []
async for page in paginator.paginate(Bucket=self._bucket):
metadata_files.extend(
obj
for obj in page.get("Contents", [])
if obj["Key"].endswith(".metadata.json")
)
for metadata_file in metadata_files:
try:

View File

@@ -68,7 +68,7 @@ class IGloLamp(LightEntity):
return self._name
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of this light between 0..255."""
return int((self._lamp.state()["brightness"] / 200.0) * 255)
@@ -97,17 +97,17 @@ class IGloLamp(LightEntity):
return self._lamp.min_kelvin
@property
def hs_color(self):
def hs_color(self) -> tuple[float, float]:
"""Return the hs value."""
return color_util.color_RGB_to_hs(*self._lamp.state()["rgb"])
@property
def effect(self):
def effect(self) -> str:
"""Return the current effect."""
return self._lamp.state()["effect"]
@property
def effect_list(self):
def effect_list(self) -> list[str]:
"""Return the list of supported effects."""
return self._lamp.effect_list()

View File

@@ -1,6 +1,7 @@
"""Implementation of the lock platform."""
from datetime import timedelta
from typing import Any
from aiohttp import ClientError
from igloohome_api import (
@@ -63,7 +64,7 @@ class IgloohomeLockEntity(IgloohomeBaseEntity, LockEntity):
)
self.bridge_id = bridge_id
async def async_lock(self, **kwargs):
async def async_lock(self, **kwargs: Any) -> None:
"""Lock this lock."""
try:
await self.api.create_bridge_proxied_job(
@@ -72,7 +73,7 @@ class IgloohomeLockEntity(IgloohomeBaseEntity, LockEntity):
except (ApiException, ClientError) as err:
raise HomeAssistantError from err
async def async_unlock(self, **kwargs):
async def async_unlock(self, **kwargs: Any) -> None:
"""Unlock this lock."""
try:
await self.api.create_bridge_proxied_job(
@@ -81,7 +82,7 @@ class IgloohomeLockEntity(IgloohomeBaseEntity, LockEntity):
except (ApiException, ClientError) as err:
raise HomeAssistantError from err
async def async_open(self, **kwargs):
async def async_open(self, **kwargs: Any) -> None:
"""Open (unlatch) this lock."""
try:
await self.api.create_bridge_proxied_job(

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/image_upload",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["Pillow==12.0.0"]
"requirements": ["Pillow==12.1.1"]
}

View File

@@ -310,7 +310,7 @@ class InputDatetime(collection.CollectionEntity, RestoreEntity):
return self._config[CONF_HAS_TIME]
@property
def icon(self):
def icon(self) -> str | None:
"""Return the icon to be used for this entity."""
return self._config.get(CONF_ICON)

View File

@@ -243,7 +243,7 @@ class InputNumber(collection.CollectionEntity, RestoreEntity):
return self._config.get(CONF_NAME)
@property
def icon(self):
def icon(self) -> str | None:
"""Return the icon to be used for this entity."""
return self._config.get(CONF_ICON)

View File

@@ -61,7 +61,7 @@ class InsteonDimmerEntity(InsteonEntity, LightEntity):
self._attr_supported_color_modes = {ColorMode.ONOFF}
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of this light between 0..255."""
return self._insteon_device_group.value

View File

@@ -451,7 +451,7 @@ class AirPlayDevice(MediaPlayerEntity):
return self.device_name
@property
def icon(self):
def icon(self) -> str:
"""Return the icon to use in the frontend, if any."""
if self.selected is True:
return "mdi:volume-high"

View File

@@ -6,7 +6,11 @@ import logging
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers import (
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from .const import DOMAIN
from .coordinator import (
@@ -80,6 +84,21 @@ async def async_setup_entry(
lhm_coordinator = LibreHardwareMonitorCoordinator(hass, config_entry)
await lhm_coordinator.async_config_entry_first_refresh()
if lhm_coordinator.data.is_deprecated_version:
issue_id = f"deprecated_api_{config_entry.entry_id}"
ir.async_create_issue(
hass,
DOMAIN,
issue_id,
breaks_in_ha_version="2026.9.0",
is_fixable=False,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_api",
translation_placeholders={
"lhm_releases_url": "https://github.com/LibreHardwareMonitor/LibreHardwareMonitor/releases"
},
)
config_entry.runtime_data = lhm_coordinator
await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS)

View File

@@ -21,7 +21,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_PORT, CONF_USERNAME
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers import device_registry as dr, issue_registry as ir
from homeassistant.helpers.aiohttp_client import async_create_clientsession
from homeassistant.helpers.device_registry import DeviceEntry
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
@@ -50,7 +50,7 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
config_entry=config_entry,
update_interval=timedelta(seconds=DEFAULT_SCAN_INTERVAL),
)
self._entry_id = config_entry.entry_id
self._api = LibreHardwareMonitorClient(
host=config_entry.data[CONF_HOST],
port=config_entry.data[CONF_PORT],
@@ -59,13 +59,14 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
session=async_create_clientsession(hass),
)
device_entries: list[DeviceEntry] = dr.async_entries_for_config_entry(
registry=dr.async_get(self.hass), config_entry_id=config_entry.entry_id
registry=dr.async_get(self.hass), config_entry_id=self._entry_id
)
self._previous_devices: dict[DeviceId, DeviceName] = {
DeviceId(next(iter(device.identifiers))[1]): DeviceName(device.name)
for device in device_entries
if device.identifiers and device.name
}
self._is_deprecated_version: bool | None = None
async def _async_update_data(self) -> LibreHardwareMonitorData:
try:
@@ -80,6 +81,12 @@ class LibreHardwareMonitorCoordinator(DataUpdateCoordinator[LibreHardwareMonitor
except LibreHardwareMonitorNoDevicesError as err:
raise UpdateFailed("No sensor data available, will retry") from err
# Check whether user has upgraded LHM from a deprecated version while the integration is running
if self._is_deprecated_version and not lhm_data.is_deprecated_version:
# Clear deprecation issue
ir.async_delete_issue(self.hass, DOMAIN, f"deprecated_api_{self._entry_id}")
self._is_deprecated_version = lhm_data.is_deprecated_version
await self._async_handle_changes_in_devices(
dict(lhm_data.main_device_ids_and_names)
)

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["librehardwaremonitor-api==1.9.1"]
"requirements": ["librehardwaremonitor-api==1.10.1"]
}

View File

@@ -5,6 +5,7 @@ from __future__ import annotations
from typing import Any
from librehardwaremonitor_api.model import LibreHardwareMonitorSensorData
from librehardwaremonitor_api.sensor_type import SensorType
from homeassistant.components.sensor import SensorEntity, SensorStateClass
from homeassistant.core import HomeAssistant, callback
@@ -53,12 +54,8 @@ class LibreHardwareMonitorSensor(
super().__init__(coordinator)
self._attr_name: str = sensor_data.name
self._attr_native_value: str | None = sensor_data.value
self._attr_extra_state_attributes: dict[str, Any] = {
STATE_MIN_VALUE: sensor_data.min,
STATE_MAX_VALUE: sensor_data.max,
}
self._attr_native_unit_of_measurement = sensor_data.unit
self._set_state(coordinator.data.is_deprecated_version, sensor_data)
self._attr_unique_id: str = f"{entry_id}_{sensor_data.sensor_id}"
self._sensor_id: str = sensor_data.sensor_id
@@ -70,15 +67,36 @@ class LibreHardwareMonitorSensor(
model=sensor_data.device_type,
)
def _set_state(
self,
is_deprecated_lhm_version: bool,
sensor_data: LibreHardwareMonitorSensorData,
) -> None:
value = sensor_data.value
min_value = sensor_data.min
max_value = sensor_data.max
unit = sensor_data.unit
if not is_deprecated_lhm_version and sensor_data.type == SensorType.THROUGHPUT:
# Temporary fix: convert the B/s value to KB/s to not break existing entries
# This will be migrated properly once SensorDeviceClass is introduced
value = f"{(float(value) / 1024):.1f}" if value else None
min_value = f"{(float(min_value) / 1024):.1f}" if min_value else None
max_value = f"{(float(max_value) / 1024):.1f}" if max_value else None
unit = "KB/s"
self._attr_native_value: str | None = value
self._attr_extra_state_attributes: dict[str, Any] = {
STATE_MIN_VALUE: min_value,
STATE_MAX_VALUE: max_value,
}
self._attr_native_unit_of_measurement = unit
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
if sensor_data := self.coordinator.data.sensor_data.get(self._sensor_id):
self._attr_native_value = sensor_data.value
self._attr_extra_state_attributes = {
STATE_MIN_VALUE: sensor_data.min,
STATE_MAX_VALUE: sensor_data.max,
}
self._set_state(self.coordinator.data.is_deprecated_version, sensor_data)
else:
self._attr_native_value = None

View File

@@ -33,5 +33,11 @@
}
}
}
},
"issues": {
"deprecated_api": {
"description": "Your version of Libre Hardware Monitor is deprecated and may not provide stable sensor data. To fix this issue:\n\n1. Download version 0.9.5 or later from {lhm_releases_url}\n2. Close Libre Hardware Monitor on your computer\n3. Install or extract the new version and start Libre Hardware Monitor again (you might have to re-enable the remote web server)\n4. Home Assistant will detect the new version and this issue will clear automatically",
"title": "Deprecated Libre Hardware Monitor version"
}
}
}

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any, Generic
from pylitterbot import FeederRobot, LitterRobot3, LitterRobot4, Robot
from pylitterbot import FeederRobot, LitterRobot3, LitterRobot4, LitterRobot5, Robot
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
@@ -24,20 +24,24 @@ class RobotButtonEntityDescription(ButtonEntityDescription, Generic[_WhiskerEnti
press_fn: Callable[[_WhiskerEntityT], Coroutine[Any, Any, bool]]
ROBOT_BUTTON_MAP: dict[type[Robot], RobotButtonEntityDescription] = {
LitterRobot3: RobotButtonEntityDescription[LitterRobot3](
ROBOT_BUTTON_MAP: dict[tuple[type[Robot], ...], RobotButtonEntityDescription] = {
(LitterRobot3, LitterRobot5): RobotButtonEntityDescription[
LitterRobot3 | LitterRobot5
](
key="reset_waste_drawer",
translation_key="reset_waste_drawer",
entity_category=EntityCategory.CONFIG,
press_fn=lambda robot: robot.reset_waste_drawer(),
),
LitterRobot4: RobotButtonEntityDescription[LitterRobot4](
(LitterRobot4, LitterRobot5): RobotButtonEntityDescription[
LitterRobot4 | LitterRobot5
](
key="reset",
translation_key="reset",
entity_category=EntityCategory.CONFIG,
press_fn=lambda robot: robot.reset(),
),
FeederRobot: RobotButtonEntityDescription[FeederRobot](
(FeederRobot,): RobotButtonEntityDescription[FeederRobot](
key="give_snack",
translation_key="give_snack",
press_fn=lambda robot: robot.give_snack(),

View File

@@ -13,5 +13,5 @@
"iot_class": "cloud_push",
"loggers": ["pylitterbot"],
"quality_scale": "bronze",
"requirements": ["pylitterbot==2025.0.0"]
"requirements": ["pylitterbot==2025.1.0"]
}

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any, Generic, TypeVar
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, Robot
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, LitterRobot5, Robot
from pylitterbot.robot.litterrobot4 import BrightnessLevel, NightLightMode
from homeassistant.components.select import SelectEntity, SelectEntityDescription
@@ -32,9 +32,11 @@ class RobotSelectEntityDescription(
select_fn: Callable[[_WhiskerEntityT, str], Coroutine[Any, Any, bool]]
ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] = {
ROBOT_SELECT_MAP: dict[
type[Robot] | tuple[type[Robot], ...], tuple[RobotSelectEntityDescription, ...]
] = {
LitterRobot: (
RobotSelectEntityDescription[LitterRobot, int]( # type: ignore[type-abstract] # only used for isinstance check
RobotSelectEntityDescription[LitterRobot, int](
key="cycle_delay",
translation_key="cycle_delay",
unit_of_measurement=UnitOfTime.MINUTES,
@@ -43,8 +45,8 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
select_fn=lambda robot, opt: robot.set_wait_time(int(opt)),
),
),
LitterRobot4: (
RobotSelectEntityDescription[LitterRobot4, str](
(LitterRobot4, LitterRobot5): (
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="globe_brightness",
translation_key="globe_brightness",
current_fn=(
@@ -61,7 +63,7 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
)
),
),
RobotSelectEntityDescription[LitterRobot4, str](
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="globe_light",
translation_key="globe_light",
current_fn=(
@@ -78,7 +80,7 @@ ROBOT_SELECT_MAP: dict[type[Robot], tuple[RobotSelectEntityDescription, ...]] =
)
),
),
RobotSelectEntityDescription[LitterRobot4, str](
RobotSelectEntityDescription[LitterRobot4 | LitterRobot5, str](
key="panel_brightness",
translation_key="brightness_level",
current_fn=(

View File

@@ -7,7 +7,7 @@ from dataclasses import dataclass
from datetime import datetime
from typing import Any, Generic
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, Pet, Robot
from pylitterbot import FeederRobot, LitterRobot, LitterRobot4, LitterRobot5, Pet, Robot
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -44,8 +44,10 @@ class RobotSensorEntityDescription(SensorEntityDescription, Generic[_WhiskerEnti
value_fn: Callable[[_WhiskerEntityT], float | datetime | str | None]
ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
LitterRobot: [ # type: ignore[type-abstract] # only used for isinstance check
ROBOT_SENSOR_MAP: dict[
type[Robot] | tuple[type[Robot], ...], list[RobotSensorEntityDescription]
] = {
LitterRobot: [
RobotSensorEntityDescription[LitterRobot](
key="waste_drawer_level",
translation_key="waste_drawer",
@@ -145,7 +147,9 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
)
),
),
RobotSensorEntityDescription[LitterRobot4](
],
(LitterRobot4, LitterRobot5): [
RobotSensorEntityDescription[LitterRobot4 | LitterRobot5](
key="litter_level",
translation_key="litter_level",
native_unit_of_measurement=PERCENTAGE,
@@ -153,7 +157,7 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda robot: robot.litter_level,
),
RobotSensorEntityDescription[LitterRobot4](
RobotSensorEntityDescription[LitterRobot4 | LitterRobot5](
key="pet_weight",
translation_key="pet_weight",
native_unit_of_measurement=UnitOfMass.POUNDS,

View File

@@ -107,36 +107,20 @@ class APIData:
class AirSensor(SensorEntity):
"""Single authority air sensor."""
ICON = "mdi:cloud-outline"
_attr_icon = "mdi:cloud-outline"
def __init__(self, name, api_data):
"""Initialize the sensor."""
self._name = name
self._attr_name = self._key = name
self._api_data = api_data
self._site_data = None
self._state = None
self._updated = None
@property
def name(self):
"""Return the name of the sensor."""
return self._name
@property
def native_value(self):
"""Return the state of the sensor."""
return self._state
@property
def site_data(self):
"""Return the dict of sites data."""
return self._site_data
@property
def icon(self):
"""Icon to use in the frontend, if any."""
return self.ICON
@property
def extra_state_attributes(self) -> dict[str, Any]:
"""Return other details about the sensor state."""
@@ -151,7 +135,7 @@ class AirSensor(SensorEntity):
sites_status: list = []
self._api_data.update()
if self._api_data.data:
self._site_data = self._api_data.data[self._name]
self._site_data = self._api_data.data[self._key]
self._updated = self._site_data[0]["updated"]
sites_status.extend(
site["pollutants_status"]
@@ -160,9 +144,9 @@ class AirSensor(SensorEntity):
)
if sites_status:
self._state = max(set(sites_status), key=sites_status.count)
self._attr_native_value = max(set(sites_status), key=sites_status.count)
else:
self._state = None
self._attr_native_value = None
def parse_species(species_data):

View File

@@ -6,5 +6,5 @@
"iot_class": "cloud_push",
"loggers": ["matrix_client"],
"quality_scale": "legacy",
"requirements": ["matrix-nio==0.25.2", "Pillow==12.0.0", "aiofiles==24.1.0"]
"requirements": ["matrix-nio==0.25.2", "Pillow==12.1.1", "aiofiles==24.1.0"]
}

View File

@@ -498,6 +498,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
custom_clusters.InovelliCluster.Attributes.LEDIndicatorIntensityOff,
),
product_id=(2, 16),
),
MatterDiscoverySchema(
platform=Platform.NUMBER,
@@ -514,6 +515,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
custom_clusters.InovelliCluster.Attributes.LEDIndicatorIntensityOn,
),
product_id=(2, 16),
),
MatterDiscoverySchema(
platform=Platform.NUMBER,

View File

@@ -908,6 +908,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ApparentPower,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -924,6 +925,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ReactivePower,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -939,6 +941,7 @@ DISCOVERY_SCHEMAS = [
),
entity_class=MatterSensor,
required_attributes=(clusters.ElectricalPowerMeasurement.Attributes.Voltage,),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -956,6 +959,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.RMSVoltage,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -973,6 +977,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ApparentCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -990,6 +995,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ActiveCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1007,6 +1013,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.ReactiveCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1024,6 +1031,7 @@ DISCOVERY_SCHEMAS = [
required_attributes=(
clusters.ElectricalPowerMeasurement.Attributes.RMSCurrent,
),
allow_none_value=True,
),
MatterDiscoverySchema(
platform=Platform.SENSOR,
@@ -1039,6 +1047,7 @@ DISCOVERY_SCHEMAS = [
device_to_ha=lambda x: x.energy,
),
entity_class=MatterSensor,
allow_none_value=True,
required_attributes=(
clusters.ElectricalEnergyMeasurement.Attributes.CumulativeEnergyImported,
),
@@ -1058,6 +1067,7 @@ DISCOVERY_SCHEMAS = [
device_to_ha=lambda x: x.energy,
),
entity_class=MatterSensor,
allow_none_value=True,
required_attributes=(
clusters.ElectricalEnergyMeasurement.Attributes.CumulativeEnergyExported,
),

View File

@@ -1,14 +1,14 @@
{
"config": {
"abort": {
"addon_get_discovery_info_failed": "Failed to get Matter Server add-on discovery info.",
"addon_info_failed": "Failed to get Matter Server add-on info.",
"addon_install_failed": "Failed to install the Matter Server add-on.",
"addon_start_failed": "Failed to start the Matter Server add-on.",
"addon_get_discovery_info_failed": "Failed to get Matter Server app discovery info.",
"addon_info_failed": "Failed to get Matter Server app info.",
"addon_install_failed": "Failed to install the Matter Server app.",
"addon_start_failed": "Failed to start the Matter Server app.",
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"not_matter_addon": "Discovered add-on is not the official Matter Server add-on.",
"not_matter_addon": "Discovered app is not the official Matter Server app.",
"reconfiguration_successful": "Successfully reconfigured the Matter integration."
},
"error": {
@@ -18,15 +18,15 @@
},
"flow_title": "{name}",
"progress": {
"install_addon": "Please wait while the Matter Server add-on installation finishes. This can take several minutes.",
"start_addon": "Please wait while the Matter Server add-on starts. This add-on is what powers Matter in Home Assistant. This may take some seconds."
"install_addon": "Please wait while the Matter Server app installation finishes. This can take several minutes.",
"start_addon": "Please wait while the Matter Server app starts. This app is what powers Matter in Home Assistant. This may take some seconds."
},
"step": {
"hassio_confirm": {
"title": "Set up the Matter integration with the Matter Server add-on"
"title": "Set up the Matter integration with the Matter Server app"
},
"install_addon": {
"title": "The add-on installation has started"
"title": "The app installation has started"
},
"manual": {
"data": {
@@ -35,13 +35,13 @@
},
"on_supervisor": {
"data": {
"use_addon": "Use the official Matter Server Supervisor add-on"
"use_addon": "Use the official Matter Server Supervisor app"
},
"description": "Do you want to use the official Matter Server Supervisor add-on?\n\nIf you are already running the Matter Server in another add-on, in a custom container, natively etc., then do not select this option.",
"description": "Do you want to use the official Matter Server Supervisor app?\n\nIf you are already running the Matter Server in another app, in a custom container, natively etc., then do not select this option.",
"title": "Select connection method"
},
"start_addon": {
"title": "Starting add-on."
"title": "Starting app."
}
}
},

View File

@@ -46,30 +46,42 @@ async def async_setup_entry(
class MatterSwitchEntityDescription(SwitchEntityDescription, MatterEntityDescription):
"""Describe Matter Switch entities."""
inverted: bool = False
class MatterSwitch(MatterEntity, SwitchEntity):
"""Representation of a Matter switch."""
entity_description: MatterSwitchEntityDescription
_platform_translation_key = "switch"
def _get_command_for_value(self, value: bool) -> ClusterCommand:
"""Get the appropriate command for the desired value.
Applies inversion if needed (e.g., for inverted logic like mute).
"""
send_value = not value if self.entity_description.inverted else value
return (
clusters.OnOff.Commands.On()
if send_value
else clusters.OnOff.Commands.Off()
)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn switch on."""
await self.send_device_command(
clusters.OnOff.Commands.On(),
)
await self.send_device_command(self._get_command_for_value(True))
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn switch off."""
await self.send_device_command(
clusters.OnOff.Commands.Off(),
)
await self.send_device_command(self._get_command_for_value(False))
@callback
def _update_from_device(self) -> None:
"""Update from device."""
self._attr_is_on = self.get_matter_attribute_value(
self._entity_info.primary_attribute
)
value = self.get_matter_attribute_value(self._entity_info.primary_attribute)
if self.entity_description.inverted:
value = not value
self._attr_is_on = value
class MatterGenericCommandSwitch(MatterSwitch):
@@ -121,9 +133,7 @@ class MatterGenericCommandSwitch(MatterSwitch):
@dataclass(frozen=True, kw_only=True)
class MatterGenericCommandSwitchEntityDescription(
SwitchEntityDescription, MatterEntityDescription
):
class MatterGenericCommandSwitchEntityDescription(MatterSwitchEntityDescription):
"""Describe Matter Generic command Switch entities."""
# command: a custom callback to create the command to send to the device
@@ -133,9 +143,7 @@ class MatterGenericCommandSwitchEntityDescription(
@dataclass(frozen=True, kw_only=True)
class MatterNumericSwitchEntityDescription(
SwitchEntityDescription, MatterEntityDescription
):
class MatterNumericSwitchEntityDescription(MatterSwitchEntityDescription):
"""Describe Matter Numeric Switch entities."""
@@ -146,11 +154,10 @@ class MatterNumericSwitch(MatterSwitch):
async def _async_set_native_value(self, value: bool) -> None:
"""Update the current value."""
send_value: Any = value
if value_convert := self.entity_description.ha_to_device:
send_value = value_convert(value)
await self.write_attribute(
value=send_value,
)
await self.write_attribute(value=send_value)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn switch on."""
@@ -248,19 +255,12 @@ DISCOVERY_SCHEMAS = [
),
MatterDiscoverySchema(
platform=Platform.SWITCH,
entity_description=MatterNumericSwitchEntityDescription(
entity_description=MatterSwitchEntityDescription(
key="MatterMuteToggle",
translation_key="speaker_mute",
device_to_ha={
True: False, # True means volume is on, so HA should show mute as off
False: True, # False means volume is off (muted), so HA should show mute as on
}.get,
ha_to_device={
False: True, # HA showing mute as off means volume is on, so send True
True: False, # HA showing mute as on means volume is off (muted), so send False
}.get,
inverted=True,
),
entity_class=MatterNumericSwitch,
entity_class=MatterSwitch,
required_attributes=(clusters.OnOff.Attributes.OnOff,),
device_type=(device_types.Speaker,),
),

View File

@@ -113,35 +113,15 @@ class NetdataSensor(SensorEntity):
def __init__(self, netdata, name, sensor, sensor_name, element, icon, unit, invert):
"""Initialize the Netdata sensor."""
self.netdata = netdata
self._state = None
self._sensor = sensor
self._element = element
self._sensor_name = self._sensor if sensor_name is None else sensor_name
self._name = name
self._icon = icon
self._unit_of_measurement = unit
if sensor_name is None:
sensor_name = self._sensor
self._attr_name = f"{name} {sensor_name}"
self._attr_icon = icon
self._attr_native_unit_of_measurement = unit
self._invert = invert
@property
def name(self):
"""Return the name of the sensor."""
return f"{self._name} {self._sensor_name}"
@property
def native_unit_of_measurement(self):
"""Return the unit the value is expressed in."""
return self._unit_of_measurement
@property
def icon(self):
"""Return the icon to use in the frontend, if any."""
return self._icon
@property
def native_value(self):
"""Return the state of the resources."""
return self._state
@property
def available(self) -> bool:
"""Could the resource be accessed during the last update call."""
@@ -151,9 +131,9 @@ class NetdataSensor(SensorEntity):
"""Get the latest data from Netdata REST API."""
await self.netdata.async_update()
resource_data = self.netdata.api.metrics.get(self._sensor)
self._state = round(resource_data["dimensions"][self._element]["value"], 2) * (
-1 if self._invert else 1
)
self._attr_native_value = round(
resource_data["dimensions"][self._element]["value"], 2
) * (-1 if self._invert else 1)
class NetdataAlarms(SensorEntity):
@@ -162,29 +142,18 @@ class NetdataAlarms(SensorEntity):
def __init__(self, netdata, name, host, port):
"""Initialize the Netdata alarm sensor."""
self.netdata = netdata
self._state = None
self._name = name
self._attr_name = f"{name} Alarms"
self._host = host
self._port = port
@property
def name(self):
"""Return the name of the sensor."""
return f"{self._name} Alarms"
@property
def native_value(self):
"""Return the state of the resources."""
return self._state
@property
def icon(self):
def icon(self) -> str:
"""Status symbol if type is symbol."""
if self._state == "ok":
if self._attr_native_value == "ok":
return "mdi:check"
if self._state == "warning":
if self._attr_native_value == "warning":
return "mdi:alert-outline"
if self._state == "critical":
if self._attr_native_value == "critical":
return "mdi:alert"
return "mdi:crosshairs-question"
@@ -197,7 +166,7 @@ class NetdataAlarms(SensorEntity):
"""Get the latest alarms from Netdata REST API."""
await self.netdata.async_update()
alarms = self.netdata.api.alarms["alarms"]
self._state = None
self._attr_native_value = None
number_of_alarms = len(alarms)
number_of_relevant_alarms = number_of_alarms
@@ -211,9 +180,9 @@ class NetdataAlarms(SensorEntity):
):
number_of_relevant_alarms = number_of_relevant_alarms - 1
elif alarms[alarm]["status"] == "CRITICAL":
self._state = "critical"
self._attr_native_value = "critical"
return
self._state = "ok" if number_of_relevant_alarms == 0 else "warning"
self._attr_native_value = "ok" if number_of_relevant_alarms == 0 else "warning"
class NetdataData:

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import TYPE_CHECKING, Any
@@ -119,6 +120,31 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
self._discovered_name: str | None = None
self._pending_host: str | None = None
async def _async_validate_credentials(
self,
host: str,
errors: dict[str, str],
username: str | None = None,
password: str | None = None,
) -> dict[str, Any] | None:
"""Validate credentials and populate errors dict on failure."""
try:
return await validate_input(
self.hass, host, username=username, password=password
)
except NRGkickApiClientApiDisabledError:
errors["base"] = "json_api_disabled"
except NRGkickApiClientAuthenticationError:
errors["base"] = "invalid_auth"
except NRGkickApiClientInvalidResponseError:
errors["base"] = "invalid_response"
except NRGkickApiClientCommunicationError:
errors["base"] = "cannot_connect"
except NRGkickApiClientError:
_LOGGER.exception("Unexpected error")
errors["base"] = "unknown"
return None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -169,36 +195,20 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
assert self._pending_host is not None
if user_input is not None:
username = user_input.get(CONF_USERNAME)
password = user_input.get(CONF_PASSWORD)
try:
info = await validate_input(
self.hass,
self._pending_host,
username=username,
password=password,
)
except NRGkickApiClientApiDisabledError:
errors["base"] = "json_api_disabled"
except NRGkickApiClientAuthenticationError:
errors["base"] = "invalid_auth"
except NRGkickApiClientInvalidResponseError:
errors["base"] = "invalid_response"
except NRGkickApiClientCommunicationError:
errors["base"] = "cannot_connect"
except NRGkickApiClientError:
_LOGGER.exception("Unexpected error")
errors["base"] = "unknown"
else:
if info := await self._async_validate_credentials(
self._pending_host,
errors,
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
):
await self.async_set_unique_id(info["serial"], raise_on_progress=False)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=info["title"],
data={
CONF_HOST: self._pending_host,
CONF_USERNAME: username,
CONF_PASSWORD: password,
CONF_USERNAME: user_input[CONF_USERNAME],
CONF_PASSWORD: user_input[CONF_PASSWORD],
},
)
@@ -211,6 +221,42 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
},
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle initiation of reauthentication."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reauthentication."""
errors: dict[str, str] = {}
if user_input is not None:
reauth_entry = self._get_reauth_entry()
if info := await self._async_validate_credentials(
reauth_entry.data[CONF_HOST],
errors,
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
):
await self.async_set_unique_id(info["serial"], raise_on_progress=False)
self._abort_if_unique_id_mismatch()
return self.async_update_reload_and_abort(
reauth_entry,
data_updates=user_input,
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=self.add_suggested_values_to_schema(
STEP_AUTH_DATA_SCHEMA,
self._get_reauth_entry().data,
),
errors=errors,
)
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
@@ -235,8 +281,9 @@ class NRGkickConfigFlow(ConfigFlow, domain=DOMAIN):
# Store discovery info for the confirmation step.
self._discovered_host = discovery_info.host
# Fallback: device_name -> model_type -> "NRGkick".
self._discovered_name = device_name or model_type or "NRGkick"
self.context["title_placeholders"] = {"name": self._discovered_name}
discovered_name = device_name or model_type or "NRGkick"
self._discovered_name = discovered_name
self.context["title_placeholders"] = {"name": discovered_name}
# If JSON API is disabled, guide the user through enabling it.
if json_api_enabled != "1":

View File

@@ -18,7 +18,7 @@ from nrgkick_api import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryError
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DEFAULT_SCAN_INTERVAL, DOMAIN
@@ -65,7 +65,7 @@ class NRGkickDataUpdateCoordinator(DataUpdateCoordinator[NRGkickData]):
control = await self.api.get_control()
values = await self.api.get_values(raw=True)
except NRGkickAuthenticationError as error:
raise ConfigEntryError(
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="authentication_error",
) from error

View File

@@ -43,7 +43,7 @@ rules:
integration-owner: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: todo
reauthentication-flow: done
test-coverage: done
# Gold

View File

@@ -4,7 +4,9 @@
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"already_in_progress": "[%key:common::config_flow::abort::already_in_progress%]",
"json_api_disabled": "JSON API is disabled on the device. Enable it in the NRGkick mobile app under Extended \u2192 Local API \u2192 API Variants.",
"no_serial_number": "Device does not provide a serial number"
"no_serial_number": "Device does not provide a serial number",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"unique_id_mismatch": "The device does not match the previous device"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -15,6 +17,17 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"password": "[%key:component::nrgkick::config::step::user_auth::data_description::password%]",
"username": "[%key:component::nrgkick::config::step::user_auth::data_description::username%]"
},
"description": "Reauthenticate with your NRGkick device.\n\nGet your username and password in the NRGkick mobile app:\n1. Open the NRGkick mobile app \u2192 Extended \u2192 Local API\n2. Under Authentication (JSON), check or set your username and password"
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"

View File

@@ -6,7 +6,7 @@
"documentation": "https://www.home-assistant.io/integrations/ntfy",
"integration_type": "service",
"iot_class": "cloud_push",
"loggers": ["aionfty"],
"loggers": ["aiontfy"],
"quality_scale": "platinum",
"requirements": ["aiontfy==0.7.0"]
"requirements": ["aiontfy==0.8.0"]
}

View File

@@ -244,7 +244,7 @@ class Luminary(LightEntity):
return self._luminary.name()
@property
def hs_color(self):
def hs_color(self) -> tuple[float, float]:
"""Return last hs color value set."""
return color_util.color_RGB_to_hs(*self._rgb_color)

View File

@@ -62,7 +62,7 @@ class PilightLight(PilightBaseDevice, LightEntity):
self._dimlevel_max = config.get(CONF_DIMLEVEL_MAX)
@property
def brightness(self):
def brightness(self) -> int | None:
"""Return the brightness."""
return self._brightness

View File

@@ -11,6 +11,7 @@ from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import PERCENTAGE
from homeassistant.core import HomeAssistant
@@ -61,6 +62,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
value_fn=(
lambda psn: psn.trophy_summary.trophy_level if psn.trophy_summary else None
),
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.TROPHY_LEVEL_PROGRESS,
@@ -69,6 +71,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
lambda psn: psn.trophy_summary.progress if psn.trophy_summary else None
),
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.EARNED_TROPHIES_PLATINUM,
@@ -80,6 +83,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
else None
)
),
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.EARNED_TROPHIES_GOLD,
@@ -89,6 +93,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
psn.trophy_summary.earned_trophies.gold if psn.trophy_summary else None
)
),
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.EARNED_TROPHIES_SILVER,
@@ -100,6 +105,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
else None
)
),
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.EARNED_TROPHIES_BRONZE,
@@ -111,6 +117,7 @@ SENSOR_DESCRIPTIONS: tuple[PlaystationNetworkSensorEntityDescription, ...] = (
else None
)
),
state_class=SensorStateClass.MEASUREMENT,
),
PlaystationNetworkSensorEntityDescription(
key=PlaystationNetworkSensor.ONLINE_ID,

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyportainer==1.0.23"]
"requirements": ["pyportainer==1.0.28"]
}

View File

@@ -176,7 +176,7 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
list[dict[str, Any]], list[tuple[list[dict[str, Any]], list[dict[str, Any]]]]
]:
"""Fetch all nodes, and then proceed to the VMs and containers."""
nodes = self.proxmox.nodes.get()
nodes = self.proxmox.nodes.get() or []
vms_containers = [self._get_vms_containers(node) for node in nodes]
return nodes, vms_containers

View File

@@ -4,5 +4,5 @@
"codeowners": [],
"documentation": "https://www.home-assistant.io/integrations/proxy",
"quality_scale": "legacy",
"requirements": ["Pillow==12.0.0"]
"requirements": ["Pillow==12.1.1"]
}

View File

@@ -6,5 +6,5 @@
"iot_class": "calculated",
"loggers": ["pyzbar"],
"quality_scale": "legacy",
"requirements": ["Pillow==12.0.0", "pyzbar==0.1.7"]
"requirements": ["Pillow==12.1.1", "pyzbar==0.1.7"]
}

View File

@@ -30,7 +30,7 @@ class QSLight(QSToggleEntity, LightEntity):
"""Light based on a Qwikswitch relay/dimmer module."""
@property
def brightness(self):
def brightness(self) -> int | None:
"""Return the brightness of this light (0-255)."""
return self.device.value if self.device.is_dimmer else None

View File

@@ -5,7 +5,7 @@
"title": "Database backup failed due to lack of resources"
},
"maria_db_range_index_regression": {
"description": "Older versions of MariaDB suffer from a significant performance regression when retrieving history data or purging the database. Update to MariaDB version {min_version} or later and restart Home Assistant. If you are using the MariaDB core add-on, make sure to update it to the latest version.",
"description": "Older versions of MariaDB suffer from a significant performance regression when retrieving history data or purging the database. Update to MariaDB version {min_version} or later and restart Home Assistant. If you are using the MariaDB Core app, make sure to update it to the latest version.",
"title": "Update MariaDB to {min_version} or later resolve a significant performance issue"
}
},

View File

@@ -99,8 +99,12 @@ def setup_platform(
class RedditSensor(SensorEntity):
"""Representation of a Reddit sensor."""
_attr_icon = "mdi:reddit"
def __init__(self, reddit, subreddit: str, limit: int, sort_by: str) -> None:
"""Initialize the Reddit sensor."""
self._attr_name = f"reddit_{subreddit}"
self._attr_native_value = 0
self._reddit = reddit
self._subreddit = subreddit
self._limit = limit
@@ -108,16 +112,6 @@ class RedditSensor(SensorEntity):
self._subreddit_data: list = []
@property
def name(self):
"""Return the name of the sensor."""
return f"reddit_{self._subreddit}"
@property
def native_value(self):
"""Return the state of the sensor."""
return len(self._subreddit_data)
@property
def extra_state_attributes(self) -> dict[str, Any]:
"""Return the state attributes."""
@@ -127,11 +121,6 @@ class RedditSensor(SensorEntity):
CONF_SORT_BY: self._sort_by,
}
@property
def icon(self):
"""Return the icon to use in the frontend."""
return "mdi:reddit"
def update(self) -> None:
"""Update data from Reddit API."""
self._subreddit_data = []
@@ -156,3 +145,5 @@ class RedditSensor(SensorEntity):
except praw.exceptions.PRAWException as err:
_LOGGER.error("Reddit error %s", err)
self._attr_native_value = len(self._subreddit_data)

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["renault_api"],
"quality_scale": "silver",
"requirements": ["renault-api==0.5.3"]
"requirements": ["renault-api==0.5.5"]
}

View File

@@ -226,7 +226,7 @@ class DimmableRflinkLight(SwitchableRflinkDevice, LightEntity):
self._state = True
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of this light between 0..255."""
return self._brightness

View File

@@ -122,6 +122,7 @@ class RMVDepartureSensor(SensorEntity):
"""Implementation of an RMV departure sensor."""
_attr_attribution = ATTRIBUTION
_attr_native_unit_of_measurement = UnitOfTime.MINUTES
def __init__(
self,
@@ -137,7 +138,7 @@ class RMVDepartureSensor(SensorEntity):
):
"""Initialize the sensor."""
self._station = station
self._name = name
self._attr_name = name
self._state = None
self.data = RMVDepartureData(
station,
@@ -149,12 +150,7 @@ class RMVDepartureSensor(SensorEntity):
max_journeys,
timeout,
)
self._icon = ICONS[None]
@property
def name(self):
"""Return the name of the sensor."""
return self._name
self._attr_icon = ICONS[None]
@property
def available(self) -> bool:
@@ -181,32 +177,22 @@ class RMVDepartureSensor(SensorEntity):
except IndexError:
return {}
@property
def icon(self):
"""Icon to use in the frontend, if any."""
return self._icon
@property
def native_unit_of_measurement(self):
"""Return the unit this state is expressed in."""
return UnitOfTime.MINUTES
async def async_update(self) -> None:
"""Get the latest data and update the state."""
await self.data.async_update()
if self._name == DEFAULT_NAME:
self._name = self.data.station
if self._attr_name == DEFAULT_NAME:
self._attr_name = self.data.station
self._station = self.data.station
if not self.data.departures:
self._state = None
self._icon = ICONS[None]
self._attr_icon = ICONS[None]
return
self._state = self.data.departures[0].get("minutes")
self._icon = ICONS[self.data.departures[0].get("product")]
self._attr_icon = ICONS[self.data.departures[0].get("product")]
class RMVDepartureData:

View File

@@ -144,18 +144,18 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
for coord in coordinators
if isinstance(coord, RoborockDataUpdateCoordinatorA01)
]
b01_coords = [
b01_q7_coords = [
coord
for coord in coordinators
if isinstance(coord, RoborockDataUpdateCoordinatorB01)
if isinstance(coord, RoborockB01Q7UpdateCoordinator)
]
if len(v1_coords) + len(a01_coords) + len(b01_coords) == 0:
if len(v1_coords) + len(a01_coords) + len(b01_q7_coords) == 0:
raise ConfigEntryNotReady(
"No devices were able to successfully setup",
translation_domain=DOMAIN,
translation_key="no_coordinators",
)
entry.runtime_data = RoborockCoordinators(v1_coords, a01_coords, b01_coords)
entry.runtime_data = RoborockCoordinators(v1_coords, a01_coords, b01_q7_coords)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)

View File

@@ -64,17 +64,17 @@ class RoborockCoordinators:
v1: list[RoborockDataUpdateCoordinator]
a01: list[RoborockDataUpdateCoordinatorA01]
b01: list[RoborockDataUpdateCoordinatorB01]
b01_q7: list[RoborockB01Q7UpdateCoordinator]
def values(
self,
) -> list[
RoborockDataUpdateCoordinator
| RoborockDataUpdateCoordinatorA01
| RoborockDataUpdateCoordinatorB01
| RoborockB01Q7UpdateCoordinator
]:
"""Return all coordinators."""
return self.v1 + self.a01 + self.b01
return self.v1 + self.a01 + self.b01_q7
type RoborockConfigEntry = ConfigEntry[RoborockCoordinators]

View File

@@ -2,8 +2,8 @@
from typing import Any
from roborock.data import Status
from roborock.devices.traits.v1.command import CommandTrait
from roborock.devices.traits.v1.status import StatusTrait
from roborock.exceptions import RoborockException
from roborock.roborock_typing import RoborockCommand
@@ -14,9 +14,9 @@ from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import (
RoborockB01Q7UpdateCoordinator,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockDataUpdateCoordinatorB01,
)
@@ -94,7 +94,7 @@ class RoborockCoordinatedEntityV1(
self._attr_unique_id = unique_id
@property
def _device_status(self) -> Status:
def _device_status(self) -> StatusTrait:
"""Return the status of the device."""
data = self.coordinator.data
return data.status
@@ -130,21 +130,21 @@ class RoborockCoordinatedEntityA01(
self._attr_unique_id = unique_id
class RoborockCoordinatedEntityB01(
RoborockEntity, CoordinatorEntity[RoborockDataUpdateCoordinatorB01]
class RoborockCoordinatedEntityB01Q7(
RoborockEntity, CoordinatorEntity[RoborockB01Q7UpdateCoordinator]
):
"""Representation of coordinated Roborock Entity."""
def __init__(
self,
unique_id: str,
coordinator: RoborockDataUpdateCoordinatorB01,
coordinator: RoborockB01Q7UpdateCoordinator,
) -> None:
"""Initialize the coordinated Roborock Device."""
CoordinatorEntity.__init__(self, coordinator=coordinator)
RoborockEntity.__init__(
self,
unique_id=unique_id,
device_info=coordinator.device_info,
)
CoordinatorEntity.__init__(self, coordinator=coordinator)
self._attr_unique_id = unique_id

View File

@@ -20,7 +20,7 @@
"loggers": ["roborock"],
"quality_scale": "silver",
"requirements": [
"python-roborock==4.14.0",
"python-roborock==4.17.1",
"vacuum-map-parser-roborock==0.1.4"
]
}

View File

@@ -12,8 +12,8 @@ from roborock.data import (
HomeDataDevice,
HomeDataProduct,
NetworkInfo,
Status,
)
from roborock.devices.traits.v1.status import StatusTrait
from vacuum_map_parser_base.map_data import MapData
_LOGGER = logging.getLogger(__name__)
@@ -23,7 +23,7 @@ _LOGGER = logging.getLogger(__name__)
class DeviceState:
"""Data about the current state of a device."""
status: Status
status: StatusTrait
dnd_timer: DnDTimer
consumable: Consumable
clean_summary: CleanSummaryWithDetail

View File

@@ -26,7 +26,7 @@ from .coordinator import (
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
)
from .entity import RoborockCoordinatedEntityB01, RoborockCoordinatedEntityV1
from .entity import RoborockCoordinatedEntityB01Q7, RoborockCoordinatedEntityV1
PARALLEL_UPDATES = 0
@@ -92,25 +92,31 @@ SELECT_DESCRIPTIONS: list[RoborockSelectDescription] = [
key="water_box_mode",
translation_key="mop_intensity",
api_command=RoborockCommand.SET_WATER_BOX_CUSTOM_MODE,
value_fn=lambda api: api.status.water_box_mode_name,
value_fn=lambda api: api.status.water_mode_name,
entity_category=EntityCategory.CONFIG,
options_lambda=lambda api: (
api.status.water_box_mode.keys()
if api.status.water_box_mode is not None
[mode.value for mode in api.status.water_mode_options]
if api.status.water_mode_options
else None
),
parameter_lambda=lambda key, api: [api.status.get_mop_intensity_code(key)],
parameter_lambda=lambda key, api: [
{v: k for k, v in api.status.water_mode_mapping.items()}[key]
],
),
RoborockSelectDescription(
key="mop_mode",
translation_key="mop_mode",
api_command=RoborockCommand.SET_MOP_MODE,
value_fn=lambda api: api.status.mop_mode_name,
value_fn=lambda api: api.status.mop_route_name,
entity_category=EntityCategory.CONFIG,
options_lambda=lambda api: (
api.status.mop_mode.keys() if api.status.mop_mode is not None else None
[mode.value for mode in api.status.mop_route_options]
if api.status.mop_route_options
else None
),
parameter_lambda=lambda key, api: [api.status.get_mop_mode_code(key)],
parameter_lambda=lambda key, api: [
{v: k for k, v in api.status.mop_route_mapping.items()}[key]
],
),
RoborockSelectDescription(
key="dust_collection_mode",
@@ -159,14 +165,13 @@ async def async_setup_entry(
)
async_add_entities(
RoborockB01SelectEntity(coordinator, description, options)
for coordinator in config_entry.runtime_data.b01
for coordinator in config_entry.runtime_data.b01_q7
for description in B01_SELECT_DESCRIPTIONS
if isinstance(coordinator, RoborockB01Q7UpdateCoordinator)
if (options := description.options_lambda(coordinator.api)) is not None
)
class RoborockB01SelectEntity(RoborockCoordinatedEntityB01, SelectEntity):
class RoborockB01SelectEntity(RoborockCoordinatedEntityB01Q7, SelectEntity):
"""Select entity for Roborock B01 devices."""
entity_description: RoborockB01SelectDescription

View File

@@ -33,14 +33,14 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from .coordinator import (
RoborockB01Q7UpdateCoordinator,
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
RoborockDataUpdateCoordinatorA01,
RoborockDataUpdateCoordinatorB01,
)
from .entity import (
RoborockCoordinatedEntityA01,
RoborockCoordinatedEntityB01,
RoborockCoordinatedEntityB01Q7,
RoborockCoordinatedEntityV1,
RoborockEntity,
)
@@ -422,8 +422,8 @@ async def async_setup_entry(
if description.data_protocol in coordinator.request_protocols
)
entities.extend(
RoborockSensorEntityB01(coordinator, description)
for coordinator in coordinators.b01
RoborockSensorEntityB01Q7(coordinator, description)
for coordinator in coordinators.b01_q7
for description in Q7_B01_SENSOR_DESCRIPTIONS
if description.value_fn(coordinator.data) is not None
)
@@ -515,14 +515,14 @@ class RoborockSensorEntityA01(RoborockCoordinatedEntityA01, SensorEntity):
return self.coordinator.data[self.entity_description.data_protocol]
class RoborockSensorEntityB01(RoborockCoordinatedEntityB01, SensorEntity):
"""Representation of a B01 Roborock sensor."""
class RoborockSensorEntityB01Q7(RoborockCoordinatedEntityB01Q7, SensorEntity):
"""Representation of a B01 Q7 Roborock sensor."""
entity_description: RoborockSensorDescriptionB01
def __init__(
self,
coordinator: RoborockDataUpdateCoordinatorB01,
coordinator: RoborockB01Q7UpdateCoordinator,
description: RoborockSensorDescriptionB01,
) -> None:
"""Initialize the entity."""

View File

@@ -118,9 +118,12 @@
"max": "Max",
"medium": "[%key:common::state::medium%]",
"mild": "Mild",
"min": "Min",
"moderate": "Moderate",
"off": "[%key:common::state::off%]",
"slight": "Slight",
"smart_mode": "[%key:component::roborock::entity::select::mop_mode::state::smart_mode%]",
"standard": "[%key:component::roborock::entity::select::mop_mode::state::standard%]",
"vac_followed_by_mop": "Vacuum followed by mop"
}
},
@@ -448,6 +451,7 @@
"max_plus": "Max plus",
"medium": "[%key:common::state::medium%]",
"off": "[%key:common::state::off%]",
"off_raise_main_brush": "Off (raised brush)",
"quiet": "Quiet",
"silent": "Silent",
"smart_mode": "[%key:component::roborock::entity::select::mop_mode::state::smart_mode%]",
@@ -478,6 +482,9 @@
"mqtt_unauthorized": {
"message": "Roborock MQTT servers rejected the connection due to rate limiting or invalid credentials. You may either attempt to reauthenticate or wait and reload the integration."
},
"multiple_maps_in_clean": {
"message": "All segments must belong to the same map. Got segments from maps: {map_flags}"
},
"no_coordinators": {
"message": "No devices were able to successfully setup"
},
@@ -487,6 +494,9 @@
"position_not_found": {
"message": "Robot position not found"
},
"segment_id_parse_error": {
"message": "Invalid segment ID format: {segment_id}"
},
"update_data_fail": {
"message": "Failed to update data"
},

View File

@@ -1,5 +1,6 @@
"""Support for Roborock vacuum class."""
import asyncio
import logging
from typing import Any
@@ -8,21 +9,22 @@ from roborock.exceptions import RoborockException
from roborock.roborock_typing import RoborockCommand
from homeassistant.components.vacuum import (
Segment,
StateVacuumEntity,
VacuumActivity,
VacuumEntityFeature,
)
from homeassistant.core import HomeAssistant, ServiceResponse
from homeassistant.exceptions import HomeAssistantError
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .const import DOMAIN, MAP_SLEEP
from .coordinator import (
RoborockB01Q7UpdateCoordinator,
RoborockConfigEntry,
RoborockDataUpdateCoordinator,
)
from .entity import RoborockCoordinatedEntityB01, RoborockCoordinatedEntityV1
from .entity import RoborockCoordinatedEntityB01Q7, RoborockCoordinatedEntityV1
_LOGGER = logging.getLogger(__name__)
@@ -82,8 +84,7 @@ async def async_setup_entry(
)
async_add_entities(
RoborockQ7Vacuum(coordinator)
for coordinator in config_entry.runtime_data.b01
if isinstance(coordinator, RoborockB01Q7UpdateCoordinator)
for coordinator in config_entry.runtime_data.b01_q7
)
@@ -101,6 +102,7 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
| VacuumEntityFeature.CLEAN_SPOT
| VacuumEntityFeature.STATE
| VacuumEntityFeature.START
| VacuumEntityFeature.CLEAN_AREA
)
_attr_translation_key = DOMAIN
_attr_name = None
@@ -116,11 +118,13 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
coordinator.duid_slug,
coordinator,
)
self._home_trait = coordinator.properties_api.home
self._maps_trait = coordinator.properties_api.maps
@property
def fan_speed_list(self) -> list[str]:
"""Get the list of available fan speeds."""
return self._device_status.fan_power_options
return [mode.value for mode in self._device_status.fan_speed_options]
@property
def activity(self) -> VacuumActivity | None:
@@ -131,7 +135,7 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
@property
def fan_speed(self) -> str | None:
"""Return the fan speed of the vacuum cleaner."""
return self._device_status.fan_power_name
return self._device_status.fan_speed_name
async def async_start(self) -> None:
"""Start the vacuum."""
@@ -170,13 +174,83 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
"""Set vacuum fan speed."""
await self.send(
RoborockCommand.SET_CUSTOM_MODE,
[self._device_status.get_fan_speed_code(fan_speed)],
[
{v: k for k, v in self._device_status.fan_speed_mapping.items()}[
fan_speed
]
],
)
async def async_set_vacuum_goto_position(self, x: int, y: int) -> None:
"""Send vacuum to a specific target point."""
await self.send(RoborockCommand.APP_GOTO_TARGET, [x, y])
async def async_get_segments(self) -> list[Segment]:
"""Get the segments that can be cleaned."""
home_map_info = self._home_trait.home_map_info
if not home_map_info:
return []
return [
Segment(
id=f"{map_flag}:{room.segment_id}",
name=room.name,
group=map_info.name,
)
for map_flag, map_info in home_map_info.items()
for room in map_info.rooms
]
async def async_clean_segments(self, segment_ids: list[str], **kwargs: Any) -> None:
"""Clean the specified segments."""
parsed: list[tuple[int, int]] = []
for seg_id in segment_ids:
# Segment id is mapflag:segment_id
parts = seg_id.split(":")
if len(parts) != 2:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="segment_id_parse_error",
translation_placeholders={"segment_id": seg_id},
)
try:
# We need to make sure both parts are ints.
parsed.append((int(parts[0]), int(parts[1])))
except ValueError as err:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="segment_id_parse_error",
translation_placeholders={"segment_id": seg_id},
) from err
# Because segment_ids can overlap for each map,
# we need to make sure that only one map is passed in.
unique_map_flags = {map_flag for map_flag, _ in parsed}
if len(unique_map_flags) > 1:
map_flags_str = ", ".join(str(flag) for flag in sorted(unique_map_flags))
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="multiple_maps_in_clean",
translation_placeholders={"map_flags": map_flags_str},
)
target_map_flag = next(iter(unique_map_flags))
if self._maps_trait.current_map != target_map_flag:
# If the user is attempting to clean an area on a map that is not selected, we should try to change.
try:
await self._maps_trait.set_current_map(target_map_flag)
except RoborockException as err:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="command_failed",
translation_placeholders={"command": "load_multi_map"},
) from err
await asyncio.sleep(MAP_SLEEP)
# We can now confirm all segments are on our current map, so clean them all.
await self.send(
RoborockCommand.APP_SEGMENT_CLEAN,
[{"segments": [seg_id for _, seg_id in parsed]}],
)
async def async_send_command(
self,
command: str,
@@ -232,7 +306,7 @@ class RoborockVacuum(RoborockCoordinatedEntityV1, StateVacuumEntity):
}
class RoborockQ7Vacuum(RoborockCoordinatedEntityB01, StateVacuumEntity):
class RoborockQ7Vacuum(RoborockCoordinatedEntityB01Q7, StateVacuumEntity):
"""General Representation of a Roborock vacuum."""
_attr_icon = "mdi:robot-vacuum"
@@ -256,7 +330,7 @@ class RoborockQ7Vacuum(RoborockCoordinatedEntityB01, StateVacuumEntity):
) -> None:
"""Initialize a vacuum."""
StateVacuumEntity.__init__(self)
RoborockCoordinatedEntityB01.__init__(
RoborockCoordinatedEntityB01Q7.__init__(
self,
coordinator.duid_slug,
coordinator,

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/seven_segments",
"iot_class": "local_polling",
"quality_scale": "legacy",
"requirements": ["Pillow==12.0.0"]
"requirements": ["Pillow==12.1.1"]
}

View File

@@ -4,6 +4,7 @@
"codeowners": ["@eavanvalkenburg"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sia",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["pysiaalarm"],
"requirements": ["pysiaalarm==3.2.2"]

View File

@@ -6,5 +6,5 @@
"iot_class": "cloud_polling",
"loggers": ["simplehound"],
"quality_scale": "legacy",
"requirements": ["Pillow==12.0.0", "simplehound==0.3"]
"requirements": ["Pillow==12.1.1", "simplehound==0.3"]
}

View File

@@ -78,7 +78,7 @@ class SisyphusLight(LightEntity):
return not self._table.is_sleeping
@property
def brightness(self):
def brightness(self) -> int:
"""Return the current brightness of the table's ring light."""
return self._table.brightness * 255

View File

@@ -4,6 +4,7 @@
"codeowners": ["@grahamwetzler"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/smart_meter_texas",
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["smart_meter_texas"],
"requirements": ["smart-meter-texas==0.5.5"]

View File

@@ -65,7 +65,7 @@ class SmartTubLight(SmartTubEntity, LightEntity):
return self.coordinator.data[self.spa.id][ATTR_LIGHTS][self.light_zone]
@property
def brightness(self):
def brightness(self) -> int:
"""Return the brightness of this light between 0..255."""
# SmartTub intensity is 0..100
@@ -87,7 +87,7 @@ class SmartTubLight(SmartTubEntity, LightEntity):
return self.light.mode != SpaLight.LightMode.OFF
@property
def effect(self):
def effect(self) -> str | None:
"""Return the current effect."""
mode = self.light.mode.name.lower()
if mode in self.effect_list:
@@ -95,7 +95,7 @@ class SmartTubLight(SmartTubEntity, LightEntity):
return None
@property
def effect_list(self):
def effect_list(self) -> list[str]:
"""Return the list of supported effects."""
return [
effect

View File

@@ -4,6 +4,7 @@
"codeowners": ["@luar123"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/snapcast",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["construct", "snapcast"],
"requirements": ["snapcast==2.3.7"]

View File

@@ -4,6 +4,7 @@
"codeowners": ["@Lash-L"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/snoo",
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["snoo"],
"quality_scale": "bronze",

View File

@@ -13,6 +13,7 @@
"config_flow": true,
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/snooz",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["pysnooz==0.8.6"]
}

View File

@@ -4,6 +4,7 @@
"codeowners": ["@squishykid", "@Darsstar"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/solax",
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["solax"],
"requirements": ["solax==3.2.3"]

View File

@@ -4,6 +4,7 @@
"codeowners": ["@ratsept"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/soma",
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["api"],
"requirements": ["pysoma==0.0.12"]

Some files were not shown because too many files have changed in this diff Show More