Compare commits

...

198 Commits

Author SHA1 Message Date
Franck Nijhof
4709913a7f Merge branch 'dev' into infrared 2026-02-13 21:42:52 +01:00
Glenn de Haan
462d958b7e Bump hdfury to 1.5.0 (#162944) 2026-02-13 21:42:22 +01:00
Robin Lintermann
d888579cbd Bump pysmarlaapi to 1.0.1 and compatibility changes (#162911) 2026-02-13 21:41:05 +01:00
Erik Montnemery
e16a8ed20e Don't mock out filesystem operations in backup tests (#162877) 2026-02-13 21:39:34 +01:00
YogevBokobza
b11a75d438 Add Switcher heater support (#162588)
Co-authored-by: Shay Levy <levyshay1@gmail.com>
2026-02-13 22:32:55 +02:00
Glenn de Haan
95df5b9ec9 Fix incorrect type HDFury select platform (#162948) 2026-02-13 20:50:26 +01:00
epenet
a301a9c4b6 Always include homeassistant translations in tests (#162850) 2026-02-13 20:17:48 +01:00
Thomas55555
e80bb871e4 Bump ruff to 0.15.1 (#162903) 2026-02-13 19:43:37 +01:00
epenet
ff4ff98e54 Parametrize yeelight test_device_types test (#161838) 2026-02-13 19:43:07 +01:00
wollew
88c6cb3877 add OnOffLight without brightness control to velux integration (#162835) 2026-02-13 19:42:44 +01:00
Michael
6b3a7e4cd6 Fix handling when FRITZ!Box reboots in FRITZ!Smarthome (#162676) 2026-02-13 19:41:03 +01:00
Michael
36ff7506a0 Fix handling when FRITZ!Box reboots in FRITZ!Box Tools (#162679) 2026-02-13 19:40:51 +01:00
Allen Porter
a0af35f2dc Improve MCP SSE fallback error handling (#162655) 2026-02-13 19:39:34 +01:00
Josef Zweck
c15da19b84 Log remaining token duration in onedrive (#162933) 2026-02-13 19:38:44 +01:00
Damien Sorel
23e88a24f0 Add remove item intent for todo component (#152922) 2026-02-13 19:38:22 +01:00
Robert Resch
815c708d19 Block redirect to localhost (#162941) 2026-02-13 19:31:35 +01:00
Paulus Schoutsen
f9f2f39a3c OpenAI: Increase max iterations for AI Task (#162599) 2026-02-13 13:16:26 -05:00
Erik Montnemery
490514c274 Add fixture to give tests their own unique copy of testing_config (#162938) 2026-02-13 18:07:18 +01:00
Kamil Breguła
7da339b59c Add quality scale for GIOS (#155603)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
2026-02-13 18:01:44 +01:00
Josef Zweck
1bb31892c2 Add integration for onedrive for business (#155709)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-13 07:01:52 -08:00
epenet
267caf2365 Use APPLICATION_CREDENTIALS_DOMAIN constant in tests (#162932) 2026-02-13 15:47:38 +01:00
Petro31
4e71a38e31 Ensure numeric template sensors only use numbers in _attr_native_state (#162878)
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2026-02-13 14:14:28 +01:00
Petro31
d3d916566a Make template lock code error consistent between state based and trigger based template entities (#162923) 2026-02-13 14:13:58 +01:00
epenet
fd3258a6d3 Use constants for update_entity calls in tests (#162920) 2026-02-13 13:54:40 +01:00
Sammy [Andrei Marinache]
d1aadb5842 Add Miele TQ1000WP tumble dryer programs and program phases (#162871)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: Åke Strandberg <ake@strandberg.eu>
2026-02-13 13:53:12 +01:00
epenet
d984411911 Raise on missing supported color modes (#162717) 2026-02-13 13:39:48 +01:00
Robin Lintermann
8ed0a4cf29 Specifiy number of parallel updates in Smarla (#162914) 2026-02-13 13:24:24 +01:00
Simone Chemelli
9a407b8668 Optimize coordinator data type for UptimeRobot (#162912) 2026-02-13 13:23:59 +01:00
Robin Lintermann
72aa9d8a6a Improve smarla typing in tests (#162163) 2026-02-13 13:19:27 +01:00
Kevin Stillhammer
dc1c52622e Fix google_travel_time get_travel_times config_entry_id description (#162910)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-13 13:18:55 +01:00
peteS-UK
44d5ecc926 Replace repeated patches in config_flow_test with fixtures for Squeezebox (#153032) 2026-02-13 12:00:24 +01:00
Simone Chemelli
54b0393ebe Cleanup code for UptimeRobot (#162905) 2026-02-13 11:53:04 +01:00
epenet
54141ffd3f Drop yardian custom translation overrides in tests (#162904) 2026-02-13 10:57:17 +01:00
David Bonnes
92b823068c Move evohome service registration to services.py (#162902) 2026-02-13 10:25:03 +01:00
Norbert Rittel
d4a6377ab3 Fix capitalization of "Immich" and "MIME type" (#162900) 2026-02-13 10:00:39 +01:00
epenet
80d07c42ac Move evohome hasskey to const module (#162899) 2026-02-13 08:25:43 +00:00
puddly
077eeafa69 Bump ZHA to 0.0.90 (#162894) 2026-02-13 08:40:26 +01:00
dependabot[bot]
b6ff8c94b1 Bump docker/build-push-action from 6.19.1 to 6.19.2 (#162896)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 08:39:53 +01:00
Michael
6a1581f2bf Immich reached platinum 🏆 (#162891) 2026-02-13 07:45:56 +01:00
johanzander
2dc0d32a29 Implement automatic migration for Growatt Server DEFAULT_PLANT_ID entries (#159972) 2026-02-13 01:56:50 +01:00
Niracler
036696f4cd Add energy sensor platform to sunricher_dali (#161415)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-13 01:55:16 +01:00
Michael
89f5b33a5e Cache get api calls in FRITZ!Box tools (#160246)
Co-authored-by: Simone Chemelli <simone.chemelli@gmail.com>
2026-02-13 01:54:33 +01:00
Matthias Alphart
fc52885c21 Support KNX time server configuration from UI (#161854) 2026-02-13 01:52:38 +01:00
Ville Skyttä
ffa8fc583d Recorder total_increasing warning clarifications (#157453) 2026-02-13 01:47:51 +01:00
Samuel Xiao
f18fa07019 Switchbot Cloud: Add new supported device Ai Art Frame (#160754)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-13 01:47:03 +01:00
Alex Merkel
ce704dd5f7 Add play/pause ability & media info to LG soundbars integration (#161184)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-13 01:39:05 +01:00
Patrick Vorgers
d930755f92 IDrive e2 backup provider (#144910)
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-12 15:49:03 -08:00
epenet
196c6d9839 Do not unregister adguard services (#158308)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-13 00:40:30 +01:00
David Rapan
cce5358901 Re-implement Cloudflare using coordinator (#156817)
Signed-off-by: David Rapan <david@rapan.cz>
2026-02-13 00:33:48 +01:00
Heindrich Paul
df7c3d787d Only show trains for configured time if configured in nederlandse_spoorwegen (#159261) 2026-02-13 00:29:20 +01:00
Manu
a6287731f7 Increase test coverage in Xbox integration (#162876) 2026-02-12 15:14:07 -08:00
karwosts
1667b3f16b Add annual statistics aggregation (#160857) 2026-02-13 00:11:07 +01:00
Noah Husby
2aa9d22350 Add room correction setting to Cambridge Audio (#162743)
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
2026-02-12 23:10:13 +00:00
Matthias Alphart
3bcb303ef1 Support KNX number entity configuration from UI (#161269) 2026-02-13 00:06:53 +01:00
Manu
e6de37cc69 Use service helper to retrieve config entry in Duck DNS integration (#162879) 2026-02-12 23:00:23 +00:00
Jon
d10f5cc9ea Expose power and energy sensors for vera metered switches (#161028) 2026-02-12 23:56:35 +01:00
Erwin Douna
4921f05189 Disable mobile devices in tado (#160881)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 23:56:01 +01:00
Brett Adams
877ad391f0 Add config flow to Splunk (#160478)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Erwin Douna <e.douna@gmail.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 23:27:49 +01:00
n-6
8a5594b9e4 Added Ambient Weather station sensors for AQIN monitor. (#161082)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 23:27:25 +01:00
Josef Zweck
a0623d1f97 Add IQS to openai_conversation (#161051)
Co-authored-by: Robert Resch <robert@resch.dev>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
Co-authored-by: Matthias Alphart <farmio@alphart.net>
Co-authored-by: Erwin Douna <e.douna@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 23:26:27 +01:00
Michael
c8f8ef887a Add reconfigure flow to immich (#162892) 2026-02-12 23:25:51 +01:00
Eduardo Tsen
40ec6d3793 Add switch controls for dishwashers in SmartThings (#160266)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 22:57:54 +01:00
Matthias Alphart
0a79d84f9a KNX Expose: Add support for sending value periodically (#160883) 2026-02-12 22:51:40 +01:00
Przemko92
7a7e60ce75 Add number to Compit (#162165)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-12 22:49:34 +01:00
Kurt Chrisford
6bfaf6b188 Add action exception handling to Actron Air (#160579)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 22:46:53 +01:00
Kinachi249
34a445545c Cync - allow updating multiple attributes in one command (#159574)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 22:46:30 +01:00
epenet
3c854a7679 Improve type hints in utility_meter (#160993) 2026-02-12 22:46:13 +01:00
Florent Fourcot
b7b6c1a72e Add more Melcloud sensors (#160770)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 22:39:21 +01:00
Michael
fdf02cf657 Add missing exception translations in immich (#162889) 2026-02-12 22:32:22 +01:00
Kevin Stillhammer
acf739df81 add services to google_travel_time (#160740)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 22:31:36 +01:00
Tom Matheussen
4801dcaded Add parent device for Satel Integra (#160933)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 22:29:59 +01:00
Daniel Hjelseth Høyer
11af0a2d04 Add reauthentication flow to Homevolt (#162868)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-02-12 22:27:30 +01:00
Michel Nederlof
40b30b94a2 Adjust discovery interval in govee-light-local (#160914)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 21:56:23 +01:00
Daniel Hjelseth Høyer
902d3f45a2 Add diagnostics to Homevolt (#162873)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-02-12 21:56:12 +01:00
ryanjones-gentex
bf887fbc71 Add reauth flow to HomeLink integration (#158454)
Co-authored-by: Nicholas Aelick <niaexa@syntronic.com>
2026-02-12 21:51:26 +01:00
Michael
e5ede7deea Categorize all immich sensor entities as diagnostic (#162887) 2026-02-12 21:36:45 +01:00
Erwin Douna
8b674a44a1 Melcloud move ConfigEntry declaration (#160890)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 21:32:54 +01:00
Joost Lekkerkerker
e145963d48 Remove unused snapshots for Homevolt (#162885) 2026-02-12 21:32:41 +01:00
Simone Chemelli
1bca0ba5f8 Update UptimeRobot to API v3 (#153508) 2026-02-12 21:28:11 +01:00
Anders Ödlund
38531033a1 Catch AccessoryDisconnectedError in homekit pairing (#162466)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 20:51:09 +01:00
abmantis
cf24011690 Cleanup 2026-02-12 18:48:06 +00:00
abmantis
f95f731a3f Add kitchen sink IR fan subentry 2026-02-12 18:44:19 +00:00
abmantis
775e5aca7b Merge branch 'dev' of github.com:home-assistant/core into infrared 2026-02-12 18:08:47 +00:00
Xitee
9f1b6a12a5 Filter out transient zero values from qBittorrent alltime stats (#162821)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 18:45:42 +01:00
Daniel Hjelseth Høyer
876589f0cd Fix keys for Homevolt (#162874)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-02-12 18:42:50 +01:00
Przemko92
bd09ac9030 Add water heater support for Compit (#162021)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 18:39:02 +01:00
wollew
6d143c1ce2 add quality scale to manifest of velux integration (#162869) 2026-02-12 18:38:49 +01:00
abmantis
8b52e16b0a Add infrared to kitchen sink 2026-02-12 17:29:36 +00:00
Artur Pragacz
f4ceb22d73 Add analytics platform to mobile_app (#162736) 2026-02-12 17:09:40 +01:00
Manu
5839191c37 Move entity service registration to async_setup in ntfy integration (#162833) 2026-02-12 16:42:15 +01:00
Manu
29feccb190 Improve tests in Bring! integration (#162853)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-02-12 16:41:08 +01:00
epenet
a017417849 Use service helper to extract habitica config entry (#162795)
Co-authored-by: Manu <4445816+tr4nt0r@users.noreply.github.com>
2026-02-12 16:39:03 +01:00
Aron List
72a7d708b0 Expose ActuatorEnabled attr of matter DoorLock (#162598) 2026-02-12 16:03:30 +01:00
epenet
47be13e6bf Improve error validation in service tests (#162851) 2026-02-12 06:34:31 -08:00
ElCruncharino
7d583be8e1 Add timeout to B2 metadata downloads to prevent backup hang (#162562) 2026-02-12 06:26:50 -08:00
Manu
ccb3b35694 Use https for media player cover images in Xbox integration (#162859) 2026-02-12 05:59:28 -08:00
Steve Easley
48893d4daa Add JVC Projector switch platform (#161899)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-12 14:21:34 +01:00
Yoshi Walsh
0388e5dd7f Bump pydaikin to 2.17.2 (#162846) 2026-02-12 14:18:46 +01:00
Marc Hörsken
7a68903318 Bump pywmspro to 0.3.3 (#162832) 2026-02-12 14:18:19 +01:00
Anders Ödlund
64766100fe Add get_lock_usercode service to zwave_js integration (#162057)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-12 13:44:33 +01:00
Abílio Costa
0576dd91b7 Validate core_files.yaml base_platforms completeness (#162826) 2026-02-12 11:59:19 +00:00
Jon Seager
f4440e992f Bump pytouchlinesl to 0.6.0 (#162856) 2026-02-12 12:42:36 +01:00
Daniel Hjelseth Høyer
ea83b5a892 Add Homevolt battery integration (#160416)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-12 12:39:58 +01:00
Vicx
d148952c99 Bump slixmpp to 1.13.2 (#162837) 2026-02-12 11:48:48 +01:00
epenet
ed9a810908 Fix unavailable status in Tuya (#162709) 2026-02-12 11:46:40 +01:00
Peter Kolbus
6960cd6853 Bump pyweatherflowudp to 1.5.0 (#162841)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-02-12 11:43:01 +01:00
dependabot[bot]
5bd86ba600 Bump docker/build-push-action from 6.18.0 to 6.19.1 (#162844)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-12 10:05:36 +01:00
LG-ThinQ-Integration
70bc49479d Add vacuum's activity table to LG ThinQ (#162616)
Co-authored-by: yunseon.park <yunseon.park@lge.com>
2026-02-12 10:02:29 +01:00
wh1t3f1r3
81e0c105d6 Fix Venstar integration crash when thermostat is unreachable (#162524) 2026-02-12 09:54:11 +01:00
Peter Kolbus
527e2aec1f Improve weatherflow type hints (#162843) 2026-02-12 09:07:30 +01:00
epenet
cd6661260c Use service helper to extract bring config entry (#162790) 2026-02-12 00:05:18 -05:00
epenet
efa522cc73 Use service helper to extract bosch alarm config entry (#162789) 2026-02-12 00:04:50 -05:00
Artur Pragacz
f9bd1b3d30 Rename registry imports in intent helper (#162765)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 00:04:29 -05:00
staticdev
4cfdb14714 Fix devcontainer defaultFormatter blocks (#162750) 2026-02-12 00:02:27 -05:00
epenet
6fb802e6b9 Use service helper to extract transmission config entry (#162814) 2026-02-11 23:58:35 -05:00
Christian Lackas
9b30fecb0c Fix absolute humidity sensor on HmIP-WGT glass thermostats (#162455) 2026-02-11 23:42:37 +01:00
Roman Lytvyn
e77acc1002 Add WATER_LEVEL sensor to homekit_controller (#161900) 2026-02-11 23:38:46 +01:00
cdnninja
07e8b780a2 Add DHCP Discovery to vesync (#162259)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-11 23:27:35 +01:00
Denis Shulyaka
e060395786 Anthropic Structured Outputs support (#162515) 2026-02-11 23:25:46 +01:00
Denis Shulyaka
661b14dec5 Deprecate OpenAI actions (#162211) 2026-02-11 23:17:15 +01:00
Christian Lackas
b8e63b7ef6 Use direct DHW status for ViCare water heater state (#162591) 2026-02-11 23:07:56 +01:00
Abílio Costa
fd78e35a86 Align number unit converters with sensor (#162662) 2026-02-11 23:07:04 +01:00
Mick Vleeshouwer
db55dfe3c7 Improve device information in Overkiz (#162419) 2026-02-11 22:39:34 +01:00
Joost Lekkerkerker
bda3121f98 Add snapshot tests to waterfurnace sensors (#162594) 2026-02-11 22:28:41 +01:00
dontinelli
fd4981f3e2 Split up coordinators in solarlog (#161169)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 22:23:32 +01:00
Manu
ae1bedd94a Add uptime ratio and avg. response time sensors to Uptime Kuma (#162785) 2026-02-11 22:09:21 +01:00
hanwg
90b67f90fa Handle config entry not loaded for Telegram bot (#161951)
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 22:05:21 +01:00
cdnninja
9c821fb5f5 Mark log unavailable as complete for vesync (#162464) 2026-02-11 22:03:12 +01:00
Graham Crockford
1f9691ace1 Add charge state to Victron BLE (#162593)
Co-authored-by: Graham Crockford <badgerwithagun@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-11 21:59:56 +01:00
Paulus Schoutsen
5331cd99c6 Google Gen AI: Increase max iterations for AI Task (#162600)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 21:55:37 +01:00
Denis Shulyaka
1c3f24c78f Add TTS support for OpenAI (#162468)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 21:37:49 +01:00
Kamil Breguła
e179e74df3 Support dual cook oven in Smartthing (#156561)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 21:27:40 +01:00
wollew
98602bd311 Bump pyvlx to 0.2.29 (#162829) 2026-02-11 21:10:44 +01:00
Jeef
5f01124c74 Bump typedmonarchmoney to 0.7.0 (#162686) 2026-02-11 19:44:08 +00:00
Brett Adams
4b5368be8e Complete config-entry-unloading quality check in Teslemetry (#161956)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 20:31:30 +01:00
Brett Adams
6379014f13 Use chained comparison in Teslemetry update platform (#161950)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-11 20:07:04 +01:00
Joost Lekkerkerker
aa640020be Bump pySmartThings to 3.5.2 (#162809)
Co-authored-by: Josef Zweck <josef@zweck.dev>
2026-02-11 20:02:28 +01:00
Simone Chemelli
92f4e600d1 Fix alarm refresh warning for Comelit SimpleHome (#162710) 2026-02-11 19:36:57 +01:00
Andreas Jakl
25a6b6fa65 Add switch platform to nrgkick integration for enabling or pausing car charging (#162563)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 19:36:36 +01:00
Manu
3cbe1295f9 Bump pythonkuma to 0.4.1 (#162773) 2026-02-11 19:35:12 +01:00
Erwin Douna
72581fb2b1 Add endpoint system df information (#160134) 2026-02-11 19:28:49 +01:00
Erwin Douna
97c89590e0 Portainer fix multiple environments & containers (#153674) 2026-02-11 19:21:36 +01:00
epenet
b6ba86f3c1 Use service helper to extract onedrive config entry (#162803) 2026-02-11 10:16:55 -08:00
epenet
cedc291872 Use service helper to extract tado config entry (#162812) 2026-02-11 10:15:38 -08:00
epenet
1d30486f82 Use service helper to extract velbus config entry (#162813) 2026-02-11 10:15:19 -08:00
Christopher Fenner
9f1b4c9035 Improve EnOcean config flow (#162751) 2026-02-11 19:14:45 +01:00
Christian Lackas
80ebb34ad1 Add smoke detector extended properties to homematicip_cloud (#161629)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 18:56:39 +01:00
hanwg
e0e11fd99d Fix bug in edit_message_media action for Telegram bot (#162762) 2026-02-11 17:55:14 +01:00
Artur Pragacz
578a933f30 Move entity name helper to module-level function (#162766) 2026-02-11 17:54:53 +01:00
Christian Lackas
57493a1f69 Add ELV-SH-SB8 Status Board switch support to homematicip_cloud (#161668) 2026-02-11 17:43:51 +01:00
Simone Chemelli
3a4100fa94 Fix image platform state for Vodafone Station (#162747)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-11 17:39:51 +01:00
theobld-ww
0c1af1d613 Add switch entities to Watts Vision + (#162699)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-11 17:39:17 +01:00
starkillerOG
4e46431798 Add additional Reolink PTZ buttons (#162793) 2026-02-11 17:29:24 +01:00
starkillerOG
bec66f49a2 Add Reolink PTZ patrol status (#162796) 2026-02-11 17:29:05 +01:00
epenet
4019768fa1 Use service helper to extract easyenergy config entry (#162791) 2026-02-11 17:24:57 +01:00
epenet
25d902fd3e Use service helper to extract google_photos config entry (#162792) 2026-02-11 17:24:44 +01:00
epenet
30f006538d Use service helper to extract google_sheets config entry (#162794) 2026-02-11 17:24:26 +01:00
epenet
15b1fee42d Use service helper to extract mastodon config entry (#162798) 2026-02-11 17:24:05 +01:00
epenet
d69b816459 Use service helper to extract mealie config entry (#162800) 2026-02-11 17:19:03 +01:00
epenet
bf79721e97 Use service helper to extract ohme config entry (#162801) 2026-02-11 17:18:25 +01:00
torben-iometer
66a0b44284 Fix missing values in battery_level in iometer (#162781) 2026-02-11 17:17:55 +01:00
Andy
8693294ea6 Add support for Nanoleaf Essentials / Replace aionanoleaf through aionanoleaf2 (#157295)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-11 17:06:40 +01:00
epenet
14ac7927f1 Use service helper to extract seventeentrack config entry (#162807) 2026-02-11 16:08:40 +01:00
Willem-Jan van Rootselaar
b4674473d7 Fix BSBLAN water heater mapping and add on/off (#160256) 2026-02-11 16:06:26 +01:00
epenet
f01ece1d3d Move TadoConfigEntry declaration (#162811) 2026-02-11 16:01:31 +01:00
epenet
08160a41a6 Use service helper to extract swiss public transport config entry (#162810) 2026-02-11 15:59:50 +01:00
epenet
e617698770 Use service helper to extract stookwijzer config entry (#162808) 2026-02-11 15:59:36 +01:00
epenet
ee31bdf18b Use service helper to extract radarr config entry (#162805) 2026-02-11 15:41:02 +01:00
epenet
305b911c0d Use service helper to extract overseerr config entry (#162804) 2026-02-11 15:36:34 +01:00
epenet
842abf78d2 Use service helper to extract risco config entry (#162806) 2026-02-11 15:35:27 +01:00
Willem-Jan van Rootselaar
134e8d1c1b Bump python-bsblan to version 4.2.0 (#162786) 2026-02-11 15:31:06 +01:00
epenet
733e90f747 Use service helper to extract immich config entry (#162797) 2026-02-11 15:02:32 +01:00
Artur Pragacz
6c92f7a864 Add integration type to mobile_app (#157719) 2026-02-11 14:48:10 +01:00
abmantis
6a2fbecad3 Add infrared to .core_files.yaml 2026-02-11 12:51:45 +00:00
epenet
f69b5b6e8f Use service helper to extract amberelectric config entry (#162788) 2026-02-11 13:49:29 +01:00
Willem-Jan van Rootselaar
59e53ee7b7 Add HVAC action support for BSBLAN climate entity (#156828)
Co-authored-by: Erwin Douna <e.douna@gmail.com>
2026-02-11 13:27:18 +01:00
epenet
62e1b0118c Add service helper to get config entry (#162068) 2026-02-11 13:20:37 +01:00
Guido Schmitz
b7e9066b9d Add quality scale for devolo Home Control (#147483) 2026-02-11 12:17:02 +01:00
Erik Montnemery
2d6532b8ee Fix deadlock in ReloadServiceHelper (#162775) 2026-02-11 12:14:23 +01:00
Kamil Breguła
ebd1f1b00f Add pagination support for AWS S3 (#162578)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-11 11:27:33 +01:00
jameson_uk
95a1ceb080 feat: add info skills to alexa devices (#162097) 2026-02-11 11:23:07 +01:00
dvdinth
3f9e7d1dba Add IntelliClima integration and tests (#157363)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-11 11:18:26 +01:00
epenet
eab80f78d9 Raise on missing color mode (#162715) 2026-02-11 11:12:52 +01:00
Robert Resch
aa9fdd56ec Bump cryptography to 46.0.5 (#162783) 2026-02-11 11:09:54 +01:00
epenet
c727261f67 Move matter fixture list to a constant (#162776) 2026-02-11 10:47:09 +01:00
jameson_uk
703c62aa74 Bump aioamazondevices to 12.0.0 (#162778) 2026-02-11 10:21:11 +01:00
Tomás Correia
6e1f90228b fix to cloudflare r2 setup screen info (#162677) 2026-02-10 23:43:59 +01:00
LeoXie
3be089d2a5 Add Matter CO alarm state (#162627)
Co-authored-by: Ludovic BOUÉ <lboue@users.noreply.github.com>
2026-02-10 23:43:32 +01:00
Noah Husby
692d3d35cc Bump aiostreammagic to 2.12.1 (#162744) 2026-02-10 23:26:20 +01:00
starkillerOG
c52cb8362e Bump reolink-aio to 0.19.0 (#162672) 2026-02-10 23:24:55 +01:00
Boaz Cahlon
93ac215ab4 Add integration for Hegel Music Systems amplifiers (#153867)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-10 22:56:48 +01:00
Michael
f9eb86b50a Improve recognizability of Wi-Fi qr code in FRITZ!Box Tools (#162752) 2026-02-10 21:55:20 +00:00
Christian Lackas
a7f9992a4e Bump homematicip to 2.6.0 (#162702)
Co-authored-by: Abílio Costa <abmantis@users.noreply.github.com>
2026-02-10 21:52:09 +00:00
Andreas Jakl
13fde0d135 Bump nrgkick-api to 1.7.1 (#162738) 2026-02-10 19:26:51 +01:00
tronikos
5105c6c50f Add last_changed and last_updated for the Opower statistics (#159101) 2026-02-10 17:08:58 +00:00
Josef Zweck
af152ebe50 Bump onedrive-personal-sdk to 0.1.2 (#162689) 2026-02-10 08:52:29 -08:00
Manu
dea4452e42 Set device entry type and integration type to service in Portainer integration (#162732) 2026-02-10 08:51:03 -08:00
abmantis
90bacbb98e Add infrared entity integration 2026-02-05 20:06:33 +00:00
810 changed files with 39066 additions and 5601 deletions

View File

@@ -22,6 +22,7 @@ base_platforms: &base_platforms
- homeassistant/components/calendar/**
- homeassistant/components/camera/**
- homeassistant/components/climate/**
- homeassistant/components/conversation/**
- homeassistant/components/cover/**
- homeassistant/components/date/**
- homeassistant/components/datetime/**
@@ -33,6 +34,7 @@ base_platforms: &base_platforms
- homeassistant/components/humidifier/**
- homeassistant/components/image/**
- homeassistant/components/image_processing/**
- homeassistant/components/infrared/**
- homeassistant/components/lawn_mower/**
- homeassistant/components/light/**
- homeassistant/components/lock/**
@@ -53,6 +55,7 @@ base_platforms: &base_platforms
- homeassistant/components/update/**
- homeassistant/components/vacuum/**
- homeassistant/components/valve/**
- homeassistant/components/wake_word/**
- homeassistant/components/water_heater/**
- homeassistant/components/weather/**
@@ -70,7 +73,6 @@ components: &components
- homeassistant/components/cloud/**
- homeassistant/components/config/**
- homeassistant/components/configurator/**
- homeassistant/components/conversation/**
- homeassistant/components/demo/**
- homeassistant/components/device_automation/**
- homeassistant/components/dhcp/**

View File

@@ -60,7 +60,13 @@
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff"
},
"[json][jsonc][yaml]": {
"[json]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"[jsonc]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"[yaml]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"json.schemas": [

View File

@@ -225,7 +225,7 @@ jobs:
- name: Build base image
id: build
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: .
file: ./Dockerfile
@@ -530,7 +530,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker image
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
@@ -543,7 +543,7 @@ jobs:
- name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
id: push
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.15.0
rev: v0.15.1
hooks:
- id: ruff-check
args:

View File

@@ -283,10 +283,12 @@ homeassistant.components.imgw_pib.*
homeassistant.components.immich.*
homeassistant.components.incomfort.*
homeassistant.components.inels.*
homeassistant.components.infrared.*
homeassistant.components.input_button.*
homeassistant.components.input_select.*
homeassistant.components.input_text.*
homeassistant.components.integration.*
homeassistant.components.intelliclima.*
homeassistant.components.intent.*
homeassistant.components.intent_script.*
homeassistant.components.ios.*
@@ -363,7 +365,6 @@ homeassistant.components.my.*
homeassistant.components.mysensors.*
homeassistant.components.myuplink.*
homeassistant.components.nam.*
homeassistant.components.nanoleaf.*
homeassistant.components.nasweb.*
homeassistant.components.neato.*
homeassistant.components.nest.*
@@ -385,6 +386,7 @@ homeassistant.components.ohme.*
homeassistant.components.onboarding.*
homeassistant.components.oncue.*
homeassistant.components.onedrive.*
homeassistant.components.onedrive_for_business.*
homeassistant.components.onewire.*
homeassistant.components.onkyo.*
homeassistant.components.open_meteo.*

17
CODEOWNERS generated
View File

@@ -672,6 +672,8 @@ build.json @home-assistant/supervisor
/homeassistant/components/hdmi_cec/ @inytar
/tests/components/hdmi_cec/ @inytar
/homeassistant/components/heatmiser/ @andylockran
/homeassistant/components/hegel/ @boazca
/tests/components/hegel/ @boazca
/homeassistant/components/heos/ @andrewsayre
/tests/components/heos/ @andrewsayre
/homeassistant/components/here_travel_time/ @eifinger
@@ -717,6 +719,8 @@ build.json @home-assistant/supervisor
/tests/components/homematic/ @pvizeli
/homeassistant/components/homematicip_cloud/ @hahn-th @lackas
/tests/components/homematicip_cloud/ @hahn-th @lackas
/homeassistant/components/homevolt/ @danielhiversen
/tests/components/homevolt/ @danielhiversen
/homeassistant/components/homewizard/ @DCSBL
/tests/components/homewizard/ @DCSBL
/homeassistant/components/honeywell/ @rdfurman @mkmer
@@ -758,6 +762,8 @@ build.json @home-assistant/supervisor
/tests/components/icloud/ @Quentame @nzapponi
/homeassistant/components/idasen_desk/ @abmantis
/tests/components/idasen_desk/ @abmantis
/homeassistant/components/idrive_e2/ @patrickvorgers
/tests/components/idrive_e2/ @patrickvorgers
/homeassistant/components/igloohome/ @keithle888
/tests/components/igloohome/ @keithle888
/homeassistant/components/ign_sismologia/ @exxamalte
@@ -784,6 +790,8 @@ build.json @home-assistant/supervisor
/tests/components/inels/ @epdevlab
/homeassistant/components/influxdb/ @mdegat01
/tests/components/influxdb/ @mdegat01
/homeassistant/components/infrared/ @home-assistant/core
/tests/components/infrared/ @home-assistant/core
/homeassistant/components/inkbird/ @bdraco
/tests/components/inkbird/ @bdraco
/homeassistant/components/input_boolean/ @home-assistant/core
@@ -802,6 +810,8 @@ build.json @home-assistant/supervisor
/tests/components/insteon/ @teharris1
/homeassistant/components/integration/ @dgomes
/tests/components/integration/ @dgomes
/homeassistant/components/intelliclima/ @dvdinth
/tests/components/intelliclima/ @dvdinth
/homeassistant/components/intellifire/ @jeeftor
/tests/components/intellifire/ @jeeftor
/homeassistant/components/intent/ @home-assistant/core @synesthesiam @arturpragacz
@@ -1078,8 +1088,8 @@ build.json @home-assistant/supervisor
/tests/components/nam/ @bieniu
/homeassistant/components/namecheapdns/ @tr4nt0r
/tests/components/namecheapdns/ @tr4nt0r
/homeassistant/components/nanoleaf/ @milanmeu @joostlek
/tests/components/nanoleaf/ @milanmeu @joostlek
/homeassistant/components/nanoleaf/ @milanmeu @joostlek @loebi-ch @JaspervRijbroek @jonathanrobichaud4
/tests/components/nanoleaf/ @milanmeu @joostlek @loebi-ch @JaspervRijbroek @jonathanrobichaud4
/homeassistant/components/nasweb/ @nasWebio
/tests/components/nasweb/ @nasWebio
/homeassistant/components/nederlandse_spoorwegen/ @YarmoM @heindrichpaul
@@ -1172,6 +1182,8 @@ build.json @home-assistant/supervisor
/tests/components/ondilo_ico/ @JeromeHXP
/homeassistant/components/onedrive/ @zweckj
/tests/components/onedrive/ @zweckj
/homeassistant/components/onedrive_for_business/ @zweckj
/tests/components/onedrive_for_business/ @zweckj
/homeassistant/components/onewire/ @garbled1 @epenet
/tests/components/onewire/ @garbled1 @epenet
/homeassistant/components/onkyo/ @arturpragacz @eclair4151
@@ -1565,6 +1577,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/speedtestdotnet/ @rohankapoorcom @engrbm87
/tests/components/speedtestdotnet/ @rohankapoorcom @engrbm87
/homeassistant/components/splunk/ @Bre77
/tests/components/splunk/ @Bre77
/homeassistant/components/spotify/ @frenck @joostlek
/tests/components/spotify/ @frenck @joostlek
/homeassistant/components/sql/ @gjohansson-ST @dougiteixeira

View File

@@ -13,6 +13,7 @@
"microsoft",
"msteams",
"onedrive",
"onedrive_for_business",
"xbox"
]
}

View File

@@ -18,7 +18,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
from .entity import ActronAirAcEntity, ActronAirZoneEntity
from .entity import ActronAirAcEntity, ActronAirZoneEntity, handle_actron_api_errors
PARALLEL_UPDATES = 0
@@ -136,16 +136,19 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
"""Return the target temperature."""
return self._status.user_aircon_settings.temperature_setpoint_cool_c
@handle_actron_api_errors
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set a new fan mode."""
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode)
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
@handle_actron_api_errors
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
ac_mode = HVAC_MODE_MAPPING_HA_TO_ACTRONAIR.get(hvac_mode)
await self._status.ac_system.set_system_mode(ac_mode)
@handle_actron_api_errors
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
temp = kwargs.get(ATTR_TEMPERATURE)
@@ -209,11 +212,13 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
"""Return the target temperature."""
return self._zone.temperature_setpoint_cool_c
@handle_actron_api_errors
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
"""Set the HVAC mode."""
is_enabled = hvac_mode != HVACMode.OFF
await self._zone.enable(is_enabled)
@handle_actron_api_errors
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
await self._zone.set_temperature(temperature=kwargs.get(ATTR_TEMPERATURE))

View File

@@ -1,7 +1,12 @@
"""Base entity classes for Actron Air integration."""
from actron_neo_api import ActronAirZone
from collections.abc import Callable, Coroutine
from functools import wraps
from typing import Any, Concatenate
from actron_neo_api import ActronAirAPIError, ActronAirZone
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
@@ -9,6 +14,26 @@ from .const import DOMAIN
from .coordinator import ActronAirSystemCoordinator
def handle_actron_api_errors[_EntityT: ActronAirEntity, **_P](
func: Callable[Concatenate[_EntityT, _P], Coroutine[Any, Any, Any]],
) -> Callable[Concatenate[_EntityT, _P], Coroutine[Any, Any, None]]:
"""Decorate Actron Air API calls to handle ActronAirAPIError exceptions."""
@wraps(func)
async def wrapper(self: _EntityT, *args: _P.args, **kwargs: _P.kwargs) -> None:
"""Wrap API calls with exception handling."""
try:
await func(self, *args, **kwargs)
except ActronAirAPIError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"error": str(err)},
) from err
return wrapper
class ActronAirEntity(CoordinatorEntity[ActronAirSystemCoordinator]):
"""Base class for Actron Air entities."""

View File

@@ -26,7 +26,7 @@ rules:
unique-config-entry: done
# Silver
action-exceptions: todo
action-exceptions: done
config-entry-unloading: done
docs-configuration-parameters:
status: exempt

View File

@@ -49,6 +49,9 @@
}
},
"exceptions": {
"api_error": {
"message": "Failed to communicate with Actron Air device: {error}"
},
"auth_error": {
"message": "Authentication failed, please reauthenticate"
},

View File

@@ -10,7 +10,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
from .entity import ActronAirAcEntity
from .entity import ActronAirAcEntity, handle_actron_api_errors
PARALLEL_UPDATES = 0
@@ -105,10 +105,12 @@ class ActronAirSwitch(ActronAirAcEntity, SwitchEntity):
"""Return true if the switch is on."""
return self.entity_description.is_on_fn(self.coordinator)
@handle_actron_api_errors
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
await self.entity_description.set_fn(self.coordinator, True)
@handle_actron_api_errors
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
await self.entity_description.set_fn(self.coordinator, False)

View File

@@ -20,9 +20,10 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryNotReady, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.typing import ConfigType
from .const import (
CONF_FORCE,
@@ -45,6 +46,7 @@ SERVICE_REFRESH_SCHEMA = vol.Schema(
{vol.Optional(CONF_FORCE, default=False): cv.boolean}
)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
PLATFORMS = [Platform.SENSOR, Platform.SWITCH, Platform.UPDATE]
type AdGuardConfigEntry = ConfigEntry[AdGuardData]
@@ -57,6 +59,69 @@ class AdGuardData:
version: str
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the component."""
def _get_adguard_instances(hass: HomeAssistant) -> list[AdGuardHome]:
"""Get the AdGuardHome instances."""
entries: list[AdGuardConfigEntry] = hass.config_entries.async_loaded_entries(
DOMAIN
)
if not entries:
raise ServiceValidationError(
translation_domain=DOMAIN, translation_key="config_entry_not_loaded"
)
return [entry.runtime_data.client for entry in entries]
async def add_url(call: ServiceCall) -> None:
"""Service call to add a new filter subscription to AdGuard Home."""
for adguard in _get_adguard_instances(call.hass):
await adguard.filtering.add_url(
allowlist=False, name=call.data[CONF_NAME], url=call.data[CONF_URL]
)
async def remove_url(call: ServiceCall) -> None:
"""Service call to remove a filter subscription from AdGuard Home."""
for adguard in _get_adguard_instances(call.hass):
await adguard.filtering.remove_url(allowlist=False, url=call.data[CONF_URL])
async def enable_url(call: ServiceCall) -> None:
"""Service call to enable a filter subscription in AdGuard Home."""
for adguard in _get_adguard_instances(call.hass):
await adguard.filtering.enable_url(allowlist=False, url=call.data[CONF_URL])
async def disable_url(call: ServiceCall) -> None:
"""Service call to disable a filter subscription in AdGuard Home."""
for adguard in _get_adguard_instances(call.hass):
await adguard.filtering.disable_url(
allowlist=False, url=call.data[CONF_URL]
)
async def refresh(call: ServiceCall) -> None:
"""Service call to refresh the filter subscriptions in AdGuard Home."""
for adguard in _get_adguard_instances(call.hass):
await adguard.filtering.refresh(
allowlist=False, force=call.data[CONF_FORCE]
)
hass.services.async_register(
DOMAIN, SERVICE_ADD_URL, add_url, schema=SERVICE_ADD_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_REMOVE_URL, remove_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_ENABLE_URL, enable_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_DISABLE_URL, disable_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_REFRESH, refresh, schema=SERVICE_REFRESH_SCHEMA
)
return True
async def async_setup_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> bool:
"""Set up AdGuard Home from a config entry."""
session = async_get_clientsession(hass, entry.data[CONF_VERIFY_SSL])
@@ -79,56 +144,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> b
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
async def add_url(call: ServiceCall) -> None:
"""Service call to add a new filter subscription to AdGuard Home."""
await adguard.filtering.add_url(
allowlist=False, name=call.data[CONF_NAME], url=call.data[CONF_URL]
)
async def remove_url(call: ServiceCall) -> None:
"""Service call to remove a filter subscription from AdGuard Home."""
await adguard.filtering.remove_url(allowlist=False, url=call.data[CONF_URL])
async def enable_url(call: ServiceCall) -> None:
"""Service call to enable a filter subscription in AdGuard Home."""
await adguard.filtering.enable_url(allowlist=False, url=call.data[CONF_URL])
async def disable_url(call: ServiceCall) -> None:
"""Service call to disable a filter subscription in AdGuard Home."""
await adguard.filtering.disable_url(allowlist=False, url=call.data[CONF_URL])
async def refresh(call: ServiceCall) -> None:
"""Service call to refresh the filter subscriptions in AdGuard Home."""
await adguard.filtering.refresh(allowlist=False, force=call.data[CONF_FORCE])
hass.services.async_register(
DOMAIN, SERVICE_ADD_URL, add_url, schema=SERVICE_ADD_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_REMOVE_URL, remove_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_ENABLE_URL, enable_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_DISABLE_URL, disable_url, schema=SERVICE_URL_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_REFRESH, refresh, schema=SERVICE_REFRESH_SCHEMA
)
return True
async def async_unload_entry(hass: HomeAssistant, entry: AdGuardConfigEntry) -> bool:
"""Unload AdGuard Home config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if not hass.config_entries.async_loaded_entries(DOMAIN):
# This is the last loaded instance of AdGuard, deregister any services
hass.services.async_remove(DOMAIN, SERVICE_ADD_URL)
hass.services.async_remove(DOMAIN, SERVICE_REMOVE_URL)
hass.services.async_remove(DOMAIN, SERVICE_ENABLE_URL)
hass.services.async_remove(DOMAIN, SERVICE_DISABLE_URL)
hass.services.async_remove(DOMAIN, SERVICE_REFRESH)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -76,6 +76,11 @@
}
}
},
"exceptions": {
"config_entry_not_loaded": {
"message": "Config entry not loaded."
}
},
"services": {
"add_url": {
"description": "Adds a new filter subscription to AdGuard Home.",

View File

@@ -29,3 +29,24 @@ COUNTRY_DOMAINS = {
CATEGORY_SENSORS = "sensors"
CATEGORY_NOTIFICATIONS = "notifications"
# Map service translation keys to Alexa API
INFO_SKILLS_MAPPING = {
"calendar_today": "Alexa.Calendar.PlayToday",
"calendar_tomorrow": "Alexa.Calendar.PlayTomorrow",
"calendar_next": "Alexa.Calendar.PlayNext",
"date": "Alexa.Date.Play",
"time": "Alexa.Time.Play",
"national_news": "Alexa.News.NationalNews",
"flash_briefing": "Alexa.FlashBriefing.Play",
"traffic": "Alexa.Traffic.Play",
"weather": "Alexa.Weather.Play",
"cleanup": "Alexa.CleanUp.Play",
"good_morning": "Alexa.GoodMorning.Play",
"sing_song": "Alexa.SingASong.Play",
"fun_fact": "Alexa.FunFact.Play",
"tell_joke": "Alexa.Joke.Play",
"tell_story": "Alexa.TellStory.Play",
"im_home": "Alexa.ImHome.Play",
"goodnight": "Alexa.GoodNight.Play",
}

View File

@@ -1,5 +1,8 @@
{
"services": {
"send_info_skill": {
"service": "mdi:information"
},
"send_sound": {
"service": "mdi:cast-audio"
},

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==11.1.3"]
"requirements": ["aioamazondevices==12.0.0"]
}

View File

@@ -1,5 +1,6 @@
"""Support for services."""
from aioamazondevices.const.metadata import ALEXA_INFO_SKILLS
from aioamazondevices.const.sounds import SOUNDS_LIST
import voluptuous as vol
@@ -9,13 +10,15 @@ from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import config_validation as cv, device_registry as dr
from .const import DOMAIN
from .const import DOMAIN, INFO_SKILLS_MAPPING
from .coordinator import AmazonConfigEntry
ATTR_TEXT_COMMAND = "text_command"
ATTR_SOUND = "sound"
ATTR_INFO_SKILL = "info_skill"
SERVICE_TEXT_COMMAND = "send_text_command"
SERVICE_SOUND_NOTIFICATION = "send_sound"
SERVICE_INFO_SKILL = "send_info_skill"
SCHEMA_SOUND_SERVICE = vol.Schema(
{
@@ -29,6 +32,12 @@ SCHEMA_CUSTOM_COMMAND = vol.Schema(
vol.Required(ATTR_DEVICE_ID): cv.string,
}
)
SCHEMA_INFO_SKILL = vol.Schema(
{
vol.Required(ATTR_INFO_SKILL): cv.string,
vol.Required(ATTR_DEVICE_ID): cv.string,
}
)
@callback
@@ -86,6 +95,17 @@ async def _async_execute_action(call: ServiceCall, attribute: str) -> None:
await coordinator.api.call_alexa_text_command(
coordinator.data[device.serial_number], value
)
elif attribute == ATTR_INFO_SKILL:
info_skill = INFO_SKILLS_MAPPING.get(value)
if info_skill not in ALEXA_INFO_SKILLS:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="invalid_info_skill_value",
translation_placeholders={"info_skill": value},
)
await coordinator.api.call_alexa_info_skill(
coordinator.data[device.serial_number], value
)
async def async_send_sound_notification(call: ServiceCall) -> None:
@@ -98,6 +118,11 @@ async def async_send_text_command(call: ServiceCall) -> None:
await _async_execute_action(call, ATTR_TEXT_COMMAND)
async def async_send_info_skill(call: ServiceCall) -> None:
"""Send an info skill command to a AmazonDevice."""
await _async_execute_action(call, ATTR_INFO_SKILL)
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up the services for the Amazon Devices integration."""
@@ -112,5 +137,10 @@ def async_setup_services(hass: HomeAssistant) -> None:
async_send_text_command,
SCHEMA_CUSTOM_COMMAND,
),
(
SERVICE_INFO_SKILL,
async_send_info_skill,
SCHEMA_INFO_SKILL,
),
):
hass.services.async_register(DOMAIN, service_name, method, schema=schema)

View File

@@ -67,3 +67,36 @@ send_sound:
- squeaky_12
- zap_01
translation_key: sound
send_info_skill:
fields:
device_id:
required: true
selector:
device:
integration: alexa_devices
info_skill:
required: true
example: date
default: date
selector:
select:
options:
- calendar_today
- calendar_tomorrow
- calendar_next
- date
- time
- national_news
- flash_briefing
- traffic
- weather
- cleanup
- good_morning
- sing_song
- fun_fact
- tell_joke
- tell_story
- im_home
- goodnight
translation_key: info_skill

View File

@@ -102,11 +102,35 @@
"invalid_device_id": {
"message": "Invalid device ID specified: {device_id}"
},
"invalid_info_skill_value": {
"message": "Invalid info skill {info_skill} specified"
},
"invalid_sound_value": {
"message": "Invalid sound {sound} specified"
}
},
"selector": {
"info_skill": {
"options": {
"calendar_next": "Calendar: Next event",
"calendar_today": "Calendar: Today's Calendar",
"calendar_tomorrow": "Calendar: Tomorrow's Calendar",
"cleanup": "Encourage me to clean up",
"date": "Date",
"flash_briefing": "Flash Briefing",
"fun_fact": "Tell me a fun fact",
"good_morning": "Good morning",
"goodnight": "Wish me a good night",
"im_home": "Welcome me home",
"national_news": "National News",
"sing_song": "Sing a song",
"tell_joke": "Tell me a joke",
"tell_story": "Tell me a story",
"time": "Time",
"traffic": "Traffic",
"weather": "Weather"
}
},
"sound": {
"options": {
"air_horn_03": "Air horn",
@@ -154,6 +178,20 @@
}
},
"services": {
"send_info_skill": {
"description": "Sends an info skill command to a device",
"fields": {
"device_id": {
"description": "[%key:component::alexa_devices::common::device_id_description%]",
"name": "Device"
},
"info_skill": {
"description": "The info skill command to send.",
"name": "Alexa info skill command"
}
},
"name": "Send info skill command"
},
"send_sound": {
"description": "Sends a sound to a device",
"fields": {

View File

@@ -3,7 +3,6 @@
from amberelectric.models.channel import ChannelType
import voluptuous as vol
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import ATTR_CONFIG_ENTRY_ID
from homeassistant.core import (
HomeAssistant,
@@ -13,6 +12,7 @@ from homeassistant.core import (
callback,
)
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import service
from homeassistant.helpers.selector import ConfigEntrySelector
from homeassistant.util.json import JsonValueType
@@ -37,23 +37,6 @@ GET_FORECASTS_SCHEMA = vol.Schema(
)
def async_get_entry(hass: HomeAssistant, config_entry_id: str) -> AmberConfigEntry:
"""Get the Amber config entry."""
if not (entry := hass.config_entries.async_get_entry(config_entry_id)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="integration_not_found",
translation_placeholders={"target": config_entry_id},
)
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="not_loaded",
translation_placeholders={"target": entry.title},
)
return entry
def get_forecasts(channel_type: str, data: dict) -> list[JsonValueType]:
"""Return an array of forecasts."""
results: list[JsonValueType] = []
@@ -109,7 +92,9 @@ def async_setup_services(hass: HomeAssistant) -> None:
async def handle_get_forecasts(call: ServiceCall) -> ServiceResponse:
channel_type = call.data[ATTR_CHANNEL_TYPE]
entry = async_get_entry(hass, call.data[ATTR_CONFIG_ENTRY_ID])
entry: AmberConfigEntry = service.async_get_config_entry(
hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY_ID]
)
coordinator = entry.runtime_data
forecasts = get_forecasts(channel_type, coordinator.data)
return {"forecasts": forecasts}

View File

@@ -25,12 +25,6 @@
"exceptions": {
"channel_not_found": {
"message": "There is no {channel_type} channel at this site."
},
"integration_not_found": {
"message": "Config entry \"{target}\" not found in registry."
},
"not_loaded": {
"message": "{target} is not loaded."
}
},
"selector": {

View File

@@ -33,13 +33,19 @@ from .const import ATTR_LAST_DATA, TYPE_SOLARRADIATION, TYPE_SOLARRADIATION_LX
from .entity import AmbientWeatherEntity
TYPE_24HOURRAININ = "24hourrainin"
TYPE_AQI_PM10_24H_AQIN = "aqi_pm10_24h_aqin"
TYPE_AQI_PM10_AQIN = "aqi_pm10_aqin"
TYPE_AQI_PM25 = "aqi_pm25"
TYPE_AQI_PM25_24H = "aqi_pm25_24h"
TYPE_AQI_PM25_24H_AQIN = "aqi_pm25_24h_aqin"
TYPE_AQI_PM25_AQIN = "aqi_pm25_aqin"
TYPE_AQI_PM25_IN = "aqi_pm25_in"
TYPE_AQI_PM25_IN_24H = "aqi_pm25_in_24h"
TYPE_BAROMABSIN = "baromabsin"
TYPE_BAROMRELIN = "baromrelin"
TYPE_CO2 = "co2"
TYPE_CO2_IN_24H_AQIN = "co2_in_24h_aqin"
TYPE_CO2_IN_AQIN = "co2_in_aqin"
TYPE_DAILYRAININ = "dailyrainin"
TYPE_DEWPOINT = "dewPoint"
TYPE_EVENTRAININ = "eventrainin"
@@ -57,17 +63,23 @@ TYPE_HUMIDITY7 = "humidity7"
TYPE_HUMIDITY8 = "humidity8"
TYPE_HUMIDITY9 = "humidity9"
TYPE_HUMIDITYIN = "humidityin"
TYPE_LASTLIGHTNING = "lightning_time"
TYPE_LASTLIGHTNING_DISTANCE = "lightning_distance"
TYPE_LASTRAIN = "lastRain"
TYPE_LIGHTNING_PER_DAY = "lightning_day"
TYPE_LIGHTNING_PER_HOUR = "lightning_hour"
TYPE_LASTLIGHTNING_DISTANCE = "lightning_distance"
TYPE_LASTLIGHTNING = "lightning_time"
TYPE_MAXDAILYGUST = "maxdailygust"
TYPE_MONTHLYRAININ = "monthlyrainin"
TYPE_PM_IN_HUMIDITY_AQIN = "pm_in_humidity_aqin"
TYPE_PM_IN_TEMP_AQIN = "pm_in_temp_aqin"
TYPE_PM10_IN_24H_AQIN = "pm10_in_24h_aqin"
TYPE_PM10_IN_AQIN = "pm10_in_aqin"
TYPE_PM25 = "pm25"
TYPE_PM25_24H = "pm25_24h"
TYPE_PM25_IN = "pm25_in"
TYPE_PM25_IN_24H = "pm25_in_24h"
TYPE_PM25_IN_24H_AQIN = "pm25_in_24h_aqin"
TYPE_PM25_IN_AQIN = "pm25_in_aqin"
TYPE_SOILHUM1 = "soilhum1"
TYPE_SOILHUM10 = "soilhum10"
TYPE_SOILHUM2 = "soilhum2"
@@ -78,8 +90,8 @@ TYPE_SOILHUM6 = "soilhum6"
TYPE_SOILHUM7 = "soilhum7"
TYPE_SOILHUM8 = "soilhum8"
TYPE_SOILHUM9 = "soilhum9"
TYPE_SOILTEMP1F = "soiltemp1f"
TYPE_SOILTEMP10F = "soiltemp10f"
TYPE_SOILTEMP1F = "soiltemp1f"
TYPE_SOILTEMP2F = "soiltemp2f"
TYPE_SOILTEMP3F = "soiltemp3f"
TYPE_SOILTEMP4F = "soiltemp4f"
@@ -143,6 +155,86 @@ SENSOR_DESCRIPTIONS = (
translation_key="pm25_indoor_aqi_24h_average",
device_class=SensorDeviceClass.AQI,
),
SensorEntityDescription(
key=TYPE_PM25_IN_AQIN,
translation_key="pm25_indoor_aqin",
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
device_class=SensorDeviceClass.PM25,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_PM25_IN_24H_AQIN,
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
translation_key="pm25_indoor_24h_aqin",
device_class=SensorDeviceClass.PM25,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_PM10_IN_AQIN,
translation_key="pm10_indoor_aqin",
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
device_class=SensorDeviceClass.PM10,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_PM10_IN_24H_AQIN,
translation_key="pm10_indoor_24h_aqin",
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
device_class=SensorDeviceClass.PM10,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_CO2_IN_AQIN,
translation_key="co2_indoor_aqin",
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
device_class=SensorDeviceClass.CO2,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_CO2_IN_24H_AQIN,
translation_key="co2_indoor_24h_aqin",
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
device_class=SensorDeviceClass.CO2,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_PM_IN_TEMP_AQIN,
translation_key="pm_indoor_temp_aqin",
native_unit_of_measurement=UnitOfTemperature.FAHRENHEIT,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_PM_IN_HUMIDITY_AQIN,
translation_key="pm_indoor_humidity_aqin",
native_unit_of_measurement=PERCENTAGE,
device_class=SensorDeviceClass.HUMIDITY,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_AQI_PM25_AQIN,
translation_key="pm25_aqi_aqin",
device_class=SensorDeviceClass.AQI,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_AQI_PM25_24H_AQIN,
translation_key="pm25_aqi_24h_aqin",
device_class=SensorDeviceClass.AQI,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_AQI_PM10_AQIN,
translation_key="pm10_aqi_aqin",
device_class=SensorDeviceClass.AQI,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_AQI_PM10_24H_AQIN,
translation_key="pm10_aqi_24h_aqin",
device_class=SensorDeviceClass.AQI,
state_class=SensorStateClass.MEASUREMENT,
),
SensorEntityDescription(
key=TYPE_BAROMABSIN,
translation_key="absolute_pressure",

View File

@@ -156,6 +156,12 @@
"absolute_pressure": {
"name": "Absolute pressure"
},
"co2_indoor_24h_aqin": {
"name": "CO2 Indoor 24h Average AQIN"
},
"co2_indoor_aqin": {
"name": "CO2 Indoor AQIN"
},
"daily_rain": {
"name": "Daily rain"
},
@@ -228,18 +234,39 @@
"monthly_rain": {
"name": "Monthly rain"
},
"pm10_aqi_24h_aqin": {
"name": "PM10 Indoor AQI 24h Average AQIN"
},
"pm10_aqi_aqin": {
"name": "PM10 Indoor AQI AQIN"
},
"pm10_indoor_24h_aqin": {
"name": "PM10 Indoor 24h Average AQIN"
},
"pm10_indoor_aqin": {
"name": "PM10 Indoor AQIN"
},
"pm25_24h_average": {
"name": "PM2.5 24 hour average"
},
"pm25_aqi": {
"name": "PM2.5 AQI"
},
"pm25_aqi_24h_aqin": {
"name": "PM2.5 Indoor AQI 24h Average AQIN"
},
"pm25_aqi_24h_average": {
"name": "PM2.5 AQI 24 hour average"
},
"pm25_aqi_aqin": {
"name": "PM2.5 Indoor AQI AQIN"
},
"pm25_indoor": {
"name": "PM2.5 indoor"
},
"pm25_indoor_24h_aqin": {
"name": "PM2.5 Indoor 24h AQIN"
},
"pm25_indoor_24h_average": {
"name": "PM2.5 indoor 24 hour average"
},
@@ -249,6 +276,15 @@
"pm25_indoor_aqi_24h_average": {
"name": "PM2.5 indoor AQI"
},
"pm25_indoor_aqin": {
"name": "PM2.5 Indoor AQIN"
},
"pm_indoor_humidity_aqin": {
"name": "Indoor Humidity AQIN"
},
"pm_indoor_temp_aqin": {
"name": "Indoor Temperature AQIN"
},
"relative_pressure": {
"name": "Relative pressure"
},

View File

@@ -491,22 +491,24 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
"role": "user",
"content": "Where are the following coordinates located: "
f"({zone_home.attributes[ATTR_LATITUDE]},"
f" {zone_home.attributes[ATTR_LONGITUDE]})? Please respond "
"only with a JSON object using the following schema:\n"
f"{convert(location_schema)}",
},
{
"role": "assistant",
"content": "{", # hints the model to skip any preamble
},
f" {zone_home.attributes[ATTR_LONGITUDE]})?",
}
],
max_tokens=cast(int, DEFAULT[CONF_MAX_TOKENS]),
output_config={
"format": {
"type": "json_schema",
"schema": {
**convert(location_schema),
"additionalProperties": False,
},
}
},
)
_LOGGER.debug("Model response: %s", response.content)
location_data = location_schema(
json.loads(
"{"
+ "".join(
"".join(
block.text
for block in response.content
if isinstance(block, anthropic.types.TextBlock)

View File

@@ -56,6 +56,15 @@ NON_ADAPTIVE_THINKING_MODELS = [
"claude-3",
]
UNSUPPORTED_STRUCTURED_OUTPUT_MODELS = [
"claude-opus-4-1",
"claude-opus-4-0",
"claude-opus-4-20250514",
"claude-sonnet-4-0",
"claude-sonnet-4-20250514",
"claude-3",
]
WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
"claude-3-opus",

View File

@@ -20,6 +20,7 @@ from anthropic.types import (
DocumentBlockParam,
ImageBlockParam,
InputJSONDelta,
JSONOutputFormatParam,
MessageDeltaUsage,
MessageParam,
MessageStreamEvent,
@@ -94,6 +95,7 @@ from .const import (
MIN_THINKING_BUDGET,
NON_ADAPTIVE_THINKING_MODELS,
NON_THINKING_MODELS,
UNSUPPORTED_STRUCTURED_OUTPUT_MODELS,
)
# Max number of back and forth with the LLM to generate a response
@@ -697,8 +699,25 @@ class AnthropicBaseLLMEntity(Entity):
)
if structure and structure_name:
structure_name = slugify(structure_name)
if model_args["thinking"]["type"] == "disabled":
if not model.startswith(tuple(UNSUPPORTED_STRUCTURED_OUTPUT_MODELS)):
# Native structured output for those models who support it.
structure_name = None
model_args.setdefault("output_config", OutputConfigParam())[
"format"
] = JSONOutputFormatParam(
type="json_schema",
schema={
**convert(
structure,
custom_serializer=chat_log.llm_api.custom_serializer
if chat_log.llm_api
else llm.selector_serializer,
),
"additionalProperties": False,
},
)
elif model_args["thinking"]["type"] == "disabled":
structure_name = slugify(structure_name)
if not tools:
# Simplest case: no tools and no extended thinking
# Add a tool and force its use
@@ -718,6 +737,7 @@ class AnthropicBaseLLMEntity(Entity):
# force tool use or disable text responses, so we add a hint to the
# system prompt instead. With extended thinking, the model should be
# smart enough to use the tool.
structure_name = slugify(structure_name)
model_args["tool_choice"] = ToolChoiceAutoParam(
type="auto",
)
@@ -725,22 +745,24 @@ class AnthropicBaseLLMEntity(Entity):
model_args["system"].append( # type: ignore[union-attr]
TextBlockParam(
type="text",
text=f"Claude MUST use the '{structure_name}' tool to provide the final answer instead of plain text.",
text=f"Claude MUST use the '{structure_name}' tool to provide "
"the final answer instead of plain text.",
)
)
tools.append(
ToolParam(
name=structure_name,
description="Use this tool to reply to the user",
input_schema=convert(
structure,
custom_serializer=chat_log.llm_api.custom_serializer
if chat_log.llm_api
else llm.selector_serializer,
),
if structure_name:
tools.append(
ToolParam(
name=structure_name,
description="Use this tool to reply to the user",
input_schema=convert(
structure,
custom_serializer=chat_log.llm_api.custom_serializer
if chat_log.llm_api
else llm.selector_serializer,
),
)
)
)
if tools:
model_args["tools"] = tools
@@ -761,7 +783,7 @@ class AnthropicBaseLLMEntity(Entity):
_transform_stream(
chat_log,
stream,
output_tool=structure_name if structure else None,
output_tool=structure_name or None,
),
)
]

View File

@@ -297,14 +297,14 @@ class S3BackupAgent(BackupAgent):
return self._backup_cache
backups = {}
response = await self._client.list_objects_v2(Bucket=self._bucket)
# Filter for metadata files only
metadata_files = [
obj
for obj in response.get("Contents", [])
if obj["Key"].endswith(".metadata.json")
]
paginator = self._client.get_paginator("list_objects_v2")
metadata_files: list[dict[str, Any]] = []
async for page in paginator.paginate(Bucket=self._bucket):
metadata_files.extend(
obj
for obj in page.get("Contents", [])
if obj["Key"].endswith(".metadata.json")
)
for metadata_file in metadata_files:
try:

View File

@@ -16,12 +16,18 @@ CONNECTION_TIMEOUT = 120 # 2 minutes
# Default TIMEOUT_FOR_UPLOAD is 128 seconds, which is too short for large backups
TIMEOUT_FOR_UPLOAD = 43200 # 12 hours
# Reduced retry count for download operations
# Default is 20 retries with exponential backoff, which can hang for 30+ minutes
# when there are persistent connection errors (e.g., SSL failures)
TRY_COUNT_DOWNLOAD = 3
class B2Http(BaseB2Http): # type: ignore[misc]
"""B2Http with extended timeouts for backup operations."""
CONNECTION_TIMEOUT = CONNECTION_TIMEOUT
TIMEOUT_FOR_UPLOAD = TIMEOUT_FOR_UPLOAD
TRY_COUNT_DOWNLOAD = TRY_COUNT_DOWNLOAD
class B2Session(BaseB2Session): # type: ignore[misc]

View File

@@ -40,6 +40,10 @@ CACHE_TTL = 300
# This prevents uploads from hanging indefinitely
UPLOAD_TIMEOUT = 43200 # 12 hours (matches B2 HTTP timeout)
# Timeout for metadata download operations (in seconds)
# This prevents the backup system from hanging when B2 connections fail
METADATA_DOWNLOAD_TIMEOUT = 60
def suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata files."""
@@ -413,12 +417,21 @@ class BackblazeBackupAgent(BackupAgent):
backups = {}
for file_name, file_version in all_files_in_prefix.items():
if file_name.endswith(METADATA_FILE_SUFFIX):
backup = await self._hass.async_add_executor_job(
self._process_metadata_file_sync,
file_name,
file_version,
all_files_in_prefix,
)
try:
backup = await asyncio.wait_for(
self._hass.async_add_executor_job(
self._process_metadata_file_sync,
file_name,
file_version,
all_files_in_prefix,
),
timeout=METADATA_DOWNLOAD_TIMEOUT,
)
except TimeoutError:
_LOGGER.warning(
"Timeout downloading metadata file %s", file_name
)
continue
if backup:
backups[backup.backup_id] = backup
self._backup_list_cache = backups
@@ -442,10 +455,18 @@ class BackblazeBackupAgent(BackupAgent):
if not file or not metadata_file_version:
raise BackupNotFound(f"Backup {backup_id} not found")
metadata_content = await self._hass.async_add_executor_job(
self._download_and_parse_metadata_sync,
metadata_file_version,
)
try:
metadata_content = await asyncio.wait_for(
self._hass.async_add_executor_job(
self._download_and_parse_metadata_sync,
metadata_file_version,
),
timeout=METADATA_DOWNLOAD_TIMEOUT,
)
except TimeoutError:
raise BackupAgentError(
f"Timeout downloading metadata for backup {backup_id}"
) from None
_LOGGER.debug(
"Successfully retrieved metadata for backup ID %s from file %s",
@@ -468,16 +489,27 @@ class BackblazeBackupAgent(BackupAgent):
# Process metadata files sequentially to avoid exhausting executor pool
for file_name, file_version in all_files_in_prefix.items():
if file_name.endswith(METADATA_FILE_SUFFIX):
(
result_backup_file,
result_metadata_file_version,
) = await self._hass.async_add_executor_job(
self._process_metadata_file_for_id_sync,
file_name,
file_version,
backup_id,
all_files_in_prefix,
)
try:
(
result_backup_file,
result_metadata_file_version,
) = await asyncio.wait_for(
self._hass.async_add_executor_job(
self._process_metadata_file_for_id_sync,
file_name,
file_version,
backup_id,
all_files_in_prefix,
),
timeout=METADATA_DOWNLOAD_TIMEOUT,
)
except TimeoutError:
_LOGGER.warning(
"Timeout downloading metadata file %s while searching for backup %s",
file_name,
backup_id,
)
continue
if result_backup_file and result_metadata_file_version:
return result_backup_file, result_metadata_file_version

View File

@@ -8,11 +8,10 @@ from typing import Any
import voluptuous as vol
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import ATTR_CONFIG_ENTRY_ID
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, service
from homeassistant.util import dt as dt_util
from .const import ATTR_DATETIME, DOMAIN, SERVICE_SET_DATE_TIME
@@ -41,21 +40,10 @@ SET_DATE_TIME_SCHEMA = vol.Schema(
async def async_set_panel_date(call: ServiceCall) -> None:
"""Set the date and time on a bosch alarm panel."""
config_entry: BoschAlarmConfigEntry | None
value: dt.datetime = call.data.get(ATTR_DATETIME, dt_util.now())
entry_id = call.data[ATTR_CONFIG_ENTRY_ID]
if not (config_entry := call.hass.config_entries.async_get_entry(entry_id)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="integration_not_found",
translation_placeholders={"target": entry_id},
)
if config_entry.state is not ConfigEntryState.LOADED:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="not_loaded",
translation_placeholders={"target": config_entry.title},
)
config_entry: BoschAlarmConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY_ID]
)
panel = config_entry.runtime_data
try:
await panel.set_panel_date(value)

View File

@@ -155,12 +155,6 @@
"incorrect_door_state": {
"message": "Door cannot be manipulated while it is momentarily unlocked."
},
"integration_not_found": {
"message": "Integration \"{target}\" not found in registry."
},
"not_loaded": {
"message": "{target} is not loaded."
},
"unknown_error": {
"message": "An unknown error occurred while setting the date and time on \"{target}\"."
}

View File

@@ -1,7 +1,5 @@
"""Actions for Bring! integration."""
from typing import TYPE_CHECKING
from bring_api import (
ActivityType,
BringAuthException,
@@ -13,7 +11,6 @@ import voluptuous as vol
from homeassistant.components.event import ATTR_EVENT_TYPE
from homeassistant.components.todo import DOMAIN as TODO_DOMAIN
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import ATTR_ENTITY_ID
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
@@ -46,19 +43,6 @@ SERVICE_ACTIVITY_STREAM_REACTION_SCHEMA = vol.Schema(
)
def get_config_entry(hass: HomeAssistant, entry_id: str) -> BringConfigEntry:
"""Return config entry or raise if not found or not loaded."""
entry = hass.config_entries.async_get_entry(entry_id)
if TYPE_CHECKING:
assert entry
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="entry_not_loaded",
)
return entry
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services for Bring! integration."""
@@ -78,7 +62,9 @@ def async_setup_services(hass: HomeAssistant) -> None:
ATTR_ENTITY_ID: call.data[ATTR_ENTITY_ID],
},
)
config_entry = get_config_entry(hass, entity.config_entry_id)
config_entry: BringConfigEntry = service.async_get_config_entry(
hass, DOMAIN, entity.config_entry_id
)
coordinator = config_entry.runtime_data.data

View File

@@ -124,10 +124,6 @@
"entity_not_found": {
"message": "Failed to send reaction for Bring! — Unknown entity {entity_id}"
},
"entry_not_loaded": {
"message": "The account associated with this Bring! list is either not loaded or disabled in Home Assistant."
},
"notify_missing_argument": {
"message": "This action requires field {field}, please enter a valid value for {field}"
},

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from typing import Any, Final
from bsblan import BSBLANError
from bsblan import BSBLANError, get_hvac_action_category
from homeassistant.components.climate import (
ATTR_HVAC_MODE,
@@ -13,6 +13,7 @@ from homeassistant.components.climate import (
PRESET_NONE,
ClimateEntity,
ClimateEntityFeature,
HVACAction,
HVACMode,
)
from homeassistant.const import ATTR_TEMPERATURE
@@ -128,6 +129,15 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
return BSBLAN_TO_HA_HVAC_MODE.get(hvac_mode_value)
return try_parse_enum(HVACMode, hvac_mode_value)
@property
def hvac_action(self) -> HVACAction | None:
"""Return the current running hvac action."""
action = self.coordinator.data.state.hvac_action
if not action or not isinstance(action.value, int):
return None
category = get_hvac_action_category(action.value)
return HVACAction(category.name.lower())
@property
def preset_mode(self) -> str | None:
"""Return the current preset mode."""

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["bsblan"],
"requirements": ["python-bsblan==4.1.0"],
"requirements": ["python-bsblan==4.2.0"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -9,10 +9,11 @@ from bsblan import BSBLANError, SetHotWaterParam
from homeassistant.components.water_heater import (
STATE_ECO,
STATE_OFF,
STATE_PERFORMANCE,
WaterHeaterEntity,
WaterHeaterEntityFeature,
)
from homeassistant.const import ATTR_TEMPERATURE, STATE_ON
from homeassistant.const import ATTR_TEMPERATURE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.device_registry import format_mac
@@ -24,14 +25,16 @@ from .entity import BSBLanDualCoordinatorEntity
PARALLEL_UPDATES = 1
# Mapping between BSBLan and HA operation modes
OPERATION_MODES = {
"Eco": STATE_ECO, # Energy saving mode
"Off": STATE_OFF, # Protection mode
"On": STATE_ON, # Continuous comfort mode
# Mapping between BSBLan operating mode values and HA operation modes
BSBLAN_TO_HA_OPERATION_MODE: dict[int, str] = {
0: STATE_OFF, # Protection mode
1: STATE_PERFORMANCE, # Continuous comfort mode
2: STATE_ECO, # Eco/automatic mode
}
OPERATION_MODES_REVERSE = {v: k for k, v in OPERATION_MODES.items()}
HA_TO_BSBLAN_OPERATION_MODE: dict[str, int] = {
v: k for k, v in BSBLAN_TO_HA_OPERATION_MODE.items()
}
async def async_setup_entry(
@@ -63,13 +66,14 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
_attr_supported_features = (
WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.OPERATION_MODE
| WaterHeaterEntityFeature.ON_OFF
)
def __init__(self, data: BSBLanData) -> None:
"""Initialize BSBLAN water heater."""
super().__init__(data.fast_coordinator, data.slow_coordinator, data)
self._attr_unique_id = format_mac(data.device.MAC)
self._attr_operation_list = list(OPERATION_MODES_REVERSE.keys())
self._attr_operation_list = list(HA_TO_BSBLAN_OPERATION_MODE.keys())
# Set temperature unit
self._attr_temperature_unit = data.fast_coordinator.client.get_temperature_unit
@@ -110,8 +114,11 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
"""Return current operation."""
if self.coordinator.data.dhw.operating_mode is None:
return None
current_mode = self.coordinator.data.dhw.operating_mode.desc
return OPERATION_MODES.get(current_mode)
# The operating_mode.value is an integer (0=Off, 1=On, 2=Eco)
current_mode_value = self.coordinator.data.dhw.operating_mode.value
if isinstance(current_mode_value, int):
return BSBLAN_TO_HA_OPERATION_MODE.get(current_mode_value)
return None
@property
def current_temperature(self) -> float | None:
@@ -144,10 +151,12 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
async def async_set_operation_mode(self, operation_mode: str) -> None:
"""Set new operation mode."""
bsblan_mode = OPERATION_MODES_REVERSE.get(operation_mode)
# Base class validates operation_mode is in operation_list before calling
bsblan_mode = HA_TO_BSBLAN_OPERATION_MODE[operation_mode]
try:
# Send numeric value as string - BSB-LAN API expects numeric mode values
await self.coordinator.client.set_hot_water(
SetHotWaterParam(operating_mode=bsblan_mode)
SetHotWaterParam(operating_mode=str(bsblan_mode))
)
except BSBLANError as err:
raise HomeAssistantError(
@@ -156,3 +165,11 @@ class BSBLANWaterHeater(BSBLanDualCoordinatorEntity, WaterHeaterEntity):
) from err
await self.coordinator.async_request_refresh()
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the water heater on."""
await self.async_set_operation_mode(STATE_PERFORMANCE)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the water heater off."""
await self.async_set_operation_mode(STATE_OFF)

View File

@@ -29,6 +29,9 @@
"state": {
"off": "mdi:volume-low"
}
},
"room_correction": {
"default": "mdi:arrow-oscillating"
}
}
}

View File

@@ -8,6 +8,6 @@
"iot_class": "local_push",
"loggers": ["aiostreammagic"],
"quality_scale": "platinum",
"requirements": ["aiostreammagic==2.12.0"],
"requirements": ["aiostreammagic==2.12.1"],
"zeroconf": ["_stream-magic._tcp.local.", "_smoip._tcp.local."]
}

View File

@@ -62,6 +62,9 @@
},
"pre_amp": {
"name": "Pre-Amp"
},
"room_correction": {
"name": "Room correction"
}
}
},

View File

@@ -1,8 +1,8 @@
"""Support for Cambridge Audio switch entities."""
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from typing import Any
from dataclasses import dataclass, field
from typing import TYPE_CHECKING, Any
from aiostreammagic import StreamMagicClient
@@ -21,10 +21,18 @@ PARALLEL_UPDATES = 0
class CambridgeAudioSwitchEntityDescription(SwitchEntityDescription):
"""Describes Cambridge Audio switch entity."""
load_fn: Callable[[StreamMagicClient], bool] = field(default=lambda _: True)
value_fn: Callable[[StreamMagicClient], bool]
set_value_fn: Callable[[StreamMagicClient, bool], Awaitable[None]]
def room_correction_enabled(client: StreamMagicClient) -> bool:
"""Check if room correction is enabled."""
if TYPE_CHECKING:
assert client.audio.tilt_eq is not None
return client.audio.tilt_eq.enabled
CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
CambridgeAudioSwitchEntityDescription(
key="pre_amp",
@@ -40,6 +48,14 @@ CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
value_fn=lambda client: client.update.early_update,
set_value_fn=lambda client, value: client.set_early_update(value),
),
CambridgeAudioSwitchEntityDescription(
key="room_correction",
translation_key="room_correction",
entity_category=EntityCategory.CONFIG,
load_fn=lambda client: client.audio.tilt_eq is not None,
value_fn=room_correction_enabled,
set_value_fn=lambda client, value: client.set_room_correction_mode(value),
),
)
@@ -49,9 +65,11 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Cambridge Audio switch entities based on a config entry."""
client: StreamMagicClient = entry.runtime_data
async_add_entities(
CambridgeAudioSwitch(entry.runtime_data, description)
for description in CONTROL_ENTITIES
if description.load_fn(client)
)

View File

@@ -2,86 +2,23 @@
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from datetime import datetime, timedelta
import logging
import socket
import pycfdns
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_TOKEN, CONF_ZONE
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryNotReady,
HomeAssistantError,
)
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.util.location import async_detect_location_info
from homeassistant.util.network import is_ipv4_address
from .const import CONF_RECORDS, DEFAULT_UPDATE_INTERVAL, DOMAIN, SERVICE_UPDATE_RECORDS
_LOGGER = logging.getLogger(__name__)
type CloudflareConfigEntry = ConfigEntry[CloudflareRuntimeData]
@dataclass
class CloudflareRuntimeData:
"""Runtime data for Cloudflare config entry."""
client: pycfdns.Client
dns_zone: pycfdns.ZoneModel
from .const import DOMAIN, SERVICE_UPDATE_RECORDS
from .coordinator import CloudflareConfigEntry, CloudflareCoordinator
async def async_setup_entry(hass: HomeAssistant, entry: CloudflareConfigEntry) -> bool:
"""Set up Cloudflare from a config entry."""
session = async_get_clientsession(hass)
client = pycfdns.Client(
api_token=entry.data[CONF_API_TOKEN],
client_session=session,
)
entry.runtime_data = CloudflareCoordinator(hass, entry)
await entry.runtime_data.async_config_entry_first_refresh()
try:
dns_zones = await client.list_zones()
dns_zone = next(
zone for zone in dns_zones if zone["name"] == entry.data[CONF_ZONE]
)
except pycfdns.AuthenticationException as error:
raise ConfigEntryAuthFailed from error
except pycfdns.ComunicationException as error:
raise ConfigEntryNotReady from error
# Since we are not using coordinator for data reads, we need to add dummy listener
entry.async_on_unload(entry.runtime_data.async_add_listener(lambda: None))
entry.runtime_data = CloudflareRuntimeData(client, dns_zone)
async def update_records(now: datetime) -> None:
"""Set up recurring update."""
try:
await _async_update_cloudflare(hass, entry)
except (
pycfdns.AuthenticationException,
pycfdns.ComunicationException,
) as error:
_LOGGER.error("Error updating zone %s: %s", entry.data[CONF_ZONE], error)
async def update_records_service(call: ServiceCall) -> None:
async def update_records_service(_: ServiceCall) -> None:
"""Set up service for manual trigger."""
try:
await _async_update_cloudflare(hass, entry)
except (
pycfdns.AuthenticationException,
pycfdns.ComunicationException,
) as error:
_LOGGER.error("Error updating zone %s: %s", entry.data[CONF_ZONE], error)
update_interval = timedelta(minutes=DEFAULT_UPDATE_INTERVAL)
entry.async_on_unload(
async_track_time_interval(hass, update_records, update_interval)
)
await entry.runtime_data.async_request_refresh()
hass.services.async_register(DOMAIN, SERVICE_UPDATE_RECORDS, update_records_service)
@@ -92,49 +29,3 @@ async def async_unload_entry(hass: HomeAssistant, entry: CloudflareConfigEntry)
"""Unload Cloudflare config entry."""
return True
async def _async_update_cloudflare(
hass: HomeAssistant,
entry: CloudflareConfigEntry,
) -> None:
client = entry.runtime_data.client
dns_zone = entry.runtime_data.dns_zone
target_records: list[str] = entry.data[CONF_RECORDS]
_LOGGER.debug("Starting update for zone %s", dns_zone["name"])
records = await client.list_dns_records(zone_id=dns_zone["id"], type="A")
_LOGGER.debug("Records: %s", records)
session = async_get_clientsession(hass, family=socket.AF_INET)
location_info = await async_detect_location_info(session)
if not location_info or not is_ipv4_address(location_info.ip):
raise HomeAssistantError("Could not get external IPv4 address")
filtered_records = [
record
for record in records
if record["name"] in target_records and record["content"] != location_info.ip
]
if len(filtered_records) == 0:
_LOGGER.debug("All target records are up to date")
return
await asyncio.gather(
*[
client.update_dns_record(
zone_id=dns_zone["id"],
record_id=record["id"],
record_content=location_info.ip,
record_name=record["name"],
record_type=record["type"],
record_proxied=record["proxied"],
)
for record in filtered_records
]
)
_LOGGER.debug("Update for zone %s is complete", dns_zone["name"])

View File

@@ -0,0 +1,116 @@
"""Contains the Coordinator for updating the IP addresses of your Cloudflare DNS records."""
from __future__ import annotations
import asyncio
from datetime import timedelta
from logging import getLogger
import socket
import pycfdns
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_TOKEN, CONF_ZONE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util.location import async_detect_location_info
from homeassistant.util.network import is_ipv4_address
from .const import CONF_RECORDS, DEFAULT_UPDATE_INTERVAL
_LOGGER = getLogger(__name__)
type CloudflareConfigEntry = ConfigEntry[CloudflareCoordinator]
class CloudflareCoordinator(DataUpdateCoordinator[None]):
"""Coordinates records updates."""
config_entry: CloudflareConfigEntry
client: pycfdns.Client
zone: pycfdns.ZoneModel
def __init__(
self, hass: HomeAssistant, config_entry: CloudflareConfigEntry
) -> None:
"""Initialize an coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=config_entry.title,
update_interval=timedelta(minutes=DEFAULT_UPDATE_INTERVAL),
)
async def _async_setup(self) -> None:
"""Set up the coordinator."""
self.client = pycfdns.Client(
api_token=self.config_entry.data[CONF_API_TOKEN],
client_session=async_get_clientsession(self.hass),
)
try:
self.zone = next(
zone
for zone in await self.client.list_zones()
if zone["name"] == self.config_entry.data[CONF_ZONE]
)
except pycfdns.AuthenticationException as e:
raise ConfigEntryAuthFailed from e
except pycfdns.ComunicationException as e:
raise UpdateFailed("Error communicating with API") from e
async def _async_update_data(self) -> None:
"""Update records."""
_LOGGER.debug("Starting update for zone %s", self.zone["name"])
try:
records = await self.client.list_dns_records(
zone_id=self.zone["id"], type="A"
)
_LOGGER.debug("Records: %s", records)
target_records: list[str] = self.config_entry.data[CONF_RECORDS]
location_info = await async_detect_location_info(
async_get_clientsession(self.hass, family=socket.AF_INET)
)
if not location_info or not is_ipv4_address(location_info.ip):
raise UpdateFailed("Could not get external IPv4 address")
filtered_records = [
record
for record in records
if record["name"] in target_records
and record["content"] != location_info.ip
]
if len(filtered_records) == 0:
_LOGGER.debug("All target records are up to date")
return
await asyncio.gather(
*[
self.client.update_dns_record(
zone_id=self.zone["id"],
record_id=record["id"],
record_content=location_info.ip,
record_name=record["name"],
record_type=record["type"],
record_proxied=record["proxied"],
)
for record in filtered_records
]
)
_LOGGER.debug("Update for zone %s is complete", self.zone["name"])
except (
pycfdns.AuthenticationException,
pycfdns.ComunicationException,
) as e:
raise UpdateFailed(
f"Error updating zone {self.config_entry.data[CONF_ZONE]}"
) from e

View File

@@ -19,11 +19,11 @@
"secret_access_key": "Secret access key"
},
"data_description": {
"access_key_id": "Access key ID to connect to Cloudflare R2 (this is your Account ID)",
"access_key_id": "Access key ID to connect to Cloudflare R2",
"bucket": "Bucket must already exist and be writable by the provided credentials.",
"endpoint_url": "Cloudflare R2 S3-compatible endpoint.",
"prefix": "Optional folder path inside the bucket. Example: backups/homeassistant",
"secret_access_key": "Secret access key to connect to Cloudflare R2. See [Docs]({auth_docs_url})"
"secret_access_key": "Secret access key to connect to Cloudflare R2. See [Cloudflare documentation]({auth_docs_url})"
},
"title": "Add Cloudflare R2 bucket"
}

View File

@@ -144,7 +144,7 @@ class ComelitAlarmEntity(
"""Update state after action."""
self._area.human_status = area_state
self._area.armed = armed
await self.async_update_ha_state()
self.async_write_ha_state()
async def async_alarm_disarm(self, code: str | None = None) -> None:
"""Send disarm command."""

View File

@@ -11,7 +11,9 @@ from .coordinator import CompitConfigEntry, CompitDataUpdateCoordinator
PLATFORMS = [
Platform.CLIMATE,
Platform.NUMBER,
Platform.SELECT,
Platform.WATER_HEATER,
]

View File

@@ -1,5 +1,43 @@
{
"entity": {
"number": {
"boiler_target_temperature": {
"default": "mdi:water-boiler"
},
"boiler_target_temperature_const": {
"default": "mdi:water-boiler"
},
"heating_target_temperature_const": {
"default": "mdi:radiator"
},
"mixer_target_temperature": {
"default": "mdi:valve"
},
"mixer_target_temperature_zone": {
"default": "mdi:valve"
},
"target_temperature_comfort": {
"default": "mdi:thermometer"
},
"target_temperature_const": {
"default": "mdi:thermometer-lines"
},
"target_temperature_eco": {
"default": "mdi:leaf"
},
"target_temperature_eco_cooling": {
"default": "mdi:snowflake-thermometer"
},
"target_temperature_eco_winter": {
"default": "mdi:thermometer"
},
"target_temperature_holiday": {
"default": "mdi:beach"
},
"target_temperature_out_of_home": {
"default": "mdi:thermometer-off"
}
},
"select": {
"aero_by_pass": {
"default": "mdi:valve",

View File

@@ -0,0 +1,339 @@
"""Number platform for Compit integration."""
from dataclasses import dataclass
from compit_inext_api.consts import CompitParameter
from homeassistant.components.number import (
NumberDeviceClass,
NumberEntity,
NumberEntityDescription,
NumberMode,
)
from homeassistant.const import EntityCategory, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER_NAME
from .coordinator import CompitConfigEntry, CompitDataUpdateCoordinator
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class CompitDeviceDescription:
"""Class to describe a Compit device."""
name: str
"""Name of the device."""
parameters: list[NumberEntityDescription]
"""Parameters of the device."""
DESCRIPTIONS: dict[CompitParameter, NumberEntityDescription] = {
CompitParameter.TARGET_TEMPERATURE_COMFORT: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_COMFORT.value,
translation_key="target_temperature_comfort",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_ECO_WINTER: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_ECO_WINTER.value,
translation_key="target_temperature_eco_winter",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_ECO_COOLING: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_ECO_COOLING.value,
translation_key="target_temperature_eco_cooling",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_OUT_OF_HOME: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_OUT_OF_HOME.value,
translation_key="target_temperature_out_of_home",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_ECO: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_ECO.value,
translation_key="target_temperature_eco",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_HOLIDAY: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_HOLIDAY.value,
translation_key="target_temperature_holiday",
native_min_value=0,
native_max_value=40,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.TARGET_TEMPERATURE_CONST: NumberEntityDescription(
key=CompitParameter.TARGET_TEMPERATURE_CONST.value,
translation_key="target_temperature_const",
native_min_value=0,
native_max_value=95,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.HEATING_TARGET_TEMPERATURE_CONST: NumberEntityDescription(
key=CompitParameter.HEATING_TARGET_TEMPERATURE_CONST.value,
translation_key="heating_target_temperature_const",
native_min_value=0,
native_max_value=95,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.MIXER_TARGET_TEMPERATURE: NumberEntityDescription(
key=CompitParameter.MIXER_TARGET_TEMPERATURE.value,
translation_key="mixer_target_temperature",
native_min_value=0,
native_max_value=90,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.MIXER1_TARGET_TEMPERATURE: NumberEntityDescription(
key=CompitParameter.MIXER1_TARGET_TEMPERATURE.value,
translation_key="mixer_target_temperature_zone",
native_min_value=0,
native_max_value=95,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
translation_placeholders={"zone": "1"},
),
CompitParameter.MIXER2_TARGET_TEMPERATURE: NumberEntityDescription(
key=CompitParameter.MIXER2_TARGET_TEMPERATURE.value,
translation_key="mixer_target_temperature_zone",
native_min_value=0,
native_max_value=95,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
translation_placeholders={"zone": "2"},
),
CompitParameter.BOILER_TARGET_TEMPERATURE: NumberEntityDescription(
key=CompitParameter.BOILER_TARGET_TEMPERATURE.value,
translation_key="boiler_target_temperature",
native_min_value=0,
native_max_value=95,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
CompitParameter.BOILER_TARGET_TEMPERATURE_CONST: NumberEntityDescription(
key=CompitParameter.BOILER_TARGET_TEMPERATURE_CONST.value,
translation_key="boiler_target_temperature_const",
native_min_value=0,
native_max_value=90,
native_step=0.1,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=NumberDeviceClass.TEMPERATURE,
mode=NumberMode.SLIDER,
entity_category=EntityCategory.CONFIG,
),
}
DEVICE_DEFINITIONS: dict[int, CompitDeviceDescription] = {
7: CompitDeviceDescription(
name="Nano One",
parameters=[
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_COMFORT],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_ECO],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_HOLIDAY],
],
),
12: CompitDeviceDescription(
name="Nano Color",
parameters=[
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_COMFORT],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_ECO_WINTER],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_ECO_COOLING],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_OUT_OF_HOME],
],
),
223: CompitDeviceDescription(
name="Nano Color 2",
parameters=[
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_COMFORT],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_ECO_WINTER],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_ECO_COOLING],
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_OUT_OF_HOME],
],
),
3: CompitDeviceDescription(
name="R810",
parameters=[
DESCRIPTIONS[CompitParameter.TARGET_TEMPERATURE_CONST],
],
),
34: CompitDeviceDescription(
name="r470",
parameters=[
DESCRIPTIONS[CompitParameter.HEATING_TARGET_TEMPERATURE_CONST],
],
),
221: CompitDeviceDescription(
name="R350.M",
parameters=[
DESCRIPTIONS[CompitParameter.MIXER_TARGET_TEMPERATURE],
],
),
91: CompitDeviceDescription(
name="R770RS / R771RS",
parameters=[
DESCRIPTIONS[CompitParameter.MIXER1_TARGET_TEMPERATURE],
DESCRIPTIONS[CompitParameter.MIXER2_TARGET_TEMPERATURE],
],
),
212: CompitDeviceDescription(
name="BioMax742",
parameters=[
DESCRIPTIONS[CompitParameter.BOILER_TARGET_TEMPERATURE],
],
),
210: CompitDeviceDescription(
name="EL750",
parameters=[
DESCRIPTIONS[CompitParameter.BOILER_TARGET_TEMPERATURE],
],
),
36: CompitDeviceDescription(
name="BioMax742",
parameters=[
DESCRIPTIONS[CompitParameter.BOILER_TARGET_TEMPERATURE_CONST],
],
),
75: CompitDeviceDescription(
name="BioMax772",
parameters=[
DESCRIPTIONS[CompitParameter.BOILER_TARGET_TEMPERATURE_CONST],
],
),
201: CompitDeviceDescription(
name="BioMax775",
parameters=[
DESCRIPTIONS[CompitParameter.BOILER_TARGET_TEMPERATURE_CONST],
],
),
}
async def async_setup_entry(
hass: HomeAssistant,
entry: CompitConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Compit number entities from a config entry."""
coordinator = entry.runtime_data
async_add_entities(
CompitNumber(
coordinator,
device_id,
device_definition.name,
entity_description,
)
for device_id, device in coordinator.connector.all_devices.items()
if (device_definition := DEVICE_DEFINITIONS.get(device.definition.code))
for entity_description in device_definition.parameters
)
class CompitNumber(CoordinatorEntity[CompitDataUpdateCoordinator], NumberEntity):
"""Representation of a Compit number entity."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: CompitDataUpdateCoordinator,
device_id: int,
device_name: str,
entity_description: NumberEntityDescription,
) -> None:
"""Initialize the number entity."""
super().__init__(coordinator)
self.device_id = device_id
self.entity_description = entity_description
self._attr_unique_id = f"{device_id}_{entity_description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, str(device_id))},
name=device_name,
manufacturer=MANUFACTURER_NAME,
model=device_name,
)
@property
def available(self) -> bool:
"""Return if entity is available."""
return (
super().available
and self.coordinator.connector.get_device(self.device_id) is not None
)
@property
def native_value(self) -> float | None:
"""Return the current value."""
value = self.coordinator.connector.get_current_value(
self.device_id, CompitParameter(self.entity_description.key)
)
if value is None or isinstance(value, str):
return None
return value
async def async_set_native_value(self, value: float) -> None:
"""Set new value."""
await self.coordinator.connector.set_device_parameter(
self.device_id, CompitParameter(self.entity_description.key), value
)
self.async_write_ha_state()

View File

@@ -33,6 +33,44 @@
}
},
"entity": {
"number": {
"boiler_target_temperature": {
"name": "Boiler target temperature"
},
"boiler_target_temperature_const": {
"name": "Constant boiler target temperature"
},
"heating_target_temperature_const": {
"name": "Constant heating target temperature"
},
"mixer_target_temperature": {
"name": "Mixer target temperature"
},
"mixer_target_temperature_zone": {
"name": "Mixer {zone} target temperature"
},
"target_temperature_comfort": {
"name": "Target comfort temperature"
},
"target_temperature_const": {
"name": "Constant target temperature"
},
"target_temperature_eco": {
"name": "Target eco temperature"
},
"target_temperature_eco_cooling": {
"name": "Target eco cooling temperature"
},
"target_temperature_eco_winter": {
"name": "Target eco winter temperature"
},
"target_temperature_holiday": {
"name": "Target holiday temperature"
},
"target_temperature_out_of_home": {
"name": "Target out of home temperature"
}
},
"select": {
"aero_by_pass": {
"name": "Bypass",

View File

@@ -0,0 +1,315 @@
"""Water heater platform for Compit integration."""
from dataclasses import dataclass
from typing import Any
from compit_inext_api.consts import CompitParameter
from propcache.api import cached_property
from homeassistant.components.water_heater import (
STATE_ECO,
STATE_OFF,
STATE_ON,
STATE_PERFORMANCE,
WaterHeaterEntity,
WaterHeaterEntityDescription,
WaterHeaterEntityFeature,
)
from homeassistant.const import ATTR_TEMPERATURE, PRECISION_WHOLE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER_NAME
from .coordinator import CompitConfigEntry, CompitDataUpdateCoordinator
PARALLEL_UPDATES = 0
STATE_SCHEDULE = "schedule"
COMPIT_STATE_TO_HA = {
STATE_OFF: STATE_OFF,
STATE_ON: STATE_PERFORMANCE,
STATE_SCHEDULE: STATE_ECO,
}
HA_STATE_TO_COMPIT = {value: key for key, value in COMPIT_STATE_TO_HA.items()}
@dataclass(frozen=True, kw_only=True)
class CompitWaterHeaterEntityDescription(WaterHeaterEntityDescription):
"""Class to describe a Compit water heater device."""
min_temp: float
max_temp: float
supported_features: WaterHeaterEntityFeature
supports_current_temperature: bool = True
DEVICE_DEFINITIONS: dict[int, CompitWaterHeaterEntityDescription] = {
34: CompitWaterHeaterEntityDescription(
key="r470",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
91: CompitWaterHeaterEntityDescription(
key="R770RS / R771RS",
min_temp=30.0,
max_temp=80.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
92: CompitWaterHeaterEntityDescription(
key="r490",
min_temp=30.0,
max_temp=80.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
215: CompitWaterHeaterEntityDescription(
key="R480",
min_temp=30.0,
max_temp=80.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
222: CompitWaterHeaterEntityDescription(
key="R377B",
min_temp=30.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
224: CompitWaterHeaterEntityDescription(
key="R 900",
min_temp=0.0,
max_temp=70.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
36: CompitWaterHeaterEntityDescription(
key="BioMax742",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
75: CompitWaterHeaterEntityDescription(
key="BioMax772",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
201: CompitWaterHeaterEntityDescription(
key="BioMax775",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
210: CompitWaterHeaterEntityDescription(
key="EL750",
min_temp=30.0,
max_temp=80.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE
| WaterHeaterEntityFeature.ON_OFF
| WaterHeaterEntityFeature.OPERATION_MODE,
),
44: CompitWaterHeaterEntityDescription(
key="SolarComp 951",
min_temp=0.0,
max_temp=85.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE,
supports_current_temperature=False,
),
45: CompitWaterHeaterEntityDescription(
key="SolarComp971",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE,
supports_current_temperature=False,
),
99: CompitWaterHeaterEntityDescription(
key="SolarComp971C",
min_temp=0.0,
max_temp=75.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE,
supports_current_temperature=False,
),
53: CompitWaterHeaterEntityDescription(
key="R350.CWU",
min_temp=0.0,
max_temp=80.0,
supported_features=WaterHeaterEntityFeature.TARGET_TEMPERATURE,
),
}
async def async_setup_entry(
hass: HomeAssistant,
entry: CompitConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Compit water heater entities from a config entry."""
coordinator = entry.runtime_data
async_add_entities(
CompitWaterHeater(coordinator, device_id, entity_description)
for device_id, device in coordinator.connector.all_devices.items()
if (entity_description := DEVICE_DEFINITIONS.get(device.definition.code))
)
class CompitWaterHeater(
CoordinatorEntity[CompitDataUpdateCoordinator], WaterHeaterEntity
):
"""Representation of a Compit Water Heater."""
_attr_target_temperature_step = PRECISION_WHOLE
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_has_entity_name = True
_attr_name = None
entity_description: CompitWaterHeaterEntityDescription
def __init__(
self,
coordinator: CompitDataUpdateCoordinator,
device_id: int,
entity_description: CompitWaterHeaterEntityDescription,
) -> None:
"""Initialize the water heater."""
super().__init__(coordinator)
self.device_id = device_id
self.entity_description = entity_description
self._attr_unique_id = f"{device_id}_{entity_description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, str(device_id))},
name=entity_description.key,
manufacturer=MANUFACTURER_NAME,
model=entity_description.key,
)
@property
def available(self) -> bool:
"""Return if entity is available."""
return (
super().available
and self.coordinator.connector.get_device(self.device_id) is not None
)
@cached_property
def min_temp(self) -> float:
"""Return the minimum temperature."""
return self.entity_description.min_temp
@cached_property
def max_temp(self) -> float:
"""Return the maximum temperature."""
return self.entity_description.max_temp
@cached_property
def supported_features(self) -> WaterHeaterEntityFeature:
"""Return the supported features."""
return self.entity_description.supported_features
@cached_property
def operation_list(self) -> list[str] | None:
"""Return the list of available operation modes."""
if (
self.entity_description.supported_features
& WaterHeaterEntityFeature.OPERATION_MODE
):
return [STATE_OFF, STATE_PERFORMANCE, STATE_ECO]
return None
@property
def target_temperature(self) -> float | None:
"""Return the set target temperature."""
value = self.coordinator.connector.get_current_value(
self.device_id, CompitParameter.DHW_TARGET_TEMPERATURE
)
if isinstance(value, float):
return value
return None
@property
def current_temperature(self) -> float | None:
"""Return the current temperature."""
if self.entity_description.supports_current_temperature is False:
return None
value = self.coordinator.connector.get_current_value(
self.device_id, CompitParameter.DHW_CURRENT_TEMPERATURE
)
if isinstance(value, float):
return value
return None
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set new target temperature."""
temperature = kwargs.get(ATTR_TEMPERATURE)
if temperature is None:
return
self._attr_target_temperature = temperature
await self.coordinator.connector.set_device_parameter(
self.device_id,
CompitParameter.DHW_TARGET_TEMPERATURE,
float(temperature),
)
self.async_write_ha_state()
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the water heater on."""
await self.coordinator.connector.select_device_option(
self.device_id,
CompitParameter.DHW_ON_OFF,
HA_STATE_TO_COMPIT[STATE_PERFORMANCE],
)
self.async_write_ha_state()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the water heater off."""
await self.coordinator.connector.select_device_option(
self.device_id,
CompitParameter.DHW_ON_OFF,
HA_STATE_TO_COMPIT[STATE_OFF],
)
self.async_write_ha_state()
async def async_set_operation_mode(self, operation_mode: str) -> None:
"""Set new operation mode."""
await self.coordinator.connector.select_device_option(
self.device_id,
CompitParameter.DHW_ON_OFF,
HA_STATE_TO_COMPIT[operation_mode],
)
self.async_write_ha_state()
@property
def current_operation(self) -> str | None:
"""Return the current operation mode."""
on_off = self.coordinator.connector.get_current_option(
self.device_id, CompitParameter.DHW_ON_OFF
)
if on_off is None:
return None
return COMPIT_STATE_TO_HA.get(on_off)

View File

@@ -131,23 +131,29 @@ class CyncLightEntity(CyncBaseEntity, LightEntity):
async def async_turn_on(self, **kwargs: Any) -> None:
"""Process an action on the light."""
if not kwargs:
await self._device.turn_on()
converted_brightness: int | None = None
converted_color_temp: int | None = None
rgb: tuple[int, int, int] | None = None
elif kwargs.get(ATTR_COLOR_TEMP_KELVIN) is not None:
if kwargs.get(ATTR_COLOR_TEMP_KELVIN) is not None:
color_temp = kwargs.get(ATTR_COLOR_TEMP_KELVIN)
converted_color_temp = self._normalize_color_temp(color_temp)
await self._device.set_color_temp(converted_color_temp)
elif kwargs.get(ATTR_RGB_COLOR) is not None:
rgb = kwargs.get(ATTR_RGB_COLOR)
elif self.color_mode == ColorMode.RGB:
rgb = self._device.rgb
elif self.color_mode == ColorMode.COLOR_TEMP:
converted_color_temp = self._device.color_temp
await self._device.set_rgb(rgb)
elif kwargs.get(ATTR_BRIGHTNESS) is not None:
if kwargs.get(ATTR_BRIGHTNESS) is not None:
brightness = kwargs.get(ATTR_BRIGHTNESS)
converted_brightness = self._normalize_brightness(brightness)
elif self.color_mode != ColorMode.ONOFF:
converted_brightness = self._device.brightness
await self._device.set_brightness(converted_brightness)
await self._device.set_combo(
True, converted_brightness, converted_color_temp, rgb
)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the light."""

View File

@@ -7,6 +7,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["pydaikin"],
"requirements": ["pydaikin==2.17.1"],
"requirements": ["pydaikin==2.17.2"],
"zeroconf": ["_dkapi._tcp.local."]
}

View File

@@ -8,6 +8,7 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["HomeControl", "Mydevolo", "MprmRest", "MprmWebsocket", "Mprm"],
"quality_scale": "silver",
"requirements": ["devolo-home-control-api==0.19.0"],
"zeroconf": ["_dvl-deviceapi._tcp.local."]
}

View File

@@ -0,0 +1,92 @@
rules:
# Bronze
action-setup:
status: exempt
comment: |
This integration does not provide additional actions.
appropriate-polling:
status: exempt
comment: |
This integration does not poll.
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions:
status: exempt
comment: |
This integration does not provide additional actions.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: |
This integration does not provide additional actions.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: |
This integration does not have an options flow.
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates:
status: exempt
comment: |
This integration does not poll.
reauthentication-flow: done
test-coverage: done
# Gold
devices: done
diagnostics: done
discovery-update-info:
status: exempt
comment: |
The information provided by the discovery is not used for more than displaying the integration in the UI.
discovery: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices: todo
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations:
status: exempt
comment: |
This integration does not define custom icons. All entities use device class icons.
reconfiguration-flow:
status: exempt
comment: |
No configuration besides credentials.
repair-issues:
status: exempt
comment: |
This integration doesn't have any cases where raising an issue is needed.
stale-devices: done
# Platinum
async-dependency: todo
inject-websession:
status: exempt
comment: |
Integration does not use a web session.
strict-typing: done

View File

@@ -8,7 +8,7 @@ import voluptuous as vol
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_DOMAIN
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers import config_validation as cv, service
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import ConfigEntrySelector
@@ -47,13 +47,9 @@ def get_config_entry(
translation_domain=DOMAIN,
translation_key="entry_not_selected",
)
return entries[0]
if not (entry := hass.config_entries.async_get_entry(entry_id)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="entry_not_found",
)
return entry
entry_id = entries[0].entry_id
return service.async_get_config_entry(hass, DOMAIN, entry_id)
async def update_domain_service(call: ServiceCall) -> None:

View File

@@ -10,7 +10,6 @@ from typing import Final
from easyenergy import Electricity, Gas, VatOption
import voluptuous as vol
from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import (
HomeAssistant,
ServiceCall,
@@ -19,7 +18,7 @@ from homeassistant.core import (
callback,
)
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers import selector
from homeassistant.helpers import selector, service
from homeassistant.util import dt as dt_util
from .const import DOMAIN
@@ -88,28 +87,9 @@ def __serialize_prices(prices: list[dict[str, float | datetime]]) -> ServiceResp
def __get_coordinator(call: ServiceCall) -> EasyEnergyDataUpdateCoordinator:
"""Get the coordinator from the entry."""
entry_id: str = call.data[ATTR_CONFIG_ENTRY]
entry: EasyEnergyConfigEntry | None = call.hass.config_entries.async_get_entry(
entry_id
entry: EasyEnergyConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
if not entry:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="invalid_config_entry",
translation_placeholders={
"config_entry": entry_id,
},
)
if entry.state != ConfigEntryState.LOADED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="unloaded_config_entry",
translation_placeholders={
"config_entry": entry.title,
},
)
return entry.runtime_data

View File

@@ -44,14 +44,8 @@
}
},
"exceptions": {
"invalid_config_entry": {
"message": "Invalid config entry provided. Got {config_entry}"
},
"invalid_date": {
"message": "Invalid date provided. Got {date}"
},
"unloaded_config_entry": {
"message": "Invalid config entry provided. {config_entry} is not loaded."
}
},
"services": {

View File

@@ -6,6 +6,7 @@ import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_DEVICE
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.selector import (
SelectSelector,
SelectSelectorConfig,
@@ -15,6 +16,12 @@ from homeassistant.helpers.selector import (
from . import dongle
from .const import DOMAIN, ERROR_INVALID_DONGLE_PATH, LOGGER
MANUAL_SCHEMA = vol.Schema(
{
vol.Required(CONF_DEVICE): cv.string,
}
)
class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle the enOcean config flows."""
@@ -49,17 +56,14 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Propose a list of detected dongles."""
errors = {}
if user_input is not None:
if user_input[CONF_DEVICE] == self.MANUAL_PATH_VALUE:
return await self.async_step_manual()
if await self.validate_enocean_conf(user_input):
return self.create_enocean_entry(user_input)
errors = {CONF_DEVICE: ERROR_INVALID_DONGLE_PATH}
return await self.async_step_manual(user_input)
devices = await self.hass.async_add_executor_job(dongle.detect)
if len(devices) == 0:
return await self.async_step_manual(user_input)
return await self.async_step_manual()
devices.append(self.MANUAL_PATH_VALUE)
return self.async_show_form(
@@ -75,26 +79,21 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
)
}
),
errors=errors,
)
async def async_step_manual(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Request manual USB dongle path."""
default_value = None
errors = {}
if user_input is not None:
if await self.validate_enocean_conf(user_input):
return self.create_enocean_entry(user_input)
default_value = user_input[CONF_DEVICE]
errors = {CONF_DEVICE: ERROR_INVALID_DONGLE_PATH}
return self.async_show_form(
step_id="manual",
data_schema=vol.Schema(
{vol.Required(CONF_DEVICE, default=default_value): str}
),
data_schema=self.add_suggested_values_to_schema(MANUAL_SCHEMA, user_input),
errors=errors,
)

View File

@@ -9,49 +9,34 @@ Note that the API used by this integration's client does not support cooling.
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
from typing import Final
import evohomeasync as ec1
import evohomeasync2 as ec2
from evohomeasync2.const import SZ_CAN_BE_TEMPORARY, SZ_SYSTEM_MODE, SZ_TIMING_MODE
from evohomeasync2.schemas.const import (
S2_DURATION as SZ_DURATION,
S2_PERIOD as SZ_PERIOD,
SystemMode as EvoSystemMode,
)
import voluptuous as vol
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_MODE,
CONF_PASSWORD,
CONF_SCAN_INTERVAL,
CONF_USERNAME,
Platform,
)
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.discovery import async_load_platform
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.service import verify_domain_control
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.hass_dict import HassKey
from .const import (
ATTR_DURATION,
ATTR_DURATION_UNTIL,
ATTR_PERIOD,
ATTR_SETPOINT,
CONF_LOCATION_IDX,
DOMAIN,
EVOHOME_DATA,
SCAN_INTERVAL_DEFAULT,
SCAN_INTERVAL_MINIMUM,
EvoService,
)
from .coordinator import EvoDataUpdateCoordinator
from .services import setup_service_functions
from .storage import TokenManager
_LOGGER = logging.getLogger(__name__)
@@ -72,26 +57,6 @@ CONFIG_SCHEMA: Final = vol.Schema(
extra=vol.ALLOW_EXTRA,
)
# system mode schemas are built dynamically when the services are registered
# because supported modes can vary for edge-case systems
RESET_ZONE_OVERRIDE_SCHEMA: Final = vol.Schema(
{vol.Required(ATTR_ENTITY_ID): cv.entity_id}
)
SET_ZONE_OVERRIDE_SCHEMA: Final = vol.Schema(
{
vol.Required(ATTR_ENTITY_ID): cv.entity_id,
vol.Required(ATTR_SETPOINT): vol.All(
vol.Coerce(float), vol.Range(min=4.0, max=35.0)
),
vol.Optional(ATTR_DURATION_UNTIL): vol.All(
cv.time_period, vol.Range(min=timedelta(days=0), max=timedelta(days=1))
),
}
)
EVOHOME_KEY: HassKey[EvoData] = HassKey(DOMAIN)
@dataclass
class EvoData:
@@ -130,7 +95,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
assert coordinator.tcs is not None # mypy
hass.data[EVOHOME_KEY] = EvoData(
hass.data[EVOHOME_DATA] = EvoData(
coordinator=coordinator,
loc_idx=coordinator.loc_idx,
tcs=coordinator.tcs,
@@ -147,132 +112,3 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
setup_service_functions(hass, coordinator)
return True
@callback
def setup_service_functions(
hass: HomeAssistant, coordinator: EvoDataUpdateCoordinator
) -> None:
"""Set up the service handlers for the system/zone operating modes.
Not all Honeywell TCC-compatible systems support all operating modes. In addition,
each mode will require any of four distinct service schemas. This has to be
enumerated before registering the appropriate handlers.
It appears that all TCC-compatible systems support the same three zones modes.
"""
@verify_domain_control(DOMAIN)
async def force_refresh(call: ServiceCall) -> None:
"""Obtain the latest state data via the vendor's RESTful API."""
await coordinator.async_refresh()
@verify_domain_control(DOMAIN)
async def set_system_mode(call: ServiceCall) -> None:
"""Set the system mode."""
assert coordinator.tcs is not None # mypy
payload = {
"unique_id": coordinator.tcs.id,
"service": call.service,
"data": call.data,
}
async_dispatcher_send(hass, DOMAIN, payload)
@verify_domain_control(DOMAIN)
async def set_zone_override(call: ServiceCall) -> None:
"""Set the zone override (setpoint)."""
entity_id = call.data[ATTR_ENTITY_ID]
registry = er.async_get(hass)
registry_entry = registry.async_get(entity_id)
if registry_entry is None or registry_entry.platform != DOMAIN:
raise ValueError(f"'{entity_id}' is not a known {DOMAIN} entity")
if registry_entry.domain != "climate":
raise ValueError(f"'{entity_id}' is not an {DOMAIN} controller/zone")
payload = {
"unique_id": registry_entry.unique_id,
"service": call.service,
"data": call.data,
}
async_dispatcher_send(hass, DOMAIN, payload)
assert coordinator.tcs is not None # mypy
hass.services.async_register(DOMAIN, EvoService.REFRESH_SYSTEM, force_refresh)
# Enumerate which operating modes are supported by this system
modes = list(coordinator.tcs.allowed_system_modes)
# Not all systems support "AutoWithReset": register this handler only if required
if any(
m[SZ_SYSTEM_MODE]
for m in modes
if m[SZ_SYSTEM_MODE] == EvoSystemMode.AUTO_WITH_RESET
):
hass.services.async_register(DOMAIN, EvoService.RESET_SYSTEM, set_system_mode)
system_mode_schemas = []
modes = [m for m in modes if m[SZ_SYSTEM_MODE] != EvoSystemMode.AUTO_WITH_RESET]
# Permanent-only modes will use this schema
perm_modes = [m[SZ_SYSTEM_MODE] for m in modes if not m[SZ_CAN_BE_TEMPORARY]]
if perm_modes: # any of: "Auto", "HeatingOff": permanent only
schema = vol.Schema({vol.Required(ATTR_MODE): vol.In(perm_modes)})
system_mode_schemas.append(schema)
modes = [m for m in modes if m[SZ_CAN_BE_TEMPORARY]]
# These modes are set for a number of hours (or indefinitely): use this schema
temp_modes = [m[SZ_SYSTEM_MODE] for m in modes if m[SZ_TIMING_MODE] == SZ_DURATION]
if temp_modes: # any of: "AutoWithEco", permanent or for 0-24 hours
schema = vol.Schema(
{
vol.Required(ATTR_MODE): vol.In(temp_modes),
vol.Optional(ATTR_DURATION): vol.All(
cv.time_period,
vol.Range(min=timedelta(hours=0), max=timedelta(hours=24)),
),
}
)
system_mode_schemas.append(schema)
# These modes are set for a number of days (or indefinitely): use this schema
temp_modes = [m[SZ_SYSTEM_MODE] for m in modes if m[SZ_TIMING_MODE] == SZ_PERIOD]
if temp_modes: # any of: "Away", "Custom", "DayOff", permanent or for 1-99 days
schema = vol.Schema(
{
vol.Required(ATTR_MODE): vol.In(temp_modes),
vol.Optional(ATTR_PERIOD): vol.All(
cv.time_period,
vol.Range(min=timedelta(days=1), max=timedelta(days=99)),
),
}
)
system_mode_schemas.append(schema)
if system_mode_schemas:
hass.services.async_register(
DOMAIN,
EvoService.SET_SYSTEM_MODE,
set_system_mode,
schema=vol.Schema(vol.Any(*system_mode_schemas)),
)
# The zone modes are consistent across all systems and use the same schema
hass.services.async_register(
DOMAIN,
EvoService.RESET_ZONE_OVERRIDE,
set_zone_override,
schema=RESET_ZONE_OVERRIDE_SCHEMA,
)
hass.services.async_register(
DOMAIN,
EvoService.SET_ZONE_OVERRIDE,
set_zone_override,
schema=SET_ZONE_OVERRIDE_SCHEMA,
)

View File

@@ -41,12 +41,12 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import dt as dt_util
from . import EVOHOME_KEY
from .const import (
ATTR_DURATION,
ATTR_DURATION_UNTIL,
ATTR_PERIOD,
ATTR_SETPOINT,
EVOHOME_DATA,
EvoService,
)
from .coordinator import EvoDataUpdateCoordinator
@@ -85,9 +85,9 @@ async def async_setup_platform(
if discovery_info is None:
return
coordinator = hass.data[EVOHOME_KEY].coordinator
loc_idx = hass.data[EVOHOME_KEY].loc_idx
tcs = hass.data[EVOHOME_KEY].tcs
coordinator = hass.data[EVOHOME_DATA].coordinator
loc_idx = hass.data[EVOHOME_DATA].loc_idx
tcs = hass.data[EVOHOME_DATA].tcs
_LOGGER.debug(
"Found the Location/Controller (%s), id=%s, name=%s (location_idx=%s)",

View File

@@ -4,9 +4,15 @@ from __future__ import annotations
from datetime import timedelta
from enum import StrEnum, unique
from typing import Final
from typing import TYPE_CHECKING, Final
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from . import EvoData
DOMAIN: Final = "evohome"
EVOHOME_DATA: HassKey[EvoData] = HassKey(DOMAIN)
STORAGE_VER: Final = 1
STORAGE_KEY: Final = DOMAIN

View File

@@ -0,0 +1,178 @@
"""Service handlers for the Evohome integration."""
from __future__ import annotations
from datetime import timedelta
from typing import Final
from evohomeasync2.const import SZ_CAN_BE_TEMPORARY, SZ_SYSTEM_MODE, SZ_TIMING_MODE
from evohomeasync2.schemas.const import (
S2_DURATION as SZ_DURATION,
S2_PERIOD as SZ_PERIOD,
SystemMode as EvoSystemMode,
)
import voluptuous as vol
from homeassistant.const import ATTR_ENTITY_ID, ATTR_MODE
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.service import verify_domain_control
from .const import (
ATTR_DURATION,
ATTR_DURATION_UNTIL,
ATTR_PERIOD,
ATTR_SETPOINT,
DOMAIN,
EvoService,
)
from .coordinator import EvoDataUpdateCoordinator
# system mode schemas are built dynamically when the services are registered
# because supported modes can vary for edge-case systems
RESET_ZONE_OVERRIDE_SCHEMA: Final = vol.Schema(
{vol.Required(ATTR_ENTITY_ID): cv.entity_id}
)
SET_ZONE_OVERRIDE_SCHEMA: Final = vol.Schema(
{
vol.Required(ATTR_ENTITY_ID): cv.entity_id,
vol.Required(ATTR_SETPOINT): vol.All(
vol.Coerce(float), vol.Range(min=4.0, max=35.0)
),
vol.Optional(ATTR_DURATION_UNTIL): vol.All(
cv.time_period,
vol.Range(min=timedelta(days=0), max=timedelta(days=1)),
),
}
)
@callback
def setup_service_functions(
hass: HomeAssistant, coordinator: EvoDataUpdateCoordinator
) -> None:
"""Set up the service handlers for the system/zone operating modes.
Not all Honeywell TCC-compatible systems support all operating modes. In addition,
each mode will require any of four distinct service schemas. This has to be
enumerated before registering the appropriate handlers.
It appears that all TCC-compatible systems support the same three zones modes.
"""
@verify_domain_control(DOMAIN)
async def force_refresh(call: ServiceCall) -> None:
"""Obtain the latest state data via the vendor's RESTful API."""
await coordinator.async_refresh()
@verify_domain_control(DOMAIN)
async def set_system_mode(call: ServiceCall) -> None:
"""Set the system mode."""
assert coordinator.tcs is not None # mypy
payload = {
"unique_id": coordinator.tcs.id,
"service": call.service,
"data": call.data,
}
async_dispatcher_send(hass, DOMAIN, payload)
@verify_domain_control(DOMAIN)
async def set_zone_override(call: ServiceCall) -> None:
"""Set the zone override (setpoint)."""
entity_id = call.data[ATTR_ENTITY_ID]
registry = er.async_get(hass)
registry_entry = registry.async_get(entity_id)
if registry_entry is None or registry_entry.platform != DOMAIN:
raise ValueError(f"'{entity_id}' is not a known {DOMAIN} entity")
if registry_entry.domain != "climate":
raise ValueError(f"'{entity_id}' is not an {DOMAIN} controller/zone")
payload = {
"unique_id": registry_entry.unique_id,
"service": call.service,
"data": call.data,
}
async_dispatcher_send(hass, DOMAIN, payload)
assert coordinator.tcs is not None # mypy
hass.services.async_register(DOMAIN, EvoService.REFRESH_SYSTEM, force_refresh)
# Enumerate which operating modes are supported by this system
modes = list(coordinator.tcs.allowed_system_modes)
# Not all systems support "AutoWithReset": register this handler only if required
if any(
m[SZ_SYSTEM_MODE]
for m in modes
if m[SZ_SYSTEM_MODE] == EvoSystemMode.AUTO_WITH_RESET
):
hass.services.async_register(DOMAIN, EvoService.RESET_SYSTEM, set_system_mode)
system_mode_schemas = []
modes = [m for m in modes if m[SZ_SYSTEM_MODE] != EvoSystemMode.AUTO_WITH_RESET]
# Permanent-only modes will use this schema
perm_modes = [m[SZ_SYSTEM_MODE] for m in modes if not m[SZ_CAN_BE_TEMPORARY]]
if perm_modes: # any of: "Auto", "HeatingOff": permanent only
schema = vol.Schema({vol.Required(ATTR_MODE): vol.In(perm_modes)})
system_mode_schemas.append(schema)
modes = [m for m in modes if m[SZ_CAN_BE_TEMPORARY]]
# These modes are set for a number of hours (or indefinitely): use this schema
temp_modes = [m[SZ_SYSTEM_MODE] for m in modes if m[SZ_TIMING_MODE] == SZ_DURATION]
if temp_modes: # any of: "AutoWithEco", permanent or for 0-24 hours
schema = vol.Schema(
{
vol.Required(ATTR_MODE): vol.In(temp_modes),
vol.Optional(ATTR_DURATION): vol.All(
cv.time_period,
vol.Range(min=timedelta(hours=0), max=timedelta(hours=24)),
),
}
)
system_mode_schemas.append(schema)
# These modes are set for a number of days (or indefinitely): use this schema
temp_modes = [m[SZ_SYSTEM_MODE] for m in modes if m[SZ_TIMING_MODE] == SZ_PERIOD]
if temp_modes: # any of: "Away", "Custom", "DayOff", permanent or for 1-99 days
schema = vol.Schema(
{
vol.Required(ATTR_MODE): vol.In(temp_modes),
vol.Optional(ATTR_PERIOD): vol.All(
cv.time_period,
vol.Range(min=timedelta(days=1), max=timedelta(days=99)),
),
}
)
system_mode_schemas.append(schema)
if system_mode_schemas:
hass.services.async_register(
DOMAIN,
EvoService.SET_SYSTEM_MODE,
set_system_mode,
schema=vol.Schema(vol.Any(*system_mode_schemas)),
)
# The zone modes are consistent across all systems and use the same schema
hass.services.async_register(
DOMAIN,
EvoService.RESET_ZONE_OVERRIDE,
set_zone_override,
schema=RESET_ZONE_OVERRIDE_SCHEMA,
)
hass.services.async_register(
DOMAIN,
EvoService.SET_ZONE_OVERRIDE,
set_zone_override,
schema=SET_ZONE_OVERRIDE_SCHEMA,
)

View File

@@ -25,7 +25,7 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import dt as dt_util
from . import EVOHOME_KEY
from .const import EVOHOME_DATA
from .coordinator import EvoDataUpdateCoordinator
from .entity import EvoChild
@@ -47,8 +47,8 @@ async def async_setup_platform(
if discovery_info is None:
return
coordinator = hass.data[EVOHOME_KEY].coordinator
tcs = hass.data[EVOHOME_KEY].tcs
coordinator = hass.data[EVOHOME_DATA].coordinator
tcs = hass.data[EVOHOME_DATA].tcs
assert tcs.hotwater is not None # mypy check

View File

@@ -36,6 +36,7 @@ from homeassistant.helpers.device_registry import CONNECTION_NETWORK_MAC
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import slugify
from homeassistant.util.hass_dict import HassKey
from .const import (
@@ -90,10 +91,56 @@ class UpdateCoordinatorDataType(TypedDict):
entity_states: dict[str, StateType | bool]
class FritzConnectionCached(FritzConnection): # type: ignore[misc]
"""FritzConnection with cached call action."""
_call_cache: dict[str, dict[str, Any]]
def clear_cache(self) -> None:
"""Clear cached calls."""
self._call_cache = {}
_LOGGER.debug("Cleared FritzConnection call action cache")
def call_action(
self,
service_name: str,
action_name: str,
*,
arguments: dict | None = None,
**kwargs: Any,
) -> dict[str, Any]:
"""Call action with cached services. Only get actions are cached."""
if not action_name.lower().startswith("get"):
return super().call_action( # type: ignore[no-any-return]
service_name, action_name, arguments=arguments, **kwargs
)
if not hasattr(self, "_call_cache"):
self._call_cache = {}
kwargs_key = ",".join(f"{k}={v!r}" for k, v in sorted(kwargs.items()))
cache_key = slugify(f"{service_name}:{action_name}:{arguments}:{kwargs_key}")
if (result := self._call_cache.get(cache_key)) is not None:
_LOGGER.debug("Using cached result for %s %s", service_name, action_name)
return result
result = super().call_action(
service_name, action_name, arguments=arguments, **kwargs
)
self._call_cache[cache_key] = result
return result # type: ignore[no-any-return]
class FritzBoxTools(DataUpdateCoordinator[UpdateCoordinatorDataType]):
"""FritzBoxTools class."""
config_entry: FritzConfigEntry
connection: FritzConnectionCached
fritz_guest_wifi: FritzGuestWLAN
fritz_hosts: FritzHosts
fritz_status: FritzStatus
fritz_call: FritzCall
def __init__(
self,
@@ -118,11 +165,6 @@ class FritzBoxTools(DataUpdateCoordinator[UpdateCoordinatorDataType]):
self._devices: dict[str, FritzDevice] = {}
self._options: Mapping[str, Any] | None = None
self._unique_id: str | None = None
self.connection: FritzConnection = None
self.fritz_guest_wifi: FritzGuestWLAN = None
self.fritz_hosts: FritzHosts = None
self.fritz_status: FritzStatus = None
self.fritz_call: FritzCall = None
self.host = host
self.mesh_role = MeshRoles.NONE
self.mesh_wifi_uplink = False
@@ -159,11 +201,12 @@ class FritzBoxTools(DataUpdateCoordinator[UpdateCoordinatorDataType]):
name=self.config_entry.title,
sw_version=self.current_firmware,
)
self.connection.clear_cache()
def setup(self) -> None:
"""Set up FritzboxTools class."""
self.connection = FritzConnection(
self.connection = FritzConnectionCached(
address=self.host,
port=self.port,
user=self.username,
@@ -263,6 +306,7 @@ class FritzBoxTools(DataUpdateCoordinator[UpdateCoordinatorDataType]):
"call_deflections": {},
"entity_states": {},
}
self.connection.clear_cache()
try:
await self.async_update_device_info()
@@ -278,6 +322,12 @@ class FritzBoxTools(DataUpdateCoordinator[UpdateCoordinatorDataType]):
"call_deflections"
] = await self.async_update_call_deflections()
except FRITZ_EXCEPTIONS as ex:
_LOGGER.debug(
"Reload %s due to error '%s' to ensure proper re-login",
self.config_entry.title,
ex,
)
self.hass.config_entries.async_schedule_reload(self.config_entry.entry_id)
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_failed",

View File

@@ -65,10 +65,10 @@ class FritzGuestWifiQRImage(FritzBoxBaseEntity, ImageEntity):
super().__init__(avm_wrapper, device_friendly_name)
ImageEntity.__init__(self, hass)
async def _fetch_image(self) -> bytes:
def _fetch_image(self) -> bytes:
"""Fetch the QR code from the Fritz!Box."""
qr_stream: BytesIO = await self.hass.async_add_executor_job(
self._avm_wrapper.fritz_guest_wifi.get_wifi_qr_code, "png"
qr_stream: BytesIO = self._avm_wrapper.fritz_guest_wifi.get_wifi_qr_code(
"png", border=2
)
qr_bytes = qr_stream.getvalue()
_LOGGER.debug("fetched %s bytes", len(qr_bytes))
@@ -77,13 +77,15 @@ class FritzGuestWifiQRImage(FritzBoxBaseEntity, ImageEntity):
async def async_added_to_hass(self) -> None:
"""Fetch and set initial data and state."""
self._current_qr_bytes = await self._fetch_image()
self._current_qr_bytes = await self.hass.async_add_executor_job(
self._fetch_image
)
self._attr_image_last_updated = dt_util.utcnow()
async def async_update(self) -> None:
"""Update the image entity data."""
try:
qr_bytes = await self._fetch_image()
qr_bytes = await self.hass.async_add_executor_job(self._fetch_image)
except RequestException:
self._current_qr_bytes = None
self._attr_image_last_updated = None

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
from requests.exceptions import ConnectionError as RequestConnectionError, HTTPError
from homeassistant.components.binary_sensor import DOMAIN as BINARY_SENSOR_DOMAIN
from homeassistant.const import EVENT_HOMEASSISTANT_STOP, UnitOfTemperature
from homeassistant.core import Event, HomeAssistant
@@ -57,7 +59,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: FritzboxConfigEntry) ->
async def async_unload_entry(hass: HomeAssistant, entry: FritzboxConfigEntry) -> bool:
"""Unloading the AVM FRITZ!SmartHome platforms."""
await hass.async_add_executor_job(entry.runtime_data.fritz.logout)
try:
await hass.async_add_executor_job(entry.runtime_data.fritz.logout)
except (RequestConnectionError, HTTPError) as ex:
LOGGER.debug("logout failed with '%s', anyway continue with unload", ex)
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -121,26 +121,11 @@ class FritzboxDataUpdateCoordinator(DataUpdateCoordinator[FritzboxCoordinatorDat
def _update_fritz_devices(self) -> FritzboxCoordinatorData:
"""Update all fritzbox device data."""
try:
self.fritz.update_devices(ignore_removed=False)
if self.has_templates:
self.fritz.update_templates(ignore_removed=False)
if self.has_triggers:
self.fritz.update_triggers(ignore_removed=False)
except RequestConnectionError as ex:
raise UpdateFailed from ex
except HTTPError:
# If the device rebooted, login again
try:
self.fritz.login()
except LoginError as ex:
raise ConfigEntryAuthFailed from ex
self.fritz.update_devices(ignore_removed=False)
if self.has_templates:
self.fritz.update_templates(ignore_removed=False)
if self.has_triggers:
self.fritz.update_triggers(ignore_removed=False)
self.fritz.update_devices(ignore_removed=False)
if self.has_templates:
self.fritz.update_templates(ignore_removed=False)
if self.has_triggers:
self.fritz.update_triggers(ignore_removed=False)
devices = self.fritz.get_devices()
device_data = {}
@@ -193,7 +178,18 @@ class FritzboxDataUpdateCoordinator(DataUpdateCoordinator[FritzboxCoordinatorDat
async def _async_update_data(self) -> FritzboxCoordinatorData:
"""Fetch all device data."""
new_data = await self.hass.async_add_executor_job(self._update_fritz_devices)
try:
new_data = await self.hass.async_add_executor_job(
self._update_fritz_devices
)
except (RequestConnectionError, HTTPError) as ex:
LOGGER.debug(
"Reload %s due to error '%s' to ensure proper re-login",
self.config_entry.title,
ex,
)
self.hass.config_entries.async_schedule_reload(self.config_entry.entry_id)
raise UpdateFailed from ex
for device in new_data.devices.values():
# create device registry entry for new main devices

View File

@@ -2,10 +2,12 @@
from __future__ import annotations
from aiohttp import ClientResponseError
from homelink.mqtt_provider import MQTTProvider
from homeassistant.const import EVENT_HOMEASSISTANT_STOP, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from . import oauth2
@@ -18,6 +20,10 @@ PLATFORMS: list[Platform] = [Platform.EVENT]
async def async_setup_entry(hass: HomeAssistant, entry: HomeLinkConfigEntry) -> bool:
"""Set up homelink from a config entry."""
auth_implementation = oauth2.SRPAuthImplementation(hass, DOMAIN)
try:
await auth_implementation.async_refresh_token(entry.data["token"])
except ClientResponseError as err:
raise ConfigEntryAuthFailed(err) from err
config_entry_oauth2_flow.async_register_implementation(
hass, DOMAIN, auth_implementation

View File

@@ -1,5 +1,6 @@
"""Config flow for homelink."""
from collections.abc import Mapping
import logging
from typing import Any
@@ -8,8 +9,8 @@ from homelink.auth.srp_auth import SRPAuth
import jwt
import voluptuous as vol
from homeassistant.config_entries import ConfigFlowResult
from homeassistant.const import CONF_EMAIL, CONF_PASSWORD
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
from homeassistant.const import CONF_EMAIL, CONF_PASSWORD, CONF_UNIQUE_ID
from homeassistant.helpers.config_entry_oauth2_flow import AbstractOAuth2FlowHandler
from .const import DOMAIN
@@ -56,9 +57,13 @@ class SRPFlowHandler(AbstractOAuth2FlowHandler, domain=DOMAIN):
tokens["AuthenticationResult"]["AccessToken"],
options={"verify_signature": False},
)
await self.async_set_unique_id(access_token["sub"])
self._abort_if_unique_id_configured()
self.external_data = {"tokens": tokens}
sub = access_token["sub"]
await self.async_set_unique_id(sub)
self.external_data = {
"tokens": tokens,
CONF_UNIQUE_ID: sub,
CONF_EMAIL: user_input[CONF_EMAIL].strip().lower(),
}
return await self.async_step_creation()
return self.async_show_form(
@@ -68,3 +73,36 @@ class SRPFlowHandler(AbstractOAuth2FlowHandler, domain=DOMAIN):
),
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Perform reauth upon an API authentication error."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
if user_input is None:
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema(
{vol.Required(CONF_EMAIL): str, vol.Required(CONF_PASSWORD): str}
),
)
return await self.async_step_user(user_input)
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
"""Create an oauth config entry or update existing entry for reauth."""
await self.async_set_unique_id(self.external_data[CONF_UNIQUE_ID])
entry_title = self.context.get("title_placeholders", {"name": "HomeLink"})[
"name"
]
if self.source == SOURCE_REAUTH:
self._abort_if_unique_id_mismatch()
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data_updates=data, title=entry_title
)
self._abort_if_unique_id_configured()
return self.async_create_entry(data=data, title=entry_title)

View File

@@ -1,7 +1,5 @@
"""Constants for the homelink integration."""
DOMAIN = "gentex_homelink"
OAUTH2_TOKEN = "https://auth.homelinkcloud.com/oauth2/token"
POLLING_INTERVAL = 5
EVENT_PRESSED = "Pressed"
OAUTH2_TOKEN_URL = "https://auth.homelinkcloud.com/oauth2/token"

View File

@@ -13,7 +13,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import OAUTH2_TOKEN
from .const import OAUTH2_TOKEN_URL
_LOGGER = logging.getLogger(__name__)
@@ -59,8 +59,8 @@ class SRPAuthImplementation(config_entry_oauth2_flow.AbstractOAuth2Implementatio
data["client_id"] = self.client_id
_LOGGER.debug("Sending token request to %s", OAUTH2_TOKEN)
resp = await session.post(OAUTH2_TOKEN, data=data)
_LOGGER.debug("Sending token request to %s", OAUTH2_TOKEN_URL)
resp = await session.post(OAUTH2_TOKEN_URL, data=data)
if resp.status >= 400:
try:
error_response = await resp.json()

View File

@@ -36,7 +36,7 @@ rules:
integration-owner: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: todo
reauthentication-flow: done
test-coverage: todo
# Gold

View File

@@ -11,6 +11,8 @@
"oauth_implementation_unavailable": "[%key:common::config_flow::abort::oauth2_implementation_unavailable%]",
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"unique_id_mismatch": "Please log in using the same account, or create a new entry.",
"user_rejected_authorize": "[%key:common::config_flow::abort::oauth2_user_rejected_authorize%]"
},
"create_entry": {
@@ -18,12 +20,24 @@
},
"error": {
"srp_auth_failed": "Error authenticating HomeLink account",
"unknown": "An unknown error occurred. Please try again later"
"unknown": "An unknown error occurred. Please try again later."
},
"step": {
"pick_implementation": {
"title": "[%key:common::config_flow::title::oauth2_pick_implementation%]"
},
"reauth_confirm": {
"data": {
"email": "[%key:common::config_flow::data::email%]",
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"email": "[%key:component::gentex_homelink::config::step::user::data_description::email%]",
"password": "[%key:component::gentex_homelink::config::step::user::data_description::password%]"
},
"description": "The HomeLink integration needs to re-authenticate your account",
"title": "[%key:common::config_flow::title::reauth%]"
},
"user": {
"data": {
"email": "[%key:common::config_flow::data::email%]",

View File

@@ -0,0 +1,101 @@
rules:
# Other comments:
# - we could consider removing the air quality entity removal
# Bronze
action-setup:
status: exempt
comment: No custom actions are defined.
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage:
status: todo
comment:
We should have the happy flow as the first test, which can be merged with test_show_form.
The config flow tests are missing adding a duplicate entry test.
config-flow:
status: todo
comment: Limit the scope of the try block in the user step
dependency-transparency: done
docs-actions:
status: exempt
comment: No custom actions are defined.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup: done
entity-unique-id: done
has-entity-name: done
runtime-data:
status: todo
comment: No direct need to wrap the coordinator in a dataclass to store in the config entry
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions:
status: exempt
comment: No custom actions are defined.
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: No options flow
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow:
status: exempt
comment: This integration does not require authentication.
test-coverage:
status: todo
comment:
The `test_async_setup_entry` should test the state of the mock config entry, instead of an entity state
The `test_availability` doesn't really do what it says it does, and this is now already tested via the snapshot tests.
# Gold
devices: done
diagnostics: done
discovery-update-info:
status: exempt
comment: This integration is a cloud service and thus does not support discovery.
discovery:
status: exempt
comment: This integration is a cloud service and thus does not support discovery.
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-supported-devices:
status: exempt
comment: This is an service, which doesn't integrate with any devices.
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: exempt
comment: This integration does not have devices.
entity-category: done
entity-device-class:
status: todo
comment: We can use the CO device class for the carbon monoxide sensor
entity-disabled-by-default: done
entity-translations:
status: todo
comment: We can remove the options state_attributes.
exception-translations: done
icon-translations: done
reconfiguration-flow:
status: exempt
comment: Only parameter that could be changed station_id would force a new config entry.
repair-issues: done
stale-devices:
status: exempt
comment: This integration does not have devices.
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -80,7 +80,10 @@ class GoogleGenerativeAITaskEntity(
) -> ai_task.GenDataTaskResult:
"""Handle a generate data task."""
await self._async_handle_chat_log(
chat_log, task.structure, default_max_tokens=RECOMMENDED_AI_TASK_MAX_TOKENS
chat_log,
task.structure,
default_max_tokens=RECOMMENDED_AI_TASK_MAX_TOKENS,
max_iterations=1000,
)
if not isinstance(chat_log.content[-1], conversation.AssistantContent):

View File

@@ -486,6 +486,7 @@ class GoogleGenerativeAILLMBaseEntity(Entity):
chat_log: conversation.ChatLog,
structure: vol.Schema | None = None,
default_max_tokens: int | None = None,
max_iterations: int = MAX_TOOL_ITERATIONS,
) -> None:
"""Generate an answer for the chat log."""
options = self.subentry.data
@@ -602,7 +603,7 @@ class GoogleGenerativeAILLMBaseEntity(Entity):
)
# To prevent infinite loops, we limit the number of iterations
for _iteration in range(MAX_TOOL_ITERATIONS):
for _iteration in range(max_iterations):
try:
chat_response_generator = await chat.send_message_stream(
message=chat_request

View File

@@ -18,8 +18,8 @@ from homeassistant.core import (
SupportsResponse,
callback,
)
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN, UPLOAD_SCOPE
from .coordinator import GooglePhotosConfigEntry
@@ -80,15 +80,10 @@ def _read_file_contents(
async def _async_handle_upload(call: ServiceCall) -> ServiceResponse:
"""Generate content from text and optionally images."""
config_entry: GooglePhotosConfigEntry | None = (
call.hass.config_entries.async_get_entry(call.data[CONF_CONFIG_ENTRY_ID])
config_entry: GooglePhotosConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[CONF_CONFIG_ENTRY_ID]
)
if not config_entry:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="integration_not_found",
translation_placeholders={"target": DOMAIN},
)
scopes = config_entry.data["token"]["scope"].split(" ")
if UPLOAD_SCOPE not in scopes:
raise HomeAssistantError(

View File

@@ -62,18 +62,12 @@
"filename_is_not_image": {
"message": "`{filename}` is not an image"
},
"integration_not_found": {
"message": "Integration \"{target}\" not found in registry."
},
"missing_upload_permission": {
"message": "Home Assistant was not granted permission to upload to Google Photos"
},
"no_access_to_path": {
"message": "Cannot read {filename}, no access to path; `allowlist_external_dirs` may need to be adjusted in `configuration.yaml`"
},
"not_loaded": {
"message": "{target} is not loaded."
},
"upload_error": {
"message": "Failed to upload content: {message}"
}

View File

@@ -12,7 +12,6 @@ from gspread.exceptions import APIError
from gspread.utils import ValueInputOption
import voluptuous as vol
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import CONF_ACCESS_TOKEN, CONF_TOKEN
from homeassistant.core import (
HomeAssistant,
@@ -21,8 +20,8 @@ from homeassistant.core import (
SupportsResponse,
callback,
)
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv, service
from homeassistant.helpers.selector import ConfigEntrySelector
from homeassistant.util.json import JsonObjectType
@@ -60,9 +59,9 @@ get_SHEET_SERVICE_SCHEMA = vol.All(
def _append_to_sheet(call: ServiceCall, entry: GoogleSheetsConfigEntry) -> None:
"""Run append in the executor."""
service = Client(Credentials(entry.data[CONF_TOKEN][CONF_ACCESS_TOKEN])) # type: ignore[no-untyped-call]
client = Client(Credentials(entry.data[CONF_TOKEN][CONF_ACCESS_TOKEN])) # type: ignore[no-untyped-call]
try:
sheet = service.open_by_key(entry.unique_id)
sheet = client.open_by_key(entry.unique_id)
except RefreshError:
entry.async_start_reauth(call.hass)
raise
@@ -90,9 +89,9 @@ def _get_from_sheet(
call: ServiceCall, entry: GoogleSheetsConfigEntry
) -> JsonObjectType:
"""Run get in the executor."""
service = Client(Credentials(entry.data[CONF_TOKEN][CONF_ACCESS_TOKEN])) # type: ignore[no-untyped-call]
client = Client(Credentials(entry.data[CONF_TOKEN][CONF_ACCESS_TOKEN])) # type: ignore[no-untyped-call]
try:
sheet = service.open_by_key(entry.unique_id)
sheet = client.open_by_key(entry.unique_id)
except RefreshError:
entry.async_start_reauth(call.hass)
raise
@@ -106,27 +105,18 @@ def _get_from_sheet(
async def _async_append_to_sheet(call: ServiceCall) -> None:
"""Append new line of data to a Google Sheets document."""
entry: GoogleSheetsConfigEntry | None = call.hass.config_entries.async_get_entry(
call.data[DATA_CONFIG_ENTRY]
entry: GoogleSheetsConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[DATA_CONFIG_ENTRY]
)
if not entry or not hasattr(entry, "runtime_data"):
raise ValueError(f"Invalid config entry: {call.data[DATA_CONFIG_ENTRY]}")
await entry.runtime_data.async_ensure_token_valid()
await call.hass.async_add_executor_job(_append_to_sheet, call, entry)
async def _async_get_from_sheet(call: ServiceCall) -> ServiceResponse:
"""Get lines of data from a Google Sheets document."""
entry: GoogleSheetsConfigEntry | None = call.hass.config_entries.async_get_entry(
call.data[DATA_CONFIG_ENTRY]
entry: GoogleSheetsConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[DATA_CONFIG_ENTRY]
)
if entry is None:
raise ServiceValidationError(
f"Invalid config entry id: {call.data[DATA_CONFIG_ENTRY]}"
)
if entry.state is not ConfigEntryState.LOADED:
raise HomeAssistantError(f"Config entry {entry.entry_id} is not loaded")
await entry.runtime_data.async_ensure_token_valid()
return await call.hass.async_add_executor_job(_get_from_sheet, call, entry)

View File

@@ -5,14 +5,25 @@ import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import dt as dt_util
from .const import CONF_TIME
from .const import CONF_TIME, DOMAIN
from .services import async_setup_services
PLATFORMS = [Platform.SENSOR]
_LOGGER = logging.getLogger(__name__)
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Google Travel Time component."""
async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Google Maps Travel Time from a config entry."""

View File

@@ -24,9 +24,7 @@ from homeassistant.helpers.selector import (
from homeassistant.util.unit_system import US_CUSTOMARY_SYSTEM
from .const import (
ALL_LANGUAGES,
ARRIVAL_TIME,
AVOID_OPTIONS,
CONF_ARRIVAL_TIME,
CONF_AVOID,
CONF_DEPARTURE_TIME,
@@ -41,12 +39,7 @@ from .const import (
DEFAULT_NAME,
DEPARTURE_TIME,
DOMAIN,
TIME_TYPES,
TRAFFIC_MODELS,
TRANSIT_PREFS,
TRANSPORT_TYPES,
TRAVEL_MODES,
UNITS,
UNITS_IMPERIAL,
UNITS_METRIC,
)
@@ -56,6 +49,15 @@ from .helpers import (
UnknownException,
validate_config_entry,
)
from .schemas import (
AVOID_SELECTOR,
LANGUAGE_SELECTOR,
TIME_TYPE_SELECTOR,
TRAFFIC_MODEL_SELECTOR,
TRANSIT_MODE_SELECTOR,
TRANSIT_ROUTING_PREFERENCE_SELECTOR,
UNITS_SELECTOR,
)
RECONFIGURE_SCHEMA = vol.Schema(
{
@@ -73,6 +75,13 @@ CONFIG_SCHEMA = RECONFIGURE_SCHEMA.extend(
OPTIONS_SCHEMA = vol.Schema(
{
vol.Optional(CONF_LANGUAGE): LANGUAGE_SELECTOR,
vol.Optional(CONF_AVOID): AVOID_SELECTOR,
vol.Optional(CONF_TRAFFIC_MODEL): TRAFFIC_MODEL_SELECTOR,
vol.Optional(CONF_TRANSIT_MODE): TRANSIT_MODE_SELECTOR,
vol.Optional(
CONF_TRANSIT_ROUTING_PREFERENCE
): TRANSIT_ROUTING_PREFERENCE_SELECTOR,
vol.Required(CONF_MODE): SelectSelector(
SelectSelectorConfig(
options=TRAVEL_MODES,
@@ -81,62 +90,9 @@ OPTIONS_SCHEMA = vol.Schema(
translation_key=CONF_MODE,
)
),
vol.Optional(CONF_LANGUAGE): SelectSelector(
SelectSelectorConfig(
options=sorted(ALL_LANGUAGES),
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_LANGUAGE,
)
),
vol.Optional(CONF_AVOID): SelectSelector(
SelectSelectorConfig(
options=AVOID_OPTIONS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_AVOID,
)
),
vol.Required(CONF_UNITS): SelectSelector(
SelectSelectorConfig(
options=UNITS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_UNITS,
)
),
vol.Required(CONF_TIME_TYPE): SelectSelector(
SelectSelectorConfig(
options=TIME_TYPES,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TIME_TYPE,
)
),
vol.Required(CONF_UNITS): UNITS_SELECTOR,
vol.Required(CONF_TIME_TYPE): TIME_TYPE_SELECTOR,
vol.Optional(CONF_TIME): TimeSelector(),
vol.Optional(CONF_TRAFFIC_MODEL): SelectSelector(
SelectSelectorConfig(
options=TRAFFIC_MODELS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRAFFIC_MODEL,
)
),
vol.Optional(CONF_TRANSIT_MODE): SelectSelector(
SelectSelectorConfig(
options=TRANSPORT_TYPES,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRANSIT_MODE,
)
),
vol.Optional(CONF_TRANSIT_ROUTING_PREFERENCE): SelectSelector(
SelectSelectorConfig(
options=TRANSIT_PREFS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRANSIT_ROUTING_PREFERENCE,
)
),
}
)

View File

@@ -98,6 +98,7 @@ TRANSPORT_TYPES_TO_GOOGLE_SDK_ENUM = {
"rail": TransitPreferences.TransitTravelMode.RAIL,
}
TRAVEL_MODES = ["driving", "walking", "bicycling", "transit"]
TRAVEL_MODES_WITHOUT_TRANSIT = ["driving", "walking", "bicycling"]
TRAVEL_MODES_TO_GOOGLE_SDK_ENUM = {
"driving": RouteTravelMode.DRIVE,
"walking": RouteTravelMode.WALK,

View File

@@ -1,5 +1,6 @@
"""Helpers for Google Time Travel integration."""
import datetime
import logging
from google.api_core.client_options import ClientOptions
@@ -12,11 +13,16 @@ from google.api_core.exceptions import (
)
from google.maps.routing_v2 import (
ComputeRoutesRequest,
ComputeRoutesResponse,
Location,
RouteModifiers,
RoutesAsyncClient,
RouteTravelMode,
RoutingPreference,
TransitPreferences,
Waypoint,
)
from google.protobuf import timestamp_pb2
from google.type import latlng_pb2
import voluptuous as vol
@@ -29,12 +35,40 @@ from homeassistant.helpers.issue_registry import (
async_delete_issue,
)
from homeassistant.helpers.location import find_coordinates
from homeassistant.util import dt as dt_util
from .const import DOMAIN
from .const import (
DOMAIN,
TRAFFIC_MODELS_TO_GOOGLE_SDK_ENUM,
TRANSIT_PREFS_TO_GOOGLE_SDK_ENUM,
TRANSPORT_TYPES_TO_GOOGLE_SDK_ENUM,
UNITS_TO_GOOGLE_SDK_ENUM,
)
_LOGGER = logging.getLogger(__name__)
def convert_time(time_str: str) -> timestamp_pb2.Timestamp:
"""Convert a string like '08:00' to a google pb2 Timestamp.
If the time is in the past, it will be shifted to the next day.
"""
parsed_time = dt_util.parse_time(time_str)
if parsed_time is None:
raise ValueError(f"Invalid time format: {time_str}")
start_of_day = dt_util.start_of_local_day()
combined = datetime.datetime.combine(
start_of_day,
parsed_time,
start_of_day.tzinfo,
)
if combined < dt_util.now():
combined = combined + datetime.timedelta(days=1)
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(dt=combined)
return timestamp
def convert_to_waypoint(hass: HomeAssistant, location: str) -> Waypoint | None:
"""Convert a location to a Waypoint.
@@ -123,3 +157,78 @@ def create_routes_api_disabled_issue(hass: HomeAssistant, entry: ConfigEntry) ->
def delete_routes_api_disabled_issue(hass: HomeAssistant, entry: ConfigEntry) -> None:
"""Delete the issue for the Routes API being disabled."""
async_delete_issue(hass, DOMAIN, f"routes_api_disabled_{entry.entry_id}")
async def async_compute_routes(
client: RoutesAsyncClient,
origin: str,
destination: str,
hass: HomeAssistant,
travel_mode: int,
units: str,
language: str | None = None,
avoid: str | None = None,
traffic_model: str | None = None,
transit_mode: str | None = None,
transit_routing_preference: str | None = None,
departure_time: str | None = None,
arrival_time: str | None = None,
field_mask: str = "routes.duration,routes.distanceMeters,routes.localized_values",
) -> ComputeRoutesResponse | None:
"""Compute routes using Google Routes API."""
origin_waypoint = convert_to_waypoint(hass, origin)
destination_waypoint = convert_to_waypoint(hass, destination)
if origin_waypoint is None or destination_waypoint is None:
return None
route_modifiers = None
routing_preference = None
if travel_mode == RouteTravelMode.DRIVE:
routing_preference = RoutingPreference.TRAFFIC_AWARE_OPTIMAL
route_modifiers = RouteModifiers(
avoid_tolls=avoid == "tolls",
avoid_ferries=avoid == "ferries",
avoid_highways=avoid == "highways",
avoid_indoor=avoid == "indoor",
)
transit_preferences = None
if travel_mode == RouteTravelMode.TRANSIT:
transit_routing_pref = None
transit_travel_mode = (
TransitPreferences.TransitTravelMode.TRANSIT_TRAVEL_MODE_UNSPECIFIED
)
if transit_routing_preference is not None:
transit_routing_pref = TRANSIT_PREFS_TO_GOOGLE_SDK_ENUM[
transit_routing_preference
]
if transit_mode is not None:
transit_travel_mode = TRANSPORT_TYPES_TO_GOOGLE_SDK_ENUM[transit_mode]
transit_preferences = TransitPreferences(
routing_preference=transit_routing_pref,
allowed_travel_modes=[transit_travel_mode],
)
departure_timestamp = convert_time(departure_time) if departure_time else None
arrival_timestamp = convert_time(arrival_time) if arrival_time else None
request = ComputeRoutesRequest(
origin=origin_waypoint,
destination=destination_waypoint,
travel_mode=travel_mode,
routing_preference=routing_preference,
departure_time=departure_timestamp,
arrival_time=arrival_timestamp,
route_modifiers=route_modifiers,
language_code=language,
units=UNITS_TO_GOOGLE_SDK_ENUM[units],
traffic_model=TRAFFIC_MODELS_TO_GOOGLE_SDK_ENUM[traffic_model]
if traffic_model
else None,
transit_preferences=transit_preferences,
)
return await client.compute_routes(
request, metadata=[("x-goog-fieldmask", field_mask)]
)

View File

@@ -0,0 +1,10 @@
{
"services": {
"get_transit_times": {
"service": "mdi:bus"
},
"get_travel_times": {
"service": "mdi:routes"
}
}
}

View File

@@ -0,0 +1,137 @@
"""Schemas for the Google Travel Time integration."""
import voluptuous as vol
from homeassistant.const import ATTR_CONFIG_ENTRY_ID, CONF_LANGUAGE, CONF_MODE
from homeassistant.helpers.selector import (
ConfigEntrySelector,
SelectSelector,
SelectSelectorConfig,
SelectSelectorMode,
TextSelector,
TimeSelector,
)
from .const import (
ALL_LANGUAGES,
AVOID_OPTIONS,
CONF_ARRIVAL_TIME,
CONF_AVOID,
CONF_DEPARTURE_TIME,
CONF_DESTINATION,
CONF_ORIGIN,
CONF_TIME_TYPE,
CONF_TRAFFIC_MODEL,
CONF_TRANSIT_MODE,
CONF_TRANSIT_ROUTING_PREFERENCE,
CONF_UNITS,
DOMAIN,
TIME_TYPES,
TRAFFIC_MODELS,
TRANSIT_PREFS,
TRANSPORT_TYPES,
TRAVEL_MODES_WITHOUT_TRANSIT,
UNITS,
UNITS_METRIC,
)
LANGUAGE_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=sorted(ALL_LANGUAGES),
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_LANGUAGE,
)
)
AVOID_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=AVOID_OPTIONS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_AVOID,
)
)
TRAFFIC_MODEL_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=TRAFFIC_MODELS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRAFFIC_MODEL,
)
)
TRANSIT_MODE_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=TRANSPORT_TYPES,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRANSIT_MODE,
)
)
TRANSIT_ROUTING_PREFERENCE_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=TRANSIT_PREFS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TRANSIT_ROUTING_PREFERENCE,
)
)
UNITS_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=UNITS,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_UNITS,
)
)
TIME_TYPE_SELECTOR = SelectSelector(
SelectSelectorConfig(
options=TIME_TYPES,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_TIME_TYPE,
)
)
_SERVICE_BASE_SCHEMA = vol.Schema(
{
vol.Required(ATTR_CONFIG_ENTRY_ID): ConfigEntrySelector(
{"integration": DOMAIN}
),
vol.Required(CONF_ORIGIN): TextSelector(),
vol.Required(CONF_DESTINATION): TextSelector(),
vol.Optional(CONF_UNITS, default=UNITS_METRIC): UNITS_SELECTOR,
vol.Optional(CONF_LANGUAGE): LANGUAGE_SELECTOR,
}
)
SERVICE_GET_TRAVEL_TIMES_SCHEMA = _SERVICE_BASE_SCHEMA.extend(
{
vol.Optional(CONF_MODE, default="driving"): SelectSelector(
SelectSelectorConfig(
options=TRAVEL_MODES_WITHOUT_TRANSIT,
sort=True,
mode=SelectSelectorMode.DROPDOWN,
translation_key=CONF_MODE,
)
),
vol.Optional(CONF_AVOID): AVOID_SELECTOR,
vol.Optional(CONF_TRAFFIC_MODEL): TRAFFIC_MODEL_SELECTOR,
vol.Optional(CONF_DEPARTURE_TIME): TimeSelector(),
}
)
SERVICE_GET_TRANSIT_TIMES_SCHEMA = _SERVICE_BASE_SCHEMA.extend(
{
vol.Optional(CONF_TRANSIT_MODE): TRANSIT_MODE_SELECTOR,
vol.Optional(
CONF_TRANSIT_ROUTING_PREFERENCE
): TRANSIT_ROUTING_PREFERENCE_SELECTOR,
vol.Exclusive(CONF_DEPARTURE_TIME, "time"): TimeSelector(),
vol.Exclusive(CONF_ARRIVAL_TIME, "time"): TimeSelector(),
}
)

View File

@@ -4,20 +4,11 @@ from __future__ import annotations
import datetime
import logging
from typing import TYPE_CHECKING, Any
from typing import Any
from google.api_core.client_options import ClientOptions
from google.api_core.exceptions import GoogleAPIError, PermissionDenied
from google.maps.routing_v2 import (
ComputeRoutesRequest,
Route,
RouteModifiers,
RoutesAsyncClient,
RouteTravelMode,
RoutingPreference,
TransitPreferences,
)
from google.protobuf import timestamp_pb2
from google.maps.routing_v2 import Route, RoutesAsyncClient
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -38,7 +29,6 @@ from homeassistant.core import CoreState, HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.location import find_coordinates
from homeassistant.util import dt as dt_util
from .const import (
ATTRIBUTION,
@@ -53,14 +43,10 @@ from .const import (
CONF_UNITS,
DEFAULT_NAME,
DOMAIN,
TRAFFIC_MODELS_TO_GOOGLE_SDK_ENUM,
TRANSIT_PREFS_TO_GOOGLE_SDK_ENUM,
TRANSPORT_TYPES_TO_GOOGLE_SDK_ENUM,
TRAVEL_MODES_TO_GOOGLE_SDK_ENUM,
UNITS_TO_GOOGLE_SDK_ENUM,
)
from .helpers import (
convert_to_waypoint,
async_compute_routes,
create_routes_api_disabled_issue,
delete_routes_api_disabled_issue,
)
@@ -70,28 +56,6 @@ _LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = datetime.timedelta(minutes=10)
FIELD_MASK = "routes.duration,routes.localized_values"
def convert_time(time_str: str) -> timestamp_pb2.Timestamp | None:
"""Convert a string like '08:00' to a google pb2 Timestamp.
If the time is in the past, it will be shifted to the next day.
"""
parsed_time = dt_util.parse_time(time_str)
if TYPE_CHECKING:
assert parsed_time is not None
start_of_day = dt_util.start_of_local_day()
combined = datetime.datetime.combine(
start_of_day,
parsed_time,
start_of_day.tzinfo,
)
if combined < dt_util.now():
combined = combined + datetime.timedelta(days=1)
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(dt=combined)
return timestamp
SENSOR_DESCRIPTIONS = [
SensorEntityDescription(
key="duration",
@@ -203,67 +167,6 @@ class GoogleTravelTimeSensor(SensorEntity):
self._config_entry.options[CONF_MODE]
]
if (
departure_time := self._config_entry.options.get(CONF_DEPARTURE_TIME)
) is not None:
departure_time = convert_time(departure_time)
if (
arrival_time := self._config_entry.options.get(CONF_ARRIVAL_TIME)
) is not None:
arrival_time = convert_time(arrival_time)
if travel_mode != RouteTravelMode.TRANSIT:
arrival_time = None
traffic_model = None
routing_preference = None
route_modifiers = None
if travel_mode == RouteTravelMode.DRIVE:
if (
options_traffic_model := self._config_entry.options.get(
CONF_TRAFFIC_MODEL
)
) is not None:
traffic_model = TRAFFIC_MODELS_TO_GOOGLE_SDK_ENUM[options_traffic_model]
routing_preference = RoutingPreference.TRAFFIC_AWARE_OPTIMAL
route_modifiers = RouteModifiers(
avoid_tolls=self._config_entry.options.get(CONF_AVOID) == "tolls",
avoid_ferries=self._config_entry.options.get(CONF_AVOID) == "ferries",
avoid_highways=self._config_entry.options.get(CONF_AVOID) == "highways",
avoid_indoor=self._config_entry.options.get(CONF_AVOID) == "indoor",
)
transit_preferences = None
if travel_mode == RouteTravelMode.TRANSIT:
transit_routing_preference = None
transit_travel_mode = (
TransitPreferences.TransitTravelMode.TRANSIT_TRAVEL_MODE_UNSPECIFIED
)
if (
option_transit_preferences := self._config_entry.options.get(
CONF_TRANSIT_ROUTING_PREFERENCE
)
) is not None:
transit_routing_preference = TRANSIT_PREFS_TO_GOOGLE_SDK_ENUM[
option_transit_preferences
]
if (
option_transit_mode := self._config_entry.options.get(CONF_TRANSIT_MODE)
) is not None:
transit_travel_mode = TRANSPORT_TYPES_TO_GOOGLE_SDK_ENUM[
option_transit_mode
]
transit_preferences = TransitPreferences(
routing_preference=transit_routing_preference,
allowed_travel_modes=[transit_travel_mode],
)
language = None
if (
options_language := self._config_entry.options.get(CONF_LANGUAGE)
) is not None:
language = options_language
self._resolved_origin = find_coordinates(self.hass, self._origin)
self._resolved_destination = find_coordinates(self.hass, self._destination)
_LOGGER.debug(
@@ -272,22 +175,24 @@ class GoogleTravelTimeSensor(SensorEntity):
self._resolved_destination,
)
if self._resolved_destination is not None and self._resolved_origin is not None:
request = ComputeRoutesRequest(
origin=convert_to_waypoint(self.hass, self._resolved_origin),
destination=convert_to_waypoint(self.hass, self._resolved_destination),
travel_mode=travel_mode,
routing_preference=routing_preference,
departure_time=departure_time,
arrival_time=arrival_time,
route_modifiers=route_modifiers,
language_code=language,
units=UNITS_TO_GOOGLE_SDK_ENUM[self._config_entry.options[CONF_UNITS]],
traffic_model=traffic_model,
transit_preferences=transit_preferences,
)
try:
response = await self._client.compute_routes(
request, metadata=[("x-goog-fieldmask", FIELD_MASK)]
response = await async_compute_routes(
client=self._client,
origin=self._resolved_origin,
destination=self._resolved_destination,
hass=self.hass,
travel_mode=travel_mode,
units=self._config_entry.options[CONF_UNITS],
language=self._config_entry.options.get(CONF_LANGUAGE),
avoid=self._config_entry.options.get(CONF_AVOID),
traffic_model=self._config_entry.options.get(CONF_TRAFFIC_MODEL),
transit_mode=self._config_entry.options.get(CONF_TRANSIT_MODE),
transit_routing_preference=self._config_entry.options.get(
CONF_TRANSIT_ROUTING_PREFERENCE
),
departure_time=self._config_entry.options.get(CONF_DEPARTURE_TIME),
arrival_time=self._config_entry.options.get(CONF_ARRIVAL_TIME),
field_mask=FIELD_MASK,
)
_LOGGER.debug("Received response: %s", response)
if response is not None and len(response.routes) > 0:

View File

@@ -0,0 +1,167 @@
"""Services for the Google Travel Time integration."""
from typing import cast
from google.api_core.client_options import ClientOptions
from google.api_core.exceptions import GoogleAPIError, PermissionDenied
from google.maps.routing_v2 import RoutesAsyncClient
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
ATTR_CONFIG_ENTRY_ID,
CONF_API_KEY,
CONF_LANGUAGE,
CONF_MODE,
)
from homeassistant.core import (
HomeAssistant,
ServiceCall,
ServiceResponse,
SupportsResponse,
callback,
)
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.service import async_get_config_entry
from .const import (
CONF_ARRIVAL_TIME,
CONF_AVOID,
CONF_DEPARTURE_TIME,
CONF_DESTINATION,
CONF_ORIGIN,
CONF_TRAFFIC_MODEL,
CONF_TRANSIT_MODE,
CONF_TRANSIT_ROUTING_PREFERENCE,
CONF_UNITS,
DOMAIN,
TRAVEL_MODES_TO_GOOGLE_SDK_ENUM,
)
from .helpers import (
async_compute_routes,
create_routes_api_disabled_issue,
delete_routes_api_disabled_issue,
)
from .schemas import SERVICE_GET_TRANSIT_TIMES_SCHEMA, SERVICE_GET_TRAVEL_TIMES_SCHEMA
SERVICE_GET_TRAVEL_TIMES = "get_travel_times"
SERVICE_GET_TRANSIT_TIMES = "get_transit_times"
def _build_routes_response(response) -> list[dict]:
"""Build the routes response from the API response."""
if response is None or not response.routes:
return []
return [
{
"duration": route.duration.seconds,
"duration_text": route.localized_values.duration.text,
"static_duration_text": route.localized_values.static_duration.text,
"distance_meters": route.distance_meters,
"distance_text": route.localized_values.distance.text,
}
for route in response.routes
]
def _raise_service_error(
hass: HomeAssistant, entry: ConfigEntry, exc: Exception
) -> None:
"""Raise a HomeAssistantError based on the exception."""
if isinstance(exc, PermissionDenied):
create_routes_api_disabled_issue(hass, entry)
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="permission_denied",
) from exc
if isinstance(exc, GoogleAPIError):
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error",
translation_placeholders={"error": str(exc)},
) from exc
raise exc
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services for the Google Travel Time integration."""
async def async_get_travel_times_service(service: ServiceCall) -> ServiceResponse:
"""Handle the service call to get travel times (non-transit modes)."""
entry = async_get_config_entry(
service.hass, DOMAIN, service.data[ATTR_CONFIG_ENTRY_ID]
)
api_key = entry.data[CONF_API_KEY]
travel_mode = TRAVEL_MODES_TO_GOOGLE_SDK_ENUM[service.data[CONF_MODE]]
client_options = ClientOptions(api_key=api_key)
client = RoutesAsyncClient(client_options=client_options)
try:
response = await async_compute_routes(
client=client,
origin=service.data[CONF_ORIGIN],
destination=service.data[CONF_DESTINATION],
hass=hass,
travel_mode=travel_mode,
units=service.data[CONF_UNITS],
language=service.data.get(CONF_LANGUAGE),
avoid=service.data.get(CONF_AVOID),
traffic_model=service.data.get(CONF_TRAFFIC_MODEL),
departure_time=service.data.get(CONF_DEPARTURE_TIME),
)
except Exception as ex: # noqa: BLE001
_raise_service_error(hass, entry, ex)
delete_routes_api_disabled_issue(hass, entry)
return cast(ServiceResponse, {"routes": _build_routes_response(response)})
async def async_get_transit_times_service(service: ServiceCall) -> ServiceResponse:
"""Handle the service call to get transit times."""
entry = async_get_config_entry(
service.hass, DOMAIN, service.data[ATTR_CONFIG_ENTRY_ID]
)
api_key = entry.data[CONF_API_KEY]
client_options = ClientOptions(api_key=api_key)
client = RoutesAsyncClient(client_options=client_options)
try:
response = await async_compute_routes(
client=client,
origin=service.data[CONF_ORIGIN],
destination=service.data[CONF_DESTINATION],
hass=hass,
travel_mode=TRAVEL_MODES_TO_GOOGLE_SDK_ENUM["transit"],
units=service.data[CONF_UNITS],
language=service.data.get(CONF_LANGUAGE),
transit_mode=service.data.get(CONF_TRANSIT_MODE),
transit_routing_preference=service.data.get(
CONF_TRANSIT_ROUTING_PREFERENCE
),
departure_time=service.data.get(CONF_DEPARTURE_TIME),
arrival_time=service.data.get(CONF_ARRIVAL_TIME),
)
except Exception as ex: # noqa: BLE001
_raise_service_error(hass, entry, ex)
delete_routes_api_disabled_issue(hass, entry)
return cast(ServiceResponse, {"routes": _build_routes_response(response)})
hass.services.async_register(
DOMAIN,
SERVICE_GET_TRAVEL_TIMES,
async_get_travel_times_service,
SERVICE_GET_TRAVEL_TIMES_SCHEMA,
supports_response=SupportsResponse.ONLY,
)
hass.services.async_register(
DOMAIN,
SERVICE_GET_TRANSIT_TIMES,
async_get_transit_times_service,
SERVICE_GET_TRANSIT_TIMES_SCHEMA,
supports_response=SupportsResponse.ONLY,
)

View File

@@ -0,0 +1,118 @@
get_travel_times:
fields:
config_entry_id:
required: true
selector:
config_entry:
integration: google_travel_time
origin:
required: true
example: "1600 Amphitheatre Parkway, Mountain View, CA"
selector:
text:
destination:
required: true
example: "1 Infinite Loop, Cupertino, CA"
selector:
text:
mode:
default: "driving"
selector:
select:
translation_key: mode
options:
- driving
- walking
- bicycling
units:
default: "metric"
selector:
select:
translation_key: units
options:
- metric
- imperial
language:
required: false
selector:
language:
avoid:
required: false
selector:
select:
translation_key: avoid
options:
- tolls
- highways
- ferries
- indoor
traffic_model:
required: false
selector:
select:
translation_key: traffic_model
options:
- best_guess
- pessimistic
- optimistic
departure_time:
required: false
selector:
time:
get_transit_times:
fields:
config_entry_id:
required: true
selector:
config_entry:
integration: google_travel_time
origin:
required: true
example: "1600 Amphitheatre Parkway, Mountain View, CA"
selector:
text:
destination:
required: true
example: "1 Infinite Loop, Cupertino, CA"
selector:
text:
units:
default: "metric"
selector:
select:
translation_key: units
options:
- metric
- imperial
language:
required: false
selector:
language:
transit_mode:
required: false
selector:
select:
translation_key: transit_mode
options:
- bus
- subway
- train
- tram
- rail
transit_routing_preference:
required: false
selector:
select:
translation_key: transit_routing_preference
options:
- less_walking
- fewer_transfers
departure_time:
required: false
selector:
time:
arrival_time:
required: false
selector:
time:

View File

@@ -30,6 +30,14 @@
}
}
},
"exceptions": {
"api_error": {
"message": "Google API error: {error}"
},
"permission_denied": {
"message": "[%key:component::google_travel_time::config::error::permission_denied%]"
}
},
"issues": {
"routes_api_disabled": {
"description": "Your Google Travel Time integration `{entry_title}` uses an API key which does not have the Routes API enabled.\n\n Please follow the instructions to [enable the API for your project]({enable_api_url}) and make sure your [API key restrictions]({api_key_restrictions_url}) allow access to the Routes API.\n\n After enabling the API this issue will be resolved automatically.",
@@ -107,5 +115,91 @@
}
}
},
"services": {
"get_transit_times": {
"description": "Retrieves route alternatives and travel times between two locations using public transit.",
"fields": {
"arrival_time": {
"description": "The desired arrival time.",
"name": "Arrival time"
},
"config_entry_id": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::config_entry_id::description%]",
"name": "[%key:component::google_travel_time::services::get_travel_times::fields::config_entry_id::name%]"
},
"departure_time": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::departure_time::description%]",
"name": "[%key:component::google_travel_time::services::get_travel_times::fields::departure_time::name%]"
},
"destination": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::destination::description%]",
"name": "[%key:component::google_travel_time::config::step::user::data::destination%]"
},
"language": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::language::description%]",
"name": "[%key:common::config_flow::data::language%]"
},
"origin": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::origin::description%]",
"name": "[%key:component::google_travel_time::config::step::user::data::origin%]"
},
"transit_mode": {
"description": "The preferred transit mode.",
"name": "[%key:component::google_travel_time::options::step::init::data::transit_mode%]"
},
"transit_routing_preference": {
"description": "The transit routing preference.",
"name": "[%key:component::google_travel_time::options::step::init::data::transit_routing_preference%]"
},
"units": {
"description": "[%key:component::google_travel_time::services::get_travel_times::fields::units::description%]",
"name": "[%key:component::google_travel_time::options::step::init::data::units%]"
}
},
"name": "Get transit times"
},
"get_travel_times": {
"description": "Retrieves route alternatives and travel times between two locations.",
"fields": {
"avoid": {
"description": "Features to avoid when calculating the route.",
"name": "[%key:component::google_travel_time::options::step::init::data::avoid%]"
},
"config_entry_id": {
"description": "The config entry to use for this action.",
"name": "Config entry"
},
"departure_time": {
"description": "The desired departure time.",
"name": "Departure time"
},
"destination": {
"description": "The destination of the route.",
"name": "[%key:component::google_travel_time::config::step::user::data::destination%]"
},
"language": {
"description": "The language to use for the response.",
"name": "[%key:common::config_flow::data::language%]"
},
"mode": {
"description": "The mode of transportation.",
"name": "[%key:component::google_travel_time::options::step::init::data::mode%]"
},
"origin": {
"description": "The origin of the route.",
"name": "[%key:component::google_travel_time::config::step::user::data::origin%]"
},
"traffic_model": {
"description": "The traffic model to use when calculating driving routes.",
"name": "[%key:component::google_travel_time::options::step::init::data::traffic_model%]"
},
"units": {
"description": "Which unit system to use.",
"name": "[%key:component::google_travel_time::options::step::init::data::units%]"
}
},
"name": "Get travel times"
}
},
"title": "Google Maps Travel Time"
}

View File

@@ -11,6 +11,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import (
CONF_DISCOVERY_INTERVAL_DEFAULT,
CONF_LISTENING_PORT_DEFAULT,
CONF_MULTICAST_ADDRESS_DEFAULT,
CONF_TARGET_PORT_DEFAULT,
@@ -49,7 +50,7 @@ class GoveeLocalApiCoordinator(DataUpdateCoordinator[list[GoveeDevice]]):
broadcast_port=CONF_TARGET_PORT_DEFAULT,
listening_port=CONF_LISTENING_PORT_DEFAULT,
discovery_enabled=True,
discovery_interval=1,
discovery_interval=CONF_DISCOVERY_INTERVAL_DEFAULT,
update_enabled=False,
)
for source_ip in source_ips

View File

@@ -16,6 +16,7 @@ from homeassistant.helpers.typing import ConfigType
from .const import (
AUTH_API_TOKEN,
AUTH_PASSWORD,
CACHED_API_KEY,
CONF_AUTH_TYPE,
CONF_PLANT_ID,
DEFAULT_PLANT_ID,
@@ -41,15 +42,163 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return True
def get_device_list_classic(
api: growattServer.GrowattApi, config: Mapping[str, str]
) -> tuple[list[dict[str, str]], str]:
"""Retrieve the device list for the selected plant."""
plant_id = config[CONF_PLANT_ID]
async def async_migrate_entry(
hass: HomeAssistant, config_entry: GrowattConfigEntry
) -> bool:
"""Migrate old config entries.
# Log in to api and fetch first plant if no plant id is defined.
Migration from version 1.0 to 1.1:
- Resolves DEFAULT_PLANT_ID (legacy value "0") to actual plant_id
- Only applies to Classic API (username/password authentication)
- Caches the logged-in API instance to avoid growatt server API rate limiting
Rate Limiting Workaround:
The Growatt Classic API rate-limits individual endpoints (login, plant_list,
device_list) with 5-minute windows. Without caching, the sequence would be:
Migration: login() → plant_list()
Setup: login() → device_list()
This results in 2 login() calls within seconds, triggering rate limits.
By caching the API instance (which contains the authenticated session), we
achieve:
Migration: login() → plant_list() → [cache API instance]
Setup: [reuse cached API] → device_list()
This reduces to just 1 login() call during the migration+setup cycle and prevent account lockout.
"""
_LOGGER.debug(
"Migrating config entry from version %s.%s",
config_entry.version,
config_entry.minor_version,
)
# Migrate from version 1.0 to 1.1
if config_entry.version == 1 and config_entry.minor_version < 1:
config = config_entry.data
# First, ensure auth_type field exists (legacy config entry migration)
# This handles config entries created before auth_type was introduced
if CONF_AUTH_TYPE not in config:
new_data = dict(config_entry.data)
# Detect auth type based on which fields are present
if CONF_TOKEN in config:
new_data[CONF_AUTH_TYPE] = AUTH_API_TOKEN
hass.config_entries.async_update_entry(config_entry, data=new_data)
config = config_entry.data
_LOGGER.debug("Added auth_type field to V1 API config entry")
elif CONF_USERNAME in config:
new_data[CONF_AUTH_TYPE] = AUTH_PASSWORD
hass.config_entries.async_update_entry(config_entry, data=new_data)
config = config_entry.data
_LOGGER.debug("Added auth_type field to Classic API config entry")
else:
# Config entry has no auth fields - this is invalid but migration
# should still succeed. Setup will fail later with a clearer error.
_LOGGER.warning(
"Config entry has no authentication fields. "
"Setup will fail until the integration is reconfigured"
)
# Handle DEFAULT_PLANT_ID resolution
if config.get(CONF_PLANT_ID) == DEFAULT_PLANT_ID:
# V1 API should never have DEFAULT_PLANT_ID (plant selection happens in config flow)
# If it does, this indicates a corrupted config entry
if config.get(CONF_AUTH_TYPE) == AUTH_API_TOKEN:
_LOGGER.error(
"V1 API config entry has DEFAULT_PLANT_ID, which indicates a "
"corrupted configuration. Please reconfigure the integration"
)
return False
# Classic API with DEFAULT_PLANT_ID - resolve to actual plant_id
if config.get(CONF_AUTH_TYPE) == AUTH_PASSWORD:
username = config.get(CONF_USERNAME)
password = config.get(CONF_PASSWORD)
url = config.get(CONF_URL, DEFAULT_URL)
if not username or not password:
# Credentials missing - cannot migrate
_LOGGER.error(
"Cannot migrate DEFAULT_PLANT_ID due to missing credentials"
)
return False
try:
# Create API instance and login
api, login_response = await _create_api_and_login(
hass, username, password, url
)
# Resolve DEFAULT_PLANT_ID to actual plant_id
plant_info = await hass.async_add_executor_job(
api.plant_list, login_response["user"]["id"]
)
except (ConfigEntryError, RequestException, JSONDecodeError) as ex:
# API failure during migration - return False to retry later
_LOGGER.error(
"Failed to resolve plant_id during migration: %s. "
"Migration will retry on next restart",
ex,
)
return False
if not plant_info or "data" not in plant_info or not plant_info["data"]:
_LOGGER.error(
"No plants found for this account. "
"Migration will retry on next restart"
)
return False
first_plant_id = plant_info["data"][0]["plantId"]
# Update config entry with resolved plant_id
new_data = dict(config_entry.data)
new_data[CONF_PLANT_ID] = first_plant_id
hass.config_entries.async_update_entry(
config_entry, data=new_data, minor_version=1
)
# Cache the logged-in API instance for reuse in async_setup_entry()
hass.data.setdefault(DOMAIN, {})
hass.data[DOMAIN][f"{CACHED_API_KEY}{config_entry.entry_id}"] = api
_LOGGER.info(
"Migrated config entry to use specific plant_id '%s'",
first_plant_id,
)
else:
# No DEFAULT_PLANT_ID to resolve, just bump version
hass.config_entries.async_update_entry(config_entry, minor_version=1)
_LOGGER.debug("Migration completed to version %s.%s", config_entry.version, 1)
return True
async def _create_api_and_login(
hass: HomeAssistant, username: str, password: str, url: str
) -> tuple[growattServer.GrowattApi, dict]:
"""Create API instance and perform login.
Returns both the API instance (with authenticated session) and the login
response (containing user_id needed for subsequent API calls).
"""
api = growattServer.GrowattApi(add_random_user_id=True, agent_identifier=username)
api.server_url = url
login_response = await hass.async_add_executor_job(
_login_classic_api, api, username, password
)
return api, login_response
def _login_classic_api(
api: growattServer.GrowattApi, username: str, password: str
) -> dict:
"""Log in to Classic API and return user info."""
try:
login_response = api.login(config[CONF_USERNAME], config[CONF_PASSWORD])
login_response = api.login(username, password)
except (RequestException, JSONDecodeError) as ex:
raise ConfigEntryError(
f"Error communicating with Growatt API during login: {ex}"
@@ -62,31 +211,7 @@ def get_device_list_classic(
raise ConfigEntryAuthFailed("Username, Password or URL may be incorrect!")
raise ConfigEntryError(f"Growatt login failed: {msg}")
user_id = login_response["user"]["id"]
# Legacy support: DEFAULT_PLANT_ID ("0") triggers auto-selection of first plant.
# Modern config flow always sets a specific plant_id, but old config entries
# from earlier versions may still have plant_id="0".
if plant_id == DEFAULT_PLANT_ID:
try:
plant_info = api.plant_list(user_id)
except (RequestException, JSONDecodeError) as ex:
raise ConfigEntryError(
f"Error communicating with Growatt API during plant list: {ex}"
) from ex
if not plant_info or "data" not in plant_info or not plant_info["data"]:
raise ConfigEntryError("No plants found for this account.")
plant_id = plant_info["data"][0]["plantId"]
# Get a list of devices for specified plant to add sensors for.
try:
devices = api.device_list(plant_id)
except (RequestException, JSONDecodeError) as ex:
raise ConfigEntryError(
f"Error communicating with Growatt API during device list: {ex}"
) from ex
return devices, plant_id
return login_response
def get_device_list_v1(
@@ -94,9 +219,9 @@ def get_device_list_v1(
) -> tuple[list[dict[str, str]], str]:
"""Device list logic for Open API V1.
Note: Plant selection (including auto-selection if only one plant exists)
is handled in the config flow before this function is called. This function
only fetches devices for the already-selected plant_id.
Plant selection is handled in the config flow before this function is called.
This function expects a specific plant_id and fetches devices for that plant.
"""
plant_id = config[CONF_PLANT_ID]
try:
@@ -126,19 +251,6 @@ def get_device_list_v1(
return supported_devices, plant_id
def get_device_list(
api, config: Mapping[str, str], api_version: str
) -> tuple[list[dict[str, str]], str]:
"""Dispatch to correct device list logic based on API version."""
if api_version == "v1":
return get_device_list_v1(api, config)
if api_version == "classic":
return get_device_list_classic(api, config)
# Defensive: api_version is hardcoded in async_setup_entry as "v1" or "classic"
# This line is unreachable through normal execution but kept as a safeguard
raise ConfigEntryError(f"Unknown API version: {api_version}") # pragma: no cover
async def async_setup_entry(
hass: HomeAssistant, config_entry: GrowattConfigEntry
) -> bool:
@@ -154,40 +266,47 @@ async def async_setup_entry(
new_data[CONF_URL] = url
hass.config_entries.async_update_entry(config_entry, data=new_data)
# Migrate legacy config entries without auth_type field
if CONF_AUTH_TYPE not in config:
new_data = dict(config_entry.data)
# Detect auth type based on which fields are present
if CONF_TOKEN in config:
new_data[CONF_AUTH_TYPE] = AUTH_API_TOKEN
elif CONF_USERNAME in config:
new_data[CONF_AUTH_TYPE] = AUTH_PASSWORD
else:
raise ConfigEntryError(
"Unable to determine authentication type from config entry."
)
hass.config_entries.async_update_entry(config_entry, data=new_data)
config = config_entry.data
# Determine API version
# Determine API version and get devices
# Note: auth_type field is guaranteed to exist after migration
if config.get(CONF_AUTH_TYPE) == AUTH_API_TOKEN:
api_version = "v1"
# V1 API (token-based, no login needed)
token = config[CONF_TOKEN]
api = growattServer.OpenApiV1(token=token)
elif config.get(CONF_AUTH_TYPE) == AUTH_PASSWORD:
api_version = "classic"
username = config[CONF_USERNAME]
api = growattServer.GrowattApi(
add_random_user_id=True, agent_identifier=username
devices, plant_id = await hass.async_add_executor_job(
get_device_list_v1, api, config
)
api.server_url = url
elif config.get(CONF_AUTH_TYPE) == AUTH_PASSWORD:
# Classic API (username/password with login)
username = config[CONF_USERNAME]
password = config[CONF_PASSWORD]
# Check if migration cached an authenticated API instance for us to reuse.
# This avoids calling login() twice (once in migration, once here) which
# would trigger rate limiting.
cached_api = hass.data.get(DOMAIN, {}).pop(
f"{CACHED_API_KEY}{config_entry.entry_id}", None
)
if cached_api:
# Reuse the logged-in API instance from migration (rate limit optimization)
api = cached_api
_LOGGER.debug("Reusing logged-in session from migration")
else:
# No cached API (normal setup or migration didn't run)
# Create new API instance and login
api, _ = await _create_api_and_login(hass, username, password, url)
# Get plant_id and devices using the authenticated session
plant_id = config[CONF_PLANT_ID]
try:
devices = await hass.async_add_executor_job(api.device_list, plant_id)
except (RequestException, JSONDecodeError) as ex:
raise ConfigEntryError(
f"Error communicating with Growatt API during device list: {ex}"
) from ex
else:
raise ConfigEntryError("Unknown authentication type in config entry.")
devices, plant_id = await hass.async_add_executor_job(
get_device_list, api, config, api_version
)
# Create a coordinator for the total sensors
total_coordinator = GrowattCoordinator(
hass, config_entry, plant_id, "total", plant_id

View File

@@ -40,6 +40,7 @@ class GrowattServerConfigFlow(ConfigFlow, domain=DOMAIN):
"""Config flow class."""
VERSION = 1
MINOR_VERSION = 1
api: growattServer.GrowattApi

View File

@@ -53,3 +53,8 @@ ABORT_NO_PLANTS = "no_plants"
BATT_MODE_LOAD_FIRST = 0
BATT_MODE_BATTERY_FIRST = 1
BATT_MODE_GRID_FIRST = 2
# Internal key prefix for caching authenticated API instance
# Used to pass logged-in session from async_migrate_entry to async_setup_entry
# to avoid double login() calls that trigger API rate limiting
CACHED_API_KEY = "_cached_api_"

View File

@@ -28,7 +28,6 @@ from habiticalib import (
import voluptuous as vol
from homeassistant.components.todo import ATTR_RENAME
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import ATTR_DATE, ATTR_NAME
from homeassistant.core import (
HomeAssistant,
@@ -38,7 +37,7 @@ from homeassistant.core import (
callback,
)
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers import config_validation as cv, service
from homeassistant.helpers.selector import ConfigEntrySelector
from homeassistant.util import dt as dt_util
@@ -243,24 +242,11 @@ SERVICE_TASK_TYPE_MAP = {
}
def get_config_entry(hass: HomeAssistant, entry_id: str) -> HabiticaConfigEntry:
"""Return config entry or raise if not found or not loaded."""
if not (entry := hass.config_entries.async_get_entry(entry_id)):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="entry_not_found",
)
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="entry_not_loaded",
)
return entry
async def _cast_skill(call: ServiceCall) -> ServiceResponse:
"""Skill action."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
skill = SKILL_MAP[call.data[ATTR_SKILL]]
@@ -324,7 +310,9 @@ async def _cast_skill(call: ServiceCall) -> ServiceResponse:
async def _manage_quests(call: ServiceCall) -> ServiceResponse:
"""Accept, reject, start, leave or cancel quests."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
FUNC_MAP = {
@@ -372,7 +360,9 @@ async def _manage_quests(call: ServiceCall) -> ServiceResponse:
async def _score_task(call: ServiceCall) -> ServiceResponse:
"""Score a task action."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
direction = (
@@ -436,7 +426,9 @@ async def _score_task(call: ServiceCall) -> ServiceResponse:
async def _transformation(call: ServiceCall) -> ServiceResponse:
"""User a transformation item on a player character."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
item = ITEMID_MAP[call.data[ATTR_ITEM]]
@@ -519,7 +511,9 @@ async def _transformation(call: ServiceCall) -> ServiceResponse:
async def _get_tasks(call: ServiceCall) -> ServiceResponse:
"""Get tasks action."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
response: list[TaskData] = coordinator.data.tasks
@@ -568,7 +562,9 @@ async def _get_tasks(call: ServiceCall) -> ServiceResponse:
async def _create_or_update_task(call: ServiceCall) -> ServiceResponse: # noqa: C901
"""Create or update task action."""
entry = get_config_entry(call.hass, call.data[ATTR_CONFIG_ENTRY])
entry: HabiticaConfigEntry = service.async_get_config_entry(
call.hass, DOMAIN, call.data[ATTR_CONFIG_ENTRY]
)
coordinator = entry.runtime_data
await coordinator.async_refresh()
is_update = call.service in (
@@ -852,7 +848,7 @@ async def _create_or_update_task(call: ServiceCall) -> ServiceResponse: # noqa:
def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services for Habitica integration."""
for service in (
for service_name in (
SERVICE_ABORT_QUEST,
SERVICE_ACCEPT_QUEST,
SERVICE_CANCEL_QUEST,
@@ -862,13 +858,13 @@ def async_setup_services(hass: HomeAssistant) -> None:
):
hass.services.async_register(
DOMAIN,
service,
service_name,
_manage_quests,
schema=SERVICE_MANAGE_QUEST_SCHEMA,
supports_response=SupportsResponse.ONLY,
)
for service in (
for service_name in (
SERVICE_UPDATE_DAILY,
SERVICE_UPDATE_HABIT,
SERVICE_UPDATE_REWARD,
@@ -876,12 +872,12 @@ def async_setup_services(hass: HomeAssistant) -> None:
):
hass.services.async_register(
DOMAIN,
service,
service_name,
_create_or_update_task,
schema=SERVICE_UPDATE_TASK_SCHEMA,
supports_response=SupportsResponse.ONLY,
)
for service in (
for service_name in (
SERVICE_CREATE_DAILY,
SERVICE_CREATE_HABIT,
SERVICE_CREATE_REWARD,
@@ -889,7 +885,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
):
hass.services.async_register(
DOMAIN,
service,
service_name,
_create_or_update_task,
schema=SERVICE_CREATE_TASK_SCHEMA,
supports_response=SupportsResponse.ONLY,

View File

@@ -550,12 +550,6 @@
"delete_todos_failed": {
"message": "Unable to delete item from Habitica to-do list, please try again"
},
"entry_not_found": {
"message": "The selected character is not configured in Home Assistant."
},
"entry_not_loaded": {
"message": "The selected character is currently not loaded or disabled in Home Assistant."
},
"frequency_not_monthly": {
"message": "Unable to update task, monthly repeat settings apply only to monthly recurring dailies."
},

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "gold",
"requirements": ["hdfury==1.4.2"],
"requirements": ["hdfury==1.5.0"],
"zeroconf": [
{ "name": "diva-*", "type": "_http._tcp.local." },
{ "name": "vertex2-*", "type": "_http._tcp.local." },

View File

@@ -3,13 +3,7 @@
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from hdfury import (
OPERATION_MODES,
TX0_INPUT_PORTS,
TX1_INPUT_PORTS,
HDFuryAPI,
HDFuryError,
)
from hdfury import OPERATION_MODES, TX0_INPUT_PORTS, TX1_INPUT_PORTS, HDFuryError
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.core import HomeAssistant
@@ -27,7 +21,7 @@ PARALLEL_UPDATES = 1
class HDFurySelectEntityDescription(SelectEntityDescription):
"""Description for HDFury select entities."""
set_value_fn: Callable[[HDFuryAPI, str], Awaitable[None]]
set_value_fn: Callable[[HDFuryCoordinator, str], Awaitable[None]]
SELECT_PORTS: tuple[HDFurySelectEntityDescription, ...] = (

View File

@@ -0,0 +1,72 @@
"""The Hegel integration."""
from __future__ import annotations
import logging
from hegel_ip_client import HegelClient
from hegel_ip_client.exceptions import HegelConnectionError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, EVENT_HOMEASSISTANT_STOP, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from .const import DEFAULT_PORT
PLATFORMS: list[Platform] = [Platform.MEDIA_PLAYER]
_LOGGER = logging.getLogger(__name__)
type HegelConfigEntry = ConfigEntry[HegelClient]
async def async_setup_entry(hass: HomeAssistant, entry: HegelConfigEntry) -> bool:
"""Set up the Hegel integration."""
host = entry.data[CONF_HOST]
# Create and test client connection
client = HegelClient(host, DEFAULT_PORT)
try:
# Test connection before proceeding with setup
await client.start()
await client.ensure_connected(timeout=10.0)
_LOGGER.debug("Successfully connected to Hegel at %s:%s", host, DEFAULT_PORT)
except (HegelConnectionError, TimeoutError, OSError) as err:
_LOGGER.error(
"Failed to connect to Hegel at %s:%s: %s", host, DEFAULT_PORT, err
)
await client.stop() # Clean up
raise ConfigEntryNotReady(
f"Unable to connect to Hegel amplifier at {host}:{DEFAULT_PORT}"
) from err
# Store client in runtime_data
entry.runtime_data = client
async def _async_close_client(event):
await client.stop()
entry.async_on_unload(
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, _async_close_client)
)
# Forward setup to supported platforms
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: HegelConfigEntry) -> bool:
"""Unload a Hegel config entry and stop active client connection."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
client = entry.runtime_data
_LOGGER.debug("Stopping Hegel client for %s", entry.title)
try:
await client.stop()
except (HegelConnectionError, OSError) as err:
_LOGGER.warning("Error while stopping Hegel client: %s", err)
return unload_ok

View File

@@ -0,0 +1,154 @@
"""Config flow for Hegel integration."""
from __future__ import annotations
import logging
from typing import Any
from hegel_ip_client import HegelClient
from hegel_ip_client.exceptions import HegelConnectionError
import voluptuous as vol
from yarl import URL
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_HOST
from homeassistant.helpers.service_info.ssdp import SsdpServiceInfo
from .const import CONF_MODEL, DEFAULT_PORT, DOMAIN, MODEL_INPUTS
_LOGGER = logging.getLogger(__name__)
class HegelConfigFlow(ConfigFlow, domain=DOMAIN):
"""Config flow for Hegel amplifiers."""
VERSION = 1
def __init__(self) -> None:
"""Initialize the config flow."""
self._host: str | None = None
self._name: str | None = None
self._model: str | None = None
async def _async_try_connect(self, host: str) -> bool:
"""Try to connect to the Hegel amplifier using the library."""
client = HegelClient(host, DEFAULT_PORT)
try:
await client.start()
await client.ensure_connected(timeout=5.0)
except HegelConnectionError, TimeoutError, OSError:
return False
else:
return True
finally:
await client.stop()
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle manual setup by the user."""
errors: dict[str, str] = {}
if user_input is not None:
host = user_input[CONF_HOST]
# Prevent duplicate entries by host
self._async_abort_entries_match({CONF_HOST: host})
if not await self._async_try_connect(host):
errors["base"] = "cannot_connect"
else:
return self.async_create_entry(
title=f"Hegel {user_input[CONF_MODEL]}",
data=user_input,
)
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_HOST): str,
vol.Required(CONF_MODEL): vol.In(list(MODEL_INPUTS.keys())),
}
),
errors=errors,
)
async def async_step_ssdp(
self, discovery_info: SsdpServiceInfo
) -> ConfigFlowResult:
"""Handle SSDP discovery."""
upnp = discovery_info.upnp or {}
# Get host from presentationURL or ssdp_location
url = upnp.get("presentationURL") or discovery_info.ssdp_location
if not url:
return self.async_abort(reason="no_host_found")
host = URL(url).host
if not host:
return self.async_abort(reason="no_host_found")
# Use UDN as unique id (device UUID)
unique_id = discovery_info.ssdp_udn
if not unique_id:
return self.async_abort(reason="no_host_found")
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured(updates={CONF_HOST: host})
# Test connection before showing confirmation
if not await self._async_try_connect(host):
return self.async_abort(reason="cannot_connect")
# Get device info
friendly_name = upnp.get("friendlyName", f"Hegel {host}")
suggested_model = upnp.get("modelName") or ""
model_default = next(
(m for m in MODEL_INPUTS if suggested_model.upper().startswith(m.upper())),
None,
)
self._host = host
self._name = friendly_name
self._model = model_default
self.context.update(
{
"title_placeholders": {"name": friendly_name},
}
)
return await self.async_step_discovery_confirm()
async def async_step_discovery_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle discovery confirmation - user can change model if needed."""
assert self._host is not None
assert self._name is not None
if user_input is not None:
return self.async_create_entry(
title=self._name,
data={
CONF_HOST: self._host,
CONF_MODEL: user_input[CONF_MODEL],
},
)
return self.async_show_form(
step_id="discovery_confirm",
data_schema=vol.Schema(
{
vol.Required(
CONF_MODEL,
default=self._model or list(MODEL_INPUTS.keys())[0],
): vol.In(list(MODEL_INPUTS.keys())),
}
),
description_placeholders={
"host": self._host,
"name": self._name,
},
)

View File

@@ -0,0 +1,92 @@
"""Constants for the Hegel integration."""
DOMAIN = "hegel"
DEFAULT_PORT = 50001
CONF_MODEL = "model"
CONF_MAX_VOLUME = "max_volume" # 1.0 means amp's internal max
HEARTBEAT_TIMEOUT_MINUTES = 3
MODEL_INPUTS = {
"Röst": [
"Balanced",
"Analog 1",
"Analog 2",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
"H95": [
"Analog 1",
"Analog 2",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
"H120": [
"Balanced",
"Analog 1",
"Analog 2",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
"H190": [
"Balanced",
"Analog 1",
"Analog 2",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
"H190V": [
"XLR",
"Analog 1",
"Analog 2",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
"Phono",
],
"H390": [
"XLR",
"Analog 1",
"Analog 2",
"BNC",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
"H590": [
"XLR 1",
"XLR 2",
"Analog 1",
"Analog 2",
"BNC",
"Coaxial",
"Optical 1",
"Optical 2",
"Optical 3",
"USB",
"Network",
],
}

Some files were not shown because too many files have changed in this diff Show More