Compare commits

...

98 Commits

Author SHA1 Message Date
Paulus Schoutsen 435f278053 Merge pull request #59397 from home-assistant/rc 2021-11-08 21:44:29 -08:00
Paulus Schoutsen 4d62d41cc1 Bumped version to 2021.11.2 2021-11-08 20:48:00 -08:00
Bram Kragten a6d795fce1 Update frontend to 20211108.0 (#59364) 2021-11-08 20:47:52 -08:00
Maikel Punie 0f4a35dd28 Bump velbusaio to 2021.11.6 (#59353) 2021-11-08 20:47:51 -08:00
Erik Montnemery 6d3e380f64 Bump paho-mqtt to 1.6.1 (#59339) 2021-11-08 20:47:51 -08:00
Kevin Hellemun 0873c3e92b Support generic xiaomi_miio vacuums (#59317)
* Support generic xiaomi_miio vacuums

Signed-off-by: Kevin Hellemun <17928966+OGKevin@users.noreply.github.com>

* Fix lint

Signed-off-by: Kevin Hellemun <17928966+OGKevin@users.noreply.github.com>

* Remove warning log

Signed-off-by: Kevin Hellemun <17928966+OGKevin@users.noreply.github.com>
2021-11-08 20:47:50 -08:00
Shay Levy 250160f007 Revert "Use DeviceInfo in shelly (#58520)" (#59315)
This reverts commit df6351f86b.
2021-11-08 20:47:49 -08:00
Maciej Bieniek e1b8e2ded3 Remove illuminance sensor (#59305) 2021-11-08 20:47:48 -08:00
Simone Chemelli 8b7686f4f2 Fix condition for fritz integration (#59281) 2021-11-08 20:47:48 -08:00
Alexei Chetroi f9fc92c36b Add Battery sensor regardless if the battery_percent_remaining attribute is supported or not (#59264) 2021-11-08 20:47:47 -08:00
Michael a4253ff54e Increase timeout for fetching camera data on Synology DSM (#59237) 2021-11-08 20:47:46 -08:00
jan iversen dcada92cef Fix tradfri group reachable access (#59217) 2021-11-08 20:47:45 -08:00
J. Nick Koston a6ff89c3e6 Bump flux_led to 0.24.17 (#59211)
* Bump flux_led to 0.24.16

- Changes: https://github.com/Danielhiversen/flux_led/compare/0.24.15...0.24.16

- Fixes turning on/off when device is out of sync internally (seen on 0x33 firmware 8)

- Fixes #59190

* Bump to .17 to fix typing
2021-11-08 20:47:45 -08:00
Michael f5d04de523 bump aioshelly to 1.0.4 (#59209) 2021-11-08 20:47:44 -08:00
Aaron Bach 1cc8e688c3 Change ReCollect Waste device class to date (#59180) 2021-11-08 20:47:43 -08:00
Aaron Bach f47e64e218 Guard against missing data in ReCollect Waste (#59177) 2021-11-08 20:47:42 -08:00
Aaron Bach 3d8ca26c00 Guard against flaky SimpliSafe API calls (#59175) 2021-11-08 20:47:42 -08:00
Aaron Bach e233730494 Bump aioguardian to 2021.11.0 (#59161) 2021-11-08 20:47:41 -08:00
J. Nick Koston 2309dd48c9 Bump flux_led to 0.24.15 (#59159)
- Changes: https://github.com/Danielhiversen/flux_led/compare/0.24.14...0.24.15

- Fixes color reporting for addressable devices
2021-11-08 20:47:40 -08:00
uvjustin 96c08df883 Adjust frag_duration setting in stream (#59135) 2021-11-08 20:47:39 -08:00
Austin Mroczek c150a296d2 Bump total_connect_client to 2021.11.2 (#58818)
* update total_connect_client to 2021.10

* update for total_connect_client changes

* remove unused return value

* bump total_connect_client to 2021.11.1

* bump total_connect_client to 2021.11.2

* Move to public ResultCode

* load locations to prevent 'unknown error occurred'

* add test for zero locations

* Revert "load locations to prevent 'unknown error occurred'"

This reverts commit 28b8984be5b1c8839fc8077d8d59bdba97eacc38.

* Revert "add test for zero locations"

This reverts commit 77bf7908d508d539d6165fc986930b041b13ca97.
2021-11-08 20:47:39 -08:00
Paulus Schoutsen 2c21f0ad18 Merge pull request #59129 from home-assistant/rc 2021-11-04 21:57:54 -07:00
Paulus Schoutsen 189677c713 Bumped version to 2021.11.1 2021-11-04 20:14:07 -07:00
J. Nick Koston 039e361bff Bump flux_led to 0.24.14 (#59121) 2021-11-04 20:13:54 -07:00
Erik Montnemery 61918e0e44 Correct rescheduling of ExternalStatisticsTask (#59076) 2021-11-04 20:13:53 -07:00
Erik Montnemery d9d8b538b0 Change minimum supported SQLite version to 3.31.0 (#59073) 2021-11-04 20:13:52 -07:00
Teemu R c3882d0782 Remove use_time sensor from mjjsq humidifers (#59066) 2021-11-04 20:13:51 -07:00
Franck Nijhof c6d651e283 Increase time to authorize OctoPrint (#59051) 2021-11-04 20:13:51 -07:00
Erik Montnemery 543381b6f2 Correct migration to recorder schema 22 (#59048) 2021-11-04 20:13:50 -07:00
Franck Nijhof 433743b0d1 Constrain urllib3 to >=1.26.5 (#59043) 2021-11-04 20:13:49 -07:00
Maikel Punie 58d88c8371 Bump velbus-aio to 2021.11.0 (#59040) 2021-11-04 20:13:49 -07:00
Glenn Waters 6e08cb815b Environment Canada config_flow fix (#59029) 2021-11-04 20:13:48 -07:00
ollo69 b125e2c425 Fix Nut resources option migration (#59020)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2021-11-04 20:13:47 -07:00
Teemu R c4aa6af953 Accept all roborock vacuum models for xiaomi_miio (#59018) 2021-11-04 20:13:46 -07:00
Eugenio Panadero dcf6004166 Bump aiopvpc to 2.2.1 (#59008)
happening because some config change in the ESIOS API server,
solved with a version patch in aiopvpc
(details in https://github.com/azogue/aiopvpc/pull/28)
2021-11-04 20:13:45 -07:00
Teemu R af28d927b4 Fix timedelta-based sensors for xiaomi_miio (#58995) 2021-11-04 20:13:44 -07:00
Kevin Hellemun 5e6cac3834 Fix mop attribute for unified mop and water box in Xiaomi Miio (#58990)
Co-authored-by: Teemu R. <tpr@iki.fi>
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2021-11-04 20:13:43 -07:00
Thomas G 397f303d6d Swap sharkiq vacuum is_docked with is_charging (#58975) 2021-11-04 20:13:43 -07:00
Franck Nijhof 85a4ee68e3 Merge pull request #58994 from home-assistant/rc 2021-11-03 16:31:23 +01:00
Franck Nijhof e3c021a910 Bumped version to 2021.11.0 2021-11-03 15:03:43 +01:00
Bram Kragten 5568121251 Update frontend to 20211103.0 (#58988) 2021-11-03 15:02:39 +01:00
Sergio Gutierrez Alvarez 7afb38ff96 Fix battery_is_charging sensor on system bridge (#58980) 2021-11-03 15:02:35 +01:00
Daniel Hjelseth Høyer 1a08da7856 Bump pyMill to 0.7.4 (#58977) 2021-11-03 12:31:22 +01:00
Hans Oischinger ded0785700 Fix broken ViCare burner & compressor sensors (#58962) 2021-11-03 10:50:09 +01:00
Robert Hillis 4163ba5dbf Add missing ZMW currency (#58959) 2021-11-03 10:50:06 +01:00
Dave T dff98b024c Aurora abb defer unique_id assignment during yaml import (#58887)
* Defer unique_id assignment during yaml import if dark

* Back out variable name change to simplify.

* Allow config flow yaml setup deferral.

* Fix deferred yaml import

* Code review: only wrap necessary lines in try blk

* Code review: catch possible duplicate unique_id

* Simplify assignment.

* Code review: use timedelta to retry yaml import

* Code review: if a different error occurs, raise it

* Remove current config entry if duplicate unique_id

* Code review: remove unnecessary line.

* Code review: revert change, leave to other PR.

* Code review: remove unnecessary patch & min->sec

* Remove unnecessary else after raise.

* Increase test coverage.

* Check the number of config entries at each stage

* Raise ConfigEntryNotReady when connection fails.

* Log & return false for error on yaml import
2021-11-03 10:50:01 +01:00
Dave T 0a27b0f353 Aurora abb energy metering (#58454)
Co-authored-by: J. Nick Koston <nick@koston.org>
2021-11-03 10:49:58 +01:00
kodsnutten ae99b678dd Fix unique_id of derived sent-sensors (#58298) 2021-11-03 10:49:54 +01:00
Franck Nijhof e43cb82f29 Merge branch 'master' into rc 2021-11-03 10:22:39 +01:00
Paulus Schoutsen 608b89a6ad Bumped version to 2021.11.0b5 2021-11-02 11:28:43 -07:00
Ernst Klamer a897dfa5b7 Add device configuration URL to Solar-Log (#58954) 2021-11-02 11:28:39 -07:00
Franck Nijhof f8290ed026 Add support for IoT Switches (tdq) in Tuya (#58952) 2021-11-02 11:28:39 -07:00
Franck Nijhof 44334ea4da Extend Tuya Dimmer (tgq) support (#58951) 2021-11-02 11:28:38 -07:00
Erik Montnemery e4143142bf Revert "Add offset support to time trigger" (#58947) 2021-11-02 11:28:37 -07:00
uvjustin d4ba9a137c Add libav.mpegts to logging filter (#58937) 2021-11-02 11:28:36 -07:00
J. Nick Koston 6cd256f26b Fix recursive limit in find_next_time_expression_time (#58914)
* Fix recursive limit in find_next_time_expression_time

* Add test case

* Update test_event.py

Co-authored-by: Erik Montnemery <erik@montnemery.com>
2021-11-02 11:28:35 -07:00
Maciej Bieniek 53cc9f35b9 Add configuration_url to Airly integration (#58911) 2021-11-02 11:28:34 -07:00
Tom Harris 26e925d885 Bump pyinsteon to 1.0.13 (#58908) 2021-11-02 11:28:34 -07:00
Kevin Hellemun 5e09685700 Add ROCKROBO_S6_PURE to supported vacuums for xiaomi_miio (#58901) 2021-11-02 11:28:33 -07:00
Franck Nijhof c97160bf97 Fix incorrect entity category in Advantage Air (#58754) 2021-11-02 11:28:32 -07:00
Peter A. Bigot 34953c4c08 Fix color temp selection when brightness changed in Tuya light (#58341)
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
Co-authored-by: Franck Nijhof <git@frenck.dev>
2021-11-02 11:28:31 -07:00
Paulus Schoutsen 632164f283 Bumped version to 2021.11.0b4 2021-11-01 10:56:45 -07:00
Otto Winter b4021de2b0 Fix find_next_time_expression_time (#58894)
* Better tests

* Fix find_next_time_expression_time

* Add tests for Nov 7th 2021, Chicago transtion

* Update event tests

* Update test_event.py

* small performance improvement

Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2021-11-01 10:56:35 -07:00
Maciej Bieniek cfa4f24395 Handle None values in Xiaomi Miio integration (#58880)
* Initial commit

* Improve _handle_coordinator_update()

* Fix entity_description define

* Improve sensor & binary_sensor platforms

* Log None value

* Use coordinator variable

* Improve log strings

* Filter attributes with None values

* Add hasattr condition

* Update homeassistant/components/xiaomi_miio/sensor.py

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2021-11-01 10:56:35 -07:00
purcell-lab 77c25aa141 Fix renamed solaredge sensor keys (#58875) 2021-11-01 10:56:34 -07:00
Kevin Hellemun 7a0443e2a6 Add ROCKROBO_S4_MAX to supported xiaomi vacuums (#58826) 2021-11-01 10:56:33 -07:00
Marc Hörsken 6e9d759798 Fix OpenWeatherMap options not being initialized the first time (#58736) 2021-11-01 10:56:33 -07:00
Paulus Schoutsen 82b6bbda76 Merge pull request #58905 from home-assistant/2021.10.7 2021-11-01 10:51:04 -07:00
Paulus Schoutsen ad55af4f67 Bumped version to 2021.10.7 2021-11-01 10:01:08 -07:00
Otto Winter 5295ffd6f1 Fix find_next_time_expression_time (#58894)
* Better tests

* Fix find_next_time_expression_time

* Add tests for Nov 7th 2021, Chicago transtion

* Update event tests

* Update test_event.py

* small performance improvement

Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
2021-11-01 10:01:01 -07:00
Simone Chemelli 96d1810019 Abort Fritz config flow for configured hostnames (#58140)
* Abort Fritz config flow for configured hostnames

* Fix tests + consider all combinations

* Fix async context
2021-11-01 09:59:18 -07:00
Brandon Rothweiler fe5b9c75b3 Bump pymazda to 0.2.2 (#58113) 2021-11-01 09:59:17 -07:00
Erik Montnemery 97ba368950 Fix template sensor when name template doesn't render (#58088) 2021-11-01 09:59:17 -07:00
Michael Chisholm 698ceda7c5 Sleep between device requests to detect socket closes (#58087) 2021-11-01 09:59:16 -07:00
micha91 a3c0f7b167 Fix Yamaha MusicCast media_stop (#58024) 2021-11-01 09:59:15 -07:00
Andrey Kupreychik ae463cb210 Abort keenetic SSDP discovery if the unique id is already setup or ignored (#58009) 2021-11-01 09:59:14 -07:00
starkillerOG 387413b5f5 Fix netgear NoneType and discovery (#57904) 2021-11-01 09:59:13 -07:00
Joakim Sørensen 2de74c86e3 Fix Tuya documentation URL (#57889) 2021-11-01 09:59:13 -07:00
Paulus Schoutsen 5ad1ec611d Bumped version to 2021.11.0b3 2021-10-31 20:24:09 -07:00
Maciej Bieniek 375e9fffd1 Add configuration_url to GIOS integration (#58840) 2021-10-31 20:24:03 -07:00
uvjustin 868fbc063d Improve part metadata in stream (#58822) 2021-10-31 20:24:02 -07:00
Robert Hillis 68b0413c98 Bump pyefergy to 0.1.3 (#58821) 2021-10-31 20:24:01 -07:00
Franck Nijhof 6908fa6127 Fix Plugwise not updating config entry with discovery information (#58819) 2021-10-31 20:24:00 -07:00
Kevin Hellemun a0fba15267 Add ROCKROBO_E2 to supported vacuums for xiaomi_miio (#58817)
https://github.com/rytilahti/python-miio/blob/e1adea55f3be237f6e6904210b6f7b52162bf154/miio/vacuum.py#L129
2021-10-31 20:24:00 -07:00
Michael Chisholm 7fae711e0c dlna_dmr: less eager discovery (#58780) 2021-10-31 20:23:59 -07:00
J. Nick Koston e031917a30 Workaround brightness transition delay from off in older yeelight models (#58774) 2021-10-31 20:23:58 -07:00
purcell-lab 184342804e Fix solaredge energy sensor names (#58773) 2021-10-31 20:23:58 -07:00
Michael 2cc3290794 Fix channel.send in Discord (#58756) 2021-10-31 20:23:57 -07:00
J. Nick Koston 0f367722ed Bump zeroconf 0.36.11 (#58755) 2021-10-31 20:23:56 -07:00
Franck Nijhof 9b715383c3 Add configuration_url to OctoPrint (#58753)
* Add configuration_url to Octoprint

* fix device_info() return

Co-authored-by: Michael <35783820+mib1185@users.noreply.github.com>
2021-10-31 20:23:56 -07:00
Kapernicus 8800ceba4d Bump nad_receiver to version 0.3.0 (#58751) 2021-10-31 20:23:55 -07:00
J. Nick Koston 2c509bfc06 Add additional test coverage for RYSE smartbridges with HK (#58746) 2021-10-31 20:23:55 -07:00
Anders Liljekvist aae8c2f5dd Fix bluesound player internally used id (#58732) 2021-10-31 20:23:54 -07:00
Tobias Sauerwein 73dfa2d205 Set Netatmo max default temperature (#58718) 2021-10-31 20:23:53 -07:00
Kevin Hellemun b6d2a7a562 Add ROCKROBO_S4 to xiaomi_miio vaccum models (#58682) 2021-10-31 20:23:53 -07:00
Erik Montnemery 8c2af76a51 Coerce to tuple before asserting the sequence (#58672) 2021-10-31 20:23:52 -07:00
Paulus Schoutsen 4086a40c05 Mobile app to update entity registry on re-register sensors (#58378)
Co-authored-by: J. Nick Koston <nick@koston.org>
2021-10-31 20:23:51 -07:00
Michael 2ea90b803c Add configuration url to AVM Fritz!Smarthome (#57711)
* add configuration url

* extend data update coordinator

* improve exception handling during data update

* store coordinator after first refresh

* fix light init
2021-10-31 20:23:51 -07:00
117 changed files with 3102 additions and 967 deletions
@@ -5,7 +5,7 @@ from homeassistant.components.binary_sensor import (
DEVICE_CLASS_PROBLEM,
BinarySensorEntity,
)
from homeassistant.const import ENTITY_CATEGORY_CONFIG, ENTITY_CATEGORY_DIAGNOSTIC
from homeassistant.const import ENTITY_CATEGORY_DIAGNOSTIC
from .const import DOMAIN as ADVANTAGE_AIR_DOMAIN
from .entity import AdvantageAirEntity
@@ -74,7 +74,7 @@ class AdvantageAirZoneMyZone(AdvantageAirEntity, BinarySensorEntity):
"""Advantage Air Zone MyZone."""
_attr_entity_registry_enabled_default = False
_attr_entity_category = ENTITY_CATEGORY_CONFIG
_attr_entity_category = ENTITY_CATEGORY_DIAGNOSTIC
def __init__(self, instance, ac_key, zone_key):
"""Initialize an Advantage Air Zone MyZone."""
@@ -6,12 +6,7 @@ from homeassistant.components.sensor import (
STATE_CLASS_MEASUREMENT,
SensorEntity,
)
from homeassistant.const import (
ENTITY_CATEGORY_CONFIG,
ENTITY_CATEGORY_DIAGNOSTIC,
PERCENTAGE,
TEMP_CELSIUS,
)
from homeassistant.const import ENTITY_CATEGORY_DIAGNOSTIC, PERCENTAGE, TEMP_CELSIUS
from homeassistant.helpers import config_validation as cv, entity_platform
from .const import ADVANTAGE_AIR_STATE_OPEN, DOMAIN as ADVANTAGE_AIR_DOMAIN
@@ -55,7 +50,7 @@ class AdvantageAirTimeTo(AdvantageAirEntity, SensorEntity):
"""Representation of Advantage Air timer control."""
_attr_native_unit_of_measurement = ADVANTAGE_AIR_SET_COUNTDOWN_UNIT
_attr_entity_category = ENTITY_CATEGORY_CONFIG
_attr_entity_category = ENTITY_CATEGORY_DIAGNOSTIC
def __init__(self, instance, ac_key, action):
"""Initialize the Advantage Air timer control."""
+1
View File
@@ -32,3 +32,4 @@ MANUFACTURER: Final = "Airly sp. z o.o."
MAX_UPDATE_INTERVAL: Final = 90
MIN_UPDATE_INTERVAL: Final = 5
NO_AIRLY_SENSORS: Final = "There are no Airly sensors in this area yet."
URL = "https://airly.org/map/#{latitude},{longitude}"
+4
View File
@@ -55,6 +55,7 @@ from .const import (
MANUFACTURER,
SUFFIX_LIMIT,
SUFFIX_PERCENT,
URL,
)
PARALLEL_UPDATES = 1
@@ -157,6 +158,9 @@ class AirlySensor(CoordinatorEntity, SensorEntity):
identifiers={(DOMAIN, f"{coordinator.latitude}-{coordinator.longitude}")},
manufacturer=MANUFACTURER,
name=DEFAULT_NAME,
configuration_url=URL.format(
latitude=coordinator.latitude, longitude=coordinator.longitude
),
)
self._attr_name = f"{name} {description.name}"
self._attr_unique_id = (
@@ -10,13 +10,15 @@
import logging
from aurorapy.client import AuroraSerialClient
from aurorapy.client import AuroraError, AuroraSerialClient
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ADDRESS, CONF_PORT
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from .const import DOMAIN
from .config_flow import validate_and_connect
from .const import ATTR_SERIAL_NUMBER, DOMAIN
PLATFORMS = ["sensor"]
@@ -29,9 +31,43 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry):
comport = entry.data[CONF_PORT]
address = entry.data[CONF_ADDRESS]
serclient = AuroraSerialClient(address, comport, parity="N", timeout=1)
# To handle yaml import attempts in darkeness, (re)try connecting only if
# unique_id not yet assigned.
if entry.unique_id is None:
try:
res = await hass.async_add_executor_job(
validate_and_connect, hass, entry.data
)
except AuroraError as error:
if "No response after" in str(error):
raise ConfigEntryNotReady("No response (could be dark)") from error
_LOGGER.error("Failed to connect to inverter: %s", error)
return False
except OSError as error:
if error.errno == 19: # No such device.
_LOGGER.error("Failed to connect to inverter: no such COM port")
return False
_LOGGER.error("Failed to connect to inverter: %s", error)
return False
else:
# If we got here, the device is now communicating (maybe after
# being in darkness). But there's a small risk that the user has
# configured via the UI since we last attempted the yaml setup,
# which means we'd get a duplicate unique ID.
new_id = res[ATTR_SERIAL_NUMBER]
# Check if this unique_id has already been used
for existing_entry in hass.config_entries.async_entries(DOMAIN):
if existing_entry.unique_id == new_id:
_LOGGER.debug(
"Remove already configured config entry for id %s", new_id
)
hass.async_create_task(
hass.config_entries.async_remove(entry.entry_id)
)
return False
hass.config_entries.async_update_entry(entry, unique_id=new_id)
hass.data.setdefault(DOMAIN, {})[entry.unique_id] = serclient
hass.config_entries.async_setup_platforms(entry, PLATFORMS)
return True
@@ -1,4 +1,6 @@
"""Top level class for AuroraABBPowerOneSolarPV inverters and sensors."""
from __future__ import annotations
import logging
from aurorapy.client import AuroraSerialClient
@@ -29,10 +31,12 @@ class AuroraDevice(Entity):
self._available = True
@property
def unique_id(self) -> str:
def unique_id(self) -> str | None:
"""Return the unique id for this device."""
serial = self._data[ATTR_SERIAL_NUMBER]
return f"{serial}_{self.type}"
serial = self._data.get(ATTR_SERIAL_NUMBER)
if serial is None:
return None
return f"{serial}_{self.entity_description.key}"
@property
def available(self) -> bool:
@@ -81,16 +81,10 @@ class AuroraABBConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
return self.async_abort(reason="already_setup")
conf = {}
conf[ATTR_SERIAL_NUMBER] = "sn_unknown_yaml"
conf[ATTR_MODEL] = "model_unknown_yaml"
conf[ATTR_FIRMWARE] = "fw_unknown_yaml"
conf[CONF_PORT] = config["device"]
conf[CONF_ADDRESS] = config["address"]
# config["name"] from yaml is ignored.
await self.async_set_unique_id(self.flow_id)
self._abort_if_unique_id_configured()
return self.async_create_entry(title=DEFAULT_INTEGRATION_TITLE, data=conf)
async def async_step_user(self, user_input=None):
@@ -1,6 +1,9 @@
"""Support for Aurora ABB PowerOne Solar Photvoltaic (PV) inverter."""
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
from aurorapy.client import AuroraError, AuroraSerialClient
import voluptuous as vol
@@ -8,19 +11,22 @@ import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA,
STATE_CLASS_MEASUREMENT,
STATE_CLASS_TOTAL_INCREASING,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import SOURCE_IMPORT
from homeassistant.const import (
CONF_ADDRESS,
CONF_DEVICE,
CONF_NAME,
DEVICE_CLASS_ENERGY,
DEVICE_CLASS_POWER,
DEVICE_CLASS_TEMPERATURE,
ENERGY_KILO_WATT_HOUR,
POWER_WATT,
TEMP_CELSIUS,
)
from homeassistant.exceptions import InvalidStateError
import homeassistant.helpers.config_validation as cv
from .aurora_device import AuroraDevice
@@ -28,6 +34,29 @@ from .const import DEFAULT_ADDRESS, DOMAIN
_LOGGER = logging.getLogger(__name__)
SENSOR_TYPES = [
SensorEntityDescription(
key="instantaneouspower",
device_class=DEVICE_CLASS_POWER,
native_unit_of_measurement=POWER_WATT,
state_class=STATE_CLASS_MEASUREMENT,
name="Power Output",
),
SensorEntityDescription(
key="temp",
device_class=DEVICE_CLASS_TEMPERATURE,
native_unit_of_measurement=TEMP_CELSIUS,
state_class=STATE_CLASS_MEASUREMENT,
name="Temperature",
),
SensorEntityDescription(
key="totalenergy",
device_class=DEVICE_CLASS_ENERGY,
native_unit_of_measurement=ENERGY_KILO_WATT_HOUR,
state_class=STATE_CLASS_TOTAL_INCREASING,
name="Total Energy",
),
]
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
@@ -55,15 +84,11 @@ async def async_setup_entry(hass, config_entry, async_add_entities) -> None:
"""Set up aurora_abb_powerone sensor based on a config entry."""
entities = []
sensortypes = [
{"parameter": "instantaneouspower", "name": "Power Output"},
{"parameter": "temperature", "name": "Temperature"},
]
client = hass.data[DOMAIN][config_entry.unique_id]
data = config_entry.data
for sens in sensortypes:
entities.append(AuroraSensor(client, data, sens["name"], sens["parameter"]))
for sens in SENSOR_TYPES:
entities.append(AuroraSensor(client, data, sens))
_LOGGER.debug("async_setup_entry adding %d entities", len(entities))
async_add_entities(entities, True)
@@ -72,22 +97,15 @@ async def async_setup_entry(hass, config_entry, async_add_entities) -> None:
class AuroraSensor(AuroraDevice, SensorEntity):
"""Representation of a Sensor on a Aurora ABB PowerOne Solar inverter."""
_attr_state_class = STATE_CLASS_MEASUREMENT
def __init__(self, client: AuroraSerialClient, data, name, typename):
def __init__(
self,
client: AuroraSerialClient,
data: Mapping[str, Any],
entity_description: SensorEntityDescription,
) -> None:
"""Initialize the sensor."""
super().__init__(client, data)
if typename == "instantaneouspower":
self.type = typename
self._attr_native_unit_of_measurement = POWER_WATT
self._attr_device_class = DEVICE_CLASS_POWER
elif typename == "temperature":
self.type = typename
self._attr_native_unit_of_measurement = TEMP_CELSIUS
self._attr_device_class = DEVICE_CLASS_TEMPERATURE
else:
raise InvalidStateError(f"Unrecognised typename '{typename}'")
self._attr_name = f"{name}"
self.entity_description = entity_description
self.availableprev = True
def update(self):
@@ -98,13 +116,16 @@ class AuroraSensor(AuroraDevice, SensorEntity):
try:
self.availableprev = self._attr_available
self.client.connect()
if self.type == "instantaneouspower":
if self.entity_description.key == "instantaneouspower":
# read ADC channel 3 (grid power output)
power_watts = self.client.measure(3, True)
self._attr_native_value = round(power_watts, 1)
elif self.type == "temperature":
elif self.entity_description.key == "temp":
temperature_c = self.client.measure(21)
self._attr_native_value = round(temperature_c, 1)
elif self.entity_description.key == "totalenergy":
energy_wh = self.client.cumulated_energy(5)
self._attr_native_value = round(energy_wh / 1000, 2)
self._attr_available = True
except AuroraError as error:
@@ -106,8 +106,6 @@ SERVICE_TO_METHOD = {
def _add_player(hass, async_add_entities, host, port=None, name=None):
"""Add Bluesound players."""
if host in [x.host for x in hass.data[DATA_BLUESOUND]]:
return
@callback
def _init_player(event=None):
@@ -127,6 +125,11 @@ def _add_player(hass, async_add_entities, host, port=None, name=None):
@callback
def _add_player_cb():
"""Add player after first sync fetch."""
if player.id in [x.id for x in hass.data[DATA_BLUESOUND]]:
_LOGGER.warning("Player already added %s", player.id)
return
hass.data[DATA_BLUESOUND].append(player)
async_add_entities([player])
_LOGGER.info("Added device with name: %s", player.name)
@@ -138,7 +141,6 @@ def _add_player(hass, async_add_entities, host, port=None, name=None):
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, _stop_polling)
player = BluesoundPlayer(hass, host, port, name, _add_player_cb)
hass.data[DATA_BLUESOUND].append(player)
if hass.is_running:
_init_player()
@@ -208,6 +210,7 @@ class BluesoundPlayer(MediaPlayerEntity):
self._polling_session = async_get_clientsession(hass)
self._polling_task = None # The actual polling task.
self._name = name
self._id = None
self._icon = None
self._capture_items = []
self._services_items = []
@@ -225,6 +228,7 @@ class BluesoundPlayer(MediaPlayerEntity):
self._bluesound_device_name = None
self._init_callback = init_callback
if self.port is None:
self.port = DEFAULT_PORT
@@ -251,6 +255,8 @@ class BluesoundPlayer(MediaPlayerEntity):
if not self._name:
self._name = self._sync_status.get("@name", self.host)
if not self._id:
self._id = self._sync_status.get("@id", None)
if not self._bluesound_device_name:
self._bluesound_device_name = self._sync_status.get("@name", self.host)
if not self._icon:
@@ -259,17 +265,19 @@ class BluesoundPlayer(MediaPlayerEntity):
if (master := self._sync_status.get("master")) is not None:
self._is_master = False
master_host = master.get("#text")
master_port = master.get("@port", "11000")
master_id = f"{master_host}:{master_port}"
master_device = [
device
for device in self._hass.data[DATA_BLUESOUND]
if device.host == master_host
if device.id == master_id
]
if master_device and master_host != self.host:
if master_device and master_id != self.id:
self._master = master_device[0]
else:
self._master = None
_LOGGER.error("Master not found %s", master_host)
_LOGGER.error("Master not found %s", master_id)
else:
if self._master is not None:
self._master = None
@@ -287,14 +295,14 @@ class BluesoundPlayer(MediaPlayerEntity):
await self.async_update_status()
except (asyncio.TimeoutError, ClientError, BluesoundPlayer._TimeoutException):
_LOGGER.info("Node %s is offline, retrying later", self._name)
_LOGGER.info("Node %s:%s is offline, retrying later", self.name, self.port)
await asyncio.sleep(NODE_OFFLINE_CHECK_TIMEOUT)
self.start_polling()
except CancelledError:
_LOGGER.debug("Stopping the polling of node %s", self._name)
_LOGGER.debug("Stopping the polling of node %s:%s", self.name, self.port)
except Exception:
_LOGGER.exception("Unexpected error in %s", self._name)
_LOGGER.exception("Unexpected error in %s:%s", self.name, self.port)
raise
def start_polling(self):
@@ -314,12 +322,14 @@ class BluesoundPlayer(MediaPlayerEntity):
await self.force_update_sync_status(self._init_callback, True)
except (asyncio.TimeoutError, ClientError):
_LOGGER.info("Node %s is offline, retrying later", self.host)
_LOGGER.info("Node %s:%s is offline, retrying later", self.host, self.port)
self._retry_remove = async_track_time_interval(
self._hass, self.async_init, NODE_RETRY_INITIATION
)
except Exception:
_LOGGER.exception("Unexpected when initiating error in %s", self.host)
_LOGGER.exception(
"Unexpected when initiating error in %s:%s", self.host, self.port
)
raise
async def async_update(self):
@@ -366,9 +376,9 @@ class BluesoundPlayer(MediaPlayerEntity):
except (asyncio.TimeoutError, aiohttp.ClientError):
if raise_timeout:
_LOGGER.info("Timeout: %s", self.host)
_LOGGER.info("Timeout: %s:%s", self.host, self.port)
raise
_LOGGER.debug("Failed communicating: %s", self.host)
_LOGGER.debug("Failed communicating: %s:%s", self.host, self.port)
return None
return data
@@ -403,7 +413,7 @@ class BluesoundPlayer(MediaPlayerEntity):
group_name = self._status.get("groupName")
if group_name != self._group_name:
_LOGGER.debug("Group name change detected on device: %s", self.host)
_LOGGER.debug("Group name change detected on device: %s", self.id)
self._group_name = group_name
# rebuild ordered list of entity_ids that are in the group, master is first
@@ -659,6 +669,11 @@ class BluesoundPlayer(MediaPlayerEntity):
mute = bool(int(mute))
return mute
@property
def id(self):
"""Get id of device."""
return self._id
@property
def name(self):
"""Return the name of the device."""
@@ -831,8 +846,8 @@ class BluesoundPlayer(MediaPlayerEntity):
if master_device:
_LOGGER.debug(
"Trying to join player: %s to master: %s",
self.host,
master_device[0].host,
self.id,
master_device[0].id,
)
await master_device[0].async_add_slave(self)
@@ -877,7 +892,7 @@ class BluesoundPlayer(MediaPlayerEntity):
if self._master is None:
return
_LOGGER.debug("Trying to unjoin player: %s", self.host)
_LOGGER.debug("Trying to unjoin player: %s", self.id)
await self._master.async_remove_slave(self)
async def async_add_slave(self, slave_device):
@@ -896,7 +911,7 @@ class BluesoundPlayer(MediaPlayerEntity):
"""Increase sleep time on player."""
sleep_time = await self.send_bluesound_command("/Sleep")
if sleep_time is None:
_LOGGER.error("Error while increasing sleep time on player: %s", self.host)
_LOGGER.error("Error while increasing sleep time on player: %s", self.id)
return 0
return int(sleep_time.get("sleep", "0"))
+2 -2
View File
@@ -94,9 +94,9 @@ class DiscordNotificationService(BaseNotificationService):
for channelid in kwargs[ATTR_TARGET]:
channelid = int(channelid)
try:
channel = discord_bot.fetch_channel(
channel = await discord_bot.fetch_channel(
channelid
) or discord_bot.fetch_user(channelid)
) or await discord_bot.fetch_user(channelid)
except discord.NotFound:
_LOGGER.warning("Channel not found for ID: %s", channelid)
continue
@@ -471,4 +471,20 @@ def _is_ignored_device(discovery_info: Mapping[str, Any]) -> bool:
if discovery_info.get(ssdp.ATTR_UPNP_DEVICE_TYPE) not in DmrDevice.DEVICE_TYPES:
return True
# Special cases for devices with other discovery methods (e.g. mDNS), or
# that advertise multiple unrelated (sent in separate discovery packets)
# UPnP devices.
manufacturer = discovery_info.get(ssdp.ATTR_UPNP_MANUFACTURER, "").lower()
model = discovery_info.get(ssdp.ATTR_UPNP_MODEL_NAME, "").lower()
if manufacturer.startswith("xbmc") or model == "kodi":
# kodi
return True
if manufacturer.startswith("samsung") and "tv" in model:
# samsungtv
return True
if manufacturer.startswith("lg") and "tv" in model:
# webostv
return True
return False
@@ -17,18 +17,6 @@
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:3",
"st": "urn:schemas-upnp-org:device:MediaRenderer:3"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:1"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:2",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:2"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:3",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:3"
}
],
"codeowners": ["@StevenLooman", "@chishm"],
@@ -3,7 +3,7 @@
"name": "Efergy",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/efergy",
"requirements": ["pyefergy==0.1.2"],
"requirements": ["pyefergy==0.1.3"],
"codeowners": ["@tkdrob"],
"iot_class": "cloud_polling"
}
@@ -20,13 +20,12 @@ async def validate_input(data):
lat = data.get(CONF_LATITUDE)
lon = data.get(CONF_LONGITUDE)
station = data.get(CONF_STATION)
lang = data.get(CONF_LANGUAGE)
lang = data.get(CONF_LANGUAGE).lower()
weather_data = ECWeather(
station_id=station,
coordinates=(lat, lon),
language=lang.lower(),
)
if station:
weather_data = ECWeather(station_id=station, language=lang)
else:
weather_data = ECWeather(coordinates=(lat, lon), language=lang)
await weather_data.update()
if lat is None or lon is None:
@@ -182,7 +182,7 @@ class FluxLedUpdateCoordinator(DataUpdateCoordinator):
hass,
_LOGGER,
name=self.device.ipaddr,
update_interval=timedelta(seconds=5),
update_interval=timedelta(seconds=10),
# We don't want an immediate refresh since the device
# takes a moment to reflect the state change
request_refresh_debouncer=Debouncer(
+1 -1
View File
@@ -165,7 +165,7 @@ CUSTOM_EFFECT_DICT: Final = {
vol.Required(CONF_COLORS): vol.All(
cv.ensure_list,
vol.Length(min=1, max=16),
[vol.All(vol.ExactSequence((cv.byte, cv.byte, cv.byte)), vol.Coerce(tuple))],
[vol.All(vol.Coerce(tuple), vol.ExactSequence((cv.byte, cv.byte, cv.byte)))],
),
vol.Optional(CONF_SPEED_PCT, default=50): vol.All(
vol.Range(min=0, max=100), vol.Coerce(int)
@@ -3,7 +3,7 @@
"name": "Flux LED/MagicHome",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/flux_led",
"requirements": ["flux_led==0.24.13"],
"requirements": ["flux_led==0.24.17"],
"quality_scale": "platinum",
"codeowners": ["@icemanch"],
"iot_class": "local_push",
+1 -1
View File
@@ -370,7 +370,7 @@ class FritzBoxTools:
device_reg = async_get(self.hass)
device_list = async_entries_for_config_entry(device_reg, config_entry.entry_id)
for device_entry in device_list:
if async_entries_for_device(
if not async_entries_for_device(
entity_reg,
device_entry.id,
include_disabled_entities=True,
+10 -52
View File
@@ -1,10 +1,7 @@
"""Support for AVM FRITZ!SmartHome devices."""
from __future__ import annotations
from datetime import timedelta
from pyfritzhome import Fritzhome, FritzhomeDevice, LoginError
import requests
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -18,10 +15,7 @@ from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.entity import DeviceInfo, EntityDescription
from homeassistant.helpers.entity_registry import RegistryEntry, async_migrate_entries
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import (
ATTR_STATE_DEVICE_LOCKED,
@@ -32,6 +26,7 @@ from .const import (
LOGGER,
PLATFORMS,
)
from .coordinator import FritzboxDataUpdateCoordinator
from .model import FritzExtraAttributes
@@ -53,52 +48,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
CONF_CONNECTIONS: fritz,
}
def _update_fritz_devices() -> dict[str, FritzhomeDevice]:
"""Update all fritzbox device data."""
try:
devices = fritz.get_devices()
except requests.exceptions.HTTPError:
# If the device rebooted, login again
try:
fritz.login()
except requests.exceptions.HTTPError as ex:
raise ConfigEntryAuthFailed from ex
devices = fritz.get_devices()
data = {}
fritz.update_devices()
for device in devices:
# assume device as unavailable, see #55799
if (
device.has_powermeter
and device.present
and hasattr(device, "voltage")
and device.voltage <= 0
and device.power <= 0
and device.energy <= 0
):
LOGGER.debug("Assume device %s as unavailable", device.name)
device.present = False
data[device.ain] = device
return data
async def async_update_coordinator() -> dict[str, FritzhomeDevice]:
"""Fetch all device data."""
return await hass.async_add_executor_job(_update_fritz_devices)
hass.data[DOMAIN][entry.entry_id][
CONF_COORDINATOR
] = coordinator = DataUpdateCoordinator(
hass,
LOGGER,
name=f"{entry.entry_id}",
update_method=async_update_coordinator,
update_interval=timedelta(seconds=30),
)
coordinator = FritzboxDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
hass.data[DOMAIN][entry.entry_id][CONF_COORDINATOR] = coordinator
def _update_unique_id(entry: RegistryEntry) -> dict[str, str] | None:
"""Update unique ID of entity entry."""
if (
@@ -142,9 +97,11 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
class FritzBoxEntity(CoordinatorEntity):
"""Basis FritzBox entity."""
coordinator: FritzboxDataUpdateCoordinator
def __init__(
self,
coordinator: DataUpdateCoordinator[dict[str, FritzhomeDevice]],
coordinator: FritzboxDataUpdateCoordinator,
ain: str,
entity_description: EntityDescription | None = None,
) -> None:
@@ -174,11 +131,12 @@ class FritzBoxEntity(CoordinatorEntity):
def device_info(self) -> DeviceInfo:
"""Return device specific attributes."""
return DeviceInfo(
name=self.device.name,
identifiers={(DOMAIN, self.ain)},
manufacturer=self.device.manufacturer,
model=self.device.productname,
name=self.device.name,
sw_version=self.device.fw_version,
configuration_url=self.coordinator.configuration_url,
)
@property
@@ -15,10 +15,10 @@ from homeassistant.components.binary_sensor import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from . import FritzBoxEntity
from .const import CONF_COORDINATOR, DOMAIN as FRITZBOX_DOMAIN
from .coordinator import FritzboxDataUpdateCoordinator
from .model import FritzEntityDescriptionMixinBase
@@ -70,7 +70,7 @@ class FritzboxBinarySensor(FritzBoxEntity, BinarySensorEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator[dict[str, FritzhomeDevice]],
coordinator: FritzboxDataUpdateCoordinator,
ain: str,
entity_description: FritzBinarySensorEntityDescription,
) -> None:
@@ -0,0 +1,68 @@
"""Data update coordinator for AVM FRITZ!SmartHome devices."""
from __future__ import annotations
from datetime import timedelta
from pyfritzhome import Fritzhome, FritzhomeDevice, LoginError
import requests
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_CONNECTIONS, DOMAIN, LOGGER
class FritzboxDataUpdateCoordinator(DataUpdateCoordinator):
"""Fritzbox Smarthome device data update coordinator."""
configuration_url: str
def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None:
"""Initialize the Fritzbox Smarthome device coordinator."""
self.entry = entry
self.fritz: Fritzhome = hass.data[DOMAIN][self.entry.entry_id][CONF_CONNECTIONS]
self.configuration_url = self.fritz.get_prefixed_host()
super().__init__(
hass,
LOGGER,
name=entry.entry_id,
update_interval=timedelta(seconds=30),
)
def _update_fritz_devices(self) -> dict[str, FritzhomeDevice]:
"""Update all fritzbox device data."""
try:
devices = self.fritz.get_devices()
except requests.exceptions.ConnectionError as ex:
raise ConfigEntryNotReady from ex
except requests.exceptions.HTTPError:
# If the device rebooted, login again
try:
self.fritz.login()
except LoginError as ex:
raise ConfigEntryAuthFailed from ex
devices = self.fritz.get_devices()
data = {}
self.fritz.update_devices()
for device in devices:
# assume device as unavailable, see #55799
if (
device.has_powermeter
and device.present
and hasattr(device, "voltage")
and device.voltage <= 0
and device.power <= 0
and device.energy <= 0
):
LOGGER.debug("Assume device %s as unavailable", device.name)
device.present = False
data[device.ain] = device
return data
async def _async_update_data(self) -> dict[str, FritzhomeDevice]:
"""Fetch all device data."""
return await self.hass.async_add_executor_job(self._update_fritz_devices)
+2 -4
View File
@@ -3,8 +3,6 @@ from __future__ import annotations
from typing import Any
from pyfritzhome.fritzhomedevice import FritzhomeDevice
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ATTR_COLOR_TEMP,
@@ -16,7 +14,6 @@ from homeassistant.components.light import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import color
from . import FritzBoxEntity
@@ -26,6 +23,7 @@ from .const import (
CONF_COORDINATOR,
DOMAIN as FRITZBOX_DOMAIN,
)
from .coordinator import FritzboxDataUpdateCoordinator
SUPPORTED_COLOR_MODES = {COLOR_MODE_COLOR_TEMP, COLOR_MODE_HS}
@@ -64,7 +62,7 @@ class FritzboxLight(FritzBoxEntity, LightEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator[dict[str, FritzhomeDevice]],
coordinator: FritzboxDataUpdateCoordinator,
ain: str,
supported_colors: dict,
supported_color_temps: list[str],
@@ -3,7 +3,7 @@
"name": "Home Assistant Frontend",
"documentation": "https://www.home-assistant.io/integrations/frontend",
"requirements": [
"home-assistant-frontend==20211028.0"
"home-assistant-frontend==20211108.0"
],
"dependencies": [
"api",
+2
View File
@@ -27,6 +27,8 @@ SCAN_INTERVAL: Final = timedelta(minutes=30)
DOMAIN: Final = "gios"
MANUFACTURER: Final = "Główny Inspektorat Ochrony Środowiska"
URL = "http://powietrze.gios.gov.pl/pjp/current/station_details/info/{station_id}"
API_TIMEOUT: Final = 30
ATTR_INDEX: Final = "index"
+2
View File
@@ -25,6 +25,7 @@ from .const import (
DOMAIN,
MANUFACTURER,
SENSOR_TYPES,
URL,
)
from .model import GiosSensorEntityDescription
@@ -86,6 +87,7 @@ class GiosSensor(CoordinatorEntity, SensorEntity):
identifiers={(DOMAIN, str(coordinator.gios.station_id))},
manufacturer=MANUFACTURER,
name=DEFAULT_NAME,
configuration_url=URL.format(station_id=coordinator.gios.station_id),
)
self._attr_name = f"{name} {description.name}"
self._attr_unique_id = f"{coordinator.gios.station_id}-{description.key}"
@@ -3,7 +3,7 @@
"name": "Elexa Guardian",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/guardian",
"requirements": ["aioguardian==1.0.8"],
"requirements": ["aioguardian==2021.11.0"],
"zeroconf": ["_api._udp.local."],
"codeowners": ["@bachya"],
"iot_class": "local_polling",
@@ -1,5 +1,5 @@
"""Offer time listening automation rules."""
from datetime import datetime, timedelta
from datetime import datetime
from functools import partial
import voluptuous as vol
@@ -8,8 +8,6 @@ from homeassistant.components import sensor
from homeassistant.const import (
ATTR_DEVICE_CLASS,
CONF_AT,
CONF_ENTITY_ID,
CONF_OFFSET,
CONF_PLATFORM,
STATE_UNAVAILABLE,
STATE_UNKNOWN,
@@ -25,21 +23,9 @@ import homeassistant.util.dt as dt_util
# mypy: allow-untyped-defs, no-check-untyped-defs
_TIME_TRIGGER_ENTITY_REFERENCE = vol.All(
str, cv.entity_domain(["input_datetime", "sensor"])
)
_TIME_TRIGGER_WITH_OFFSET_SCHEMA = vol.Schema(
{
vol.Required(CONF_ENTITY_ID): _TIME_TRIGGER_ENTITY_REFERENCE,
vol.Required(CONF_OFFSET): cv.time_period,
}
)
_TIME_TRIGGER_SCHEMA = vol.Any(
cv.time,
_TIME_TRIGGER_ENTITY_REFERENCE,
_TIME_TRIGGER_WITH_OFFSET_SCHEMA,
vol.All(str, cv.entity_domain(["input_datetime", "sensor"])),
msg="Expected HH:MM, HH:MM:SS or Entity ID with domain 'input_datetime' or 'sensor'",
)
@@ -57,7 +43,6 @@ async def async_attach_trigger(hass, config, action, automation_info):
entities = {}
removes = []
job = HassJob(action)
offsets = {}
@callback
def time_automation_listener(description, now, *, entity_id=None):
@@ -92,8 +77,6 @@ async def async_attach_trigger(hass, config, action, automation_info):
if not new_state:
return
offset = offsets[entity_id] if entity_id in offsets else timedelta(0)
# Check state of entity. If valid, set up a listener.
if new_state.domain == "input_datetime":
if has_date := new_state.attributes["has_date"]:
@@ -110,17 +93,14 @@ async def async_attach_trigger(hass, config, action, automation_info):
if has_date:
# If input_datetime has date, then track point in time.
trigger_dt = (
datetime(
year,
month,
day,
hour,
minute,
second,
tzinfo=dt_util.DEFAULT_TIME_ZONE,
)
+ offset
trigger_dt = datetime(
year,
month,
day,
hour,
minute,
second,
tzinfo=dt_util.DEFAULT_TIME_ZONE,
)
# Only set up listener if time is now or in the future.
if trigger_dt >= dt_util.now():
@@ -152,7 +132,7 @@ async def async_attach_trigger(hass, config, action, automation_info):
== sensor.DEVICE_CLASS_TIMESTAMP
and new_state.state not in (STATE_UNAVAILABLE, STATE_UNKNOWN)
):
trigger_dt = dt_util.parse_datetime(new_state.state) + offset
trigger_dt = dt_util.parse_datetime(new_state.state)
if trigger_dt is not None and trigger_dt > dt_util.utcnow():
remove = async_track_point_in_time(
@@ -176,15 +156,6 @@ async def async_attach_trigger(hass, config, action, automation_info):
# entity
to_track.append(at_time)
update_entity_trigger(at_time, new_state=hass.states.get(at_time))
elif isinstance(at_time, dict) and CONF_OFFSET in at_time:
# entity with offset
entity_id = at_time.get(CONF_ENTITY_ID)
to_track.append(entity_id)
offsets[entity_id] = at_time.get(CONF_OFFSET)
update_entity_trigger(
entity_id,
new_state=hass.states.get(entity_id),
)
else:
# datetime.time
removes.append(
@@ -3,7 +3,7 @@
"name": "Insteon",
"documentation": "https://www.home-assistant.io/integrations/insteon",
"requirements": [
"pyinsteon==1.0.12"
"pyinsteon==1.0.13"
],
"codeowners": [
"@teharris1"
+3 -3
View File
@@ -119,19 +119,19 @@ LIFX_EFFECT_PULSE_SCHEMA = cv.make_entity_service_schema(
ATTR_BRIGHTNESS_PCT: VALID_BRIGHTNESS_PCT,
vol.Exclusive(ATTR_COLOR_NAME, COLOR_GROUP): cv.string,
vol.Exclusive(ATTR_RGB_COLOR, COLOR_GROUP): vol.All(
vol.ExactSequence((cv.byte, cv.byte, cv.byte)), vol.Coerce(tuple)
vol.Coerce(tuple), vol.ExactSequence((cv.byte, cv.byte, cv.byte))
),
vol.Exclusive(ATTR_XY_COLOR, COLOR_GROUP): vol.All(
vol.ExactSequence((cv.small_float, cv.small_float)), vol.Coerce(tuple)
vol.Coerce(tuple), vol.ExactSequence((cv.small_float, cv.small_float))
),
vol.Exclusive(ATTR_HS_COLOR, COLOR_GROUP): vol.All(
vol.Coerce(tuple),
vol.ExactSequence(
(
vol.All(vol.Coerce(float), vol.Range(min=0, max=360)),
vol.All(vol.Coerce(float), vol.Range(min=0, max=100)),
)
),
vol.Coerce(tuple),
),
vol.Exclusive(ATTR_COLOR_TEMP, COLOR_GROUP): vol.All(
vol.Coerce(int), vol.Range(min=1)
+1 -1
View File
@@ -2,7 +2,7 @@
"domain": "mill",
"name": "Mill",
"documentation": "https://www.home-assistant.io/integrations/mill",
"requirements": ["millheater==0.7.3"],
"requirements": ["millheater==0.7.4"],
"codeowners": ["@danielhiversen"],
"config_flow": true,
"iot_class": "cloud_polling"
@@ -9,6 +9,7 @@ from .const import (
ATTR_DEVICE_NAME,
ATTR_SENSOR_ATTRIBUTES,
ATTR_SENSOR_DEVICE_CLASS,
ATTR_SENSOR_ENTITY_CATEGORY,
ATTR_SENSOR_ICON,
ATTR_SENSOR_NAME,
ATTR_SENSOR_STATE,
@@ -40,6 +41,7 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
ATTR_SENSOR_STATE: None,
ATTR_SENSOR_TYPE: entry.domain,
ATTR_SENSOR_UNIQUE_ID: entry.unique_id,
ATTR_SENSOR_ENTITY_CATEGORY: entry.entity_category,
}
entities.append(MobileAppBinarySensor(config, entry.device_id, config_entry))
@@ -11,6 +11,7 @@ from .const import (
ATTR_DEVICE_NAME,
ATTR_SENSOR_ATTRIBUTES,
ATTR_SENSOR_DEVICE_CLASS,
ATTR_SENSOR_ENTITY_CATEGORY,
ATTR_SENSOR_ICON,
ATTR_SENSOR_NAME,
ATTR_SENSOR_STATE,
@@ -45,6 +46,7 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
ATTR_SENSOR_TYPE: entry.domain,
ATTR_SENSOR_UNIQUE_ID: entry.unique_id,
ATTR_SENSOR_UOM: entry.unit_of_measurement,
ATTR_SENSOR_ENTITY_CATEGORY: entry.entity_category,
}
entities.append(MobileAppSensor(config, entry.device_id, config_entry))
@@ -446,6 +446,26 @@ async def webhook_register_sensor(hass, config_entry, data):
"Re-register for %s of existing sensor %s", device_name, unique_id
)
entry = entity_registry.async_get(existing_sensor)
changes = {}
if (
new_name := f"{device_name} {data[ATTR_SENSOR_NAME]}"
) != entry.original_name:
changes["original_name"] = new_name
for ent_reg_key, data_key in (
("device_class", ATTR_SENSOR_DEVICE_CLASS),
("unit_of_measurement", ATTR_SENSOR_UOM),
("entity_category", ATTR_SENSOR_ENTITY_CATEGORY),
("original_icon", ATTR_SENSOR_ICON),
):
if data_key in data and getattr(entry, ent_reg_key) != data[data_key]:
changes[ent_reg_key] = data[data_key]
if changes:
entity_registry.async_update_entity(existing_sensor, **changes)
async_dispatcher_send(hass, SIGNAL_SENSOR_UPDATE, data)
else:
register_signal = f"{DOMAIN}_{data[ATTR_SENSOR_TYPE]}_register"
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "MQTT",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/mqtt",
"requirements": ["paho-mqtt==1.5.1"],
"requirements": ["paho-mqtt==1.6.1"],
"dependencies": ["http"],
"codeowners": ["@emontnemery"],
"iot_class": "local_push"
+1 -1
View File
@@ -2,7 +2,7 @@
"domain": "nad",
"name": "NAD",
"documentation": "https://www.home-assistant.io/integrations/nad",
"requirements": ["nad_receiver==0.2.0"],
"requirements": ["nad_receiver==0.3.0"],
"codeowners": [],
"iot_class": "local_polling"
}
+2 -1
View File
@@ -232,6 +232,7 @@ class NetatmoThermostat(NetatmoBase, ClimateEntity):
if self._model == NA_THERM:
self._operation_list.append(HVAC_MODE_OFF)
self._attr_max_temp = DEFAULT_MAX_TEMP
self._attr_unique_id = f"{self._id}-{self._model}"
async def async_added_to_hass(self) -> None:
@@ -446,7 +447,7 @@ class NetatmoThermostat(NetatmoBase, ClimateEntity):
if (temp := kwargs.get(ATTR_TEMPERATURE)) is None:
return
await self._home_status.async_set_room_thermpoint(
self._id, STATE_NETATMO_MANUAL, temp
self._id, STATE_NETATMO_MANUAL, min(temp, DEFAULT_MAX_TEMP)
)
self.async_write_ha_state()
+4 -1
View File
@@ -39,8 +39,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
# strip out the stale options CONF_RESOURCES
if CONF_RESOURCES in entry.options:
new_data = {**entry.data, CONF_RESOURCES: entry.options[CONF_RESOURCES]}
new_options = {k: v for k, v in entry.options.items() if k != CONF_RESOURCES}
hass.config_entries.async_update_entry(entry, options=new_options)
hass.config_entries.async_update_entry(
entry, data=new_data, options=new_options
)
config = entry.data
host = config[CONF_HOST]
+27 -3
View File
@@ -1,9 +1,11 @@
"""Support for monitoring OctoPrint 3D printers."""
from datetime import timedelta
import logging
from typing import cast
from pyoctoprintapi import ApiError, OctoprintClient, PrinterOffline
import voluptuous as vol
from yarl import URL
from homeassistant.config_entries import SOURCE_IMPORT, ConfigEntry
from homeassistant.const import (
@@ -20,6 +22,7 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity import DeviceInfo
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import slugify as util_slugify
import homeassistant.util.dt as dt_util
@@ -160,7 +163,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry):
client.set_api_key(entry.data[CONF_API_KEY])
coordinator = OctoprintDataUpdateCoordinator(hass, client, entry.entry_id, 30)
coordinator = OctoprintDataUpdateCoordinator(hass, client, entry, 30)
await coordinator.async_config_entry_first_refresh()
@@ -184,20 +187,23 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry):
class OctoprintDataUpdateCoordinator(DataUpdateCoordinator):
"""Class to manage fetching Octoprint data."""
config_entry: ConfigEntry
def __init__(
self,
hass: HomeAssistant,
octoprint: OctoprintClient,
config_entry_id: str,
config_entry: ConfigEntry,
interval: int,
) -> None:
"""Initialize."""
super().__init__(
hass,
_LOGGER,
name=f"octoprint-{config_entry_id}",
name=f"octoprint-{config_entry.entry_id}",
update_interval=timedelta(seconds=interval),
)
self.config_entry = config_entry
self._octoprint = octoprint
self._printer_offline = False
self.data = {"printer": None, "job": None, "last_read_time": None}
@@ -225,3 +231,21 @@ class OctoprintDataUpdateCoordinator(DataUpdateCoordinator):
self._printer_offline = False
return {"job": job, "printer": printer, "last_read_time": dt_util.utcnow()}
@property
def device_info(self) -> DeviceInfo:
"""Device info."""
unique_id = cast(str, self.config_entry.unique_id)
configuration_url = URL.build(
scheme=self.config_entry.data[CONF_SSL] and "https" or "http",
host=self.config_entry.data[CONF_HOST],
port=self.config_entry.data[CONF_PORT],
path=self.config_entry.data[CONF_PATH],
)
return DeviceInfo(
identifiers={(DOMAIN, unique_id)},
manufacturer="OctoPrint",
name="OctoPrint",
configuration_url=str(configuration_url),
)
@@ -2,7 +2,6 @@
from __future__ import annotations
from abc import abstractmethod
import logging
from pyoctoprintapi import OctoprintPrinterInfo
@@ -10,14 +9,10 @@ from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN as COMPONENT_DOMAIN
_LOGGER = logging.getLogger(__name__)
from . import OctoprintDataUpdateCoordinator
from .const import DOMAIN
async def async_setup_entry(
@@ -26,7 +21,7 @@ async def async_setup_entry(
async_add_entities: AddEntitiesCallback,
) -> None:
"""Set up the available OctoPrint binary sensors."""
coordinator: DataUpdateCoordinator = hass.data[COMPONENT_DOMAIN][
coordinator: OctoprintDataUpdateCoordinator = hass.data[DOMAIN][
config_entry.entry_id
]["coordinator"]
device_id = config_entry.unique_id
@@ -44,9 +39,11 @@ async def async_setup_entry(
class OctoPrintBinarySensorBase(CoordinatorEntity, BinarySensorEntity):
"""Representation an OctoPrint binary sensor."""
coordinator: OctoprintDataUpdateCoordinator
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: OctoprintDataUpdateCoordinator,
sensor_type: str,
device_id: str,
) -> None:
@@ -59,11 +56,7 @@ class OctoPrintBinarySensorBase(CoordinatorEntity, BinarySensorEntity):
@property
def device_info(self):
"""Device info."""
return {
"identifiers": {(COMPONENT_DOMAIN, self._device_id)},
"manufacturer": "OctoPrint",
"name": "OctoPrint",
}
return self.coordinator.device_info
@property
def is_on(self):
@@ -87,7 +80,9 @@ class OctoPrintBinarySensorBase(CoordinatorEntity, BinarySensorEntity):
class OctoPrintPrintingBinarySensor(OctoPrintBinarySensorBase):
"""Representation an OctoPrint binary sensor."""
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Printing", device_id)
@@ -98,7 +93,9 @@ class OctoPrintPrintingBinarySensor(OctoPrintBinarySensorBase):
class OctoPrintPrintingErrorBinarySensor(OctoPrintBinarySensorBase):
"""Representation an OctoPrint binary sensor."""
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Printing Error", device_id)
@@ -189,7 +189,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
try:
user_input[CONF_API_KEY] = await octoprint.request_app_key(
"Home Assistant", user_input[CONF_USERNAME], 30
"Home Assistant", user_input[CONF_USERNAME], 300
)
finally:
# Continue the flow after show progress when the task is done.
+21 -17
View File
@@ -16,12 +16,10 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import DOMAIN as COMPONENT_DOMAIN
from . import OctoprintDataUpdateCoordinator
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
@@ -32,7 +30,7 @@ async def async_setup_entry(
async_add_entities: AddEntitiesCallback,
) -> None:
"""Set up the available OctoPrint binary sensors."""
coordinator: DataUpdateCoordinator = hass.data[COMPONENT_DOMAIN][
coordinator: OctoprintDataUpdateCoordinator = hass.data[DOMAIN][
config_entry.entry_id
]["coordinator"]
device_id = config_entry.unique_id
@@ -67,9 +65,11 @@ async def async_setup_entry(
class OctoPrintSensorBase(CoordinatorEntity, SensorEntity):
"""Representation of an OctoPrint sensor."""
coordinator: OctoprintDataUpdateCoordinator
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: OctoprintDataUpdateCoordinator,
sensor_type: str,
device_id: str,
) -> None:
@@ -82,11 +82,7 @@ class OctoPrintSensorBase(CoordinatorEntity, SensorEntity):
@property
def device_info(self):
"""Device info."""
return {
"identifiers": {(COMPONENT_DOMAIN, self._device_id)},
"manufacturer": "OctoPrint",
"name": "OctoPrint",
}
return self.coordinator.device_info
class OctoPrintStatusSensor(OctoPrintSensorBase):
@@ -94,7 +90,9 @@ class OctoPrintStatusSensor(OctoPrintSensorBase):
_attr_icon = "mdi:printer-3d"
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Current State", device_id)
@@ -119,7 +117,9 @@ class OctoPrintJobPercentageSensor(OctoPrintSensorBase):
_attr_native_unit_of_measurement = PERCENTAGE
_attr_icon = "mdi:file-percent"
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Job Percentage", device_id)
@@ -142,7 +142,9 @@ class OctoPrintEstimatedFinishTimeSensor(OctoPrintSensorBase):
_attr_device_class = DEVICE_CLASS_TIMESTAMP
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Estimated Finish Time", device_id)
@@ -163,7 +165,9 @@ class OctoPrintStartTimeSensor(OctoPrintSensorBase):
_attr_device_class = DEVICE_CLASS_TIMESTAMP
def __init__(self, coordinator: DataUpdateCoordinator, device_id: str) -> None:
def __init__(
self, coordinator: OctoprintDataUpdateCoordinator, device_id: str
) -> None:
"""Initialize a new OctoPrint sensor."""
super().__init__(coordinator, "Start Time", device_id)
@@ -189,7 +193,7 @@ class OctoPrintTemperatureSensor(OctoPrintSensorBase):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: OctoprintDataUpdateCoordinator,
tool: str,
temp_type: str,
device_id: str,
@@ -62,7 +62,7 @@ PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
CONF_NEIGHBORS, DEFAULT_NEIGHBORS
): cv.positive_int,
vol.Optional(CONF_MIN_SIZE, DEFAULT_MIN_SIZE): vol.Schema(
vol.All(vol.ExactSequence([int, int]), vol.Coerce(tuple))
vol.All(vol.Coerce(tuple), vol.ExactSequence([int, int]))
),
}
),
@@ -109,13 +109,15 @@ class OpenWeatherMapOptionsFlow(config_entries.OptionsFlow):
vol.Optional(
CONF_MODE,
default=self.config_entry.options.get(
CONF_MODE, DEFAULT_FORECAST_MODE
CONF_MODE,
self.config_entry.data.get(CONF_MODE, DEFAULT_FORECAST_MODE),
),
): vol.In(FORECAST_MODES),
vol.Optional(
CONF_LANGUAGE,
default=self.config_entry.options.get(
CONF_LANGUAGE, DEFAULT_LANGUAGE
CONF_LANGUAGE,
self.config_entry.data.get(CONF_LANGUAGE, DEFAULT_LANGUAGE),
),
): vol.In(LANGUAGES),
}
@@ -109,7 +109,7 @@ class PlugwiseConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
# unique_id is needed here, to be able to determine whether the discovered device is known, or not.
unique_id = self.discovery_info.get("hostname").split(".")[0]
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
self._abort_if_unique_id_configured({CONF_HOST: self.discovery_info[CONF_HOST]})
if DEFAULT_USERNAME not in unique_id:
self.discovery_info[CONF_USERNAME] = STRETCH_USERNAME
@@ -3,7 +3,7 @@
"name": "Spain electricity hourly pricing (PVPC)",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/pvpc_hourly_pricing",
"requirements": ["aiopvpc==2.2.0"],
"requirements": ["aiopvpc==2.2.1"],
"codeowners": ["@azogue"],
"quality_scale": "platinum",
"iot_class": "cloud_polling"
@@ -1,17 +1,11 @@
"""Support for ReCollect Waste sensors."""
from __future__ import annotations
from datetime import date, datetime, time
from aiorecollect.client import PickupType
from homeassistant.components.sensor import SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
ATTR_ATTRIBUTION,
CONF_FRIENDLY_NAME,
DEVICE_CLASS_TIMESTAMP,
)
from homeassistant.const import ATTR_ATTRIBUTION, CONF_FRIENDLY_NAME, DEVICE_CLASS_DATE
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
@@ -19,7 +13,6 @@ from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.util.dt import as_utc
from .const import CONF_PLACE_ID, CONF_SERVICE_ID, DATA_COORDINATOR, DOMAIN
@@ -47,12 +40,6 @@ def async_get_pickup_type_names(
]
@callback
def async_get_utc_midnight(target_date: date) -> datetime:
"""Get UTC midnight for a given date."""
return as_utc(datetime.combine(target_date, time(0)))
async def async_setup_entry(
hass: HomeAssistant, entry: ConfigEntry, async_add_entities: AddEntitiesCallback
) -> None:
@@ -64,7 +51,7 @@ async def async_setup_entry(
class ReCollectWasteSensor(CoordinatorEntity, SensorEntity):
"""ReCollect Waste Sensor."""
_attr_device_class = DEVICE_CLASS_TIMESTAMP
_attr_device_class = DEVICE_CLASS_DATE
def __init__(self, coordinator: DataUpdateCoordinator, entry: ConfigEntry) -> None:
"""Initialize the sensor."""
@@ -91,8 +78,13 @@ class ReCollectWasteSensor(CoordinatorEntity, SensorEntity):
@callback
def update_from_latest_data(self) -> None:
"""Update the state."""
pickup_event = self.coordinator.data[0]
next_pickup_event = self.coordinator.data[1]
try:
pickup_event = self.coordinator.data[0]
next_pickup_event = self.coordinator.data[1]
except IndexError:
self._attr_native_value = None
self._attr_extra_state_attributes = {}
return
self._attr_extra_state_attributes.update(
{
@@ -103,9 +95,7 @@ class ReCollectWasteSensor(CoordinatorEntity, SensorEntity):
ATTR_NEXT_PICKUP_TYPES: async_get_pickup_type_names(
self._entry, next_pickup_event.pickup_types
),
ATTR_NEXT_PICKUP_DATE: async_get_utc_midnight(
next_pickup_event.date
).isoformat(),
ATTR_NEXT_PICKUP_DATE: next_pickup_event.date.isoformat(),
}
)
self._attr_native_value = async_get_utc_midnight(pickup_event.date).isoformat()
self._attr_native_value = pickup_event.date.isoformat()
@@ -793,7 +793,7 @@ class Recorder(threading.Thread):
if statistics.add_external_statistics(self, metadata, stats):
return
# Schedule a new statistics task if this one didn't finish
self.queue.put(StatisticsTask(metadata, stats))
self.queue.put(ExternalStatisticsTask(metadata, stats))
def _process_one_event(self, event):
"""Process one event."""
+14 -9
View File
@@ -12,6 +12,7 @@ from sqlalchemy.exc import (
SQLAlchemyError,
)
from sqlalchemy.schema import AddConstraint, DropConstraint
from sqlalchemy.sql.expression import true
from .models import (
SCHEMA_VERSION,
@@ -24,7 +25,7 @@ from .models import (
StatisticsShortTerm,
process_timestamp,
)
from .statistics import get_metadata_with_session, get_start_time
from .statistics import get_start_time
from .util import session_scope
_LOGGER = logging.getLogger(__name__)
@@ -558,21 +559,25 @@ def _apply_update(instance, session, new_version, old_version): # noqa: C901
session.add(StatisticsRuns(start=fake_start_time))
fake_start_time += timedelta(minutes=5)
# Copy last hourly statistic to the newly created 5-minute statistics table
sum_statistics = get_metadata_with_session(
instance.hass, session, statistic_type="sum"
)
for metadata_id, _ in sum_statistics.values():
# When querying the database, be careful to only explicitly query for columns
# which were present in schema version 21. If querying the table, SQLAlchemy
# will refer to future columns.
for sum_statistic in session.query(StatisticsMeta.id).filter_by(has_sum=true()):
last_statistic = (
session.query(Statistics)
.filter_by(metadata_id=metadata_id)
session.query(
Statistics.start,
Statistics.last_reset,
Statistics.state,
Statistics.sum,
)
.filter_by(metadata_id=sum_statistic.id)
.order_by(Statistics.start.desc())
.first()
)
if last_statistic:
session.add(
StatisticsShortTerm(
metadata_id=last_statistic.metadata_id,
metadata_id=sum_statistic.id,
start=last_statistic.start,
last_reset=last_statistic.last_reset,
state=last_statistic.state,
+2 -2
View File
@@ -49,7 +49,7 @@ MIN_VERSION_MARIA_DB_ROWNUM = AwesomeVersion("10.2.0", AwesomeVersionStrategy.SI
MIN_VERSION_MYSQL = AwesomeVersion("8.0.0", AwesomeVersionStrategy.SIMPLEVER)
MIN_VERSION_MYSQL_ROWNUM = AwesomeVersion("5.8.0", AwesomeVersionStrategy.SIMPLEVER)
MIN_VERSION_PGSQL = AwesomeVersion("12.0", AwesomeVersionStrategy.SIMPLEVER)
MIN_VERSION_SQLITE = AwesomeVersion("3.32.1", AwesomeVersionStrategy.SIMPLEVER)
MIN_VERSION_SQLITE = AwesomeVersion("3.31.0", AwesomeVersionStrategy.SIMPLEVER)
MIN_VERSION_SQLITE_ROWNUM = AwesomeVersion("3.25.0", AwesomeVersionStrategy.SIMPLEVER)
# This is the maximum time after the recorder ends the session
@@ -295,7 +295,7 @@ def _warn_unsupported_dialect(dialect):
"Starting with Home Assistant 2022.2 this will prevent the recorder from "
"starting. Please migrate your database to a supported software before then",
dialect,
"MariaDB ≥ 10.3, MySQL ≥ 8.0, PostgreSQL ≥ 12, SQLite ≥ 3.32.1",
"MariaDB ≥ 10.3, MySQL ≥ 8.0, PostgreSQL ≥ 12, SQLite ≥ 3.31.0",
)
+1 -6
View File
@@ -138,11 +138,6 @@ class SharkVacuumEntity(CoordinatorEntity, StateVacuumEntity):
"""Flag vacuum cleaner robot features that are supported."""
return SUPPORT_SHARKIQ
@property
def is_docked(self) -> bool | None:
"""Is vacuum docked."""
return self.sharkiq.get_property_value(Properties.DOCKED_STATUS)
@property
def error_code(self) -> int | None:
"""Return the last observed error code (or None)."""
@@ -175,7 +170,7 @@ class SharkVacuumEntity(CoordinatorEntity, StateVacuumEntity):
In the app, these are (usually) handled by showing the robot as stopped and sending the
user a notification.
"""
if self.is_docked:
if self.sharkiq.get_property_value(Properties.CHARGING_STATUS):
return STATE_DOCKED
return self.operating_mode
+17 -9
View File
@@ -282,9 +282,6 @@ class ShellyBlockEntity(entity.Entity):
self.wrapper = wrapper
self.block = block
self._name = get_block_entity_name(wrapper.device, block)
self._attr_device_info = DeviceInfo(
connections={(device_registry.CONNECTION_NETWORK_MAC, wrapper.mac)}
)
@property
def name(self) -> str:
@@ -296,6 +293,13 @@ class ShellyBlockEntity(entity.Entity):
"""If device should be polled."""
return False
@property
def device_info(self) -> DeviceInfo:
"""Device info."""
return {
"connections": {(device_registry.CONNECTION_NETWORK_MAC, self.wrapper.mac)}
}
@property
def available(self) -> bool:
"""Available."""
@@ -344,9 +348,9 @@ class ShellyRpcEntity(entity.Entity):
self.wrapper = wrapper
self.key = key
self._attr_should_poll = False
self._attr_device_info = DeviceInfo(
connections={(device_registry.CONNECTION_NETWORK_MAC, wrapper.mac)}
)
self._attr_device_info = {
"connections": {(device_registry.CONNECTION_NETWORK_MAC, wrapper.mac)}
}
self._attr_unique_id = f"{wrapper.mac}-{key}"
self._attr_name = get_rpc_entity_name(wrapper.device, key)
@@ -490,15 +494,19 @@ class ShellyRestAttributeEntity(update_coordinator.CoordinatorEntity):
self.description = description
self._name = get_block_entity_name(wrapper.device, None, self.description.name)
self._last_value = None
self._attr_device_info = DeviceInfo(
connections={(device_registry.CONNECTION_NETWORK_MAC, wrapper.mac)}
)
@property
def name(self) -> str:
"""Name of sensor."""
return self._name
@property
def device_info(self) -> DeviceInfo:
"""Device info."""
return {
"connections": {(device_registry.CONNECTION_NETWORK_MAC, self.wrapper.mac)}
}
@property
def entity_registry_enabled_default(self) -> bool:
"""Return if it should be enabled by default."""
@@ -3,7 +3,7 @@
"name": "Shelly",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/shelly",
"requirements": ["aioshelly==1.0.2"],
"requirements": ["aioshelly==1.0.4"],
"zeroconf": [
{
"type": "_http._tcp.local.",
@@ -2,7 +2,7 @@
"domain": "shiftr",
"name": "shiftr.io",
"documentation": "https://www.home-assistant.io/integrations/shiftr",
"requirements": ["paho-mqtt==1.5.1"],
"requirements": ["paho-mqtt==1.6.1"],
"codeowners": ["@fabaff"],
"iot_class": "cloud_push"
}
@@ -103,9 +103,11 @@ ATTR_TIMESTAMP = "timestamp"
DEFAULT_ENTITY_MODEL = "alarm_control_panel"
DEFAULT_ENTITY_NAME = "Alarm Control Panel"
DEFAULT_REST_API_ERROR_COUNT = 2
DEFAULT_SCAN_INTERVAL = timedelta(seconds=30)
DEFAULT_SOCKET_MIN_RETRY = 15
DISPATCHER_TOPIC_WEBSOCKET_EVENT = "simplisafe_websocket_event_{0}"
EVENT_SIMPLISAFE_EVENT = "SIMPLISAFE_EVENT"
@@ -556,6 +558,8 @@ class SimpliSafeEntity(CoordinatorEntity):
assert simplisafe.coordinator
super().__init__(simplisafe.coordinator)
self._rest_api_errors = 0
if device:
model = device.type.name
device_name = device.name
@@ -618,11 +622,24 @@ class SimpliSafeEntity(CoordinatorEntity):
else:
system_offline = False
return super().available and self._online and not system_offline
return (
self._rest_api_errors < DEFAULT_REST_API_ERROR_COUNT
and self._online
and not system_offline
)
@callback
def _handle_coordinator_update(self) -> None:
"""Update the entity with new REST API data."""
# SimpliSafe can incorrectly return an error state when there isn't any
# error. This can lead to the system having an unknown state frequently.
# To protect against that, we measure how many "error states" we receive
# and only alter the state if we detect a few in a row:
if self.coordinator.last_update_success:
self._rest_api_errors = 0
else:
self._rest_api_errors += 1
self.async_update_from_rest_api()
self.async_write_ha_state()
@@ -72,8 +72,6 @@ ATTR_RF_JAMMING = "rf_jamming"
ATTR_WALL_POWER_LEVEL = "wall_power_level"
ATTR_WIFI_STRENGTH = "wifi_strength"
DEFAULT_ERRORS_TO_ACCOMMODATE = 2
VOLUME_STRING_MAP = {
VOLUME_HIGH: "high",
VOLUME_LOW: "low",
@@ -141,8 +139,6 @@ class SimpliSafeAlarm(SimpliSafeEntity, AlarmControlPanelEntity):
additional_websocket_events=WEBSOCKET_EVENTS_TO_LISTEN_FOR,
)
self._errors = 0
if code := self._simplisafe.entry.options.get(CONF_CODE):
if code.isdigit():
self._attr_code_format = FORMAT_NUMBER
@@ -249,19 +245,6 @@ class SimpliSafeAlarm(SimpliSafeEntity, AlarmControlPanelEntity):
}
)
# SimpliSafe can incorrectly return an error state when there isn't any
# error. This can lead to the system having an unknown state frequently.
# To protect against that, we measure how many "error states" we receive
# and only alter the state if we detect a few in a row:
if self._system.state == SystemStates.error:
if self._errors > DEFAULT_ERRORS_TO_ACCOMMODATE:
self._attr_state = None
else:
self._errors += 1
return
self._errors = 0
self._set_state_from_system_data()
@callback
+10 -10
View File
@@ -147,45 +147,45 @@ SENSOR_TYPES = [
icon="mdi:car-battery",
),
SolarEdgeSensorEntityDescription(
key="purchased_power",
key="purchased_energy",
json_key="Purchased",
name="Imported Power",
name="Imported Energy",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_WATT_HOUR,
device_class=DEVICE_CLASS_ENERGY,
),
SolarEdgeSensorEntityDescription(
key="production_power",
key="production_energy",
json_key="Production",
name="Production Power",
name="Production Energy",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_WATT_HOUR,
device_class=DEVICE_CLASS_ENERGY,
),
SolarEdgeSensorEntityDescription(
key="consumption_power",
key="consumption_energy",
json_key="Consumption",
name="Consumption Power",
name="Consumption Energy",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_WATT_HOUR,
device_class=DEVICE_CLASS_ENERGY,
),
SolarEdgeSensorEntityDescription(
key="selfconsumption_power",
key="selfconsumption_energy",
json_key="SelfConsumption",
name="SelfConsumption Power",
name="SelfConsumption Energy",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_WATT_HOUR,
device_class=DEVICE_CLASS_ENERGY,
),
SolarEdgeSensorEntityDescription(
key="feedin_power",
key="feedin_energy",
json_key="FeedIn",
name="Exported Power",
name="Exported Energy",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_TOTAL_INCREASING,
native_unit_of_measurement=ENERGY_WATT_HOUR,
+5 -5
View File
@@ -92,11 +92,11 @@ class SolarEdgeSensorFactory:
self.services[key] = (SolarEdgeStorageLevelSensor, flow)
for key in (
"purchased_power",
"production_power",
"feedin_power",
"consumption_power",
"selfconsumption_power",
"purchased_energy",
"production_energy",
"feedin_energy",
"consumption_energy",
"selfconsumption_energy",
):
self.services[key] = (SolarEdgeEnergyDetailsSensor, energy)
@@ -34,6 +34,7 @@ class SolarlogSensor(update_coordinator.CoordinatorEntity, SensorEntity):
identifiers={(DOMAIN, coordinator.unique_id)},
manufacturer="Solar-Log",
name=coordinator.name,
configuration_url=coordinator.host,
)
@property
@@ -120,6 +120,7 @@ def filter_libav_logging() -> None:
"libav.rtsp",
"libav.tcp",
"libav.tls",
"libav.mpegts",
"libav.NULL",
):
logging.getLogger(logging_namespace).addFilter(libav_filter)
+72 -45
View File
@@ -66,9 +66,15 @@ class SegmentBuffer:
memory_file: BytesIO,
sequence: int,
input_vstream: av.video.VideoStream,
) -> av.container.OutputContainer:
"""Make a new av OutputContainer."""
return av.open(
input_astream: av.audio.stream.AudioStream,
) -> tuple[
av.container.OutputContainer,
av.video.VideoStream,
av.audio.stream.AudioStream | None,
]:
"""Make a new av OutputContainer and add output streams."""
add_audio = input_astream and input_astream.name in AUDIO_CODECS
container = av.open(
memory_file,
mode="w",
format=SEGMENT_CONTAINER_FORMAT,
@@ -93,19 +99,21 @@ class SegmentBuffer:
# Create a fragment every TARGET_PART_DURATION. The data from each fragment is stored in
# a "Part" that can be combined with the data from all the other "Part"s, plus an init
# section, to reconstitute the data in a "Segment".
# frag_duration is the threshold for determining part boundaries, and the dts of the last
# packet in the part should correspond to a duration that is smaller than this value.
# However, as the part duration includes the duration of the last frame, the part duration
# will be equal to or greater than this value.
# We previously scaled this number down by .85 to account for this while keeping within
# the 15% variance allowed in part duration. However, this did not work when inputs had
# an audio stream - sometimes the fragment would get cut on the audio packet, causing
# the durations to actually be to short.
# The current approach is to use this frag_duration for creating the media while
# adjusting the metadata duration to keep the durations in the metadata below the
# part_target_duration threshold.
# The LL-HLS spec allows for a fragment's duration to be within the range [0.85x,1.0x]
# of the part target duration. We use the frag_duration option to tell ffmpeg to try to
# cut the fragments when they reach frag_duration. However, the resulting fragments can
# have variability in their durations and can end up being too short or too long. With a
# video track with no audio, the discrete nature of frames means that the frame at the
# end of a fragment will sometimes extend slightly beyond the desired frag_duration.
# If there are two tracks, as in the case of a video feed with audio, there is an added
# wrinkle as the fragment cut seems to be done on the first track that crosses the desired
# threshold, and cutting on the audio track may also result in a shorter video fragment
# than desired.
# Given this, our approach is to give ffmpeg a frag_duration somewhere in the middle
# of the range, hoping that the parts stay pretty well bounded, and we adjust the part
# durations a bit in the hls metadata so that everything "looks" ok.
"frag_duration": str(
self._stream_settings.part_target_duration * 1e6
self._stream_settings.part_target_duration * 9e5
),
}
if self._stream_settings.ll_hls
@@ -113,6 +121,12 @@ class SegmentBuffer:
),
},
)
output_vstream = container.add_stream(template=input_vstream)
# Check if audio is requested
output_astream = None
if add_audio:
output_astream = container.add_stream(template=input_astream)
return container, output_vstream, output_astream
def set_streams(
self,
@@ -128,26 +142,22 @@ class SegmentBuffer:
"""Initialize a new stream segment."""
# Keep track of the number of segments we've processed
self._sequence += 1
self._segment_start_dts = video_dts
self._part_start_dts = self._segment_start_dts = video_dts
self._segment = None
self._memory_file = BytesIO()
self._memory_file_pos = 0
self._av_output = self.make_new_av(
(
self._av_output,
self._output_video_stream,
self._output_audio_stream,
) = self.make_new_av(
memory_file=self._memory_file,
sequence=self._sequence,
input_vstream=self._input_video_stream,
)
self._output_video_stream = self._av_output.add_stream(
template=self._input_video_stream
input_astream=self._input_audio_stream,
)
if self._output_video_stream.name == "hevc":
self._output_video_stream.codec_tag = "hvc1"
# Check if audio is requested
self._output_audio_stream = None
if self._input_audio_stream and self._input_audio_stream.name in AUDIO_CODECS:
self._output_audio_stream = self._av_output.add_stream(
template=self._input_audio_stream
)
def mux_packet(self, packet: av.Packet) -> None:
"""Mux a packet to the appropriate output stream."""
@@ -186,13 +196,9 @@ class SegmentBuffer:
# Fetch the latest StreamOutputs, which may have changed since the
# worker started.
stream_outputs=self._outputs_callback().values(),
start_time=self._start_time
+ datetime.timedelta(
seconds=float(self._segment_start_dts * packet.time_base)
),
start_time=self._start_time,
)
self._memory_file_pos = self._memory_file.tell()
self._part_start_dts = self._segment_start_dts
else: # These are the ends of the part segments
self.flush(packet, last_part=False)
@@ -201,17 +207,23 @@ class SegmentBuffer:
If last_part is True, also close the segment, give it a duration,
and clean up the av_output and memory_file.
There are two different ways to enter this function, and when
last_part is True, packet has not yet been muxed, while when
last_part is False, the packet has already been muxed. However,
in both cases, packet is the next packet and is not included in
the Part.
This function writes the duration metadata for the Part and
for the Segment. However, as the fragmentation done by ffmpeg
may result in fragment durations which fall outside the
[0.85x,1.0x] tolerance band allowed by LL-HLS, we need to fudge
some durations a bit by reporting them as being within that
range.
Note that repeated adjustments may cause drift between the part
durations in the metadata and those in the media and result in
playback issues in some clients.
"""
# In some cases using the current packet's dts (which is the start
# dts of the next part) to calculate the part duration will result in a
# value which exceeds the part_target_duration. This can muck up the
# duration of both this part and the next part. An easy fix is to just
# use the current packet dts and cap it by the part target duration.
# The adjustment may cause a drift between this adjusted duration
# (used in the metadata) and the media duration, but the drift should be
# automatically corrected when the part duration cleanly divides the
# framerate.
current_dts = min(
# Part durations should not exceed the part target duration
adjusted_dts = min(
packet.dts,
self._part_start_dts
+ self._stream_settings.part_target_duration / packet.time_base,
@@ -220,29 +232,44 @@ class SegmentBuffer:
# Closing the av_output will write the remaining buffered data to the
# memory_file as a new moof/mdat.
self._av_output.close()
elif not self._part_has_keyframe:
# Parts which are not the last part or an independent part should
# not have durations below 0.85 of the part target duration.
adjusted_dts = max(
adjusted_dts,
self._part_start_dts
+ 0.85 * self._stream_settings.part_target_duration / packet.time_base,
)
assert self._segment
self._memory_file.seek(self._memory_file_pos)
self._hass.loop.call_soon_threadsafe(
self._segment.async_add_part,
Part(
duration=float((current_dts - self._part_start_dts) * packet.time_base),
duration=float(
(adjusted_dts - self._part_start_dts) * packet.time_base
),
has_keyframe=self._part_has_keyframe,
data=self._memory_file.read(),
),
float((current_dts - self._segment_start_dts) * packet.time_base)
(
segment_duration := float(
(adjusted_dts - self._segment_start_dts) * packet.time_base
)
)
if last_part
else 0,
)
if last_part:
# If we've written the last part, we can close the memory_file.
self._memory_file.close() # We don't need the BytesIO object anymore
self._start_time += datetime.timedelta(seconds=segment_duration)
# Reinitialize
self.reset(current_dts)
self.reset(packet.dts)
else:
# For the last part, these will get set again elsewhere so we can skip
# setting them here.
self._memory_file_pos = self._memory_file.tell()
self._part_start_dts = current_dts
self._part_start_dts = adjusted_dts
self._part_has_keyframe = False
def discontinuity(self) -> None:
@@ -233,7 +233,7 @@ async def async_setup_entry( # noqa: C901
surveillance_station = api.surveillance_station
try:
async with async_timeout.timeout(10):
async with async_timeout.timeout(30):
await hass.async_add_executor_job(surveillance_station.update)
except SynologyDSMAPIErrorException as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
@@ -41,7 +41,7 @@ BATTERY_BINARY_SENSOR_TYPES: tuple[SystemBridgeBinarySensorEntityDescription, ..
key="battery_is_charging",
name="Battery Is Charging",
device_class=DEVICE_CLASS_BATTERY_CHARGING,
value=lambda bridge: bridge.information.updates.available,
value=lambda bridge: bridge.battery.isCharging,
),
)
@@ -38,7 +38,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
TotalConnectClient, username, password, usercodes
)
if not client.is_valid_credentials():
if not client.is_logged_in():
raise ConfigEntryAuthFailed("TotalConnect authentication failed")
coordinator = TotalConnectDataUpdateCoordinator(hass, client)
@@ -88,5 +88,3 @@ class TotalConnectDataUpdateCoordinator(DataUpdateCoordinator):
raise UpdateFailed(exception) from exception
except ValueError as exception:
raise UpdateFailed("Unknown state from TotalConnect") from exception
return True
@@ -40,7 +40,7 @@ class TotalConnectConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
TotalConnectClient, username, password, None
)
if client.is_valid_credentials():
if client.is_logged_in():
# username/password valid so show user locations
self.username = username
self.password = password
@@ -136,7 +136,7 @@ class TotalConnectConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
self.usercodes,
)
if not client.is_valid_credentials():
if not client.is_logged_in():
errors["base"] = "invalid_auth"
return self.async_show_form(
step_id="reauth_confirm",
@@ -2,7 +2,7 @@
"domain": "totalconnect",
"name": "Total Connect",
"documentation": "https://www.home-assistant.io/integrations/totalconnect",
"requirements": ["total_connect_client==2021.8.3"],
"requirements": ["total_connect_client==2021.11.2"],
"dependencies": [],
"codeowners": ["@austinmroczek"],
"config_flow": true,
+18 -2
View File
@@ -60,7 +60,6 @@ class TradfriBaseClass(Entity):
"""Initialize a device."""
self._api = handle_error(api)
self._attr_name = device.name
self._attr_available = device.reachable
self._device: Device = device
self._device_control: BlindControl | LightControl | SocketControl | SignalRepeaterControl | AirPurifierControl | None = (
None
@@ -105,7 +104,6 @@ class TradfriBaseClass(Entity):
"""Refresh the device data."""
self._device = device
self._attr_name = device.name
self._attr_available = device.reachable
if write_ha:
self.async_write_ha_state()
@@ -116,6 +114,16 @@ class TradfriBaseDevice(TradfriBaseClass):
All devices should inherit from this class.
"""
def __init__(
self,
device: Device,
api: Callable[[Command | list[Command]], Any],
gateway_id: str,
) -> None:
"""Initialize a device."""
self._attr_available = device.reachable
super().__init__(device, api, gateway_id)
@property
def device_info(self) -> DeviceInfo:
"""Return the device info."""
@@ -128,3 +136,11 @@ class TradfriBaseDevice(TradfriBaseClass):
sw_version=info.firmware_version,
via_device=(DOMAIN, self._gateway_id),
)
def _refresh(self, device: Device, write_ha: bool = True) -> None:
"""Refresh the device data."""
# The base class _refresh cannot be used, because
# there are devices (group) that do not have .reachable
# so set _attr_available here and let the base class do the rest.
self._attr_available = device.reachable
super()._refresh(device, write_ha)
+30 -24
View File
@@ -177,6 +177,13 @@ LIGHTS: dict[str, tuple[TuyaLightEntityDescription, ...]] = {
# Dimmer
# https://developer.tuya.com/en/docs/iot/tgq?id=Kaof8ke9il4k4
"tgq": (
TuyaLightEntityDescription(
key=DPCode.SWITCH_LED,
name="Light",
brightness=(DPCode.BRIGHT_VALUE_V2, DPCode.BRIGHT_VALUE),
brightness_max=DPCode.BRIGHTNESS_MAX_1,
brightness_min=DPCode.BRIGHTNESS_MIN_1,
),
TuyaLightEntityDescription(
key=DPCode.SWITCH_LED_1,
name="Light",
@@ -443,7 +450,29 @@ class TuyaLightEntity(TuyaEntity, LightEntity):
"""Turn on or control the light."""
commands = [{"code": self.entity_description.key, "value": True}]
if self._color_data_type and (
if self._color_temp_type and ATTR_COLOR_TEMP in kwargs:
if color_mode_dpcode := self.entity_description.color_mode:
commands += [
{
"code": color_mode_dpcode,
"value": WorkMode.WHITE,
},
]
commands += [
{
"code": self._color_temp_dpcode,
"value": round(
self._color_temp_type.remap_value_from(
kwargs[ATTR_COLOR_TEMP],
self.min_mireds,
self.max_mireds,
reverse=True,
)
),
},
]
elif self._color_data_type and (
ATTR_HS_COLOR in kwargs
or (ATTR_BRIGHTNESS in kwargs and self.color_mode == COLOR_MODE_HS)
):
@@ -486,29 +515,6 @@ class TuyaLightEntity(TuyaEntity, LightEntity):
},
]
elif ATTR_COLOR_TEMP in kwargs and self._color_temp_type:
if color_mode_dpcode := self.entity_description.color_mode:
commands += [
{
"code": color_mode_dpcode,
"value": WorkMode.WHITE,
},
]
commands += [
{
"code": self._color_temp_dpcode,
"value": round(
self._color_temp_type.remap_value_from(
kwargs[ATTR_COLOR_TEMP],
self.min_mireds,
self.max_mireds,
reverse=True,
)
),
},
]
if (
ATTR_BRIGHTNESS in kwargs
and self.color_mode != COLOR_MODE_HS
+32
View File
@@ -143,6 +143,22 @@ SELECTS: dict[str, tuple[SelectEntityDescription, ...]] = {
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
# IoT Switch?
# Note: Undocumented
"tdq": (
SelectEntityDescription(
key=DPCode.RELAY_STATUS,
name="Power on Behavior",
device_class=DEVICE_CLASS_TUYA_RELAY_STATUS,
entity_category=ENTITY_CATEGORY_CONFIG,
),
SelectEntityDescription(
key=DPCode.LIGHT_MODE,
name="Indicator Light Mode",
device_class=DEVICE_CLASS_TUYA_LIGHT_MODE,
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
# Dimmer Switch
# https://developer.tuya.com/en/docs/iot/categorytgkg?id=Kaiuz0ktx7m0o
"tgkg": (
@@ -177,6 +193,22 @@ SELECTS: dict[str, tuple[SelectEntityDescription, ...]] = {
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
# Dimmer
# https://developer.tuya.com/en/docs/iot/tgq?id=Kaof8ke9il4k4
"tgq": (
SelectEntityDescription(
key=DPCode.LED_TYPE_1,
name="Light Source Type",
device_class=DEVICE_CLASS_TUYA_LED_TYPE,
entity_category=ENTITY_CATEGORY_CONFIG,
),
SelectEntityDescription(
key=DPCode.LED_TYPE_2,
name="Light 2 Source Type",
device_class=DEVICE_CLASS_TUYA_LED_TYPE,
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
}
+25
View File
@@ -369,6 +369,31 @@ SWITCHES: dict[str, tuple[SwitchEntityDescription, ...]] = {
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
# IoT Switch?
# Note: Undocumented
"tdq": (
SwitchEntityDescription(
key=DPCode.SWITCH_1,
name="Switch 1",
device_class=DEVICE_CLASS_OUTLET,
),
SwitchEntityDescription(
key=DPCode.SWITCH_2,
name="Switch 2",
device_class=DEVICE_CLASS_OUTLET,
),
SwitchEntityDescription(
key=DPCode.SWITCH_3,
name="Switch 3",
device_class=DEVICE_CLASS_OUTLET,
),
SwitchEntityDescription(
key=DPCode.CHILD_LOCK,
name="Child Lock",
icon="mdi:account-lock",
entity_category=ENTITY_CATEGORY_CONFIG,
),
),
# Solar Light
# https://developer.tuya.com/en/docs/iot/tynd?id=Kaof8j02e1t98
"tyndj": (
+2 -2
View File
@@ -84,7 +84,7 @@ DERIVED_SENSORS: tuple[UpnpSensorEntityDescription, ...] = (
),
UpnpSensorEntityDescription(
key=BYTES_SENT,
unique_id="KiB/sent",
unique_id="KiB/sec_sent",
name=f"{DATA_RATE_KIBIBYTES_PER_SECOND} sent",
icon="mdi:server-network",
native_unit_of_measurement=DATA_RATE_KIBIBYTES_PER_SECOND,
@@ -100,7 +100,7 @@ DERIVED_SENSORS: tuple[UpnpSensorEntityDescription, ...] = (
),
UpnpSensorEntityDescription(
key=PACKETS_SENT,
unique_id="packets/sent",
unique_id="packets/sec_sent",
name=f"{DATA_RATE_PACKETS_PER_SECOND} sent",
icon="mdi:server-network",
native_unit_of_measurement=DATA_RATE_PACKETS_PER_SECOND,
@@ -2,7 +2,7 @@
"domain": "velbus",
"name": "Velbus",
"documentation": "https://www.home-assistant.io/integrations/velbus",
"requirements": ["velbus-aio==2021.10.7"],
"requirements": ["velbus-aio==2021.11.6"],
"config_flow": true,
"codeowners": ["@Cereal2nd", "@brefra"],
"iot_class": "local_push"
@@ -129,14 +129,14 @@ async def async_setup_platform(hass, config, async_add_entities, discovery_info=
all_devices.append(entity)
try:
_entities_from_descriptions(
await _entities_from_descriptions(
hass, name, all_devices, BURNER_SENSORS, api.burners
)
except PyViCareNotSupportedFeatureError:
_LOGGER.info("No burners found")
try:
_entities_from_descriptions(
await _entities_from_descriptions(
hass, name, all_devices, COMPRESSOR_SENSORS, api.compressors
)
except PyViCareNotSupportedFeatureError:
+2 -2
View File
@@ -393,14 +393,14 @@ async def async_setup_platform(hass, config, async_add_entities, discovery_info=
all_devices.append(entity)
try:
_entities_from_descriptions(
await _entities_from_descriptions(
hass, name, all_devices, BURNER_SENSORS, api.burners
)
except PyViCareNotSupportedFeatureError:
_LOGGER.info("No burners found")
try:
_entities_from_descriptions(
await _entities_from_descriptions(
hass, name, all_devices, COMPRESSOR_SENSORS, api.compressors
)
except PyViCareNotSupportedFeatureError:
@@ -66,6 +66,8 @@ from .const import (
MODELS_PURIFIER_MIOT,
MODELS_SWITCH,
MODELS_VACUUM,
ROBOROCK_GENERIC,
ROCKROBO_GENERIC,
AuthException,
SetupException,
)
@@ -267,7 +269,7 @@ async def async_create_miio_device_and_coordinator(
hass: core.HomeAssistant, entry: config_entries.ConfigEntry
):
"""Set up a data coordinator and one miio device to service multiple entities."""
model = entry.data[CONF_MODEL]
model: str = entry.data[CONF_MODEL]
host = entry.data[CONF_HOST]
token = entry.data[CONF_TOKEN]
name = entry.title
@@ -280,6 +282,8 @@ async def async_create_miio_device_and_coordinator(
model not in MODELS_HUMIDIFIER
and model not in MODELS_FAN
and model not in MODELS_VACUUM
and not model.startswith(ROBOROCK_GENERIC)
and not model.startswith(ROCKROBO_GENERIC)
):
return
@@ -304,7 +308,11 @@ async def async_create_miio_device_and_coordinator(
device = AirPurifier(host, token)
elif model.startswith("zhimi.airfresh."):
device = AirFresh(host, token)
elif model in MODELS_VACUUM:
elif (
model in MODELS_VACUUM
or model.startswith(ROBOROCK_GENERIC)
or model.startswith(ROCKROBO_GENERIC)
):
device = Vacuum(host, token)
update_method = _async_update_data_vacuum
coordinator_class = DataUpdateCoordinator[VacuumCoordinatorData]
@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
import logging
from typing import Callable
from homeassistant.components.binary_sensor import (
@@ -12,6 +13,7 @@ from homeassistant.components.binary_sensor import (
BinarySensorEntityDescription,
)
from homeassistant.const import ENTITY_CATEGORY_DIAGNOSTIC
from homeassistant.core import callback
from . import VacuumCoordinatorDataAttributes
from .const import (
@@ -27,9 +29,12 @@ from .const import (
MODELS_HUMIDIFIER_MJJSQ,
MODELS_VACUUM,
MODELS_VACUUM_WITH_MOP,
MODELS_VACUUM_WITH_SEPARATE_MOP,
)
from .device import XiaomiCoordinatedMiioEntity
_LOGGER = logging.getLogger(__name__)
ATTR_NO_WATER = "no_water"
ATTR_POWERSUPPLY_ATTACHED = "powersupply_attached"
ATTR_WATER_TANK_DETACHED = "water_tank_detached"
@@ -73,7 +78,7 @@ FAN_ZA5_BINARY_SENSORS = (ATTR_POWERSUPPLY_ATTACHED,)
VACUUM_SENSORS = {
ATTR_MOP_ATTACHED: XiaomiMiioBinarySensorDescription(
key=ATTR_MOP_ATTACHED,
key=ATTR_WATER_BOX_ATTACHED,
name="Mop Attached",
icon="mdi:square-rounded",
parent_key=VacuumCoordinatorDataAttributes.status,
@@ -101,6 +106,19 @@ VACUUM_SENSORS = {
),
}
VACUUM_SENSORS_SEPARATE_MOP = {
**VACUUM_SENSORS,
ATTR_MOP_ATTACHED: XiaomiMiioBinarySensorDescription(
key=ATTR_MOP_ATTACHED,
name="Mop Attached",
icon="mdi:square-rounded",
parent_key=VacuumCoordinatorDataAttributes.status,
entity_registry_enabled_default=True,
device_class=DEVICE_CLASS_CONNECTIVITY,
entity_category=ENTITY_CATEGORY_DIAGNOSTIC,
),
}
HUMIDIFIER_MIIO_BINARY_SENSORS = (ATTR_WATER_TANK_DETACHED,)
HUMIDIFIER_MIOT_BINARY_SENSORS = (ATTR_WATER_TANK_DETACHED,)
HUMIDIFIER_MJJSQ_BINARY_SENSORS = (ATTR_NO_WATER, ATTR_WATER_TANK_DETACHED)
@@ -108,21 +126,33 @@ HUMIDIFIER_MJJSQ_BINARY_SENSORS = (ATTR_NO_WATER, ATTR_WATER_TANK_DETACHED)
def _setup_vacuum_sensors(hass, config_entry, async_add_entities):
"""Only vacuums with mop should have binary sensor registered."""
if config_entry.data[CONF_MODEL] not in MODELS_VACUUM_WITH_MOP:
return
device = hass.data[DOMAIN][config_entry.entry_id].get(KEY_DEVICE)
coordinator = hass.data[DOMAIN][config_entry.entry_id][KEY_COORDINATOR]
entities = []
sensors = VACUUM_SENSORS
for sensor, description in VACUUM_SENSORS.items():
if config_entry.data[CONF_MODEL] in MODELS_VACUUM_WITH_SEPARATE_MOP:
sensors = VACUUM_SENSORS_SEPARATE_MOP
for sensor, description in sensors.items():
parent_key_data = getattr(coordinator.data, description.parent_key)
if getattr(parent_key_data, description.key, None) is None:
_LOGGER.debug(
"It seems the %s does not support the %s as the initial value is None",
config_entry.data[CONF_MODEL],
description.key,
)
continue
entities.append(
XiaomiGenericBinarySensor(
f"{config_entry.title} {description.name}",
device,
config_entry,
f"{sensor}_{config_entry.unique_id}",
hass.data[DOMAIN][config_entry.entry_id][KEY_COORDINATOR],
coordinator,
description,
)
)
@@ -168,18 +198,26 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
class XiaomiGenericBinarySensor(XiaomiCoordinatedMiioEntity, BinarySensorEntity):
"""Representation of a Xiaomi Humidifier binary sensor."""
entity_description: XiaomiMiioBinarySensorDescription
def __init__(self, name, device, entry, unique_id, coordinator, description):
"""Initialize the entity."""
super().__init__(name, device, entry, unique_id, coordinator)
self.entity_description: XiaomiMiioBinarySensorDescription = description
self.entity_description = description
self._attr_entity_registry_enabled_default = (
description.entity_registry_enabled_default
)
self._attr_is_on = self._determine_native_value()
@property
def is_on(self):
"""Return true if the binary sensor is on."""
@callback
def _handle_coordinator_update(self) -> None:
self._attr_is_on = self._determine_native_value()
super()._handle_coordinator_update()
def _determine_native_value(self):
"""Determine native value."""
if self.entity_description.parent_key is not None:
return self._extract_value_from_attribute(
getattr(self.coordinator.data, self.entity_description.parent_key),
@@ -197,20 +197,37 @@ MODELS_LIGHT = (
)
# TODO: use const from pythonmiio once new release with the constant has been published. # pylint: disable=fixme
ROCKROBO_S4 = "roborock.vacuum.s4"
ROCKROBO_S4_MAX = "roborock.vacuum.a19"
ROCKROBO_S5_MAX = "roborock.vacuum.s5e"
ROCKROBO_S6_PURE = "roborock.vacuum.a08"
ROCKROBO_E2 = "roborock.vacuum.e2"
ROBOROCK_GENERIC = "roborock.vacuum"
ROCKROBO_GENERIC = "rockrobo.vacuum"
MODELS_VACUUM = [
ROCKROBO_V1,
ROCKROBO_E2,
ROCKROBO_S4,
ROCKROBO_S4_MAX,
ROCKROBO_S5,
ROCKROBO_S5_MAX,
ROCKROBO_S6,
ROCKROBO_S6_MAXV,
ROCKROBO_S6_PURE,
ROCKROBO_S7,
ROBOROCK_GENERIC,
ROCKROBO_GENERIC,
]
MODELS_VACUUM_WITH_MOP = [
ROCKROBO_E2,
ROCKROBO_S5,
ROCKROBO_S5_MAX,
ROCKROBO_S6,
ROCKROBO_S6_MAXV,
ROCKROBO_S6_PURE,
ROCKROBO_S7,
]
MODELS_VACUUM_WITH_SEPARATE_MOP = [
ROCKROBO_S7,
]
+3 -17
View File
@@ -166,25 +166,15 @@ class XiaomiCoordinatedMiioEntity(CoordinatorEntity):
return cls._parse_datetime_time(value)
if isinstance(value, datetime.datetime):
return cls._parse_datetime_datetime(value)
if isinstance(value, datetime.timedelta):
return cls._parse_time_delta(value)
if isinstance(value, float):
return value
if isinstance(value, int):
return value
_LOGGER.warning(
"Could not determine how to parse state value of type %s for state %s and attribute %s",
type(value),
type(state),
attribute,
)
if value is None:
_LOGGER.debug("Attribute %s is None, this is unexpected", attribute)
return value
@staticmethod
def _parse_time_delta(timedelta: datetime.timedelta) -> int:
return timedelta.seconds
return int(timedelta.total_seconds())
@staticmethod
def _parse_datetime_time(time: datetime.time) -> str:
@@ -200,7 +190,3 @@ class XiaomiCoordinatedMiioEntity(CoordinatorEntity):
@staticmethod
def _parse_datetime_datetime(time: datetime.datetime) -> str:
return time.isoformat()
@staticmethod
def _parse_datetime_timedelta(time: datetime.timedelta) -> int:
return time.seconds
@@ -1,7 +1,6 @@
"""Support for Xiaomi Mi Air Purifier and Xiaomi Mi Air Humidifier."""
from abc import abstractmethod
import asyncio
from enum import Enum
import logging
import math
@@ -363,14 +362,6 @@ class XiaomiGenericAirPurifier(XiaomiGenericDevice):
return None
@staticmethod
def _extract_value_from_attribute(state, attribute):
value = getattr(state, attribute)
if isinstance(value, Enum):
return value.value
return value
@callback
def _handle_coordinator_update(self):
"""Fetch state from the device."""
@@ -1,5 +1,4 @@
"""Support for Xiaomi Mi Air Purifier and Xiaomi Mi Air Humidifier with humidifier entity."""
from enum import Enum
import logging
import math
@@ -124,14 +123,6 @@ class XiaomiGenericHumidifier(XiaomiCoordinatedMiioEntity, HumidifierEntity):
"""Return true if device is on."""
return self._state
@staticmethod
def _extract_value_from_attribute(state, attribute):
value = getattr(state, attribute)
if isinstance(value, Enum):
return value.value
return value
@property
def mode(self):
"""Get the current mode."""
@@ -2,7 +2,6 @@
from __future__ import annotations
from dataclasses import dataclass
from enum import Enum
from homeassistant.components.number import NumberEntity, NumberEntityDescription
from homeassistant.const import DEGREE, ENTITY_CATEGORY_CONFIG, TIME_MINUTES
@@ -285,14 +284,6 @@ class XiaomiNumberEntity(XiaomiCoordinatedMiioEntity, NumberEntity):
return False
return super().available
@staticmethod
def _extract_value_from_attribute(state, attribute):
value = getattr(state, attribute)
if isinstance(value, Enum):
return value.value
return value
async def async_set_value(self, value):
"""Set an option of the miio device."""
method = getattr(self, self.entity_description.method)
+1 -10
View File
@@ -2,7 +2,6 @@
from __future__ import annotations
from dataclasses import dataclass
from enum import Enum
from miio.airfresh import LedBrightness as AirfreshLedBrightness
from miio.airhumidifier import LedBrightness as AirhumidifierLedBrightness
@@ -126,14 +125,6 @@ class XiaomiSelector(XiaomiCoordinatedMiioEntity, SelectEntity):
self._attr_options = list(description.options)
self.entity_description = description
@staticmethod
def _extract_value_from_attribute(state, attribute):
value = getattr(state, attribute)
if isinstance(value, Enum):
return value.value
return value
class XiaomiAirHumidifierSelector(XiaomiSelector):
"""Representation of a Xiaomi Air Humidifier selector."""
@@ -153,7 +144,7 @@ class XiaomiAirHumidifierSelector(XiaomiSelector):
)
# Sometimes (quite rarely) the device returns None as the LED brightness so we
# check that the value is not None before updating the state.
if led_brightness:
if led_brightness is not None:
self._current_led_brightness = led_brightness
self.async_write_ha_state()
+52 -27
View File
@@ -48,6 +48,7 @@ from homeassistant.const import (
TIME_SECONDS,
VOLUME_CUBIC_METERS,
)
from homeassistant.core import callback
from . import VacuumCoordinatorDataAttributes
from .const import (
@@ -80,6 +81,8 @@ from .const import (
MODELS_PURIFIER_MIIO,
MODELS_PURIFIER_MIOT,
MODELS_VACUUM,
ROBOROCK_GENERIC,
ROCKROBO_GENERIC,
)
from .device import XiaomiCoordinatedMiioEntity, XiaomiMiioEntity
from .gateway import XiaomiGatewayDevice
@@ -298,7 +301,7 @@ HUMIDIFIER_MIOT_SENSORS = (
ATTR_USE_TIME,
ATTR_WATER_LEVEL,
)
HUMIDIFIER_MJJSQ_SENSORS = (ATTR_HUMIDITY, ATTR_TEMPERATURE, ATTR_USE_TIME)
HUMIDIFIER_MJJSQ_SENSORS = (ATTR_HUMIDITY, ATTR_TEMPERATURE)
PURIFIER_MIIO_SENSORS = (
ATTR_FILTER_LIFE_REMAINING,
@@ -373,7 +376,6 @@ AIRFRESH_SENSORS = (
ATTR_FILTER_LIFE_REMAINING,
ATTR_FILTER_USE,
ATTR_HUMIDITY,
ATTR_ILLUMINANCE_LUX,
ATTR_PM25,
ATTR_TEMPERATURE,
ATTR_USE_TIME,
@@ -529,17 +531,27 @@ VACUUM_SENSORS = {
def _setup_vacuum_sensors(hass, config_entry, async_add_entities):
"""Set up the Xiaomi vacuum sensors."""
device = hass.data[DOMAIN][config_entry.entry_id].get(KEY_DEVICE)
coordinator = hass.data[DOMAIN][config_entry.entry_id][KEY_COORDINATOR]
entities = []
for sensor, description in VACUUM_SENSORS.items():
parent_key_data = getattr(coordinator.data, description.parent_key)
if getattr(parent_key_data, description.key, None) is None:
_LOGGER.debug(
"It seems the %s does not support the %s as the initial value is None",
config_entry.data[CONF_MODEL],
description.key,
)
continue
entities.append(
XiaomiGenericSensor(
f"{config_entry.title} {description.name}",
device,
config_entry,
f"{sensor}_{config_entry.unique_id}",
hass.data[DOMAIN][config_entry.entry_id][KEY_COORDINATOR],
coordinator,
description,
)
)
@@ -582,7 +594,7 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
elif config_entry.data[CONF_FLOW_TYPE] == CONF_DEVICE:
host = config_entry.data[CONF_HOST]
token = config_entry.data[CONF_TOKEN]
model = config_entry.data[CONF_MODEL]
model: str = config_entry.data[CONF_MODEL]
if model in (MODEL_FAN_ZA1, MODEL_FAN_ZA3, MODEL_FAN_ZA4, MODEL_FAN_P5):
return
@@ -614,7 +626,11 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
sensors = PURIFIER_MIIO_SENSORS
elif model in MODELS_PURIFIER_MIOT:
sensors = PURIFIER_MIOT_SENSORS
elif model in MODELS_VACUUM:
elif (
model in MODELS_VACUUM
or model.startswith(ROBOROCK_GENERIC)
or model.startswith(ROCKROBO_GENERIC)
):
return _setup_vacuum_sensors(hass, config_entry, async_add_entities)
for sensor, description in SENSOR_TYPES.items():
@@ -637,23 +653,41 @@ async def async_setup_entry(hass, config_entry, async_add_entities):
class XiaomiGenericSensor(XiaomiCoordinatedMiioEntity, SensorEntity):
"""Representation of a Xiaomi generic sensor."""
def __init__(
self,
name,
device,
entry,
unique_id,
coordinator,
description: XiaomiMiioSensorDescription,
):
entity_description: XiaomiMiioSensorDescription
def __init__(self, name, device, entry, unique_id, coordinator, description):
"""Initialize the entity."""
super().__init__(name, device, entry, unique_id, coordinator)
self.entity_description = description
self._attr_unique_id = unique_id
self.entity_description: XiaomiMiioSensorDescription = description
self._attr_native_value = self._determine_native_value()
self._attr_extra_state_attributes = self._extract_attributes(coordinator.data)
@property
def native_value(self):
"""Return the state of the device."""
@callback
def _extract_attributes(self, data):
"""Return state attributes with valid values."""
return {
attr: value
for attr in self.entity_description.attributes
if hasattr(data, attr)
and (value := self._extract_value_from_attribute(data, attr)) is not None
}
@callback
def _handle_coordinator_update(self):
"""Fetch state from the device."""
native_value = self._determine_native_value()
# Sometimes (quite rarely) the device returns None as the sensor value so we
# check that the value is not None before updating the state.
if native_value is not None:
self._attr_native_value = native_value
self._attr_extra_state_attributes = self._extract_attributes(
self.coordinator.data
)
self.async_write_ha_state()
def _determine_native_value(self):
"""Determine native value."""
if self.entity_description.parent_key is not None:
return self._extract_value_from_attribute(
getattr(self.coordinator.data, self.entity_description.parent_key),
@@ -664,15 +698,6 @@ class XiaomiGenericSensor(XiaomiCoordinatedMiioEntity, SensorEntity):
self.coordinator.data, self.entity_description.key
)
@property
def extra_state_attributes(self):
"""Return the state attributes."""
return {
attr: self._extract_value_from_attribute(self.coordinator.data, attr)
for attr in self.entity_description.attributes
if hasattr(self.coordinator.data, attr)
}
class XiaomiAirQualityMonitor(XiaomiMiioEntity, SensorEntity):
"""Representation of a Xiaomi Air Quality Monitor."""
@@ -3,7 +3,6 @@ from __future__ import annotations
import asyncio
from dataclasses import dataclass
from enum import Enum
from functools import partial
import logging
@@ -474,14 +473,6 @@ class XiaomiGenericCoordinatedSwitch(XiaomiCoordinatedMiioEntity, SwitchEntity):
return False
return super().available
@staticmethod
def _extract_value_from_attribute(state, attribute):
value = getattr(state, attribute)
if isinstance(value, Enum):
return value.value
return value
async def async_turn_on(self, **kwargs) -> None:
"""Turn on an option of the miio device."""
method = getattr(self, self.entity_description.method_on)
@@ -35,6 +35,20 @@ _LOGGER = logging.getLogger(__name__)
STATE_CHANGE_TIME = 0.40 # seconds
POWER_STATE_CHANGE_TIME = 1 # seconds
#
# These models do not transition correctly when turning on, and
# yeelight is no longer updating the firmware on older devices
#
# https://github.com/home-assistant/core/issues/58315
#
# The problem can be worked around by always setting the brightness
# even when the bulb is reporting the brightness is already at the
# desired level.
#
MODELS_WITH_DELAYED_ON_TRANSITION = {
"color", # YLDP02YL
}
DOMAIN = "yeelight"
DATA_YEELIGHT = DOMAIN
DATA_UPDATED = "yeelight_{}_data_updated"
+7 -3
View File
@@ -63,6 +63,7 @@ from . import (
DATA_DEVICE,
DATA_UPDATED,
DOMAIN,
MODELS_WITH_DELAYED_ON_TRANSITION,
POWER_STATE_CHANGE_TIME,
YEELIGHT_FLOW_TRANSITION_SCHEMA,
YeelightEntity,
@@ -180,20 +181,20 @@ SERVICE_SCHEMA_START_FLOW = YEELIGHT_FLOW_TRANSITION_SCHEMA
SERVICE_SCHEMA_SET_COLOR_SCENE = {
vol.Required(ATTR_RGB_COLOR): vol.All(
vol.ExactSequence((cv.byte, cv.byte, cv.byte)), vol.Coerce(tuple)
vol.Coerce(tuple), vol.ExactSequence((cv.byte, cv.byte, cv.byte))
),
vol.Required(ATTR_BRIGHTNESS): VALID_BRIGHTNESS,
}
SERVICE_SCHEMA_SET_HSV_SCENE = {
vol.Required(ATTR_HS_COLOR): vol.All(
vol.Coerce(tuple),
vol.ExactSequence(
(
vol.All(vol.Coerce(float), vol.Range(min=0, max=359)),
vol.All(vol.Coerce(float), vol.Range(min=0, max=100)),
)
),
vol.Coerce(tuple),
),
vol.Required(ATTR_BRIGHTNESS): VALID_BRIGHTNESS,
}
@@ -614,7 +615,10 @@ class YeelightGenericLight(YeelightEntity, LightEntity):
"""Set bulb brightness."""
if not brightness:
return
if math.floor(self.brightness) == math.floor(brightness):
if (
math.floor(self.brightness) == math.floor(brightness)
and self._bulb.model not in MODELS_WITH_DELAYED_ON_TRANSITION
):
_LOGGER.debug("brightness already set to: %s", brightness)
# Already set, and since we get pushed updates
# we avoid setting it again to ensure we do not
@@ -2,7 +2,7 @@
"domain": "zeroconf",
"name": "Zero-configuration networking (zeroconf)",
"documentation": "https://www.home-assistant.io/integrations/zeroconf",
"requirements": ["zeroconf==0.36.9"],
"requirements": ["zeroconf==0.36.11"],
"dependencies": ["network", "api"],
"codeowners": ["@bdraco"],
"quality_scale": "internal",
+16
View File
@@ -226,6 +226,22 @@ class Battery(Sensor):
_unit = PERCENTAGE
_attr_entity_category = ENTITY_CATEGORY_DIAGNOSTIC
@classmethod
def create_entity(
cls,
unique_id: str,
zha_device: ZhaDeviceType,
channels: list[ChannelType],
**kwargs,
) -> ZhaEntity | None:
"""Entity Factory.
Unlike any other entity, PowerConfiguration cluster may not support
battery_percent_remaining attribute, but zha-device-handlers takes care of it
so create the entity regardless
"""
return cls(unique_id, zha_device, channels, **kwargs)
@staticmethod
def formatter(value: int) -> int:
"""Return the state of the entity."""
+1 -1
View File
@@ -5,7 +5,7 @@ from typing import Final
MAJOR_VERSION: Final = 2021
MINOR_VERSION: Final = 11
PATCH_VERSION: Final = "0b2"
PATCH_VERSION: Final = "2"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 8, 0)
-12
View File
@@ -95,18 +95,6 @@ SSDP = {
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:3",
"st": "urn:schemas-upnp-org:device:MediaRenderer:3"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:1",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:1"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:2",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:2"
},
{
"deviceType": "urn:schemas-upnp-org:device:MediaRenderer:3",
"nt": "urn:schemas-upnp-org:device:MediaRenderer:3"
}
],
"fritz": [
@@ -1444,6 +1444,7 @@ currency = vol.In(
"YER",
"ZAR",
"ZMK",
"ZMW",
"ZWL",
},
msg="invalid ISO 4217 formatted currency",
+2 -3
View File
@@ -39,7 +39,6 @@ from homeassistant.core import CALLBACK_TYPE, Context, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError, NoEntitySpecifiedError
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.entity_platform import EntityPlatform
from homeassistant.helpers.entity_registry import RegistryEntry
from homeassistant.helpers.event import Event, async_track_entity_registry_updated_event
from homeassistant.helpers.typing import StateType
from homeassistant.loader import bind_hass
@@ -233,7 +232,7 @@ class Entity(ABC):
parallel_updates: asyncio.Semaphore | None = None
# Entry in the entity registry
registry_entry: RegistryEntry | None = None
registry_entry: er.RegistryEntry | None = None
# Hold list for functions to call on remove.
_on_remove: list[CALLBACK_TYPE] | None = None
@@ -812,7 +811,7 @@ class Entity(ABC):
if data["action"] != "update":
return
ent_reg = await self.hass.helpers.entity_registry.async_get_registry()
ent_reg = er.async_get(self.hass)
old = self.registry_entry
self.registry_entry = ent_reg.async_get(data["entity_id"])
assert self.registry_entry is not None
+46 -36
View File
@@ -243,21 +243,21 @@ class EntityRegistry:
unique_id: str,
*,
# To influence entity ID generation
suggested_object_id: str | None = None,
known_object_ids: Iterable[str] | None = None,
suggested_object_id: str | None = None,
# To disable an entity if it gets created
disabled_by: str | None = None,
# Data that we want entry to have
config_entry: ConfigEntry | None = None,
device_id: str | None = None,
area_id: str | None = None,
capabilities: Mapping[str, Any] | None = None,
supported_features: int | None = None,
config_entry: ConfigEntry | None = None,
device_class: str | None = None,
unit_of_measurement: str | None = None,
original_name: str | None = None,
original_icon: str | None = None,
device_id: str | None = None,
entity_category: str | None = None,
original_icon: str | None = None,
original_name: str | None = None,
supported_features: int | None = None,
unit_of_measurement: str | None = None,
) -> RegistryEntry:
"""Get entity. Create if it doesn't exist."""
config_entry_id = None
@@ -300,20 +300,20 @@ class EntityRegistry:
disabled_by = DISABLED_INTEGRATION
entity = RegistryEntry(
entity_id=entity_id,
config_entry_id=config_entry_id,
device_id=device_id,
area_id=area_id,
unique_id=unique_id,
platform=platform,
disabled_by=disabled_by,
capabilities=capabilities,
supported_features=supported_features or 0,
config_entry_id=config_entry_id,
device_class=device_class,
unit_of_measurement=unit_of_measurement,
original_name=original_name,
original_icon=original_icon,
device_id=device_id,
disabled_by=disabled_by,
entity_category=entity_category,
entity_id=entity_id,
original_icon=original_icon,
original_name=original_name,
platform=platform,
supported_features=supported_features or 0,
unique_id=unique_id,
unit_of_measurement=unit_of_measurement,
)
self._register_entry(entity)
_LOGGER.info("Registered new %s.%s entity: %s", domain, platform, entity_id)
@@ -383,24 +383,34 @@ class EntityRegistry:
self,
entity_id: str,
*,
name: str | None | UndefinedType = UNDEFINED,
icon: str | None | UndefinedType = UNDEFINED,
config_entry_id: str | None | UndefinedType = UNDEFINED,
area_id: str | None | UndefinedType = UNDEFINED,
config_entry_id: str | None | UndefinedType = UNDEFINED,
device_class: str | None | UndefinedType = UNDEFINED,
disabled_by: str | None | UndefinedType = UNDEFINED,
entity_category: str | None | UndefinedType = UNDEFINED,
icon: str | None | UndefinedType = UNDEFINED,
name: str | None | UndefinedType = UNDEFINED,
new_entity_id: str | UndefinedType = UNDEFINED,
new_unique_id: str | UndefinedType = UNDEFINED,
disabled_by: str | None | UndefinedType = UNDEFINED,
original_icon: str | None | UndefinedType = UNDEFINED,
original_name: str | None | UndefinedType = UNDEFINED,
unit_of_measurement: str | None | UndefinedType = UNDEFINED,
) -> RegistryEntry:
"""Update properties of an entity."""
return self._async_update_entity(
entity_id,
name=name,
icon=icon,
config_entry_id=config_entry_id,
area_id=area_id,
config_entry_id=config_entry_id,
device_class=device_class,
disabled_by=disabled_by,
entity_category=entity_category,
icon=icon,
name=name,
new_entity_id=new_entity_id,
new_unique_id=new_unique_id,
disabled_by=disabled_by,
original_icon=original_icon,
original_name=original_name,
unit_of_measurement=unit_of_measurement,
)
@callback
@@ -408,21 +418,21 @@ class EntityRegistry:
self,
entity_id: str,
*,
name: str | None | UndefinedType = UNDEFINED,
icon: str | None | UndefinedType = UNDEFINED,
config_entry_id: str | None | UndefinedType = UNDEFINED,
new_entity_id: str | UndefinedType = UNDEFINED,
device_id: str | None | UndefinedType = UNDEFINED,
area_id: str | None | UndefinedType = UNDEFINED,
new_unique_id: str | UndefinedType = UNDEFINED,
disabled_by: str | None | UndefinedType = UNDEFINED,
capabilities: Mapping[str, Any] | None | UndefinedType = UNDEFINED,
supported_features: int | UndefinedType = UNDEFINED,
config_entry_id: str | None | UndefinedType = UNDEFINED,
device_class: str | None | UndefinedType = UNDEFINED,
unit_of_measurement: str | None | UndefinedType = UNDEFINED,
original_name: str | None | UndefinedType = UNDEFINED,
original_icon: str | None | UndefinedType = UNDEFINED,
device_id: str | None | UndefinedType = UNDEFINED,
disabled_by: str | None | UndefinedType = UNDEFINED,
entity_category: str | None | UndefinedType = UNDEFINED,
icon: str | None | UndefinedType = UNDEFINED,
name: str | None | UndefinedType = UNDEFINED,
new_entity_id: str | UndefinedType = UNDEFINED,
new_unique_id: str | UndefinedType = UNDEFINED,
original_icon: str | None | UndefinedType = UNDEFINED,
original_name: str | None | UndefinedType = UNDEFINED,
supported_features: int | UndefinedType = UNDEFINED,
unit_of_measurement: str | None | UndefinedType = UNDEFINED,
) -> RegistryEntry:
"""Private facing update properties method."""
old = self.entities[entity_id]
+5 -5
View File
@@ -15,11 +15,11 @@ ciso8601==2.2.0
cryptography==3.4.8
emoji==1.5.0
hass-nabucasa==0.50.0
home-assistant-frontend==20211028.0
home-assistant-frontend==20211108.0
httpx==0.19.0
ifaddr==0.1.7
jinja2==3.0.2
paho-mqtt==1.5.1
paho-mqtt==1.6.1
pillow==8.2.0
pip>=8.0.3,<20.3
pyserial==3.5
@@ -32,12 +32,12 @@ sqlalchemy==1.4.23
voluptuous-serialize==2.4.0
voluptuous==0.12.2
yarl==1.6.3
zeroconf==0.36.9
zeroconf==0.36.11
pycryptodome>=3.6.6
# Constrain urllib3 to ensure we deal with CVE-2019-11236 & CVE-2019-11324
urllib3>=1.24.3
# Constrain urllib3 to ensure we deal with CVE-2020-26137 and CVE-2021-33503
urllib3>=1.26.5
# Constrain H11 to ensure we get a new enough version to support non-rfc line endings
h11>=0.12.0
+90 -70
View File
@@ -245,6 +245,16 @@ def _dst_offset_diff(dattim: dt.datetime) -> dt.timedelta:
return (dattim + delta).utcoffset() - (dattim - delta).utcoffset() # type: ignore[operator]
def _lower_bound(arr: list[int], cmp: int) -> int | None:
"""Return the first value in arr greater or equal to cmp.
Return None if no such value exists.
"""
if (left := bisect.bisect_left(arr, cmp)) == len(arr):
return None
return arr[left]
def find_next_time_expression_time(
now: dt.datetime, # pylint: disable=redefined-outer-name
seconds: list[int],
@@ -263,89 +273,99 @@ def find_next_time_expression_time(
if not seconds or not minutes or not hours:
raise ValueError("Cannot find a next time: Time expression never matches!")
def _lower_bound(arr: list[int], cmp: int) -> int | None:
"""Return the first value in arr greater or equal to cmp.
while True:
# Reset microseconds and fold; fold (for ambiguous DST times) will be handled later
result = now.replace(microsecond=0, fold=0)
Return None if no such value exists.
"""
if (left := bisect.bisect_left(arr, cmp)) == len(arr):
return None
return arr[left]
# Match next second
if (next_second := _lower_bound(seconds, result.second)) is None:
# No second to match in this minute. Roll-over to next minute.
next_second = seconds[0]
result += dt.timedelta(minutes=1)
result = now.replace(microsecond=0)
result = result.replace(second=next_second)
# Match next second
if (next_second := _lower_bound(seconds, result.second)) is None:
# No second to match in this minute. Roll-over to next minute.
next_second = seconds[0]
result += dt.timedelta(minutes=1)
# Match next minute
next_minute = _lower_bound(minutes, result.minute)
if next_minute != result.minute:
# We're in the next minute. Seconds needs to be reset.
result = result.replace(second=seconds[0])
result = result.replace(second=next_second)
if next_minute is None:
# No minute to match in this hour. Roll-over to next hour.
next_minute = minutes[0]
result += dt.timedelta(hours=1)
# Match next minute
next_minute = _lower_bound(minutes, result.minute)
if next_minute != result.minute:
# We're in the next minute. Seconds needs to be reset.
result = result.replace(second=seconds[0])
result = result.replace(minute=next_minute)
if next_minute is None:
# No minute to match in this hour. Roll-over to next hour.
next_minute = minutes[0]
result += dt.timedelta(hours=1)
# Match next hour
next_hour = _lower_bound(hours, result.hour)
if next_hour != result.hour:
# We're in the next hour. Seconds+minutes needs to be reset.
result = result.replace(second=seconds[0], minute=minutes[0])
result = result.replace(minute=next_minute)
if next_hour is None:
# No minute to match in this day. Roll-over to next day.
next_hour = hours[0]
result += dt.timedelta(days=1)
# Match next hour
next_hour = _lower_bound(hours, result.hour)
if next_hour != result.hour:
# We're in the next hour. Seconds+minutes needs to be reset.
result = result.replace(second=seconds[0], minute=minutes[0])
result = result.replace(hour=next_hour)
if next_hour is None:
# No minute to match in this day. Roll-over to next day.
next_hour = hours[0]
result += dt.timedelta(days=1)
if result.tzinfo in (None, UTC):
# Using UTC, no DST checking needed
return result
result = result.replace(hour=next_hour)
if not _datetime_exists(result):
# When entering DST and clocks are turned forward.
# There are wall clock times that don't "exist" (an hour is skipped).
# -> trigger on the next time that 1. matches the pattern and 2. does exist
# for example:
# on 2021.03.28 02:00:00 in CET timezone clocks are turned forward an hour
# with pattern "02:30", don't run on 28 mar (such a wall time does not exist on this day)
# instead run at 02:30 the next day
# We solve this edge case by just iterating one second until the result exists
# (max. 3600 operations, which should be fine for an edge case that happens once a year)
now += dt.timedelta(seconds=1)
continue
now_is_ambiguous = _datetime_ambiguous(now)
result_is_ambiguous = _datetime_ambiguous(result)
# When leaving DST and clocks are turned backward.
# Then there are wall clock times that are ambiguous i.e. exist with DST and without DST
# The logic above does not take into account if a given pattern matches _twice_
# in a day.
# Example: on 2021.10.31 02:00:00 in CET timezone clocks are turned backward an hour
if now_is_ambiguous and result_is_ambiguous:
# `now` and `result` are both ambiguous, so the next match happens
# _within_ the current fold.
# Examples:
# 1. 2021.10.31 02:00:00+02:00 with pattern 02:30 -> 2021.10.31 02:30:00+02:00
# 2. 2021.10.31 02:00:00+01:00 with pattern 02:30 -> 2021.10.31 02:30:00+01:00
return result.replace(fold=now.fold)
if now_is_ambiguous and now.fold == 0 and not result_is_ambiguous:
# `now` is in the first fold, but result is not ambiguous (meaning it no longer matches
# within the fold).
# -> Check if result matches in the next fold. If so, emit that match
# Turn back the time by the DST offset, effectively run the algorithm on the first fold
# If it matches on the first fold, that means it will also match on the second one.
# Example: 2021.10.31 02:45:00+02:00 with pattern 02:30 -> 2021.10.31 02:30:00+01:00
check_result = find_next_time_expression_time(
now + _dst_offset_diff(now), seconds, minutes, hours
)
if _datetime_ambiguous(check_result):
return check_result.replace(fold=1)
if result.tzinfo in (None, UTC):
return result
if _datetime_ambiguous(result):
# This happens when we're leaving daylight saving time and local
# clocks are rolled back. In this case, we want to trigger
# on both the DST and non-DST time. So when "now" is in the DST
# use the DST-on time, and if not, use the DST-off time.
fold = 1 if now.dst() else 0
if result.fold != fold:
result = result.replace(fold=fold)
if not _datetime_exists(result):
# This happens when we're entering daylight saving time and local
# clocks are rolled forward, thus there are local times that do
# not exist. In this case, we want to trigger on the next time
# that *does* exist.
# In the worst case, this will run through all the seconds in the
# time shift, but that's max 3600 operations for once per year
return find_next_time_expression_time(
result + dt.timedelta(seconds=1), seconds, minutes, hours
)
# Another edge-case when leaving DST:
# When now is in DST and ambiguous *and* the next trigger time we *should*
# trigger is ambiguous and outside DST, the excepts above won't catch it.
# For example: if triggering on 2:30 and now is 28.10.2018 2:30 (in DST)
# we should trigger next on 28.10.2018 2:30 (out of DST), but our
# algorithm above would produce 29.10.2018 2:30 (out of DST)
if _datetime_ambiguous(now):
check_result = find_next_time_expression_time(
now + _dst_offset_diff(now), seconds, minutes, hours
)
if _datetime_ambiguous(check_result):
return check_result
return result
def _datetime_exists(dattim: dt.datetime) -> bool:
"""Check if a datetime exists."""
+13 -13
View File
@@ -173,7 +173,7 @@ aioftp==0.12.0
aiogithubapi==21.8.0
# homeassistant.components.guardian
aioguardian==1.0.8
aioguardian==2021.11.0
# homeassistant.components.harmony
aioharmony==0.2.8
@@ -234,7 +234,7 @@ aiopulse==0.4.2
aiopvapi==1.6.14
# homeassistant.components.pvpc_hourly_pricing
aiopvpc==2.2.0
aiopvpc==2.2.1
# homeassistant.components.webostv
aiopylgtv==0.4.0
@@ -243,7 +243,7 @@ aiopylgtv==0.4.0
aiorecollect==1.0.8
# homeassistant.components.shelly
aioshelly==1.0.2
aioshelly==1.0.4
# homeassistant.components.switcher_kis
aioswitcher==2.0.6
@@ -652,7 +652,7 @@ fjaraskupan==1.0.2
flipr-api==1.4.1
# homeassistant.components.flux_led
flux_led==0.24.13
flux_led==0.24.17
# homeassistant.components.homekit
fnvhash==0.1.0
@@ -813,7 +813,7 @@ hole==0.5.1
holidays==0.11.3.1
# homeassistant.components.frontend
home-assistant-frontend==20211028.0
home-assistant-frontend==20211108.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -1005,7 +1005,7 @@ micloud==0.4
miflora==0.7.0
# homeassistant.components.mill
millheater==0.7.3
millheater==0.7.4
# homeassistant.components.minio
minio==4.0.9
@@ -1035,7 +1035,7 @@ mychevy==2.1.1
mycroftapi==2.0
# homeassistant.components.nad
nad_receiver==0.2.0
nad_receiver==0.3.0
# homeassistant.components.keenetic_ndms2
ndms2_client==0.1.1
@@ -1163,7 +1163,7 @@ p1monitor==1.0.0
# homeassistant.components.mqtt
# homeassistant.components.shiftr
paho-mqtt==1.5.1
paho-mqtt==1.6.1
# homeassistant.components.panasonic_bluray
panacotta==0.1
@@ -1447,7 +1447,7 @@ pyeconet==0.1.14
pyedimax==0.2.1
# homeassistant.components.efergy
pyefergy==0.1.2
pyefergy==0.1.3
# homeassistant.components.eight_sleep
pyeight==0.1.9
@@ -1535,7 +1535,7 @@ pyialarm==1.9.0
pyicloud==0.10.2
# homeassistant.components.insteon
pyinsteon==1.0.12
pyinsteon==1.0.13
# homeassistant.components.intesishome
pyintesishome==1.7.6
@@ -2314,7 +2314,7 @@ todoist-python==8.0.0
toonapi==0.2.1
# homeassistant.components.totalconnect
total_connect_client==2021.8.3
total_connect_client==2021.11.2
# homeassistant.components.tplink_lte
tp-connected==0.0.4
@@ -2360,7 +2360,7 @@ uvcclient==0.11.0
vallox-websocket-api==2.8.1
# homeassistant.components.velbus
velbus-aio==2021.10.7
velbus-aio==2021.11.6
# homeassistant.components.venstar
venstarcolortouch==0.14
@@ -2465,7 +2465,7 @@ youtube_dl==2021.06.06
zengge==0.2
# homeassistant.components.zeroconf
zeroconf==0.36.9
zeroconf==0.36.11
# homeassistant.components.zha
zha-quirks==0.0.63
+12 -12
View File
@@ -115,7 +115,7 @@ aioesphomeapi==10.2.0
aioflo==0.4.1
# homeassistant.components.guardian
aioguardian==1.0.8
aioguardian==2021.11.0
# homeassistant.components.harmony
aioharmony==0.2.8
@@ -161,7 +161,7 @@ aiopulse==0.4.2
aiopvapi==1.6.14
# homeassistant.components.pvpc_hourly_pricing
aiopvpc==2.2.0
aiopvpc==2.2.1
# homeassistant.components.webostv
aiopylgtv==0.4.0
@@ -170,7 +170,7 @@ aiopylgtv==0.4.0
aiorecollect==1.0.8
# homeassistant.components.shelly
aioshelly==1.0.2
aioshelly==1.0.4
# homeassistant.components.switcher_kis
aioswitcher==2.0.6
@@ -387,7 +387,7 @@ fjaraskupan==1.0.2
flipr-api==1.4.1
# homeassistant.components.flux_led
flux_led==0.24.13
flux_led==0.24.17
# homeassistant.components.homekit
fnvhash==0.1.0
@@ -500,7 +500,7 @@ hole==0.5.1
holidays==0.11.3.1
# homeassistant.components.frontend
home-assistant-frontend==20211028.0
home-assistant-frontend==20211108.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -600,7 +600,7 @@ mficlient==0.3.0
micloud==0.4
# homeassistant.components.mill
millheater==0.7.3
millheater==0.7.4
# homeassistant.components.minio
minio==4.0.9
@@ -689,7 +689,7 @@ p1monitor==1.0.0
# homeassistant.components.mqtt
# homeassistant.components.shiftr
paho-mqtt==1.5.1
paho-mqtt==1.6.1
# homeassistant.components.panasonic_viera
panasonic_viera==0.3.6
@@ -856,7 +856,7 @@ pydispatcher==2.0.5
pyeconet==0.1.14
# homeassistant.components.efergy
pyefergy==0.1.2
pyefergy==0.1.3
# homeassistant.components.everlights
pyeverlights==0.1.0
@@ -914,7 +914,7 @@ pyialarm==1.9.0
pyicloud==0.10.2
# homeassistant.components.insteon
pyinsteon==1.0.12
pyinsteon==1.0.13
# homeassistant.components.ipma
pyipma==2.0.5
@@ -1330,7 +1330,7 @@ tesla-powerwall==0.3.12
toonapi==0.2.1
# homeassistant.components.totalconnect
total_connect_client==2021.8.3
total_connect_client==2021.11.2
# homeassistant.components.transmission
transmissionrpc==0.11
@@ -1364,7 +1364,7 @@ url-normalize==1.4.1
uvcclient==0.11.0
# homeassistant.components.velbus
velbus-aio==2021.10.7
velbus-aio==2021.11.6
# homeassistant.components.venstar
venstarcolortouch==0.14
@@ -1430,7 +1430,7 @@ yeelight==0.7.8
youless-api==0.15
# homeassistant.components.zeroconf
zeroconf==0.36.9
zeroconf==0.36.11
# homeassistant.components.zha
zha-quirks==0.0.63
+2 -2
View File
@@ -63,8 +63,8 @@ CONSTRAINT_PATH = os.path.join(
CONSTRAINT_BASE = """
pycryptodome>=3.6.6
# Constrain urllib3 to ensure we deal with CVE-2019-11236 & CVE-2019-11324
urllib3>=1.24.3
# Constrain urllib3 to ensure we deal with CVE-2020-26137 and CVE-2021-33503
urllib3>=1.26.5
# Constrain H11 to ensure we get a new enough version to support non-rfc line endings
h11>=0.12.0
@@ -1,4 +1,5 @@
"""Test the Aurora ABB PowerOne Solar PV config flow."""
from datetime import timedelta
from logging import INFO
from unittest.mock import patch
@@ -12,7 +13,20 @@ from homeassistant.components.aurora_abb_powerone.const import (
ATTR_SERIAL_NUMBER,
DOMAIN,
)
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import CONF_ADDRESS, CONF_PORT
from homeassistant.util.dt import utcnow
from tests.common import async_fire_time_changed
def _simulated_returns(index, global_measure=None):
returns = {
3: 45.678, # power
21: 9.876, # temperature
5: 12345, # energy
}
return returns[index]
async def test_form(hass):
@@ -150,16 +164,161 @@ async def test_form_invalid_com_ports(hass):
# Tests below can be deleted after deprecation period is finished.
async def test_import(hass):
"""Test configuration.yaml import used during migration."""
TESTDATA = {"device": "/dev/ttyUSB7", "address": 3, "name": "MyAuroraPV"}
with patch(
"homeassistant.components.generic.camera.GenericCamera.async_camera_image",
return_value=None,
):
async def test_import_day(hass):
"""Test .yaml import when the inverter is able to communicate."""
TEST_DATA = {"device": "/dev/ttyUSB7", "address": 3, "name": "MyAuroraPV"}
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None,), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
) as mock_setup_entry:
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_IMPORT}, data=TESTDATA
DOMAIN, context={"source": config_entries.SOURCE_IMPORT}, data=TEST_DATA
)
assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["data"][CONF_PORT] == "/dev/ttyUSB7"
assert result["data"][CONF_ADDRESS] == 3
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
assert len(mock_setup_entry.mock_calls) == 1
async def test_import_night(hass):
"""Test .yaml import when the inverter is inaccessible (e.g. darkness)."""
TEST_DATA = {"device": "/dev/ttyUSB7", "address": 3, "name": "MyAuroraPV"}
# First time round, no response.
with patch(
"aurorapy.client.AuroraSerialClient.connect",
side_effect=AuroraError("No response after"),
) as mock_connect:
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_IMPORT}, data=TEST_DATA
)
configs = hass.config_entries.async_entries(DOMAIN)
assert len(configs) == 1
entry = configs[0]
assert not entry.unique_id
assert entry.state == ConfigEntryState.SETUP_RETRY
assert len(mock_connect.mock_calls) == 1
assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["data"][CONF_PORT] == "/dev/ttyUSB7"
assert result["data"][CONF_ADDRESS] == 3
# Second time round, talking this time.
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None,), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
), patch(
"aurorapy.client.AuroraSerialClient.measure",
side_effect=_simulated_returns,
):
# Wait >5seconds for the config to auto retry.
async_fire_time_changed(hass, utcnow() + timedelta(seconds=6))
await hass.async_block_till_done()
assert entry.state == ConfigEntryState.LOADED
assert entry.unique_id
assert len(mock_connect.mock_calls) == 1
assert hass.states.get("sensor.power_output").state == "45.7"
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
async def test_import_night_then_user(hass):
"""Attempt yaml import and fail (dark), but user sets up manually before auto retry."""
TEST_DATA = {"device": "/dev/ttyUSB7", "address": 3, "name": "MyAuroraPV"}
# First time round, no response.
with patch(
"aurorapy.client.AuroraSerialClient.connect",
side_effect=AuroraError("No response after"),
) as mock_connect:
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_IMPORT}, data=TEST_DATA
)
configs = hass.config_entries.async_entries(DOMAIN)
assert len(configs) == 1
entry = configs[0]
assert not entry.unique_id
assert entry.state == ConfigEntryState.SETUP_RETRY
assert len(mock_connect.mock_calls) == 1
assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["data"][CONF_PORT] == "/dev/ttyUSB7"
assert result["data"][CONF_ADDRESS] == 3
# Failed once, now simulate the user initiating config flow with valid settings.
fakecomports = []
fakecomports.append(list_ports_common.ListPortInfo("/dev/ttyUSB7"))
with patch(
"serial.tools.list_ports.comports",
return_value=fakecomports,
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
assert result["errors"] == {}
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None,), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
):
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_PORT: "/dev/ttyUSB7", CONF_ADDRESS: 7},
)
assert result2["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert len(hass.config_entries.async_entries(DOMAIN)) == 2
# Now retry yaml - it should fail with duplicate
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None,), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
):
# Wait >5seconds for the config to auto retry.
async_fire_time_changed(hass, utcnow() + timedelta(seconds=6))
await hass.async_block_till_done()
assert entry.state == ConfigEntryState.NOT_LOADED
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
@@ -19,6 +19,18 @@ async def test_unload_entry(hass):
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None), patch(
"homeassistant.components.aurora_abb_powerone.sensor.AuroraSensor.update",
return_value=None,
), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
):
mock_entry = MockConfigEntry(
domain=DOMAIN,
@@ -3,7 +3,6 @@ from datetime import timedelta
from unittest.mock import patch
from aurorapy.client import AuroraError
import pytest
from homeassistant.components.aurora_abb_powerone.const import (
ATTR_DEVICE_NAME,
@@ -13,10 +12,8 @@ from homeassistant.components.aurora_abb_powerone.const import (
DEFAULT_INTEGRATION_TITLE,
DOMAIN,
)
from homeassistant.components.aurora_abb_powerone.sensor import AuroraSensor
from homeassistant.config_entries import SOURCE_IMPORT
from homeassistant.const import CONF_ADDRESS, CONF_PORT
from homeassistant.exceptions import InvalidStateError
from homeassistant.setup import async_setup_component
import homeassistant.util.dt as dt_util
@@ -39,6 +36,7 @@ def _simulated_returns(index, global_measure=None):
returns = {
3: 45.678, # power
21: 9.876, # temperature
5: 12345, # energy
}
return returns[index]
@@ -66,7 +64,24 @@ async def test_setup_platform_valid_config(hass):
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None), patch(
"aurorapy.client.AuroraSerialClient.measure",
side_effect=_simulated_returns,
), assert_setup_component(1, "sensor"):
), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
), patch(
"aurorapy.client.AuroraSerialClient.cumulated_energy",
side_effect=_simulated_returns,
), assert_setup_component(
1, "sensor"
):
assert await async_setup_component(hass, "sensor", TEST_CONFIG)
await hass.async_block_till_done()
power = hass.states.get("sensor.power_output")
@@ -91,6 +106,21 @@ async def test_sensors(hass):
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None), patch(
"aurorapy.client.AuroraSerialClient.measure",
side_effect=_simulated_returns,
), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
), patch(
"aurorapy.client.AuroraSerialClient.cumulated_energy",
side_effect=_simulated_returns,
):
mock_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_entry.entry_id)
@@ -104,24 +134,9 @@ async def test_sensors(hass):
assert temperature
assert temperature.state == "9.9"
async def test_sensor_invalid_type(hass):
"""Test invalid sensor type during setup."""
entities = []
mock_entry = _mock_config_entry()
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None), patch(
"aurorapy.client.AuroraSerialClient.measure",
side_effect=_simulated_returns,
):
mock_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_entry.entry_id)
await hass.async_block_till_done()
client = hass.data[DOMAIN][mock_entry.unique_id]
data = mock_entry.data
with pytest.raises(InvalidStateError):
entities.append(AuroraSensor(client, data, "WrongSensor", "wrongparameter"))
energy = hass.states.get("sensor.total_energy")
assert energy
assert energy.state == "12.35"
async def test_sensor_dark(hass):
@@ -132,6 +147,18 @@ async def test_sensor_dark(hass):
# sun is up
with patch("aurorapy.client.AuroraSerialClient.connect", return_value=None), patch(
"aurorapy.client.AuroraSerialClient.measure", side_effect=_simulated_returns
), patch(
"aurorapy.client.AuroraSerialClient.serial_number",
return_value="9876543",
), patch(
"aurorapy.client.AuroraSerialClient.version",
return_value="9.8.7.6",
), patch(
"aurorapy.client.AuroraSerialClient.pn",
return_value="A.B.C",
), patch(
"aurorapy.client.AuroraSerialClient.firmware",
return_value="1.234",
):
mock_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_entry.entry_id)

Some files were not shown because too many files have changed in this diff Show More