Compare commits

...

85 Commits

Author SHA1 Message Date
Paulus Schoutsen 8ec6afb85a 2023.4.3 (#91316) 2023-04-12 21:50:11 -04:00
Franck Nijhof bbf2d0e6ad Remove codecov from Python test requirements (#91295) 2023-04-12 20:51:59 -04:00
tronikos c073cee049 Google Assistant SDK: Fix broadcast command for Portuguese (#91293)
Fix broadcast command for pt
2023-04-12 20:51:58 -04:00
Paulus Schoutsen e9f1148c0a Bumped version to 2023.4.3 2023-04-12 20:35:59 -04:00
J. Nick Koston a420007e80 Restore use of local timezone for MariaDB/MySQL in SQL integration (#91313)
* Use local timezone for recorder connection

The fix in #90335 had an unexpected side effect of
using UTC for the timezone since all recorder operations
use UTC. Since only sqlite much use the database executor
we can use a seperate connection pool which uses local time

This also ensures that the engines are disposed of
when Home Assistant is shutdown as previously we
did not cleanly disconnect

* coverage

* fix unclean shutdown in config flow

* tweaks
2023-04-12 20:35:50 -04:00
puddly 64a9bfcc22 Bump ZHA dependencies (#91291) 2023-04-12 20:35:49 -04:00
codyhackw fd53eda5c6 Update Inovelli Blue Series switch support in ZHA (#91254)
Co-authored-by: David F. Mulcahey <david.mulcahey@icloud.com>
2023-04-12 20:35:49 -04:00
Erik Montnemery d6574b4a2e Fix switch_as_x name (#91232) 2023-04-12 20:35:48 -04:00
Bram Kragten 8eb75beb96 Update frontend to 20230411.0 (#91219) 2023-04-12 20:35:47 -04:00
Erik Montnemery 68920a12aa Flush conversation name cache when an entity is renamed (#91214) 2023-04-12 20:35:46 -04:00
Aaron Bach a806e070a2 Bump pytile to 2023.04.0 (#91191) 2023-04-12 20:35:45 -04:00
David F. Mulcahey a87c78ca20 Cleanup ZHA from Zigpy deprecated property removal (#91180) 2023-04-12 20:35:44 -04:00
Aidan Timson 48df638f5d Reduce startup time for System Bridge integration (#91171) 2023-04-12 20:35:43 -04:00
Allen Porter c601266f9c Fix all day event coercion logic (#91169) 2023-04-12 20:35:42 -04:00
starkillerOG 30d615f206 Reolink config flow fix custom port when USE_HTTPS not selected (#91137)
give USE_HTTPS a default
2023-04-12 20:35:41 -04:00
J. Nick Koston 2db8d70c2f Fix false positive in SQL sensor full table scan check (#91134) 2023-04-12 20:35:40 -04:00
J. Nick Koston 3efffe7688 Bump ulid-transform to 0.6.3 (#91133)
* Bump ulid-transform to 0.6.2

changelog: https://github.com/bdraco/ulid-transform/compare/v0.6.0...v0.6.2

32bit fixes

fixes #91092

* 0.6.3
2023-04-12 20:35:39 -04:00
Allen Porter dc777f78b8 Relax calendar event validation to allow existing zero duration events (#91129)
Relax event valudation to allow existing zero duration events
2023-04-12 20:35:38 -04:00
Michael Davie 4cd00da319 Bump env_canada to 0.5.32 (#91126) 2023-04-12 20:35:37 -04:00
Robert Hillis 3f6486db3e Bump aiopyarr to 23.4.0 (#91110) 2023-04-12 20:35:36 -04:00
Diogo Gomes 2d41fe837c Track availability of source sensor in utility meter (#91035)
* track availability of source sensor

* address review comments
2023-04-12 20:35:35 -04:00
Pascal Reeb 34394d90c0 Fall back to polling if webhook cannot be registered on Nuki (#91013)
fix(nuki): throw warning if webhook cannot be created
2023-04-12 20:35:34 -04:00
Anthony Mattas fa29aea68e Fix configuring Flo instances (#90990)
* Update config_flow.py

Used constant string for consistency

* Update config_flow.py

Removed code for location ID and name the integration using the username

* Update manifest.json

Updated codeowners

* Update config_flow.py

* Update config_flow.py

Formatted with black

* Update manifest.json

Updated codeowners

* Update test_config_flow.py

Updated test
2023-04-12 20:35:33 -04:00
Paulus Schoutsen 7928b31087 2023.4.2 (#91111) 2023-04-08 23:41:48 -04:00
J. Nick Koston e792350be6 Fix fnvhash import in schema 32 test backport (#91112) 2023-04-08 23:41:19 -04:00
Paulus Schoutsen 5f0553dd22 Bumped version to 2023.4.2 2023-04-08 22:58:28 -04:00
J. Nick Koston 8f6b77235e Make the device_tracker more forgiving when passed an empty ip address string (#91101)
This has come up over and over and over again

fixes #87165 fixes #51980
2023-04-08 22:56:49 -04:00
J. Nick Koston 8ababc75d4 Bump flux_led to 0.28.37 (#91099)
changes: https://github.com/Danielhiversen/flux_led/releases/tag/0.28.37
2023-04-08 22:56:48 -04:00
J. Nick Koston 0a8f399655 Fix context_user_id round trip when calling to_native (#91098)
We do not actually use this in the history or logbook
APIs so nothing broke but there was a bug here for anyone
calling this directly

fixes #91090
2023-04-08 22:56:47 -04:00
Michael Davie 19567e7fee Bump env_canada to v0.5.31 (#91094) 2023-04-08 22:56:46 -04:00
Garrett 3a137cb24c Bump subarulink to 0.7.6 (#91064) 2023-04-08 22:56:45 -04:00
Allen Porter 935af6904d Bump gcal_sync to 4.1.4 (#91062) 2023-04-08 22:56:44 -04:00
Allen Porter 4fed5ad21c Make location optional in google calendar create service (#91061) 2023-04-08 22:56:44 -04:00
J. Nick Koston 9dc15687b5 Bump zeroconf to 0.56.0 (#91060) 2023-04-08 22:56:43 -04:00
J. Nick Koston 38a0eca223 Bump zeroconf to 0.55.0 (#90987) 2023-04-08 22:56:42 -04:00
David F. Mulcahey 6836e0b511 Fix Smartthings acceleration sensor in ZHA (#91056) 2023-04-08 22:55:52 -04:00
David F. Mulcahey cab88b72b8 Bump ZHA quirks lib (#91054) 2023-04-08 22:55:51 -04:00
Steven Looman 07421927ec Make sure upnp-router is also initialized when first seen through an advertisement (#91037) 2023-04-08 22:55:50 -04:00
Diogo Gomes 828a2779a0 Delay utility_meter until HA has started (#91017)
* increase information for end user

* only warn after home assistant has started

* delay utility_meter until HA has startED
2023-04-08 22:55:49 -04:00
Joost Lekkerkerker 7392a5780c Bump roombapy to 1.6.8 (#91012)
* Update roombapy to 1.6.7

* Update roombapy to 1.6.8
2023-04-08 22:55:48 -04:00
Aaron Bach 804270a797 Bump aioambient to 2023.04.0 (#90991) 2023-04-08 22:55:47 -04:00
J. Nick Koston 7f5f286648 Bump vallox-websocket-api to 3.2.1 (#90980)
unblocks https://github.com/home-assistant/core/pull/90901
which will finally fix the races in websockets
2023-04-08 22:55:46 -04:00
J. Nick Koston 0a70a29e92 Resume entity id post migration after a restart (#90973)
* Restart entity id post migration after a restart

If the entity migration finished and Home Assistant was
restarted during the post migration it would never be resumed
which means the old index and space would never be recovered

* add migration resume test
2023-04-08 22:55:46 -04:00
J. Nick Koston dc2f2e8d3f Raise an issue for legacy SQL queries that will cause full table scans (#90971)
* Raise an issue for SQL queries that will cause full table scans

* Raise an issue for SQL queries that will cause full table scans

* Raise an issue for SQL queries that will cause full table scans

* Raise an issue for SQL queries that will cause full table scans

* Update homeassistant/components/sql/sensor.py

Co-authored-by: Paulus Schoutsen <balloob@gmail.com>

* coverage

---------

Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2023-04-08 22:55:45 -04:00
J. Nick Koston 6522a3ad1b Bump websockets constraint to 11.0.1+ (#90901) 2023-04-08 22:55:44 -04:00
PatrickGlesner be65d4f33e Fix NMBS AttributeError (#90525)
* Fix NMBS AttributeError (Issue #90505)

* Set and use API_FAILURE

* Configure the logger to track API failures

* Remove broad exceptions and rewite logging
2023-04-08 22:55:43 -04:00
Paulus Schoutsen 0c15c75781 2023.4.1 (#90956) 2023-04-06 17:52:14 -04:00
Heikki Partanen 2bf51a033b Fix verisure autolock (#90960)
Fix verisure autolock #90959
2023-04-06 20:54:40 +00:00
Steven Rollason cfd8695aaa Fix command_template sensor value_template not being used if json_attributes set (#90603)
* Allow value_template to be used if json_attributes set

* Set state to None if no value_template and json_attributes used

* Refactor check for no value_template when json_attributes used

* Updated and additional unit test

* Updated to set _attr_native_value and return if value_template is None

* Update unit test docstring

* Updated test docstring based on feedback
2023-04-06 20:49:32 +00:00
Jan Bouwhuis e8a6a2e105 Fix error after losing an imap connection (#90966)
Cleanup first after losing an imap connection
2023-04-06 20:46:54 +00:00
Allen Porter 73a960af34 Bump gcal_sync to 4.1.3 (#90968) 2023-04-06 20:44:52 +00:00
Allen Porter bbb571fdf8 Coerce previously persisted local calendars to have valid durations (#90970) 2023-04-06 20:42:00 +00:00
J. Nick Koston c944be8215 Fix state being cleared on disconnect with deep sleep esphome devices (#90925)
* Fix state being cleared on disconnect with deep sleep esphome devices

fixes #90923

* fix logic
2023-04-06 20:39:04 +00:00
J. Nick Koston 5e903e04cf Avoid writing state to all esphome entities at shutdown (#90555) 2023-04-06 20:39:00 +00:00
starkillerOG 6884b0a421 Bump reolink-aio to 0.5.10 (#90963)
* use is_doorbell instead of is_doorbell_enabled

* Bump reolink-aio to 0.5.10
2023-04-06 14:35:39 -04:00
Aaron Bach a1c7159304 Bump aioambient to 2022.10.0 (#90940)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-04-06 14:34:25 -04:00
epenet d65791027f Fix flaky test in vesync (#90921)
* Fix flaky test in vesync

* Move sorting to the test
2023-04-06 14:34:24 -04:00
Paulus Schoutsen 5ffa0cba39 Bumped version to 2023.4.1 2023-04-06 13:21:13 -04:00
Bram Kragten f5be600383 Update frontend to 20230406.1 (#90951) 2023-04-06 13:21:07 -04:00
Pascal Reeb 9b2e26c270 Handle NoURLAvailableError in Nuki component (#90927)
* fix(nuki): handle NoURLAvailableError

* only try internal URLs
2023-04-06 13:21:06 -04:00
stickpin e25edea815 Return empty available programs list if an appliance is off during initial configuration (#90905) 2023-04-06 13:21:05 -04:00
J. Nick Koston 849000d5ac Bump aiodiscover to 1.4.16 (#90903) 2023-04-06 13:21:04 -04:00
Aaron Bach cb06541fda Bump simplisafe-python to 2023.04.0 (#90896)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-04-06 13:21:03 -04:00
J. Nick Koston 70d1e733f6 Fix entity_id migration query failing with MySQL 8.0.30 (#90895) 2023-04-06 13:21:02 -04:00
J. Nick Koston 0b3012071e Guard against invalid ULIDs in contexts while recording events (#90889) 2023-04-06 13:21:01 -04:00
J. Nick Koston 42b7ed115f Bump ulid-transform 0.6.0 (#90888)
* Bump ulid-transform 0.6.0

changelog: https://github.com/bdraco/ulid-transform/compare/v0.5.1...v0.6.0

to find the source of the invalid ulids in https://github.com/home-assistant/core/issues/90887
2023-04-06 13:21:00 -04:00
J. Nick Koston 513a13f369 Fix missing bluetooth client wrapper in bleak_retry_connector (#90885) 2023-04-06 13:20:59 -04:00
Michael f341d0787e Migrate entity unique ids in PI-Hole (#90883)
* migrate entity unique ids

* Update homeassistant/components/pi_hole/__init__.py

---------

Co-authored-by: Paulus Schoutsen <paulus@home-assistant.io>
2023-04-06 13:20:58 -04:00
J. Nick Koston c8ee45b53c Add MariaDB deadlock retry wrapper to database timestamp column migrations (#90880)
Add deadlock retry wrapper to timestamp column migrations

fixes #90819
2023-04-06 13:20:57 -04:00
J. Nick Koston b4e2dd4e06 Add constraint for websockets to <11.0 (#90868) 2023-04-06 13:20:56 -04:00
J. Nick Koston c663d8754b Generate a seperate log message per dumped object for profiler.dump_log_objects (#90867)
Since some objects are very large we can generate overly large log messages
```
Event data for system_log_event exceed maximum size of 32768 bytes. This can cause database performance issues; Event data will not be stored
```

Reported in https://ptb.discord.com/channels/330944238910963714/427516175237382144/1093069996101472306
2023-04-06 13:20:55 -04:00
Tom Harris 968a4e4818 Fix issue with Insteon All-Link Database loading (#90858)
Bump to 1.4.1
2023-04-06 13:20:54 -04:00
saschaabraham 833b95722e Bump fritzconnection to 1.12.0 (#90799) 2023-04-06 13:20:53 -04:00
mkmer 096e814929 Handle Uncaught exceptions in async_update Honeywell (#90746) 2023-04-06 13:20:52 -04:00
Franck Nijhof cff493fb98 2023.4.0 (#90855) 2023-04-05 19:57:42 +02:00
Franck Nijhof d67265bb66 Bumped version to 2023.4.0 2023-04-05 17:37:57 +02:00
Erik Montnemery 6e51f0d6f5 Adjust OTBR channel conflict URL (#90847) 2023-04-05 17:37:06 +02:00
Bram Kragten 82977f33ed Bump frontend to 20230405.0 (#90841) 2023-04-05 17:37:03 +02:00
epenet fb2d432d32 Adjust async_track_time_interval name argument (#90838)
Adjust async_track_time_interval naming
2023-04-05 17:36:59 +02:00
Tom Puttemans 0d019a3c4c Support entity name translation in DSMR Reader component (#90836)
* Use translation_key instead of name for the entity names and enum values

This change allows for the translation of entity names and their values based on a key, instead of having the English text in the code

* Adjusted tariff options order

Not really wrong, but this way it is consistent with all other entities
2023-04-05 17:36:55 +02:00
Paul Bottein 65b877bb77 Add entity name translations to prusalink entities (#90833) 2023-04-05 17:36:52 +02:00
Jan Bouwhuis 2a23583d67 Suppress imap logging on reconnect and presume state (#90826) 2023-04-05 17:36:48 +02:00
Penny Wood 80fe5051b3 Master RAS zone (#90825)
Fixes issue in some systems with different numbering systems
2023-04-05 17:36:44 +02:00
J. Nick Koston 2dfe33d177 Bump aioesphomeapi to 10.6.1 (#90816) 2023-04-05 17:36:41 +02:00
J. Nick Koston 617037a92d Fix BLEDevice not getting updated when details change for remote scanners (#90815) 2023-04-05 17:36:36 +02:00
127 changed files with 3104 additions and 507 deletions
@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["aioambient"],
"requirements": ["aioambient==2021.11.0"]
"requirements": ["aioambient==2023.04.0"]
}
@@ -28,7 +28,7 @@ async def async_setup(hass: HomeAssistant, _: ConfigType) -> bool:
# Send every day
async_track_time_interval(
hass, analytics.send_analytics, INTERVAL, "analytics daily"
hass, analytics.send_analytics, INTERVAL, name="analytics daily"
)
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STARTED, start_schedule)
@@ -38,7 +38,10 @@ class AugustSubscriberMixin:
def _async_setup_listeners(self):
"""Create interval and stop listeners."""
self._unsub_interval = async_track_time_interval(
self._hass, self._async_refresh, self._update_interval, "august refresh"
self._hass,
self._async_refresh,
self._update_interval,
name="august refresh",
)
@callback
@@ -101,7 +101,7 @@ class BaseHaScanner(ABC):
self.hass,
self._async_scanner_watchdog,
SCANNER_WATCHDOG_INTERVAL,
f"{self.name} Bluetooth scanner watchdog",
name=f"{self.name} Bluetooth scanner watchdog",
)
@hass_callback
@@ -230,7 +230,7 @@ class BaseHaRemoteScanner(BaseHaScanner):
self.hass,
self._async_expire_devices,
timedelta(seconds=30),
f"{self.name} Bluetooth scanner device expire",
name=f"{self.name} Bluetooth scanner device expire",
)
cancel_stop = self.hass.bus.async_listen(
EVENT_HOMEASSISTANT_STOP, self._async_save_history
@@ -345,12 +345,27 @@ class BaseHaRemoteScanner(BaseHaScanner):
tx_power=NO_RSSI_VALUE if tx_power is None else tx_power,
platform_data=(),
)
device = BLEDevice(
address=address,
name=local_name,
details=self._details | details,
rssi=rssi, # deprecated, will be removed in newer bleak
)
if prev_discovery:
#
# Bleak updates the BLEDevice via create_or_update_device.
# We need to do the same to ensure integrations that already
# have the BLEDevice object get the updated details when they
# change.
#
# https://github.com/hbldh/bleak/blob/222618b7747f0467dbb32bd3679f8cfaa19b1668/bleak/backends/scanner.py#L203
#
device = prev_device
device.name = local_name
device.details = self._details | details
# pylint: disable-next=protected-access
device._rssi = rssi # deprecated, will be removed in newer bleak
else:
device = BLEDevice(
address=address,
name=local_name,
details=self._details | details,
rssi=rssi, # deprecated, will be removed in newer bleak
)
self._discovered_device_advertisement_datas[address] = (
device,
advertisement_data,
@@ -276,7 +276,7 @@ class BluetoothManager:
self.hass,
self._async_check_unavailable,
timedelta(seconds=UNAVAILABLE_TRACK_SECONDS),
"Bluetooth manager unavailable tracking",
name="Bluetooth manager unavailable tracking",
)
@hass_callback
+6 -1
View File
@@ -10,9 +10,10 @@ from .wrappers import HaBleakClientWrapper, HaBleakScannerWrapper
ORIGINAL_BLEAK_SCANNER = bleak.BleakScanner
ORIGINAL_BLEAK_CLIENT = bleak.BleakClient
ORIGINAL_BLEAK_RETRY_CONNECTOR_CLIENT = (
ORIGINAL_BLEAK_RETRY_CONNECTOR_CLIENT_WITH_SERVICE_CACHE = (
bleak_retry_connector.BleakClientWithServiceCache
)
ORIGINAL_BLEAK_RETRY_CONNECTOR_CLIENT = bleak_retry_connector.BleakClient
def install_multiple_bleak_catcher() -> None:
@@ -23,6 +24,7 @@ def install_multiple_bleak_catcher() -> None:
bleak.BleakScanner = HaBleakScannerWrapper # type: ignore[misc, assignment]
bleak.BleakClient = HaBleakClientWrapper # type: ignore[misc]
bleak_retry_connector.BleakClientWithServiceCache = HaBleakClientWithServiceCache # type: ignore[misc,assignment] # noqa: E501
bleak_retry_connector.BleakClient = HaBleakClientWrapper # type: ignore[misc] # noqa: E501
def uninstall_multiple_bleak_catcher() -> None:
@@ -30,6 +32,9 @@ def uninstall_multiple_bleak_catcher() -> None:
bleak.BleakScanner = ORIGINAL_BLEAK_SCANNER # type: ignore[misc]
bleak.BleakClient = ORIGINAL_BLEAK_CLIENT # type: ignore[misc]
bleak_retry_connector.BleakClientWithServiceCache = ( # type: ignore[misc]
ORIGINAL_BLEAK_RETRY_CONNECTOR_CLIENT_WITH_SERVICE_CACHE
)
bleak_retry_connector.BleakClient = ( # type: ignore[misc]
ORIGINAL_BLEAK_RETRY_CONNECTOR_CLIENT
)
+1 -1
View File
@@ -177,7 +177,7 @@ class BondEntity(Entity):
self.hass,
self._async_update_if_bpup_not_alive,
_FALLBACK_SCAN_INTERVAL,
f"Bond {self.entity_id} fallback polling",
name=f"Bond {self.entity_id} fallback polling",
)
)
+28 -10
View File
@@ -67,6 +67,13 @@ SCAN_INTERVAL = datetime.timedelta(seconds=60)
# Don't support rrules more often than daily
VALID_FREQS = {"DAILY", "WEEKLY", "MONTHLY", "YEARLY"}
# Ensure events created in Home Assistant have a positive duration
MIN_NEW_EVENT_DURATION = datetime.timedelta(seconds=1)
# Events must have a non-negative duration e.g. Google Calendar can create zero
# duration events in the UI.
MIN_EVENT_DURATION = datetime.timedelta(seconds=0)
def _has_timezone(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Assert that all datetime values have a timezone."""
@@ -116,17 +123,18 @@ def _as_local_timezone(*keys: Any) -> Callable[[dict[str, Any]], dict[str, Any]]
return validate
def _has_duration(
start_key: str, end_key: str
def _has_min_duration(
start_key: str, end_key: str, min_duration: datetime.timedelta
) -> Callable[[dict[str, Any]], dict[str, Any]]:
"""Verify that the time span between start and end is positive."""
"""Verify that the time span between start and end has a minimum duration."""
def validate(obj: dict[str, Any]) -> dict[str, Any]:
"""Test that all keys in the dict are in order."""
if (start := obj.get(start_key)) and (end := obj.get(end_key)):
duration = end - start
if duration.total_seconds() <= 0:
raise vol.Invalid(f"Expected positive event duration ({start}, {end})")
if duration < min_duration:
raise vol.Invalid(
f"Expected minimum event duration of {min_duration} ({start}, {end})"
)
return obj
return validate
@@ -204,8 +212,8 @@ CREATE_EVENT_SCHEMA = vol.All(
),
_has_consistent_timezone(EVENT_START_DATETIME, EVENT_END_DATETIME),
_as_local_timezone(EVENT_START_DATETIME, EVENT_END_DATETIME),
_has_duration(EVENT_START_DATE, EVENT_END_DATE),
_has_duration(EVENT_START_DATETIME, EVENT_END_DATETIME),
_has_min_duration(EVENT_START_DATE, EVENT_END_DATE, MIN_NEW_EVENT_DURATION),
_has_min_duration(EVENT_START_DATETIME, EVENT_END_DATETIME, MIN_NEW_EVENT_DURATION),
)
WEBSOCKET_EVENT_SCHEMA = vol.Schema(
@@ -221,7 +229,7 @@ WEBSOCKET_EVENT_SCHEMA = vol.Schema(
_has_same_type(EVENT_START, EVENT_END),
_has_consistent_timezone(EVENT_START, EVENT_END),
_as_local_timezone(EVENT_START, EVENT_END),
_has_duration(EVENT_START, EVENT_END),
_has_min_duration(EVENT_START, EVENT_END, MIN_NEW_EVENT_DURATION),
)
)
@@ -238,7 +246,7 @@ CALENDAR_EVENT_SCHEMA = vol.Schema(
_has_timezone("start", "end"),
_has_consistent_timezone("start", "end"),
_as_local_timezone("start", "end"),
_has_duration("start", "end"),
_has_min_duration("start", "end", MIN_EVENT_DURATION),
),
extra=vol.ALLOW_EXTRA,
)
@@ -346,6 +354,16 @@ class CalendarEvent:
f"Failed to validate CalendarEvent: {err}"
) from err
# It is common to set a start an end date to be the same thing for
# an all day event, but that is not a valid duration. Fix to have a
# duration of one day.
if (
not isinstance(self.start, datetime.datetime)
and not isinstance(self.end, datetime.datetime)
and self.start == self.end
):
self.end = self.start + datetime.timedelta(days=1)
def _event_dict_factory(obj: Iterable[tuple[str, Any]]) -> dict[str, str]:
"""Convert CalendarEvent dataclass items to dictionary of attributes."""
+1 -1
View File
@@ -380,7 +380,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
entity.async_write_ha_state()
unsub = async_track_time_interval(
hass, update_tokens, TOKEN_CHANGE_INTERVAL, "Camera update tokens"
hass, update_tokens, TOKEN_CHANGE_INTERVAL, name="Camera update tokens"
)
@callback
@@ -137,8 +137,11 @@ class CommandSensor(SensorEntity):
_LOGGER.warning("Unable to parse output as JSON: %s", value)
else:
_LOGGER.warning("Empty reply found when expecting JSON data")
if self._value_template is None:
self._attr_native_value = None
return
elif self._value_template is not None:
if self._value_template is not None:
self._attr_native_value = (
self._value_template.async_render_with_possible_json_value(
value,
@@ -32,6 +32,7 @@ from .const import DEFAULT_EXPOSED_ATTRIBUTES, DEFAULT_EXPOSED_DOMAINS, DOMAIN
_LOGGER = logging.getLogger(__name__)
_DEFAULT_ERROR_TEXT = "Sorry, I couldn't understand that"
_ENTITY_REGISTRY_UPDATE_FIELDS = ["aliases", "name", "original_name"]
REGEX_TYPE = type(re.compile(""))
@@ -450,8 +451,10 @@ class DefaultAgent(AbstractConversationAgent):
@core.callback
def _async_handle_entity_registry_changed(self, event: core.Event) -> None:
"""Clear names list cache when an entity changes aliases."""
if event.data["action"] == "update" and "aliases" not in event.data["changes"]:
"""Clear names list cache when an entity registry entry has changed."""
if event.data["action"] == "update" and not any(
field in event.data["changes"] for field in _ENTITY_REGISTRY_UPDATE_FIELDS
):
return
self._slot_lists = None
@@ -348,7 +348,7 @@ class ScannerEntity(BaseTrackerEntity):
self.mac_address,
self.unique_id,
)
if self.is_connected:
if self.is_connected and self.ip_address:
_async_connected_device_registered(
hass,
self.mac_address,
@@ -405,7 +405,7 @@ class ScannerEntity(BaseTrackerEntity):
"""Return the device state attributes."""
attr: dict[str, StateType] = {}
attr.update(super().state_attributes)
if self.ip_address is not None:
if self.ip_address:
attr[ATTR_IP] = self.ip_address
if self.mac_address is not None:
attr[ATTR_MAC] = self.mac_address
@@ -427,7 +427,7 @@ def async_setup_scanner_platform(
hass,
async_device_tracker_scan,
interval,
f"device_tracker {platform} legacy scan",
name=f"device_tracker {platform} legacy scan",
)
hass.async_create_task(async_device_tracker_scan(None))
+4 -1
View File
@@ -260,7 +260,10 @@ class NetworkWatcher(WatcherBase):
"""Start scanning for new devices on the network."""
self._discover_hosts = DiscoverHosts()
self._unsub = async_track_time_interval(
self.hass, self.async_start_discover, SCAN_INTERVAL, "DHCP network watcher"
self.hass,
self.async_start_discover,
SCAN_INTERVAL,
name="DHCP network watcher",
)
self.async_start_discover()
+1 -1
View File
@@ -7,5 +7,5 @@
"iot_class": "local_push",
"loggers": ["aiodiscover", "dnspython", "pyroute2", "scapy"],
"quality_scale": "internal",
"requirements": ["scapy==2.5.0", "aiodiscover==1.4.15"]
"requirements": ["scapy==2.5.0", "aiodiscover==1.4.16"]
}
@@ -48,49 +48,49 @@ class DSMRReaderSensorEntityDescription(SensorEntityDescription):
SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_delivered_1",
name="Low tariff usage",
translation_key="low_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_returned_1",
name="Low tariff returned",
translation_key="low_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_delivered_2",
name="High tariff usage",
translation_key="high_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_returned_2",
name="High tariff returned",
translation_key="high_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_currently_delivered",
name="Current power usage",
translation_key="current_power_usage",
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
state_class=SensorStateClass.MEASUREMENT,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/electricity_currently_returned",
name="Current power return",
translation_key="current_power_return",
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
state_class=SensorStateClass.MEASUREMENT,
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_delivered_l1",
name="Current power usage L1",
translation_key="current_power_usage_l1",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -98,7 +98,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_delivered_l2",
name="Current power usage L2",
translation_key="current_power_usage_l2",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -106,7 +106,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_delivered_l3",
name="Current power usage L3",
translation_key="current_power_usage_l3",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -114,7 +114,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_returned_l1",
name="Current power return L1",
translation_key="current_power_return_l1",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -122,7 +122,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_returned_l2",
name="Current power return L2",
translation_key="current_power_return_l2",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -130,7 +130,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_currently_returned_l3",
name="Current power return L3",
translation_key="current_power_return_l3",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
@@ -138,7 +138,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/extra_device_delivered",
name="Gas meter usage",
translation_key="gas_meter_usage",
entity_registry_enabled_default=False,
icon="mdi:fire",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
@@ -146,7 +146,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_voltage_l1",
name="Current voltage L1",
translation_key="current_voltage_l1",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
@@ -154,7 +154,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_voltage_l2",
name="Current voltage L2",
translation_key="current_voltage_l2",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
@@ -162,7 +162,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_voltage_l3",
name="Current voltage L3",
translation_key="current_voltage_l3",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.VOLTAGE,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
@@ -170,7 +170,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_power_current_l1",
name="Phase power current L1",
translation_key="phase_power_current_l1",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.CURRENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
@@ -178,7 +178,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_power_current_l2",
name="Phase power current L2",
translation_key="phase_power_current_l2",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.CURRENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
@@ -186,7 +186,7 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/phase_power_current_l3",
name="Phase power current L3",
translation_key="phase_power_current_l3",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.CURRENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
@@ -194,384 +194,386 @@ SENSORS: tuple[DSMRReaderSensorEntityDescription, ...] = (
),
DSMRReaderSensorEntityDescription(
key="dsmr/reading/timestamp",
name="Telegram timestamp",
translation_key="telegram_timestamp",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
state=dt_util.parse_datetime,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/gas/delivered",
name="Gas usage",
translation_key="gas_usage",
device_class=SensorDeviceClass.GAS,
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/gas/currently_delivered",
name="Current gas usage",
translation_key="current_gas_usage",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
state_class=SensorStateClass.MEASUREMENT,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/gas/read_at",
name="Gas meter read",
translation_key="gas_meter_read",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
state=dt_util.parse_datetime,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity1",
name="Low tariff usage (daily)",
translation_key="daily_low_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity2",
name="High tariff usage (daily)",
translation_key="daily_high_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity1_returned",
name="Low tariff return (daily)",
translation_key="daily_low_tariff_return",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity2_returned",
name="High tariff return (daily)",
translation_key="daily_high_tariff_return",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity_merged",
name="Power usage total (daily)",
translation_key="daily_power_usage_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity_returned_merged",
name="Power return total (daily)",
translation_key="daily_power_return_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity1_cost",
name="Low tariff cost (daily)",
translation_key="daily_low_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity2_cost",
name="High tariff cost (daily)",
translation_key="daily_high_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/electricity_cost_merged",
name="Power total cost (daily)",
translation_key="daily_power_total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/gas",
name="Gas usage (daily)",
translation_key="daily_gas_usage",
icon="mdi:counter",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/gas_cost",
name="Gas cost",
translation_key="gas_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/total_cost",
name="Total cost",
translation_key="total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/energy_supplier_price_electricity_delivered_1",
name="Low tariff delivered price",
translation_key="low_tariff_delivered_price",
icon="mdi:currency-eur",
native_unit_of_measurement=PRICE_EUR_KWH,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/energy_supplier_price_electricity_delivered_2",
name="High tariff delivered price",
translation_key="high_tariff_delivered_price",
icon="mdi:currency-eur",
native_unit_of_measurement=PRICE_EUR_KWH,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/energy_supplier_price_electricity_returned_1",
name="Low tariff returned price",
translation_key="low_tariff_returned_price",
icon="mdi:currency-eur",
native_unit_of_measurement=PRICE_EUR_KWH,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/energy_supplier_price_electricity_returned_2",
name="High tariff returned price",
translation_key="high_tariff_returned_price",
icon="mdi:currency-eur",
native_unit_of_measurement=PRICE_EUR_KWH,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/energy_supplier_price_gas",
name="Gas price",
translation_key="gas_price",
icon="mdi:currency-eur",
native_unit_of_measurement=PRICE_EUR_M3,
),
DSMRReaderSensorEntityDescription(
key="dsmr/day-consumption/fixed_cost",
name="Current day fixed cost",
translation_key="current_day_fixed_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/dsmr_version",
name="DSMR version",
translation_key="dsmr_version",
entity_registry_enabled_default=False,
icon="mdi:alert-circle",
state=dsmr_transform,
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/electricity_tariff",
name="Electricity tariff",
translation_key="electricity_tariff",
device_class=SensorDeviceClass.ENUM,
options=["low", "high"],
icon="mdi:flash",
state=tariff_transform,
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/power_failure_count",
name="Power failure count",
translation_key="power_failure_count",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/long_power_failure_count",
name="Long power failure count",
translation_key="long_power_failure_count",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_sag_count_l1",
name="Voltage sag L1",
translation_key="voltage_sag_l1",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_sag_count_l2",
name="Voltage sag L2",
translation_key="voltage_sag_l2",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_sag_count_l3",
name="Voltage sag L3",
translation_key="voltage_sag_l3",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_swell_count_l1",
name="Voltage swell L1",
translation_key="voltage_swell_l1",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_swell_count_l2",
name="Voltage swell L2",
translation_key="voltage_swell_l2",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/voltage_swell_count_l3",
name="Voltage swell L3",
translation_key="voltage_swell_l3",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/meter-stats/rejected_telegrams",
name="Rejected telegrams",
translation_key="rejected_telegrams",
entity_registry_enabled_default=False,
icon="mdi:flash",
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity1",
name="Current month low tariff usage",
translation_key="current_month_low_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity2",
name="Current month high tariff usage",
translation_key="current_month_high_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity1_returned",
name="Current month low tariff returned",
translation_key="current_month_low_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity2_returned",
name="Current month high tariff returned",
translation_key="current_month_high_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity_merged",
name="Current month power usage total",
translation_key="current_month_power_usage_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity_returned_merged",
name="Current month power return total",
translation_key="current_month_power_return_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity1_cost",
name="Current month low tariff cost",
translation_key="current_month_low_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity2_cost",
name="Current month high tariff cost",
translation_key="current_month_high_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/electricity_cost_merged",
name="Current month power total cost",
translation_key="current_month_power_total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/gas",
name="Current month gas usage",
translation_key="current_month_gas_usage",
icon="mdi:counter",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/gas_cost",
name="Current month gas cost",
translation_key="current_month_gas_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/fixed_cost",
name="Current month fixed cost",
translation_key="current_month_fixed_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-month/total_cost",
name="Current month total cost",
translation_key="current_month_total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity1",
name="Current year low tariff usage",
translation_key="current_year_low_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity2",
name="Current year high tariff usage",
translation_key="current_year_high_tariff_usage",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity1_returned",
name="Current year low tariff returned",
translation_key="current_year_low_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity2_returned",
name="Current year high tariff returned",
translation_key="current_year_high_tariff_returned",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity_merged",
name="Current year power usage total",
translation_key="current_year_power_usage_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity_returned_merged",
name="Current year power returned total",
translation_key="current_year_power_returned_total",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity1_cost",
name="Current year low tariff cost",
translation_key="current_year_low_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity2_cost",
name="Current year high tariff cost",
translation_key="current_year_high_tariff_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/electricity_cost_merged",
name="Current year power total cost",
translation_key="current_year_power_total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/gas",
name="Current year gas usage",
translation_key="current_year_gas_usage",
icon="mdi:counter",
native_unit_of_measurement=UnitOfVolume.CUBIC_METERS,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/gas_cost",
name="Current year gas cost",
translation_key="current_year_gas_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/fixed_cost",
name="Current year fixed cost",
translation_key="current_year_fixed_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/current-year/total_cost",
name="Current year total cost",
translation_key="current_year_total_cost",
icon="mdi:currency-eur",
native_unit_of_measurement=CURRENCY_EURO,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/quarter-hour-peak-electricity/average_delivered",
name="Previous quarter-hour peak usage",
translation_key="previous_quarter_hour_peak_usage",
device_class=SensorDeviceClass.POWER,
native_unit_of_measurement=UnitOfPower.KILO_WATT,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/quarter-hour-peak-electricity/read_at_start",
name="Quarter-hour peak start time",
translation_key="quarter_hour_peak_start_time",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
state=dt_util.parse_datetime,
),
DSMRReaderSensorEntityDescription(
key="dsmr/consumption/quarter-hour-peak-electricity/read_at_end",
name="Quarter-hour peak end time",
translation_key="quarter_hour_peak_end_time",
entity_registry_enabled_default=False,
device_class=SensorDeviceClass.TIMESTAMP,
state=dt_util.parse_datetime,
@@ -23,6 +23,7 @@ async def async_setup_entry(
class DSMRSensor(SensorEntity):
"""Representation of a DSMR sensor that is updated via MQTT."""
_attr_has_entity_name = True
entity_description: DSMRReaderSensorEntityDescription
def __init__(
@@ -8,5 +8,256 @@
"description": "Make sure to configure the 'split topic' data sources in DSMR Reader."
}
}
},
"entity": {
"sensor": {
"low_tariff_usage": {
"name": "Low tariff usage"
},
"low_tariff_returned": {
"name": "Low tariff returned"
},
"high_tariff_usage": {
"name": "High tariff usage"
},
"high_tariff_returned": {
"name": "High tariff returned"
},
"current_power_usage": {
"name": "Current power usage"
},
"current_power_return": {
"name": "Current power return"
},
"current_power_usage_l1": {
"name": "Current power usage L1"
},
"current_power_usage_l2": {
"name": "Current power usage L2"
},
"current_power_usage_l3": {
"name": "Current power usage L3"
},
"current_power_return_l1": {
"name": "Current power return L1"
},
"current_power_return_l2": {
"name": "Current power return L2"
},
"current_power_return_l3": {
"name": "Current power return L3"
},
"gas_meter_usage": {
"name": "Gas meter usage"
},
"current_voltage_l1": {
"name": "Current voltage L1"
},
"current_voltage_l2": {
"name": "Current voltage L2"
},
"current_voltage_l3": {
"name": "Current voltage L3"
},
"phase_power_current_l1": {
"name": "Phase power current L1"
},
"phase_power_current_l2": {
"name": "Phase power current L2"
},
"phase_power_current_l3": {
"name": "Phase power current L3"
},
"telegram_timestamp": {
"name": "Telegram timestamp"
},
"gas_usage": {
"name": "Gas usage"
},
"current_gas_usage": {
"name": "Current gas usage"
},
"gas_meter_read": {
"name": "Gas meter read"
},
"daily_low_tariff_usage": {
"name": "Low tariff usage (daily)"
},
"daily_high_tariff_usage": {
"name": "High tariff usage (daily)"
},
"daily_low_tariff_return": {
"name": "Low tariff return (daily)"
},
"daily_high_tariff_return": {
"name": "High tariff return (daily)"
},
"daily_power_usage_total": {
"name": "Power usage total (daily)"
},
"daily_power_return_total": {
"name": "Power return total (daily)"
},
"daily_low_tariff_cost": {
"name": "Low tariff cost (daily)"
},
"daily_high_tariff_cost": {
"name": "High tariff cost (daily)"
},
"daily_power_total_cost": {
"name": "Power total cost (daily)"
},
"daily_gas_usage": {
"name": "Gas usage (daily)"
},
"gas_cost": {
"name": "Gas cost"
},
"total_cost": {
"name": "Total cost"
},
"low_tariff_delivered_price": {
"name": "Low tariff delivered price"
},
"high_tariff_delivered_price": {
"name": "High tariff delivered price"
},
"low_tariff_returned_price": {
"name": "Low tariff returned price"
},
"high_tariff_returned_price": {
"name": "High tariff returned price"
},
"gas_price": {
"name": "Gas Price"
},
"current_day_fixed_cost": {
"name": "Current day fixed cost"
},
"dsmr_version": {
"name": "DSMR version"
},
"electricity_tariff": {
"name": "Electricity tariff",
"state": {
"low": "Low",
"high": "High"
}
},
"power_failure_count": {
"name": "Power failure count"
},
"long_power_failure_count": {
"name": "Long power failure count"
},
"voltage_sag_l1": {
"name": "Voltage sag L1"
},
"voltage_sag_l2": {
"name": "Voltage sag L2"
},
"voltage_sag_l3": {
"name": "Voltage sag L3"
},
"voltage_swell_l1": {
"name": "Voltage swell L1"
},
"voltage_swell_l2": {
"name": "Voltage swell L2"
},
"voltage_swell_l3": {
"name": "Voltage swell L3"
},
"rejected_telegrams": {
"name": "Rejected telegrams"
},
"current_month_low_tariff_usage": {
"name": "Current month low tariff usage"
},
"current_month_high_tariff_usage": {
"name": "Current month high tariff usage"
},
"current_month_low_tariff_returned": {
"name": "Current month low tariff returned"
},
"current_month_high_tariff_returned": {
"name": "Current month high tariff returned"
},
"current_month_power_usage_total": {
"name": "Current month power usage total"
},
"current_month_power_return_total": {
"name": "Current month power return total"
},
"current_month_low_tariff_cost": {
"name": "Current month low tariff cost"
},
"current_month_high_tariff_cost": {
"name": "Current month high tariff cost"
},
"current_month_power_total_cost": {
"name": "Current month power total cost"
},
"current_month_gas_usage": {
"name": "Current month gas usage"
},
"current_month_gas_cost": {
"name": "Current month gas cost"
},
"current_month_fixed_cost": {
"name": "Current month fixed cost"
},
"current_month_total_cost": {
"name": "Current month total cost"
},
"current_year_low_tariff_usage": {
"name": "Current year low tariff usage"
},
"current_year_high_tariff_usage": {
"name": "Current year high tariff usage"
},
"current_year_low_tariff_returned": {
"name": "Current year low tariff returned"
},
"current_year_high_tariff_returned": {
"name": "Current year high tariff returned"
},
"current_year_power_usage_total": {
"name": "Current year power usage total"
},
"current_year_power_returned_total": {
"name": "Current year power returned total"
},
"current_year_low_tariff_cost": {
"name": "Current year low tariff cost"
},
"current_year_high_tariff_cost": {
"name": "Current year high tariff cost"
},
"current_year_power_total_cost": {
"name": "Current year power total cost"
},
"current_year_gas_usage": {
"name": "Current year gas usage"
},
"current_year_gas_cost": {
"name": "Current year gas cost"
},
"current_year_fixed_cost": {
"name": "Current year fixed cost"
},
"current_year_total_cost": {
"name": "Current year total cost"
},
"previous_quarter_hour_peak_usage": {
"name": "Previous quarter-hour peak usage"
},
"quarter_hour_peak_start_time": {
"name": "Quarter-hour peak start time"
},
"quarter_hour_peak_end_time": {
"name": "Quarter-hour peak end time"
}
}
}
}
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/environment_canada",
"iot_class": "cloud_polling",
"loggers": ["env_canada"],
"requirements": ["env_canada==0.5.30"]
"requirements": ["env_canada==0.5.32"]
}
+12 -4
View File
@@ -345,11 +345,19 @@ async def async_setup_entry( # noqa: C901
disconnect_cb()
entry_data.disconnect_callbacks = []
entry_data.available = False
# Clear out the states so that we will always dispatch
# Mark state as stale so that we will always dispatch
# the next state update of that type when the device reconnects
for state_keys in entry_data.state.values():
state_keys.clear()
entry_data.async_update_device_state(hass)
entry_data.stale_state = {
(type(entity_state), key)
for state_dict in entry_data.state.values()
for key, entity_state in state_dict.items()
}
if not hass.is_stopping:
# Avoid marking every esphome entity as unavailable on shutdown
# since it generates a lot of state changed events and database
# writes when we already know we're shutting down and the state
# will be cleared anyway.
entry_data.async_update_device_state(hass)
async def on_connect_error(err: Exception) -> None:
"""Start reauth flow if appropriate connect error type."""
@@ -70,6 +70,10 @@ class RuntimeEntryData:
client: APIClient
store: Store
state: dict[type[EntityState], dict[int, EntityState]] = field(default_factory=dict)
# When the disconnect callback is called, we mark all states
# as stale so we will always dispatch a state update when the
# device reconnects. This is the same format as state_subscriptions.
stale_state: set[tuple[type[EntityState], int]] = field(default_factory=set)
info: dict[str, dict[int, EntityInfo]] = field(default_factory=dict)
# A second list of EntityInfo objects
@@ -206,9 +210,11 @@ class RuntimeEntryData:
"""Distribute an update of state information to the target."""
key = state.key
state_type = type(state)
stale_state = self.stale_state
current_state_by_type = self.state[state_type]
current_state = current_state_by_type.get(key, _SENTINEL)
if current_state == state:
subscription_key = (state_type, key)
if current_state == state and subscription_key not in stale_state:
_LOGGER.debug(
"%s: ignoring duplicate update with and key %s: %s",
self.name,
@@ -222,8 +228,8 @@ class RuntimeEntryData:
key,
state,
)
stale_state.discard(subscription_key)
current_state_by_type[key] = state
subscription_key = (state_type, key)
if subscription_key in self.state_subscriptions:
self.state_subscriptions[subscription_key]()
@@ -14,6 +14,6 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["aioesphomeapi", "noiseprotocol"],
"requirements": ["aioesphomeapi==13.6.0", "esphome-dashboard-api==1.2.3"],
"requirements": ["aioesphomeapi==13.6.1", "esphome-dashboard-api==1.2.3"],
"zeroconf": ["_esphomelib._tcp.local."]
}
+8 -11
View File
@@ -9,7 +9,9 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN, LOGGER
DATA_SCHEMA = vol.Schema({vol.Required("username"): str, vol.Required("password"): str})
DATA_SCHEMA = vol.Schema(
{vol.Required(CONF_USERNAME): str, vol.Required(CONF_PASSWORD): str}
)
async def validate_input(hass: core.HomeAssistant, data):
@@ -20,18 +22,11 @@ async def validate_input(hass: core.HomeAssistant, data):
session = async_get_clientsession(hass)
try:
api = await async_get_api(
data[CONF_USERNAME], data[CONF_PASSWORD], session=session
)
await async_get_api(data[CONF_USERNAME], data[CONF_PASSWORD], session=session)
except RequestError as request_error:
LOGGER.error("Error connecting to the Flo API: %s", request_error)
raise CannotConnect from request_error
user_info = await api.user.get_info()
a_location_id = user_info["locations"][0]["id"]
location_info = await api.location.get_info(a_location_id)
return {"title": location_info["nickname"]}
class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
"""Handle a config flow for flo."""
@@ -45,8 +40,10 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(user_input[CONF_USERNAME])
self._abort_if_unique_id_configured()
try:
info = await validate_input(self.hass, user_input)
return self.async_create_entry(title=info["title"], data=user_input)
await validate_input(self.hass, user_input)
return self.async_create_entry(
title=user_input[CONF_USERNAME], data=user_input
)
except CannotConnect:
errors["base"] = "cannot_connect"
@@ -51,5 +51,5 @@
"iot_class": "local_push",
"loggers": ["flux_led"],
"quality_scale": "platinum",
"requirements": ["flux_led==0.28.36"]
"requirements": ["flux_led==0.28.37"]
}
+1 -1
View File
@@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/fritz",
"iot_class": "local_polling",
"loggers": ["fritzconnection"],
"requirements": ["fritzconnection==1.11.0", "xmltodict==0.13.0"],
"requirements": ["fritzconnection==1.12.0", "xmltodict==0.13.0"],
"ssdp": [
{
"st": "urn:schemas-upnp-org:device:fritzbox:1"
@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["fritzconnection"],
"requirements": ["fritzconnection==1.11.0"]
"requirements": ["fritzconnection==1.12.0"]
}
@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20230403.0"]
"requirements": ["home-assistant-frontend==20230411.0"]
}
+9 -8
View File
@@ -285,17 +285,18 @@ async def async_setup_add_event_service(
raise ValueError(
"Missing required fields to set start or end date/datetime"
)
event = Event(
summary=call.data[EVENT_SUMMARY],
description=call.data[EVENT_DESCRIPTION],
start=start,
end=end,
)
if location := call.data.get(EVENT_LOCATION):
event.location = location
try:
await calendar_service.async_create_event(
call.data[EVENT_CALENDAR_ID],
Event(
summary=call.data[EVENT_SUMMARY],
description=call.data[EVENT_DESCRIPTION],
location=call.data[EVENT_LOCATION],
start=start,
end=end,
),
event,
)
except ApiException as err:
raise HomeAssistantError(str(err)) from err
+11 -8
View File
@@ -508,9 +508,10 @@ class GoogleCalendarEntity(
"start": start,
"end": end,
EVENT_DESCRIPTION: kwargs.get(EVENT_DESCRIPTION),
EVENT_LOCATION: kwargs.get(EVENT_LOCATION),
}
)
if location := kwargs.get(EVENT_LOCATION):
event.location = location
if rrule := kwargs.get(EVENT_RRULE):
event.recurrence = [f"{RRULE_PREFIX}{rrule}"]
@@ -597,18 +598,20 @@ async def async_create_event(entity: GoogleCalendarEntity, call: ServiceCall) ->
if start is None or end is None:
raise ValueError("Missing required fields to set start or end date/datetime")
event = Event(
summary=call.data[EVENT_SUMMARY],
description=call.data[EVENT_DESCRIPTION],
start=start,
end=end,
)
if location := call.data.get(EVENT_LOCATION):
event.location = location
try:
await cast(
CalendarSyncUpdateCoordinator, entity.coordinator
).sync.api.async_create_event(
entity.calendar_id,
Event(
summary=call.data[EVENT_SUMMARY],
description=call.data[EVENT_DESCRIPTION],
location=call.data[EVENT_LOCATION],
start=start,
end=end,
),
event,
)
except ApiException as err:
raise HomeAssistantError(str(err)) from err
@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/calendar.google/",
"iot_class": "cloud_polling",
"loggers": ["googleapiclient"],
"requirements": ["gcal-sync==4.1.2", "oauth2client==4.1.3"]
"requirements": ["gcal-sync==4.1.4", "oauth2client==4.1.3"]
}
@@ -20,7 +20,7 @@ LANG_TO_BROADCAST_COMMAND = {
"it": ("Trasmetti {0}", "Trasmetti in {1} {0}"),
"ja": ("{0}とブロードキャストして", "{0}{1}にブロードキャストして"),
"ko": ("{0} 라고 방송해 줘", "{0} 라고 {1}에 방송해 줘"),
"pt": ("Transmite {0}", "Transmite para {1} {0}"),
"pt": ("Transmitir {0}", "Transmitir {0} para {1}"),
}
+1 -1
View File
@@ -151,7 +151,7 @@ class DeviceWithPrograms(HomeConnectDevice):
programs_available = self.appliance.get_programs_available()
except (HomeConnectError, ValueError):
_LOGGER.debug("Unable to fetch available programs. Probably offline")
programs_available = None
programs_available = []
return programs_available
def get_program_switches(self):
@@ -272,7 +272,8 @@ class HKDevice:
self.hass,
self.async_update_available_state,
timedelta(seconds=BLE_AVAILABILITY_CHECK_INTERVAL),
f"HomeKit Controller {self.unique_id} BLE availability check poll",
name=f"HomeKit Controller {self.unique_id} BLE availability "
"check poll",
)
)
# BLE devices always get an RSSI sensor as well
@@ -290,7 +291,7 @@ class HKDevice:
self.hass,
self.async_request_update,
self.pairing.poll_interval,
f"HomeKit Controller {self.unique_id} availability check poll",
name=f"HomeKit Controller {self.unique_id} availability check poll",
)
)
+11 -5
View File
@@ -1,9 +1,11 @@
"""Support for Honeywell (US) Total Connect Comfort climate systems."""
from __future__ import annotations
import asyncio
import datetime
from typing import Any
from aiohttp import ClientConnectionError
import aiosomecomfort
from homeassistant.components.climate import (
@@ -421,10 +423,7 @@ class HoneywellUSThermostat(ClimateEntity):
try:
await self._device.refresh()
self._attr_available = True
except (
aiosomecomfort.SomeComfortError,
OSError,
):
except aiosomecomfort.SomeComfortError:
try:
await self._data.client.login()
@@ -433,5 +432,12 @@ class HoneywellUSThermostat(ClimateEntity):
await self.hass.async_create_task(
self.hass.config_entries.async_reload(self._data.entry_id)
)
except aiosomecomfort.SomeComfortError:
except (
aiosomecomfort.SomeComfortError,
ClientConnectionError,
asyncio.TimeoutError,
):
self._attr_available = False
except (ClientConnectionError, asyncio.TimeoutError):
self._attr_available = False
+14 -12
View File
@@ -194,7 +194,11 @@ class ImapDataUpdateCoordinator(DataUpdateCoordinator[int | None]):
if count
else None
)
if count and last_message_id is not None:
if (
count
and last_message_id is not None
and self._last_message_id != last_message_id
):
self._last_message_id = last_message_id
await self._async_process_event(last_message_id)
@@ -209,10 +213,9 @@ class ImapDataUpdateCoordinator(DataUpdateCoordinator[int | None]):
await self.imap_client.stop_wait_server_push()
await self.imap_client.close()
await self.imap_client.logout()
except (AioImapException, asyncio.TimeoutError) as ex:
except (AioImapException, asyncio.TimeoutError):
if log_error:
self.async_set_update_error(ex)
_LOGGER.warning("Error while cleaning up imap connection")
_LOGGER.debug("Error while cleaning up imap connection")
self.imap_client = None
async def shutdown(self, *_) -> None:
@@ -236,18 +239,18 @@ class ImapPollingDataUpdateCoordinator(ImapDataUpdateCoordinator):
UpdateFailed,
asyncio.TimeoutError,
) as ex:
self.async_set_update_error(ex)
await self._cleanup()
self.async_set_update_error(ex)
raise UpdateFailed() from ex
except InvalidFolder as ex:
_LOGGER.warning("Selected mailbox folder is invalid")
self.async_set_update_error(ex)
await self._cleanup()
self.async_set_update_error(ex)
raise ConfigEntryError("Selected mailbox folder is invalid.") from ex
except InvalidAuth as ex:
_LOGGER.warning("Username or password incorrect, starting reauthentication")
self.async_set_update_error(ex)
await self._cleanup()
self.async_set_update_error(ex)
raise ConfigEntryAuthFailed() from ex
@@ -276,30 +279,30 @@ class ImapPushDataUpdateCoordinator(ImapDataUpdateCoordinator):
try:
number_of_messages = await self._async_fetch_number_of_messages()
except InvalidAuth as ex:
await self._cleanup()
_LOGGER.warning(
"Username or password incorrect, starting reauthentication"
)
self.config_entry.async_start_reauth(self.hass)
self.async_set_update_error(ex)
await self._cleanup()
await asyncio.sleep(BACKOFF_TIME)
except InvalidFolder as ex:
_LOGGER.warning("Selected mailbox folder is invalid")
await self._cleanup()
self.config_entry.async_set_state(
self.hass,
ConfigEntryState.SETUP_ERROR,
"Selected mailbox folder is invalid.",
)
self.async_set_update_error(ex)
await self._cleanup()
await asyncio.sleep(BACKOFF_TIME)
except (
UpdateFailed,
AioImapException,
asyncio.TimeoutError,
) as ex:
self.async_set_update_error(ex)
await self._cleanup()
self.async_set_update_error(ex)
await asyncio.sleep(BACKOFF_TIME)
continue
else:
@@ -312,12 +315,11 @@ class ImapPushDataUpdateCoordinator(ImapDataUpdateCoordinator):
await idle
except (AioImapException, asyncio.TimeoutError):
_LOGGER.warning(
_LOGGER.debug(
"Lost %s (will attempt to reconnect after %s s)",
self.config_entry.data[CONF_SERVER],
BACKOFF_TIME,
)
self.async_set_update_error(UpdateFailed("Lost connection"))
await self._cleanup()
await asyncio.sleep(BACKOFF_TIME)
@@ -17,7 +17,7 @@
"iot_class": "local_push",
"loggers": ["pyinsteon", "pypubsub"],
"requirements": [
"pyinsteon==1.4.0",
"pyinsteon==1.4.1",
"insteon-frontend-home-assistant==0.3.4"
],
"usb": [
+4 -1
View File
@@ -142,8 +142,11 @@ class ControllerDevice(ClimateEntity):
# If mode RAS, or mode master with CtrlZone 13 then can set master temperature,
# otherwise the unit determines which zone to use as target. See interface manual p. 8
# It appears some systems may have a different numbering system, so will trigger
# this if the control zone is > total zones.
if (
controller.ras_mode == "master" and controller.zone_ctrl == 13
controller.ras_mode == "master"
and controller.zone_ctrl > controller.zones_total
) or controller.ras_mode == "RAS":
self._attr_supported_features |= ClimateEntityFeature.TARGET_TEMPERATURE
@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"loggers": ["aiopyarr"],
"requirements": ["aiopyarr==22.11.0"]
"requirements": ["aiopyarr==23.4.0"]
}
@@ -2,7 +2,7 @@
from __future__ import annotations
from datetime import datetime
from datetime import date, datetime, timedelta
import logging
from typing import Any
@@ -186,14 +186,23 @@ def _parse_event(event: dict[str, Any]) -> Event:
def _get_calendar_event(event: Event) -> CalendarEvent:
"""Return a CalendarEvent from an API event."""
start: datetime | date
end: datetime | date
if isinstance(event.start, datetime) and isinstance(event.end, datetime):
start = dt_util.as_local(event.start)
end = dt_util.as_local(event.end)
if (end - start) <= timedelta(seconds=0):
end = start + timedelta(minutes=30)
else:
start = event.start
end = event.end
if (end - start) < timedelta(days=0):
end = start + timedelta(days=1)
return CalendarEvent(
summary=event.summary,
start=dt_util.as_local(event.start)
if isinstance(event.start, datetime)
else event.start,
end=dt_util.as_local(event.end)
if isinstance(event.end, datetime)
else event.end,
start=start,
end=end,
description=event.description,
uid=event.uid,
rrule=event.rrule.as_rrule_str() if event.rrule else None,
+23 -12
View File
@@ -22,6 +22,8 @@ import homeassistant.util.dt as dt_util
_LOGGER = logging.getLogger(__name__)
API_FAILURE = -1
DEFAULT_NAME = "NMBS"
DEFAULT_ICON = "mdi:train"
@@ -162,16 +164,19 @@ class NMBSLiveBoard(SensorEntity):
"""Set the state equal to the next departure."""
liveboard = self._api_client.get_liveboard(self._station)
if (
liveboard is None
or liveboard.get("departures") is None
or liveboard.get("departures").get("number") is None
or liveboard.get("departures").get("number") == "0"
or liveboard.get("departures").get("departure") is None
):
if liveboard == API_FAILURE:
_LOGGER.warning("API failed in NMBSLiveBoard")
return
next_departure = liveboard["departures"]["departure"][0]
if not (departures := liveboard.get("departures")):
_LOGGER.warning("API returned invalid departures: %r", liveboard)
return
_LOGGER.debug("API returned departures: %r", departures)
if departures["number"] == "0":
# No trains are scheduled
return
next_departure = departures["departure"][0]
self._attrs = next_departure
self._state = (
@@ -290,13 +295,19 @@ class NMBSSensor(SensorEntity):
self._station_from, self._station_to
)
if connections is None or not connections.get("connection"):
if connections == API_FAILURE:
_LOGGER.warning("API failed in NMBSSensor")
return
if int(connections["connection"][0]["departure"]["left"]) > 0:
next_connection = connections["connection"][1]
if not (connection := connections.get("connection")):
_LOGGER.warning("API returned invalid connection: %r", connections)
return
_LOGGER.debug("API returned connection: %r", connection)
if int(connection[0]["departure"]["left"]) > 0:
next_connection = connection[1]
else:
next_connection = connections["connection"][0]
next_connection = connection[0]
self._attrs = next_connection
+87 -63
View File
@@ -25,13 +25,12 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import (
device_registry as dr,
entity_registry as er,
issue_registry as ir,
)
from homeassistant.helpers.network import get_url
from homeassistant.helpers.network import NoURLAvailableError, get_url
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
@@ -47,7 +46,7 @@ from .const import (
DOMAIN,
ERROR_STATES,
)
from .helpers import parse_id
from .helpers import NukiWebhookException, parse_id
_NukiDeviceT = TypeVar("_NukiDeviceT", bound=NukiDevice)
@@ -61,6 +60,87 @@ def _get_bridge_devices(bridge: NukiBridge) -> tuple[list[NukiLock], list[NukiOp
return bridge.locks, bridge.openers
async def _create_webhook(
hass: HomeAssistant, entry: ConfigEntry, bridge: NukiBridge
) -> None:
# Create HomeAssistant webhook
async def handle_webhook(
hass: HomeAssistant, webhook_id: str, request: web.Request
) -> web.Response:
"""Handle webhook callback."""
try:
data = await request.json()
except ValueError:
return web.Response(status=HTTPStatus.BAD_REQUEST)
locks = hass.data[DOMAIN][entry.entry_id][DATA_LOCKS]
openers = hass.data[DOMAIN][entry.entry_id][DATA_OPENERS]
devices = [x for x in locks + openers if x.nuki_id == data["nukiId"]]
if len(devices) == 1:
devices[0].update_from_callback(data)
coordinator = hass.data[DOMAIN][entry.entry_id][DATA_COORDINATOR]
coordinator.async_set_updated_data(None)
return web.Response(status=HTTPStatus.OK)
webhook.async_register(
hass, DOMAIN, entry.title, entry.entry_id, handle_webhook, local_only=True
)
webhook_url = webhook.async_generate_path(entry.entry_id)
try:
hass_url = get_url(
hass,
allow_cloud=False,
allow_external=False,
allow_ip=True,
require_ssl=False,
)
except NoURLAvailableError:
webhook.async_unregister(hass, entry.entry_id)
raise NukiWebhookException(
f"Error registering URL for webhook {entry.entry_id}: "
"HomeAssistant URL is not available"
) from None
url = f"{hass_url}{webhook_url}"
if hass_url.startswith("https"):
ir.async_create_issue(
hass,
DOMAIN,
"https_webhook",
is_fixable=False,
severity=ir.IssueSeverity.WARNING,
translation_key="https_webhook",
translation_placeholders={
"base_url": hass_url,
"network_link": "https://my.home-assistant.io/redirect/network/",
},
)
else:
ir.async_delete_issue(hass, DOMAIN, "https_webhook")
try:
async with async_timeout.timeout(10):
await hass.async_add_executor_job(
_register_webhook, bridge, entry.entry_id, url
)
except InvalidCredentialsException as err:
webhook.async_unregister(hass, entry.entry_id)
raise NukiWebhookException(
f"Invalid credentials for Bridge: {err}"
) from err
except RequestException as err:
webhook.async_unregister(hass, entry.entry_id)
raise NukiWebhookException(
f"Error communicating with Bridge: {err}"
) from err
def _register_webhook(bridge: NukiBridge, entry_id: str, url: str) -> bool:
# Register HA URL as webhook if not already
callbacks = bridge.callback_list()
@@ -126,66 +206,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
sw_version=info["versions"]["firmwareVersion"],
)
async def handle_webhook(
hass: HomeAssistant, webhook_id: str, request: web.Request
) -> web.Response:
"""Handle webhook callback."""
try:
data = await request.json()
except ValueError:
return web.Response(status=HTTPStatus.BAD_REQUEST)
locks = hass.data[DOMAIN][entry.entry_id][DATA_LOCKS]
openers = hass.data[DOMAIN][entry.entry_id][DATA_OPENERS]
devices = [x for x in locks + openers if x.nuki_id == data["nukiId"]]
if len(devices) == 1:
devices[0].update_from_callback(data)
coordinator = hass.data[DOMAIN][entry.entry_id][DATA_COORDINATOR]
coordinator.async_set_updated_data(None)
return web.Response(status=HTTPStatus.OK)
webhook.async_register(
hass, DOMAIN, entry.title, entry.entry_id, handle_webhook, local_only=True
)
webhook_url = webhook.async_generate_path(entry.entry_id)
hass_url = get_url(
hass, allow_cloud=False, allow_external=False, allow_ip=True, require_ssl=False
)
url = f"{hass_url}{webhook_url}"
if hass_url.startswith("https"):
ir.async_create_issue(
hass,
DOMAIN,
"https_webhook",
is_fixable=False,
severity=ir.IssueSeverity.WARNING,
translation_key="https_webhook",
translation_placeholders={
"base_url": hass_url,
"network_link": "https://my.home-assistant.io/redirect/network/",
},
)
else:
ir.async_delete_issue(hass, DOMAIN, "https_webhook")
try:
async with async_timeout.timeout(10):
await hass.async_add_executor_job(
_register_webhook, bridge, entry.entry_id, url
)
except InvalidCredentialsException as err:
webhook.async_unregister(hass, entry.entry_id)
raise ConfigEntryNotReady(f"Invalid credentials for Bridge: {err}") from err
except RequestException as err:
webhook.async_unregister(hass, entry.entry_id)
raise ConfigEntryNotReady(
f"Error communicating with Bridge: {err}"
) from err
try:
await _create_webhook(hass, entry, bridge)
except NukiWebhookException as err:
_LOGGER.warning("Error registering HomeAssistant webhook: %s", err)
async def _stop_nuki(_: Event):
"""Stop and remove the Nuki webhook."""
+4
View File
@@ -13,3 +13,7 @@ class CannotConnect(exceptions.HomeAssistantError):
class InvalidAuth(exceptions.HomeAssistantError):
"""Error to indicate there is invalid auth."""
class NukiWebhookException(exceptions.HomeAssistantError):
"""Error to indicate there was an issue with the webhook."""
+2 -4
View File
@@ -27,11 +27,9 @@ _R = TypeVar("_R")
_P = ParamSpec("_P")
INFO_URL_SKY_CONNECT = (
"https://skyconnect.home-assistant.io/procedures/enable-multiprotocol/#limitations"
)
INFO_URL_YELLOW = (
"https://yellow.home-assistant.io/guides/enable-multiprotocol/#limitations"
"https://skyconnect.home-assistant.io/multiprotocol-channel-missmatch"
)
INFO_URL_YELLOW = "https://yellow.home-assistant.io/multiprotocol-channel-missmatch"
INSECURE_NETWORK_KEYS = (
# Thread web UI default
+34 -2
View File
@@ -16,9 +16,9 @@ from homeassistant.const import (
CONF_VERIFY_SSL,
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers import config_validation as cv, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.entity import DeviceInfo
from homeassistant.helpers.update_coordinator import (
@@ -64,6 +64,38 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
_LOGGER.debug("Setting up %s integration with host %s", DOMAIN, host)
name_to_key = {
"Core Update Available": "core_update_available",
"Web Update Available": "web_update_available",
"FTL Update Available": "ftl_update_available",
"Status": "status",
"Ads Blocked Today": "ads_blocked_today",
"Ads Percentage Blocked Today": "ads_percentage_today",
"Seen Clients": "clients_ever_seen",
"DNS Queries Today": "dns_queries_today",
"Domains Blocked": "domains_being_blocked",
"DNS Queries Cached": "queries_cached",
"DNS Queries Forwarded": "queries_forwarded",
"DNS Unique Clients": "unique_clients",
"DNS Unique Domains": "unique_domains",
}
@callback
def update_unique_id(
entity_entry: er.RegistryEntry,
) -> dict[str, str] | None:
"""Update unique ID of entity entry."""
unique_id_parts = entity_entry.unique_id.split("/")
if len(unique_id_parts) == 2 and unique_id_parts[1] in name_to_key:
name = unique_id_parts[1]
new_unique_id = entity_entry.unique_id.replace(name, name_to_key[name])
_LOGGER.debug("Migrate %s to %s", entity_entry.unique_id, new_unique_id)
return {"new_unique_id": new_unique_id}
return None
await er.async_migrate_entries(hass, entry.entry_id, update_unique_id)
session = async_get_clientsession(hass, verify_tls)
api = Hole(
host,
@@ -164,11 +164,12 @@ async def async_setup_entry( # noqa: C901
obj_type = call.data[CONF_TYPE]
_LOGGER.critical(
"%s objects in memory: %s",
obj_type,
[_safe_repr(obj) for obj in objgraph.by_type(obj_type)],
)
for obj in objgraph.by_type(obj_type):
_LOGGER.critical(
"%s object in memory: %s",
obj_type,
_safe_repr(obj),
)
persistent_notification.create(
hass,
+3 -3
View File
@@ -38,7 +38,7 @@ BUTTONS: dict[str, tuple[PrusaLinkButtonEntityDescription, ...]] = {
"printer": (
PrusaLinkButtonEntityDescription[PrinterInfo](
key="printer.cancel_job",
name="Cancel Job",
translation_key="cancel_job",
icon="mdi:cancel",
press_fn=lambda api: cast(Coroutine, api.cancel_job()),
available_fn=lambda data: any(
@@ -48,7 +48,7 @@ BUTTONS: dict[str, tuple[PrusaLinkButtonEntityDescription, ...]] = {
),
PrusaLinkButtonEntityDescription[PrinterInfo](
key="job.pause_job",
name="Pause Job",
translation_key="pause_job",
icon="mdi:pause",
press_fn=lambda api: cast(Coroutine, api.pause_job()),
available_fn=lambda data: (
@@ -58,7 +58,7 @@ BUTTONS: dict[str, tuple[PrusaLinkButtonEntityDescription, ...]] = {
),
PrusaLinkButtonEntityDescription[PrinterInfo](
key="job.resume_job",
name="Resume Job",
translation_key="resume_job",
icon="mdi:play",
press_fn=lambda api: cast(Coroutine, api.resume_job()),
available_fn=lambda data: cast(bool, data["state"]["flags"]["paused"]),
+1 -1
View File
@@ -24,7 +24,7 @@ class PrusaLinkJobPreviewEntity(PrusaLinkEntity, Camera):
last_path = ""
last_image: bytes
_attr_name = "Job Preview"
_attr_translation_key = "job_preview"
def __init__(self, coordinator: JobUpdateCoordinator) -> None:
"""Initialize a PrusaLink camera entity."""
+6 -6
View File
@@ -65,7 +65,7 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
),
PrusaLinkSensorEntityDescription[PrinterInfo](
key="printer.telemetry.temp-bed",
name="Heatbed",
translation_key="heatbed_temperature",
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -74,7 +74,7 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
),
PrusaLinkSensorEntityDescription[PrinterInfo](
key="printer.telemetry.temp-nozzle",
name="Nozzle Temperature",
translation_key="nozzle_temperature",
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
@@ -85,7 +85,7 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
"job": (
PrusaLinkSensorEntityDescription[JobInfo](
key="job.progress",
name="Progress",
translation_key="progress",
icon="mdi:progress-clock",
native_unit_of_measurement=PERCENTAGE,
value_fn=lambda data: cast(float, data["progress"]["completion"]) * 100,
@@ -93,14 +93,14 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
),
PrusaLinkSensorEntityDescription[JobInfo](
key="job.filename",
name="Filename",
translation_key="filename",
icon="mdi:file-image-outline",
value_fn=lambda data: cast(str, data["job"]["file"]["display"]),
available_fn=lambda data: data.get("job") is not None,
),
PrusaLinkSensorEntityDescription[JobInfo](
key="job.start",
name="Print Start",
translation_key="print_start",
device_class=SensorDeviceClass.TIMESTAMP,
icon="mdi:clock-start",
value_fn=ignore_variance(
@@ -113,7 +113,7 @@ SENSORS: dict[str, tuple[PrusaLinkSensorEntityDescription, ...]] = {
),
PrusaLinkSensorEntityDescription[JobInfo](
key="job.finish",
name="Print Finish",
translation_key="print_finish",
icon="mdi:clock-end",
device_class=SensorDeviceClass.TIMESTAMP,
value_fn=ignore_variance(
@@ -25,6 +25,40 @@
"pausing": "Pausing",
"printing": "Printing"
}
},
"heatbed_temperature": {
"name": "Heatbed temperature"
},
"nozzle_temperature": {
"name": "Nozzle temperature"
},
"progress": {
"name": "Progress"
},
"filename": {
"name": "Filename"
},
"print_start": {
"name": "Print start"
},
"print_finish": {
"name": "Print finish"
}
},
"button": {
"cancel_job": {
"name": "Cancel job"
},
"pause_job": {
"name": "Pause job"
},
"resume_job": {
"name": "Resume job"
}
},
"camera": {
"job_preview": {
"name": "Preview"
}
}
}
@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"loggers": ["aiopyarr"],
"requirements": ["aiopyarr==22.11.0"]
"requirements": ["aiopyarr==23.4.0"]
}
@@ -32,6 +32,7 @@ from .const import ( # noqa: F401
INTEGRATION_PLATFORM_EXCLUDE_ATTRIBUTES,
INTEGRATION_PLATFORMS_LOAD_IN_RECORDER_THREAD,
SQLITE_URL_PREFIX,
SupportedDialect,
)
from .core import Recorder
from .services import async_register_services
+17 -3
View File
@@ -58,6 +58,7 @@ from .const import (
SupportedDialect,
)
from .db_schema import (
LEGACY_STATES_ENTITY_ID_LAST_UPDATED_INDEX,
LEGACY_STATES_EVENT_ID_INDEX,
SCHEMA_VERSION,
TABLE_STATES,
@@ -96,6 +97,7 @@ from .tasks import (
CompileMissingStatisticsTask,
DatabaseLockTask,
EntityIDMigrationTask,
EntityIDPostMigrationTask,
EventIdMigrationTask,
EventsContextIDMigrationTask,
EventTask,
@@ -299,7 +301,7 @@ class Recorder(threading.Thread):
self.hass,
self._async_check_queue,
timedelta(minutes=10),
"Recorder queue watcher",
name="Recorder queue watcher",
)
@callback
@@ -602,7 +604,7 @@ class Recorder(threading.Thread):
self.hass,
self._async_keep_alive,
timedelta(seconds=KEEPALIVE_TIME),
"Recorder keep alive",
name="Recorder keep alive",
)
# If the commit interval is not 0, we need to commit periodically
@@ -611,7 +613,7 @@ class Recorder(threading.Thread):
self.hass,
self._async_commit,
timedelta(seconds=self.commit_interval),
"Recorder commit",
name="Recorder commit",
)
# Run nightly tasks at 4:12am
@@ -757,6 +759,18 @@ class Recorder(threading.Thread):
else:
_LOGGER.debug("Activating states_meta manager as all data is migrated")
self.states_meta_manager.active = True
with contextlib.suppress(SQLAlchemyError):
# If ix_states_entity_id_last_updated_ts still exists
# on the states table it means the entity id migration
# finished by the EntityIDPostMigrationTask did not
# because they restarted in the middle of it. We need
# to pick back up where we left off.
if get_index_by_name(
session,
TABLE_STATES,
LEGACY_STATES_ENTITY_ID_LAST_UPDATED_INDEX,
):
self.queue_task(EntityIDPostMigrationTask())
if self.schema_version > LEGACY_STATES_EVENT_ID_INDEX_SCHEMA_VERSION:
with contextlib.suppress(SQLAlchemyError):
@@ -119,6 +119,7 @@ METADATA_ID_LAST_UPDATED_INDEX_TS = "ix_states_metadata_id_last_updated_ts"
EVENTS_CONTEXT_ID_BIN_INDEX = "ix_events_context_id_bin"
STATES_CONTEXT_ID_BIN_INDEX = "ix_states_context_id_bin"
LEGACY_STATES_EVENT_ID_INDEX = "ix_states_event_id"
LEGACY_STATES_ENTITY_ID_LAST_UPDATED_INDEX = "ix_states_entity_id_last_updated_ts"
CONTEXT_ID_BIN_MAX_LENGTH = 16
MYSQL_COLLATE = "utf8mb4_unicode_ci"
@@ -284,7 +285,7 @@ class Events(Base):
"""Convert to a native HA Event."""
context = Context(
id=bytes_to_ulid_or_none(self.context_id_bin),
user_id=bytes_to_uuid_hex_or_none(self.context_user_id),
user_id=bytes_to_uuid_hex_or_none(self.context_user_id_bin),
parent_id=bytes_to_ulid_or_none(self.context_parent_id_bin),
)
try:
@@ -508,7 +509,7 @@ class States(Base):
"""Convert to an HA state object."""
context = Context(
id=bytes_to_ulid_or_none(self.context_id_bin),
user_id=bytes_to_uuid_hex_or_none(self.context_user_id),
user_id=bytes_to_uuid_hex_or_none(self.context_user_id_bin),
parent_id=bytes_to_ulid_or_none(self.context_parent_id_bin),
)
try:
@@ -48,6 +48,7 @@ from .const import SupportedDialect
from .db_schema import (
CONTEXT_ID_BIN_MAX_LENGTH,
DOUBLE_PRECISION_TYPE_SQL,
LEGACY_STATES_ENTITY_ID_LAST_UPDATED_INDEX,
LEGACY_STATES_EVENT_ID_INDEX,
MYSQL_COLLATE,
MYSQL_DEFAULT_CHARSET,
@@ -913,7 +914,7 @@ def _apply_update( # noqa: C901
_create_index(session_maker, "events", "ix_events_event_type_time_fired_ts")
_create_index(session_maker, "states", "ix_states_entity_id_last_updated_ts")
_create_index(session_maker, "states", "ix_states_last_updated_ts")
_migrate_columns_to_timestamp(session_maker, engine)
_migrate_columns_to_timestamp(instance, session_maker, engine)
elif new_version == 32:
# Migration is done in two steps to ensure we can start using
# the new columns before we wipe the old ones.
@@ -966,7 +967,7 @@ def _apply_update( # noqa: C901
"ix_statistics_short_term_statistic_id_start_ts",
)
try:
_migrate_statistics_columns_to_timestamp(session_maker, engine)
_migrate_statistics_columns_to_timestamp(instance, session_maker, engine)
except IntegrityError as ex:
_LOGGER.error(
"Statistics table contains duplicate entries: %s; "
@@ -979,7 +980,7 @@ def _apply_update( # noqa: C901
# and try again
with session_scope(session=session_maker()) as session:
delete_statistics_duplicates(instance, hass, session)
_migrate_statistics_columns_to_timestamp(session_maker, engine)
_migrate_statistics_columns_to_timestamp(instance, session_maker, engine)
# Log at error level to ensure the user sees this message in the log
# since we logged the error above.
_LOGGER.error(
@@ -1195,8 +1196,9 @@ def _wipe_old_string_time_columns(
session.commit()
@database_job_retry_wrapper("Migrate columns to timestamp", 3)
def _migrate_columns_to_timestamp(
session_maker: Callable[[], Session], engine: Engine
instance: Recorder, session_maker: Callable[[], Session], engine: Engine
) -> None:
"""Migrate columns to use timestamp."""
# Migrate all data in Events.time_fired to Events.time_fired_ts
@@ -1283,8 +1285,9 @@ def _migrate_columns_to_timestamp(
)
@database_job_retry_wrapper("Migrate statistics columns to timestamp", 3)
def _migrate_statistics_columns_to_timestamp(
session_maker: Callable[[], Session], engine: Engine
instance: Recorder, session_maker: Callable[[], Session], engine: Engine
) -> None:
"""Migrate statistics columns to use timestamp."""
# Migrate all data in statistics.start to statistics.start_ts
@@ -1584,7 +1587,7 @@ def post_migrate_entity_ids(instance: Recorder) -> bool:
if is_done:
# Drop the old indexes since they are no longer needed
_drop_index(session_maker, "states", "ix_states_entity_id_last_updated_ts")
_drop_index(session_maker, "states", LEGACY_STATES_ENTITY_ID_LAST_UPDATED_INDEX)
_LOGGER.debug("Cleanup legacy entity_ids done=%s", is_done)
return is_done
@@ -3,23 +3,36 @@ from __future__ import annotations
from contextlib import suppress
from functools import lru_cache
import logging
from uuid import UUID
from homeassistant.util.ulid import bytes_to_ulid, ulid_to_bytes
_LOGGER = logging.getLogger(__name__)
def ulid_to_bytes_or_none(ulid: str | None) -> bytes | None:
"""Convert an ulid to bytes."""
if ulid is None:
return None
return ulid_to_bytes(ulid)
try:
return ulid_to_bytes(ulid)
except ValueError as ex:
_LOGGER.error("Error converting ulid %s to bytes: %s", ulid, ex, exc_info=True)
return None
def bytes_to_ulid_or_none(_bytes: bytes | None) -> str | None:
"""Convert bytes to a ulid."""
if _bytes is None:
return None
return bytes_to_ulid(_bytes)
try:
return bytes_to_ulid(_bytes)
except ValueError as ex:
_LOGGER.error(
"Error converting bytes %s to ulid: %s", _bytes, ex, exc_info=True
)
return None
@lru_cache(maxsize=16)
+4 -1
View File
@@ -730,7 +730,8 @@ def batch_cleanup_entity_ids() -> StatementLambdaElement:
lambda: update(States)
.where(
States.state_id.in_(
select(States.state_id).join(
select(States.state_id)
.join(
states_with_entity_ids := select(
States.state_id.label("state_id_with_entity_id")
)
@@ -739,6 +740,8 @@ def batch_cleanup_entity_ids() -> StatementLambdaElement:
.subquery(),
States.state_id == states_with_entity_ids.c.state_id_with_entity_id,
)
.alias("states_with_entity_ids")
.select()
)
)
.values(entity_id=None)
@@ -87,7 +87,7 @@ BINARY_SENSORS = (
icon="mdi:bell-ring-outline",
icon_off="mdi:doorbell",
value=lambda api, ch: api.visitor_detected(ch),
supported=lambda api, ch: api.is_doorbell_enabled(ch),
supported=lambda api, ch: api.is_doorbell(ch),
),
)
@@ -178,7 +178,7 @@ class ReolinkFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
data_schema = data_schema.extend(
{
vol.Optional(CONF_PORT): cv.positive_int,
vol.Optional(CONF_USE_HTTPS): bool,
vol.Required(CONF_USE_HTTPS, default=False): bool,
}
)
@@ -18,5 +18,5 @@
"documentation": "https://www.home-assistant.io/integrations/reolink",
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"requirements": ["reolink-aio==0.5.9"]
"requirements": ["reolink-aio==0.5.10"]
}
@@ -24,5 +24,5 @@
"documentation": "https://www.home-assistant.io/integrations/roomba",
"iot_class": "local_push",
"loggers": ["paho_mqtt", "roombapy"],
"requirements": ["roombapy==1.6.6"]
"requirements": ["roombapy==1.6.8"]
}
@@ -13,5 +13,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["simplipy"],
"requirements": ["simplisafe-python==2022.12.0"]
"requirements": ["simplisafe-python==2023.04.0"]
}
@@ -7,5 +7,5 @@
"iot_class": "local_polling",
"loggers": ["aiopyarr"],
"quality_scale": "silver",
"requirements": ["aiopyarr==22.11.0"]
"requirements": ["aiopyarr==23.4.0"]
}
@@ -64,6 +64,7 @@ def validate_query(db_url: str, query: str, column: str) -> bool:
if sess:
sess.close()
engine.dispose()
return True
+16
View File
@@ -0,0 +1,16 @@
"""The sql integration models."""
from __future__ import annotations
from dataclasses import dataclass
from sqlalchemy.orm import scoped_session
from homeassistant.core import CALLBACK_TYPE
@dataclass(slots=True)
class SQLData:
"""Data for the sql integration."""
shutdown_event_cancel: CALLBACK_TYPE
session_makers_by_db_url: dict[str, scoped_session]
+94 -9
View File
@@ -13,7 +13,11 @@ from sqlalchemy.orm import Session, scoped_session, sessionmaker
from sqlalchemy.sql.lambdas import StatementLambdaElement
from sqlalchemy.util import LRUCache
from homeassistant.components.recorder import CONF_DB_URL, get_instance
from homeassistant.components.recorder import (
CONF_DB_URL,
SupportedDialect,
get_instance,
)
from homeassistant.components.sensor import (
CONF_STATE_CLASS,
SensorDeviceClass,
@@ -27,9 +31,11 @@ from homeassistant.const import (
CONF_UNIQUE_ID,
CONF_UNIT_OF_MEASUREMENT,
CONF_VALUE_TEMPLATE,
EVENT_HOMEASSISTANT_STOP,
)
from homeassistant.core import HomeAssistant
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import TemplateError
from homeassistant.helpers import issue_registry as ir
from homeassistant.helpers.device_registry import DeviceEntryType
from homeassistant.helpers.entity import DeviceInfo
from homeassistant.helpers.entity_platform import AddEntitiesCallback
@@ -37,6 +43,7 @@ from homeassistant.helpers.template import Template
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import CONF_COLUMN_NAME, CONF_QUERY, DB_URL_RE, DOMAIN
from .models import SQLData
from .util import resolve_db_url
_LOGGER = logging.getLogger(__name__)
@@ -126,6 +133,36 @@ async def async_setup_entry(
)
@callback
def _async_get_or_init_domain_data(hass: HomeAssistant) -> SQLData:
"""Get or initialize domain data."""
if DOMAIN in hass.data:
sql_data: SQLData = hass.data[DOMAIN]
return sql_data
session_makers_by_db_url: dict[str, scoped_session] = {}
#
# Ensure we dispose of all engines at shutdown
# to avoid unclean disconnects
#
# Shutdown all sessions in the executor since they will
# do blocking I/O
#
def _shutdown_db_engines(event: Event) -> None:
"""Shutdown all database engines."""
for sessmaker in session_makers_by_db_url.values():
sessmaker.connection().engine.dispose()
cancel_shutdown = hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_STOP, _shutdown_db_engines
)
sql_data = SQLData(cancel_shutdown, session_makers_by_db_url)
hass.data[DOMAIN] = sql_data
return sql_data
async def async_setup_sensor(
hass: HomeAssistant,
name: str,
@@ -143,20 +180,68 @@ async def async_setup_sensor(
"""Set up the SQL sensor."""
instance = get_instance(hass)
sessmaker: scoped_session | None
if use_database_executor := (db_url == instance.db_url):
sql_data = _async_get_or_init_domain_data(hass)
uses_recorder_db = db_url == instance.db_url
use_database_executor = False
if uses_recorder_db and instance.dialect_name == SupportedDialect.SQLITE:
use_database_executor = True
assert instance.engine is not None
sessmaker = scoped_session(sessionmaker(bind=instance.engine, future=True))
elif not (
sessmaker := await hass.async_add_executor_job(
_validate_and_get_session_maker_for_db_url, db_url
)
# For other databases we need to create a new engine since
# we want the connection to use the default timezone and these
# database engines will use QueuePool as its only sqlite that
# needs our custom pool. If there is already a session maker
# for this db_url we can use that so we do not create a new engine
# for every sensor.
elif db_url in sql_data.session_makers_by_db_url:
sessmaker = sql_data.session_makers_by_db_url[db_url]
elif sessmaker := await hass.async_add_executor_job(
_validate_and_get_session_maker_for_db_url, db_url
):
sql_data.session_makers_by_db_url[db_url] = sessmaker
else:
return
upper_query = query_str.upper()
if uses_recorder_db:
redacted_query = redact_credentials(query_str)
issue_key = unique_id if unique_id else redacted_query
# If the query has a unique id and they fix it we can dismiss the issue
# but if it doesn't have a unique id they have to ignore it instead
if (
"ENTITY_ID," in upper_query or "ENTITY_ID " in upper_query
) and "STATES_META" not in upper_query:
_LOGGER.error(
"The query `%s` contains the keyword `entity_id` but does not "
"reference the `states_meta` table. This will cause a full table "
"scan and database instability. Please check the documentation and use "
"`states_meta.entity_id` instead",
redacted_query,
)
ir.async_create_issue(
hass,
DOMAIN,
f"entity_id_query_does_full_table_scan_{issue_key}",
translation_key="entity_id_query_does_full_table_scan",
translation_placeholders={"query": redacted_query},
is_fixable=False,
severity=ir.IssueSeverity.ERROR,
)
raise ValueError(
"Query contains entity_id but does not reference states_meta"
)
ir.async_delete_issue(
hass, DOMAIN, f"entity_id_query_does_full_table_scan_{issue_key}"
)
# MSSQL uses TOP and not LIMIT
if not ("LIMIT" in query_str.upper() or "SELECT TOP" in query_str.upper()):
if not ("LIMIT" in upper_query or "SELECT TOP" in upper_query):
if "mssql" in db_url:
query_str = query_str.upper().replace("SELECT", "SELECT TOP 1")
query_str = upper_query.replace("SELECT", "SELECT TOP 1")
else:
query_str = query_str.replace(";", "") + " LIMIT 1;"
@@ -53,5 +53,11 @@
"db_url_invalid": "[%key:component::sql::config::error::db_url_invalid%]",
"query_invalid": "[%key:component::sql::config::error::query_invalid%]"
}
},
"issues": {
"entity_id_query_does_full_table_scan": {
"title": "SQL query does full table scan",
"description": "The query `{query}` contains the keyword `entity_id` but does not reference the `states_meta` table. This will cause a full table scan and database instability. Please check the documentation and use `states_meta.entity_id` instead."
}
}
}
+1 -1
View File
@@ -401,7 +401,7 @@ class Scanner:
self.hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, self.async_stop)
self._cancel_scan = async_track_time_interval(
self.hass, self.async_scan, SCAN_INTERVAL, "SSDP scanner"
self.hass, self.async_scan, SCAN_INTERVAL, name="SSDP scanner"
)
# Trigger the initial-scan.
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/subaru",
"iot_class": "cloud_polling",
"loggers": ["stdiomask", "subarulink"],
"requirements": ["subarulink==0.7.5"]
"requirements": ["subarulink==0.7.6"]
}
@@ -3,7 +3,11 @@ from __future__ import annotations
from typing import Any
from homeassistant.components.cover import CoverEntity, CoverEntityFeature
from homeassistant.components.cover import (
DOMAIN as COVER_DOMAIN,
CoverEntity,
CoverEntityFeature,
)
from homeassistant.components.switch import DOMAIN as SWITCH_DOMAIN
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -36,6 +40,7 @@ async def async_setup_entry(
CoverSwitch(
hass,
config_entry.title,
COVER_DOMAIN,
entity_id,
config_entry.entry_id,
)
+18 -3
View File
@@ -23,13 +23,15 @@ class BaseEntity(Entity):
"""Represents a Switch as an X."""
_attr_should_poll = False
_is_new_entity: bool
def __init__(
self,
hass: HomeAssistant,
config_entry_title: str,
domain: str,
switch_entity_id: str,
unique_id: str | None,
unique_id: str,
) -> None:
"""Initialize Switch as an X."""
registry = er.async_get(hass)
@@ -41,7 +43,7 @@ class BaseEntity(Entity):
name: str | None = config_entry_title
if wrapped_switch:
name = wrapped_switch.name or wrapped_switch.original_name
name = wrapped_switch.original_name
self._device_id = device_id
if device_id and (device := device_registry.async_get(device_id)):
@@ -55,6 +57,10 @@ class BaseEntity(Entity):
self._attr_unique_id = unique_id
self._switch_entity_id = switch_entity_id
self._is_new_entity = (
registry.async_get_entity_id(domain, SWITCH_AS_X_DOMAIN, unique_id) is None
)
@callback
def async_state_changed_listener(self, event: Event | None = None) -> None:
"""Handle child updates."""
@@ -67,7 +73,7 @@ class BaseEntity(Entity):
self._attr_available = True
async def async_added_to_hass(self) -> None:
"""Register callbacks."""
"""Register callbacks and copy the wrapped entity's custom name if set."""
@callback
def _async_state_changed_listener(event: Event | None = None) -> None:
@@ -93,6 +99,15 @@ class BaseEntity(Entity):
{"entity_id": self._switch_entity_id},
)
if not self._is_new_entity:
return
wrapped_switch = registry.async_get(self._switch_entity_id)
if not wrapped_switch or wrapped_switch.name is None:
return
registry.async_update_entity(self.entity_id, name=wrapped_switch.name)
class BaseToggleEntity(BaseEntity, ToggleEntity):
"""Represents a Switch as a ToggleEntity."""
+2 -1
View File
@@ -3,7 +3,7 @@ from __future__ import annotations
from typing import Any
from homeassistant.components.fan import FanEntity
from homeassistant.components.fan import DOMAIN as FAN_DOMAIN, FanEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ENTITY_ID
from homeassistant.core import HomeAssistant
@@ -29,6 +29,7 @@ async def async_setup_entry(
FanSwitch(
hass,
config_entry.title,
FAN_DOMAIN,
entity_id,
config_entry.entry_id,
)
@@ -1,7 +1,11 @@
"""Light support for switch entities."""
from __future__ import annotations
from homeassistant.components.light import ColorMode, LightEntity
from homeassistant.components.light import (
DOMAIN as LIGHT_DOMAIN,
ColorMode,
LightEntity,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ENTITY_ID
from homeassistant.core import HomeAssistant
@@ -27,6 +31,7 @@ async def async_setup_entry(
LightSwitch(
hass,
config_entry.title,
LIGHT_DOMAIN,
entity_id,
config_entry.entry_id,
)
+2 -1
View File
@@ -3,7 +3,7 @@ from __future__ import annotations
from typing import Any
from homeassistant.components.lock import LockEntity
from homeassistant.components.lock import DOMAIN as LOCK_DOMAIN, LockEntity
from homeassistant.components.switch import DOMAIN as SWITCH_DOMAIN
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -36,6 +36,7 @@ async def async_setup_entry(
LockSwitch(
hass,
config_entry.title,
LOCK_DOMAIN,
entity_id,
config_entry.entry_id,
)
@@ -1,7 +1,11 @@
"""Siren support for switch entities."""
from __future__ import annotations
from homeassistant.components.siren import SirenEntity, SirenEntityFeature
from homeassistant.components.siren import (
DOMAIN as SIREN_DOMAIN,
SirenEntity,
SirenEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ENTITY_ID
from homeassistant.core import HomeAssistant
@@ -27,6 +31,7 @@ async def async_setup_entry(
SirenSwitch(
hass,
config_entry.title,
SIREN_DOMAIN,
entity_id,
config_entry.entry_id,
)
@@ -53,7 +53,10 @@ SERVICE_SEND_KEYPRESS = "send_keypress"
SERVICE_SEND_TEXT = "send_text"
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
) -> bool:
"""Set up System Bridge from a config entry."""
# Check version before initialising
@@ -64,11 +67,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
session=async_get_clientsession(hass),
)
try:
if not await version.check_supported():
raise ConfigEntryNotReady(
"You are not running a supported version of System Bridge. Please"
f" update to {SUPPORTED_VERSION} or higher."
)
async with async_timeout.timeout(10):
if not await version.check_supported():
raise ConfigEntryNotReady(
"You are not running a supported version of System Bridge. Please"
f" update to {SUPPORTED_VERSION} or higher."
)
except AuthenticationException as exception:
_LOGGER.error("Authentication failed for %s: %s", entry.title, exception)
raise ConfigEntryAuthFailed from exception
@@ -87,7 +91,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry=entry,
)
try:
async with async_timeout.timeout(30):
async with async_timeout.timeout(10):
await coordinator.async_get_data(MODULES)
except AuthenticationException as exception:
_LOGGER.error("Authentication failed for %s: %s", entry.title, exception)
@@ -105,8 +109,8 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
try:
# Wait for initial data
async with async_timeout.timeout(30):
while not coordinator.is_ready():
async with async_timeout.timeout(10):
while not coordinator.is_ready:
_LOGGER.debug(
"Waiting for initial data from %s (%s)",
entry.title,
@@ -55,7 +55,7 @@ async def _validate_input(
data[CONF_API_KEY],
)
try:
async with async_timeout.timeout(30):
async with async_timeout.timeout(15):
await websocket_client.connect(session=async_get_clientsession(hass))
hass.async_create_task(websocket_client.listen())
response = await websocket_client.get_data(GetData(modules=["system"]))
@@ -82,6 +82,7 @@ class SystemBridgeDataUpdateCoordinator(
hass, LOGGER, name=DOMAIN, update_interval=timedelta(seconds=30)
)
@property
def is_ready(self) -> bool:
"""Return if the data is ready."""
if self.data is None:
@@ -157,7 +158,7 @@ class SystemBridgeDataUpdateCoordinator(
self.last_update_success = False
self.async_update_listeners()
except (ConnectionClosedException, ConnectionResetError) as exception:
self.logger.info(
self.logger.debug(
"Websocket connection closed for %s. Will retry: %s",
self.title,
exception,
@@ -168,7 +169,7 @@ class SystemBridgeDataUpdateCoordinator(
self.last_update_success = False
self.async_update_listeners()
except ConnectionErrorException as exception:
self.logger.warning(
self.logger.debug(
"Connection error occurred for %s. Will retry: %s",
self.title,
exception,
+1 -1
View File
@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["pytile"],
"requirements": ["pytile==2022.02.0"]
"requirements": ["pytile==2023.04.0"]
}
@@ -15,6 +15,12 @@
},
{
"st": "urn:schemas-upnp-org:device:InternetGatewayDevice:2"
},
{
"nt": "urn:schemas-upnp-org:device:InternetGatewayDevice:1"
},
{
"nt": "urn:schemas-upnp-org:device:InternetGatewayDevice:2"
}
]
}
@@ -35,7 +35,7 @@ from homeassistant.helpers.event import (
async_track_point_in_time,
async_track_state_change_event,
)
from homeassistant.helpers.start import async_at_start
from homeassistant.helpers.start import async_at_started
from homeassistant.helpers.template import is_number
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import slugify
@@ -410,8 +410,11 @@ class UtilityMeterSensor(RestoreSensor):
if (old_state_val := self._validate_state(old_state)) is not None:
return new_state_val - old_state_val
_LOGGER.warning(
"Invalid state (%s > %s)",
"%s received an invalid state change coming from %s (%s > %s)",
self.name,
self._sensor_source_id,
old_state.state if old_state else None,
new_state_val,
)
@@ -420,11 +423,26 @@ class UtilityMeterSensor(RestoreSensor):
@callback
def async_reading(self, event: Event):
"""Handle the sensor state changes."""
if (
source_state := self.hass.states.get(self._sensor_source_id)
) is None or source_state.state == STATE_UNAVAILABLE:
self._attr_available = False
self.async_write_ha_state()
return
self._attr_available = True
old_state: State | None = event.data.get("old_state")
new_state: State = event.data.get("new_state") # type: ignore[assignment] # a state change event always has a new state
# First check if the new_state is valid (see discussion in PR #88446)
if (new_state_val := self._validate_state(new_state)) is None:
_LOGGER.warning("Invalid state %s", new_state.state)
_LOGGER.warning(
"%s received an invalid new state from %s : %s",
self.name,
self._sensor_source_id,
new_state.state,
)
return
if self._state is None:
@@ -597,7 +615,7 @@ class UtilityMeterSensor(RestoreSensor):
self.hass, [self._sensor_source_id], self.async_reading
)
self.async_on_remove(async_at_start(self.hass, async_source_tracking))
self.async_on_remove(async_at_started(self.hass, async_source_tracking))
async def async_will_remove_from_hass(self) -> None:
"""Run when entity will be removed from hass."""
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/vallox",
"iot_class": "local_polling",
"loggers": ["vallox_websocket_api"],
"requirements": ["vallox-websocket-api==3.0.0"]
"requirements": ["vallox-websocket-api==3.2.1"]
}
+4 -2
View File
@@ -188,9 +188,10 @@ class VerisureDoorlock(CoordinatorEntity[VerisureDataUpdateCoordinator], LockEnt
def disable_autolock(self) -> None:
"""Disable autolock on a doorlock."""
try:
self.coordinator.verisure.set_lock_config(
command = self.coordinator.verisure.set_autolock_enabled(
self.serial_number, auto_lock_enabled=False
)
self.coordinator.verisure.request(command)
LOGGER.debug("Disabling autolock on %s", self.serial_number)
except VerisureError as ex:
LOGGER.error("Could not disable autolock, %s", ex)
@@ -198,9 +199,10 @@ class VerisureDoorlock(CoordinatorEntity[VerisureDataUpdateCoordinator], LockEnt
def enable_autolock(self) -> None:
"""Enable autolock on a doorlock."""
try:
self.coordinator.verisure.set_lock_config(
command = self.coordinator.verisure.set_autolock_enabled(
self.serial_number, auto_lock_enabled=True
)
self.coordinator.verisure.request(command)
LOGGER.debug("Enabling autolock on %s", self.serial_number)
except VerisureError as ex:
LOGGER.error("Could not enable autolock, %s", ex)
@@ -8,5 +8,5 @@
"iot_class": "local_push",
"loggers": ["zeroconf"],
"quality_scale": "internal",
"requirements": ["zeroconf==0.54.0"]
"requirements": ["zeroconf==0.56.0"]
}
@@ -25,7 +25,7 @@ class LightLink(ZigbeeChannel):
application = self._ch_pool.endpoint.device.application
try:
coordinator = application.get_device(application.ieee)
coordinator = application.get_device(application.state.node_info.ieee)
except KeyError:
self.warning("Aborting - unable to locate required coordinator device.")
return
@@ -187,11 +187,16 @@ class SmartThingsAcceleration(ZigbeeChannel):
@callback
def attribute_updated(self, attrid, value):
"""Handle attribute updates on this cluster."""
try:
attr_name = self._cluster.attributes[attrid].name
except KeyError:
attr_name = UNKNOWN
if attrid == self.value_attribute:
self.async_send_signal(
f"{self.unique_id}_{SIGNAL_ATTR_UPDATED}",
attrid,
self._cluster.attributes.get(attrid, [UNKNOWN])[0],
attr_name,
value,
)
return
@@ -200,7 +205,7 @@ class SmartThingsAcceleration(ZigbeeChannel):
SIGNAL_ATTR_UPDATED,
{
ATTR_ATTRIBUTE_ID: attrid,
ATTR_ATTRIBUTE_NAME: self._cluster.attributes.get(attrid, [UNKNOWN])[0],
ATTR_ATTRIBUTE_NAME: attr_name,
ATTR_VALUE: value,
},
)
@@ -246,14 +251,20 @@ class InovelliConfigEntityChannel(ZigbeeChannel):
"active_energy_reports": True,
"power_type": False,
"switch_type": False,
"increased_non_neutral_output": True,
"button_delay": False,
"smart_bulb_mode": False,
"double_tap_up_for_max_brightness": True,
"double_tap_down_for_min_brightness": True,
"double_tap_up_enabled": True,
"double_tap_down_enabled": True,
"double_tap_up_level": True,
"double_tap_down_level": True,
"led_color_when_on": True,
"led_color_when_off": True,
"led_intensity_when_on": True,
"led_intensity_when_off": True,
"led_scaling_mode": True,
"aux_switch_scenes": True,
"binding_off_to_on_sync_level": True,
"local_protection": False,
"output_mode": False,
"on_off_led_mode": True,
@@ -363,7 +363,7 @@ class IASZoneChannel(ZigbeeChannel):
self.debug("started IASZoneChannel configuration")
await self.bind()
ieee = self.cluster.endpoint.device.application.ieee
ieee = self.cluster.endpoint.device.application.state.node_info.ieee
try:
res = await self._cluster.write_attributes({"cie_addr": ieee})
+3 -3
View File
@@ -20,12 +20,12 @@
"zigpy_znp"
],
"requirements": [
"bellows==0.35.0",
"bellows==0.35.1",
"pyserial==3.5",
"pyserial-asyncio==0.6",
"zha-quirks==0.0.95",
"zha-quirks==0.0.97",
"zigpy-deconz==0.20.0",
"zigpy==0.54.0",
"zigpy==0.54.1",
"zigpy-xbee==0.17.0",
"zigpy-zigate==0.10.3",
"zigpy-znp==0.10.0"
+28
View File
@@ -835,6 +835,34 @@ class InovelliDefaultAllLEDOffIntensity(
_attr_name: str = "Default all LED off intensity"
@CONFIG_DIAGNOSTIC_MATCH(channel_names=CHANNEL_INOVELLI)
class InovelliDoubleTapUpLevel(
ZHANumberConfigurationEntity, id_suffix="double_tap_up_level"
):
"""Inovelli double tap up level configuration entity."""
_attr_entity_category = EntityCategory.CONFIG
_attr_icon: str = ICONS[16]
_attr_native_min_value: float = 2
_attr_native_max_value: float = 254
_zcl_attribute: str = "double_tap_up_level"
_attr_name: str = "Double tap up level"
@CONFIG_DIAGNOSTIC_MATCH(channel_names=CHANNEL_INOVELLI)
class InovelliDoubleTapDownLevel(
ZHANumberConfigurationEntity, id_suffix="double_tap_down_level"
):
"""Inovelli double tap down level configuration entity."""
_attr_entity_category = EntityCategory.CONFIG
_attr_icon: str = ICONS[16]
_attr_native_min_value: float = 0
_attr_native_max_value: float = 254
_zcl_attribute: str = "double_tap_down_level"
_attr_name: str = "Double tap down level"
@CONFIG_DIAGNOSTIC_MATCH(channel_names="opple_cluster", models={"aqara.feeder.acn001"})
class AqaraPetFeederServingSize(ZHANumberConfigurationEntity, id_suffix="serving_size"):
"""Aqara pet feeder serving size configuration entity."""
+40 -1
View File
@@ -472,9 +472,10 @@ class InovelliOutputModeEntity(ZCLEnumSelectEntity, id_suffix="output_mode"):
class InovelliSwitchType(types.enum8):
"""Inovelli output mode."""
Load_Only = 0x00
Single_Pole = 0x00
Three_Way_Dumb = 0x01
Three_Way_AUX = 0x02
Single_Pole_Full_Sine = 0x03
@CONFIG_DIAGNOSTIC_MATCH(
@@ -488,6 +489,44 @@ class InovelliSwitchTypeEntity(ZCLEnumSelectEntity, id_suffix="switch_type"):
_attr_name: str = "Switch type"
class InovelliLedScalingMode(types.enum1):
"""Inovelli led mode."""
VZM31SN = 0x00
LZW31SN = 0x01
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliLedScalingModeEntity(ZCLEnumSelectEntity, id_suffix="led_scaling_mode"):
"""Inovelli led mode control."""
_select_attr = "led_scaling_mode"
_enum = InovelliLedScalingMode
_attr_name: str = "Led scaling mode"
class InovelliNonNeutralOutput(types.enum1):
"""Inovelli non neutral output selection."""
Low = 0x00
High = 0x01
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliNonNeutralOutputEntity(
ZCLEnumSelectEntity, id_suffix="increased_non_neutral_output"
):
"""Inovelli non neutral output control."""
_select_attr = "increased_non_neutral_output"
_enum = InovelliNonNeutralOutput
_attr_name: str = "Non neutral output"
class AqaraFeedingMode(types.enum8):
"""Feeding mode."""
+34 -10
View File
@@ -367,25 +367,49 @@ class InovelliSmartBulbMode(ZHASwitchConfigurationEntity, id_suffix="smart_bulb_
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliDoubleTapForFullBrightness(
ZHASwitchConfigurationEntity, id_suffix="double_tap_up_for_max_brightness"
class InovelliDoubleTapUpEnabled(
ZHASwitchConfigurationEntity, id_suffix="double_tap_up_enabled"
):
"""Inovelli double tap for full brightness control."""
"""Inovelli double tap up enabled."""
_zcl_attribute: str = "double_tap_up_for_max_brightness"
_attr_name: str = "Double tap full brightness"
_zcl_attribute: str = "double_tap_up_enabled"
_attr_name: str = "Double tap up enabled"
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliDoubleTapForMinBrightness(
ZHASwitchConfigurationEntity, id_suffix="double_tap_down_for_min_brightness"
class InovelliDoubleTapDownEnabled(
ZHASwitchConfigurationEntity, id_suffix="double_tap_down_enabled"
):
"""Inovelli double tap down for minimum brightness control."""
"""Inovelli double tap down enabled."""
_zcl_attribute: str = "double_tap_down_for_min_brightness"
_attr_name: str = "Double tap minimum brightness"
_zcl_attribute: str = "double_tap_down_enabled"
_attr_name: str = "Double tap down enabled"
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliAuxSwitchScenes(
ZHASwitchConfigurationEntity, id_suffix="aux_switch_scenes"
):
"""Inovelli unique aux switch scenes."""
_zcl_attribute: str = "aux_switch_scenes"
_attr_name: str = "Aux switch scenes"
@CONFIG_DIAGNOSTIC_MATCH(
channel_names=CHANNEL_INOVELLI,
)
class InovelliBindingOffToOnSyncLevel(
ZHASwitchConfigurationEntity, id_suffix="binding_off_to_on_sync_level"
):
"""Inovelli send move to level with on/off to bound devices."""
_zcl_attribute: str = "binding_off_to_on_sync_level"
_attr_name: str = "Binding off to on sync level"
@CONFIG_DIAGNOSTIC_MATCH(
+1 -1
View File
@@ -8,7 +8,7 @@ from .backports.enum import StrEnum
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2023
MINOR_VERSION: Final = 4
PATCH_VERSION: Final = "0b7"
PATCH_VERSION: Final = "3"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 10, 0)
+6
View File
@@ -305,6 +305,12 @@ SSDP = {
{
"st": "urn:schemas-upnp-org:device:InternetGatewayDevice:2",
},
{
"nt": "urn:schemas-upnp-org:device:InternetGatewayDevice:1",
},
{
"nt": "urn:schemas-upnp-org:device:InternetGatewayDevice:2",
},
],
"webostv": [
{
+1 -1
View File
@@ -479,7 +479,7 @@ class EntityPlatform:
self.hass,
self._update_entity_states,
self.scan_interval,
f"EntityPlatform poll {self.domain}.{self.platform_name}",
name=f"EntityPlatform poll {self.domain}.{self.platform_name}",
)
def _entity_id_already_exists(self, entity_id: str) -> tuple[bool, bool]:
+1
View File
@@ -1397,6 +1397,7 @@ def async_track_time_interval(
hass: HomeAssistant,
action: Callable[[datetime], Coroutine[Any, Any, None] | None],
interval: timedelta,
*,
name: str | None = None,
) -> CALLBACK_TYPE:
"""Add a listener that fires repetitively at every timedelta interval."""
+1 -1
View File
@@ -219,7 +219,7 @@ class RestoreStateData:
self.hass,
_async_dump_states,
STATE_DUMP_INTERVAL,
"RestoreStateData dump states",
name="RestoreStateData dump states",
)
async def _async_dump_states_at_stop(*_: Any) -> None:
+9 -4
View File
@@ -1,7 +1,7 @@
PyJWT==2.6.0
PyNaCl==1.5.0
PyTurboJPEG==1.6.7
aiodiscover==1.4.15
aiodiscover==1.4.16
aiohttp==3.8.4
aiohttp_cors==0.7.0
astral==2.2
@@ -25,7 +25,7 @@ ha-av==10.0.0
hass-nabucasa==0.63.1
hassil==1.0.6
home-assistant-bluetooth==1.9.3
home-assistant-frontend==20230403.0
home-assistant-frontend==20230411.0
home-assistant-intents==2023.3.29
httpx==0.23.3
ifaddr==0.1.7
@@ -46,11 +46,11 @@ requests==2.28.2
scapy==2.5.0
sqlalchemy==2.0.7
typing-extensions>=4.5.0,<5.0
ulid-transform==0.5.1
ulid-transform==0.6.3
voluptuous-serialize==2.6.0
voluptuous==0.13.1
yarl==1.8.1
zeroconf==0.54.0
zeroconf==0.56.0
# Constrain pycryptodome to avoid vulnerability
# see https://github.com/home-assistant/core/pull/16238
@@ -157,3 +157,8 @@ uamqp==1.6.0;python_version<'3.11'
# faust-cchardet: Ensure we have a version we can build wheels
# 2.1.18 is the first version that works with our wheel builder
faust-cchardet>=2.1.18
# websockets 11.0 is missing files in the source distribution
# which break wheel builds so we need at least 11.0.1
# https://github.com/aaugustin/websockets/issues/1329
websockets>=11.0.1
+2 -2
View File
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "homeassistant"
version = "2023.4.0b7"
version = "2023.4.3"
license = {text = "Apache-2.0"}
description = "Open-source home automation platform running on Python 3."
readme = "README.rst"
@@ -50,7 +50,7 @@ dependencies = [
"pyyaml==6.0",
"requests==2.28.2",
"typing-extensions>=4.5.0,<5.0",
"ulid-transform==0.5.1",
"ulid-transform==0.6.3",
"voluptuous==0.13.1",
"voluptuous-serialize==2.6.0",
"yarl==1.8.1",
+1 -1
View File
@@ -24,7 +24,7 @@ python-slugify==4.0.1
pyyaml==6.0
requests==2.28.2
typing-extensions>=4.5.0,<5.0
ulid-transform==0.5.1
ulid-transform==0.6.3
voluptuous==0.13.1
voluptuous-serialize==2.6.0
yarl==1.8.1
+20 -20
View File
@@ -119,7 +119,7 @@ aioairq==0.2.4
aioairzone==0.5.2
# homeassistant.components.ambient_station
aioambient==2021.11.0
aioambient==2023.04.0
# homeassistant.components.aseko_pool_live
aioaseko==0.0.2
@@ -137,7 +137,7 @@ aiobafi6==0.8.0
aiobotocore==2.1.0
# homeassistant.components.dhcp
aiodiscover==1.4.15
aiodiscover==1.4.16
# homeassistant.components.dnsip
# homeassistant.components.minecraft_server
@@ -156,7 +156,7 @@ aioecowitt==2023.01.0
aioemonitor==1.0.5
# homeassistant.components.esphome
aioesphomeapi==13.6.0
aioesphomeapi==13.6.1
# homeassistant.components.flo
aioflo==2021.11.0
@@ -246,7 +246,7 @@ aiopvpc==4.1.0
# homeassistant.components.lidarr
# homeassistant.components.radarr
# homeassistant.components.sonarr
aiopyarr==22.11.0
aiopyarr==23.4.0
# homeassistant.components.qnap_qsw
aioqsw==0.3.2
@@ -422,7 +422,7 @@ beautifulsoup4==4.11.1
# beewi_smartclim==0.0.10
# homeassistant.components.zha
bellows==0.35.0
bellows==0.35.1
# homeassistant.components.bmw_connected_drive
bimmer_connected==0.13.0
@@ -661,7 +661,7 @@ enocean==0.50
enturclient==0.2.4
# homeassistant.components.environment_canada
env_canada==0.5.30
env_canada==0.5.32
# homeassistant.components.enphase_envoy
envoy_reader==0.20.1
@@ -725,7 +725,7 @@ fjaraskupan==2.2.0
flipr-api==1.5.0
# homeassistant.components.flux_led
flux_led==0.28.36
flux_led==0.28.37
# homeassistant.components.homekit
# homeassistant.components.recorder
@@ -748,7 +748,7 @@ freesms==0.2.0
# homeassistant.components.fritz
# homeassistant.components.fritzbox_callmonitor
fritzconnection==1.11.0
fritzconnection==1.12.0
# homeassistant.components.google_translate
gTTS==2.2.4
@@ -757,7 +757,7 @@ gTTS==2.2.4
gassist-text==0.0.10
# homeassistant.components.google
gcal-sync==4.1.2
gcal-sync==4.1.4
# homeassistant.components.geniushub
geniushub-client==0.7.0
@@ -907,7 +907,7 @@ hole==0.8.0
holidays==0.21.13
# homeassistant.components.frontend
home-assistant-frontend==20230403.0
home-assistant-frontend==20230411.0
# homeassistant.components.conversation
home-assistant-intents==2023.3.29
@@ -1684,7 +1684,7 @@ pyialarm==2.2.0
pyicloud==1.0.0
# homeassistant.components.insteon
pyinsteon==1.4.0
pyinsteon==1.4.1
# homeassistant.components.intesishome
pyintesishome==1.8.0
@@ -2127,7 +2127,7 @@ python_opendata_transport==0.3.0
pythonegardia==1.0.40
# homeassistant.components.tile
pytile==2022.02.0
pytile==2023.04.0
# homeassistant.components.tomorrowio
pytomorrowio==0.3.5
@@ -2231,7 +2231,7 @@ regenmaschine==2022.11.0
renault-api==0.1.12
# homeassistant.components.reolink
reolink-aio==0.5.9
reolink-aio==0.5.10
# homeassistant.components.python_script
restrictedpython==6.0
@@ -2258,7 +2258,7 @@ rocketchat-API==0.6.1
rokuecp==0.17.1
# homeassistant.components.roomba
roombapy==1.6.6
roombapy==1.6.8
# homeassistant.components.roon
roonapi==0.1.4
@@ -2343,7 +2343,7 @@ simplehound==0.3
simplepush==2.1.1
# homeassistant.components.simplisafe
simplisafe-python==2022.12.0
simplisafe-python==2023.04.0
# homeassistant.components.sisyphus
sisyphus-control==3.1.2
@@ -2428,7 +2428,7 @@ streamlabswater==1.0.1
stringcase==1.2.0
# homeassistant.components.subaru
subarulink==0.7.5
subarulink==0.7.6
# homeassistant.components.solarlog
sunwatcher==0.2.1
@@ -2565,7 +2565,7 @@ url-normalize==1.4.3
uvcclient==0.11.0
# homeassistant.components.vallox
vallox-websocket-api==3.0.0
vallox-websocket-api==3.2.1
# homeassistant.components.rdw
vehicle==1.0.0
@@ -2692,13 +2692,13 @@ zamg==0.2.2
zengge==0.2
# homeassistant.components.zeroconf
zeroconf==0.54.0
zeroconf==0.56.0
# homeassistant.components.zeversolar
zeversolar==0.3.1
# homeassistant.components.zha
zha-quirks==0.0.95
zha-quirks==0.0.97
# homeassistant.components.zhong_hong
zhong_hong_hvac==1.0.9
@@ -2719,7 +2719,7 @@ zigpy-zigate==0.10.3
zigpy-znp==0.10.0
# homeassistant.components.zha
zigpy==0.54.0
zigpy==0.54.1
# homeassistant.components.zoneminder
zm-py==0.5.2
-1
View File
@@ -8,7 +8,6 @@
-c homeassistant/package_constraints.txt
-r requirements_test_pre_commit.txt
astroid==2.15.0
codecov==2.1.12
coverage==7.2.1
freezegun==1.2.2
mock-open==1.4.0

Some files were not shown because too many files have changed in this diff Show More