Compare commits

...

50 Commits

Author SHA1 Message Date
Franck Nijhof
60be2af8ac 2024.4.4 (#116045) 2024-04-23 21:50:08 +02:00
Franck Nijhof
4d551d68c6 Bump version to 2024.4.4 2024-04-23 20:12:21 +02:00
Raj Laud
b521acb724 Use start helper in squeezebox for server discovery (#115978) 2024-04-23 20:12:06 +02:00
Michael
036b6fca25 Fix geo location attributes of Tankerkoenig sensors (#115914)
* geo location attributes needs to be float

* make mypy happy
2024-04-23 20:12:04 +02:00
Allen Porter
c9c7c7803e Bump ical to 8.0.0 (#115907)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-04-23 20:12:01 +02:00
jjlawren
b8b2f6427a Bump plexapi to 4.15.12 (#115872) 2024-04-23 20:11:58 +02:00
Ståle Storø Hauknes
13ed2d2919 Fix KeyError error when fetching sensors (Airthings) (#115844) 2024-04-23 20:11:55 +02:00
J. Nick Koston
32f82d480f Ensure scripts with timeouts of zero timeout immediately (#115830) 2024-04-23 20:11:52 +02:00
Robert Svensson
6464218e59 Bump aiounifi to v75 (#115819) 2024-04-23 20:11:49 +02:00
Simone Chemelli
4088447303 Add missing media_player features to Samsung TV (#115788)
* add missing features

* fix snapshot
2024-04-23 20:11:46 +02:00
Erik Montnemery
851a5497b4 Allow [##:##:##] type keypad address in homeworks (#115762)
Allow [##:##:##] type keypad address
2024-04-23 20:11:43 +02:00
Erik Montnemery
c4b504ce39 Fix homeworks import flow (#115761) 2024-04-23 20:11:40 +02:00
epenet
db31a526e5 Bump renault-api to 0.2.2 (#115738) 2024-04-23 20:11:36 +02:00
J. Nick Koston
8207fc29d2 Bump aiohttp to 3.9.5 (#115727)
changelog: https://github.com/aio-libs/aiohttp/compare/v3.9.4...v3.9.5
2024-04-23 20:08:08 +02:00
J. Nick Koston
630763ad9e Bump sqlparse to 0.5.0 (#115681)
fixes https://github.com/home-assistant/core/security/dependabot/54
fixes https://github.com/home-assistant/core/security/dependabot/55
2024-04-23 20:07:08 +02:00
J. Nick Koston
09ed0aa399 Bump httpcore to 1.0.5 (#115672)
Fixes missing handling of EndOfStream errors
2024-04-23 20:07:05 +02:00
Brett Adams
66918d1686 Fix sensor entity description in Teslemetry (#115614)
Add description back to sensor entity
2024-04-23 20:07:02 +02:00
jan iversen
3d68ee99a4 Modbus: Bump pymodbus v3.6.8 (#115574) 2024-04-23 20:06:59 +02:00
Brett Adams
37a82c8785 Fix Teslemetry sensor values (#115571) 2024-04-23 20:06:56 +02:00
J. Nick Koston
038040db5e Fix race in TimestampDataUpdateCoordinator (#115542)
* Fix race in TimestampDataUpdateCoordinator

The last_update_success_time value was being set after the listeners
were fired which could lead to a loop because the listener may
re-trigger an update because it thinks the data is stale

* coverage

* docstring
2024-04-23 20:06:53 +02:00
Marc Mueller
b770edc16e Update pillow to 10.3.0 (#115524) 2024-04-23 20:06:50 +02:00
J. Nick Koston
e1a2416076 Bump zeroconf to 0.132.2 (#115505) 2024-04-23 20:06:47 +02:00
J. Nick Koston
6247624514 Bump zeroconf to 0.132.1 (#115501) 2024-04-23 20:06:44 +02:00
slyoldfox
42c13eb57f Add scheduled mode to renault charge mode (#115427)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2024-04-23 20:06:41 +02:00
avee87
7eb6b2ca33 Fix Hyperion light not updating state (#115389) 2024-04-23 20:03:23 +02:00
Joost Lekkerkerker
5194faa8fd Make Withings recoverable after internet outage (#115124) 2024-04-23 20:03:20 +02:00
Jonny Rimkus
5826f9a4f6 Bump slixmpp version to 1.8.5 (#114448)
* Update slixmpp to 1.8.5, hopefully fixes #113990

* Bump slixmpp version to 1.8.5 #114448
2024-04-23 20:03:15 +02:00
Franck Nijhof
efe91815fb 2024.4.3 (#115463) 2024-04-12 13:20:48 +02:00
Franck Nijhof
62eee52aed Bump version to 2024.4.3 2024-04-12 12:00:16 +02:00
Bram Kragten
7f6514b03c Update frontend to 20240404.2 (#115460) 2024-04-12 11:59:53 +02:00
Santobert
2ed1cfd68d Bump pybotvac to 0.0.25 (#115435)
Bump pybotvac
2024-04-12 11:59:50 +02:00
Allen Porter
a455e142ac Fix bug in rainbird switch when turning off a switch that is already off (#115421)
Fix big in rainbird switch when turning off a switch that is already off

Co-authored-by: J. Nick Koston <nick@koston.org>
2024-04-12 11:59:45 +02:00
Jessica Smith
5fa06e5a9c Bump whirlpool-sixth-sense to 0.18.8 (#115393)
bump whirlpool to 0.18.8
2024-04-12 11:59:42 +02:00
J. Nick Koston
4aca39b49e Fix deadlock in holidays dynamic loading (#115385) 2024-04-12 11:59:39 +02:00
jan iversen
d055f98736 Solve modbus test problem (#115376)
Fix test.
2024-04-12 11:59:35 +02:00
jan iversen
98bc7c0ed2 Secure against resetting a non active modbus (#115364) 2024-04-12 11:59:32 +02:00
J. Nick Koston
0d62e2e92a Bump bleak-retry-connector 3.5.0 (#115328) 2024-04-12 11:59:28 +02:00
J. Nick Koston
db5343164f Ensure automations do not execute from a trigger if they are disabled (#115305)
* Ensure automations are stopped as soon as the stop future is set

* revert script changes and move them to #115325
2024-04-12 11:59:24 +02:00
TheJulianJES
5c2e9142fa Bump zha-quirks to 0.0.114 (#115299) 2024-04-12 11:59:21 +02:00
Shay Levy
f941e5d5bb Fix Aranet failure when the Bluetooth proxy is not providing a device name (#115298)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-04-12 11:59:17 +02:00
Joost Lekkerkerker
150145c9b1 Bump yt-dlp to 2024.04.09 (#115295) 2024-04-12 11:59:14 +02:00
jan iversen
db2005d4ec Bump pymodbus v3.6.7 (#115279)
Bump pymodbus v3.6.7.
2024-04-12 11:59:11 +02:00
Stefan Agner
08bd269696 Support backup of add-ons with hyphens (#115274)
Co-authored-by: J. Nick Koston <nick@koston.org>
2024-04-12 11:59:07 +02:00
Klaas Schoute
5723ed28d3 Bump forecast-solar lib to v3.1.0 (#115272) 2024-04-12 11:59:04 +02:00
Allen Porter
14da34cd4d Fix Google Tasks parsing of remove responses (#115258) 2024-04-12 11:59:01 +02:00
J. Nick Koston
f284273ef6 Fix misssing timeout in caldav (#115247) 2024-04-12 11:58:57 +02:00
On Freund
fc60426213 Improve Risco exception logging (#115232) 2024-04-12 11:58:54 +02:00
On Freund
922cc81a62 Configurable maximum concurrency in Risco local (#115226)
* Configurable maximum concurrency in Risco local

* Show advanced Risco options in advanced mode
2024-04-12 11:58:50 +02:00
Mike Degatano
4c6fad8dc3 Add support for adopt data disk repair (#114891) 2024-04-12 11:58:46 +02:00
J. Nick Koston
733e2ec57a Bump aiohttp to 3.9.4 (#110730)
* Bump aiohttp to 3.9.4

This is rc0 for now but will be updated when the full release it out

* cleanup cruft

* regen

* fix tests (these changes are fine)

* chunk size is too small to read since boundry is now enforced

* chunk size is too small to read since boundry is now enforced
2024-04-12 11:55:52 +02:00
83 changed files with 1414 additions and 380 deletions

View File

@@ -157,3 +157,11 @@ class AirthingsHeaterEnergySensor(
def native_value(self) -> StateType:
"""Return the value reported by the sensor."""
return self.coordinator.data[self._id].sensors[self.entity_description.key] # type: ignore[no-any-return]
@property
def available(self) -> bool:
"""Check if device and sensor is available in data."""
return (
super().available
and self.entity_description.key in self.coordinator.data[self._id].sensors
)

View File

@@ -2,10 +2,10 @@
from __future__ import annotations
import logging
from typing import Any
from aranet4.client import Aranet4Advertisement, Version as AranetVersion
from bluetooth_data_tools import human_readable_name
import voluptuous as vol
from homeassistant.components.bluetooth import (
@@ -18,11 +18,15 @@ from homeassistant.data_entry_flow import AbortFlow
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
MIN_VERSION = AranetVersion(1, 2, 0)
def _title(discovery_info: BluetoothServiceInfoBleak) -> str:
return discovery_info.device.name or human_readable_name(
None, "Aranet", discovery_info.address
)
class AranetConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Aranet."""
@@ -61,11 +65,8 @@ class AranetConfigFlow(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Confirm discovery."""
assert self._discovered_device is not None
adv = self._discovered_device
assert self._discovery_info is not None
discovery_info = self._discovery_info
title = adv.readings.name if adv.readings else discovery_info.name
title = _title(self._discovery_info)
if user_input is not None:
return self.async_create_entry(title=title, data={})
@@ -101,10 +102,7 @@ class AranetConfigFlow(ConfigFlow, domain=DOMAIN):
discovery_info.device, discovery_info.advertisement
)
if adv.manufacturer_data:
self._discovered_devices[address] = (
adv.readings.name if adv.readings else discovery_info.name,
adv,
)
self._discovered_devices[address] = (_title(discovery_info), adv)
if not self._discovered_devices:
return self.async_abort(reason="no_devices_found")

View File

@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/aranet",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["aranet4==2.2.2"]
"requirements": ["aranet4==2.3.3"]
}

View File

@@ -812,6 +812,22 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Log helper callback."""
self._logger.log(level, "%s %s", msg, self.name, **kwargs)
async def _async_trigger_if_enabled(
self,
run_variables: dict[str, Any],
context: Context | None = None,
skip_condition: bool = False,
) -> ScriptRunResult | None:
"""Trigger automation if enabled.
If the trigger starts but has a delay, the automation will be triggered
when the delay has passed so we need to make sure its still enabled before
executing the action.
"""
if not self._is_enabled:
return None
return await self.async_trigger(run_variables, context, skip_condition)
async def _async_attach_triggers(
self, home_assistant_start: bool
) -> Callable[[], None] | None:
@@ -835,7 +851,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
return await async_initialize_triggers(
self.hass,
self._trigger_config,
self.async_trigger,
self._async_trigger_if_enabled,
DOMAIN,
str(self.name),
self._log_callback,

View File

@@ -15,7 +15,7 @@
"quality_scale": "internal",
"requirements": [
"bleak==0.21.1",
"bleak-retry-connector==3.4.0",
"bleak-retry-connector==3.5.0",
"bluetooth-adapters==0.18.0",
"bluetooth-auto-recovery==1.4.0",
"bluetooth-data-tools==1.19.0",

View File

@@ -34,6 +34,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
ssl_verify_cert=entry.data[CONF_VERIFY_SSL],
timeout=10,
)
try:
await hass.async_add_executor_job(client.principal)

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/doods",
"iot_class": "local_polling",
"loggers": ["pydoods"],
"requirements": ["pydoods==1.0.2", "Pillow==10.2.0"]
"requirements": ["pydoods==1.0.2", "Pillow==10.3.0"]
}

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "platinum",
"requirements": ["forecast-solar==3.0.0"]
"requirements": ["forecast-solar==3.1.0"]
}

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20240404.1"]
"requirements": ["home-assistant-frontend==20240404.2"]
}

View File

@@ -6,5 +6,5 @@
"dependencies": ["http"],
"documentation": "https://www.home-assistant.io/integrations/generic",
"iot_class": "local_push",
"requirements": ["ha-av==10.1.1", "Pillow==10.2.0"]
"requirements": ["ha-av==10.1.1", "Pillow==10.3.0"]
}

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/calendar.google",
"iot_class": "cloud_polling",
"loggers": ["googleapiclient"],
"requirements": ["gcal-sync==6.0.4", "oauth2client==4.1.3", "ical==7.0.3"]
"requirements": ["gcal-sync==6.0.4", "oauth2client==4.1.3", "ical==8.0.0"]
}

View File

@@ -112,8 +112,9 @@ class AsyncConfigEntryAuth:
raise GoogleTasksApiError(
f"Google Tasks API responded with error ({exception.status_code})"
) from exception
data = json.loads(response)
_raise_if_error(data)
if response:
data = json.loads(response)
_raise_if_error(data)
for task_id in task_ids:
batch.add(

View File

@@ -196,7 +196,7 @@ SCHEMA_BACKUP_PARTIAL = SCHEMA_BACKUP_FULL.extend(
{
vol.Optional(ATTR_HOMEASSISTANT): cv.boolean,
vol.Optional(ATTR_FOLDERS): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(ATTR_ADDONS): vol.All(cv.ensure_list, [cv.slug]),
vol.Optional(ATTR_ADDONS): vol.All(cv.ensure_list, [VALID_ADDON_SLUG]),
}
)
@@ -211,7 +211,7 @@ SCHEMA_RESTORE_PARTIAL = SCHEMA_RESTORE_FULL.extend(
{
vol.Optional(ATTR_HOMEASSISTANT): cv.boolean,
vol.Optional(ATTR_FOLDERS): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(ATTR_ADDONS): vol.All(cv.ensure_list, [cv.slug]),
vol.Optional(ATTR_ADDONS): vol.All(cv.ensure_list, [VALID_ADDON_SLUG]),
}
)

View File

@@ -22,7 +22,7 @@ from .const import (
from .handler import async_apply_suggestion
from .issues import Issue, Suggestion
SUGGESTION_CONFIRMATION_REQUIRED = {"system_execute_reboot"}
SUGGESTION_CONFIRMATION_REQUIRED = {"system_adopt_data_disk", "system_execute_reboot"}
EXTRA_PLACEHOLDERS = {
"issue_mount_mount_failed": {

View File

@@ -51,8 +51,15 @@
"title": "Multiple data disks detected",
"fix_flow": {
"step": {
"system_rename_data_disk": {
"description": "`{reference}` is a filesystem with the name hassos-data and is not the active data disk. This can cause Home Assistant to choose the wrong data disk at system reboot.\n\nUse the fix option to rename the filesystem to prevent this. Alternatively you can move the data disk to the drive (overwriting its contents) or remove the drive from the system."
"fix_menu": {
"description": "`{reference}` is a filesystem with the name hassos-data and is not the active data disk. This can cause Home Assistant to choose the wrong data disk at system reboot.\n\nUse the 'Rename' option to rename the filesystem to prevent this. Use the 'Adopt' option to make that your data disk and rename the existing one. Alternatively you can move the data disk to the drive (overwriting its contents) or remove the drive from the system.",
"menu_options": {
"system_rename_data_disk": "Rename",
"system_adopt_data_disk": "Adopt"
}
},
"system_adopt_data_disk": {
"description": "This fix will initiate a system reboot which will make Home Assistant and all the Add-ons inaccessible for a brief period. After the reboot `{reference}` will be the data disk of Home Assistant and your existing data disk will be renamed and ignored."
}
},
"abort": {

View File

@@ -2,15 +2,36 @@
from __future__ import annotations
from functools import partial
from holidays import country_holidays
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.const import CONF_COUNTRY, Platform
from homeassistant.core import HomeAssistant
from homeassistant.setup import SetupPhases, async_pause_setup
from .const import CONF_PROVINCE
PLATFORMS: list[Platform] = [Platform.CALENDAR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Holiday from a config entry."""
country: str = entry.data[CONF_COUNTRY]
province: str | None = entry.data.get(CONF_PROVINCE)
# We only import here to ensure that that its not imported later
# in the event loop since the platforms will call country_holidays
# which loads python code from disk.
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# import executor job is used here because multiple integrations use
# the holidays library and it is not thread safe to import it in parallel
# https://github.com/python/cpython/issues/83065
await hass.async_add_import_executor_job(
partial(country_holidays, country, subdiv=province)
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True

View File

@@ -93,7 +93,7 @@ BUTTON_EDIT = {
}
validate_addr = cv.matches_regex(r"\[\d\d:\d\d:\d\d:\d\d\]")
validate_addr = cv.matches_regex(r"\[(?:\d\d:)?\d\d:\d\d:\d\d\]")
async def validate_add_controller(
@@ -565,15 +565,7 @@ class HomeworksConfigFlowHandler(ConfigFlow, domain=DOMAIN):
CONF_KEYPADS: [
{
CONF_ADDR: keypad[CONF_ADDR],
CONF_BUTTONS: [
{
CONF_LED: button[CONF_LED],
CONF_NAME: button[CONF_NAME],
CONF_NUMBER: button[CONF_NUMBER],
CONF_RELEASE_DELAY: button[CONF_RELEASE_DELAY],
}
for button in keypad[CONF_BUTTONS]
],
CONF_BUTTONS: [],
CONF_NAME: keypad[CONF_NAME],
}
for keypad in config[CONF_KEYPADS]

View File

@@ -191,13 +191,13 @@ class HyperionVisiblePrioritySensor(HyperionSensor):
if priority[KEY_COMPONENTID] == "COLOR":
state_value = priority[KEY_VALUE][KEY_RGB]
else:
state_value = priority[KEY_OWNER]
state_value = priority.get(KEY_OWNER)
attrs = {
"component_id": priority[KEY_COMPONENTID],
"origin": priority[KEY_ORIGIN],
"priority": priority[KEY_PRIORITY],
"owner": priority[KEY_OWNER],
"owner": priority.get(KEY_OWNER),
}
if priority[KEY_COMPONENTID] == "COLOR":

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/image_upload",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["Pillow==10.2.0"]
"requirements": ["Pillow==10.3.0"]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/local_calendar",
"iot_class": "local_polling",
"loggers": ["ical"],
"requirements": ["ical==7.0.3"]
"requirements": ["ical==8.0.0"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/local_todo",
"iot_class": "local_polling",
"requirements": ["ical==7.0.3"]
"requirements": ["ical==8.0.0"]
}

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/matrix",
"iot_class": "cloud_push",
"loggers": ["matrix_client"],
"requirements": ["matrix-nio==0.24.0", "Pillow==10.2.0"]
"requirements": ["matrix-nio==0.24.0", "Pillow==10.3.0"]
}

View File

@@ -7,5 +7,5 @@
"iot_class": "calculated",
"loggers": ["yt_dlp"],
"quality_scale": "internal",
"requirements": ["yt-dlp==2024.03.10"]
"requirements": ["yt-dlp==2024.04.09"]
}

View File

@@ -440,6 +440,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_reset_platform(hass: HomeAssistant, integration_name: str) -> None:
"""Release modbus resources."""
if DOMAIN not in hass.data:
_LOGGER.error("Modbus cannot reload, because it was never loaded")
return
_LOGGER.info("Modbus reloading")
hubs = hass.data[DOMAIN]
for name in hubs:

View File

@@ -6,5 +6,5 @@
"iot_class": "local_polling",
"loggers": ["pymodbus"],
"quality_scale": "platinum",
"requirements": ["pymodbus==3.6.6"]
"requirements": ["pymodbus==3.6.8"]
}

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/neato",
"iot_class": "cloud_polling",
"loggers": ["pybotvac"],
"requirements": ["pybotvac==0.0.24"]
"requirements": ["pybotvac==0.0.25"]
}

View File

@@ -8,7 +8,7 @@
"iot_class": "local_push",
"loggers": ["plexapi", "plexwebsocket"],
"requirements": [
"PlexAPI==4.15.11",
"PlexAPI==4.15.12",
"plexauth==0.0.6",
"plexwebsocket==0.0.14"
],

View File

@@ -3,5 +3,5 @@
"name": "Camera Proxy",
"codeowners": [],
"documentation": "https://www.home-assistant.io/integrations/proxy",
"requirements": ["Pillow==10.2.0"]
"requirements": ["Pillow==10.3.0"]
}

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/qrcode",
"iot_class": "calculated",
"loggers": ["pyzbar"],
"requirements": ["Pillow==10.2.0", "pyzbar==0.1.7"]
"requirements": ["Pillow==10.3.0", "pyzbar==0.1.7"]
}

View File

@@ -123,7 +123,8 @@ class RainBirdSwitch(CoordinatorEntity[RainbirdUpdateCoordinator], SwitchEntity)
# The device reflects the old state for a few moments. Update the
# state manually and trigger a refresh after a short debounced delay.
self.coordinator.data.active_zones.remove(self._zone)
if self.is_on:
self.coordinator.data.active_zones.remove(self._zone)
self.async_write_ha_state()
await self.coordinator.async_request_refresh()

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["renault_api"],
"quality_scale": "platinum",
"requirements": ["renault-api==0.2.1"]
"requirements": ["renault-api==0.2.2"]
}

View File

@@ -71,6 +71,6 @@ SENSOR_TYPES: tuple[RenaultSelectEntityDescription, ...] = (
coordinator="charge_mode",
data_key="chargeMode",
translation_key="charge_mode",
options=["always", "always_charging", "schedule_mode"],
options=["always", "always_charging", "schedule_mode", "scheduled"],
),
)

View File

@@ -38,7 +38,9 @@ from homeassistant.helpers.storage import Store
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import (
CONF_CONCURRENCY,
DATA_COORDINATOR,
DEFAULT_CONCURRENCY,
DEFAULT_SCAN_INTERVAL,
DOMAIN,
EVENTS_COORDINATOR,
@@ -85,7 +87,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def _async_setup_local_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
data = entry.data
risco = RiscoLocal(data[CONF_HOST], data[CONF_PORT], data[CONF_PIN])
concurrency = entry.options.get(CONF_CONCURRENCY, DEFAULT_CONCURRENCY)
risco = RiscoLocal(
data[CONF_HOST], data[CONF_PORT], data[CONF_PIN], concurrency=concurrency
)
try:
await risco.connect()
@@ -96,7 +101,7 @@ async def _async_setup_local_entry(hass: HomeAssistant, entry: ConfigEntry) -> b
return False
async def _error(error: Exception) -> None:
_LOGGER.error("Error in Risco library: %s", error)
_LOGGER.error("Error in Risco library", exc_info=error)
entry.async_on_unload(risco.add_error_handler(_error))

View File

@@ -35,8 +35,10 @@ from .const import (
CONF_CODE_ARM_REQUIRED,
CONF_CODE_DISARM_REQUIRED,
CONF_COMMUNICATION_DELAY,
CONF_CONCURRENCY,
CONF_HA_STATES_TO_RISCO,
CONF_RISCO_STATES_TO_HA,
DEFAULT_ADVANCED_OPTIONS,
DEFAULT_OPTIONS,
DOMAIN,
MAX_COMMUNICATION_DELAY,
@@ -225,11 +227,8 @@ class RiscoOptionsFlowHandler(OptionsFlow):
self._data = {**DEFAULT_OPTIONS, **config_entry.options}
def _options_schema(self) -> vol.Schema:
return vol.Schema(
schema = vol.Schema(
{
vol.Required(
CONF_SCAN_INTERVAL, default=self._data[CONF_SCAN_INTERVAL]
): int,
vol.Required(
CONF_CODE_ARM_REQUIRED, default=self._data[CONF_CODE_ARM_REQUIRED]
): bool,
@@ -239,6 +238,19 @@ class RiscoOptionsFlowHandler(OptionsFlow):
): bool,
}
)
if self.show_advanced_options:
self._data = {**DEFAULT_ADVANCED_OPTIONS, **self._data}
schema = schema.extend(
{
vol.Required(
CONF_SCAN_INTERVAL, default=self._data[CONF_SCAN_INTERVAL]
): int,
vol.Required(
CONF_CONCURRENCY, default=self._data[CONF_CONCURRENCY]
): int,
}
)
return schema
async def async_step_init(
self, user_input: dict[str, Any] | None = None

View File

@@ -14,6 +14,7 @@ DATA_COORDINATOR = "risco"
EVENTS_COORDINATOR = "risco_events"
DEFAULT_SCAN_INTERVAL = 30
DEFAULT_CONCURRENCY = 4
TYPE_LOCAL = "local"
@@ -25,6 +26,7 @@ CONF_CODE_DISARM_REQUIRED = "code_disarm_required"
CONF_RISCO_STATES_TO_HA = "risco_states_to_ha"
CONF_HA_STATES_TO_RISCO = "ha_states_to_risco"
CONF_COMMUNICATION_DELAY = "communication_delay"
CONF_CONCURRENCY = "concurrency"
RISCO_GROUPS = ["A", "B", "C", "D"]
RISCO_ARM = "arm"
@@ -44,9 +46,13 @@ DEFAULT_HA_STATES_TO_RISCO = {
}
DEFAULT_OPTIONS = {
CONF_SCAN_INTERVAL: DEFAULT_SCAN_INTERVAL,
CONF_CODE_ARM_REQUIRED: False,
CONF_CODE_DISARM_REQUIRED: False,
CONF_RISCO_STATES_TO_HA: DEFAULT_RISCO_STATES_TO_HA,
CONF_HA_STATES_TO_RISCO: DEFAULT_HA_STATES_TO_RISCO,
}
DEFAULT_ADVANCED_OPTIONS = {
CONF_SCAN_INTERVAL: DEFAULT_SCAN_INTERVAL,
CONF_CONCURRENCY: DEFAULT_CONCURRENCY,
}

View File

@@ -36,7 +36,8 @@
"init": {
"title": "Configure options",
"data": {
"scan_interval": "How often to poll Risco (in seconds)",
"scan_interval": "How often to poll Risco Cloud (in seconds)",
"concurrency": "Maximum concurrent requests in Risco local",
"code_arm_required": "Require PIN to arm",
"code_disarm_required": "Require PIN to disarm"
}

View File

@@ -46,15 +46,17 @@ from .triggers.turn_on import async_get_turn_on_trigger
SOURCES = {"TV": "KEY_TV", "HDMI": "KEY_HDMI"}
SUPPORT_SAMSUNGTV = (
MediaPlayerEntityFeature.PAUSE
| MediaPlayerEntityFeature.VOLUME_STEP
| MediaPlayerEntityFeature.VOLUME_MUTE
| MediaPlayerEntityFeature.PREVIOUS_TRACK
| MediaPlayerEntityFeature.SELECT_SOURCE
| MediaPlayerEntityFeature.NEXT_TRACK
| MediaPlayerEntityFeature.TURN_OFF
MediaPlayerEntityFeature.NEXT_TRACK
| MediaPlayerEntityFeature.PAUSE
| MediaPlayerEntityFeature.PLAY
| MediaPlayerEntityFeature.PLAY_MEDIA
| MediaPlayerEntityFeature.PREVIOUS_TRACK
| MediaPlayerEntityFeature.SELECT_SOURCE
| MediaPlayerEntityFeature.STOP
| MediaPlayerEntityFeature.TURN_OFF
| MediaPlayerEntityFeature.VOLUME_MUTE
| MediaPlayerEntityFeature.VOLUME_SET
| MediaPlayerEntityFeature.VOLUME_STEP
)

View File

@@ -4,5 +4,5 @@
"codeowners": ["@fabaff"],
"documentation": "https://www.home-assistant.io/integrations/seven_segments",
"iot_class": "local_polling",
"requirements": ["Pillow==10.2.0"]
"requirements": ["Pillow==10.3.0"]
}

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/sighthound",
"iot_class": "cloud_polling",
"loggers": ["simplehound"],
"requirements": ["Pillow==10.2.0", "simplehound==0.3"]
"requirements": ["Pillow==10.3.0", "simplehound==0.3"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sql",
"iot_class": "local_polling",
"requirements": ["SQLAlchemy==2.0.29", "sqlparse==0.4.4"]
"requirements": ["SQLAlchemy==2.0.29", "sqlparse==0.5.0"]
}

View File

@@ -28,7 +28,6 @@ from homeassistant.const import (
CONF_PASSWORD,
CONF_PORT,
CONF_USERNAME,
EVENT_HOMEASSISTANT_START,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import (
@@ -44,6 +43,7 @@ from homeassistant.helpers.dispatcher import (
)
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.start import async_at_start
from homeassistant.util.dt import utcnow
from .browse_media import (
@@ -207,12 +207,7 @@ async def async_setup_entry(
platform.async_register_entity_service(SERVICE_UNSYNC, None, "async_unsync")
# Start server discovery task if not already running
if hass.is_running:
hass.async_create_task(start_server_discovery(hass))
else:
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_START, start_server_discovery(hass)
)
config_entry.async_on_unload(async_at_start(hass, start_server_discovery))
class SqueezeBoxEntity(MediaPlayerEntity):

View File

@@ -91,7 +91,7 @@ class FuelPriceSensor(TankerkoenigCoordinatorEntity, SensorEntity):
self._fuel_type = fuel_type
self._attr_translation_key = fuel_type
self._attr_unique_id = f"{station.id}_{fuel_type}"
attrs = {
attrs: dict[str, int | str | float | None] = {
ATTR_BRAND: station.brand,
ATTR_FUEL_TYPE: fuel_type,
ATTR_STATION_NAME: station.name,
@@ -102,8 +102,8 @@ class FuelPriceSensor(TankerkoenigCoordinatorEntity, SensorEntity):
}
if coordinator.show_on_map:
attrs[ATTR_LATITUDE] = str(station.lat)
attrs[ATTR_LONGITUDE] = str(station.lng)
attrs[ATTR_LATITUDE] = station.lat
attrs[ATTR_LONGITUDE] = station.lng
self._attr_extra_state_attributes = attrs
@property

View File

@@ -10,6 +10,6 @@
"tf-models-official==2.5.0",
"pycocotools==2.0.6",
"numpy==1.26.0",
"Pillow==10.2.0"
"Pillow==10.3.0"
]
}

View File

@@ -58,7 +58,7 @@ SHIFT_STATES = {"P": "p", "D": "d", "R": "r", "N": "n"}
class TeslemetrySensorEntityDescription(SensorEntityDescription):
"""Describes Teslemetry Sensor entity."""
value_fn: Callable[[StateType], StateType | datetime] = lambda x: x
value_fn: Callable[[StateType], StateType] = lambda x: x
VEHICLE_DESCRIPTIONS: tuple[TeslemetrySensorEntityDescription, ...] = (
@@ -447,8 +447,14 @@ class TeslemetryVehicleSensorEntity(TeslemetryVehicleEntity, SensorEntity):
description: TeslemetrySensorEntityDescription,
) -> None:
"""Initialize the sensor."""
self.entity_description = description
super().__init__(vehicle, description.key)
@property
def native_value(self) -> StateType:
"""Return the state of the sensor."""
return self.entity_description.value_fn(self._value)
class TeslemetryVehicleTimeSensorEntity(TeslemetryVehicleEntity, SensorEntity):
"""Base class for Teslemetry vehicle metric sensors."""

View File

@@ -8,7 +8,7 @@
"iot_class": "local_push",
"loggers": ["aiounifi"],
"quality_scale": "platinum",
"requirements": ["aiounifi==74"],
"requirements": ["aiounifi==75"],
"ssdp": [
{
"manufacturer": "Ubiquiti Networks",

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["whirlpool"],
"requirements": ["whirlpool-sixth-sense==0.18.7"]
"requirements": ["whirlpool-sixth-sense==0.18.8"]
}

View File

@@ -12,6 +12,7 @@ from dataclasses import dataclass, field
from datetime import timedelta
from typing import TYPE_CHECKING, Any, cast
from aiohttp import ClientError
from aiohttp.hdrs import METH_POST
from aiohttp.web import Request, Response
from aiowithings import NotificationCategory, WithingsClient
@@ -340,7 +341,11 @@ class WithingsWebhookManager:
async def async_unsubscribe_webhooks(client: WithingsClient) -> None:
"""Unsubscribe to all Withings webhooks."""
current_webhooks = await client.list_notification_configurations()
try:
current_webhooks = await client.list_notification_configurations()
except ClientError:
LOGGER.exception("Error when unsubscribing webhooks")
return
for webhook_configuration in current_webhooks:
LOGGER.debug(

View File

@@ -11,6 +11,7 @@ from homeassistant.const import CONF_COUNTRY, CONF_LANGUAGE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryError
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from homeassistant.setup import SetupPhases, async_pause_setup
from .const import CONF_PROVINCE, DOMAIN, PLATFORMS
@@ -23,7 +24,11 @@ async def _async_validate_country_and_province(
if not country:
return
try:
await hass.async_add_executor_job(country_holidays, country)
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# import executor job is used here because multiple integrations use
# the holidays library and it is not thread safe to import it in parallel
# https://github.com/python/cpython/issues/83065
await hass.async_add_import_executor_job(country_holidays, country)
except NotImplementedError as ex:
async_create_issue(
hass,
@@ -41,9 +46,13 @@ async def _async_validate_country_and_province(
if not province:
return
try:
await hass.async_add_executor_job(
partial(country_holidays, country, subdiv=province)
)
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# import executor job is used here because multiple integrations use
# the holidays library and it is not thread safe to import it in parallel
# https://github.com/python/cpython/issues/83065
await hass.async_add_import_executor_job(
partial(country_holidays, country, subdiv=province)
)
except NotImplementedError as ex:
async_create_issue(
hass,
@@ -73,9 +82,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
await _async_validate_country_and_province(hass, entry, country, province)
if country and CONF_LANGUAGE not in entry.options:
cls: HolidayBase = await hass.async_add_executor_job(
partial(country_holidays, country, subdiv=province)
)
with async_pause_setup(hass, SetupPhases.WAIT_IMPORT_PACKAGES):
# import executor job is used here because multiple integrations use
# the holidays library and it is not thread safe to import it in parallel
# https://github.com/python/cpython/issues/83065
cls: HolidayBase = await hass.async_add_import_executor_job(
partial(country_holidays, country, subdiv=province)
)
default_language = cls.default_language
new_options = entry.options.copy()
new_options[CONF_LANGUAGE] = default_language

View File

@@ -5,5 +5,5 @@
"documentation": "https://www.home-assistant.io/integrations/xmpp",
"iot_class": "cloud_push",
"loggers": ["pyasn1", "slixmpp"],
"requirements": ["slixmpp==1.8.4", "emoji==2.8.0"]
"requirements": ["slixmpp==1.8.5", "emoji==2.8.0"]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "local_push",
"loggers": ["zeroconf"],
"quality_scale": "internal",
"requirements": ["zeroconf==0.132.0"]
"requirements": ["zeroconf==0.132.2"]
}

View File

@@ -24,7 +24,7 @@
"bellows==0.38.1",
"pyserial==3.5",
"pyserial-asyncio==0.6",
"zha-quirks==0.0.113",
"zha-quirks==0.0.114",
"zigpy-deconz==0.23.1",
"zigpy==0.63.5",
"zigpy-xbee==0.20.1",

View File

@@ -18,7 +18,7 @@ from .util.signal_type import SignalType
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2024
MINOR_VERSION: Final = 4
PATCH_VERSION: Final = "2"
PATCH_VERSION: Final = "4"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 12, 0)

View File

@@ -656,6 +656,12 @@ class _ScriptRun:
# check if condition already okay
if condition.async_template(self._hass, wait_template, self._variables, False):
self._variables["wait"]["completed"] = True
self._changed()
return
if timeout == 0:
self._changed()
self._async_handle_timeout()
return
futures, timeout_handle, timeout_future = self._async_futures_with_timeout(
@@ -1085,6 +1091,11 @@ class _ScriptRun:
self._variables["wait"] = {"remaining": timeout, "trigger": None}
trace_set_result(wait=self._variables["wait"])
if timeout == 0:
self._changed()
self._async_handle_timeout()
return
futures, timeout_handle, timeout_future = self._async_futures_with_timeout(
timeout
)
@@ -1115,6 +1126,14 @@ class _ScriptRun:
futures, timeout_handle, timeout_future, remove_triggers
)
def _async_handle_timeout(self) -> None:
"""Handle timeout."""
self._variables["wait"]["remaining"] = 0.0
if not self._action.get(CONF_CONTINUE_ON_TIMEOUT, True):
self._log(_TIMEOUT_MSG)
trace_set_result(wait=self._variables["wait"], timeout=True)
raise _AbortScript from TimeoutError()
async def _async_wait_with_optional_timeout(
self,
futures: list[asyncio.Future[None]],
@@ -1125,11 +1144,7 @@ class _ScriptRun:
try:
await asyncio.wait(futures, return_when=asyncio.FIRST_COMPLETED)
if timeout_future and timeout_future.done():
self._variables["wait"]["remaining"] = 0.0
if not self._action.get(CONF_CONTINUE_ON_TIMEOUT, True):
self._log(_TIMEOUT_MSG)
trace_set_result(wait=self._variables["wait"], timeout=True)
raise _AbortScript from TimeoutError()
self._async_handle_timeout()
finally:
if timeout_future and not timeout_future.done() and timeout_handle:
timeout_handle.cancel()

View File

@@ -403,6 +403,8 @@ class DataUpdateCoordinator(BaseDataUpdateCoordinatorProtocol, Generic[_DataT]):
if not auth_failed and self._listeners and not self.hass.is_stopping:
self._schedule_refresh()
self._async_refresh_finished()
if not self.last_update_success and not previous_update_success:
return
@@ -413,6 +415,15 @@ class DataUpdateCoordinator(BaseDataUpdateCoordinatorProtocol, Generic[_DataT]):
):
self.async_update_listeners()
@callback
def _async_refresh_finished(self) -> None:
"""Handle when a refresh has finished.
Called when refresh is finished before listeners are updated.
To be overridden by subclasses.
"""
@callback
def async_set_update_error(self, err: Exception) -> None:
"""Manually set an error, log the message and notify listeners."""
@@ -446,20 +457,9 @@ class TimestampDataUpdateCoordinator(DataUpdateCoordinator[_DataT]):
last_update_success_time: datetime | None = None
async def _async_refresh(
self,
log_failures: bool = True,
raise_on_auth_failed: bool = False,
scheduled: bool = False,
raise_on_entry_error: bool = False,
) -> None:
"""Refresh data."""
await super()._async_refresh(
log_failures,
raise_on_auth_failed,
scheduled,
raise_on_entry_error,
)
@callback
def _async_refresh_finished(self) -> None:
"""Handle when a refresh has finished."""
if self.last_update_success:
self.last_update_success_time = utcnow()

View File

@@ -4,7 +4,7 @@ aiodhcpwatcher==1.0.0
aiodiscover==2.0.0
aiohttp-fast-url-dispatcher==0.3.0
aiohttp-zlib-ng==0.3.1
aiohttp==3.9.3
aiohttp==3.9.5
aiohttp_cors==0.7.0
astral==2.2
async-interrupt==1.1.1
@@ -13,7 +13,7 @@ atomicwrites-homeassistant==1.4.1
attrs==23.2.0
awesomeversion==24.2.0
bcrypt==4.1.2
bleak-retry-connector==3.4.0
bleak-retry-connector==3.5.0
bleak==0.21.1
bluetooth-adapters==0.18.0
bluetooth-auto-recovery==1.4.0
@@ -30,7 +30,7 @@ habluetooth==2.4.2
hass-nabucasa==0.78.0
hassil==1.6.1
home-assistant-bluetooth==1.12.0
home-assistant-frontend==20240404.1
home-assistant-frontend==20240404.2
home-assistant-intents==2024.4.3
httpx==0.27.0
ifaddr==0.2.0
@@ -40,7 +40,7 @@ mutagen==1.47.0
orjson==3.9.15
packaging>=23.1
paho-mqtt==1.6.1
Pillow==10.2.0
Pillow==10.3.0
pip>=21.3.1
psutil-home-assistant==0.0.1
PyJWT==2.8.0
@@ -60,7 +60,7 @@ voluptuous-serialize==2.6.0
voluptuous==0.13.1
webrtc-noise-gain==1.2.3
yarl==1.9.4
zeroconf==0.132.0
zeroconf==0.132.2
# Constrain pycryptodome to avoid vulnerability
# see https://github.com/home-assistant/core/pull/16238
@@ -107,7 +107,7 @@ regex==2021.8.28
# requirements so we can directly link HA versions to these library versions.
anyio==4.3.0
h11==0.14.0
httpcore==1.0.4
httpcore==1.0.5
# Ensure we have a hyperframe version that works in Python 3.10
# 5.2.0 fixed a collections abc deprecation

View File

@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "homeassistant"
version = "2024.4.2"
version = "2024.4.4"
license = {text = "Apache-2.0"}
description = "Open-source home automation platform running on Python 3."
readme = "README.rst"
@@ -23,7 +23,7 @@ classifiers = [
]
requires-python = ">=3.12.0"
dependencies = [
"aiohttp==3.9.3",
"aiohttp==3.9.5",
"aiohttp_cors==0.7.0",
"aiohttp-fast-url-dispatcher==0.3.0",
"aiohttp-zlib-ng==0.3.1",
@@ -49,7 +49,7 @@ dependencies = [
"PyJWT==2.8.0",
# PyJWT has loose dependency. We want the latest one.
"cryptography==42.0.5",
"Pillow==10.2.0",
"Pillow==10.3.0",
"pyOpenSSL==24.1.0",
"orjson==3.9.15",
"packaging>=23.1",

View File

@@ -3,7 +3,7 @@
-c homeassistant/package_constraints.txt
# Home Assistant Core
aiohttp==3.9.3
aiohttp==3.9.5
aiohttp_cors==0.7.0
aiohttp-fast-url-dispatcher==0.3.0
aiohttp-zlib-ng==0.3.1
@@ -24,7 +24,7 @@ Jinja2==3.1.3
lru-dict==1.3.0
PyJWT==2.8.0
cryptography==42.0.5
Pillow==10.2.0
Pillow==10.3.0
pyOpenSSL==24.1.0
orjson==3.9.15
packaging>=23.1

View File

@@ -42,10 +42,10 @@ Mastodon.py==1.8.1
# homeassistant.components.seven_segments
# homeassistant.components.sighthound
# homeassistant.components.tensorflow
Pillow==10.2.0
Pillow==10.3.0
# homeassistant.components.plex
PlexAPI==4.15.11
PlexAPI==4.15.12
# homeassistant.components.progettihwsw
ProgettiHWSW==0.1.3
@@ -392,7 +392,7 @@ aiotankerkoenig==0.4.1
aiotractive==0.5.6
# homeassistant.components.unifi
aiounifi==74
aiounifi==75
# homeassistant.components.vlc_telnet
aiovlc==0.1.0
@@ -467,7 +467,7 @@ aprslib==0.7.0
aqualogic==2.6
# homeassistant.components.aranet
aranet4==2.2.2
aranet4==2.3.3
# homeassistant.components.arcam_fmj
arcam-fmj==1.4.0
@@ -556,7 +556,7 @@ bizkaibus==0.1.1
bleak-esphome==1.0.0
# homeassistant.components.bluetooth
bleak-retry-connector==3.4.0
bleak-retry-connector==3.5.0
# homeassistant.components.bluetooth
bleak==0.21.1
@@ -883,7 +883,7 @@ fnv-hash-fast==0.5.0
foobot_async==1.0.0
# homeassistant.components.forecast_solar
forecast-solar==3.0.0
forecast-solar==3.1.0
# homeassistant.components.fortios
fortiosapi==1.0.5
@@ -1077,7 +1077,7 @@ hole==0.8.0
holidays==0.46
# homeassistant.components.frontend
home-assistant-frontend==20240404.1
home-assistant-frontend==20240404.2
# homeassistant.components.conversation
home-assistant-intents==2024.4.3
@@ -1118,7 +1118,7 @@ ibmiotf==0.3.4
# homeassistant.components.google
# homeassistant.components.local_calendar
# homeassistant.components.local_todo
ical==7.0.3
ical==8.0.0
# homeassistant.components.ping
icmplib==3.0
@@ -1718,7 +1718,7 @@ pybbox==0.0.5-alpha
pyblackbird==0.6
# homeassistant.components.neato
pybotvac==0.0.24
pybotvac==0.0.25
# homeassistant.components.braviatv
pybravia==0.3.3
@@ -1973,7 +1973,7 @@ pymitv==1.4.3
pymochad==0.2.0
# homeassistant.components.modbus
pymodbus==3.6.6
pymodbus==3.6.8
# homeassistant.components.monoprice
pymonoprice==0.4
@@ -2429,7 +2429,7 @@ refoss-ha==1.2.0
regenmaschine==2024.03.0
# homeassistant.components.renault
renault-api==0.2.1
renault-api==0.2.2
# homeassistant.components.renson
renson-endura-delta==1.7.1
@@ -2553,7 +2553,7 @@ sisyphus-control==3.1.3
slackclient==2.5.0
# homeassistant.components.xmpp
slixmpp==1.8.4
slixmpp==1.8.5
# homeassistant.components.smart_meter_texas
smart-meter-texas==0.4.7
@@ -2595,7 +2595,7 @@ spiderpy==1.6.1
spotipy==2.23.0
# homeassistant.components.sql
sqlparse==0.4.4
sqlparse==0.5.0
# homeassistant.components.srp_energy
srpenergy==1.3.6
@@ -2850,7 +2850,7 @@ webmin-xmlrpc==0.0.2
webrtc-noise-gain==1.2.3
# homeassistant.components.whirlpool
whirlpool-sixth-sense==0.18.7
whirlpool-sixth-sense==0.18.8
# homeassistant.components.whois
whois==0.9.27
@@ -2919,7 +2919,7 @@ youless-api==1.0.1
youtubeaio==1.1.5
# homeassistant.components.media_extractor
yt-dlp==2024.03.10
yt-dlp==2024.04.09
# homeassistant.components.zamg
zamg==0.3.6
@@ -2928,13 +2928,13 @@ zamg==0.3.6
zengge==0.2
# homeassistant.components.zeroconf
zeroconf==0.132.0
zeroconf==0.132.2
# homeassistant.components.zeversolar
zeversolar==0.3.1
# homeassistant.components.zha
zha-quirks==0.0.113
zha-quirks==0.0.114
# homeassistant.components.zhong_hong
zhong-hong-hvac==1.0.12

View File

@@ -36,10 +36,10 @@ HATasmota==0.8.0
# homeassistant.components.seven_segments
# homeassistant.components.sighthound
# homeassistant.components.tensorflow
Pillow==10.2.0
Pillow==10.3.0
# homeassistant.components.plex
PlexAPI==4.15.11
PlexAPI==4.15.12
# homeassistant.components.progettihwsw
ProgettiHWSW==0.1.3
@@ -365,7 +365,7 @@ aiotankerkoenig==0.4.1
aiotractive==0.5.6
# homeassistant.components.unifi
aiounifi==74
aiounifi==75
# homeassistant.components.vlc_telnet
aiovlc==0.1.0
@@ -428,7 +428,7 @@ apprise==1.7.4
aprslib==0.7.0
# homeassistant.components.aranet
aranet4==2.2.2
aranet4==2.3.3
# homeassistant.components.arcam_fmj
arcam-fmj==1.4.0
@@ -478,7 +478,7 @@ bimmer-connected[china]==0.14.6
bleak-esphome==1.0.0
# homeassistant.components.bluetooth
bleak-retry-connector==3.4.0
bleak-retry-connector==3.5.0
# homeassistant.components.bluetooth
bleak==0.21.1
@@ -721,7 +721,7 @@ fnv-hash-fast==0.5.0
foobot_async==1.0.0
# homeassistant.components.forecast_solar
forecast-solar==3.0.0
forecast-solar==3.1.0
# homeassistant.components.freebox
freebox-api==1.1.0
@@ -876,7 +876,7 @@ hole==0.8.0
holidays==0.46
# homeassistant.components.frontend
home-assistant-frontend==20240404.1
home-assistant-frontend==20240404.2
# homeassistant.components.conversation
home-assistant-intents==2024.4.3
@@ -908,7 +908,7 @@ ibeacon-ble==1.2.0
# homeassistant.components.google
# homeassistant.components.local_calendar
# homeassistant.components.local_todo
ical==7.0.3
ical==8.0.0
# homeassistant.components.ping
icmplib==3.0
@@ -1350,7 +1350,7 @@ pybalboa==1.0.1
pyblackbird==0.6
# homeassistant.components.neato
pybotvac==0.0.24
pybotvac==0.0.25
# homeassistant.components.braviatv
pybravia==0.3.3
@@ -1533,7 +1533,7 @@ pymeteoclimatic==0.1.0
pymochad==0.2.0
# homeassistant.components.modbus
pymodbus==3.6.6
pymodbus==3.6.8
# homeassistant.components.monoprice
pymonoprice==0.4
@@ -1875,7 +1875,7 @@ refoss-ha==1.2.0
regenmaschine==2024.03.0
# homeassistant.components.renault
renault-api==0.2.1
renault-api==0.2.2
# homeassistant.components.renson
renson-endura-delta==1.7.1
@@ -1999,7 +1999,7 @@ spiderpy==1.6.1
spotipy==2.23.0
# homeassistant.components.sql
sqlparse==0.4.4
sqlparse==0.5.0
# homeassistant.components.srp_energy
srpenergy==1.3.6
@@ -2197,7 +2197,7 @@ webmin-xmlrpc==0.0.2
webrtc-noise-gain==1.2.3
# homeassistant.components.whirlpool
whirlpool-sixth-sense==0.18.7
whirlpool-sixth-sense==0.18.8
# homeassistant.components.whois
whois==0.9.27
@@ -2257,19 +2257,19 @@ youless-api==1.0.1
youtubeaio==1.1.5
# homeassistant.components.media_extractor
yt-dlp==2024.03.10
yt-dlp==2024.04.09
# homeassistant.components.zamg
zamg==0.3.6
# homeassistant.components.zeroconf
zeroconf==0.132.0
zeroconf==0.132.2
# homeassistant.components.zeversolar
zeversolar==0.3.1
# homeassistant.components.zha
zha-quirks==0.0.113
zha-quirks==0.0.114
# homeassistant.components.zha
zigpy-deconz==0.23.1

View File

@@ -100,7 +100,7 @@ regex==2021.8.28
# requirements so we can directly link HA versions to these library versions.
anyio==4.3.0
h11==0.14.0
httpcore==1.0.4
httpcore==1.0.5
# Ensure we have a hyperframe version that works in Python 3.10
# 5.2.0 fixed a collections abc deprecation

View File

@@ -58,6 +58,14 @@ VALID_DATA_SERVICE_INFO = fake_service_info(
},
)
VALID_DATA_SERVICE_INFO_WITH_NO_NAME = fake_service_info(
None,
"0000fce0-0000-1000-8000-00805f9b34fb",
{
1794: b'\x21\x00\x02\x01\x00\x00\x00\x01\x8a\x02\xa5\x01\xb1&"Y\x01,\x01\xe8\x00\x88'
},
)
VALID_ARANET2_DATA_SERVICE_INFO = fake_service_info(
"Aranet2 12345",
"0000fce0-0000-1000-8000-00805f9b34fb",

View File

@@ -12,6 +12,7 @@ from . import (
NOT_ARANET4_SERVICE_INFO,
OLD_FIRMWARE_SERVICE_INFO,
VALID_DATA_SERVICE_INFO,
VALID_DATA_SERVICE_INFO_WITH_NO_NAME,
)
from tests.common import MockConfigEntry
@@ -36,6 +37,25 @@ async def test_async_step_bluetooth_valid_device(hass: HomeAssistant) -> None:
assert result2["result"].unique_id == "aa:bb:cc:dd:ee:ff"
async def test_async_step_bluetooth_device_without_name(hass: HomeAssistant) -> None:
"""Test discovery via bluetooth with a valid device that has no name."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_BLUETOOTH},
data=VALID_DATA_SERVICE_INFO_WITH_NO_NAME,
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "bluetooth_confirm"
with patch("homeassistant.components.aranet.async_setup_entry", return_value=True):
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input={}
)
assert result2["type"] is FlowResultType.CREATE_ENTRY
assert result2["title"] == "Aranet (EEFF)"
assert result2["data"] == {}
assert result2["result"].unique_id == "aa:bb:cc:dd:ee:ff"
async def test_async_step_bluetooth_not_aranet4(hass: HomeAssistant) -> None:
"""Test that we reject discovery via Bluetooth for an unrelated device."""
result = await hass.config_entries.flow.async_init(

View File

@@ -2650,3 +2650,83 @@ def test_deprecated_constants(
import_and_test_deprecated_constant(
caplog, automation, constant_name, replacement.__name__, replacement, "2025.1"
)
async def test_automation_turns_off_other_automation(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture
) -> None:
"""Test an automation that turns off another automation."""
hass.set_state(CoreState.not_running)
calls = async_mock_service(hass, "persistent_notification", "create")
hass.states.async_set("binary_sensor.presence", "on")
await hass.async_block_till_done()
assert await async_setup_component(
hass,
automation.DOMAIN,
{
automation.DOMAIN: [
{
"trigger": {
"platform": "state",
"entity_id": "binary_sensor.presence",
"from": "on",
},
"action": {
"service": "automation.turn_off",
"target": {
"entity_id": "automation.automation_1",
},
"data": {
"stop_actions": True,
},
},
"id": "automation_0",
"mode": "single",
},
{
"trigger": {
"platform": "state",
"entity_id": "binary_sensor.presence",
"from": "on",
"for": {
"hours": 0,
"minutes": 0,
"seconds": 5,
},
},
"action": {
"service": "persistent_notification.create",
"metadata": {},
"data": {
"message": "Test race",
},
},
"id": "automation_1",
"mode": "single",
},
]
},
)
await hass.async_start()
await hass.async_block_till_done()
hass.states.async_set("binary_sensor.presence", "off")
await hass.async_block_till_done()
assert len(calls) == 0
async_fire_time_changed(hass, dt_util.utcnow() + timedelta(seconds=5))
await hass.async_block_till_done()
assert len(calls) == 0
await hass.services.async_call(
"automation",
"turn_on",
{"entity_id": "automation.automation_1"},
blocking=True,
)
hass.states.async_set("binary_sensor.presence", "off")
await hass.async_block_till_done()
assert len(calls) == 0
async_fire_time_changed(hass, dt_util.utcnow() + timedelta(seconds=5))
await hass.async_block_till_done()
assert len(calls) == 0

View File

@@ -90,9 +90,9 @@ async def test_upload_large_file(
file_upload.TEMP_DIR_NAME + f"-{getrandbits(10):03x}",
),
patch(
# Patch one megabyte to 8 bytes to prevent having to use big files in tests
# Patch one megabyte to 50 bytes to prevent having to use big files in tests
"homeassistant.components.file_upload.ONE_MEGABYTE",
8,
50,
),
):
res = await client.post("/api/file_upload", data={"file": large_file_io})
@@ -152,9 +152,9 @@ async def test_upload_large_file_fails(
file_upload.TEMP_DIR_NAME + f"-{getrandbits(10):03x}",
),
patch(
# Patch one megabyte to 8 bytes to prevent having to use big files in tests
# Patch one megabyte to 50 bytes to prevent having to use big files in tests
"homeassistant.components.file_upload.ONE_MEGABYTE",
8,
50,
),
patch(
"homeassistant.components.file_upload.Path.open", return_value=_mock_open()

View File

@@ -156,7 +156,7 @@ def create_response_object(api_response: dict | list) -> tuple[Response, bytes]:
def create_batch_response_object(
content_ids: list[str], api_responses: list[dict | list | Response]
content_ids: list[str], api_responses: list[dict | list | Response | None]
) -> tuple[Response, bytes]:
"""Create a batch response in the multipart/mixed format."""
assert len(api_responses) == len(content_ids)
@@ -166,7 +166,7 @@ def create_batch_response_object(
body = ""
if isinstance(api_response, Response):
status = api_response.status
else:
elif api_response is not None:
body = json.dumps(api_response)
content.extend(
[
@@ -194,7 +194,7 @@ def create_batch_response_object(
def create_batch_response_handler(
api_responses: list[dict | list | Response],
api_responses: list[dict | list | Response | None],
) -> Callable[[Any], tuple[Response, bytes]]:
"""Create a fake http2lib response handler that supports generating batch responses.
@@ -598,11 +598,11 @@ async def test_partial_update_status(
[
LIST_TASK_LIST_RESPONSE,
LIST_TASKS_RESPONSE_MULTIPLE,
[EMPTY_RESPONSE, EMPTY_RESPONSE, EMPTY_RESPONSE], # Delete batch
[None, None, None], # Delete batch empty responses
LIST_TASKS_RESPONSE, # refresh after delete
]
)
)
),
],
)
async def test_delete_todo_list_item(

View File

@@ -674,3 +674,116 @@ async def test_supervisor_issue_docker_config_repair_flow(
str(aioclient_mock.mock_calls[-1][1])
== "http://127.0.0.1/resolution/suggestion/1235"
)
async def test_supervisor_issue_repair_flow_multiple_data_disks(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_client: ClientSessionGenerator,
issue_registry: ir.IssueRegistry,
all_setup_requests,
) -> None:
"""Test fix flow for multiple data disks supervisor issue."""
mock_resolution_info(
aioclient_mock,
issues=[
{
"uuid": "1234",
"type": "multiple_data_disks",
"context": "system",
"reference": "/dev/sda1",
"suggestions": [
{
"uuid": "1235",
"type": "rename_data_disk",
"context": "system",
"reference": "/dev/sda1",
},
{
"uuid": "1236",
"type": "adopt_data_disk",
"context": "system",
"reference": "/dev/sda1",
},
],
},
],
)
assert await async_setup_component(hass, "hassio", {})
repair_issue = issue_registry.async_get_issue(domain="hassio", issue_id="1234")
assert repair_issue
client = await hass_client()
resp = await client.post(
"/api/repairs/issues/fix",
json={"handler": "hassio", "issue_id": repair_issue.issue_id},
)
assert resp.status == HTTPStatus.OK
data = await resp.json()
flow_id = data["flow_id"]
assert data == {
"type": "menu",
"flow_id": flow_id,
"handler": "hassio",
"step_id": "fix_menu",
"data_schema": [
{
"type": "select",
"options": [
["system_rename_data_disk", "system_rename_data_disk"],
["system_adopt_data_disk", "system_adopt_data_disk"],
],
"name": "next_step_id",
}
],
"menu_options": ["system_rename_data_disk", "system_adopt_data_disk"],
"description_placeholders": {"reference": "/dev/sda1"},
}
resp = await client.post(
f"/api/repairs/issues/fix/{flow_id}",
json={"next_step_id": "system_adopt_data_disk"},
)
assert resp.status == HTTPStatus.OK
data = await resp.json()
flow_id = data["flow_id"]
assert data == {
"type": "form",
"flow_id": flow_id,
"handler": "hassio",
"step_id": "system_adopt_data_disk",
"data_schema": [],
"errors": None,
"description_placeholders": {"reference": "/dev/sda1"},
"last_step": True,
"preview": None,
}
resp = await client.post(f"/api/repairs/issues/fix/{flow_id}")
assert resp.status == HTTPStatus.OK
data = await resp.json()
flow_id = data["flow_id"]
assert data == {
"type": "create_entry",
"flow_id": flow_id,
"handler": "hassio",
"description": None,
"description_placeholders": None,
}
assert not issue_registry.async_get_issue(domain="hassio", issue_id="1234")
assert aioclient_mock.mock_calls[-1][0] == "post"
assert (
str(aioclient_mock.mock_calls[-1][1])
== "http://127.0.0.1/resolution/suggestion/1236"
)

View File

@@ -9,7 +9,6 @@ from homeassistant.components.binary_sensor import DOMAIN as BINARY_SENSOR_DOMAI
from homeassistant.components.button import DOMAIN as BUTTON_DOMAIN
from homeassistant.components.homeworks.const import (
CONF_ADDR,
CONF_BUTTONS,
CONF_DIMMERS,
CONF_INDEX,
CONF_KEYPADS,
@@ -161,26 +160,6 @@ async def test_import_flow(
{
CONF_ADDR: "[02:08:02:01]",
CONF_NAME: "Foyer Keypad",
CONF_BUTTONS: [
{
CONF_NAME: "Morning",
CONF_NUMBER: 1,
CONF_LED: True,
CONF_RELEASE_DELAY: None,
},
{
CONF_NAME: "Relax",
CONF_NUMBER: 2,
CONF_LED: True,
CONF_RELEASE_DELAY: None,
},
{
CONF_NAME: "Dim up",
CONF_NUMBER: 3,
CONF_LED: False,
CONF_RELEASE_DELAY: 0.2,
},
],
}
],
},
@@ -207,16 +186,7 @@ async def test_import_flow(
"keypads": [
{
"addr": "[02:08:02:01]",
"buttons": [
{
"led": True,
"name": "Morning",
"number": 1,
"release_delay": None,
},
{"led": True, "name": "Relax", "number": 2, "release_delay": None},
{"led": False, "name": "Dim up", "number": 3, "release_delay": 0.2},
],
"buttons": [],
"name": "Foyer Keypad",
}
],
@@ -574,8 +544,12 @@ async def test_options_add_remove_light_flow(
)
@pytest.mark.parametrize("keypad_address", ["[02:08:03:01]", "[02:08:03]"])
async def test_options_add_remove_keypad_flow(
hass: HomeAssistant, mock_config_entry: MockConfigEntry, mock_homeworks: MagicMock
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_homeworks: MagicMock,
keypad_address: str,
) -> None:
"""Test options flow to add and remove a keypad."""
mock_config_entry.add_to_hass(hass)
@@ -596,7 +570,7 @@ async def test_options_add_remove_keypad_flow(
result = await hass.config_entries.options.async_configure(
result["flow_id"],
user_input={
CONF_ADDR: "[02:08:03:01]",
CONF_ADDR: keypad_address,
CONF_NAME: "Hall Keypad",
},
)
@@ -622,7 +596,7 @@ async def test_options_add_remove_keypad_flow(
],
"name": "Foyer Keypad",
},
{"addr": "[02:08:03:01]", "buttons": [], "name": "Hall Keypad"},
{"addr": keypad_address, "buttons": [], "name": "Hall Keypad"},
],
"port": 1234,
}
@@ -642,7 +616,7 @@ async def test_options_add_remove_keypad_flow(
assert result["step_id"] == "remove_keypad"
assert result["data_schema"].schema["index"].options == {
"0": "Foyer Keypad ([02:08:02:01])",
"1": "Hall Keypad ([02:08:03:01])",
"1": f"Hall Keypad ({keypad_address})",
}
result = await hass.config_entries.options.async_configure(
@@ -655,7 +629,7 @@ async def test_options_add_remove_keypad_flow(
{"addr": "[02:08:01:01]", "name": "Foyer Sconces", "rate": 1.0},
],
"host": "192.168.0.1",
"keypads": [{"addr": "[02:08:03:01]", "buttons": [], "name": "Hall Keypad"}],
"keypads": [{"addr": keypad_address, "buttons": [], "name": "Hall Keypad"}],
"port": 1234,
}
await hass.async_block_till_done()

View File

@@ -159,7 +159,6 @@ async def test_visible_effect_state_changes(hass: HomeAssistant) -> None:
KEY_ACTIVE: True,
KEY_COMPONENTID: "COLOR",
KEY_ORIGIN: "System",
KEY_OWNER: "System",
KEY_PRIORITY: 250,
KEY_VALUE: {KEY_RGB: [0, 0, 0]},
KEY_VISIBLE: True,

View File

@@ -52,6 +52,15 @@ def mock_pymodbus_fixture():
"""Mock pymodbus."""
mock_pb = mock.AsyncMock()
mock_pb.close = mock.MagicMock()
read_result = ReadResult([])
mock_pb.read_coils.return_value = read_result
mock_pb.read_discrete_inputs.return_value = read_result
mock_pb.read_input_registers.return_value = read_result
mock_pb.read_holding_registers.return_value = read_result
mock_pb.write_register.return_value = read_result
mock_pb.write_registers.return_value = read_result
mock_pb.write_coil.return_value = read_result
mock_pb.write_coils.return_value = read_result
with (
mock.patch(
"homeassistant.components.modbus.modbus.AsyncModbusTcpClient",
@@ -156,7 +165,7 @@ async def mock_pymodbus_exception_fixture(hass, do_exception, mock_modbus):
@pytest.fixture(name="mock_pymodbus_return")
async def mock_pymodbus_return_fixture(hass, register_words, mock_modbus):
"""Trigger update call with time_changed event."""
read_result = ReadResult(register_words) if register_words else None
read_result = ReadResult(register_words if register_words else [])
mock_modbus.read_coils.return_value = read_result
mock_modbus.read_discrete_inputs.return_value = read_result
mock_modbus.read_input_registers.return_value = read_result
@@ -165,6 +174,7 @@ async def mock_pymodbus_return_fixture(hass, register_words, mock_modbus):
mock_modbus.write_registers.return_value = read_result
mock_modbus.write_coil.return_value = read_result
mock_modbus.write_coils.return_value = read_result
return mock_modbus
@pytest.fixture(name="mock_do_cycle")

View File

@@ -3,3 +3,7 @@ modbus:
host: "testHost"
port: 5001
name: "testModbus"
sensors:
- name: "dummy"
address: 117
slave: 0

View File

@@ -25,6 +25,7 @@ import voluptuous as vol
from homeassistant import config as hass_config
from homeassistant.components.binary_sensor import DOMAIN as BINARY_SENSOR_DOMAIN
from homeassistant.components.modbus import async_reset_platform
from homeassistant.components.modbus.const import (
ATTR_ADDRESS,
ATTR_HUB,
@@ -1560,7 +1561,7 @@ async def test_shutdown(
],
)
async def test_stop_restart(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_modbus
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_pymodbus_return
) -> None:
"""Run test for service stop."""
@@ -1571,7 +1572,7 @@ async def test_stop_restart(
await hass.async_block_till_done()
assert hass.states.get(entity_id).state == "17"
mock_modbus.reset_mock()
mock_pymodbus_return.reset_mock()
caplog.clear()
data = {
ATTR_HUB: TEST_MODBUS_NAME,
@@ -1579,23 +1580,23 @@ async def test_stop_restart(
await hass.services.async_call(DOMAIN, SERVICE_STOP, data, blocking=True)
await hass.async_block_till_done()
assert hass.states.get(entity_id).state == STATE_UNAVAILABLE
assert mock_modbus.close.called
assert mock_pymodbus_return.close.called
assert f"modbus {TEST_MODBUS_NAME} communication closed" in caplog.text
mock_modbus.reset_mock()
mock_pymodbus_return.reset_mock()
caplog.clear()
await hass.services.async_call(DOMAIN, SERVICE_RESTART, data, blocking=True)
await hass.async_block_till_done()
assert not mock_modbus.close.called
assert mock_modbus.connect.called
assert not mock_pymodbus_return.close.called
assert mock_pymodbus_return.connect.called
assert f"modbus {TEST_MODBUS_NAME} communication open" in caplog.text
mock_modbus.reset_mock()
mock_pymodbus_return.reset_mock()
caplog.clear()
await hass.services.async_call(DOMAIN, SERVICE_RESTART, data, blocking=True)
await hass.async_block_till_done()
assert mock_modbus.close.called
assert mock_modbus.connect.called
assert mock_pymodbus_return.close.called
assert mock_pymodbus_return.connect.called
assert f"modbus {TEST_MODBUS_NAME} communication closed" in caplog.text
assert f"modbus {TEST_MODBUS_NAME} communication open" in caplog.text
@@ -1625,7 +1626,7 @@ async def test_write_no_client(hass: HomeAssistant, mock_modbus) -> None:
async def test_integration_reload(
hass: HomeAssistant,
caplog: pytest.LogCaptureFixture,
mock_modbus,
mock_pymodbus_return,
freezer: FrozenDateTimeFactory,
) -> None:
"""Run test for integration reload."""
@@ -1646,7 +1647,7 @@ async def test_integration_reload(
@pytest.mark.parametrize("do_config", [{}])
async def test_integration_reload_failed(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_modbus
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_pymodbus_return
) -> None:
"""Run test for integration connect failure on reload."""
caplog.set_level(logging.INFO)
@@ -1655,7 +1656,9 @@ async def test_integration_reload_failed(
yaml_path = get_fixture_path("configuration.yaml", "modbus")
with (
mock.patch.object(hass_config, "YAML_CONFIG_FILE", yaml_path),
mock.patch.object(mock_modbus, "connect", side_effect=ModbusException("error")),
mock.patch.object(
mock_pymodbus_return, "connect", side_effect=ModbusException("error")
),
):
await hass.services.async_call(DOMAIN, SERVICE_RELOAD, blocking=True)
await hass.async_block_till_done()
@@ -1666,7 +1669,7 @@ async def test_integration_reload_failed(
@pytest.mark.parametrize("do_config", [{}])
async def test_integration_setup_failed(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_modbus
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, mock_pymodbus_return
) -> None:
"""Run test for integration setup on reload."""
with mock.patch.object(
@@ -1694,3 +1697,9 @@ async def test_no_entities(hass: HomeAssistant) -> None:
]
}
assert await async_setup_component(hass, DOMAIN, config) is False
async def test_reset_platform(hass: HomeAssistant) -> None:
"""Run test for async_reset_platform."""
await async_reset_platform(hass, "modbus")
assert DOMAIN not in hass.data

View File

@@ -146,20 +146,24 @@ async def test_switch_on(
@pytest.mark.parametrize(
"zone_state_response",
[ZONE_3_ON_RESPONSE],
("zone_state_response", "start_state"),
[
(ZONE_3_ON_RESPONSE, "on"),
(ZONE_OFF_RESPONSE, "off"), # Already off
],
)
async def test_switch_off(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
responses: list[AiohttpClientMockResponse],
start_state: str,
) -> None:
"""Test turning off irrigation switch."""
# Initially the test zone is on
zone = hass.states.get("switch.rain_bird_sprinkler_3")
assert zone is not None
assert zone.state == "on"
assert zone.state == start_state
aioclient_mock.mock_calls.clear()
responses.extend(

View File

@@ -127,7 +127,12 @@ MOCK_VEHICLES = {
{
ATTR_ENTITY_ID: "select.reg_number_charge_mode",
ATTR_ICON: "mdi:calendar-remove",
ATTR_OPTIONS: ["always", "always_charging", "schedule_mode"],
ATTR_OPTIONS: [
"always",
"always_charging",
"schedule_mode",
"scheduled",
],
ATTR_STATE: "always",
ATTR_UNIQUE_ID: "vf1aaaaa555777999_charge_mode",
},
@@ -363,7 +368,12 @@ MOCK_VEHICLES = {
{
ATTR_ENTITY_ID: "select.reg_number_charge_mode",
ATTR_ICON: "mdi:calendar-clock",
ATTR_OPTIONS: ["always", "always_charging", "schedule_mode"],
ATTR_OPTIONS: [
"always",
"always_charging",
"schedule_mode",
"scheduled",
],
ATTR_STATE: "schedule_mode",
ATTR_UNIQUE_ID: "vf1aaaaa555777999_charge_mode",
},
@@ -599,7 +609,12 @@ MOCK_VEHICLES = {
{
ATTR_ENTITY_ID: "select.reg_number_charge_mode",
ATTR_ICON: "mdi:calendar-remove",
ATTR_OPTIONS: ["always", "always_charging", "schedule_mode"],
ATTR_OPTIONS: [
"always",
"always_charging",
"schedule_mode",
"scheduled",
],
ATTR_STATE: "always",
ATTR_UNIQUE_ID: "vf1aaaaa555777123_charge_mode",
},

View File

@@ -82,6 +82,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -121,6 +122,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,
@@ -175,6 +177,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -214,6 +217,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,
@@ -268,6 +272,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -307,6 +312,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,
@@ -401,6 +407,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -440,6 +447,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,
@@ -494,6 +502,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -533,6 +542,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,
@@ -587,6 +597,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'config_entry_id': <ANY>,
@@ -626,6 +637,7 @@
'always',
'always_charging',
'schedule_mode',
'scheduled',
]),
}),
'context': <ANY>,

View File

@@ -46,11 +46,15 @@ TEST_HA_TO_RISCO = {
}
TEST_OPTIONS = {
"scan_interval": 10,
"code_arm_required": True,
"code_disarm_required": True,
}
TEST_ADVANCED_OPTIONS = {
"scan_interval": 10,
"concurrency": 3,
}
async def test_cloud_form(hass: HomeAssistant) -> None:
"""Test we get the cloud form."""
@@ -387,6 +391,53 @@ async def test_options_flow(hass: HomeAssistant) -> None:
}
async def test_advanced_options_flow(hass: HomeAssistant) -> None:
"""Test options flow."""
entry = MockConfigEntry(
domain=DOMAIN,
unique_id=TEST_CLOUD_DATA["username"],
data=TEST_CLOUD_DATA,
)
entry.add_to_hass(hass)
result = await hass.config_entries.options.async_init(
entry.entry_id, context={"show_advanced_options": True}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "init"
assert "concurrency" in result["data_schema"].schema
assert "scan_interval" in result["data_schema"].schema
result = await hass.config_entries.options.async_configure(
result["flow_id"], user_input={**TEST_OPTIONS, **TEST_ADVANCED_OPTIONS}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "risco_to_ha"
result = await hass.config_entries.options.async_configure(
result["flow_id"],
user_input=TEST_RISCO_TO_HA,
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "ha_to_risco"
with patch("homeassistant.components.risco.async_setup_entry", return_value=True):
result = await hass.config_entries.options.async_configure(
result["flow_id"],
user_input=TEST_HA_TO_RISCO,
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert entry.options == {
**TEST_OPTIONS,
**TEST_ADVANCED_OPTIONS,
"risco_states_to_ha": TEST_RISCO_TO_HA,
"ha_states_to_risco": TEST_HA_TO_RISCO,
}
async def test_ha_to_risco_schema(hass: HomeAssistant) -> None:
"""Test that the schema for the ha-to-risco mapping step is generated properly."""
entry = MockConfigEntry(

View File

@@ -9,7 +9,7 @@
'TV',
'HDMI',
]),
'supported_features': <MediaPlayerEntityFeature: 20413>,
'supported_features': <MediaPlayerEntityFeature: 24509>,
}),
'context': <ANY>,
'entity_id': 'media_player.any',
@@ -51,7 +51,7 @@
'original_name': None,
'platform': 'samsungtv',
'previous_unique_id': None,
'supported_features': <MediaPlayerEntityFeature: 20413>,
'supported_features': <MediaPlayerEntityFeature: 24509>,
'translation_key': None,
'unique_id': 'sample-entry-id',
'unit_of_measurement': None,

File diff suppressed because it is too large Load Diff

View File

@@ -221,7 +221,7 @@ async def test_auth_close_after_revoke(
hass.auth.async_remove_refresh_token(refresh_token)
msg = await websocket_client.receive()
assert msg.type == aiohttp.WSMsgType.CLOSED
assert msg.type is aiohttp.WSMsgType.CLOSE
assert websocket_client.closed

View File

@@ -43,7 +43,7 @@ async def test_pending_msg_overflow(
for idx in range(10):
await websocket_client.send_json({"id": idx + 1, "type": "ping"})
msg = await websocket_client.receive()
assert msg.type == WSMsgType.CLOSED
assert msg.type is WSMsgType.CLOSE
async def test_cleanup_on_cancellation(
@@ -249,7 +249,7 @@ async def test_pending_msg_peak(
)
msg = await websocket_client.receive()
assert msg.type == WSMsgType.CLOSED
assert msg.type is WSMsgType.CLOSE
assert "Client unable to keep up with pending messages" in caplog.text
assert "Stayed over 5 for 5 seconds" in caplog.text
assert "overload" in caplog.text
@@ -297,7 +297,7 @@ async def test_pending_msg_peak_recovery(
msg = await websocket_client.receive()
assert msg.type == WSMsgType.TEXT
msg = await websocket_client.receive()
assert msg.type == WSMsgType.CLOSED
assert msg.type is WSMsgType.CLOSE
assert "Client unable to keep up with pending messages" not in caplog.text

View File

@@ -41,7 +41,7 @@ async def test_quiting_hass(hass: HomeAssistant, websocket_client) -> None:
msg = await websocket_client.receive()
assert msg.type == WSMsgType.CLOSED
assert msg.type is WSMsgType.CLOSE
async def test_unknown_command(websocket_client) -> None:

View File

@@ -5,6 +5,7 @@ from typing import Any
from unittest.mock import AsyncMock, MagicMock, patch
from urllib.parse import urlparse
from aiohttp import ClientConnectionError
from aiohttp.hdrs import METH_HEAD
from aiowithings import (
NotificationCategory,
@@ -508,6 +509,110 @@ async def test_cloud_disconnect(
assert withings.subscribe_notification.call_count == 12
async def test_internet_disconnect(
hass: HomeAssistant,
withings: AsyncMock,
webhook_config_entry: MockConfigEntry,
hass_client_no_auth: ClientSessionGenerator,
freezer: FrozenDateTimeFactory,
) -> None:
"""Test we can recover from internet disconnects."""
await mock_cloud(hass)
await hass.async_block_till_done()
with (
patch("homeassistant.components.cloud.async_is_logged_in", return_value=True),
patch.object(cloud, "async_is_connected", return_value=True),
patch.object(cloud, "async_active_subscription", return_value=True),
patch(
"homeassistant.components.cloud.async_create_cloudhook",
return_value="https://hooks.nabu.casa/ABCD",
),
patch(
"homeassistant.components.withings.async_get_config_entry_implementation",
),
patch(
"homeassistant.components.cloud.async_delete_cloudhook",
),
patch(
"homeassistant.components.withings.webhook_generate_url",
),
):
await setup_integration(hass, webhook_config_entry)
await prepare_webhook_setup(hass, freezer)
assert cloud.async_active_subscription(hass) is True
assert cloud.async_is_connected(hass) is True
assert withings.revoke_notification_configurations.call_count == 3
assert withings.subscribe_notification.call_count == 6
await hass.async_block_till_done()
withings.list_notification_configurations.side_effect = ClientConnectionError
async_mock_cloud_connection_status(hass, False)
await hass.async_block_till_done()
assert withings.revoke_notification_configurations.call_count == 3
async_mock_cloud_connection_status(hass, True)
await hass.async_block_till_done()
assert withings.subscribe_notification.call_count == 12
async def test_cloud_disconnect_retry(
hass: HomeAssistant,
withings: AsyncMock,
webhook_config_entry: MockConfigEntry,
hass_client_no_auth: ClientSessionGenerator,
freezer: FrozenDateTimeFactory,
) -> None:
"""Test we retry to create webhook connection again after cloud disconnects."""
await mock_cloud(hass)
await hass.async_block_till_done()
with (
patch("homeassistant.components.cloud.async_is_logged_in", return_value=True),
patch.object(cloud, "async_is_connected", return_value=True),
patch.object(
cloud, "async_active_subscription", return_value=True
) as mock_async_active_subscription,
patch(
"homeassistant.components.cloud.async_create_cloudhook",
return_value="https://hooks.nabu.casa/ABCD",
),
patch(
"homeassistant.components.withings.async_get_config_entry_implementation",
),
patch(
"homeassistant.components.cloud.async_delete_cloudhook",
),
patch(
"homeassistant.components.withings.webhook_generate_url",
),
):
await setup_integration(hass, webhook_config_entry)
await prepare_webhook_setup(hass, freezer)
assert cloud.async_active_subscription(hass) is True
assert cloud.async_is_connected(hass) is True
assert mock_async_active_subscription.call_count == 3
await hass.async_block_till_done()
async_mock_cloud_connection_status(hass, False)
await hass.async_block_till_done()
assert mock_async_active_subscription.call_count == 3
freezer.tick(timedelta(seconds=30))
async_fire_time_changed(hass)
await hass.async_block_till_done()
assert mock_async_active_subscription.call_count == 4
@pytest.mark.parametrize(
("body", "expected_code"),
[

View File

@@ -1311,6 +1311,184 @@ async def test_wait_timeout(
assert_action_trace(expected_trace)
@pytest.mark.parametrize(
"timeout_param", [0, "{{ 0 }}", {"minutes": 0}, {"minutes": "{{ 0 }}"}]
)
async def test_wait_trigger_with_zero_timeout(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, timeout_param: int | str
) -> None:
"""Test the wait trigger with zero timeout option."""
event = "test_event"
events = async_capture_events(hass, event)
action = {
"wait_for_trigger": {
"platform": "state",
"entity_id": "switch.test",
"to": "off",
}
}
action["timeout"] = timeout_param
action["continue_on_timeout"] = True
sequence = cv.SCRIPT_SCHEMA([action, {"event": event}])
sequence = await script.async_validate_actions_config(hass, sequence)
script_obj = script.Script(hass, sequence, "Test Name", "test_domain")
wait_started_flag = async_watch_for_action(script_obj, "wait")
hass.states.async_set("switch.test", "on")
hass.async_create_task(script_obj.async_run(context=Context()))
try:
await asyncio.wait_for(wait_started_flag.wait(), 1)
except (AssertionError, TimeoutError):
await script_obj.async_stop()
raise
assert not script_obj.is_running
assert len(events) == 1
assert "(timeout: 0:00:00)" in caplog.text
variable_wait = {"wait": {"trigger": None, "remaining": 0.0}}
expected_trace = {
"0": [
{
"result": variable_wait,
"variables": variable_wait,
}
],
"1": [{"result": {"event": "test_event", "event_data": {}}}],
}
assert_action_trace(expected_trace)
@pytest.mark.parametrize(
"timeout_param", [0, "{{ 0 }}", {"minutes": 0}, {"minutes": "{{ 0 }}"}]
)
async def test_wait_trigger_matches_with_zero_timeout(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, timeout_param: int | str
) -> None:
"""Test the wait trigger that matches with zero timeout option."""
event = "test_event"
events = async_capture_events(hass, event)
action = {
"wait_for_trigger": {
"platform": "state",
"entity_id": "switch.test",
"to": "off",
}
}
action["timeout"] = timeout_param
action["continue_on_timeout"] = True
sequence = cv.SCRIPT_SCHEMA([action, {"event": event}])
sequence = await script.async_validate_actions_config(hass, sequence)
script_obj = script.Script(hass, sequence, "Test Name", "test_domain")
wait_started_flag = async_watch_for_action(script_obj, "wait")
hass.states.async_set("switch.test", "off")
hass.async_create_task(script_obj.async_run(context=Context()))
try:
await asyncio.wait_for(wait_started_flag.wait(), 1)
except (AssertionError, TimeoutError):
await script_obj.async_stop()
raise
assert not script_obj.is_running
assert len(events) == 1
assert "(timeout: 0:00:00)" in caplog.text
variable_wait = {"wait": {"trigger": None, "remaining": 0.0}}
expected_trace = {
"0": [
{
"result": variable_wait,
"variables": variable_wait,
}
],
"1": [{"result": {"event": "test_event", "event_data": {}}}],
}
assert_action_trace(expected_trace)
@pytest.mark.parametrize(
"timeout_param", [0, "{{ 0 }}", {"minutes": 0}, {"minutes": "{{ 0 }}"}]
)
async def test_wait_template_with_zero_timeout(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, timeout_param: int | str
) -> None:
"""Test the wait template with zero timeout option."""
event = "test_event"
events = async_capture_events(hass, event)
action = {"wait_template": "{{ states.switch.test.state == 'off' }}"}
action["timeout"] = timeout_param
action["continue_on_timeout"] = True
sequence = cv.SCRIPT_SCHEMA([action, {"event": event}])
sequence = await script.async_validate_actions_config(hass, sequence)
script_obj = script.Script(hass, sequence, "Test Name", "test_domain")
wait_started_flag = async_watch_for_action(script_obj, "wait")
hass.states.async_set("switch.test", "on")
hass.async_create_task(script_obj.async_run(context=Context()))
try:
await asyncio.wait_for(wait_started_flag.wait(), 1)
except (AssertionError, TimeoutError):
await script_obj.async_stop()
raise
assert not script_obj.is_running
assert len(events) == 1
assert "(timeout: 0:00:00)" in caplog.text
variable_wait = {"wait": {"completed": False, "remaining": 0.0}}
expected_trace = {
"0": [
{
"result": variable_wait,
"variables": variable_wait,
}
],
"1": [{"result": {"event": "test_event", "event_data": {}}}],
}
assert_action_trace(expected_trace)
@pytest.mark.parametrize(
"timeout_param", [0, "{{ 0 }}", {"minutes": 0}, {"minutes": "{{ 0 }}"}]
)
async def test_wait_template_matches_with_zero_timeout(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, timeout_param: int | str
) -> None:
"""Test the wait template that matches with zero timeout option."""
event = "test_event"
events = async_capture_events(hass, event)
action = {"wait_template": "{{ states.switch.test.state == 'off' }}"}
action["timeout"] = timeout_param
action["continue_on_timeout"] = True
sequence = cv.SCRIPT_SCHEMA([action, {"event": event}])
sequence = await script.async_validate_actions_config(hass, sequence)
script_obj = script.Script(hass, sequence, "Test Name", "test_domain")
wait_started_flag = async_watch_for_action(script_obj, "wait")
hass.states.async_set("switch.test", "off")
hass.async_create_task(script_obj.async_run(context=Context()))
try:
await asyncio.wait_for(wait_started_flag.wait(), 1)
except (AssertionError, TimeoutError):
await script_obj.async_stop()
raise
assert not script_obj.is_running
assert len(events) == 1
assert "(timeout: 0:00:00)" in caplog.text
variable_wait = {"wait": {"completed": True, "remaining": 0.0}}
expected_trace = {
"0": [
{
"result": variable_wait,
"variables": variable_wait,
}
],
"1": [{"result": {"event": "test_event", "event_data": {}}}],
}
assert_action_trace(expected_trace)
@pytest.mark.parametrize(
("continue_on_timeout", "n_events"), [(False, 0), (True, 1), (None, 1)]
)

View File

@@ -1,6 +1,6 @@
"""Tests for the update coordinator."""
from datetime import timedelta
from datetime import datetime, timedelta
import logging
from unittest.mock import AsyncMock, Mock, patch
import urllib.error
@@ -12,7 +12,7 @@ import requests
from homeassistant import config_entries
from homeassistant.const import EVENT_HOMEASSISTANT_STOP
from homeassistant.core import CoreState, HomeAssistant
from homeassistant.core import CoreState, HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import update_coordinator
from homeassistant.util.dt import utcnow
@@ -716,3 +716,35 @@ async def test_always_callback_when_always_update_is_true(
update_callback.reset_mock()
remove_callbacks()
async def test_timestamp_date_update_coordinator(hass: HomeAssistant) -> None:
"""Test last_update_success_time is set before calling listeners."""
last_update_success_times: list[datetime | None] = []
async def refresh() -> int:
return 1
crd = update_coordinator.TimestampDataUpdateCoordinator[int](
hass,
_LOGGER,
name="test",
update_method=refresh,
update_interval=timedelta(seconds=10),
)
@callback
def listener():
last_update_success_times.append(crd.last_update_success_time)
unsub = crd.async_add_listener(listener)
await crd.async_refresh()
assert len(last_update_success_times) == 1
# Ensure the time is set before the listener is called
assert last_update_success_times != [None]
unsub()
await crd.async_refresh()
assert len(last_update_success_times) == 1