Compare commits

...

43 Commits

Author SHA1 Message Date
Franck Nijhof
79ff85f517 2025.2.1 (#137688)
* Fix hassio test using wrong fixture (#137516)

* Change Electric Kiwi authentication (#135231)

Co-authored-by: Joostlek <joostlek@outlook.com>

* Update govee-ble to 0.42.1 (#137371)

* Bump holidays to 0.66 (#137449)

* Bump aiohttp-asyncmdnsresolver to 0.1.0 (#137492)

changelog: https://github.com/aio-libs/aiohttp-asyncmdnsresolver/compare/v0.0.3...v0.1.0

Switches to the new AsyncDualMDNSResolver class to which
tries via mDNS and DNS for .local domains since we have
so many different user DNS configurations to support

fixes #137479
fixes #136922

* Bump aiohttp to 3.11.12 (#137494)

changelog: https://github.com/aio-libs/aiohttp/compare/v3.11.11...v3.11.12

* Bump govee-ble to 0.43.0 to fix compat with new H5179 firmware (#137508)

changelog: https://github.com/Bluetooth-Devices/govee-ble/compare/v0.42.1...v0.43.0

fixes #136969

* Bump habiticalib to v0.3.5 (#137510)

* Fix Mill issue, where no sensors were shown (#137521)

Fix mill issue #137477

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>

* Don't overwrite setup state in async_set_domains_to_be_loaded (#137547)

* Use separate metadata files for onedrive (#137549)

* Fix sending polls to Telegram threads (#137553)

Fix sending poll to Telegram thread

* Skip building wheels for electrickiwi-api (#137556)

* Add excluded domains to broadcast intent (#137566)

* Revert "Add `PaddleSwitchPico` (Pico Paddle Remote) device trigger to Lutron Caseta" (#137571)

* Fix Overseerr webhook configuration JSON (#137572)

Co-authored-by: Lars Jouon <schm.lars@googlemail.com>

* Do not rely on pyserial for port scanning with the CM5 + ZHA (#137585)

Do not rely on pyserial for port scanning with the CM5

* Bump eheimdigital to 1.0.6 (#137587)

* Bump pyfireservicerota to 0.0.46 (#137589)

* Bump reolink-aio to 0.11.10 (#137591)

* Allow to omit the payload attribute to MQTT publish action to allow an empty payload to be sent by default (#137595)

Allow to omit the payload attribute to MQTT publish actionto allow an empty payload to be sent by default

* Handle previously migrated HEOS device identifier (#137596)

* Bump `aioshelly` to version `12.4.1` (#137598)

* Bump aioshelly to 12.4.0

* Bump to 12.4.1

* Bump electrickiwi-api  to 0.9.13 (#137601)

* bump ek api version to fix deps

* Revert "Skip building wheels for electrickiwi-api (#137556)"

This reverts commit 5f6068eea4.

---------

Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>

* Bump ZHA to 0.0.48 (#137610)

* Bump Electrickiwi-api to 0.9.14 (#137614)

* bump library to fix bug with post

* rebuild

* Update google-nest-sdm to 7.1.3 (#137625)

* Update google-nest-sdm to 7.1.2

* Bump nest to 7.1.3

* Don't use the current temperature from Shelly BLU TRV as a state for External Temperature number entity (#137658)

Introduce RpcBluTrvExtTempNumber for External Temperature entity

* Fix LG webOS TV turn off when device is already off (#137675)

* Bump version to 2025.2.1

---------

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Erik Montnemery <erik@montnemery.com>
Co-authored-by: Michael Arthur <mikey0000@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
Co-authored-by: G Johansson <goran.johansson@shiftit.se>
Co-authored-by: J. Nick Koston <nick@koston.org>
Co-authored-by: Manu <4445816+tr4nt0r@users.noreply.github.com>
Co-authored-by: Daniel Hjelseth Høyer <github@dahoiv.net>
Co-authored-by: Josef Zweck <josef@zweck.dev>
Co-authored-by: Jasper Wiegratz <656460+jwhb@users.noreply.github.com>
Co-authored-by: Michael Hansen <mike@rhasspy.org>
Co-authored-by: Dennis Effing <dennis.effing@outlook.com>
Co-authored-by: Lars Jouon <schm.lars@googlemail.com>
Co-authored-by: puddly <32534428+puddly@users.noreply.github.com>
Co-authored-by: Sid <27780930+autinerd@users.noreply.github.com>
Co-authored-by: Ron <ron@cyberjunky.nl>
Co-authored-by: starkillerOG <starkiller.og@gmail.com>
Co-authored-by: Jan Bouwhuis <jbouwh@users.noreply.github.com>
Co-authored-by: Andrew Sayre <6730289+andrewsayre@users.noreply.github.com>
Co-authored-by: Maciej Bieniek <bieniu@users.noreply.github.com>
Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
Co-authored-by: Allen Porter <allen@thebends.org>
Co-authored-by: Shay Levy <levyshay1@gmail.com>
2025-02-07 19:34:32 +01:00
Franck Nijhof
73ad4caf94 Bump version to 2025.2.1 2025-02-07 16:39:53 +00:00
Shay Levy
e3d649d349 Fix LG webOS TV turn off when device is already off (#137675) 2025-02-07 16:37:52 +00:00
Maciej Bieniek
657e3488ba Don't use the current temperature from Shelly BLU TRV as a state for External Temperature number entity (#137658)
Introduce RpcBluTrvExtTempNumber for External Temperature entity
2025-02-07 16:37:49 +00:00
Allen Porter
7508c14a53 Update google-nest-sdm to 7.1.3 (#137625)
* Update google-nest-sdm to 7.1.2

* Bump nest to 7.1.3
2025-02-07 16:37:43 +00:00
Michael Arthur
ac84970da8 Bump Electrickiwi-api to 0.9.14 (#137614)
* bump library to fix bug with post

* rebuild
2025-02-07 16:37:40 +00:00
TheJulianJES
30073f3493 Bump ZHA to 0.0.48 (#137610) 2025-02-07 16:37:36 +00:00
Michael Arthur
3abd7b8ba3 Bump electrickiwi-api to 0.9.13 (#137601)
* bump ek api version to fix deps

* Revert "Skip building wheels for electrickiwi-api (#137556)"

This reverts commit 5f6068eea4.

---------

Co-authored-by: Marc Mueller <30130371+cdce8p@users.noreply.github.com>
2025-02-07 16:37:33 +00:00
Maciej Bieniek
62bc6e4bf6 Bump aioshelly to version 12.4.1 (#137598)
* Bump aioshelly to 12.4.0

* Bump to 12.4.1
2025-02-07 16:37:30 +00:00
Andrew Sayre
5faa189fef Handle previously migrated HEOS device identifier (#137596) 2025-02-07 16:37:26 +00:00
Jan Bouwhuis
e09ae1c83d Allow to omit the payload attribute to MQTT publish action to allow an empty payload to be sent by default (#137595)
Allow to omit the payload attribute to MQTT publish actionto allow an empty payload to be sent by default
2025-02-07 16:37:23 +00:00
starkillerOG
7b20299de7 Bump reolink-aio to 0.11.10 (#137591) 2025-02-07 16:37:19 +00:00
Ron
81e501aba1 Bump pyfireservicerota to 0.0.46 (#137589) 2025-02-07 16:37:16 +00:00
Sid
568ac22ce8 Bump eheimdigital to 1.0.6 (#137587) 2025-02-07 16:37:12 +00:00
puddly
c71ab054f1 Do not rely on pyserial for port scanning with the CM5 + ZHA (#137585)
Do not rely on pyserial for port scanning with the CM5
2025-02-07 16:37:09 +00:00
Dennis Effing
bea201f9f6 Fix Overseerr webhook configuration JSON (#137572)
Co-authored-by: Lars Jouon <schm.lars@googlemail.com>
2025-02-07 16:37:05 +00:00
J. Nick Koston
dda90bc04c Revert "Add PaddleSwitchPico (Pico Paddle Remote) device trigger to Lutron Caseta" (#137571) 2025-02-07 16:37:02 +00:00
Michael Hansen
a033e4c88d Add excluded domains to broadcast intent (#137566) 2025-02-07 16:36:59 +00:00
Marc Mueller
42b6f83e7c Skip building wheels for electrickiwi-api (#137556) 2025-02-07 16:36:55 +00:00
Jasper Wiegratz
cb937bc115 Fix sending polls to Telegram threads (#137553)
Fix sending poll to Telegram thread
2025-02-07 16:36:51 +00:00
Josef Zweck
bec569caf9 Use separate metadata files for onedrive (#137549) 2025-02-07 16:36:47 +00:00
Erik Montnemery
3390fb32a8 Don't overwrite setup state in async_set_domains_to_be_loaded (#137547) 2025-02-07 16:36:43 +00:00
Daniel Hjelseth Høyer
3ebb58f780 Fix Mill issue, where no sensors were shown (#137521)
Fix mill issue #137477

Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2025-02-07 16:36:40 +00:00
Manu
30b131d3b9 Bump habiticalib to v0.3.5 (#137510) 2025-02-07 16:36:36 +00:00
J. Nick Koston
cd40232beb Bump govee-ble to 0.43.0 to fix compat with new H5179 firmware (#137508)
changelog: https://github.com/Bluetooth-Devices/govee-ble/compare/v0.42.1...v0.43.0

fixes #136969
2025-02-07 16:36:29 +00:00
J. Nick Koston
f27fe365c5 Bump aiohttp to 3.11.12 (#137494)
changelog: https://github.com/aio-libs/aiohttp/compare/v3.11.11...v3.11.12
2025-02-07 16:34:31 +00:00
J. Nick Koston
1c769418fb Bump aiohttp-asyncmdnsresolver to 0.1.0 (#137492)
changelog: https://github.com/aio-libs/aiohttp-asyncmdnsresolver/compare/v0.0.3...v0.1.0

Switches to the new AsyncDualMDNSResolver class to which
tries via mDNS and DNS for .local domains since we have
so many different user DNS configurations to support

fixes #137479
fixes #136922
2025-02-07 16:32:21 +00:00
G Johansson
db7c2dab52 Bump holidays to 0.66 (#137449) 2025-02-07 16:28:43 +00:00
Marc Mueller
627377872b Update govee-ble to 0.42.1 (#137371) 2025-02-07 16:28:37 +00:00
Michael Arthur
8504162539 Change Electric Kiwi authentication (#135231)
Co-authored-by: Joostlek <joostlek@outlook.com>
2025-02-07 16:28:31 +00:00
Erik Montnemery
67c6a1d436 Fix hassio test using wrong fixture (#137516) 2025-02-06 09:04:49 +01:00
Franck Nijhof
5c383f3d88 2025.2.0 (#137448) 2025-02-05 20:11:04 +01:00
Franck Nijhof
3a88c9d6f4 Bump version to 2025.2.0 2025-02-05 17:35:07 +00:00
Franck Nijhof
5c7cabed1e Bump version to 2025.2.0b12 2025-02-05 17:30:55 +00:00
J. Nick Koston
65fde6042f Bump dbus-fast to 2.33.0 (#137446)
changelog: https://github.com/Bluetooth-Devices/dbus-fast/compare/v2.32.0...v2.33.0
2025-02-05 17:30:19 +00:00
Michael Hansen
d5dd0f6ec1 Bump hassil and intents (#137440) 2025-02-05 17:28:33 +00:00
Marc Mueller
95410586b1 Update bluetooth-data-tools to 1.23.4 (#137374)
Co-authored-by: J. Nick Koston <nick@koston.org>
2025-02-05 17:24:18 +00:00
Marc Mueller
d5ad91fce3 Update bluetooth dependencies (#137353) 2025-02-05 17:21:28 +00:00
Franck Nijhof
04b0d587c5 Bump version to 2025.2.0b11 2025-02-05 16:18:01 +00:00
Bram Kragten
72a3c5296c Update frontend to 20250205.0 (#137441) 2025-02-05 16:16:12 +00:00
Erik Montnemery
d6414b9849 Bump aiohasupervisor to version 0.3.0 (#137437) 2025-02-05 16:14:42 +00:00
starkillerOG
c4e2ddd28b Bump reolink_aio to 0.11.9 (#137430)
* Add push callbacks

* Bump reolink_aio to 0.11.9
2025-02-05 16:14:39 +00:00
Josef Zweck
5687a4d718 Bump onedrive to 0.0.8 (#137423)
* Bump onedrive to 0.0.6

* bump to 0.0.7

* bump to 0.0.8

* Improve coverage
2025-02-05 16:14:06 +00:00
91 changed files with 2369 additions and 1584 deletions

View File

@@ -1,5 +1,7 @@
"""Assist Satellite intents."""
from typing import Final
import voluptuous as vol
from homeassistant.core import HomeAssistant
@@ -7,6 +9,8 @@ from homeassistant.helpers import entity_registry as er, intent
from .const import DOMAIN, AssistSatelliteEntityFeature
EXCLUDED_DOMAINS: Final[set[str]] = {"voip"}
async def async_setup_intents(hass: HomeAssistant) -> None:
"""Set up the intents."""
@@ -30,19 +34,36 @@ class BroadcastIntentHandler(intent.IntentHandler):
ent_reg = er.async_get(hass)
# Find all assist satellite entities that are not the one invoking the intent
entities = {
entity: entry
for entity in hass.states.async_entity_ids(DOMAIN)
if (entry := ent_reg.async_get(entity))
and entry.supported_features & AssistSatelliteEntityFeature.ANNOUNCE
}
entities: dict[str, er.RegistryEntry] = {}
for entity in hass.states.async_entity_ids(DOMAIN):
entry = ent_reg.async_get(entity)
if (
(entry is None)
or (
# Supports announce
not (
entry.supported_features & AssistSatelliteEntityFeature.ANNOUNCE
)
)
# Not the invoking device
or (intent_obj.device_id and (entry.device_id == intent_obj.device_id))
):
# Skip satellite
continue
if intent_obj.device_id:
entities = {
entity: entry
for entity, entry in entities.items()
if entry.device_id != intent_obj.device_id
}
# Check domain of config entry against excluded domains
if (
entry.config_entry_id
and (
config_entry := hass.config_entries.async_get_entry(
entry.config_entry_id
)
)
and (config_entry.domain in EXCLUDED_DOMAINS)
):
continue
entities[entity] = entry
await hass.services.async_call(
DOMAIN,
@@ -54,7 +75,6 @@ class BroadcastIntentHandler(intent.IntentHandler):
)
response = intent_obj.create_response()
response.async_set_speech("Done")
response.response_type = intent.IntentResponseType.ACTION_DONE
response.async_set_results(
success_results=[

View File

@@ -16,11 +16,11 @@
"quality_scale": "internal",
"requirements": [
"bleak==0.22.3",
"bleak-retry-connector==3.8.0",
"bluetooth-adapters==0.21.1",
"bleak-retry-connector==3.8.1",
"bluetooth-adapters==0.21.4",
"bluetooth-auto-recovery==1.4.2",
"bluetooth-data-tools==1.23.3",
"dbus-fast==2.32.0",
"habluetooth==3.21.0"
"bluetooth-data-tools==1.23.4",
"dbus-fast==2.33.0",
"habluetooth==3.21.1"
]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["hassil==2.2.0", "home-assistant-intents==2025.1.28"]
"requirements": ["hassil==2.2.3", "home-assistant-intents==2025.2.5"]
}

View File

@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["eheimdigital"],
"quality_scale": "bronze",
"requirements": ["eheimdigital==1.0.5"],
"requirements": ["eheimdigital==1.0.6"],
"zeroconf": [
{ "type": "_http._tcp.local.", "name": "eheimdigital._http._tcp.local." }
]

View File

@@ -4,12 +4,16 @@ from __future__ import annotations
import aiohttp
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException
from electrickiwi_api.exceptions import ApiException, AuthException
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from homeassistant.helpers import (
aiohttp_client,
config_entry_oauth2_flow,
entity_registry as er,
)
from . import api
from .coordinator import (
@@ -44,7 +48,9 @@ async def async_setup_entry(
raise ConfigEntryNotReady from err
ek_api = ElectricKiwiApi(
api.AsyncConfigEntryAuth(aiohttp_client.async_get_clientsession(hass), session)
api.ConfigEntryElectricKiwiAuth(
aiohttp_client.async_get_clientsession(hass), session
)
)
hop_coordinator = ElectricKiwiHOPDataCoordinator(hass, entry, ek_api)
account_coordinator = ElectricKiwiAccountDataCoordinator(hass, entry, ek_api)
@@ -53,6 +59,8 @@ async def async_setup_entry(
await ek_api.set_active_session()
await hop_coordinator.async_config_entry_first_refresh()
await account_coordinator.async_config_entry_first_refresh()
except AuthException as err:
raise ConfigEntryAuthFailed from err
except ApiException as err:
raise ConfigEntryNotReady from err
@@ -70,3 +78,53 @@ async def async_unload_entry(
) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def async_migrate_entry(
hass: HomeAssistant, config_entry: ElectricKiwiConfigEntry
) -> bool:
"""Migrate old entry."""
if config_entry.version == 1 and config_entry.minor_version == 1:
implementation = (
await config_entry_oauth2_flow.async_get_config_entry_implementation(
hass, config_entry
)
)
session = config_entry_oauth2_flow.OAuth2Session(
hass, config_entry, implementation
)
ek_api = ElectricKiwiApi(
api.ConfigEntryElectricKiwiAuth(
aiohttp_client.async_get_clientsession(hass), session
)
)
try:
await ek_api.set_active_session()
connection_details = await ek_api.get_connection_details()
except AuthException:
config_entry.async_start_reauth(hass)
return False
except ApiException:
return False
unique_id = str(ek_api.customer_number)
identifier = ek_api.electricity.identifier
hass.config_entries.async_update_entry(
config_entry, unique_id=unique_id, minor_version=2
)
entity_registry = er.async_get(hass)
entity_entries = er.async_entries_for_config_entry(
entity_registry, config_entry_id=config_entry.entry_id
)
for entity in entity_entries:
assert entity.config_entry_id
entity_registry.async_update_entity(
entity.entity_id,
new_unique_id=entity.unique_id.replace(
f"{unique_id}_{connection_details.id}", f"{unique_id}_{identifier}"
),
)
return True

View File

@@ -2,17 +2,16 @@
from __future__ import annotations
from typing import cast
from aiohttp import ClientSession
from electrickiwi_api import AbstractAuth
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.core import HomeAssistant
from homeassistant.helpers import aiohttp_client, config_entry_oauth2_flow
from .const import API_BASE_URL
class AsyncConfigEntryAuth(AbstractAuth):
class ConfigEntryElectricKiwiAuth(AbstractAuth):
"""Provide Electric Kiwi authentication tied to an OAuth2 based config entry."""
def __init__(
@@ -29,4 +28,21 @@ class AsyncConfigEntryAuth(AbstractAuth):
"""Return a valid access token."""
await self._oauth_session.async_ensure_token_valid()
return cast(str, self._oauth_session.token["access_token"])
return str(self._oauth_session.token["access_token"])
class ConfigFlowElectricKiwiAuth(AbstractAuth):
"""Provide Electric Kiwi authentication tied to an OAuth2 based config flow."""
def __init__(
self,
hass: HomeAssistant,
token: str,
) -> None:
"""Initialize ConfigFlowFitbitApi."""
super().__init__(aiohttp_client.async_get_clientsession(hass), API_BASE_URL)
self._token = token
async def async_get_access_token(self) -> str:
"""Return the token for the Electric Kiwi API."""
return self._token

View File

@@ -6,9 +6,14 @@ from collections.abc import Mapping
import logging
from typing import Any
from homeassistant.config_entries import ConfigFlowResult
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException
from homeassistant.config_entries import SOURCE_REAUTH, ConfigFlowResult
from homeassistant.const import CONF_NAME
from homeassistant.helpers import config_entry_oauth2_flow
from . import api
from .const import DOMAIN, SCOPE_VALUES
@@ -17,6 +22,8 @@ class ElectricKiwiOauth2FlowHandler(
):
"""Config flow to handle Electric Kiwi OAuth2 authentication."""
VERSION = 1
MINOR_VERSION = 2
DOMAIN = DOMAIN
@property
@@ -40,12 +47,30 @@ class ElectricKiwiOauth2FlowHandler(
) -> ConfigFlowResult:
"""Dialog that informs the user that reauth is required."""
if user_input is None:
return self.async_show_form(step_id="reauth_confirm")
return self.async_show_form(
step_id="reauth_confirm",
description_placeholders={CONF_NAME: self._get_reauth_entry().title},
)
return await self.async_step_user()
async def async_oauth_create_entry(self, data: dict) -> ConfigFlowResult:
"""Create an entry for Electric Kiwi."""
existing_entry = await self.async_set_unique_id(DOMAIN)
if existing_entry:
return self.async_update_reload_and_abort(existing_entry, data=data)
return await super().async_oauth_create_entry(data)
ek_api = ElectricKiwiApi(
api.ConfigFlowElectricKiwiAuth(self.hass, data["token"]["access_token"])
)
try:
session = await ek_api.get_active_session()
except ApiException:
return self.async_abort(reason="connection_error")
unique_id = str(session.data.customer_number)
await self.async_set_unique_id(unique_id)
if self.source == SOURCE_REAUTH:
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(), data=data
)
self._abort_if_unique_id_configured()
return self.async_create_entry(title=unique_id, data=data)

View File

@@ -8,4 +8,4 @@ OAUTH2_AUTHORIZE = "https://welcome.electrickiwi.co.nz/oauth/authorize"
OAUTH2_TOKEN = "https://welcome.electrickiwi.co.nz/oauth/token"
API_BASE_URL = "https://api.electrickiwi.co.nz"
SCOPE_VALUES = "read_connection_detail read_billing_frequency read_account_running_balance read_consumption_summary read_consumption_averages read_hop_intervals_config read_hop_connection save_hop_connection read_session"
SCOPE_VALUES = "read_customer_details read_connection_detail read_connection read_billing_address get_bill_address read_billing_frequency read_billing_details read_billing_bills read_billing_bill read_billing_bill_id read_billing_bill_file read_account_running_balance read_customer_account_summary read_consumption_summary download_consumption_file read_consumption_averages get_consumption_averages read_hop_intervals_config read_hop_intervals read_hop_connection read_hop_specific_connection save_hop_connection save_hop_specific_connection read_outage_contact get_outage_contact_info_for_icp read_session read_session_data_login"

View File

@@ -10,7 +10,7 @@ import logging
from electrickiwi_api import ElectricKiwiApi
from electrickiwi_api.exceptions import ApiException, AuthException
from electrickiwi_api.model import AccountBalance, Hop, HopIntervals
from electrickiwi_api.model import AccountSummary, Hop, HopIntervals
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -34,7 +34,7 @@ class ElectricKiwiRuntimeData:
type ElectricKiwiConfigEntry = ConfigEntry[ElectricKiwiRuntimeData]
class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountBalance]):
class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountSummary]):
"""ElectricKiwi Account Data object."""
def __init__(
@@ -51,13 +51,13 @@ class ElectricKiwiAccountDataCoordinator(DataUpdateCoordinator[AccountBalance]):
name="Electric Kiwi Account Data",
update_interval=ACCOUNT_SCAN_INTERVAL,
)
self._ek_api = ek_api
self.ek_api = ek_api
async def _async_update_data(self) -> AccountBalance:
async def _async_update_data(self) -> AccountSummary:
"""Fetch data from Account balance API endpoint."""
try:
async with asyncio.timeout(60):
return await self._ek_api.get_account_balance()
return await self.ek_api.get_account_summary()
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:
@@ -85,7 +85,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
# Polling interval. Will only be polled if there are subscribers.
update_interval=HOP_SCAN_INTERVAL,
)
self._ek_api = ek_api
self.ek_api = ek_api
self.hop_intervals: HopIntervals | None = None
def get_hop_options(self) -> dict[str, int]:
@@ -100,7 +100,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
async def async_update_hop(self, hop_interval: int) -> Hop:
"""Update selected hop and data."""
try:
self.async_set_updated_data(await self._ek_api.post_hop(hop_interval))
self.async_set_updated_data(await self.ek_api.post_hop(hop_interval))
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:
@@ -118,7 +118,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
try:
async with asyncio.timeout(60):
if self.hop_intervals is None:
hop_intervals: HopIntervals = await self._ek_api.get_hop_intervals()
hop_intervals: HopIntervals = await self.ek_api.get_hop_intervals()
hop_intervals.intervals = OrderedDict(
filter(
lambda pair: pair[1].active == 1,
@@ -127,7 +127,7 @@ class ElectricKiwiHOPDataCoordinator(DataUpdateCoordinator[Hop]):
)
self.hop_intervals = hop_intervals
return await self._ek_api.get_hop()
return await self.ek_api.get_hop()
except AuthException as auth_err:
raise ConfigEntryAuthFailed from auth_err
except ApiException as api_err:

View File

@@ -7,5 +7,5 @@
"documentation": "https://www.home-assistant.io/integrations/electric_kiwi",
"integration_type": "hub",
"iot_class": "cloud_polling",
"requirements": ["electrickiwi-api==0.8.5"]
"requirements": ["electrickiwi-api==0.9.14"]
}

View File

@@ -53,8 +53,8 @@ class ElectricKiwiSelectHOPEntity(
"""Initialise the HOP selection entity."""
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description
self.values_dict = coordinator.get_hop_options()

View File

@@ -6,7 +6,7 @@ from collections.abc import Callable
from dataclasses import dataclass
from datetime import datetime, timedelta
from electrickiwi_api.model import AccountBalance, Hop
from electrickiwi_api.model import AccountSummary, Hop
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -39,7 +39,15 @@ ATTR_HOP_PERCENTAGE = "hop_percentage"
class ElectricKiwiAccountSensorEntityDescription(SensorEntityDescription):
"""Describes Electric Kiwi sensor entity."""
value_func: Callable[[AccountBalance], float | datetime]
value_func: Callable[[AccountSummary], float | datetime]
def _get_hop_percentage(account_balance: AccountSummary) -> float:
"""Return the hop percentage from account summary."""
if power := account_balance.services.get("power"):
if connection := power.connections[0]:
return float(connection.hop_percentage)
return 0.0
ACCOUNT_SENSOR_TYPES: tuple[ElectricKiwiAccountSensorEntityDescription, ...] = (
@@ -72,9 +80,7 @@ ACCOUNT_SENSOR_TYPES: tuple[ElectricKiwiAccountSensorEntityDescription, ...] = (
translation_key="hop_power_savings",
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
value_func=lambda account_balance: float(
account_balance.connections[0].hop_percentage
),
value_func=_get_hop_percentage,
),
)
@@ -165,8 +171,8 @@ class ElectricKiwiAccountEntity(
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description
@@ -194,8 +200,8 @@ class ElectricKiwiHOPEntity(
super().__init__(coordinator)
self._attr_unique_id = (
f"{coordinator._ek_api.customer_number}" # noqa: SLF001
f"_{coordinator._ek_api.connection_id}_{description.key}" # noqa: SLF001
f"{coordinator.ek_api.customer_number}"
f"_{coordinator.ek_api.electricity.identifier}_{description.key}"
)
self.entity_description = description

View File

@@ -21,7 +21,8 @@
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"oauth_timeout": "[%key:common::config_flow::abort::oauth2_timeout%]",
"oauth_unauthorized": "[%key:common::config_flow::abort::oauth2_unauthorized%]",
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]"
"oauth_failed": "[%key:common::config_flow::abort::oauth2_failed%]",
"connection_error": "[%key:common::config_flow::error::cannot_connect%]"
},
"create_entry": {
"default": "[%key:common::config_flow::create_entry::authenticated%]"

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/fireservicerota",
"iot_class": "cloud_polling",
"loggers": ["pyfireservicerota"],
"requirements": ["pyfireservicerota==0.0.43"]
"requirements": ["pyfireservicerota==0.0.46"]
}

View File

@@ -21,5 +21,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250204.0"]
"requirements": ["home-assistant-frontend==20250205.0"]
}

View File

@@ -38,6 +38,10 @@
"local_name": "GV5126*",
"connectable": false
},
{
"local_name": "GV5179*",
"connectable": false
},
{
"local_name": "GVH5127*",
"connectable": false
@@ -131,5 +135,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/govee_ble",
"iot_class": "local_push",
"requirements": ["govee-ble==0.42.0"]
"requirements": ["govee-ble==0.43.0"]
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/habitica",
"iot_class": "cloud_polling",
"loggers": ["habiticalib"],
"requirements": ["habiticalib==0.3.4"]
"requirements": ["habiticalib==0.3.5"]
}

View File

@@ -510,7 +510,8 @@ def async_setup_services(hass: HomeAssistant) -> None: # noqa: C901
or (task.notes and keyword in task.notes.lower())
or any(keyword in item.text.lower() for item in task.checklist)
]
result: dict[str, Any] = {"tasks": response}
result: dict[str, Any] = {"tasks": [task.to_dict() for task in response]}
return result
hass.services.async_register(

View File

@@ -20,6 +20,7 @@ from aiohasupervisor.models import (
backups as supervisor_backups,
mounts as supervisor_mounts,
)
from aiohasupervisor.models.backups import LOCATION_CLOUD_BACKUP, LOCATION_LOCAL_STORAGE
from homeassistant.components.backup import (
DATA_MANAGER,
@@ -56,8 +57,6 @@ from homeassistant.util.enum import try_parse_enum
from .const import DOMAIN, EVENT_SUPERVISOR_EVENT
from .handler import get_supervisor_client
LOCATION_CLOUD_BACKUP = ".cloud_backup"
LOCATION_LOCAL = ".local"
MOUNT_JOBS = ("mount_manager_create_mount", "mount_manager_remove_mount")
RESTORE_JOB_ID_ENV = "SUPERVISOR_RESTORE_JOB_ID"
# Set on backups automatically created when updating an addon
@@ -72,7 +71,9 @@ async def async_get_backup_agents(
"""Return the hassio backup agents."""
client = get_supervisor_client(hass)
mounts = await client.mounts.info()
agents: list[BackupAgent] = [SupervisorBackupAgent(hass, "local", None)]
agents: list[BackupAgent] = [
SupervisorBackupAgent(hass, "local", LOCATION_LOCAL_STORAGE)
]
for mount in mounts.mounts:
if mount.usage is not supervisor_mounts.MountUsage.BACKUP:
continue
@@ -112,7 +113,7 @@ def async_register_backup_agents_listener(
def _backup_details_to_agent_backup(
details: supervisor_backups.BackupComplete, location: str | None
details: supervisor_backups.BackupComplete, location: str
) -> AgentBackup:
"""Convert a supervisor backup details object to an agent backup."""
homeassistant_included = details.homeassistant is not None
@@ -125,7 +126,6 @@ def _backup_details_to_agent_backup(
for addon in details.addons
]
extra_metadata = details.extra or {}
location = location or LOCATION_LOCAL
return AgentBackup(
addons=addons,
backup_id=details.slug,
@@ -148,7 +148,7 @@ class SupervisorBackupAgent(BackupAgent):
domain = DOMAIN
def __init__(self, hass: HomeAssistant, name: str, location: str | None) -> None:
def __init__(self, hass: HomeAssistant, name: str, location: str) -> None:
"""Initialize the backup agent."""
super().__init__()
self._hass = hass
@@ -206,7 +206,7 @@ class SupervisorBackupAgent(BackupAgent):
backup_list = await self._client.backups.list()
result = []
for backup in backup_list:
if not backup.locations or self.location not in backup.locations:
if self.location not in backup.location_attributes:
continue
details = await self._client.backups.backup_info(backup.slug)
result.append(_backup_details_to_agent_backup(details, self.location))
@@ -222,7 +222,7 @@ class SupervisorBackupAgent(BackupAgent):
details = await self._client.backups.backup_info(backup_id)
except SupervisorNotFoundError:
return None
if self.location not in details.locations:
if self.location not in details.location_attributes:
return None
return _backup_details_to_agent_backup(details, self.location)
@@ -295,8 +295,8 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
# will be handled by async_upload_backup.
# If the lists are the same length, it does not matter which one we send,
# we send the encrypted list to have a well defined behavior.
encrypted_locations: list[str | None] = []
decrypted_locations: list[str | None] = []
encrypted_locations: list[str] = []
decrypted_locations: list[str] = []
agents_settings = manager.config.data.agents
for hassio_agent in hassio_agents:
if password is not None:
@@ -353,12 +353,12 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
eager_start=False, # To ensure the task is not started before we return
)
return (NewBackup(backup_job_id=backup.job_id), backup_task)
return (NewBackup(backup_job_id=backup.job_id.hex), backup_task)
async def _async_wait_for_backup(
self,
backup: supervisor_backups.NewBackup,
locations: list[str | None],
locations: list[str],
*,
on_progress: Callable[[CreateBackupEvent], None],
remove_after_upload: bool,
@@ -508,7 +508,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
else None
)
restore_location: str | None
restore_location: str
if manager.backup_agents[agent_id].domain != DOMAIN:
# Download the backup to the supervisor. Supervisor will clean up the backup
# two days after the restore is done.
@@ -577,10 +577,11 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
on_progress: Callable[[RestoreBackupEvent | IdleEvent], None],
) -> None:
"""Check restore status after core restart."""
if not (restore_job_id := os.environ.get(RESTORE_JOB_ID_ENV)):
if not (restore_job_str := os.environ.get(RESTORE_JOB_ID_ENV)):
_LOGGER.debug("No restore job ID found in environment")
return
restore_job_id = UUID(restore_job_str)
_LOGGER.debug("Found restore job ID %s in environment", restore_job_id)
sent_event = False
@@ -634,7 +635,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
@callback
def _async_listen_job_events(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
self, job_id: UUID, on_event: Callable[[Mapping[str, Any]], None]
) -> Callable[[], None]:
"""Listen for job events."""
@@ -649,7 +650,7 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
if (
data.get("event") != "job"
or not (event_data := data.get("data"))
or event_data.get("uuid") != job_id
or event_data.get("uuid") != job_id.hex
):
return
on_event(event_data)
@@ -660,10 +661,10 @@ class SupervisorBackupReaderWriter(BackupReaderWriter):
return unsub
async def _get_job_state(
self, job_id: str, on_event: Callable[[Mapping[str, Any]], None]
self, job_id: UUID, on_event: Callable[[Mapping[str, Any]], None]
) -> None:
"""Poll a job for its state."""
job = await self._client.jobs.get_job(UUID(job_id))
job = await self._client.jobs.get_job(job_id)
_LOGGER.debug("Job state: %s", job)
on_event(job.to_dict())

View File

@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/hassio",
"iot_class": "local_polling",
"quality_scale": "internal",
"requirements": ["aiohasupervisor==0.2.2b6"],
"requirements": ["aiohasupervisor==0.3.0"],
"single_config_entry": true
}

View File

@@ -39,9 +39,22 @@ async def async_setup_entry(hass: HomeAssistant, entry: HeosConfigEntry) -> bool
):
for domain, player_id in device.identifiers:
if domain == DOMAIN and not isinstance(player_id, str):
device_registry.async_update_device( # type: ignore[unreachable]
device.id, new_identifiers={(DOMAIN, str(player_id))}
)
# Create set of identifiers excluding this integration
identifiers = { # type: ignore[unreachable]
(domain, identifier)
for domain, identifier in device.identifiers
if domain != DOMAIN
}
migrated_identifiers = {(DOMAIN, str(player_id))}
# Add migrated if not already present in another device, which occurs if the user downgraded and then upgraded
if not device_registry.async_get_device(migrated_identifiers):
identifiers.update(migrated_identifiers)
if len(identifiers) > 0:
device_registry.async_update_device(
device.id, new_identifiers=identifiers
)
else:
device_registry.async_remove_device(device.id)
break
coordinator = HeosCoordinator(hass, entry)

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/holiday",
"iot_class": "local_polling",
"requirements": ["holidays==0.65", "babel==2.15.0"]
"requirements": ["holidays==0.66", "babel==2.15.0"]
}

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/ld2410_ble",
"integration_type": "device",
"iot_class": "local_push",
"requirements": ["bluetooth-data-tools==1.23.3", "ld2410-ble==0.1.1"]
"requirements": ["bluetooth-data-tools==1.23.4", "ld2410-ble==0.1.1"]
}

View File

@@ -35,5 +35,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/led_ble",
"iot_class": "local_polling",
"requirements": ["bluetooth-data-tools==1.23.3", "led-ble==1.1.6"]
"requirements": ["bluetooth-data-tools==1.23.4", "led-ble==1.1.6"]
}

View File

@@ -277,20 +277,6 @@ FOUR_GROUP_REMOTE_TRIGGER_SCHEMA = LUTRON_BUTTON_TRIGGER_SCHEMA.extend(
}
)
PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP = {
"button_0": 2,
"button_2": 4,
}
PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LEAP = {
"button_0": 0,
"button_2": 2,
}
PADDLE_SWITCH_PICO_TRIGGER_SCHEMA = LUTRON_BUTTON_TRIGGER_SCHEMA.extend(
{
vol.Required(CONF_SUBTYPE): vol.In(PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP),
}
)
DEVICE_TYPE_SCHEMA_MAP = {
"Pico2Button": PICO_2_BUTTON_TRIGGER_SCHEMA,
@@ -302,7 +288,6 @@ DEVICE_TYPE_SCHEMA_MAP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_TRIGGER_SCHEMA,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_TRIGGER_SCHEMA,
"FourGroupRemote": FOUR_GROUP_REMOTE_TRIGGER_SCHEMA,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_TRIGGER_SCHEMA,
}
DEVICE_TYPE_SUBTYPE_MAP_TO_LIP = {
@@ -315,7 +300,6 @@ DEVICE_TYPE_SUBTYPE_MAP_TO_LIP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_BUTTON_TYPES_TO_LIP,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_BUTTON_TYPES_TO_LIP,
"FourGroupRemote": FOUR_GROUP_REMOTE_BUTTON_TYPES_TO_LIP,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LIP,
}
DEVICE_TYPE_SUBTYPE_MAP_TO_LEAP = {
@@ -328,7 +312,6 @@ DEVICE_TYPE_SUBTYPE_MAP_TO_LEAP = {
"Pico4ButtonZone": PICO_4_BUTTON_ZONE_BUTTON_TYPES_TO_LEAP,
"Pico4Button2Group": PICO_4_BUTTON_2_GROUP_BUTTON_TYPES_TO_LEAP,
"FourGroupRemote": FOUR_GROUP_REMOTE_BUTTON_TYPES_TO_LEAP,
"PaddleSwitchPico": PADDLE_SWITCH_PICO_BUTTON_TYPES_TO_LEAP,
}
LEAP_TO_DEVICE_TYPE_SUBTYPE_MAP: dict[str, dict[int, str]] = {
@@ -343,7 +326,6 @@ TRIGGER_SCHEMA = vol.Any(
PICO_4_BUTTON_ZONE_TRIGGER_SCHEMA,
PICO_4_BUTTON_2_GROUP_TRIGGER_SCHEMA,
FOUR_GROUP_REMOTE_TRIGGER_SCHEMA,
PADDLE_SWITCH_PICO_TRIGGER_SCHEMA,
)

View File

@@ -105,10 +105,8 @@ class MillHeater(MillBaseEntity, ClimateEntity):
self, coordinator: MillDataUpdateCoordinator, device: mill.Heater
) -> None:
"""Initialize the thermostat."""
super().__init__(coordinator, device)
self._attr_unique_id = device.device_id
self._update_attr(device)
super().__init__(coordinator, device)
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set new target temperature."""

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from abc import abstractmethod
from mill import Heater, MillDevice
from mill import MillDevice
from homeassistant.core import callback
from homeassistant.helpers.device_registry import DeviceInfo
@@ -45,7 +45,7 @@ class MillBaseEntity(CoordinatorEntity[MillDataUpdateCoordinator]):
@abstractmethod
@callback
def _update_attr(self, device: MillDevice | Heater) -> None:
def _update_attr(self, device: MillDevice) -> None:
"""Update the attribute of the entity."""
@property

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from mill import MillDevice
from mill import Heater, MillDevice
from homeassistant.components.number import NumberDeviceClass, NumberEntity
from homeassistant.config_entries import ConfigEntry
@@ -27,6 +27,7 @@ async def async_setup_entry(
async_add_entities(
MillNumber(mill_data_coordinator, mill_device)
for mill_device in mill_data_coordinator.data.values()
if isinstance(mill_device, Heater)
)
@@ -45,9 +46,8 @@ class MillNumber(MillBaseEntity, NumberEntity):
mill_device: MillDevice,
) -> None:
"""Initialize the number."""
super().__init__(coordinator, mill_device)
self._attr_unique_id = f"{mill_device.device_id}_max_heating_power"
self._update_attr(mill_device)
super().__init__(coordinator, mill_device)
@callback
def _update_attr(self, device: MillDevice) -> None:

View File

@@ -192,9 +192,9 @@ class MillSensor(MillBaseEntity, SensorEntity):
mill_device: mill.Socket | mill.Heater,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator, mill_device)
self.entity_description = entity_description
self._attr_unique_id = f"{mill_device.device_id}_{entity_description.key}"
super().__init__(coordinator, mill_device)
@callback
def _update_attr(self, device):

View File

@@ -236,7 +236,7 @@ CONFIG_SCHEMA = vol.Schema(
MQTT_PUBLISH_SCHEMA = vol.Schema(
{
vol.Required(ATTR_TOPIC): valid_publish_topic,
vol.Required(ATTR_PAYLOAD): cv.string,
vol.Required(ATTR_PAYLOAD, default=None): vol.Any(cv.string, None),
vol.Optional(ATTR_EVALUATE_PAYLOAD): cv.boolean,
vol.Optional(ATTR_QOS, default=DEFAULT_QOS): valid_qos_schema,
vol.Optional(ATTR_RETAIN, default=DEFAULT_RETAIN): cv.boolean,

View File

@@ -8,7 +8,6 @@ publish:
selector:
text:
payload:
required: true
example: "The temperature is {{ states('sensor.temperature') }}"
selector:
template:

View File

@@ -246,11 +246,7 @@
},
"payload": {
"name": "Payload",
"description": "The payload to publish."
},
"payload_template": {
"name": "Payload template",
"description": "Template to render as a payload value. If a payload is provided, the template is ignored."
"description": "The payload to publish. Publishes an empty message if not provided."
},
"qos": {
"name": "QoS",

View File

@@ -19,5 +19,5 @@
"documentation": "https://www.home-assistant.io/integrations/nest",
"iot_class": "cloud_push",
"loggers": ["google_nest_sdm"],
"requirements": ["google-nest-sdm==7.1.1"]
"requirements": ["google-nest-sdm==7.1.3"]
}

View File

@@ -2,8 +2,12 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from html import unescape
from json import dumps, loads
import logging
from typing import cast
from onedrive_personal_sdk import OneDriveClient
from onedrive_personal_sdk.exceptions import (
@@ -11,8 +15,10 @@ from onedrive_personal_sdk.exceptions import (
HttpRequestException,
OneDriveException,
)
from onedrive_personal_sdk.models.items import ItemUpdate
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -22,7 +28,6 @@ from homeassistant.helpers.config_entry_oauth2_flow import (
)
from homeassistant.helpers.instance_id import async_get as async_get_instance_id
from .api import OneDriveConfigEntryAccessTokenProvider
from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
@@ -31,7 +36,7 @@ class OneDriveRuntimeData:
"""Runtime data for the OneDrive integration."""
client: OneDriveClient
token_provider: OneDriveConfigEntryAccessTokenProvider
token_function: Callable[[], Awaitable[str]]
backup_folder_id: str
@@ -43,12 +48,13 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) -> bool:
"""Set up OneDrive from a config entry."""
implementation = await async_get_config_entry_implementation(hass, entry)
session = OAuth2Session(hass, entry, implementation)
token_provider = OneDriveConfigEntryAccessTokenProvider(session)
async def get_access_token() -> str:
await session.async_ensure_token_valid()
return cast(str, session.token[CONF_ACCESS_TOKEN])
client = OneDriveClient(token_provider, async_get_clientsession(hass))
client = OneDriveClient(get_access_token, async_get_clientsession(hass))
# get approot, will be created automatically if it does not exist
try:
@@ -81,10 +87,18 @@ async def async_setup_entry(hass: HomeAssistant, entry: OneDriveConfigEntry) ->
entry.runtime_data = OneDriveRuntimeData(
client=client,
token_provider=token_provider,
token_function=get_access_token,
backup_folder_id=backup_folder.id,
)
try:
await _migrate_backup_files(client, backup_folder.id)
except OneDriveException as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="failed_to_migrate_files",
) from err
_async_notify_backup_listeners_soon(hass)
return True
@@ -104,3 +118,34 @@ def _async_notify_backup_listeners(hass: HomeAssistant) -> None:
@callback
def _async_notify_backup_listeners_soon(hass: HomeAssistant) -> None:
hass.loop.call_soon(_async_notify_backup_listeners, hass)
async def _migrate_backup_files(client: OneDriveClient, backup_folder_id: str) -> None:
"""Migrate backup files to metadata version 2."""
files = await client.list_drive_items(backup_folder_id)
for file in files:
if file.description and '"metadata_version": 1' in (
metadata_json := unescape(file.description)
):
metadata = loads(metadata_json)
del metadata["metadata_version"]
metadata_filename = file.name.rsplit(".", 1)[0] + ".metadata.json"
metadata_file = await client.upload_file(
backup_folder_id,
metadata_filename,
dumps(metadata), # type: ignore[arg-type]
)
metadata_description = {
"metadata_version": 2,
"backup_id": metadata["backup_id"],
"backup_file_id": file.id,
}
await client.update_drive_item(
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
await client.update_drive_item(
path_or_id=file.id,
data=ItemUpdate(description=""),
)
_LOGGER.debug("Migrated backup file %s", file.name)

View File

@@ -1,34 +0,0 @@
"""API for OneDrive bound to Home Assistant OAuth."""
from typing import cast
from onedrive_personal_sdk import TokenProvider
from homeassistant.const import CONF_ACCESS_TOKEN
from homeassistant.helpers import config_entry_oauth2_flow
class OneDriveConfigFlowAccessTokenProvider(TokenProvider):
"""Provide OneDrive authentication tied to an OAuth2 based config entry."""
def __init__(self, token: str) -> None:
"""Initialize OneDrive auth."""
super().__init__()
self._token = token
def async_get_access_token(self) -> str:
"""Return a valid access token."""
return self._token
class OneDriveConfigEntryAccessTokenProvider(TokenProvider):
"""Provide OneDrive authentication tied to an OAuth2 based config entry."""
def __init__(self, oauth_session: config_entry_oauth2_flow.OAuth2Session) -> None:
"""Initialize OneDrive auth."""
super().__init__()
self._oauth_session = oauth_session
def async_get_access_token(self) -> str:
"""Return a valid access token."""
return cast(str, self._oauth_session.token[CONF_ACCESS_TOKEN])

View File

@@ -4,8 +4,8 @@ from __future__ import annotations
from collections.abc import AsyncIterator, Callable, Coroutine
from functools import wraps
import html
import json
from html import unescape
from json import dumps, loads
import logging
from typing import Any, Concatenate
@@ -34,6 +34,7 @@ from .const import DATA_BACKUP_AGENT_LISTENERS, DOMAIN
_LOGGER = logging.getLogger(__name__)
UPLOAD_CHUNK_SIZE = 16 * 320 * 1024 # 5.2MB
TIMEOUT = ClientTimeout(connect=10, total=43200) # 12 hours
METADATA_VERSION = 2
async def async_get_backup_agents(
@@ -109,7 +110,7 @@ class OneDriveBackupAgent(BackupAgent):
self._hass = hass
self._entry = entry
self._client = entry.runtime_data.client
self._token_provider = entry.runtime_data.token_provider
self._token_function = entry.runtime_data.token_function
self._folder_id = entry.runtime_data.backup_folder_id
self.name = entry.title
assert entry.unique_id
@@ -120,11 +121,19 @@ class OneDriveBackupAgent(BackupAgent):
self, backup_id: str, **kwargs: Any
) -> AsyncIterator[bytes]:
"""Download a backup file."""
item = await self._find_item_by_backup_id(backup_id)
if item is None:
metadata_item = await self._find_item_by_backup_id(backup_id)
if (
metadata_item is None
or metadata_item.description is None
or "backup_file_id" not in metadata_item.description
):
raise BackupAgentError("Backup not found")
stream = await self._client.download_drive_item(item.id, timeout=TIMEOUT)
metadata_info = loads(unescape(metadata_item.description))
stream = await self._client.download_drive_item(
metadata_info["backup_file_id"], timeout=TIMEOUT
)
return stream.iter_chunked(1024)
@handle_backup_errors
@@ -136,31 +145,41 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Upload a backup."""
filename = suggested_filename(backup)
file = FileInfo(
suggested_filename(backup),
filename,
backup.size,
self._folder_id,
await open_stream(),
)
try:
item = await LargeFileUploadClient.upload(
self._token_provider, file, session=async_get_clientsession(self._hass)
backup_file = await LargeFileUploadClient.upload(
self._token_function, file, session=async_get_clientsession(self._hass)
)
except HashMismatchError as err:
raise BackupAgentError(
"Hash validation failed, backup file might be corrupt"
) from err
# store metadata in description
backup_dict = backup.as_dict()
backup_dict["metadata_version"] = 1 # version of the backup metadata
description = json.dumps(backup_dict)
# store metadata in metadata file
description = dumps(backup.as_dict())
_LOGGER.debug("Creating metadata: %s", description)
metadata_filename = filename.rsplit(".", 1)[0] + ".metadata.json"
metadata_file = await self._client.upload_file(
self._folder_id,
metadata_filename,
description, # type: ignore[arg-type]
)
# add metadata to the metadata file
metadata_description = {
"metadata_version": METADATA_VERSION,
"backup_id": backup.backup_id,
"backup_file_id": backup_file.id,
}
await self._client.update_drive_item(
path_or_id=item.id,
data=ItemUpdate(description=description),
path_or_id=metadata_file.id,
data=ItemUpdate(description=dumps(metadata_description)),
)
@handle_backup_errors
@@ -170,18 +189,28 @@ class OneDriveBackupAgent(BackupAgent):
**kwargs: Any,
) -> None:
"""Delete a backup file."""
item = await self._find_item_by_backup_id(backup_id)
if item is None:
metadata_item = await self._find_item_by_backup_id(backup_id)
if (
metadata_item is None
or metadata_item.description is None
or "backup_file_id" not in metadata_item.description
):
return
await self._client.delete_drive_item(item.id)
metadata_info = loads(unescape(metadata_item.description))
await self._client.delete_drive_item(metadata_info["backup_file_id"])
await self._client.delete_drive_item(metadata_item.id)
@handle_backup_errors
async def async_list_backups(self, **kwargs: Any) -> list[AgentBackup]:
"""List backups."""
items = await self._client.list_drive_items(self._folder_id)
return [
self._backup_from_description(item.description)
for item in await self._client.list_drive_items(self._folder_id)
if item.description and "homeassistant_version" in item.description
await self._download_backup_metadata(item.id)
for item in items
if item.description
and "backup_id" in item.description
and f'"metadata_version": {METADATA_VERSION}' in unescape(item.description)
]
@handle_backup_errors
@@ -189,19 +218,11 @@ class OneDriveBackupAgent(BackupAgent):
self, backup_id: str, **kwargs: Any
) -> AgentBackup | None:
"""Return a backup."""
item = await self._find_item_by_backup_id(backup_id)
return (
self._backup_from_description(item.description)
if item and item.description
else None
)
metadata_file = await self._find_item_by_backup_id(backup_id)
if metadata_file is None or metadata_file.description is None:
return None
def _backup_from_description(self, description: str) -> AgentBackup:
"""Create a backup object from a description."""
description = html.unescape(
description
) # OneDrive encodes the description on save automatically
return AgentBackup.from_dict(json.loads(description))
return await self._download_backup_metadata(metadata_file.id)
async def _find_item_by_backup_id(self, backup_id: str) -> File | Folder | None:
"""Find an item by backup ID."""
@@ -209,7 +230,15 @@ class OneDriveBackupAgent(BackupAgent):
(
item
for item in await self._client.list_drive_items(self._folder_id)
if item.description and backup_id in item.description
if item.description
and backup_id in item.description
and f'"metadata_version": {METADATA_VERSION}'
in unescape(item.description)
),
None,
)
async def _download_backup_metadata(self, item_id: str) -> AgentBackup:
metadata_stream = await self._client.download_drive_item(item_id)
metadata_json = loads(await metadata_stream.read())
return AgentBackup.from_dict(metadata_json)

View File

@@ -12,7 +12,6 @@ from homeassistant.const import CONF_ACCESS_TOKEN, CONF_TOKEN
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.config_entry_oauth2_flow import AbstractOAuth2FlowHandler
from .api import OneDriveConfigFlowAccessTokenProvider
from .const import DOMAIN, OAUTH_SCOPES
@@ -36,12 +35,12 @@ class OneDriveConfigFlow(AbstractOAuth2FlowHandler, domain=DOMAIN):
data: dict[str, Any],
) -> ConfigFlowResult:
"""Handle the initial step."""
token_provider = OneDriveConfigFlowAccessTokenProvider(
cast(str, data[CONF_TOKEN][CONF_ACCESS_TOKEN])
)
async def get_access_token() -> str:
return cast(str, data[CONF_TOKEN][CONF_ACCESS_TOKEN])
graph_client = OneDriveClient(
token_provider, async_get_clientsession(self.hass)
get_access_token, async_get_clientsession(self.hass)
)
try:

View File

@@ -9,5 +9,5 @@
"iot_class": "cloud_polling",
"loggers": ["onedrive_personal_sdk"],
"quality_scale": "bronze",
"requirements": ["onedrive-personal-sdk==0.0.4"]
"requirements": ["onedrive-personal-sdk==0.0.8"]
}

View File

@@ -35,6 +35,9 @@
},
"failed_to_get_folder": {
"message": "Failed to get {folder} folder"
},
"failed_to_migrate_files": {
"message": "Failed to migrate metadata to separate files"
}
}
}

View File

@@ -27,7 +27,7 @@ REGISTERED_NOTIFICATIONS = (
JSON_PAYLOAD = (
'"{\\"notification_type\\":\\"{{notification_type}}\\",\\"subject\\":\\"{{subject}'
'}\\",\\"message\\":\\"{{message}}\\",\\"image\\":\\"{{image}}\\",\\"{{media}}\\":'
'{\\"media_type\\":\\"{{media_type}}\\",\\"tmdb_idd\\":\\"{{media_tmdbid}}\\",\\"t'
'{\\"media_type\\":\\"{{media_type}}\\",\\"tmdb_id\\":\\"{{media_tmdbid}}\\",\\"t'
'vdb_id\\":\\"{{media_tvdbid}}\\",\\"status\\":\\"{{media_status}}\\",\\"status4k'
'\\":\\"{{media_status4k}}\\"},\\"{{request}}\\":{\\"request_id\\":\\"{{request_id'
'}}\\",\\"requested_by_email\\":\\"{{requestedBy_email}}\\",\\"requested_by_userna'

View File

@@ -6,5 +6,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/private_ble_device",
"iot_class": "local_push",
"requirements": ["bluetooth-data-tools==1.23.3"]
"requirements": ["bluetooth-data-tools==1.23.4"]
}

View File

@@ -19,5 +19,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.11.8"]
"requirements": ["reolink-aio==0.11.10"]
}

View File

@@ -424,6 +424,7 @@ NUMBER_ENTITIES = (
ReolinkNumberEntityDescription(
key="image_brightness",
cmd_key="GetImage",
cmd_id=26,
translation_key="image_brightness",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -437,6 +438,7 @@ NUMBER_ENTITIES = (
ReolinkNumberEntityDescription(
key="image_contrast",
cmd_key="GetImage",
cmd_id=26,
translation_key="image_contrast",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -450,6 +452,7 @@ NUMBER_ENTITIES = (
ReolinkNumberEntityDescription(
key="image_saturation",
cmd_key="GetImage",
cmd_id=26,
translation_key="image_saturation",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -463,6 +466,7 @@ NUMBER_ENTITIES = (
ReolinkNumberEntityDescription(
key="image_sharpness",
cmd_key="GetImage",
cmd_id=26,
translation_key="image_sharpness",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
@@ -476,6 +480,7 @@ NUMBER_ENTITIES = (
ReolinkNumberEntityDescription(
key="image_hue",
cmd_key="GetImage",
cmd_id=26,
translation_key="image_hue",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,

View File

@@ -80,6 +80,7 @@ SELECT_ENTITIES = (
ReolinkSelectEntityDescription(
key="day_night_mode",
cmd_key="GetIsp",
cmd_id=26,
translation_key="day_night_mode",
entity_category=EntityCategory.CONFIG,
get_options=[mode.name for mode in DayNightEnum],

View File

@@ -8,7 +8,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["aioshelly"],
"requirements": ["aioshelly==12.3.2"],
"requirements": ["aioshelly==12.4.1"],
"zeroconf": [
{
"type": "_http._tcp.local.",

View File

@@ -139,6 +139,24 @@ class RpcBluTrvNumber(RpcNumber):
)
class RpcBluTrvExtTempNumber(RpcBluTrvNumber):
"""Represent a RPC BluTrv External Temperature number."""
_reported_value: float | None = None
@property
def native_value(self) -> float | None:
"""Return value of number."""
return self._reported_value
async def async_set_native_value(self, value: float) -> None:
"""Change the value."""
await super().async_set_native_value(value)
self._reported_value = value
self.async_write_ha_state()
NUMBERS: dict[tuple[str, str], BlockNumberDescription] = {
("device", "valvePos"): BlockNumberDescription(
key="device|valvepos",
@@ -175,7 +193,7 @@ RPC_NUMBERS: Final = {
"method": "Trv.SetExternalTemperature",
"params": {"id": 0, "t_C": value},
},
entity_class=RpcBluTrvNumber,
entity_class=RpcBluTrvExtTempNumber,
),
"number": RpcNumberDescription(
key="number",

View File

@@ -175,6 +175,7 @@ BASE_SERVICE_SCHEMA = vol.Schema(
vol.Optional(ATTR_KEYBOARD_INLINE): cv.ensure_list,
vol.Optional(ATTR_TIMEOUT): cv.positive_int,
vol.Optional(ATTR_MESSAGE_TAG): cv.string,
vol.Optional(ATTR_MESSAGE_THREAD_ID): vol.Coerce(int),
},
extra=vol.ALLOW_EXTRA,
)
@@ -216,6 +217,7 @@ SERVICE_SCHEMA_SEND_POLL = vol.Schema(
vol.Optional(ATTR_ALLOWS_MULTIPLE_ANSWERS, default=False): cv.boolean,
vol.Optional(ATTR_DISABLE_NOTIF): cv.boolean,
vol.Optional(ATTR_TIMEOUT): cv.positive_int,
vol.Optional(ATTR_MESSAGE_THREAD_ID): vol.Coerce(int),
}
)

View File

@@ -125,7 +125,7 @@ def cmd[_R, **_P](
self: LgWebOSMediaPlayerEntity, *args: _P.args, **kwargs: _P.kwargs
) -> _R:
"""Wrap all command methods."""
if self.state is MediaPlayerState.OFF:
if self.state is MediaPlayerState.OFF and func.__name__ != "async_turn_off":
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="device_off",

View File

@@ -7,5 +7,5 @@
"iot_class": "local_polling",
"loggers": ["holidays"],
"quality_scale": "internal",
"requirements": ["holidays==0.65"]
"requirements": ["holidays==0.66"]
}

View File

@@ -113,9 +113,14 @@ async def list_serial_ports(hass: HomeAssistant) -> list[ListPortInfo]:
except HomeAssistantError:
pass
else:
yellow_radio = next(p for p in ports if p.device == "/dev/ttyAMA1")
yellow_radio.description = "Yellow Zigbee module"
yellow_radio.manufacturer = "Nabu Casa"
# PySerial does not properly handle the Yellow's serial port with the CM5
# so we manually include it
port = ListPortInfo(device="/dev/ttyAMA1", skip_link_detection=True)
port.description = "Yellow Zigbee module"
port.manufacturer = "Nabu Casa"
ports = [p for p in ports if not p.device.startswith("/dev/ttyAMA")]
ports.insert(0, port)
if is_hassio(hass):
# Present the multi-PAN addon as a setup option, if it's available

View File

@@ -21,7 +21,7 @@
"zha",
"universal_silabs_flasher"
],
"requirements": ["zha==0.0.47"],
"requirements": ["zha==0.0.48"],
"usb": [
{
"vid": "10C4",

View File

@@ -25,7 +25,7 @@ if TYPE_CHECKING:
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2025
MINOR_VERSION: Final = 2
PATCH_VERSION: Final = "0b10"
PATCH_VERSION: Final = "1"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 13, 0)

View File

@@ -187,6 +187,11 @@ BLUETOOTH: Final[list[dict[str, bool | str | int | list[int]]]] = [
"domain": "govee_ble",
"local_name": "GV5126*",
},
{
"connectable": False,
"domain": "govee_ble",
"local_name": "GV5179*",
},
{
"connectable": False,
"domain": "govee_ble",

View File

@@ -15,7 +15,7 @@ import aiohttp
from aiohttp import web
from aiohttp.hdrs import CONTENT_TYPE, USER_AGENT
from aiohttp.web_exceptions import HTTPBadGateway, HTTPGatewayTimeout
from aiohttp_asyncmdnsresolver.api import AsyncMDNSResolver
from aiohttp_asyncmdnsresolver.api import AsyncDualMDNSResolver
from homeassistant import config_entries
from homeassistant.components import zeroconf
@@ -377,5 +377,5 @@ def _async_get_connector(
@callback
def _async_make_resolver(hass: HomeAssistant) -> AsyncMDNSResolver:
return AsyncMDNSResolver(async_zeroconf=zeroconf.async_get_async_zeroconf(hass))
def _async_make_resolver(hass: HomeAssistant) -> AsyncDualMDNSResolver:
return AsyncDualMDNSResolver(async_zeroconf=zeroconf.async_get_async_zeroconf(hass))

View File

@@ -3,10 +3,10 @@
aiodhcpwatcher==1.0.3
aiodiscover==2.1.0
aiodns==3.2.0
aiohasupervisor==0.2.2b6
aiohttp-asyncmdnsresolver==0.0.3
aiohasupervisor==0.3.0
aiohttp-asyncmdnsresolver==0.1.0
aiohttp-fast-zlib==0.2.0
aiohttp==3.11.11
aiohttp==3.11.12
aiohttp_cors==0.7.0
aiousbwatcher==1.1.1
aiozoneinfo==0.2.1
@@ -19,26 +19,26 @@ audioop-lts==0.2.1;python_version>='3.13'
av==13.1.0
awesomeversion==24.6.0
bcrypt==4.2.0
bleak-retry-connector==3.8.0
bleak-retry-connector==3.8.1
bleak==0.22.3
bluetooth-adapters==0.21.1
bluetooth-adapters==0.21.4
bluetooth-auto-recovery==1.4.2
bluetooth-data-tools==1.23.3
bluetooth-data-tools==1.23.4
cached-ipaddress==0.8.0
certifi>=2021.5.30
ciso8601==2.3.2
cronsim==2.6
cryptography==44.0.0
dbus-fast==2.32.0
dbus-fast==2.33.0
fnv-hash-fast==1.2.2
go2rtc-client==0.1.2
ha-ffmpeg==3.2.2
habluetooth==3.21.0
habluetooth==3.21.1
hass-nabucasa==0.88.1
hassil==2.2.0
hassil==2.2.3
home-assistant-bluetooth==1.13.0
home-assistant-frontend==20250204.0
home-assistant-intents==2025.1.28
home-assistant-frontend==20250205.0
home-assistant-intents==2025.2.5
httpx==0.28.1
ifaddr==0.2.0
Jinja2==3.1.5

View File

@@ -132,7 +132,13 @@ def async_set_domains_to_be_loaded(hass: core.HomeAssistant, domains: set[str])
- Keep track of domains which will load but have not yet finished loading
"""
setup_done_futures = hass.data.setdefault(DATA_SETUP_DONE, {})
setup_done_futures.update({domain: hass.loop.create_future() for domain in domains})
setup_futures = hass.data.setdefault(DATA_SETUP, {})
old_domains = set(setup_futures) | set(setup_done_futures) | hass.config.components
if overlap := old_domains & domains:
_LOGGER.debug("Domains to be loaded %s already loaded or pending", overlap)
setup_done_futures.update(
{domain: hass.loop.create_future() for domain in domains - old_domains}
)
def setup_component(hass: core.HomeAssistant, domain: str, config: ConfigType) -> bool:

View File

@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "homeassistant"
version = "2025.2.0b10"
version = "2025.2.1"
license = {text = "Apache-2.0"}
description = "Open-source home automation platform running on Python 3."
readme = "README.rst"
@@ -27,11 +27,11 @@ dependencies = [
# Integrations may depend on hassio integration without listing it to
# change behavior based on presence of supervisor. Deprecated with #127228
# Lib can be removed with 2025.11
"aiohasupervisor==0.2.2b6",
"aiohttp==3.11.11",
"aiohasupervisor==0.3.0",
"aiohttp==3.11.12",
"aiohttp_cors==0.7.0",
"aiohttp-fast-zlib==0.2.0",
"aiohttp-asyncmdnsresolver==0.0.3",
"aiohttp-asyncmdnsresolver==0.1.0",
"aiozoneinfo==0.2.1",
"astral==2.2",
"async-interrupt==1.2.0",

6
requirements.txt generated
View File

@@ -4,11 +4,11 @@
# Home Assistant Core
aiodns==3.2.0
aiohasupervisor==0.2.2b6
aiohttp==3.11.11
aiohasupervisor==0.3.0
aiohttp==3.11.12
aiohttp_cors==0.7.0
aiohttp-fast-zlib==0.2.0
aiohttp-asyncmdnsresolver==0.0.3
aiohttp-asyncmdnsresolver==0.1.0
aiozoneinfo==0.2.1
astral==2.2
async-interrupt==1.2.0

40
requirements_all.txt generated
View File

@@ -261,7 +261,7 @@ aioguardian==2022.07.0
aioharmony==0.4.1
# homeassistant.components.hassio
aiohasupervisor==0.2.2b6
aiohasupervisor==0.3.0
# homeassistant.components.homekit_controller
aiohomekit==3.2.7
@@ -368,7 +368,7 @@ aioruuvigateway==0.1.0
aiosenz==1.0.0
# homeassistant.components.shelly
aioshelly==12.3.2
aioshelly==12.4.1
# homeassistant.components.skybell
aioskybell==22.7.0
@@ -597,7 +597,7 @@ bizkaibus==0.1.1
bleak-esphome==2.7.0
# homeassistant.components.bluetooth
bleak-retry-connector==3.8.0
bleak-retry-connector==3.8.1
# homeassistant.components.bluetooth
bleak==0.22.3
@@ -622,7 +622,7 @@ bluemaestro-ble==0.2.3
# bluepy==1.3.0
# homeassistant.components.bluetooth
bluetooth-adapters==0.21.1
bluetooth-adapters==0.21.4
# homeassistant.components.bluetooth
bluetooth-auto-recovery==1.4.2
@@ -631,7 +631,7 @@ bluetooth-auto-recovery==1.4.2
# homeassistant.components.ld2410_ble
# homeassistant.components.led_ble
# homeassistant.components.private_ble_device
bluetooth-data-tools==1.23.3
bluetooth-data-tools==1.23.4
# homeassistant.components.bond
bond-async==0.2.1
@@ -735,7 +735,7 @@ datadog==0.15.0
datapoint==0.9.9
# homeassistant.components.bluetooth
dbus-fast==2.32.0
dbus-fast==2.33.0
# homeassistant.components.debugpy
debugpy==1.8.11
@@ -818,10 +818,10 @@ ebusdpy==0.0.17
ecoaliface==0.4.0
# homeassistant.components.eheimdigital
eheimdigital==1.0.5
eheimdigital==1.0.6
# homeassistant.components.electric_kiwi
electrickiwi-api==0.8.5
electrickiwi-api==0.9.14
# homeassistant.components.elevenlabs
elevenlabs==1.9.0
@@ -1033,7 +1033,7 @@ google-cloud-texttospeech==2.17.2
google-generativeai==0.8.2
# homeassistant.components.nest
google-nest-sdm==7.1.1
google-nest-sdm==7.1.3
# homeassistant.components.google_photos
google-photos-library-api==0.12.1
@@ -1049,7 +1049,7 @@ goslide-api==0.7.0
gotailwind==0.3.0
# homeassistant.components.govee_ble
govee-ble==0.42.0
govee-ble==0.43.0
# homeassistant.components.govee_light_local
govee-local-api==1.5.3
@@ -1097,10 +1097,10 @@ ha-iotawattpy==0.1.2
ha-philipsjs==3.2.2
# homeassistant.components.habitica
habiticalib==0.3.4
habiticalib==0.3.5
# homeassistant.components.bluetooth
habluetooth==3.21.0
habluetooth==3.21.1
# homeassistant.components.cloud
hass-nabucasa==0.88.1
@@ -1109,7 +1109,7 @@ hass-nabucasa==0.88.1
hass-splunk==0.1.1
# homeassistant.components.conversation
hassil==2.2.0
hassil==2.2.3
# homeassistant.components.jewish_calendar
hdate==0.11.1
@@ -1140,13 +1140,13 @@ hole==0.8.0
# homeassistant.components.holiday
# homeassistant.components.workday
holidays==0.65
holidays==0.66
# homeassistant.components.frontend
home-assistant-frontend==20250204.0
home-assistant-frontend==20250205.0
# homeassistant.components.conversation
home-assistant-intents==2025.1.28
home-assistant-intents==2025.2.5
# homeassistant.components.home_connect
homeconnect==0.8.0
@@ -1556,7 +1556,7 @@ omnilogic==0.4.5
ondilo==0.5.0
# homeassistant.components.onedrive
onedrive-personal-sdk==0.0.4
onedrive-personal-sdk==0.0.8
# homeassistant.components.onvif
onvif-zeep-async==3.2.5
@@ -1954,7 +1954,7 @@ pyfibaro==0.8.0
pyfido==2.1.2
# homeassistant.components.fireservicerota
pyfireservicerota==0.0.43
pyfireservicerota==0.0.46
# homeassistant.components.flic
pyflic==2.0.4
@@ -2603,7 +2603,7 @@ renault-api==0.2.9
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.11.8
reolink-aio==0.11.10
# homeassistant.components.idteck_prox
rfk101py==0.0.1
@@ -3131,7 +3131,7 @@ zeroconf==0.143.0
zeversolar==0.3.2
# homeassistant.components.zha
zha==0.0.47
zha==0.0.48
# homeassistant.components.zhong_hong
zhong-hong-hvac==1.0.13

View File

@@ -246,7 +246,7 @@ aioguardian==2022.07.0
aioharmony==0.4.1
# homeassistant.components.hassio
aiohasupervisor==0.2.2b6
aiohasupervisor==0.3.0
# homeassistant.components.homekit_controller
aiohomekit==3.2.7
@@ -350,7 +350,7 @@ aioruuvigateway==0.1.0
aiosenz==1.0.0
# homeassistant.components.shelly
aioshelly==12.3.2
aioshelly==12.4.1
# homeassistant.components.skybell
aioskybell==22.7.0
@@ -528,7 +528,7 @@ bimmer-connected[china]==0.17.2
bleak-esphome==2.7.0
# homeassistant.components.bluetooth
bleak-retry-connector==3.8.0
bleak-retry-connector==3.8.1
# homeassistant.components.bluetooth
bleak==0.22.3
@@ -546,7 +546,7 @@ bluecurrent-api==1.2.3
bluemaestro-ble==0.2.3
# homeassistant.components.bluetooth
bluetooth-adapters==0.21.1
bluetooth-adapters==0.21.4
# homeassistant.components.bluetooth
bluetooth-auto-recovery==1.4.2
@@ -555,7 +555,7 @@ bluetooth-auto-recovery==1.4.2
# homeassistant.components.ld2410_ble
# homeassistant.components.led_ble
# homeassistant.components.private_ble_device
bluetooth-data-tools==1.23.3
bluetooth-data-tools==1.23.4
# homeassistant.components.bond
bond-async==0.2.1
@@ -631,7 +631,7 @@ datadog==0.15.0
datapoint==0.9.9
# homeassistant.components.bluetooth
dbus-fast==2.32.0
dbus-fast==2.33.0
# homeassistant.components.debugpy
debugpy==1.8.11
@@ -696,10 +696,10 @@ eagle100==0.1.1
easyenergy==2.1.2
# homeassistant.components.eheimdigital
eheimdigital==1.0.5
eheimdigital==1.0.6
# homeassistant.components.electric_kiwi
electrickiwi-api==0.8.5
electrickiwi-api==0.9.14
# homeassistant.components.elevenlabs
elevenlabs==1.9.0
@@ -883,7 +883,7 @@ google-cloud-texttospeech==2.17.2
google-generativeai==0.8.2
# homeassistant.components.nest
google-nest-sdm==7.1.1
google-nest-sdm==7.1.3
# homeassistant.components.google_photos
google-photos-library-api==0.12.1
@@ -899,7 +899,7 @@ goslide-api==0.7.0
gotailwind==0.3.0
# homeassistant.components.govee_ble
govee-ble==0.42.0
govee-ble==0.43.0
# homeassistant.components.govee_light_local
govee-local-api==1.5.3
@@ -938,16 +938,16 @@ ha-iotawattpy==0.1.2
ha-philipsjs==3.2.2
# homeassistant.components.habitica
habiticalib==0.3.4
habiticalib==0.3.5
# homeassistant.components.bluetooth
habluetooth==3.21.0
habluetooth==3.21.1
# homeassistant.components.cloud
hass-nabucasa==0.88.1
# homeassistant.components.conversation
hassil==2.2.0
hassil==2.2.3
# homeassistant.components.jewish_calendar
hdate==0.11.1
@@ -969,13 +969,13 @@ hole==0.8.0
# homeassistant.components.holiday
# homeassistant.components.workday
holidays==0.65
holidays==0.66
# homeassistant.components.frontend
home-assistant-frontend==20250204.0
home-assistant-frontend==20250205.0
# homeassistant.components.conversation
home-assistant-intents==2025.1.28
home-assistant-intents==2025.2.5
# homeassistant.components.home_connect
homeconnect==0.8.0
@@ -1304,7 +1304,7 @@ omnilogic==0.4.5
ondilo==0.5.0
# homeassistant.components.onedrive
onedrive-personal-sdk==0.0.4
onedrive-personal-sdk==0.0.8
# homeassistant.components.onvif
onvif-zeep-async==3.2.5
@@ -1592,7 +1592,7 @@ pyfibaro==0.8.0
pyfido==2.1.2
# homeassistant.components.fireservicerota
pyfireservicerota==0.0.43
pyfireservicerota==0.0.46
# homeassistant.components.flic
pyflic==2.0.4
@@ -2106,7 +2106,7 @@ renault-api==0.2.9
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.11.8
reolink-aio==0.11.10
# homeassistant.components.rflink
rflink==0.0.66
@@ -2520,7 +2520,7 @@ zeroconf==0.143.0
zeversolar==0.3.2
# homeassistant.components.zha
zha==0.0.47
zha==0.0.48
# homeassistant.components.zwave_js
zwave-js-server-python==0.60.0

View File

@@ -25,7 +25,7 @@ RUN --mount=from=ghcr.io/astral-sh/uv:0.5.21,source=/uv,target=/bin/uv \
-c /usr/src/homeassistant/homeassistant/package_constraints.txt \
-r /usr/src/homeassistant/requirements.txt \
stdlib-list==0.10.0 pipdeptree==2.23.4 tqdm==4.66.5 ruff==0.9.1 \
PyTurboJPEG==1.7.5 go2rtc-client==0.1.2 ha-ffmpeg==3.2.2 hassil==2.2.0 home-assistant-intents==2025.1.28 mutagen==1.47.0 pymicro-vad==1.0.1 pyspeex-noise==1.0.2
PyTurboJPEG==1.7.5 go2rtc-client==0.1.2 ha-ffmpeg==3.2.2 hassil==2.2.3 home-assistant-intents==2025.2.5 mutagen==1.47.0 pymicro-vad==1.0.1 pyspeex-noise==1.0.2
LABEL "name"="hassfest"
LABEL "maintainer"="Home Assistant <hello@home-assistant.io>"

View File

@@ -9,7 +9,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import intent
from .conftest import MockAssistSatellite
from .conftest import TEST_DOMAIN, MockAssistSatellite
@pytest.fixture
@@ -65,12 +65,7 @@ async def test_broadcast_intent(
},
"language": "en",
"response_type": "action_done",
"speech": {
"plain": {
"extra_data": None,
"speech": "Done",
}
},
"speech": {}, # response comes from intents
}
assert len(entity.announcements) == 1
assert len(entity2.announcements) == 1
@@ -99,12 +94,37 @@ async def test_broadcast_intent(
},
"language": "en",
"response_type": "action_done",
"speech": {
"plain": {
"extra_data": None,
"speech": "Done",
}
},
"speech": {}, # response comes from intents
}
assert len(entity.announcements) == 1
assert len(entity2.announcements) == 2
async def test_broadcast_intent_excluded_domains(
hass: HomeAssistant,
init_components: ConfigEntry,
entity: MockAssistSatellite,
entity2: MockAssistSatellite,
mock_tts: None,
) -> None:
"""Test that the broadcast intent filters out entities in excluded domains."""
# Exclude the "test" domain
with patch(
"homeassistant.components.assist_satellite.intent.EXCLUDED_DOMAINS",
new={TEST_DOMAIN},
):
result = await intent.async_handle(
hass, "test", intent.INTENT_BROADCAST, {"message": {"value": "Hello"}}
)
assert result.as_dict() == {
"card": {},
"data": {
"failed": [],
"success": [], # no satellites
"targets": [],
},
"language": "en",
"response_type": "action_done",
"speech": {},
}

View File

@@ -1 +1,13 @@
"""Tests for the Electric Kiwi integration."""
from homeassistant.core import HomeAssistant
from tests.common import MockConfigEntry
async def init_integration(hass: HomeAssistant, entry: MockConfigEntry) -> None:
"""Fixture for setting up the integration with args."""
entry.add_to_hass(hass)
await hass.config_entries.async_setup(entry.entry_id)
await hass.async_block_till_done()

View File

@@ -2,11 +2,18 @@
from __future__ import annotations
from collections.abc import Awaitable, Callable, Generator
from collections.abc import Generator
from time import time
from unittest.mock import AsyncMock, patch
from electrickiwi_api.model import AccountBalance, Hop, HopIntervals
from electrickiwi_api.model import (
AccountSummary,
CustomerConnection,
Hop,
HopIntervals,
Service,
Session,
)
import pytest
from homeassistant.components.application_credentials import (
@@ -23,37 +30,55 @@ CLIENT_ID = "1234"
CLIENT_SECRET = "5678"
REDIRECT_URI = "https://example.com/auth/external/callback"
type YieldFixture = Generator[AsyncMock]
type ComponentSetup = Callable[[], Awaitable[bool]]
@pytest.fixture(autouse=True)
async def setup_credentials(hass: HomeAssistant) -> None:
"""Fixture to setup application credentials component."""
await async_setup_component(hass, "application_credentials", {})
await async_import_client_credential(
hass,
DOMAIN,
ClientCredential(CLIENT_ID, CLIENT_SECRET),
)
@pytest.fixture(autouse=True)
async def request_setup(current_request_with_host: None) -> None:
"""Request setup."""
@pytest.fixture
def component_setup(
hass: HomeAssistant, config_entry: MockConfigEntry
) -> ComponentSetup:
"""Fixture for setting up the integration."""
async def _setup_func() -> bool:
assert await async_setup_component(hass, "application_credentials", {})
await hass.async_block_till_done()
await async_import_client_credential(
hass,
DOMAIN,
ClientCredential(CLIENT_ID, CLIENT_SECRET),
DOMAIN,
def electrickiwi_api() -> Generator[AsyncMock]:
"""Mock ek api and return values."""
with (
patch(
"homeassistant.components.electric_kiwi.ElectricKiwiApi",
autospec=True,
) as mock_client,
patch(
"homeassistant.components.electric_kiwi.config_flow.ElectricKiwiApi",
new=mock_client,
),
):
client = mock_client.return_value
client.customer_number = 123456
client.electricity = Service(
identifier="00000000DDA",
service="electricity",
service_status="Y",
is_primary_service=True,
)
await hass.async_block_till_done()
config_entry.add_to_hass(hass)
result = await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done()
return result
return _setup_func
client.get_active_session.return_value = Session.from_dict(
load_json_value_fixture("session.json", DOMAIN)
)
client.get_hop_intervals.return_value = HopIntervals.from_dict(
load_json_value_fixture("hop_intervals.json", DOMAIN)
)
client.get_hop.return_value = Hop.from_dict(
load_json_value_fixture("get_hop.json", DOMAIN)
)
client.get_account_summary.return_value = AccountSummary.from_dict(
load_json_value_fixture("account_summary.json", DOMAIN)
)
client.get_connection_details.return_value = CustomerConnection.from_dict(
load_json_value_fixture("connection_details.json", DOMAIN)
)
yield client
@pytest.fixture(name="config_entry")
@@ -63,7 +88,7 @@ def mock_config_entry(hass: HomeAssistant) -> MockConfigEntry:
title="Electric Kiwi",
domain=DOMAIN,
data={
"id": "12345",
"id": "123456",
"auth_implementation": DOMAIN,
"token": {
"refresh_token": "mock-refresh-token",
@@ -74,6 +99,54 @@ def mock_config_entry(hass: HomeAssistant) -> MockConfigEntry:
},
},
unique_id=DOMAIN,
version=1,
minor_version=1,
)
@pytest.fixture(name="config_entry2")
def mock_config_entry2(hass: HomeAssistant) -> MockConfigEntry:
"""Create mocked config entry."""
return MockConfigEntry(
title="Electric Kiwi",
domain=DOMAIN,
data={
"id": "123457",
"auth_implementation": DOMAIN,
"token": {
"refresh_token": "mock-refresh-token",
"access_token": "mock-access-token",
"type": "Bearer",
"expires_in": 60,
"expires_at": time() + 60,
},
},
unique_id="1234567",
version=1,
minor_version=1,
)
@pytest.fixture(name="migrated_config_entry")
def mock_migrated_config_entry(hass: HomeAssistant) -> MockConfigEntry:
"""Create mocked config entry."""
return MockConfigEntry(
title="Electric Kiwi",
domain=DOMAIN,
data={
"id": "123456",
"auth_implementation": DOMAIN,
"token": {
"refresh_token": "mock-refresh-token",
"access_token": "mock-access-token",
"type": "Bearer",
"expires_in": 60,
"expires_at": time() + 60,
},
},
unique_id="123456",
version=1,
minor_version=2,
)
@@ -87,35 +160,10 @@ def mock_setup_entry() -> Generator[AsyncMock]:
@pytest.fixture(name="ek_auth")
def electric_kiwi_auth() -> YieldFixture:
def electric_kiwi_auth() -> Generator[AsyncMock]:
"""Patch access to electric kiwi access token."""
with patch(
"homeassistant.components.electric_kiwi.api.AsyncConfigEntryAuth"
"homeassistant.components.electric_kiwi.api.ConfigEntryElectricKiwiAuth"
) as mock_auth:
mock_auth.return_value.async_get_access_token = AsyncMock("auth_token")
yield mock_auth
@pytest.fixture(name="ek_api")
def ek_api() -> YieldFixture:
"""Mock ek api and return values."""
with patch(
"homeassistant.components.electric_kiwi.ElectricKiwiApi", autospec=True
) as mock_ek_api:
mock_ek_api.return_value.customer_number = 123456
mock_ek_api.return_value.connection_id = 123456
mock_ek_api.return_value.set_active_session.return_value = None
mock_ek_api.return_value.get_hop_intervals.return_value = (
HopIntervals.from_dict(
load_json_value_fixture("hop_intervals.json", DOMAIN)
)
)
mock_ek_api.return_value.get_hop.return_value = Hop.from_dict(
load_json_value_fixture("get_hop.json", DOMAIN)
)
mock_ek_api.return_value.get_account_balance.return_value = (
AccountBalance.from_dict(
load_json_value_fixture("account_balance.json", DOMAIN)
)
)
yield mock_ek_api

View File

@@ -1,28 +0,0 @@
{
"data": {
"connections": [
{
"hop_percentage": "3.5",
"id": 3,
"running_balance": "184.09",
"start_date": "2020-10-04",
"unbilled_days": 15
}
],
"last_billed_amount": "-66.31",
"last_billed_date": "2020-10-03",
"next_billing_date": "2020-11-03",
"is_prepay": "N",
"summary": {
"credits": "0.0",
"electricity_used": "184.09",
"other_charges": "0.00",
"payments": "-220.0"
},
"total_account_balance": "-102.22",
"total_billing_days": 30,
"total_running_balance": "184.09",
"type": "account_running_balance"
},
"status": 1
}

View File

@@ -0,0 +1,43 @@
{
"data": {
"type": "account_summary",
"total_running_balance": "184.09",
"total_account_balance": "-102.22",
"total_billing_days": 31,
"next_billing_date": "2025-02-19",
"service_names": ["power"],
"services": {
"power": {
"connections": [
{
"id": 515363,
"running_balance": "12.98",
"unbilled_days": 5,
"hop_percentage": "11.2",
"start_date": "2025-01-19",
"service_label": "Power"
}
]
}
},
"date_to_pay": "",
"invoice_id": "",
"total_invoiced_charges": "",
"default_to_pay": "",
"invoice_exists": 1,
"display_date": "2025-01-19",
"last_billed_date": "2025-01-18",
"last_billed_amount": "-21.02",
"summary": {
"electricity_used": "12.98",
"other_charges": "0.00",
"payments": "0.00",
"credits": "0.00",
"mobile_charges": "0.00",
"broadband_charges": "0.00",
"addon_unbilled_charges": {}
},
"is_prepay": "N"
},
"status": 1
}

View File

@@ -0,0 +1,73 @@
{
"data": {
"type": "connection",
"id": 515363,
"customer_id": 273941,
"customer_number": 34030646,
"icp_identifier": "00000000DDA",
"address": "",
"short_address": "",
"physical_address_unit": "",
"physical_address_number": "555",
"physical_address_street": "RACECOURSE ROAD",
"physical_address_suburb": "",
"physical_address_town": "Blah",
"physical_address_region": "Blah",
"physical_address_postcode": "0000",
"is_active": "Y",
"pricing_plan": {
"id": 51423,
"usage": "0.0000",
"fixed": "0.6000",
"usage_rate_inc_gst": "0.0000",
"supply_rate_inc_gst": "0.6900",
"plan_description": "MoveMaster Anytime Residential (Low User)",
"plan_type": "movemaster_tou",
"signup_price_plan_blurb": "Better rates every day during off-peak, and all day on weekends. Plus half price nights (11pm-7am) and our best solar buyback.",
"signup_price_plan_label": "MoveMaster",
"app_price_plan_label": "Your MoveMaster rates are...",
"solar_rate_excl_gst": "0.1250",
"solar_rate_incl_gst": "0.1438",
"pricing_type": "tou_plus",
"tou_plus": {
"fixed_rate_excl_gst": "0.6000",
"fixed_rate_incl_gst": "0.6900",
"interval_types": ["peak", "off_peak_shoulder", "off_peak_night"],
"peak": {
"price_excl_gst": "0.5390",
"price_incl_gst": "0.6199",
"display_text": {
"Weekdays": "7am-9am, 5pm-9pm"
},
"tou_plus_label": "Peak"
},
"off_peak_shoulder": {
"price_excl_gst": "0.3234",
"price_incl_gst": "0.3719",
"display_text": {
"Weekdays": "9am-5pm, 9pm-11pm",
"Weekends": "7am-11pm"
},
"tou_plus_label": "Off-peak shoulder"
},
"off_peak_night": {
"price_excl_gst": "0.2695",
"price_incl_gst": "0.3099",
"display_text": {
"Every day": "11pm-7am"
},
"tou_plus_label": "Off-peak night"
}
}
},
"hop": {
"start_time": "9:00 PM",
"end_time": "10:00 PM",
"interval_start": "43",
"interval_end": "44"
},
"start_date": "2022-03-03",
"end_date": "",
"property_type": "residential"
}
}

View File

@@ -1,16 +1,18 @@
{
"data": {
"connection_id": "3",
"customer_number": 1000001,
"end": {
"end_time": "5:00 PM",
"interval": "34"
},
"type": "hop_customer",
"customer_id": 123456,
"service_type": "electricity",
"connection_id": 515363,
"billing_id": 1247975,
"start": {
"start_time": "4:00 PM",
"interval": "33"
"interval": "33",
"start_time": "4:00 PM"
},
"type": "hop_customer"
"end": {
"interval": "34",
"end_time": "5:00 PM"
}
},
"status": 1
}

View File

@@ -1,249 +1,250 @@
{
"data": {
"hop_duration": "60",
"type": "hop_intervals",
"hop_duration": "60",
"intervals": {
"1": {
"active": 1,
"start_time": "12:00 AM",
"end_time": "1:00 AM",
"start_time": "12:00 AM"
"active": 1
},
"2": {
"active": 1,
"start_time": "12:30 AM",
"end_time": "1:30 AM",
"start_time": "12:30 AM"
"active": 1
},
"3": {
"active": 1,
"start_time": "1:00 AM",
"end_time": "2:00 AM",
"start_time": "1:00 AM"
"active": 1
},
"4": {
"active": 1,
"start_time": "1:30 AM",
"end_time": "2:30 AM",
"start_time": "1:30 AM"
"active": 1
},
"5": {
"active": 1,
"start_time": "2:00 AM",
"end_time": "3:00 AM",
"start_time": "2:00 AM"
"active": 1
},
"6": {
"active": 1,
"start_time": "2:30 AM",
"end_time": "3:30 AM",
"start_time": "2:30 AM"
"active": 1
},
"7": {
"active": 1,
"start_time": "3:00 AM",
"end_time": "4:00 AM",
"start_time": "3:00 AM"
"active": 1
},
"8": {
"active": 1,
"start_time": "3:30 AM",
"end_time": "4:30 AM",
"start_time": "3:30 AM"
"active": 1
},
"9": {
"active": 1,
"start_time": "4:00 AM",
"end_time": "5:00 AM",
"start_time": "4:00 AM"
"active": 1
},
"10": {
"active": 1,
"start_time": "4:30 AM",
"end_time": "5:30 AM",
"start_time": "4:30 AM"
"active": 1
},
"11": {
"active": 1,
"start_time": "5:00 AM",
"end_time": "6:00 AM",
"start_time": "5:00 AM"
"active": 1
},
"12": {
"active": 1,
"start_time": "5:30 AM",
"end_time": "6:30 AM",
"start_time": "5:30 AM"
"active": 1
},
"13": {
"active": 1,
"start_time": "6:00 AM",
"end_time": "7:00 AM",
"start_time": "6:00 AM"
"active": 1
},
"14": {
"active": 1,
"start_time": "6:30 AM",
"end_time": "7:30 AM",
"start_time": "6:30 AM"
"active": 0
},
"15": {
"active": 1,
"start_time": "7:00 AM",
"end_time": "8:00 AM",
"start_time": "7:00 AM"
"active": 0
},
"16": {
"active": 1,
"start_time": "7:30 AM",
"end_time": "8:30 AM",
"start_time": "7:30 AM"
"active": 0
},
"17": {
"active": 1,
"start_time": "8:00 AM",
"end_time": "9:00 AM",
"start_time": "8:00 AM"
"active": 0
},
"18": {
"active": 1,
"start_time": "8:30 AM",
"end_time": "9:30 AM",
"start_time": "8:30 AM"
"active": 0
},
"19": {
"active": 1,
"start_time": "9:00 AM",
"end_time": "10:00 AM",
"start_time": "9:00 AM"
"active": 1
},
"20": {
"active": 1,
"start_time": "9:30 AM",
"end_time": "10:30 AM",
"start_time": "9:30 AM"
"active": 1
},
"21": {
"active": 1,
"start_time": "10:00 AM",
"end_time": "11:00 AM",
"start_time": "10:00 AM"
"active": 1
},
"22": {
"active": 1,
"start_time": "10:30 AM",
"end_time": "11:30 AM",
"start_time": "10:30 AM"
"active": 1
},
"23": {
"active": 1,
"start_time": "11:00 AM",
"end_time": "12:00 PM",
"start_time": "11:00 AM"
"active": 1
},
"24": {
"active": 1,
"start_time": "11:30 AM",
"end_time": "12:30 PM",
"start_time": "11:30 AM"
"active": 1
},
"25": {
"active": 1,
"start_time": "12:00 PM",
"end_time": "1:00 PM",
"start_time": "12:00 PM"
"active": 1
},
"26": {
"active": 1,
"start_time": "12:30 PM",
"end_time": "1:30 PM",
"start_time": "12:30 PM"
"active": 1
},
"27": {
"active": 1,
"start_time": "1:00 PM",
"end_time": "2:00 PM",
"start_time": "1:00 PM"
"active": 1
},
"28": {
"active": 1,
"start_time": "1:30 PM",
"end_time": "2:30 PM",
"start_time": "1:30 PM"
"active": 1
},
"29": {
"active": 1,
"start_time": "2:00 PM",
"end_time": "3:00 PM",
"start_time": "2:00 PM"
"active": 1
},
"30": {
"active": 1,
"start_time": "2:30 PM",
"end_time": "3:30 PM",
"start_time": "2:30 PM"
"active": 1
},
"31": {
"active": 1,
"start_time": "3:00 PM",
"end_time": "4:00 PM",
"start_time": "3:00 PM"
"active": 1
},
"32": {
"active": 1,
"start_time": "3:30 PM",
"end_time": "4:30 PM",
"start_time": "3:30 PM"
"active": 1
},
"33": {
"active": 1,
"start_time": "4:00 PM",
"end_time": "5:00 PM",
"start_time": "4:00 PM"
"active": 1
},
"34": {
"active": 1,
"start_time": "4:30 PM",
"end_time": "5:30 PM",
"start_time": "4:30 PM"
"active": 0
},
"35": {
"active": 1,
"start_time": "5:00 PM",
"end_time": "6:00 PM",
"start_time": "5:00 PM"
"active": 0
},
"36": {
"active": 1,
"start_time": "5:30 PM",
"end_time": "6:30 PM",
"start_time": "5:30 PM"
"active": 0
},
"37": {
"active": 1,
"start_time": "6:00 PM",
"end_time": "7:00 PM",
"start_time": "6:00 PM"
"active": 0
},
"38": {
"active": 1,
"start_time": "6:30 PM",
"end_time": "7:30 PM",
"start_time": "6:30 PM"
"active": 0
},
"39": {
"active": 1,
"start_time": "7:00 PM",
"end_time": "8:00 PM",
"start_time": "7:00 PM"
"active": 0
},
"40": {
"active": 1,
"start_time": "7:30 PM",
"end_time": "8:30 PM",
"start_time": "7:30 PM"
"active": 0
},
"41": {
"active": 1,
"start_time": "8:00 PM",
"end_time": "9:00 PM",
"start_time": "8:00 PM"
"active": 0
},
"42": {
"active": 1,
"start_time": "8:30 PM",
"end_time": "9:30 PM",
"start_time": "8:30 PM"
"active": 0
},
"43": {
"active": 1,
"start_time": "9:00 PM",
"end_time": "10:00 PM",
"start_time": "9:00 PM"
"active": 1
},
"44": {
"active": 1,
"start_time": "9:30 PM",
"end_time": "10:30 PM",
"start_time": "9:30 PM"
"active": 1
},
"45": {
"active": 1,
"end_time": "11:00 AM",
"start_time": "10:00 PM"
"start_time": "10:00 PM",
"end_time": "11:00 PM",
"active": 1
},
"46": {
"active": 1,
"start_time": "10:30 PM",
"end_time": "11:30 PM",
"start_time": "10:30 PM"
"active": 1
},
"47": {
"active": 1,
"start_time": "11:00 PM",
"end_time": "12:00 AM",
"start_time": "11:00 PM"
"active": 1
},
"48": {
"active": 1,
"start_time": "11:30 PM",
"end_time": "12:30 AM",
"start_time": "11:30 PM"
"active": 0
}
}
},
"service_type": "electricity"
},
"status": 1
}

View File

@@ -0,0 +1,23 @@
{
"data": {
"data": {
"type": "session",
"avatar": [],
"customer_number": 123456,
"customer_name": "Joe Dirt",
"email": "joe@dirt.kiwi",
"customer_status": "Y",
"services": [
{
"service": "Electricity",
"identifier": "00000000DDA",
"is_primary_service": true,
"service_status": "Y"
}
],
"res_partner_id": 285554,
"nuid": "EK_GUID"
}
},
"status": 1
}

View File

@@ -0,0 +1,16 @@
{
"data": {
"data": {
"type": "session",
"avatar": [],
"customer_number": 123456,
"customer_name": "Joe Dirt",
"email": "joe@dirt.kiwi",
"customer_status": "Y",
"services": [],
"res_partner_id": 285554,
"nuid": "EK_GUID"
}
},
"status": 1
}

View File

@@ -3,70 +3,40 @@
from __future__ import annotations
from http import HTTPStatus
from unittest.mock import AsyncMock, MagicMock
from unittest.mock import AsyncMock
from electrickiwi_api.exceptions import ApiException
import pytest
from homeassistant import config_entries
from homeassistant.components.application_credentials import (
ClientCredential,
async_import_client_credential,
)
from homeassistant.components.electric_kiwi.const import (
DOMAIN,
OAUTH2_AUTHORIZE,
OAUTH2_TOKEN,
SCOPE_VALUES,
)
from homeassistant.config_entries import SOURCE_USER
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers import config_entry_oauth2_flow
from homeassistant.setup import async_setup_component
from .conftest import CLIENT_ID, CLIENT_SECRET, REDIRECT_URI
from .conftest import CLIENT_ID, REDIRECT_URI
from tests.common import MockConfigEntry
from tests.test_util.aiohttp import AiohttpClientMocker
from tests.typing import ClientSessionGenerator
pytestmark = pytest.mark.usefixtures("mock_setup_entry")
@pytest.fixture
async def setup_credentials(hass: HomeAssistant) -> None:
"""Fixture to setup application credentials component."""
await async_setup_component(hass, "application_credentials", {})
await async_import_client_credential(
hass,
DOMAIN,
ClientCredential(CLIENT_ID, CLIENT_SECRET),
)
async def test_config_flow_no_credentials(hass: HomeAssistant) -> None:
"""Test config flow base case with no credentials registered."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result.get("type") is FlowResultType.ABORT
assert result.get("reason") == "missing_credentials"
@pytest.mark.usefixtures("current_request_with_host")
@pytest.mark.usefixtures("current_request_with_host", "electrickiwi_api")
async def test_full_flow(
hass: HomeAssistant,
hass_client_no_auth: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
setup_credentials: None,
mock_setup_entry: AsyncMock,
) -> None:
"""Check full flow."""
await async_import_client_credential(
hass, DOMAIN, ClientCredential(CLIENT_ID, CLIENT_SECRET)
)
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER, "entry_id": DOMAIN}
DOMAIN, context={"source": SOURCE_USER}
)
state = config_entry_oauth2_flow._encode_jwt(
hass,
@@ -76,13 +46,13 @@ async def test_full_flow(
},
)
URL_SCOPE = SCOPE_VALUES.replace(" ", "+")
url_scope = SCOPE_VALUES.replace(" ", "+")
assert result["url"] == (
f"{OAUTH2_AUTHORIZE}?response_type=code&client_id={CLIENT_ID}"
f"&redirect_uri={REDIRECT_URI}"
f"&state={state}"
f"&scope={URL_SCOPE}"
f"&scope={url_scope}"
)
client = await hass_client_no_auth()
@@ -90,6 +60,7 @@ async def test_full_flow(
assert resp.status == HTTPStatus.OK
assert resp.headers["content-type"] == "text/html; charset=utf-8"
aioclient_mock.clear_requests()
aioclient_mock.post(
OAUTH2_TOKEN,
json={
@@ -106,20 +77,73 @@ async def test_full_flow(
assert len(mock_setup_entry.mock_calls) == 1
@pytest.mark.usefixtures("current_request_with_host")
async def test_flow_failure(
hass: HomeAssistant,
hass_client_no_auth: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
electrickiwi_api: AsyncMock,
) -> None:
"""Check failure on creation of entry."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
)
state = config_entry_oauth2_flow._encode_jwt(
hass,
{
"flow_id": result["flow_id"],
"redirect_uri": REDIRECT_URI,
},
)
url_scope = SCOPE_VALUES.replace(" ", "+")
assert result["url"] == (
f"{OAUTH2_AUTHORIZE}?response_type=code&client_id={CLIENT_ID}"
f"&redirect_uri={REDIRECT_URI}"
f"&state={state}"
f"&scope={url_scope}"
)
client = await hass_client_no_auth()
resp = await client.get(f"/auth/external/callback?code=abcd&state={state}")
assert resp.status == HTTPStatus.OK
assert resp.headers["content-type"] == "text/html; charset=utf-8"
aioclient_mock.clear_requests()
aioclient_mock.post(
OAUTH2_TOKEN,
json={
"refresh_token": "mock-refresh-token",
"access_token": "mock-access-token",
"type": "Bearer",
"expires_in": 60,
},
)
electrickiwi_api.get_active_session.side_effect = ApiException()
result = await hass.config_entries.flow.async_configure(result["flow_id"])
assert len(hass.config_entries.async_entries(DOMAIN)) == 0
assert result.get("type") is FlowResultType.ABORT
assert result.get("reason") == "connection_error"
@pytest.mark.usefixtures("current_request_with_host")
async def test_existing_entry(
hass: HomeAssistant,
hass_client_no_auth: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
setup_credentials: None,
config_entry: MockConfigEntry,
migrated_config_entry: MockConfigEntry,
) -> None:
"""Check existing entry."""
config_entry.add_to_hass(hass)
migrated_config_entry.add_to_hass(hass)
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER, "entry_id": DOMAIN}
DOMAIN, context={"source": SOURCE_USER, "entry_id": DOMAIN}
)
state = config_entry_oauth2_flow._encode_jwt(
@@ -145,7 +169,9 @@ async def test_existing_entry(
},
)
await hass.config_entries.flow.async_configure(result["flow_id"])
result = await hass.config_entries.flow.async_configure(result["flow_id"])
assert result.get("type") is FlowResultType.ABORT
assert result.get("reason") == "already_configured"
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
@@ -154,13 +180,13 @@ async def test_reauthentication(
hass: HomeAssistant,
hass_client_no_auth: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
mock_setup_entry: MagicMock,
config_entry: MockConfigEntry,
setup_credentials: None,
mock_setup_entry: AsyncMock,
migrated_config_entry: MockConfigEntry,
) -> None:
"""Test Electric Kiwi reauthentication."""
config_entry.add_to_hass(hass)
result = await config_entry.start_reauth_flow(hass)
migrated_config_entry.add_to_hass(hass)
result = await migrated_config_entry.start_reauth_flow(hass)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "reauth_confirm"
@@ -189,8 +215,11 @@ async def test_reauthentication(
},
)
await hass.config_entries.flow.async_configure(result["flow_id"])
result = await hass.config_entries.flow.async_configure(result["flow_id"])
await hass.async_block_till_done()
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
assert len(mock_setup_entry.mock_calls) == 1
assert result.get("type") is FlowResultType.ABORT
assert result.get("reason") == "reauth_successful"

View File

@@ -0,0 +1,135 @@
"""Test the Electric Kiwi init."""
import http
from unittest.mock import AsyncMock, patch
from aiohttp import RequestInfo
from aiohttp.client_exceptions import ClientResponseError
from electrickiwi_api.exceptions import ApiException, AuthException
import pytest
from homeassistant.components.electric_kiwi.const import DOMAIN
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN
from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from . import init_integration
from tests.common import MockConfigEntry
async def test_async_setup_entry(
hass: HomeAssistant, config_entry: MockConfigEntry
) -> None:
"""Test a successful setup entry and unload of entry."""
await init_integration(hass, config_entry)
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
assert config_entry.state is ConfigEntryState.LOADED
assert await hass.config_entries.async_unload(config_entry.entry_id)
await hass.async_block_till_done()
assert config_entry.state is ConfigEntryState.NOT_LOADED
async def test_async_setup_multiple_entries(
hass: HomeAssistant,
config_entry: MockConfigEntry,
config_entry2: MockConfigEntry,
) -> None:
"""Test a successful setup and unload of multiple entries."""
for entry in (config_entry, config_entry2):
await init_integration(hass, entry)
assert len(hass.config_entries.async_entries(DOMAIN)) == 2
for entry in (config_entry, config_entry2):
assert await hass.config_entries.async_unload(entry.entry_id)
await hass.async_block_till_done()
assert entry.state is ConfigEntryState.NOT_LOADED
@pytest.mark.parametrize(
("status", "expected_state"),
[
(
http.HTTPStatus.UNAUTHORIZED,
ConfigEntryState.SETUP_ERROR,
),
(
http.HTTPStatus.INTERNAL_SERVER_ERROR,
ConfigEntryState.SETUP_RETRY,
),
],
ids=["failure_requires_reauth", "transient_failure"],
)
async def test_refresh_token_validity_failures(
hass: HomeAssistant,
config_entry: MockConfigEntry,
status: http.HTTPStatus,
expected_state: ConfigEntryState,
) -> None:
"""Test token refresh failure status."""
with patch(
"homeassistant.helpers.config_entry_oauth2_flow.OAuth2Session.async_ensure_token_valid",
side_effect=ClientResponseError(
RequestInfo("", "POST", {}, ""), None, status=status
),
) as mock_async_ensure_token_valid:
await init_integration(hass, config_entry)
mock_async_ensure_token_valid.assert_called_once()
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
entries = hass.config_entries.async_entries(DOMAIN)
assert entries[0].state is expected_state
async def test_unique_id_migration(
hass: HomeAssistant,
config_entry: MockConfigEntry,
entity_registry: er.EntityRegistry,
) -> None:
"""Test that the unique ID is migrated to the customer number."""
config_entry.add_to_hass(hass)
entity_registry.async_get_or_create(
SENSOR_DOMAIN, DOMAIN, "123456_515363_sensor", config_entry=config_entry
)
await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done()
new_entry = hass.config_entries.async_get_entry(config_entry.entry_id)
assert new_entry.minor_version == 2
assert new_entry.unique_id == "123456"
entity_entry = entity_registry.async_get(
"sensor.electric_kiwi_123456_515363_sensor"
)
assert entity_entry.unique_id == "123456_00000000DDA_sensor"
async def test_unique_id_migration_failure(
hass: HomeAssistant, config_entry: MockConfigEntry, electrickiwi_api: AsyncMock
) -> None:
"""Test that the unique ID is migrated to the customer number."""
electrickiwi_api.set_active_session.side_effect = ApiException()
await init_integration(hass, config_entry)
assert config_entry.minor_version == 1
assert config_entry.unique_id == DOMAIN
assert config_entry.state is ConfigEntryState.MIGRATION_ERROR
async def test_unique_id_migration_auth_failure(
hass: HomeAssistant, config_entry: MockConfigEntry, electrickiwi_api: AsyncMock
) -> None:
"""Test that the unique ID is migrated to the customer number."""
electrickiwi_api.set_active_session.side_effect = AuthException()
await init_integration(hass, config_entry)
assert config_entry.minor_version == 1
assert config_entry.unique_id == DOMAIN
assert config_entry.state is ConfigEntryState.MIGRATION_ERROR

View File

@@ -20,7 +20,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_registry import EntityRegistry
from homeassistant.util import dt as dt_util
from .conftest import ComponentSetup, YieldFixture
from . import init_integration
from tests.common import MockConfigEntry
@@ -47,10 +47,9 @@ def restore_timezone():
async def test_hop_sensors(
hass: HomeAssistant,
config_entry: MockConfigEntry,
ek_api: YieldFixture,
ek_auth: YieldFixture,
electrickiwi_api: Mock,
ek_auth: AsyncMock,
entity_registry: EntityRegistry,
component_setup: ComponentSetup,
sensor: str,
sensor_state: str,
) -> None:
@@ -61,7 +60,7 @@ async def test_hop_sensors(
sensor state should be set to today at 4pm or if now is past 4pm,
then tomorrow at 4pm.
"""
assert await component_setup()
await init_integration(hass, config_entry)
assert config_entry.state is ConfigEntryState.LOADED
entity = entity_registry.async_get(sensor)
@@ -70,8 +69,7 @@ async def test_hop_sensors(
state = hass.states.get(sensor)
assert state
api = ek_api(Mock())
hop_data = await api.get_hop()
hop_data = await electrickiwi_api.get_hop()
value = _check_and_move_time(hop_data, sensor_state)
@@ -98,20 +96,19 @@ async def test_hop_sensors(
),
(
"sensor.next_billing_date",
"2020-11-03T00:00:00",
"2025-02-19T00:00:00",
SensorDeviceClass.DATE,
None,
),
("sensor.hour_of_power_savings", "3.5", None, SensorStateClass.MEASUREMENT),
("sensor.hour_of_power_savings", "11.2", None, SensorStateClass.MEASUREMENT),
],
)
async def test_account_sensors(
hass: HomeAssistant,
config_entry: MockConfigEntry,
ek_api: YieldFixture,
ek_auth: YieldFixture,
electrickiwi_api: AsyncMock,
ek_auth: AsyncMock,
entity_registry: EntityRegistry,
component_setup: ComponentSetup,
sensor: str,
sensor_state: str,
device_class: str,
@@ -119,7 +116,7 @@ async def test_account_sensors(
) -> None:
"""Test Account sensors for the Electric Kiwi integration."""
assert await component_setup()
await init_integration(hass, config_entry)
assert config_entry.state is ConfigEntryState.LOADED
entity = entity_registry.async_get(sensor)
@@ -133,9 +130,9 @@ async def test_account_sensors(
assert state.attributes.get(ATTR_STATE_CLASS) == state_class
async def test_check_and_move_time(ek_api: AsyncMock) -> None:
async def test_check_and_move_time(electrickiwi_api: AsyncMock) -> None:
"""Test correct time is returned depending on time of day."""
hop = await ek_api(Mock()).get_hop()
hop = await electrickiwi_api.get_hop()
test_time = datetime(2023, 6, 21, 18, 0, 0, tzinfo=TEST_TIMEZONE)
dt_util.set_default_time_zone(TEST_TIMEZONE)

View File

@@ -8,7 +8,6 @@
'habitica_data': dict({
'tasks': list([
dict({
'Type': 'habit',
'alias': None,
'attribute': 'str',
'byHabitica': False,
@@ -71,6 +70,7 @@
'tags': list([
]),
'text': 'task text',
'type': 'habit',
'up': True,
'updatedAt': '2024-10-10T15:57:14.287000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
@@ -80,7 +80,6 @@
'yesterDaily': None,
}),
dict({
'Type': 'todo',
'alias': None,
'attribute': 'str',
'byHabitica': True,
@@ -143,6 +142,7 @@
'tags': list([
]),
'text': 'task text',
'type': 'todo',
'up': None,
'updatedAt': '2024-11-27T19:34:29.001000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
@@ -152,7 +152,6 @@
'yesterDaily': None,
}),
dict({
'Type': 'reward',
'alias': None,
'attribute': 'str',
'byHabitica': False,
@@ -215,6 +214,7 @@
'tags': list([
]),
'text': 'task text',
'type': 'reward',
'up': None,
'updatedAt': '2024-10-10T15:57:14.290000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',
@@ -224,7 +224,6 @@
'yesterDaily': None,
}),
dict({
'Type': 'daily',
'alias': None,
'attribute': 'str',
'byHabitica': False,
@@ -341,6 +340,7 @@
'tags': list([
]),
'text': 'task text',
'type': 'daily',
'up': None,
'updatedAt': '2024-11-27T19:34:29.001000+00:00',
'userId': 'ffce870c-3ff3-4fa4-bad1-87612e52b8e7',

File diff suppressed because it is too large Load Diff

View File

@@ -26,6 +26,7 @@ from aiohasupervisor.models import (
jobs as supervisor_jobs,
mounts as supervisor_mounts,
)
from aiohasupervisor.models.backups import LOCATION_CLOUD_BACKUP, LOCATION_LOCAL_STORAGE
from aiohasupervisor.models.mounts import MountsInfo
from freezegun.api import FrozenDateTimeFactory
import pytest
@@ -39,11 +40,7 @@ from homeassistant.components.backup import (
Folder,
)
from homeassistant.components.hassio import DOMAIN
from homeassistant.components.hassio.backup import (
LOCATION_CLOUD_BACKUP,
LOCATION_LOCAL,
RESTORE_JOB_ID_ENV,
)
from homeassistant.components.hassio.backup import RESTORE_JOB_ID_ENV
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
@@ -60,17 +57,12 @@ TEST_BACKUP = supervisor_backups.Backup(
homeassistant=True,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location=None,
location_attributes={
LOCATION_LOCAL: supervisor_backups.BackupLocationAttributes(
LOCATION_LOCAL_STORAGE: supervisor_backups.BackupLocationAttributes(
protected=False, size_bytes=1048576
)
},
locations={None},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
@@ -89,14 +81,9 @@ TEST_BACKUP_DETAILS = supervisor_backups.BackupComplete(
folders=[supervisor_backups.Folder.SHARE],
homeassistant_exclude_database=False,
homeassistant="2024.12.0",
location=TEST_BACKUP.location,
location_attributes=TEST_BACKUP.location_attributes,
locations=TEST_BACKUP.locations,
name=TEST_BACKUP.name,
protected=TEST_BACKUP.protected,
repositories=[],
size=TEST_BACKUP.size,
size_bytes=TEST_BACKUP.size_bytes,
slug=TEST_BACKUP.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP.type,
@@ -110,17 +97,12 @@ TEST_BACKUP_2 = supervisor_backups.Backup(
homeassistant=False,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location=None,
location_attributes={
LOCATION_LOCAL: supervisor_backups.BackupLocationAttributes(
LOCATION_LOCAL_STORAGE: supervisor_backups.BackupLocationAttributes(
protected=False, size_bytes=1048576
)
},
locations={None},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
@@ -139,14 +121,9 @@ TEST_BACKUP_DETAILS_2 = supervisor_backups.BackupComplete(
folders=[supervisor_backups.Folder.SHARE],
homeassistant_exclude_database=False,
homeassistant=None,
location=TEST_BACKUP_2.location,
location_attributes=TEST_BACKUP_2.location_attributes,
locations=TEST_BACKUP_2.locations,
name=TEST_BACKUP_2.name,
protected=TEST_BACKUP_2.protected,
repositories=[],
size=TEST_BACKUP_2.size,
size_bytes=TEST_BACKUP_2.size_bytes,
slug=TEST_BACKUP_2.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP_2.type,
@@ -160,17 +137,12 @@ TEST_BACKUP_3 = supervisor_backups.Backup(
homeassistant=True,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location="share",
location_attributes={
LOCATION_LOCAL: supervisor_backups.BackupLocationAttributes(
LOCATION_LOCAL_STORAGE: supervisor_backups.BackupLocationAttributes(
protected=False, size_bytes=1048576
)
},
locations={"share"},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
@@ -189,14 +161,9 @@ TEST_BACKUP_DETAILS_3 = supervisor_backups.BackupComplete(
folders=[supervisor_backups.Folder.SHARE],
homeassistant_exclude_database=False,
homeassistant=None,
location=TEST_BACKUP_3.location,
location_attributes=TEST_BACKUP_3.location_attributes,
locations=TEST_BACKUP_3.locations,
name=TEST_BACKUP_3.name,
protected=TEST_BACKUP_3.protected,
repositories=[],
size=TEST_BACKUP_3.size,
size_bytes=TEST_BACKUP_3.size_bytes,
slug=TEST_BACKUP_3.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP_3.type,
@@ -211,17 +178,12 @@ TEST_BACKUP_4 = supervisor_backups.Backup(
homeassistant=True,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location=None,
location_attributes={
LOCATION_LOCAL: supervisor_backups.BackupLocationAttributes(
LOCATION_LOCAL_STORAGE: supervisor_backups.BackupLocationAttributes(
protected=False, size_bytes=1048576
)
},
locations={None},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
@@ -240,14 +202,9 @@ TEST_BACKUP_DETAILS_4 = supervisor_backups.BackupComplete(
folders=[supervisor_backups.Folder.SHARE],
homeassistant_exclude_database=True,
homeassistant="2024.12.0",
location=TEST_BACKUP_4.location,
location_attributes=TEST_BACKUP_4.location_attributes,
locations=TEST_BACKUP_4.locations,
name=TEST_BACKUP_4.name,
protected=TEST_BACKUP_4.protected,
repositories=[],
size=TEST_BACKUP_4.size,
size_bytes=TEST_BACKUP_4.size_bytes,
slug=TEST_BACKUP_4.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP_4.type,
@@ -261,17 +218,12 @@ TEST_BACKUP_5 = supervisor_backups.Backup(
homeassistant=True,
),
date=datetime.fromisoformat("1970-01-01T00:00:00Z"),
location=LOCATION_CLOUD_BACKUP,
location_attributes={
LOCATION_CLOUD_BACKUP: supervisor_backups.BackupLocationAttributes(
protected=False, size_bytes=1048576
)
},
locations={LOCATION_CLOUD_BACKUP},
name="Test",
protected=False,
size=1.0,
size_bytes=1048576,
slug="abc123",
type=supervisor_backups.BackupType.PARTIAL,
)
@@ -290,14 +242,9 @@ TEST_BACKUP_DETAILS_5 = supervisor_backups.BackupComplete(
folders=[supervisor_backups.Folder.SHARE],
homeassistant_exclude_database=False,
homeassistant="2024.12.0",
location=TEST_BACKUP_5.location,
location_attributes=TEST_BACKUP_5.location_attributes,
locations=TEST_BACKUP_5.locations,
name=TEST_BACKUP_5.name,
protected=TEST_BACKUP_5.protected,
repositories=[],
size=TEST_BACKUP_5.size,
size_bytes=TEST_BACKUP_5.size_bytes,
slug=TEST_BACKUP_5.slug,
supervisor_version="2024.11.2",
type=TEST_BACKUP_5.type,
@@ -312,6 +259,7 @@ TEST_JOB_NOT_DONE = supervisor_jobs.Job(
stage="copy_additional_locations",
done=False,
errors=[],
created=datetime.fromisoformat("1970-01-01T00:00:00Z"),
child_jobs=[],
)
TEST_JOB_DONE = supervisor_jobs.Job(
@@ -322,6 +270,7 @@ TEST_JOB_DONE = supervisor_jobs.Job(
stage="copy_additional_locations",
done=True,
errors=[],
created=datetime.fromisoformat("1970-01-01T00:00:00Z"),
child_jobs=[],
)
TEST_RESTORE_JOB_DONE_WITH_ERROR = supervisor_jobs.Job(
@@ -340,6 +289,7 @@ TEST_RESTORE_JOB_DONE_WITH_ERROR = supervisor_jobs.Job(
),
)
],
created=datetime.fromisoformat("1970-01-01T00:00:00Z"),
child_jobs=[],
)
@@ -580,7 +530,10 @@ async def test_agent_download(
assert await resp.content.read() == b"backup data"
supervisor_client.backups.download_backup.assert_called_once_with(
"abc123", options=supervisor_backups.DownloadBackupOptions(location=None)
"abc123",
options=supervisor_backups.DownloadBackupOptions(
location=LOCATION_LOCAL_STORAGE
),
)
@@ -766,7 +719,10 @@ async def test_agent_delete_backup(
assert response["success"]
assert response["result"] == {"agent_errors": {}}
supervisor_client.backups.remove_backup.assert_called_once_with(
backup_id, options=supervisor_backups.RemoveBackupOptions(location={None})
backup_id,
options=supervisor_backups.RemoveBackupOptions(
location={LOCATION_LOCAL_STORAGE}
),
)
@@ -812,7 +768,10 @@ async def test_agent_delete_with_error(
assert response == {"id": 1, "type": "result"} | expected_response
supervisor_client.backups.remove_backup.assert_called_once_with(
backup_id, options=supervisor_backups.RemoveBackupOptions(location={None})
backup_id,
options=supervisor_backups.RemoveBackupOptions(
location={LOCATION_LOCAL_STORAGE}
),
)
@@ -891,7 +850,7 @@ DEFAULT_BACKUP_OPTIONS = supervisor_backups.PartialBackupOptions(
folders={"ssl"},
homeassistant_exclude_database=False,
homeassistant=True,
location=[None],
location=[LOCATION_LOCAL_STORAGE],
name="Test",
password=None,
)
@@ -947,7 +906,7 @@ async def test_reader_writer_create(
"""Test generating a backup."""
client = await hass_ws_client(hass)
freezer.move_to("2025-01-30 13:42:12.345678")
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -1022,7 +981,7 @@ async def test_reader_writer_create_report_progress(
"""Test generating a backup."""
client = await hass_ws_client(hass)
freezer.move_to("2025-01-30 13:42:12.345678")
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -1129,7 +1088,7 @@ async def test_reader_writer_create_job_done(
"""Test generating a backup, and backup job finishes early."""
client = await hass_ws_client(hass)
freezer.move_to("2025-01-30 13:42:12.345678")
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = TEST_JOB_DONE
@@ -1198,7 +1157,7 @@ async def test_reader_writer_create_job_done(
None,
["hassio.local", "hassio.share1", "hassio.share2", "hassio.share3"],
None,
[None, "share1", "share2", "share3"],
[LOCATION_LOCAL_STORAGE, "share1", "share2", "share3"],
False,
[],
),
@@ -1207,7 +1166,7 @@ async def test_reader_writer_create_job_done(
"hunter2",
["hassio.local", "hassio.share1", "hassio.share2", "hassio.share3"],
"hunter2",
[None, "share1", "share2", "share3"],
[LOCATION_LOCAL_STORAGE, "share1", "share2", "share3"],
True,
[],
),
@@ -1225,7 +1184,7 @@ async def test_reader_writer_create_job_done(
"hunter2",
["share1", "share2", "share3"],
True,
[None],
[LOCATION_LOCAL_STORAGE],
),
(
[
@@ -1242,7 +1201,7 @@ async def test_reader_writer_create_job_done(
"hunter2",
["share2", "share3"],
True,
[None, "share1"],
[LOCATION_LOCAL_STORAGE, "share1"],
),
(
[
@@ -1258,7 +1217,7 @@ async def test_reader_writer_create_job_done(
"hunter2",
["hassio.local", "hassio.share1", "hassio.share2", "hassio.share3"],
None,
[None, "share1", "share2"],
[LOCATION_LOCAL_STORAGE, "share1", "share2"],
True,
["share3"],
),
@@ -1274,7 +1233,7 @@ async def test_reader_writer_create_job_done(
"hunter2",
["hassio.local"],
None,
[None],
[LOCATION_LOCAL_STORAGE],
False,
[],
),
@@ -1312,15 +1271,14 @@ async def test_reader_writer_create_per_agent_encryption(
for i in range(1, 4)
],
)
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = replace(
TEST_BACKUP_DETAILS,
extra=DEFAULT_BACKUP_OPTIONS.extra,
locations=create_locations,
location_attributes={
location or LOCATION_LOCAL: supervisor_backups.BackupLocationAttributes(
location: supervisor_backups.BackupLocationAttributes(
protected=create_protected,
size_bytes=TEST_BACKUP_DETAILS.size_bytes,
size_bytes=1048576,
)
for location in create_locations
},
@@ -1514,7 +1472,7 @@ async def test_reader_writer_create_missing_reference_error(
) -> None:
"""Test missing reference error when generating a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
await client.send_json_auto_id({"type": "backup/subscribe_events"})
@@ -1581,7 +1539,7 @@ async def test_reader_writer_create_download_remove_error(
) -> None:
"""Test download and remove error when generating a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS_5
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
method_mock = getattr(supervisor_client.backups, method)
@@ -1668,7 +1626,7 @@ async def test_reader_writer_create_info_error(
) -> None:
"""Test backup info error when generating a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.side_effect = exception
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -1745,7 +1703,7 @@ async def test_reader_writer_create_remote_backup(
"""Test generating a backup which will be uploaded to a remote agent."""
client = await hass_ws_client(hass)
freezer.move_to("2025-01-30 13:42:12.345678")
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS_5
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -1848,7 +1806,7 @@ async def test_reader_writer_create_wrong_parameters(
) -> None:
"""Test generating a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_backup.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_backup.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
await client.send_json_auto_id({"type": "backup/subscribe_events"})
@@ -1975,7 +1933,7 @@ async def test_reader_writer_restore(
) -> None:
"""Test restoring a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_restore.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_restore.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = get_job_result
@@ -2006,7 +1964,7 @@ async def test_reader_writer_restore(
background=True,
folders=None,
homeassistant=True,
location=None,
location=LOCATION_LOCAL_STORAGE,
password=None,
),
)
@@ -2032,7 +1990,7 @@ async def test_reader_writer_restore(
assert response["result"] is None
@pytest.mark.usefixtures("hassio_client", "setup_integration")
@pytest.mark.usefixtures("hassio_client", "setup_backup_integration")
async def test_reader_writer_restore_report_progress(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
@@ -2040,7 +1998,7 @@ async def test_reader_writer_restore_report_progress(
) -> None:
"""Test restoring a backup."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_restore.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_restore.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -2071,7 +2029,7 @@ async def test_reader_writer_restore_report_progress(
background=True,
folders=None,
homeassistant=True,
location=None,
location=LOCATION_LOCAL_STORAGE,
password=None,
),
)
@@ -2193,7 +2151,7 @@ async def test_reader_writer_restore_error(
background=True,
folders=None,
homeassistant=True,
location=None,
location=LOCATION_LOCAL_STORAGE,
password=None,
),
)
@@ -2221,7 +2179,7 @@ async def test_reader_writer_restore_late_error(
) -> None:
"""Test restoring a backup with error."""
client = await hass_ws_client(hass)
supervisor_client.backups.partial_restore.return_value.job_id = TEST_JOB_ID
supervisor_client.backups.partial_restore.return_value.job_id = UUID(TEST_JOB_ID)
supervisor_client.backups.list.return_value = [TEST_BACKUP]
supervisor_client.backups.backup_info.return_value = TEST_BACKUP_DETAILS
supervisor_client.jobs.get_job.return_value = TEST_JOB_NOT_DONE
@@ -2250,7 +2208,7 @@ async def test_reader_writer_restore_late_error(
background=True,
folders=None,
homeassistant=True,
location=None,
location=LOCATION_LOCAL_STORAGE,
password=None,
),
)

View File

@@ -193,13 +193,36 @@ async def test_device_id_migration(
# Create a device with a legacy identifier
device_registry.async_get_or_create(
config_entry_id=config_entry.entry_id,
identifiers={(DOMAIN, 1)}, # type: ignore[arg-type]
identifiers={(DOMAIN, 1), ("Other", "1")}, # type: ignore[arg-type]
)
device_registry.async_get_or_create(
config_entry_id=config_entry.entry_id,
identifiers={("Other", 1)}, # type: ignore[arg-type]
)
assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done(wait_background_tasks=True)
assert device_registry.async_get_device({("Other", 1)}) is not None # type: ignore[arg-type]
assert device_registry.async_get_device({(DOMAIN, 1)}) is None # type: ignore[arg-type]
assert device_registry.async_get_device({(DOMAIN, "1")}) is not None
assert device_registry.async_get_device({("Other", "1")}) is not None
async def test_device_id_migration_both_present(
hass: HomeAssistant,
device_registry: dr.DeviceRegistry,
config_entry: MockConfigEntry,
) -> None:
"""Test that legacy non-string devices are removed when both devices present."""
config_entry.add_to_hass(hass)
# Create a device with a legacy identifier AND a new identifier
device_registry.async_get_or_create(
config_entry_id=config_entry.entry_id,
identifiers={(DOMAIN, 1)}, # type: ignore[arg-type]
)
device_registry.async_get_or_create(
config_entry_id=config_entry.entry_id, identifiers={(DOMAIN, "1")}
)
assert await hass.config_entries.async_setup(config_entry.entry_id)
await hass.async_block_till_done(wait_background_tasks=True)
assert device_registry.async_get_device({(DOMAIN, 1)}) is None # type: ignore[arg-type]
assert device_registry.async_get_device({(DOMAIN, "1")}) is not None

View File

@@ -391,6 +391,25 @@ async def test_service_call_with_ascii_qos_retain_flags(
blocking=True,
)
assert mqtt_mock.async_publish.called
assert mqtt_mock.async_publish.call_args[0][1] == ""
assert mqtt_mock.async_publish.call_args[0][2] == 2
assert not mqtt_mock.async_publish.call_args[0][3]
mqtt_mock.reset_mock()
# Test service call without payload
await hass.services.async_call(
mqtt.DOMAIN,
mqtt.SERVICE_PUBLISH,
{
mqtt.ATTR_TOPIC: "test/topic",
mqtt.ATTR_QOS: "2",
mqtt.ATTR_RETAIN: "no",
},
blocking=True,
)
assert mqtt_mock.async_publish.called
assert mqtt_mock.async_publish.call_args[0][1] is None
assert mqtt_mock.async_publish.call_args[0][2] == 2
assert not mqtt_mock.async_publish.call_args[0][3]

View File

@@ -1,6 +1,7 @@
"""Fixtures for OneDrive tests."""
from collections.abc import AsyncIterator, Generator
from json import dumps
import time
from unittest.mock import AsyncMock, MagicMock, patch
@@ -15,11 +16,13 @@ from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from .const import (
BACKUP_METADATA,
CLIENT_ID,
CLIENT_SECRET,
MOCK_APPROOT,
MOCK_BACKUP_FILE,
MOCK_BACKUP_FOLDER,
MOCK_METADATA_FILE,
)
from tests.common import MockConfigEntry
@@ -67,8 +70,8 @@ def mock_config_entry(expires_at: int, scopes: list[str]) -> MockConfigEntry:
)
@pytest.fixture(autouse=True)
def mock_onedrive_client() -> Generator[MagicMock]:
@pytest.fixture
def mock_onedrive_client_init() -> Generator[MagicMock]:
"""Return a mocked GraphServiceClient."""
with (
patch(
@@ -80,19 +83,29 @@ def mock_onedrive_client() -> Generator[MagicMock]:
new=onedrive_client,
),
):
client = onedrive_client.return_value
client.get_approot.return_value = MOCK_APPROOT
client.create_folder.return_value = MOCK_BACKUP_FOLDER
client.list_drive_items.return_value = [MOCK_BACKUP_FILE]
client.get_drive_item.return_value = MOCK_BACKUP_FILE
yield onedrive_client
class MockStreamReader:
async def iter_chunked(self, chunk_size: int) -> AsyncIterator[bytes]:
yield b"backup data"
client.download_drive_item.return_value = MockStreamReader()
@pytest.fixture(autouse=True)
def mock_onedrive_client(mock_onedrive_client_init: MagicMock) -> Generator[MagicMock]:
"""Return a mocked GraphServiceClient."""
client = mock_onedrive_client_init.return_value
client.get_approot.return_value = MOCK_APPROOT
client.create_folder.return_value = MOCK_BACKUP_FOLDER
client.list_drive_items.return_value = [MOCK_BACKUP_FILE, MOCK_METADATA_FILE]
client.get_drive_item.return_value = MOCK_BACKUP_FILE
client.upload_file.return_value = MOCK_METADATA_FILE
yield client
class MockStreamReader:
async def iter_chunked(self, chunk_size: int) -> AsyncIterator[bytes]:
yield b"backup data"
async def read(self) -> bytes:
return dumps(BACKUP_METADATA).encode()
client.download_drive_item.return_value = MockStreamReader()
return client
@pytest.fixture
@@ -101,6 +114,7 @@ def mock_large_file_upload_client() -> Generator[AsyncMock]:
with patch(
"homeassistant.components.onedrive.backup.LargeFileUploadClient.upload"
) as mock_upload:
mock_upload.return_value = MOCK_BACKUP_FILE
yield mock_upload

View File

@@ -72,6 +72,29 @@ MOCK_BACKUP_FILE = File(
quick_xor_hash="hash",
),
mime_type="application/x-tar",
description=escape(dumps(BACKUP_METADATA)),
description="",
created_by=CONTRIBUTOR,
)
MOCK_METADATA_FILE = File(
id="id",
name="23e64aec.tar",
size=34519040,
parent_reference=ItemParentReference(
drive_id="mock_drive_id", id="id", path="path"
),
hashes=Hashes(
quick_xor_hash="hash",
),
mime_type="application/x-tar",
description=escape(
dumps(
{
"metadata_version": 2,
"backup_id": "23e64aec",
"backup_file_id": "id",
}
)
),
created_by=CONTRIBUTOR,
)

View File

@@ -152,7 +152,7 @@ async def test_agents_delete(
assert response["success"]
assert response["result"] == {"agent_errors": {}}
mock_onedrive_client.delete_drive_item.assert_called_once()
assert mock_onedrive_client.delete_drive_item.call_count == 2
async def test_agents_upload(

View File

@@ -70,6 +70,7 @@ async def test_full_flow(
hass_client_no_auth: ClientSessionGenerator,
aioclient_mock: AiohttpClientMocker,
mock_setup_entry: AsyncMock,
mock_onedrive_client_init: MagicMock,
) -> None:
"""Check full flow."""
@@ -79,6 +80,10 @@ async def test_full_flow(
await _do_get_token(hass, result, hass_client_no_auth, aioclient_mock)
result = await hass.config_entries.flow.async_configure(result["flow_id"])
# Ensure the token callback is set up correctly
token_callback = mock_onedrive_client_init.call_args[0][0]
assert await token_callback() == "mock-access-token"
assert result["type"] is FlowResultType.CREATE_ENTRY
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
assert len(mock_setup_entry.mock_calls) == 1

View File

@@ -1,5 +1,7 @@
"""Test the OneDrive setup."""
from html import escape
from json import dumps
from unittest.mock import MagicMock
from onedrive_personal_sdk.exceptions import AuthenticationError, OneDriveException
@@ -9,6 +11,7 @@ from homeassistant.config_entries import ConfigEntryState
from homeassistant.core import HomeAssistant
from . import setup_integration
from .const import BACKUP_METADATA, MOCK_BACKUP_FILE
from tests.common import MockConfigEntry
@@ -16,10 +19,20 @@ from tests.common import MockConfigEntry
async def test_load_unload_config_entry(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_onedrive_client_init: MagicMock,
mock_onedrive_client: MagicMock,
) -> None:
"""Test loading and unloading the integration."""
await setup_integration(hass, mock_config_entry)
# Ensure the token callback is set up correctly
token_callback = mock_onedrive_client_init.call_args[0][0]
assert await token_callback() == "mock-access-token"
# make sure metadata migration is not called
assert mock_onedrive_client.upload_file.call_count == 0
assert mock_onedrive_client.update_drive_item.call_count == 0
assert mock_config_entry.state is ConfigEntryState.LOADED
await hass.config_entries.async_unload(mock_config_entry.entry_id)
@@ -59,3 +72,32 @@ async def test_get_integration_folder_error(
await setup_integration(hass, mock_config_entry)
assert mock_config_entry.state is ConfigEntryState.SETUP_RETRY
assert "Failed to get backups_9f86d081 folder" in caplog.text
async def test_migrate_metadata_files(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_onedrive_client: MagicMock,
) -> None:
"""Test migration of metadata files."""
MOCK_BACKUP_FILE.description = escape(
dumps({**BACKUP_METADATA, "metadata_version": 1})
)
await setup_integration(hass, mock_config_entry)
await hass.async_block_till_done()
mock_onedrive_client.upload_file.assert_called_once()
assert mock_onedrive_client.update_drive_item.call_count == 2
assert mock_onedrive_client.update_drive_item.call_args[1]["data"].description == ""
async def test_migrate_metadata_files_errors(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_onedrive_client: MagicMock,
) -> None:
"""Test migration of metadata files errors."""
mock_onedrive_client.list_drive_items.side_effect = OneDriveException()
await setup_integration(hass, mock_config_entry)
assert mock_config_entry.state is ConfigEntryState.SETUP_RETRY

View File

@@ -2,7 +2,7 @@
"enabled": true,
"types": 222,
"options": {
"jsonPayload": "{\"notification_type\":\"{{notification_type}}\",\"subject\":\"{{subject}}\",\"message\":\"{{message}}\",\"image\":\"{{image}}\",\"{{media}}\":{\"media_type\":\"{{media_type}}\",\"tmdb_idd\":\"{{media_tmdbid}}\",\"tvdb_id\":\"{{media_tvdbid}}\",\"status\":\"{{media_status}}\",\"status4k\":\"{{media_status4k}}\"},\"{{request}}\":{\"request_id\":\"{{request_id}}\",\"requested_by_email\":\"{{requestedBy_email}}\",\"requested_by_username\":\"{{requestedBy_username}}\",\"requested_by_avatar\":\"{{requestedBy_avatar}}\",\"requested_by_settings_discord_id\":\"{{requestedBy_settings_discordId}}\",\"requested_by_settings_telegram_chat_id\":\"{{requestedBy_settings_telegramChatId}}\"},\"{{issue}}\":{\"issue_id\":\"{{issue_id}}\",\"issue_type\":\"{{issue_type}}\",\"issue_status\":\"{{issue_status}}\",\"reported_by_email\":\"{{reportedBy_email}}\",\"reported_by_username\":\"{{reportedBy_username}}\",\"reported_by_avatar\":\"{{reportedBy_avatar}}\",\"reported_by_settings_discord_id\":\"{{reportedBy_settings_discordId}}\",\"reported_by_settings_telegram_chat_id\":\"{{reportedBy_settings_telegramChatId}}\"},\"{{comment}}\":{\"comment_message\":\"{{comment_message}}\",\"commented_by_email\":\"{{commentedBy_email}}\",\"commented_by_username\":\"{{commentedBy_username}}\",\"commented_by_avatar\":\"{{commentedBy_avatar}}\",\"commented_by_settings_discord_id\":\"{{commentedBy_settings_discordId}}\",\"commented_by_settings_telegram_chat_id\":\"{{commentedBy_settings_telegramChatId}}\"}}",
"jsonPayload": "{\"notification_type\":\"{{notification_type}}\",\"subject\":\"{{subject}}\",\"message\":\"{{message}}\",\"image\":\"{{image}}\",\"{{media}}\":{\"media_type\":\"{{media_type}}\",\"tmdb_id\":\"{{media_tmdbid}}\",\"tvdb_id\":\"{{media_tvdbid}}\",\"status\":\"{{media_status}}\",\"status4k\":\"{{media_status4k}}\"},\"{{request}}\":{\"request_id\":\"{{request_id}}\",\"requested_by_email\":\"{{requestedBy_email}}\",\"requested_by_username\":\"{{requestedBy_username}}\",\"requested_by_avatar\":\"{{requestedBy_avatar}}\",\"requested_by_settings_discord_id\":\"{{requestedBy_settings_discordId}}\",\"requested_by_settings_telegram_chat_id\":\"{{requestedBy_settings_telegramChatId}}\"},\"{{issue}}\":{\"issue_id\":\"{{issue_id}}\",\"issue_type\":\"{{issue_type}}\",\"issue_status\":\"{{issue_status}}\",\"reported_by_email\":\"{{reportedBy_email}}\",\"reported_by_username\":\"{{reportedBy_username}}\",\"reported_by_avatar\":\"{{reportedBy_avatar}}\",\"reported_by_settings_discord_id\":\"{{reportedBy_settings_discordId}}\",\"reported_by_settings_telegram_chat_id\":\"{{reportedBy_settings_telegramChatId}}\"},\"{{comment}}\":{\"comment_message\":\"{{comment_message}}\",\"commented_by_email\":\"{{commentedBy_email}}\",\"commented_by_username\":\"{{commentedBy_username}}\",\"commented_by_avatar\":\"{{commentedBy_avatar}}\",\"commented_by_settings_discord_id\":\"{{commentedBy_settings_discordId}}\",\"commented_by_settings_telegram_chat_id\":\"{{commentedBy_settings_telegramChatId}}\"}}",
"webhookUrl": "http://10.10.10.10:8123/api/webhook/test-webhook-id"
}
}

View File

@@ -52,7 +52,7 @@
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '15.2',
'state': 'unknown',
})
# ---
# name: test_blu_trv_number_entity[number.trv_name_valve_position-entry]

View File

@@ -417,24 +417,23 @@ async def test_blu_trv_number_entity(
assert entry == snapshot(name=f"{entity_id}-entry")
async def test_blu_trv_set_value(
hass: HomeAssistant,
mock_blu_trv: Mock,
monkeypatch: pytest.MonkeyPatch,
async def test_blu_trv_ext_temp_set_value(
hass: HomeAssistant, mock_blu_trv: Mock
) -> None:
"""Test the set value action for BLU TRV number entity."""
"""Test the set value action for BLU TRV External Temperature number entity."""
await init_integration(hass, 3, model=MODEL_BLU_GATEWAY_GEN3)
entity_id = f"{NUMBER_DOMAIN}.trv_name_external_temperature"
assert hass.states.get(entity_id).state == "15.2"
# After HA start the state should be unknown because there was no previous external
# temperature report
assert hass.states.get(entity_id).state is STATE_UNKNOWN
monkeypatch.setitem(mock_blu_trv.status["blutrv:200"], "current_C", 22.2)
await hass.services.async_call(
NUMBER_DOMAIN,
SERVICE_SET_VALUE,
{
ATTR_ENTITY_ID: f"{NUMBER_DOMAIN}.trv_name_external_temperature",
ATTR_ENTITY_ID: entity_id,
ATTR_VALUE: 22.2,
},
blocking=True,
@@ -451,3 +450,44 @@ async def test_blu_trv_set_value(
)
assert hass.states.get(entity_id).state == "22.2"
async def test_blu_trv_valve_pos_set_value(
hass: HomeAssistant,
mock_blu_trv: Mock,
monkeypatch: pytest.MonkeyPatch,
) -> None:
"""Test the set value action for BLU TRV Valve Position number entity."""
# disable automatic temperature control to enable valve position entity
monkeypatch.setitem(mock_blu_trv.config["blutrv:200"], "enable", False)
await init_integration(hass, 3, model=MODEL_BLU_GATEWAY_GEN3)
entity_id = f"{NUMBER_DOMAIN}.trv_name_valve_position"
assert hass.states.get(entity_id).state == "0"
monkeypatch.setitem(mock_blu_trv.status["blutrv:200"], "pos", 20)
await hass.services.async_call(
NUMBER_DOMAIN,
SERVICE_SET_VALUE,
{
ATTR_ENTITY_ID: entity_id,
ATTR_VALUE: 20.0,
},
blocking=True,
)
mock_blu_trv.mock_update()
mock_blu_trv.call_rpc.assert_called_once_with(
"BluTRV.Call",
{
"id": 200,
"method": "Trv.SetPosition",
"params": {"id": 0, "pos": 20},
},
BLU_TRV_TIMEOUT,
)
# device only accepts int for 'pos' value
assert isinstance(mock_blu_trv.call_rpc.call_args[0][1]["params"]["pos"], int)
assert hass.states.get(entity_id).state == "20"

View File

@@ -184,7 +184,7 @@ async def test_send_message_thread(hass: HomeAssistant, webhook_platform) -> Non
assert len(events) == 1
assert events[0].context == context
assert events[0].data[ATTR_MESSAGE_THREAD_ID] == "123"
assert events[0].data[ATTR_MESSAGE_THREAD_ID] == 123
async def test_webhook_endpoint_generates_telegram_text_event(

View File

@@ -553,6 +553,17 @@ async def test_control_error_handling(
assert client.play.call_count == int(is_on)
async def test_turn_off_when_device_is_off(hass: HomeAssistant, client) -> None:
"""Test no error when turning off device that is already off."""
await setup_webostv(hass)
client.is_on = False
await client.mock_state_update()
data = {ATTR_ENTITY_ID: ENTITY_ID}
await hass.services.async_call(MP_DOMAIN, SERVICE_TURN_OFF, data, True)
assert client.power_off.call_count == 1
async def test_supported_features(hass: HomeAssistant, client) -> None:
"""Test test supported features."""
client.sound_output = "lineout"

View File

@@ -1914,9 +1914,18 @@ async def test_options_flow_migration_reset_old_adapter(
assert result4["step_id"] == "choose_serial_port"
async def test_config_flow_port_yellow_port_name(hass: HomeAssistant) -> None:
@pytest.mark.parametrize(
"device",
[
"/dev/ttyAMA1", # CM4
"/dev/ttyAMA10", # CM5, erroneously detected by pyserial
],
)
async def test_config_flow_port_yellow_port_name(
hass: HomeAssistant, device: str
) -> None:
"""Test config flow serial port name for Yellow Zigbee radio."""
port = com_port(device="/dev/ttyAMA1")
port = com_port(device=device)
port.serial_number = None
port.manufacturer = None
port.description = None

View File

@@ -363,20 +363,24 @@ async def test_component_failing_setup(hass: HomeAssistant) -> None:
async def test_component_exception_setup(hass: HomeAssistant) -> None:
"""Test component that raises exception during setup."""
setup.async_set_domains_to_be_loaded(hass, {"comp"})
domain = "comp"
setup.async_set_domains_to_be_loaded(hass, {domain})
def exception_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Raise exception."""
raise Exception("fail!") # noqa: TRY002
mock_integration(hass, MockModule("comp", setup=exception_setup))
mock_integration(hass, MockModule(domain, setup=exception_setup))
assert not await setup.async_setup_component(hass, "comp", {})
assert "comp" not in hass.config.components
assert not await setup.async_setup_component(hass, domain, {})
assert domain in hass.data[setup.DATA_SETUP]
assert domain not in hass.data[setup.DATA_SETUP_DONE]
assert domain not in hass.config.components
async def test_component_base_exception_setup(hass: HomeAssistant) -> None:
"""Test component that raises exception during setup."""
domain = "comp"
setup.async_set_domains_to_be_loaded(hass, {"comp"})
def exception_setup(hass: HomeAssistant, config: ConfigType) -> bool:
@@ -389,7 +393,69 @@ async def test_component_base_exception_setup(hass: HomeAssistant) -> None:
await setup.async_setup_component(hass, "comp", {})
assert str(exc_info.value) == "fail!"
assert "comp" not in hass.config.components
assert domain in hass.data[setup.DATA_SETUP]
assert domain not in hass.data[setup.DATA_SETUP_DONE]
assert domain not in hass.config.components
async def test_set_domains_to_be_loaded(hass: HomeAssistant) -> None:
"""Test async_set_domains_to_be_loaded."""
domain_good = "comp_good"
domain_bad = "comp_bad"
domain_base_exception = "comp_base_exception"
domain_exception = "comp_exception"
domains = {domain_good, domain_bad, domain_exception, domain_base_exception}
setup.async_set_domains_to_be_loaded(hass, domains)
assert set(hass.data[setup.DATA_SETUP_DONE]) == domains
setup_done = dict(hass.data[setup.DATA_SETUP_DONE])
# Calling async_set_domains_to_be_loaded again should not create new futures
setup.async_set_domains_to_be_loaded(hass, domains)
assert setup_done == hass.data[setup.DATA_SETUP_DONE]
def good_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Success."""
return True
def bad_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Fail."""
return False
def base_exception_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Raise exception."""
raise BaseException("fail!") # noqa: TRY002
def exception_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Raise exception."""
raise Exception("fail!") # noqa: TRY002
mock_integration(hass, MockModule(domain_good, setup=good_setup))
mock_integration(hass, MockModule(domain_bad, setup=bad_setup))
mock_integration(
hass, MockModule(domain_base_exception, setup=base_exception_setup)
)
mock_integration(hass, MockModule(domain_exception, setup=exception_setup))
# Set up the four components
assert await setup.async_setup_component(hass, domain_good, {})
assert not await setup.async_setup_component(hass, domain_bad, {})
assert not await setup.async_setup_component(hass, domain_exception, {})
with pytest.raises(BaseException, match="fail!"):
await setup.async_setup_component(hass, domain_base_exception, {})
# Check the result of the setup
assert not hass.data[setup.DATA_SETUP_DONE]
assert set(hass.data[setup.DATA_SETUP]) == {
domain_bad,
domain_exception,
domain_base_exception,
}
assert set(hass.config.components) == {domain_good}
# Calling async_set_domains_to_be_loaded again should not create any new futures
setup.async_set_domains_to_be_loaded(hass, domains)
assert not hass.data[setup.DATA_SETUP_DONE]
async def test_component_setup_with_validation_and_dependency(