Compare commits

..

63 Commits

Author SHA1 Message Date
Paulus Schoutsen 0cb0136b2f Bumped version to 2021.12.0b4 2021-12-08 11:02:14 -08:00
Paulus Schoutsen 36eca38be2 don't convert GTFS timestamp to UTC in timestamp sensor (#61221) 2021-12-08 11:02:05 -08:00
Paulus Schoutsen 0b470bb8fb Fix follow-up review comment for bbox (#61219) 2021-12-08 11:02:04 -08:00
Paulus Schoutsen 030ac3d762 Fix yandex_transport timestamp sensor (#61217) 2021-12-08 11:02:04 -08:00
Paulus Schoutsen b5b2c3cc0d Fix vallox timestamp sensor (#61216)
* Fix vallox timestamp sensor

* Change old state type
2021-12-08 11:02:03 -08:00
Paulus Schoutsen 7940aab4c5 Fix repetier timestamp sensors (#61214) 2021-12-08 11:02:02 -08:00
Paulus Schoutsen 2513347e27 Fix oasa_telematics timestamp sensor (#61213) 2021-12-08 11:02:01 -08:00
Paulus Schoutsen e6b784e4f2 Fix nextbus timestamp sensor (#61212) 2021-12-08 11:02:00 -08:00
Paulus Schoutsen d080c31583 Fix modern_forms timestmap sensors (#61211) 2021-12-08 11:01:59 -08:00
Paulus Schoutsen e68dcff3f3 Fix meteo_france timestamp sensor (#61210) 2021-12-08 11:01:58 -08:00
Paulus Schoutsen 66fa6dff93 Fix lyric timestamp sensor (#61209)
* Fix lyric timestamp sensor

* Update type
2021-12-08 11:01:57 -08:00
Paulus Schoutsen d533aba4f9 Fix litterrobot timestamp sensor (#61208)
* Fix litterrobot timestamp sensor

* Update type
2021-12-08 11:01:56 -08:00
Paulus Schoutsen 700eaf8794 Fix islamic prayer times timestamp sensor (#61207) 2021-12-08 11:01:56 -08:00
Paulus Schoutsen 7583d9a409 Fix hydrawise timestamp sensor (#61206) 2021-12-08 11:01:55 -08:00
Paulus Schoutsen dc3ece447b Fix hvv_departures timestamp sensor (#61205) 2021-12-08 11:01:54 -08:00
Paulus Schoutsen 2c0e406c1b Fix gtfs timestamp sensor (#61204) 2021-12-08 11:01:53 -08:00
Paulus Schoutsen 67c808bde9 Fix flipr timestamp sensor (#61203) 2021-12-08 11:01:52 -08:00
Paulus Schoutsen 2fa2a2e6d4 Fix bbox timestamp (#61202) 2021-12-08 11:01:52 -08:00
Paulus Schoutsen 8735395144 Fix Rova using strings as timestamp (#61201) 2021-12-08 11:01:51 -08:00
J. Nick Koston 428129cad7 Fix log spam from flux_led 0x08 devices when in music mode (#61196) 2021-12-08 11:01:50 -08:00
puddly 64c52aecef Bump ZHA dependency zigpy-znp from 0.6.3 to 0.6.4 (#61194) 2021-12-08 11:01:49 -08:00
J. Nick Koston 04a2e1fd7b Fix uncaught exception in bond config flow (#61184) 2021-12-08 11:01:49 -08:00
Robert Blomqvist bdc37e9353 Rephrase upgrade notification message to avoid installing Python 3.10 (#61181) 2021-12-08 11:01:48 -08:00
Jan Bouwhuis a581095bd0 Fix pvoutput template use and REST integer parsing (#61171)
* Fix pvoutput template use and REST integer parsing

* revert accepting templates as input
2021-12-08 11:01:47 -08:00
Erik Montnemery 707e501511 Skip duplicated data when calculating fossil energy consumption (#60599) 2021-12-08 11:01:46 -08:00
Paulus Schoutsen 9f1701f557 Bumped version to 2021.12.0b3 2021-12-07 12:54:28 -08:00
Charles Garwood 61545edd96 Remove loopenergy integration (#61175)
* Remove loopenergy integration

* Fix requirements_all.txt

* Fix requirements_test_all.txt
2021-12-07 12:54:22 -08:00
Allen Porter e09c85c591 Bump nest to 0.4.5 to fix media player event expiration (#61174) 2021-12-07 12:54:21 -08:00
einarhauks fecfbba442 Display energy in wh instead of kWh (#61169) 2021-12-07 12:54:21 -08:00
Aaron Bach 13ce6edc68 Bump py17track to 2021.12.2 (#61166) 2021-12-07 12:54:20 -08:00
Tobias Sauerwein 816b5af883 Fix Netatmo climate issue (#61154)
Signed-off-by: cgtobi <cgtobi@gmail.com>
2021-12-07 12:54:19 -08:00
Erik Montnemery 78ada630c0 Guard against missing states in Alexa state updates (#61152) 2021-12-07 12:54:18 -08:00
Marcel van der Veldt 4ad904f3b7 Change check for existence of options flow (#61147)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2021-12-07 12:54:16 -08:00
Fredrik Erlandsson fa447332c6 Fix point availability (#61144) 2021-12-07 12:54:15 -08:00
Erik Montnemery 8da3756602 Bump hatasmota to 0.3.1 (#61120) 2021-12-07 12:54:15 -08:00
G Johansson 01adc6a042 Improve code quality trafikverket_weatherstation (#61044)
* Code quality trafikverket_weatherstation

* Updates from review

* Fix extra attributes settings

* Fix for additional review comments
2021-12-07 12:54:14 -08:00
Paulus Schoutsen d105e9f99e Bumped version to 2021.12.0b2 2021-12-06 15:54:09 -08:00
Paulus Schoutsen 348079f069 Bump frontend to 20211206.0 (#61133) 2021-12-06 15:54:01 -08:00
Aaron Bach 86f5165e4c Deprecate entity_id parameter in Guardian service calls (#61129) 2021-12-06 15:54:00 -08:00
jjlawren b6d012222a Improve Sonos activity debug logging (#61122) 2021-12-06 15:53:59 -08:00
Aaron Bach 0532c22069 Bump simplisafe-python to 2021.12.0 (#61121) 2021-12-06 15:53:58 -08:00
Marcel van der Veldt d1672a1e9a Remove colon from default entity name in Hue integration (#61118) 2021-12-06 15:53:58 -08:00
Paulus Schoutsen 725e3046db Return native timestamps for home connect (#61116) 2021-12-06 15:53:57 -08:00
Paulus Schoutsen 325aa66b8c Bump aiohue to 3.0.2 (#61115) 2021-12-06 15:53:56 -08:00
Erik Montnemery 3ba07ce395 Fix CO2 calculation when data is missing (#61106) 2021-12-06 15:53:56 -08:00
Martin Hjelmare 21463121a7 Improve zwave_js add-on config flow description (#61099) 2021-12-06 15:53:55 -08:00
Marcel van der Veldt ef0f3f7ce9 Fix migration of entities of Hue integration (#61095)
* fix device name in log

* Fix Hue migration for all id versions

* fix tests

* typo

* change to bit more universal approach

* fix test again

* formatting
2021-12-06 15:53:54 -08:00
epenet cb371ef27c Prevent log flooding in frame helper (#61085)
Co-authored-by: epenet <epenet@users.noreply.github.com>
2021-12-06 15:53:54 -08:00
J. Nick Koston 878700e26f Provide a hint on which username to use for enphase_envoy (#61084) 2021-12-06 15:53:53 -08:00
J. Nick Koston e09245eb14 Fix missing unique id in enphase_envoy (#61083) 2021-12-06 15:53:52 -08:00
J. Nick Koston 20fb06484c Bump enphase_envoy to 0.20.1 (#61082) 2021-12-06 15:53:51 -08:00
Allen Porter f4a38c0190 Coalesce nest media source preview clips by session and bump google-nest-sdm (#61081) 2021-12-06 15:53:50 -08:00
Allen Porter fa33464217 Remove unnecessary explicit use of OrderedDict in nest media source (#61054)
Address follow up PR comments from #60073
2021-12-06 15:53:49 -08:00
Alexander Pitkin bd239bcbed Fix yandex transport for Belarus (#61080) 2021-12-06 15:52:00 -08:00
Aaron Bach d5f3e2a761 Deprecate system_id parameter in SimpliSafe service calls (#61076) 2021-12-06 15:51:59 -08:00
J. Nick Koston ec88a42948 Abort flux_led discovery if another device gets the ip (#61074)
- If the dhcp reservation expired for the device that
  was at the ip and a new flux_led device appears we
  would discover it because the unique_id did not match
2021-12-06 15:51:59 -08:00
Alexei Chetroi a3ede8f895 Add 3157100-E model to Centralite thermostat (#61073) 2021-12-06 15:51:58 -08:00
J. Nick Koston 23ebde58cd Bump flux_led to 0.25.17 to fix missing push messages on 0xA3 models (#61070) 2021-12-06 15:51:57 -08:00
Allen Porter 0c87885f41 Fix regression in nest event media player with multiple devices (#61064) 2021-12-06 15:51:56 -08:00
Aaron Bach c159790caf Fix mispelling in SimpliSafe service description (#61058) 2021-12-06 15:51:56 -08:00
Allen Porter 056575f491 Add debug logging for pip install command (#61057) 2021-12-06 15:51:55 -08:00
Jérôme W e4d9d0d83e Add media player volume control in fr-FR with Alexa (#60489)
* media player volume control in `fr-FR` with Alexa

* Apply suggestions from code review

Co-authored-by: Erik Montnemery <erik@montnemery.com>
2021-12-06 15:51:54 -08:00
schreyack 34f728e5d2 Fix previous setting briefly appearing on newer flux_led devices when turning on (#60004)
Co-authored-by: J. Nick Koston <nick@koston.org>
2021-12-06 15:51:54 -08:00
94 changed files with 1544 additions and 557 deletions
-1
View File
@@ -597,7 +597,6 @@ omit =
homeassistant/components/lookin/models.py
homeassistant/components/lookin/sensor.py
homeassistant/components/lookin/climate.py
homeassistant/components/loopenergy/sensor.py
homeassistant/components/luci/device_tracker.py
homeassistant/components/luftdaten/__init__.py
homeassistant/components/luftdaten/sensor.py
-1
View File
@@ -293,7 +293,6 @@ homeassistant/components/local_ip/* @issacg
homeassistant/components/logger/* @home-assistant/core
homeassistant/components/logi_circle/* @evanjd
homeassistant/components/lookin/* @ANMalko
homeassistant/components/loopenergy/* @pavoni
homeassistant/components/lovelace/* @home-assistant/frontend
homeassistant/components/luci/* @mzdrale
homeassistant/components/luftdaten/* @fabaff
+1 -2
View File
@@ -252,8 +252,7 @@ async def async_from_config_dict(
f"{'.'.join(str(x) for x in sys.version_info[:3])} is deprecated and will "
f"be removed in Home Assistant {REQUIRED_NEXT_PYTHON_HA_RELEASE}. "
"Please upgrade Python to "
f"{'.'.join(str(x) for x in REQUIRED_NEXT_PYTHON_VER)} or "
"higher."
f"{'.'.join(str(x) for x in REQUIRED_NEXT_PYTHON_VER[:2])}."
)
_LOGGER.warning(msg)
hass.components.persistent_notification.async_create(
@@ -695,6 +695,7 @@ class AlexaSpeaker(AlexaCapability):
"en-US",
"es-ES",
"es-MX",
"fr-FR", # Not documented as of 2021-12-04, see PR #60489
"it-IT",
"ja-JP",
}
@@ -752,6 +753,7 @@ class AlexaStepSpeaker(AlexaCapability):
"en-IN",
"en-US",
"es-ES",
"fr-FR", # Not documented as of 2021-12-04, see PR #60489
"it-IT",
}
@@ -182,12 +182,13 @@ async def async_send_add_or_update_message(hass, config, entity_ids):
endpoints = []
for entity_id in entity_ids:
domain = entity_id.split(".", 1)[0]
if domain not in ENTITY_ADAPTERS:
if (domain := entity_id.split(".", 1)[0]) not in ENTITY_ADAPTERS:
continue
alexa_entity = ENTITY_ADAPTERS[domain](hass, config, hass.states.get(entity_id))
if (state := hass.states.get(entity_id)) is None:
continue
alexa_entity = ENTITY_ADAPTERS[domain](hass, config, state)
endpoints.append(alexa_entity.serialize_discovery())
payload = {"endpoints": endpoints, "scope": {"type": "BearerToken", "token": token}}
+1 -2
View File
@@ -131,10 +131,9 @@ class BboxUptimeSensor(SensorEntity):
def update(self):
"""Get the latest data from Bbox and update the state."""
self.bbox_data.update()
uptime = utcnow() - timedelta(
self._attr_native_value = utcnow() - timedelta(
seconds=self.bbox_data.router_infos["device"]["uptime"]
)
self._attr_native_value = uptime.replace(microsecond=0).isoformat()
class BboxSensor(SensorEntity):
+4 -1
View File
@@ -87,7 +87,10 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
return
self._discovered[CONF_ACCESS_TOKEN] = token
_, hub_name = await _validate_input(self.hass, self._discovered)
try:
_, hub_name = await _validate_input(self.hass, self._discovered)
except InputValidationError:
return
self._discovered[CONF_NAME] = hub_name
async def async_step_zeroconf(
@@ -366,13 +366,11 @@ async def ignore_config_flow(hass, connection, msg):
def entry_json(entry: config_entries.ConfigEntry) -> dict:
"""Return JSON value of a config entry."""
handler = config_entries.HANDLERS.get(entry.domain)
supports_options = (
# Guard in case handler is no longer registered (custom component etc)
handler is not None
# pylint: disable=comparison-with-callable
and handler.async_get_options_flow
!= config_entries.ConfigFlow.async_get_options_flow
# work out if handler has support for options flow
supports_options = handler is not None and handler.async_supports_options_flow(
entry
)
return {
"entry_id": entry.entry_id,
"domain": entry.domain,
@@ -274,14 +274,16 @@ async def ws_get_fossil_energy_consumption(
) -> dict[datetime, float]:
"""Combine multiple statistics, returns a dict indexed by start time."""
result: defaultdict[datetime, float] = defaultdict(float)
seen: defaultdict[datetime, set[str]] = defaultdict(set)
for statistics_id, stat in stats.items():
if statistics_id not in statistic_ids:
continue
for period in stat:
if period["sum"] is None:
if period["sum"] is None or statistics_id in seen[period["start"]]:
continue
result[period["start"]] += period["sum"]
seen[period["start"]].add(statistics_id)
return {key: result[key] for key in sorted(result)}
@@ -303,6 +305,8 @@ async def ws_get_fossil_energy_consumption(
"""Reduce hourly deltas to daily or monthly deltas."""
result: list[dict[str, Any]] = []
deltas: list[float] = []
if not stat_list:
return result
prev_stat: dict[str, Any] = stat_list[0]
# Loop over the hourly deltas + a fake entry to end the period
@@ -75,6 +75,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
envoy_reader.get_inverters = False
await coordinator.async_config_entry_first_refresh()
if not entry.unique_id:
try:
serial = await envoy_reader.get_full_serial_number()
except httpx.HTTPError:
pass
else:
hass.config_entries.async_update_entry(entry, unique_id=serial)
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = {
COORDINATOR: coordinator,
NAME: name,
@@ -1,6 +1,7 @@
"""Config flow for Enphase Envoy integration."""
from __future__ import annotations
import contextlib
import logging
from typing import Any
@@ -31,7 +32,7 @@ ENVOY = "Envoy"
CONF_SERIAL = "serial"
async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> dict[str, Any]:
async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> EnvoyReader:
"""Validate the user input allows us to connect."""
envoy_reader = EnvoyReader(
data[CONF_HOST],
@@ -48,6 +49,8 @@ async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> dict[str,
except (RuntimeError, httpx.HTTPError) as err:
raise CannotConnect from err
return envoy_reader
class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Enphase Envoy."""
@@ -59,7 +62,6 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
self.ip_address = None
self.name = None
self.username = None
self.serial = None
self._reauth_entry = None
@callback
@@ -104,8 +106,8 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
self, discovery_info: zeroconf.ZeroconfServiceInfo
) -> FlowResult:
"""Handle a flow initialized by zeroconf discovery."""
self.serial = discovery_info.properties["serialnum"]
await self.async_set_unique_id(self.serial)
serial = discovery_info.properties["serialnum"]
await self.async_set_unique_id(serial)
self.ip_address = discovery_info.host
self._abort_if_unique_id_configured({CONF_HOST: self.ip_address})
for entry in self._async_current_entries(include_ignore=False):
@@ -114,9 +116,9 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
and CONF_HOST in entry.data
and entry.data[CONF_HOST] == self.ip_address
):
title = f"{ENVOY} {self.serial}" if entry.title == ENVOY else ENVOY
title = f"{ENVOY} {serial}" if entry.title == ENVOY else ENVOY
self.hass.config_entries.async_update_entry(
entry, title=title, unique_id=self.serial
entry, title=title, unique_id=serial
)
self.hass.async_create_task(
self.hass.config_entries.async_reload(entry.entry_id)
@@ -132,6 +134,24 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
)
return await self.async_step_user()
def _async_envoy_name(self) -> str:
"""Return the name of the envoy."""
if self.name:
return self.name
if self.unique_id:
return f"{ENVOY} {self.unique_id}"
return ENVOY
async def _async_set_unique_id_from_envoy(self, envoy_reader: EnvoyReader) -> bool:
"""Set the unique id by fetching it from the envoy."""
serial = None
with contextlib.suppress(httpx.HTTPError):
serial = await envoy_reader.get_full_serial_number()
if serial:
await self.async_set_unique_id(serial)
return True
return False
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> FlowResult:
@@ -145,7 +165,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
):
return self.async_abort(reason="already_configured")
try:
await validate_input(self.hass, user_input)
envoy_reader = await validate_input(self.hass, user_input)
except CannotConnect:
errors["base"] = "cannot_connect"
except InvalidAuth:
@@ -155,21 +175,28 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
errors["base"] = "unknown"
else:
data = user_input.copy()
if self.serial:
data[CONF_NAME] = f"{ENVOY} {self.serial}"
else:
data[CONF_NAME] = self.name or ENVOY
data[CONF_NAME] = self._async_envoy_name()
if self._reauth_entry:
self.hass.config_entries.async_update_entry(
self._reauth_entry,
data=data,
)
return self.async_abort(reason="reauth_successful")
if not self.unique_id and await self._async_set_unique_id_from_envoy(
envoy_reader
):
data[CONF_NAME] = self._async_envoy_name()
if self.unique_id:
self._abort_if_unique_id_configured({CONF_HOST: data[CONF_HOST]})
return self.async_create_entry(title=data[CONF_NAME], data=data)
if self.serial:
if self.unique_id:
self.context["title_placeholders"] = {
CONF_SERIAL: self.serial,
CONF_SERIAL: self.unique_id,
CONF_HOST: self.ip_address,
}
return self.async_show_form(
@@ -3,7 +3,7 @@
"name": "Enphase Envoy",
"documentation": "https://www.home-assistant.io/integrations/enphase_envoy",
"requirements": [
"envoy_reader==0.20.0"
"envoy_reader==0.20.1"
],
"codeowners": [
"@gtdiehl"
@@ -15,4 +15,4 @@
}
],
"iot_class": "local_polling"
}
}
@@ -3,6 +3,7 @@
"flow_title": "{serial} ({host})",
"step": {
"user": {
"description": "For newer models, enter username `envoy` without a password. For older models, enter username `installer` without a password. For all other models, enter a valid username and password.",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"username": "[%key:common::config_flow::data::username%]",
@@ -16,7 +16,8 @@
"host": "Host",
"password": "Password",
"username": "Username"
}
},
"description": "For newer models, enter username `envoy` without a password. For older models, enter username `installer` without a password. For all other models, enter a valid username and password."
}
}
}
+1 -6
View File
@@ -1,8 +1,6 @@
"""Sensor platform for the Flipr's pool_sensor."""
from __future__ import annotations
from datetime import datetime
from homeassistant.components.sensor import SensorEntity, SensorEntityDescription
from homeassistant.const import (
DEVICE_CLASS_TEMPERATURE,
@@ -60,7 +58,4 @@ class FliprSensor(FliprEntity, SensorEntity):
@property
def native_value(self):
"""State of the sensor."""
state = self.coordinator.data[self.entity_description.key]
if isinstance(state, datetime):
return state.isoformat()
return state
return self.coordinator.data[self.entity_description.key]
@@ -115,8 +115,9 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
await self.async_set_unique_id(mac)
self._abort_if_unique_id_configured(updates={CONF_HOST: host})
for entry in self._async_current_entries(include_ignore=False):
if entry.data[CONF_HOST] == host and not entry.unique_id:
async_update_entry_from_discovery(self.hass, entry, device)
if entry.data[CONF_HOST] == host:
if not entry.unique_id:
async_update_entry_from_discovery(self.hass, entry, device)
return self.async_abort(reason="already_configured")
self.context[CONF_HOST] = host
for progress in self._async_in_progress():
+5 -4
View File
@@ -280,10 +280,11 @@ class FluxLight(FluxOnOffEntity, CoordinatorEntity, LightEntity):
async def _async_turn_on(self, **kwargs: Any) -> None:
"""Turn the specified or all lights on."""
if not self.is_on:
await self._device.async_turn_on()
if not kwargs:
return
if self._device.requires_turn_on or not kwargs:
if not self.is_on:
await self._device.async_turn_on()
if not kwargs:
return
if MODE_ATTRS.intersection(kwargs):
await self._async_set_mode(**kwargs)
@@ -3,7 +3,7 @@
"name": "Flux LED/MagicHome",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/flux_led",
"requirements": ["flux_led==0.25.16"],
"requirements": ["flux_led==0.26.2"],
"quality_scale": "platinum",
"codeowners": ["@icemanch"],
"iot_class": "local_push",
@@ -3,7 +3,7 @@
"name": "Home Assistant Frontend",
"documentation": "https://www.home-assistant.io/integrations/frontend",
"requirements": [
"home-assistant-frontend==20211203.0"
"home-assistant-frontend==20211206.0"
],
"dependencies": [
"api",
+3 -5
View File
@@ -544,7 +544,7 @@ class GTFSDepartureSensor(SensorEntity):
self._available = False
self._icon = ICON
self._name = ""
self._state: str | None = None
self._state: datetime.datetime | None = None
self._attributes: dict[str, Any] = {}
self._agency = None
@@ -563,7 +563,7 @@ class GTFSDepartureSensor(SensorEntity):
return self._name
@property
def native_value(self) -> str | None:
def native_value(self) -> datetime.datetime | None:
"""Return the state of the sensor."""
return self._state
@@ -619,9 +619,7 @@ class GTFSDepartureSensor(SensorEntity):
if not self._departure:
self._state = None
else:
self._state = dt_util.as_utc(
self._departure["departure_time"]
).isoformat()
self._state = self._departure["departure_time"]
# Fetch trip and route details once, unless updated
if not self._departure:
+57 -18
View File
@@ -3,7 +3,7 @@ from __future__ import annotations
import asyncio
from collections.abc import Awaitable, Callable
from typing import cast
from typing import TYPE_CHECKING, cast
from aioguardian import Client
from aioguardian.errors import GuardianError
@@ -11,6 +11,8 @@ import voluptuous as vol
from homeassistant.config_entries import ConfigEntry, ConfigEntryState
from homeassistant.const import (
ATTR_DEVICE_ID,
ATTR_ENTITY_ID,
CONF_DEVICE_ID,
CONF_FILENAME,
CONF_IP_ADDRESS,
@@ -18,7 +20,11 @@ from homeassistant.const import (
CONF_URL,
)
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
)
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.entity import DeviceInfo, EntityDescription
from homeassistant.helpers.update_coordinator import (
@@ -63,20 +69,41 @@ SERVICES = (
SERVICE_NAME_UPGRADE_FIRMWARE,
)
SERVICE_PAIR_UNPAIR_SENSOR_SCHEMA = vol.Schema(
{
vol.Required(CONF_DEVICE_ID): cv.string,
vol.Required(CONF_UID): cv.string,
}
SERVICE_BASE_SCHEMA = vol.All(
cv.deprecated(ATTR_ENTITY_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_ENTITY_ID): cv.entity_id,
}
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_ENTITY_ID),
)
SERVICE_UPGRADE_FIRMWARE_SCHEMA = vol.Schema(
{
vol.Required(CONF_DEVICE_ID): cv.string,
vol.Optional(CONF_URL): cv.url,
vol.Optional(CONF_PORT): cv.port,
vol.Optional(CONF_FILENAME): cv.string,
},
SERVICE_PAIR_UNPAIR_SENSOR_SCHEMA = vol.All(
cv.deprecated(ATTR_ENTITY_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_ENTITY_ID): cv.entity_id,
vol.Required(CONF_UID): cv.string,
}
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_ENTITY_ID),
)
SERVICE_UPGRADE_FIRMWARE_SCHEMA = vol.All(
cv.deprecated(ATTR_ENTITY_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_ENTITY_ID): cv.entity_id,
vol.Optional(CONF_URL): cv.url,
vol.Optional(CONF_PORT): cv.port,
vol.Optional(CONF_FILENAME): cv.string,
},
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_ENTITY_ID),
)
@@ -86,6 +113,14 @@ PLATFORMS = ["binary_sensor", "sensor", "switch"]
@callback
def async_get_entry_id_for_service_call(hass: HomeAssistant, call: ServiceCall) -> str:
"""Get the entry ID related to a service call (by device ID)."""
if ATTR_ENTITY_ID in call.data:
entity_registry = er.async_get(hass)
entity_registry_entry = entity_registry.async_get(call.data[ATTR_ENTITY_ID])
if TYPE_CHECKING:
assert entity_registry_entry
assert entity_registry_entry.config_entry_id
return entity_registry_entry.config_entry_id
device_id = call.data[CONF_DEVICE_ID]
device_registry = dr.async_get(hass)
@@ -221,15 +256,19 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
for service_name, schema, method in (
(SERVICE_NAME_DISABLE_AP, None, async_disable_ap),
(SERVICE_NAME_ENABLE_AP, None, async_enable_ap),
(SERVICE_NAME_DISABLE_AP, SERVICE_BASE_SCHEMA, async_disable_ap),
(SERVICE_NAME_ENABLE_AP, SERVICE_BASE_SCHEMA, async_enable_ap),
(
SERVICE_NAME_PAIR_SENSOR,
SERVICE_PAIR_UNPAIR_SENSOR_SCHEMA,
async_pair_sensor,
),
(SERVICE_NAME_REBOOT, None, async_reboot),
(SERVICE_NAME_RESET_VALVE_DIAGNOSTICS, None, async_reset_valve_diagnostics),
(SERVICE_NAME_REBOOT, SERVICE_BASE_SCHEMA, async_reboot),
(
SERVICE_NAME_RESET_VALVE_DIAGNOSTICS,
SERVICE_BASE_SCHEMA,
async_reset_valve_diagnostics,
),
(
SERVICE_NAME_UNPAIR_SENSOR,
SERVICE_PAIR_UNPAIR_SENSOR_SCHEMA,
@@ -63,16 +63,14 @@ class HomeConnectSensor(HomeConnectEntity, SensorEntity):
elif (
self._state is not None
and self._sign == 1
and dt_util.parse_datetime(self._state) < dt_util.utcnow()
and self._state < dt_util.utcnow()
):
# if the date is supposed to be in the future but we're
# already past it, set state to None.
self._state = None
else:
seconds = self._sign * float(status[self._key][ATTR_VALUE])
self._state = (
dt_util.utcnow() + timedelta(seconds=seconds)
).isoformat()
self._state = dt_util.utcnow() + timedelta(seconds=seconds)
else:
self._state = status[self._key].get(ATTR_VALUE)
if self._key == BSH_OPERATION_STATE:
+10 -5
View File
@@ -12,7 +12,7 @@ import async_timeout
import slugify as unicode_slug
import voluptuous as vol
from homeassistant import config_entries, data_entry_flow
from homeassistant import config_entries
from homeassistant.components import ssdp, zeroconf
from homeassistant.const import CONF_API_KEY, CONF_HOST
from homeassistant.core import callback
@@ -48,10 +48,15 @@ class HueFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
config_entry: config_entries.ConfigEntry,
) -> HueOptionsFlowHandler:
"""Get the options flow for this handler."""
if config_entry.data.get(CONF_API_VERSION, 1) == 1:
# Options for Hue are only applicable to V1 bridges.
return HueOptionsFlowHandler(config_entry)
raise data_entry_flow.UnknownHandler
return HueOptionsFlowHandler(config_entry)
@classmethod
@callback
def async_supports_options_flow(
cls, config_entry: config_entries.ConfigEntry
) -> bool:
"""Return options flow support for this handler."""
return config_entry.data.get(CONF_API_VERSION, 1) == 1
def __init__(self) -> None:
"""Initialize the Hue flow."""
+1 -1
View File
@@ -3,7 +3,7 @@
"name": "Philips Hue",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/hue",
"requirements": ["aiohue==3.0.1"],
"requirements": ["aiohue==3.0.2"],
"ssdp": [
{
"manufacturer": "Royal Philips Electronics",
+82 -56
View File
@@ -4,6 +4,7 @@ import logging
from aiohue import HueBridgeV2
from aiohue.discovery import is_v2_bridge
from aiohue.v2.models.device import DeviceArchetypes
from aiohue.v2.models.resource import ResourceTypes
from homeassistant import core
@@ -18,7 +19,10 @@ from homeassistant.const import (
DEVICE_CLASS_TEMPERATURE,
)
from homeassistant.helpers import aiohttp_client
from homeassistant.helpers.device_registry import async_get as async_get_device_registry
from homeassistant.helpers.device_registry import (
async_entries_for_config_entry as devices_for_config_entries,
async_get as async_get_device_registry,
)
from homeassistant.helpers.entity_registry import (
async_entries_for_config_entry as entities_for_config_entry,
async_entries_for_device,
@@ -82,6 +86,18 @@ async def handle_v2_migration(hass: core.HomeAssistant, entry: ConfigEntry) -> N
dev_reg = async_get_device_registry(hass)
ent_reg = async_get_entity_registry(hass)
LOGGER.info("Start of migration of devices and entities to support API schema 2")
# Create mapping of mac address to HA device id's.
# Identifier in dev reg should be mac-address,
# but in some cases it has a postfix like `-0b` or `-01`.
dev_ids = {}
for hass_dev in devices_for_config_entries(dev_reg, entry.entry_id):
for domain, mac in hass_dev.identifiers:
if domain != DOMAIN:
continue
normalized_mac = mac.split("-")[0]
dev_ids[normalized_mac] = hass_dev.id
# initialize bridge connection just for the migration
async with HueBridgeV2(host, api_key, websession) as api:
@@ -92,83 +108,93 @@ async def handle_v2_migration(hass: core.HomeAssistant, entry: ConfigEntry) -> N
DEVICE_CLASS_TEMPERATURE: ResourceTypes.TEMPERATURE,
}
# handle entities attached to device
# migrate entities attached to a device
for hue_dev in api.devices:
zigbee = api.devices.get_zigbee_connectivity(hue_dev.id)
if not zigbee or not zigbee.mac_address:
# not a zigbee device or invalid mac
continue
# get/update existing device by V1 identifier (mac address)
# the device will now have both the old and the new identifier
identifiers = {(DOMAIN, hue_dev.id), (DOMAIN, zigbee.mac_address)}
hass_dev = dev_reg.async_get_or_create(
config_entry_id=entry.entry_id, identifiers=identifiers
)
LOGGER.info("Migrated device %s (%s)", hass_dev.name, hass_dev.id)
# loop through al entities for device and find match
for ent in async_entries_for_device(ent_reg, hass_dev.id, True):
# migrate light
if ent.entity_id.startswith("light"):
# should always return one lightid here
new_unique_id = next(iter(hue_dev.lights))
if ent.unique_id == new_unique_id:
continue # just in case
LOGGER.info(
"Migrating %s from unique id %s to %s",
ent.entity_id,
ent.unique_id,
new_unique_id,
)
ent_reg.async_update_entity(
ent.entity_id, new_unique_id=new_unique_id
)
continue
# migrate sensors
matched_dev_class = sensor_class_mapping.get(
ent.original_device_class or "unknown"
# get existing device by V1 identifier (mac address)
if hue_dev.product_data.product_archetype == DeviceArchetypes.BRIDGE_V2:
hass_dev_id = dev_ids.get(api.config.bridge_id.upper())
else:
hass_dev_id = dev_ids.get(zigbee.mac_address)
if hass_dev_id is None:
# can be safely ignored, this device does not exist in current config
LOGGER.debug(
"Ignoring device %s (%s) as it does not (yet) exist in the device registry",
hue_dev.metadata.name,
hue_dev.id,
)
if matched_dev_class is None:
continue
dev_reg.async_update_device(
hass_dev_id, new_identifiers={(DOMAIN, hue_dev.id)}
)
LOGGER.info("Migrated device %s (%s)", hue_dev.metadata.name, hass_dev_id)
# loop through all entities for device and find match
for ent in async_entries_for_device(ent_reg, hass_dev_id, True):
if ent.entity_id.startswith("light"):
# migrate light
# should always return one lightid here
new_unique_id = next(iter(hue_dev.lights), None)
else:
# migrate sensors
matched_dev_class = sensor_class_mapping.get(
ent.original_device_class or "unknown"
)
new_unique_id = next(
(
sensor.id
for sensor in api.devices.get_sensors(hue_dev.id)
if sensor.type == matched_dev_class
),
None,
)
if new_unique_id is None:
# this may happen if we're looking at orphaned or unsupported entity
LOGGER.warning(
"Skip migration of %s because it no longer exists on the bridge",
ent.entity_id,
)
continue
for sensor in api.devices.get_sensors(hue_dev.id):
if sensor.type != matched_dev_class:
continue
new_unique_id = sensor.id
if ent.unique_id == new_unique_id:
break # just in case
try:
ent_reg.async_update_entity(
ent.entity_id, new_unique_id=new_unique_id
)
except ValueError:
# assume edge case where the entity was already migrated in a previous run
# which got aborted somehow and we do not want
# to crash the entire integration init
LOGGER.warning(
"Skip migration of %s because it already exists",
ent.entity_id,
)
else:
LOGGER.info(
"Migrating %s from unique id %s to %s",
"Migrated entity %s from unique id %s to %s",
ent.entity_id,
ent.unique_id,
new_unique_id,
)
try:
ent_reg.async_update_entity(
ent.entity_id, new_unique_id=sensor.id
)
except ValueError:
# assume edge case where the entity was already migrated in a previous run
# which got aborted somehow and we do not want
# to crash the entire integration init
LOGGER.warning(
"Skip migration of %s because it already exists",
ent.entity_id,
)
break
# migrate entities that are not connected to a device (groups)
for ent in entities_for_config_entry(ent_reg, entry.entry_id):
if ent.device_id is not None:
continue
v1_id = f"/groups/{ent.unique_id}"
hue_group = api.groups.room.get_by_v1_id(v1_id)
if hue_group is None or hue_group.grouped_light is None:
# try again with zone
hue_group = api.groups.zone.get_by_v1_id(v1_id)
if "-" in ent.unique_id:
# handle case where unique id is v2-id of group/zone
hue_group = api.groups.get(ent.unique_id)
else:
# handle case where the unique id is just the v1 id
v1_id = f"/groups/{ent.unique_id}"
hue_group = api.groups.room.get_by_v1_id(
v1_id
) or api.groups.zone.get_by_v1_id(v1_id)
if hue_group is None or hue_group.grouped_light is None:
# this may happen if we're looking at some orphaned entity
LOGGER.warning(
+1 -1
View File
@@ -64,7 +64,7 @@ class HueBaseEntity(Entity):
type_title = RESOURCE_TYPE_NAMES.get(
self.resource.type, self.resource.type.value.replace("_", " ").title()
)
return f"{dev_name}: {type_title}"
return f"{dev_name} {type_title}"
async def async_added_to_hass(self) -> None:
"""Call when entity is added."""
@@ -116,7 +116,7 @@ class HVVDepartureSensor(SensorEntity):
departure_time
+ timedelta(minutes=departure["timeOffset"])
+ timedelta(seconds=delay)
).isoformat()
)
self.attr.update(
{
+1 -1
View File
@@ -83,4 +83,4 @@ class HydrawiseSensor(HydrawiseEntity, SensorEntity):
_LOGGER.debug("New cycle time: %s", next_cycle)
self._attr_native_value = dt.utc_from_timestamp(
dt.as_timestamp(dt.now()) + next_cycle
).isoformat()
)
@@ -45,10 +45,8 @@ class IslamicPrayerTimeSensor(SensorEntity):
@property
def native_value(self):
"""Return the state of the sensor."""
return (
self.client.prayer_times_info.get(self.sensor_type)
.astimezone(dt_util.UTC)
.isoformat()
return self.client.prayer_times_info.get(self.sensor_type).astimezone(
dt_util.UTC
)
async def async_added_to_hass(self):
@@ -1,9 +1,11 @@
"""Support for Litter-Robot sensors."""
from __future__ import annotations
from datetime import datetime
from pylitterbot.robot import Robot
from homeassistant.components.sensor import SensorEntity
from homeassistant.components.sensor import SensorEntity, StateType
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import DEVICE_CLASS_TIMESTAMP, PERCENTAGE
from homeassistant.core import HomeAssistant
@@ -36,7 +38,7 @@ class LitterRobotPropertySensor(LitterRobotEntity, SensorEntity):
self.sensor_attribute = sensor_attribute
@property
def native_value(self) -> str:
def native_value(self) -> StateType | datetime:
"""Return the state."""
return getattr(self.robot, self.sensor_attribute)
@@ -59,10 +61,10 @@ class LitterRobotSleepTimeSensor(LitterRobotPropertySensor):
"""Litter-Robot sleep time sensor."""
@property
def native_value(self) -> str | None:
def native_value(self) -> StateType | datetime:
"""Return the state."""
if self.robot.sleep_mode_enabled:
return super().native_value.isoformat()
return super().native_value
return None
@property
@@ -1 +0,0 @@
"""The loopenergy component."""
@@ -1,8 +0,0 @@
{
"domain": "loopenergy",
"name": "Loop Energy",
"documentation": "https://www.home-assistant.io/integrations/loopenergy",
"requirements": ["pyloopenergy==0.2.1"],
"codeowners": ["@pavoni"],
"iot_class": "cloud_push"
}
@@ -1,149 +0,0 @@
"""Support for Loop Energy sensors."""
import logging
import pyloopenergy
import voluptuous as vol
from homeassistant.components.sensor import PLATFORM_SCHEMA, SensorEntity
from homeassistant.const import (
CONF_UNIT_SYSTEM_IMPERIAL,
CONF_UNIT_SYSTEM_METRIC,
EVENT_HOMEASSISTANT_STOP,
)
import homeassistant.helpers.config_validation as cv
_LOGGER = logging.getLogger(__name__)
CONF_ELEC = "electricity"
CONF_GAS = "gas"
CONF_ELEC_SERIAL = "electricity_serial"
CONF_ELEC_SECRET = "electricity_secret"
CONF_GAS_SERIAL = "gas_serial"
CONF_GAS_SECRET = "gas_secret"
CONF_GAS_CALORIFIC = "gas_calorific"
CONF_GAS_TYPE = "gas_type"
DEFAULT_CALORIFIC = 39.11
DEFAULT_UNIT = "kW"
ELEC_SCHEMA = vol.Schema(
{
vol.Required(CONF_ELEC_SERIAL): cv.string,
vol.Required(CONF_ELEC_SECRET): cv.string,
}
)
GAS_TYPE_SCHEMA = vol.In([CONF_UNIT_SYSTEM_METRIC, CONF_UNIT_SYSTEM_IMPERIAL])
GAS_SCHEMA = vol.Schema(
{
vol.Required(CONF_GAS_SERIAL): cv.string,
vol.Required(CONF_GAS_SECRET): cv.string,
vol.Optional(CONF_GAS_TYPE, default=CONF_UNIT_SYSTEM_METRIC): GAS_TYPE_SCHEMA,
vol.Optional(CONF_GAS_CALORIFIC, default=DEFAULT_CALORIFIC): vol.Coerce(float),
}
)
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{vol.Required(CONF_ELEC): ELEC_SCHEMA, vol.Optional(CONF_GAS): GAS_SCHEMA}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
"""Set up the Loop Energy sensors."""
elec_config = config.get(CONF_ELEC)
gas_config = config.get(CONF_GAS, {})
controller = pyloopenergy.LoopEnergy(
elec_config.get(CONF_ELEC_SERIAL),
elec_config.get(CONF_ELEC_SECRET),
gas_config.get(CONF_GAS_SERIAL),
gas_config.get(CONF_GAS_SECRET),
gas_config.get(CONF_GAS_TYPE),
gas_config.get(CONF_GAS_CALORIFIC),
)
def stop_loopenergy(event):
"""Shutdown loopenergy thread on exit."""
_LOGGER.info("Shutting down loopenergy")
controller.terminate()
hass.bus.listen_once(EVENT_HOMEASSISTANT_STOP, stop_loopenergy)
sensors = [LoopEnergyElec(controller)]
if gas_config.get(CONF_GAS_SERIAL):
sensors.append(LoopEnergyGas(controller))
add_entities(sensors)
class LoopEnergySensor(SensorEntity):
"""Implementation of an Loop Energy base sensor."""
def __init__(self, controller):
"""Initialize the sensor."""
self._state = None
self._unit_of_measurement = DEFAULT_UNIT
self._controller = controller
self._name = None
@property
def name(self):
"""Return the name of the sensor."""
return self._name
@property
def native_value(self):
"""Return the state of the sensor."""
return self._state
@property
def should_poll(self):
"""No polling needed."""
return False
@property
def native_unit_of_measurement(self):
"""Return the unit of measurement of this entity, if any."""
return self._unit_of_measurement
def _callback(self):
self.schedule_update_ha_state(True)
class LoopEnergyElec(LoopEnergySensor):
"""Implementation of an Loop Energy Electricity sensor."""
def __init__(self, controller):
"""Initialize the sensor."""
super().__init__(controller)
self._name = "Power Usage"
async def async_added_to_hass(self):
"""Subscribe to updates."""
self._controller.subscribe_elecricity(self._callback)
def update(self):
"""Get the cached Loop energy reading."""
self._state = round(self._controller.electricity_useage, 2)
class LoopEnergyGas(LoopEnergySensor):
"""Implementation of an Loop Energy Gas sensor."""
def __init__(self, controller):
"""Initialize the sensor."""
super().__init__(controller)
self._name = "Gas Usage"
async def async_added_to_hass(self):
"""Subscribe to updates."""
self._controller.subscribe_gas(self._callback)
def update(self):
"""Get the cached Loop gas reading."""
self._state = round(self._controller.gas_useage, 2)
+2 -2
View File
@@ -47,7 +47,7 @@ LYRIC_SETPOINT_STATUS_NAMES = {
class LyricSensorEntityDescription(SensorEntityDescription):
"""Class describing Honeywell Lyric sensor entities."""
value: Callable[[LyricDevice], StateType] = round
value: Callable[[LyricDevice], StateType | datetime] = round
def get_datetime_from_future_time(time: str) -> datetime:
@@ -133,7 +133,7 @@ async def async_setup_entry(
device_class=DEVICE_CLASS_TIMESTAMP,
value=lambda device: get_datetime_from_future_time(
device.changeableValues.nextPeriodTime
).isoformat(),
),
),
location,
device,
@@ -142,11 +142,7 @@ class MeteoFranceRainSensor(MeteoFranceSensor):
(cadran for cadran in self.coordinator.data.forecast if cadran["rain"] > 1),
None,
)
return (
dt_util.utc_from_timestamp(next_rain["dt"]).isoformat()
if next_rain
else None
)
return dt_util.utc_from_timestamp(next_rain["dt"]) if next_rain else None
@property
def extra_state_attributes(self):
@@ -73,7 +73,7 @@ class ModernFormsLightTimerRemainingTimeSensor(ModernFormsSensor):
self._attr_device_class = DEVICE_CLASS_TIMESTAMP
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the state of the sensor."""
sleep_time: datetime = dt_util.utc_from_timestamp(
self.coordinator.data.state.light_sleep_timer
@@ -83,7 +83,7 @@ class ModernFormsLightTimerRemainingTimeSensor(ModernFormsSensor):
or (sleep_time - dt_util.utcnow()).total_seconds() < 0
):
return None
return sleep_time.isoformat()
return sleep_time
class ModernFormsFanTimerRemainingTimeSensor(ModernFormsSensor):
@@ -103,7 +103,7 @@ class ModernFormsFanTimerRemainingTimeSensor(ModernFormsSensor):
self._attr_device_class = DEVICE_CLASS_TIMESTAMP
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the state of the sensor."""
sleep_time: datetime = dt_util.utc_from_timestamp(
self.coordinator.data.state.fan_sleep_timer
@@ -115,4 +115,4 @@ class ModernFormsFanTimerRemainingTimeSensor(ModernFormsSensor):
):
return None
return sleep_time.isoformat()
return sleep_time
+1 -1
View File
@@ -197,7 +197,7 @@ class SignalUpdateCallback:
"device_id": device_entry.id,
"type": event_type,
"timestamp": event_message.timestamp,
"nest_event_id": image_event.event_id,
"nest_event_id": image_event.event_session_id,
}
self._hass.bus.async_fire(NEST_EVENT, message)
+1 -1
View File
@@ -4,7 +4,7 @@
"config_flow": true,
"dependencies": ["ffmpeg", "http", "media_source"],
"documentation": "https://www.home-assistant.io/integrations/nest",
"requirements": ["python-nest==4.1.0", "google-nest-sdm==0.4.2"],
"requirements": ["python-nest==4.1.0", "google-nest-sdm==0.4.5"],
"codeowners": ["@allenporter"],
"quality_scale": "platinum",
"dhcp": [
@@ -18,7 +18,6 @@ https://developers.google.com/nest/device-access/api/camera#handle_camera_events
from __future__ import annotations
from collections import OrderedDict
from collections.abc import Mapping
from dataclasses import dataclass
import logging
@@ -182,7 +181,7 @@ class NestMediaSource(MediaSource):
browse_device.children = []
events = await _get_events(device)
for child_event in events.values():
event_id = MediaId(media_id.device_id, child_event.event_id)
event_id = MediaId(media_id.device_id, child_event.event_session_id)
browse_device.children.append(
_browse_event(event_id, device, child_event)
)
@@ -204,7 +203,7 @@ class NestMediaSource(MediaSource):
async def _get_events(device: Device) -> Mapping[str, ImageEventBase]:
"""Return relevant events for the specified device."""
events = await device.event_media_manager.async_events()
return OrderedDict({e.event_id: e for e in events})
return {e.event_session_id: e for e in events}
def _browse_root() -> BrowseMediaSource:
+8 -3
View File
@@ -135,9 +135,14 @@ async def async_setup_entry(
entities = []
for home_id in climate_topology.home_ids:
signal_name = f"{CLIMATE_STATE_CLASS_NAME}-{home_id}"
await data_handler.register_data_class(
CLIMATE_STATE_CLASS_NAME, signal_name, None, home_id=home_id
)
try:
await data_handler.register_data_class(
CLIMATE_STATE_CLASS_NAME, signal_name, None, home_id=home_id
)
except KeyError:
continue
climate_state = data_handler.data[signal_name]
climate_topology.register_handler(home_id, climate_state.process_topology)
@@ -194,7 +194,11 @@ class NetatmoDataHandler:
self._auth, **kwargs
)
await self.async_fetch_data(data_class_entry)
try:
await self.async_fetch_data(data_class_entry)
except KeyError:
self.data_classes.pop(data_class_entry)
raise
self._queue.append(self.data_classes[data_class_entry])
_LOGGER.debug("Data class %s added", data_class_entry)
@@ -48,6 +48,14 @@ async def async_setup_entry(
entities = []
for home_id in climate_topology.home_ids:
signal_name = f"{CLIMATE_STATE_CLASS_NAME}-{home_id}"
try:
await data_handler.register_data_class(
CLIMATE_STATE_CLASS_NAME, signal_name, None, home_id=home_id
)
except KeyError:
continue
await data_handler.register_data_class(
CLIMATE_STATE_CLASS_NAME, signal_name, None, home_id=home_id
)
+1 -3
View File
@@ -218,6 +218,4 @@ class NextBusDepartureSensor(SensorEntity):
)
latest_prediction = maybe_first(predictions)
self._state = utc_from_timestamp(
int(latest_prediction["epochTime"]) / 1000
).isoformat()
self._state = utc_from_timestamp(int(latest_prediction["epochTime"]) / 1000)
@@ -120,7 +120,7 @@ class OASATelematicsSensor(SensorEntity):
self._name_data = self.data.name_data
next_arrival_data = self._times[0]
if ATTR_NEXT_ARRIVAL in next_arrival_data:
self._state = next_arrival_data[ATTR_NEXT_ARRIVAL].isoformat()
self._state = next_arrival_data[ATTR_NEXT_ARRIVAL]
class OASATelematicsData:
+1 -1
View File
@@ -185,7 +185,7 @@ class MinutPointClient:
async def _sync(self):
"""Update local list of devices."""
if not await self._client.update() and self._is_available:
if not await self._client.update():
self._is_available = False
_LOGGER.warning("Device is unavailable")
async_dispatcher_send(self._hass, SIGNAL_UPDATE_ENTITY)
+6 -2
View File
@@ -25,9 +25,10 @@ from homeassistant.const import (
)
from homeassistant.core import callback
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.template import Template
_LOGGER = logging.getLogger(__name__)
_ENDPOINT = "http://pvoutput.org/service/r2/getstatus.jsp"
_ENDPOINT = "https://pvoutput.org/service/r2/getstatus.jsp"
ATTR_ENERGY_GENERATION = "energy_generation"
ATTR_POWER_GENERATION = "power_generation"
@@ -59,7 +60,10 @@ async def async_setup_platform(hass, config, async_add_entities, discovery_info=
method = "GET"
payload = auth = None
verify_ssl = DEFAULT_VERIFY_SSL
headers = {"X-Pvoutput-Apikey": api_key, "X-Pvoutput-SystemId": system_id}
headers = {
"X-Pvoutput-Apikey": Template(api_key, hass),
"X-Pvoutput-SystemId": Template(system_id, hass),
}
rest = RestData(hass, method, _ENDPOINT, auth, headers, None, payload, verify_ssl)
await rest.async_update()
+2 -2
View File
@@ -160,7 +160,7 @@ class RepetierJobEndSensor(RepetierSensor):
print_time = data["print_time"]
from_start = data["from_start"]
time_end = start + round(print_time, 0)
self._state = datetime.utcfromtimestamp(time_end).isoformat()
self._state = datetime.utcfromtimestamp(time_end)
remaining = print_time - from_start
remaining_secs = int(round(remaining, 0))
_LOGGER.debug(
@@ -182,7 +182,7 @@ class RepetierJobStartSensor(RepetierSensor):
job_name = data["job_name"]
start = data["start"]
from_start = data["from_start"]
self._state = datetime.utcfromtimestamp(start).isoformat()
self._state = datetime.utcfromtimestamp(start)
elapsed_secs = int(round(from_start, 0))
_LOGGER.debug(
"Job %s elapsed %s",
+1 -1
View File
@@ -23,5 +23,5 @@ def render_templates(tpl_dict: dict[str, Template] | None):
rendered_items = {}
for item_name, template_header in tpl_dict.items():
if (value := template_header.async_render()) is not None:
rendered_items[item_name] = value
rendered_items[item_name] = str(value)
return rendered_items
+1 -1
View File
@@ -116,7 +116,7 @@ class RovaSensor(SensorEntity):
self.data_service.update()
pickup_date = self.data_service.data.get(self.entity_description.key)
if pickup_date is not None:
self._attr_native_value = pickup_date.isoformat()
self._attr_native_value = pickup_date
class RovaData:
@@ -2,7 +2,7 @@
"domain": "seventeentrack",
"name": "17TRACK",
"documentation": "https://www.home-assistant.io/integrations/seventeentrack",
"requirements": ["py17track==2021.12.1"],
"requirements": ["py17track==2021.12.2"],
"codeowners": [],
"iot_class": "cloud_polling"
}
+90 -48
View File
@@ -144,57 +144,86 @@ SERVICES = (
SERVICE_NAME_SET_SYSTEM_PROPERTIES,
)
SERVICE_REMOVE_PIN_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): cv.string,
vol.Required(ATTR_PIN_LABEL_OR_VALUE): cv.string,
}
SERVICE_CLEAR_NOTIFICATIONS_SCHEMA = vol.All(
cv.deprecated(ATTR_SYSTEM_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_SYSTEM_ID): cv.string,
}
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_SYSTEM_ID),
)
SERVICE_SET_PIN_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): cv.string,
vol.Required(ATTR_PIN_LABEL): cv.string,
vol.Required(ATTR_PIN_VALUE): cv.string,
}
SERVICE_REMOVE_PIN_SCHEMA = vol.All(
cv.deprecated(ATTR_SYSTEM_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_SYSTEM_ID): cv.string,
vol.Required(ATTR_PIN_LABEL_OR_VALUE): cv.string,
}
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_SYSTEM_ID),
)
SERVICE_SET_SYSTEM_PROPERTIES_SCHEMA = vol.Schema(
{
vol.Required(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_ALARM_DURATION): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_ALARM_DURATION, max=MAX_ALARM_DURATION),
),
vol.Optional(ATTR_ALARM_VOLUME): vol.All(vol.In(VOLUME_MAP), VOLUME_MAP.get),
vol.Optional(ATTR_CHIME_VOLUME): vol.All(vol.In(VOLUME_MAP), VOLUME_MAP.get),
vol.Optional(ATTR_ENTRY_DELAY_AWAY): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_ENTRY_DELAY_AWAY, max=MAX_ENTRY_DELAY_AWAY),
),
vol.Optional(ATTR_ENTRY_DELAY_HOME): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(max=MAX_ENTRY_DELAY_HOME),
),
vol.Optional(ATTR_EXIT_DELAY_AWAY): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_EXIT_DELAY_AWAY, max=MAX_EXIT_DELAY_AWAY),
),
vol.Optional(ATTR_EXIT_DELAY_HOME): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(max=MAX_EXIT_DELAY_HOME),
),
vol.Optional(ATTR_LIGHT): cv.boolean,
vol.Optional(ATTR_VOICE_PROMPT_VOLUME): vol.All(
vol.In(VOLUME_MAP), VOLUME_MAP.get
),
}
SERVICE_SET_PIN_SCHEMA = vol.All(
cv.deprecated(ATTR_SYSTEM_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_SYSTEM_ID): cv.string,
vol.Required(ATTR_PIN_LABEL): cv.string,
vol.Required(ATTR_PIN_VALUE): cv.string,
},
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_SYSTEM_ID),
)
SERVICE_SET_SYSTEM_PROPERTIES_SCHEMA = vol.All(
cv.deprecated(ATTR_SYSTEM_ID),
vol.Schema(
{
vol.Optional(ATTR_DEVICE_ID): cv.string,
vol.Optional(ATTR_SYSTEM_ID): cv.string,
vol.Optional(ATTR_ALARM_DURATION): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_ALARM_DURATION, max=MAX_ALARM_DURATION),
),
vol.Optional(ATTR_ALARM_VOLUME): vol.All(
vol.In(VOLUME_MAP), VOLUME_MAP.get
),
vol.Optional(ATTR_CHIME_VOLUME): vol.All(
vol.In(VOLUME_MAP), VOLUME_MAP.get
),
vol.Optional(ATTR_ENTRY_DELAY_AWAY): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_ENTRY_DELAY_AWAY, max=MAX_ENTRY_DELAY_AWAY),
),
vol.Optional(ATTR_ENTRY_DELAY_HOME): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(max=MAX_ENTRY_DELAY_HOME),
),
vol.Optional(ATTR_EXIT_DELAY_AWAY): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(min=MIN_EXIT_DELAY_AWAY, max=MAX_EXIT_DELAY_AWAY),
),
vol.Optional(ATTR_EXIT_DELAY_HOME): vol.All(
cv.time_period,
lambda value: value.total_seconds(),
vol.Range(max=MAX_EXIT_DELAY_HOME),
),
vol.Optional(ATTR_LIGHT): cv.boolean,
vol.Optional(ATTR_VOICE_PROMPT_VOLUME): vol.All(
vol.In(VOLUME_MAP), VOLUME_MAP.get
),
}
),
cv.has_at_least_one_key(ATTR_DEVICE_ID, ATTR_SYSTEM_ID),
)
WEBSOCKET_EVENTS_REQUIRING_SERIAL = [EVENT_LOCK_LOCKED, EVENT_LOCK_UNLOCKED]
@@ -216,6 +245,15 @@ def _async_get_system_for_service_call(
hass: HomeAssistant, call: ServiceCall
) -> SystemType:
"""Get the SimpliSafe system related to a service call (by device ID)."""
if ATTR_SYSTEM_ID in call.data:
for entry in hass.config_entries.async_entries(DOMAIN):
simplisafe = hass.data[DOMAIN][entry.entry_id]
if (
system := simplisafe.systems.get(int(call.data[ATTR_SYSTEM_ID]))
) is None:
continue
return cast(SystemType, system)
device_id = call.data[ATTR_DEVICE_ID]
device_registry = dr.async_get(hass)
@@ -365,7 +403,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
for service, method, schema in (
(SERVICE_NAME_CLEAR_NOTIFICATIONS, async_clear_notifications, None),
(
SERVICE_NAME_CLEAR_NOTIFICATIONS,
async_clear_notifications,
SERVICE_CLEAR_NOTIFICATIONS_SCHEMA,
),
(SERVICE_NAME_REMOVE_PIN, async_remove_pin, SERVICE_REMOVE_PIN_SCHEMA),
(SERVICE_NAME_SET_PIN, async_set_pin, SERVICE_SET_PIN_SCHEMA),
(
@@ -3,7 +3,7 @@
"name": "SimpliSafe",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/simplisafe",
"requirements": ["simplisafe-python==2021.11.2"],
"requirements": ["simplisafe-python==2021.12.0"],
"codeowners": ["@bachya"],
"iot_class": "cloud_polling",
"dhcp": [
@@ -1,7 +1,7 @@
# Describes the format for available SimpliSafe services
clear_notifications:
name: Clear notifications
description: Clear any active SimpliSafe notificiations
description: Clear any active SimpliSafe notifications
fields:
device_id:
name: System
+4 -2
View File
@@ -40,7 +40,7 @@ def soco_error(
return None
except (OSError, SoCoException, SoCoUPnPException) as err:
error_code = getattr(err, "error_code", None)
function = funct.__name__
function = funct.__qualname__
if errorcodes and error_code in errorcodes:
_LOGGER.debug(
"Error code %s ignored in call to %s", error_code, function
@@ -59,7 +59,9 @@ def soco_error(
return None
dispatcher_send(
self.hass, f"{SONOS_SPEAKER_ACTIVITY}-{self.soco.uid}", funct.__name__
self.hass,
f"{SONOS_SPEAKER_ACTIVITY}-{self.soco.uid}",
funct.__qualname__,
)
return result
@@ -3,7 +3,7 @@
"name": "Tasmota",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/tasmota",
"requirements": ["hatasmota==0.3.0"],
"requirements": ["hatasmota==0.3.1"],
"dependencies": ["mqtt"],
"mqtt": ["tasmota/discovery/#"],
"codeowners": ["@emontnemery"],
@@ -14,7 +14,7 @@ from homeassistant.const import (
DEVICE_CLASS_VOLTAGE,
ELECTRIC_CURRENT_AMPERE,
ELECTRIC_POTENTIAL_VOLT,
ENERGY_KILO_WATT_HOUR,
ENERGY_WATT_HOUR,
ENTITY_CATEGORY_DIAGNOSTIC,
FREQUENCY_HERTZ,
TEMP_CELSIUS,
@@ -120,10 +120,10 @@ WALL_CONNECTOR_SENSORS = [
entity_category=ENTITY_CATEGORY_DIAGNOSTIC,
),
WallConnectorSensorDescription(
key="total_energy_kWh",
name=prefix_entity_name("Total Energy"),
native_unit_of_measurement=ENERGY_KILO_WATT_HOUR,
value_fn=lambda data: data[WALLCONNECTOR_DATA_LIFETIME].energy_wh / 1000.0,
key="energy_kWh",
name=prefix_entity_name("Energy"),
native_unit_of_measurement=ENERGY_WATT_HOUR,
value_fn=lambda data: data[WALLCONNECTOR_DATA_LIFETIME].energy_wh,
state_class=STATE_CLASS_TOTAL_INCREASING,
device_class=DEVICE_CLASS_ENERGY,
),
@@ -5,13 +5,15 @@ import asyncio
from dataclasses import dataclass
from datetime import timedelta
import logging
from typing import Any
import aiohttp
from pytrafikverket.trafikverket_weather import TrafikverketWeather
from pytrafikverket.trafikverket_weather import TrafikverketWeather, WeatherStationInfo
import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA,
PLATFORM_SCHEMA as PARENT_PLATFORM_SCHEMA,
STATE_CLASS_MEASUREMENT,
SensorEntity,
SensorEntityDescription,
)
@@ -70,6 +72,7 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
native_unit_of_measurement=TEMP_CELSIUS,
icon="mdi:thermometer",
device_class=DEVICE_CLASS_TEMPERATURE,
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="road_temp",
@@ -78,12 +81,14 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
native_unit_of_measurement=TEMP_CELSIUS,
icon="mdi:thermometer",
device_class=DEVICE_CLASS_TEMPERATURE,
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="precipitation",
api_key="precipitationtype",
name="Precipitation type",
icon="mdi:weather-snowy-rainy",
entity_registry_enabled_default=False,
),
TrafikverketSensorEntityDescription(
key="wind_direction",
@@ -91,6 +96,7 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
name="Wind direction",
native_unit_of_measurement=DEGREE,
icon="mdi:flag-triangle",
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="wind_direction_text",
@@ -104,6 +110,7 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
name="Wind speed",
native_unit_of_measurement=SPEED_METERS_PER_SECOND,
icon="mdi:weather-windy",
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="wind_speed_max",
@@ -111,6 +118,8 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
name="Wind speed max",
native_unit_of_measurement=SPEED_METERS_PER_SECOND,
icon="mdi:weather-windy-variant",
entity_registry_enabled_default=False,
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="humidity",
@@ -119,6 +128,8 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
native_unit_of_measurement=PERCENTAGE,
icon="mdi:water-percent",
device_class=DEVICE_CLASS_HUMIDITY,
entity_registry_enabled_default=False,
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="precipitation_amount",
@@ -126,18 +137,20 @@ SENSOR_TYPES: tuple[TrafikverketSensorEntityDescription, ...] = (
name="Precipitation amount",
native_unit_of_measurement=LENGTH_MILLIMETERS,
icon="mdi:cup-water",
state_class=STATE_CLASS_MEASUREMENT,
),
TrafikverketSensorEntityDescription(
key="precipitation_amountname",
api_key="precipitation_amountname",
name="Precipitation name",
icon="mdi:weather-pouring",
entity_registry_enabled_default=False,
),
)
SENSOR_KEYS = [desc.key for desc in SENSOR_TYPES]
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
PLATFORM_SCHEMA = PARENT_PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_API_KEY): cv.string,
@@ -172,17 +185,12 @@ async def async_setup_entry(
) -> None:
"""Set up the Trafikverket sensor entry."""
sensor_name = entry.data[CONF_STATION]
sensor_api = entry.data[CONF_API_KEY]
sensor_station = entry.data[CONF_STATION]
web_session = async_get_clientsession(hass)
weather_api = TrafikverketWeather(web_session, sensor_api)
weather_api = TrafikverketWeather(web_session, entry.data[CONF_API_KEY])
entities = [
TrafikverketWeatherStation(
weather_api, sensor_name, sensor_station, description
weather_api, entry.entry_id, entry.data[CONF_STATION], description
)
for description in SENSOR_TYPES
]
@@ -197,29 +205,36 @@ class TrafikverketWeatherStation(SensorEntity):
def __init__(
self,
weather_api,
name,
sensor_station,
weather_api: TrafikverketWeather,
entry_id: str,
sensor_station: str,
description: TrafikverketSensorEntityDescription,
):
) -> None:
"""Initialize the sensor."""
self.entity_description = description
self._attr_name = f"{name} {description.name}"
self._attr_name = f"{sensor_station} {description.name}"
self._attr_unique_id = f"{entry_id}_{description.key}"
self._station = sensor_station
self._weather_api = weather_api
self._weather = None
self._weather: WeatherStationInfo | None = None
self._active: bool | None = None
self._measure_time: str | None = None
@property
def extra_state_attributes(self):
def extra_state_attributes(self) -> dict[str, Any]:
"""Return the state attributes of Trafikverket Weatherstation."""
return {
_additional_attributes: dict[str, Any] = {
ATTR_ATTRIBUTION: ATTRIBUTION,
ATTR_ACTIVE: self._weather.active,
ATTR_MEASURE_TIME: self._weather.measure_time,
}
if self._active:
_additional_attributes[ATTR_ACTIVE] = self._active
if self._measure_time:
_additional_attributes[ATTR_MEASURE_TIME] = self._measure_time
return _additional_attributes
@Throttle(MIN_TIME_BETWEEN_UPDATES)
async def async_update(self):
async def async_update(self) -> None:
"""Get the latest data from Trafikverket and updates the states."""
try:
self._weather = await self._weather_api.async_get_weather(self._station)
@@ -228,3 +243,6 @@ class TrafikverketWeatherStation(SensorEntity):
)
except (asyncio.TimeoutError, aiohttp.ClientError, ValueError) as error:
_LOGGER.error("Could not fetch weather data: %s", error)
return
self._active = self._weather.active
self._measure_time = self._weather.measure_time
+4 -4
View File
@@ -56,7 +56,7 @@ class ValloxSensor(CoordinatorEntity, SensorEntity):
self._attr_unique_id = f"{uuid}-{description.key}"
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the value reported by the sensor."""
if (metric_key := self.entity_description.metric_key) is None:
return None
@@ -84,7 +84,7 @@ class ValloxFanSpeedSensor(ValloxSensor):
"""Child class for fan speed reporting."""
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the value reported by the sensor."""
fan_is_on = self.coordinator.data.get_metric(METRIC_KEY_MODE) == MODE_ON
return super().native_value if fan_is_on else 0
@@ -94,7 +94,7 @@ class ValloxFilterRemainingSensor(ValloxSensor):
"""Child class for filter remaining time reporting."""
@property
def native_value(self) -> StateType:
def native_value(self) -> StateType | datetime:
"""Return the value reported by the sensor."""
super_native_value = super().native_value
@@ -107,7 +107,7 @@ class ValloxFilterRemainingSensor(ValloxSensor):
days_remaining_delta = timedelta(days=days_remaining)
now = datetime.utcnow().replace(hour=13, minute=0, second=0, microsecond=0)
return (now + days_remaining_delta).isoformat()
return now + days_remaining_delta
class ValloxCellStateSensor(ValloxSensor):
@@ -2,7 +2,7 @@
"domain": "yandex_transport",
"name": "Yandex Transport",
"documentation": "https://www.home-assistant.io/integrations/yandex_transport",
"requirements": ["aioymaps==1.2.1"],
"requirements": ["aioymaps==1.2.2"],
"codeowners": ["@rishatik92", "@devbis"],
"iot_class": "cloud_polling"
}
@@ -129,9 +129,7 @@ class DiscoverYandexTransport(SensorEntity):
if closer_time is None:
self._state = None
else:
self._state = dt_util.utc_from_timestamp(closer_time).isoformat(
timespec="seconds"
)
self._state = dt_util.utc_from_timestamp(closer_time).replace(microsecond=0)
self._attrs = attrs
@property
+1 -1
View File
@@ -599,7 +599,7 @@ class ZenWithinThermostat(Thermostat):
channel_names=CHANNEL_THERMOSTAT,
aux_channels=CHANNEL_FAN,
manufacturers="Centralite",
models="3157100",
models={"3157100", "3157100-E"},
stop_on_match=True,
)
class CentralitePearl(ZenWithinThermostat):
+1 -1
View File
@@ -12,7 +12,7 @@
"zigpy==0.42.0",
"zigpy-xbee==0.14.0",
"zigpy-zigate==0.7.3",
"zigpy-znp==0.6.3"
"zigpy-znp==0.6.4"
],
"usb": [
{"vid":"10C4","pid":"EA60","description":"*2652*","known_devices":["slae.sh cc2652rb stick"]},
+8
View File
@@ -61,6 +61,7 @@ from .core import discovery
from .core.const import (
CHANNEL_ANALOG_INPUT,
CHANNEL_ELECTRICAL_MEASUREMENT,
CHANNEL_FAN,
CHANNEL_HUMIDITY,
CHANNEL_ILLUMINANCE,
CHANNEL_LEAF_WETNESS,
@@ -636,6 +637,13 @@ class ThermostatHVACAction(Sensor, id_suffix="hvac_action"):
self.async_write_ha_state()
@MULTI_MATCH(
channel_names=CHANNEL_THERMOSTAT,
aux_channels=CHANNEL_FAN,
manufacturers="Centralite",
models={"3157100", "3157100-E"},
stop_on_match=True,
)
@MULTI_MATCH(
channel_names=CHANNEL_THERMOSTAT,
manufacturers="Zen Within",
@@ -22,6 +22,7 @@
},
"configure_addon": {
"title": "Enter the Z-Wave JS add-on configuration",
"description": "The add-on will generate security keys if those fields are left empty.",
"data": {
"usb_path": "[%key:common::config_flow::data::usb_path%]",
"s0_legacy_key": "S0 Key (Legacy)",
@@ -79,6 +80,7 @@
},
"configure_addon": {
"title": "Enter the Z-Wave JS add-on configuration",
"description": "The add-on will generate security keys if those fields are left empty.",
"data": {
"usb_path": "[%key:common::config_flow::data::usb_path%]",
"s0_legacy_key": "S0 Key (Legacy)",
+6
View File
@@ -1163,6 +1163,12 @@ class ConfigFlow(data_entry_flow.FlowHandler):
"""Get the options flow for this handler."""
raise data_entry_flow.UnknownHandler
@classmethod
@callback
def async_supports_options_flow(cls, config_entry: ConfigEntry) -> bool:
"""Return options flow support for this handler."""
return cls.async_get_options_flow is not ConfigFlow.async_get_options_flow
@callback
def _async_abort_entries_match(
self, match_dict: dict[str, Any] | None = None
+1 -1
View File
@@ -7,7 +7,7 @@ from homeassistant.backports.enum import StrEnum
MAJOR_VERSION: Final = 2021
MINOR_VERSION: Final = 12
PATCH_VERSION: Final = "0b1"
PATCH_VERSION: Final = "0b4"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 8, 0)
+9
View File
@@ -12,6 +12,9 @@ from homeassistant.exceptions import HomeAssistantError
_LOGGER = logging.getLogger(__name__)
# Keep track of integrations already reported to prevent flooding
_REPORTED_INTEGRATIONS: set[str] = set()
CALLABLE_T = TypeVar("CALLABLE_T", bound=Callable) # pylint: disable=invalid-name
@@ -85,6 +88,12 @@ def report_integration(
"""
found_frame, integration, path = integration_frame
# Keep track of integrations already reported to prevent flooding
key = f"{found_frame.filename}:{found_frame.lineno}"
if key in _REPORTED_INTEGRATIONS:
return
_REPORTED_INTEGRATIONS.add(key)
index = found_frame.filename.index(path)
if path == "custom_components/":
extra = " to the custom component author"
+1 -1
View File
@@ -16,7 +16,7 @@ ciso8601==2.2.0
cryptography==35.0.0
emoji==1.5.0
hass-nabucasa==0.50.0
home-assistant-frontend==20211203.0
home-assistant-frontend==20211206.0
httpx==0.21.0
ifaddr==0.1.7
jinja2==3.0.3
+1
View File
@@ -93,6 +93,7 @@ def install_package(
# Workaround for incompatible prefix setting
# See http://stackoverflow.com/a/4495175
args += ["--prefix="]
_LOGGER.debug("Running pip command: args=%s", args)
with Popen(args, stdin=PIPE, stdout=PIPE, stderr=PIPE, env=env) as process:
_, stderr = process.communicate()
if process.returncode != 0:
+10 -13
View File
@@ -186,7 +186,7 @@ aiohomekit==0.6.4
aiohttp_cors==0.7.0
# homeassistant.components.hue
aiohue==3.0.1
aiohue==3.0.2
# homeassistant.components.imap
aioimaplib==0.9.0
@@ -267,7 +267,7 @@ aiovlc==0.1.0
aiowatttime==0.1.1
# homeassistant.components.yandex_transport
aioymaps==1.2.1
aioymaps==1.2.2
# homeassistant.components.airly
airly==1.1.0
@@ -606,7 +606,7 @@ env_canada==0.5.18
# envirophat==0.0.6
# homeassistant.components.enphase_envoy
envoy_reader==0.20.0
envoy_reader==0.20.1
# homeassistant.components.season
ephem==3.7.7.0
@@ -658,7 +658,7 @@ fjaraskupan==1.0.2
flipr-api==1.4.1
# homeassistant.components.flux_led
flux_led==0.25.16
flux_led==0.26.2
# homeassistant.components.homekit
fnvhash==0.1.0
@@ -738,7 +738,7 @@ google-cloud-pubsub==2.1.0
google-cloud-texttospeech==0.4.0
# homeassistant.components.nest
google-nest-sdm==0.4.2
google-nest-sdm==0.4.5
# homeassistant.components.google_travel_time
googlemaps==2.5.1
@@ -792,7 +792,7 @@ hass-nabucasa==0.50.0
hass_splunk==0.1.1
# homeassistant.components.tasmota
hatasmota==0.3.0
hatasmota==0.3.1
# homeassistant.components.jewish_calendar
hdate==0.10.4
@@ -819,7 +819,7 @@ hole==0.7.0
holidays==0.11.3.1
# homeassistant.components.frontend
home-assistant-frontend==20211203.0
home-assistant-frontend==20211206.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -1305,7 +1305,7 @@ py-synologydsm-api==1.0.4
py-zabbix==1.1.7
# homeassistant.components.seventeentrack
py17track==2021.12.1
py17track==2021.12.2
# homeassistant.components.hdmi_cec
pyCEC==0.5.1
@@ -1612,9 +1612,6 @@ pylitejet==0.3.0
# homeassistant.components.litterrobot
pylitterbot==2021.11.0
# homeassistant.components.loopenergy
pyloopenergy==0.2.1
# homeassistant.components.lutron_caseta
pylutron-caseta==0.11.0
@@ -2149,7 +2146,7 @@ simplehound==0.3
simplepush==1.1.4
# homeassistant.components.simplisafe
simplisafe-python==2021.11.2
simplisafe-python==2021.12.0
# homeassistant.components.sisyphus
sisyphus-control==3.0
@@ -2507,7 +2504,7 @@ zigpy-xbee==0.14.0
zigpy-zigate==0.7.3
# homeassistant.components.zha
zigpy-znp==0.6.3
zigpy-znp==0.6.4
# homeassistant.components.zha
zigpy==0.42.0
+10 -10
View File
@@ -131,7 +131,7 @@ aiohomekit==0.6.4
aiohttp_cors==0.7.0
# homeassistant.components.hue
aiohue==3.0.1
aiohue==3.0.2
# homeassistant.components.apache_kafka
aiokafka==0.6.0
@@ -197,7 +197,7 @@ aiovlc==0.1.0
aiowatttime==0.1.1
# homeassistant.components.yandex_transport
aioymaps==1.2.1
aioymaps==1.2.2
# homeassistant.components.airly
airly==1.1.0
@@ -378,7 +378,7 @@ enocean==0.50
env_canada==0.5.18
# homeassistant.components.enphase_envoy
envoy_reader==0.20.0
envoy_reader==0.20.1
# homeassistant.components.season
ephem==3.7.7.0
@@ -399,7 +399,7 @@ fjaraskupan==1.0.2
flipr-api==1.4.1
# homeassistant.components.flux_led
flux_led==0.25.16
flux_led==0.26.2
# homeassistant.components.homekit
fnvhash==0.1.0
@@ -461,7 +461,7 @@ google-api-python-client==1.6.4
google-cloud-pubsub==2.1.0
# homeassistant.components.nest
google-nest-sdm==0.4.2
google-nest-sdm==0.4.5
# homeassistant.components.google_travel_time
googlemaps==2.5.1
@@ -497,7 +497,7 @@ hangups==0.4.14
hass-nabucasa==0.50.0
# homeassistant.components.tasmota
hatasmota==0.3.0
hatasmota==0.3.1
# homeassistant.components.jewish_calendar
hdate==0.10.4
@@ -515,7 +515,7 @@ hole==0.7.0
holidays==0.11.3.1
# homeassistant.components.frontend
home-assistant-frontend==20211203.0
home-assistant-frontend==20211206.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -792,7 +792,7 @@ py-nightscout==1.2.2
py-synologydsm-api==1.0.4
# homeassistant.components.seventeentrack
py17track==2021.12.1
py17track==2021.12.2
# homeassistant.components.control4
pyControl4==0.0.6
@@ -1273,7 +1273,7 @@ sharkiqpy==0.1.8
simplehound==0.3
# homeassistant.components.simplisafe
simplisafe-python==2021.11.2
simplisafe-python==2021.12.0
# homeassistant.components.slack
slackclient==2.5.0
@@ -1488,7 +1488,7 @@ zigpy-xbee==0.14.0
zigpy-zigate==0.7.3
# homeassistant.components.zha
zigpy-znp==0.6.3
zigpy-znp==0.6.4
# homeassistant.components.zha
zigpy==0.42.0
+10 -2
View File
@@ -117,10 +117,18 @@ async def test_send_add_or_update_message(hass, aioclient_mock):
{"friendly_name": "Test Contact Sensor", "device_class": "door"},
)
await state_report.async_send_add_or_update_message(
hass, DEFAULT_CONFIG, ["binary_sensor.test_contact", "zwave.bla"]
hass.states.async_set(
"zwave.bla",
"wow_such_unsupported",
)
entities = [
"binary_sensor.test_contact",
"binary_sensor.non_existing", # Supported, but does not exist
"zwave.bla", # Unsupported
]
await state_report.async_send_add_or_update_message(hass, DEFAULT_CONFIG, entities)
assert len(aioclient_mock.mock_calls) == 1
call = aioclient_mock.mock_calls
+41
View File
@@ -1,6 +1,7 @@
"""Test the Bond config flow."""
from __future__ import annotations
from http import HTTPStatus
from typing import Any
from unittest.mock import MagicMock, Mock, patch
@@ -304,6 +305,46 @@ async def test_zeroconf_form_with_token_available(hass: core.HomeAssistant):
assert len(mock_setup_entry.mock_calls) == 1
async def test_zeroconf_form_with_token_available_name_unavailable(
hass: core.HomeAssistant,
):
"""Test we get the discovery form when we can get the token but the name is unavailable."""
with patch_bond_version(
side_effect=ClientResponseError(Mock(), (), status=HTTPStatus.BAD_REQUEST)
), patch_bond_token(return_value={"token": "discovered-token"}):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_ZEROCONF},
data=zeroconf.ZeroconfServiceInfo(
host="test-host",
hostname="mock_hostname",
name="test-bond-id.some-other-tail-info",
port=None,
properties={},
type="mock_type",
),
)
await hass.async_block_till_done()
assert result["type"] == "form"
assert result["errors"] == {}
with _patch_async_setup_entry() as mock_setup_entry:
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{},
)
await hass.async_block_till_done()
assert result2["type"] == "create_entry"
assert result2["title"] == "test-bond-id"
assert result2["data"] == {
CONF_HOST: "test-host",
CONF_ACCESS_TOKEN: "discovered-token",
}
assert len(mock_setup_entry.mock_calls) == 1
async def test_zeroconf_already_configured(hass: core.HomeAssistant):
"""Test starting a flow from discovery when already configured."""
@@ -47,10 +47,16 @@ async def test_get_entries(hass, client):
@staticmethod
@callback
def async_get_options_flow(config, options):
def async_get_options_flow(config_entry):
"""Get options flow."""
pass
@classmethod
@callback
def async_supports_options_flow(cls, config_entry):
"""Return options flow support for this handler."""
return True
hass.helpers.config_entry_flow.register_discovery_flow(
"comp2", "Comp 2", lambda: None
)
+409 -1
View File
@@ -1,9 +1,10 @@
"""Test the Energy websocket API."""
from unittest.mock import AsyncMock, Mock
from unittest.mock import AsyncMock, Mock, patch
import pytest
from homeassistant.components.energy import data, is_configured
from homeassistant.components.recorder import statistics
from homeassistant.components.recorder.statistics import async_add_external_statistics
from homeassistant.setup import async_setup_component
from homeassistant.util import dt as dt_util
@@ -472,8 +473,10 @@ async def test_fossil_energy_consumption_hole(hass, hass_ws_client):
period1 = dt_util.as_utc(dt_util.parse_datetime("2021-09-01 00:00:00"))
period2 = dt_util.as_utc(dt_util.parse_datetime("2021-09-30 23:00:00"))
period2_day_start = dt_util.as_utc(dt_util.parse_datetime("2021-09-30 00:00:00"))
period3 = dt_util.as_utc(dt_util.parse_datetime("2021-10-01 00:00:00"))
period4 = dt_util.as_utc(dt_util.parse_datetime("2021-10-31 23:00:00"))
period4_day_start = dt_util.as_utc(dt_util.parse_datetime("2021-10-31 00:00:00"))
external_energy_statistics_1 = (
{
@@ -575,6 +578,197 @@ async def test_fossil_energy_consumption_hole(hass, hass_ws_client):
period4.isoformat(): pytest.approx(88.0 - 55.0),
}
await client.send_json(
{
"id": 2,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1",
"test:total_energy_import_tariff_2",
],
"co2_statistic_id": "test:co2_ratio_missing",
"period": "day",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
period2_day_start.isoformat(): pytest.approx(3.0 - 20.0),
period3.isoformat(): pytest.approx(55.0 - 3.0),
period4_day_start.isoformat(): pytest.approx(88.0 - 55.0),
}
await client.send_json(
{
"id": 3,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1",
"test:total_energy_import_tariff_2",
],
"co2_statistic_id": "test:co2_ratio_missing",
"period": "month",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
period1.isoformat(): pytest.approx(3.0 - 20.0),
period3.isoformat(): pytest.approx((55.0 - 3.0) + (88.0 - 55.0)),
}
@pytest.mark.freeze_time("2021-08-01 00:00:00+00:00")
async def test_fossil_energy_consumption_no_data(hass, hass_ws_client):
"""Test fossil_energy_consumption when there is no data."""
now = dt_util.utcnow()
later = dt_util.as_utc(dt_util.parse_datetime("2022-09-01 00:00:00"))
await hass.async_add_executor_job(init_recorder_component, hass)
await async_setup_component(hass, "history", {})
await async_setup_component(hass, "sensor", {})
period1 = dt_util.as_utc(dt_util.parse_datetime("2021-09-01 00:00:00"))
period2 = dt_util.as_utc(dt_util.parse_datetime("2021-09-30 23:00:00"))
period3 = dt_util.as_utc(dt_util.parse_datetime("2021-10-01 00:00:00"))
period4 = dt_util.as_utc(dt_util.parse_datetime("2021-10-31 23:00:00"))
external_energy_statistics_1 = (
{
"start": period1,
"last_reset": None,
"state": 0,
"sum": None,
},
{
"start": period2,
"last_reset": None,
"state": 1,
"sum": 3,
},
{
"start": period3,
"last_reset": None,
"state": 2,
"sum": 5,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 8,
},
)
external_energy_metadata_1 = {
"has_mean": False,
"has_sum": True,
"name": "Total imported energy",
"source": "test",
"statistic_id": "test:total_energy_import_tariff_1",
"unit_of_measurement": "kWh",
}
external_energy_statistics_2 = (
{
"start": period1,
"last_reset": None,
"state": 0,
"sum": 20,
},
{
"start": period2,
"last_reset": None,
"state": 1,
"sum": None,
},
{
"start": period3,
"last_reset": None,
"state": 2,
"sum": 50,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 80,
},
)
external_energy_metadata_2 = {
"has_mean": False,
"has_sum": True,
"name": "Total imported energy",
"source": "test",
"statistic_id": "test:total_energy_import_tariff_2",
"unit_of_measurement": "kWh",
}
async_add_external_statistics(
hass, external_energy_metadata_1, external_energy_statistics_1
)
async_add_external_statistics(
hass, external_energy_metadata_2, external_energy_statistics_2
)
await async_wait_recording_done_without_instance(hass)
client = await hass_ws_client()
await client.send_json(
{
"id": 1,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1_missing",
"test:total_energy_import_tariff_2_missing",
],
"co2_statistic_id": "test:co2_ratio_missing",
"period": "hour",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {}
await client.send_json(
{
"id": 2,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1_missing",
"test:total_energy_import_tariff_2_missing",
],
"co2_statistic_id": "test:co2_ratio_missing",
"period": "day",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {}
await client.send_json(
{
"id": 3,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1_missing",
"test:total_energy_import_tariff_2_missing",
],
"co2_statistic_id": "test:co2_ratio_missing",
"period": "month",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {}
@pytest.mark.freeze_time("2021-08-01 00:00:00+00:00")
async def test_fossil_energy_consumption(hass, hass_ws_client):
@@ -770,6 +964,220 @@ async def test_fossil_energy_consumption(hass, hass_ws_client):
}
@pytest.mark.freeze_time("2021-08-01 00:00:00+00:00")
async def test_fossil_energy_consumption_duplicate(hass, hass_ws_client):
"""Test fossil_energy_consumption with co2 sensor data."""
now = dt_util.utcnow()
later = dt_util.as_utc(dt_util.parse_datetime("2022-09-01 00:00:00"))
await hass.async_add_executor_job(init_recorder_component, hass)
await async_setup_component(hass, "history", {})
await async_setup_component(hass, "sensor", {})
period1 = dt_util.as_utc(dt_util.parse_datetime("2021-09-01 00:00:00"))
period2 = dt_util.as_utc(dt_util.parse_datetime("2021-09-30 23:00:00"))
period2_day_start = dt_util.as_utc(dt_util.parse_datetime("2021-09-30 00:00:00"))
period3 = dt_util.as_utc(dt_util.parse_datetime("2021-10-01 00:00:00"))
period4 = dt_util.as_utc(dt_util.parse_datetime("2021-10-31 23:00:00"))
period4_day_start = dt_util.as_utc(dt_util.parse_datetime("2021-10-31 00:00:00"))
external_energy_statistics_1 = (
{
"start": period1,
"last_reset": None,
"state": 0,
"sum": 2,
},
{
"start": period2,
"last_reset": None,
"state": 1,
"sum": 3,
},
{
"start": period3,
"last_reset": None,
"state": 2,
"sum": 4,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 5,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 5,
},
)
external_energy_metadata_1 = {
"has_mean": False,
"has_sum": True,
"name": "Total imported energy",
"source": "test",
"statistic_id": "test:total_energy_import_tariff_1",
"unit_of_measurement": "kWh",
}
external_energy_statistics_2 = (
{
"start": period1,
"last_reset": None,
"state": 0,
"sum": 20,
},
{
"start": period2,
"last_reset": None,
"state": 1,
"sum": 30,
},
{
"start": period3,
"last_reset": None,
"state": 2,
"sum": 40,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 50,
},
{
"start": period4,
"last_reset": None,
"state": 3,
"sum": 50,
},
)
external_energy_metadata_2 = {
"has_mean": False,
"has_sum": True,
"name": "Total imported energy",
"source": "test",
"statistic_id": "test:total_energy_import_tariff_2",
"unit_of_measurement": "kWh",
}
external_co2_statistics = (
{
"start": period1,
"last_reset": None,
"mean": 10,
},
{
"start": period2,
"last_reset": None,
"mean": 30,
},
{
"start": period3,
"last_reset": None,
"mean": 60,
},
{
"start": period4,
"last_reset": None,
"mean": 90,
},
)
external_co2_metadata = {
"has_mean": True,
"has_sum": False,
"name": "Fossil percentage",
"source": "test",
"statistic_id": "test:fossil_percentage",
"unit_of_measurement": "%",
}
with patch.object(
statistics, "_statistics_exists", return_value=False
), patch.object(
statistics, "_insert_statistics", wraps=statistics._insert_statistics
) as insert_statistics_mock:
async_add_external_statistics(
hass, external_energy_metadata_1, external_energy_statistics_1
)
async_add_external_statistics(
hass, external_energy_metadata_2, external_energy_statistics_2
)
async_add_external_statistics(
hass, external_co2_metadata, external_co2_statistics
)
await async_wait_recording_done_without_instance(hass)
assert insert_statistics_mock.call_count == 14
client = await hass_ws_client()
await client.send_json(
{
"id": 1,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1",
"test:total_energy_import_tariff_2",
],
"co2_statistic_id": "test:fossil_percentage",
"period": "hour",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
period2.isoformat(): pytest.approx((33.0 - 22.0) * 0.3),
period3.isoformat(): pytest.approx((44.0 - 33.0) * 0.6),
period4.isoformat(): pytest.approx((55.0 - 44.0) * 0.9),
}
await client.send_json(
{
"id": 2,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1",
"test:total_energy_import_tariff_2",
],
"co2_statistic_id": "test:fossil_percentage",
"period": "day",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
period2_day_start.isoformat(): pytest.approx((33.0 - 22.0) * 0.3),
period3.isoformat(): pytest.approx((44.0 - 33.0) * 0.6),
period4_day_start.isoformat(): pytest.approx((55.0 - 44.0) * 0.9),
}
await client.send_json(
{
"id": 3,
"type": "energy/fossil_energy_consumption",
"start_time": now.isoformat(),
"end_time": later.isoformat(),
"energy_statistic_ids": [
"test:total_energy_import_tariff_1",
"test:total_energy_import_tariff_2",
],
"co2_statistic_id": "test:fossil_percentage",
"period": "month",
}
)
response = await client.receive_json()
assert response["success"]
assert response["result"] == {
period1.isoformat(): pytest.approx((33.0 - 22.0) * 0.3),
period3.isoformat(): pytest.approx(
((44.0 - 33.0) * 0.6) + ((55.0 - 44.0) * 0.9)
),
}
async def test_fossil_energy_consumption_checks(hass, hass_ws_client):
"""Test fossil_energy_consumption parameter validation."""
client = await hass_ws_client(hass)
@@ -23,6 +23,91 @@ async def test_form(hass: HomeAssistant) -> None:
with patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.getData",
return_value=True,
), patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.get_full_serial_number",
return_value="1234",
), patch(
"homeassistant.components.enphase_envoy.async_setup_entry",
return_value=True,
) as mock_setup_entry:
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{
"host": "1.1.1.1",
"username": "test-username",
"password": "test-password",
},
)
await hass.async_block_till_done()
assert result2["type"] == "create_entry"
assert result2["title"] == "Envoy 1234"
assert result2["data"] == {
"host": "1.1.1.1",
"name": "Envoy 1234",
"username": "test-username",
"password": "test-password",
}
assert len(mock_setup_entry.mock_calls) == 1
async def test_user_no_serial_number(hass: HomeAssistant) -> None:
"""Test user setup without a serial number."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
assert result["errors"] == {}
with patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.getData",
return_value=True,
), patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.get_full_serial_number",
return_value=None,
), patch(
"homeassistant.components.enphase_envoy.async_setup_entry",
return_value=True,
) as mock_setup_entry:
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{
"host": "1.1.1.1",
"username": "test-username",
"password": "test-password",
},
)
await hass.async_block_till_done()
assert result2["type"] == "create_entry"
assert result2["title"] == "Envoy"
assert result2["data"] == {
"host": "1.1.1.1",
"name": "Envoy",
"username": "test-username",
"password": "test-password",
}
assert len(mock_setup_entry.mock_calls) == 1
async def test_user_fetching_serial_fails(hass: HomeAssistant) -> None:
"""Test user setup without a serial number."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == "form"
assert result["errors"] == {}
with patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.getData",
return_value=True,
), patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.get_full_serial_number",
side_effect=httpx.HTTPStatusError(
"any", request=MagicMock(), response=MagicMock()
),
), patch(
"homeassistant.components.enphase_envoy.async_setup_entry",
return_value=True,
@@ -125,6 +210,9 @@ async def test_import(hass: HomeAssistant) -> None:
with patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.getData",
return_value=True,
), patch(
"homeassistant.components.enphase_envoy.config_flow.EnvoyReader.get_full_serial_number",
return_value="1234",
), patch(
"homeassistant.components.enphase_envoy.async_setup_entry",
return_value=True,
+2
View File
@@ -66,6 +66,7 @@ def _mocked_bulb() -> AIOWifiLedBulb:
bulb.data_receive_callback = callback
bulb.device_type = DeviceType.Bulb
bulb.requires_turn_on = True
bulb.async_setup = AsyncMock(side_effect=_save_setup_callback)
bulb.effect_list = ["some_effect"]
bulb.async_set_custom_pattern = AsyncMock()
@@ -115,6 +116,7 @@ def _mocked_switch() -> AIOWifiLedBulb:
switch.data_receive_callback = callback
switch.device_type = DeviceType.Switch
switch.requires_turn_on = True
switch.async_setup = AsyncMock(side_effect=_save_setup_callback)
switch.async_stop = AsyncMock()
switch.async_update = AsyncMock()
@@ -40,6 +40,8 @@ from . import (
from tests.common import MockConfigEntry
MAC_ADDRESS_DIFFERENT = "ff:bb:ff:dd:ee:ff"
async def test_discovery(hass: HomeAssistant):
"""Test setting up discovery."""
@@ -472,6 +474,34 @@ async def test_discovered_by_dhcp_or_discovery_adds_missing_unique_id(
assert config_entry.unique_id == MAC_ADDRESS
@pytest.mark.parametrize(
"source, data",
[
(config_entries.SOURCE_DHCP, DHCP_DISCOVERY),
(config_entries.SOURCE_DISCOVERY, FLUX_DISCOVERY),
],
)
async def test_discovered_by_dhcp_or_discovery_mac_address_mismatch_host_already_configured(
hass, source, data
):
"""Test we abort if the host is already configured but the mac does not match."""
config_entry = MockConfigEntry(
domain=DOMAIN, data={CONF_HOST: IP_ADDRESS}, unique_id=MAC_ADDRESS_DIFFERENT
)
config_entry.add_to_hass(hass)
with _patch_discovery(), _patch_wifibulb():
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": source}, data=data
)
await hass.async_block_till_done()
assert result["type"] == RESULT_TYPE_ABORT
assert result["reason"] == "already_configured"
assert config_entry.unique_id == MAC_ADDRESS_DIFFERENT
async def test_options(hass: HomeAssistant):
"""Test options flow."""
config_entry = MockConfigEntry(
+119 -8
View File
@@ -226,22 +226,19 @@ async def test_rgb_light(hass: HomeAssistant) -> None:
bulb.async_set_levels.reset_mock()
bulb.async_turn_on.reset_mock()
await hass.services.async_call(
LIGHT_DOMAIN, "turn_on", {ATTR_ENTITY_ID: entity_id}, blocking=True
)
bulb.async_turn_on.assert_called_once()
bulb.async_turn_on.reset_mock()
await async_mock_device_turn_on(hass, bulb)
assert hass.states.get(entity_id).state == STATE_ON
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_BRIGHTNESS: 100},
blocking=True,
)
# If its off and the device requires the turn on
# command before setting brightness we need to make sure its called
bulb.async_turn_on.assert_called_once()
bulb.async_set_brightness.assert_called_with(100)
bulb.async_set_brightness.reset_mock()
await async_mock_device_turn_on(hass, bulb)
assert hass.states.get(entity_id).state == STATE_ON
await hass.services.async_call(
LIGHT_DOMAIN,
@@ -284,6 +281,120 @@ async def test_rgb_light(hass: HomeAssistant) -> None:
bulb.async_set_effect.reset_mock()
async def test_rgb_light_auto_on(hass: HomeAssistant) -> None:
"""Test an rgb light that does not need the turn on command sent."""
config_entry = MockConfigEntry(
domain=DOMAIN,
data={CONF_HOST: IP_ADDRESS, CONF_NAME: DEFAULT_ENTRY_TITLE},
unique_id=MAC_ADDRESS,
)
config_entry.add_to_hass(hass)
bulb = _mocked_bulb()
bulb.requires_turn_on = False
bulb.raw_state = bulb.raw_state._replace(model_num=0x33) # RGB only model
bulb.color_modes = {FLUX_COLOR_MODE_RGB}
bulb.color_mode = FLUX_COLOR_MODE_RGB
with _patch_discovery(device=bulb), _patch_wifibulb(device=bulb):
await async_setup_component(hass, flux_led.DOMAIN, {flux_led.DOMAIN: {}})
await hass.async_block_till_done()
entity_id = "light.bulb_rgbcw_ddeeff"
state = hass.states.get(entity_id)
assert state.state == STATE_ON
attributes = state.attributes
assert attributes[ATTR_BRIGHTNESS] == 128
assert attributes[ATTR_COLOR_MODE] == "rgb"
assert attributes[ATTR_EFFECT_LIST] == bulb.effect_list
assert attributes[ATTR_SUPPORTED_COLOR_MODES] == ["rgb"]
assert attributes[ATTR_HS_COLOR] == (0, 100)
await hass.services.async_call(
LIGHT_DOMAIN, "turn_off", {ATTR_ENTITY_ID: entity_id}, blocking=True
)
bulb.async_turn_off.assert_called_once()
await async_mock_device_turn_off(hass, bulb)
assert hass.states.get(entity_id).state == STATE_OFF
bulb.brightness = 0
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_RGB_COLOR: (10, 10, 30)},
blocking=True,
)
# If the bulb is off and we are using existing brightness
# it has to be at least 1 or the bulb won't turn on
bulb.async_turn_on.assert_not_called()
bulb.async_set_levels.assert_called_with(10, 10, 30, brightness=1)
bulb.async_set_levels.reset_mock()
bulb.async_turn_on.reset_mock()
# Should still be called with no kwargs
await hass.services.async_call(
LIGHT_DOMAIN, "turn_on", {ATTR_ENTITY_ID: entity_id}, blocking=True
)
bulb.async_turn_on.assert_called_once()
await async_mock_device_turn_on(hass, bulb)
assert hass.states.get(entity_id).state == STATE_ON
bulb.async_turn_on.reset_mock()
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_BRIGHTNESS: 100},
blocking=True,
)
bulb.async_turn_on.assert_not_called()
bulb.async_set_brightness.assert_called_with(100)
bulb.async_set_brightness.reset_mock()
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_RGB_COLOR: (10, 10, 30)},
blocking=True,
)
# If the bulb is on and we are using existing brightness
# and brightness was 0 it means we could not read it because
# an effect is in progress so we use 255
bulb.async_turn_on.assert_not_called()
bulb.async_set_levels.assert_called_with(10, 10, 30, brightness=255)
bulb.async_set_levels.reset_mock()
bulb.brightness = 128
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_HS_COLOR: (10, 30)},
blocking=True,
)
bulb.async_turn_on.assert_not_called()
bulb.async_set_levels.assert_called_with(255, 191, 178, brightness=128)
bulb.async_set_levels.reset_mock()
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_EFFECT: "random"},
blocking=True,
)
bulb.async_turn_on.assert_not_called()
bulb.async_set_effect.assert_called_once()
bulb.async_set_effect.reset_mock()
await hass.services.async_call(
LIGHT_DOMAIN,
"turn_on",
{ATTR_ENTITY_ID: entity_id, ATTR_EFFECT: "purple_fade"},
blocking=True,
)
bulb.async_turn_on.assert_not_called()
bulb.async_set_effect.assert_called_with("purple_fade", 50, 50)
bulb.async_set_effect.reset_mock()
async def test_rgb_cct_light(hass: HomeAssistant) -> None:
"""Test an rgb cct light."""
config_entry = MockConfigEntry(
+1 -1
View File
@@ -19,7 +19,7 @@ async def test_binary_sensors(hass, mock_bridge_v2, v2_resources_test_data):
sensor = hass.states.get("binary_sensor.hue_motion_sensor_motion")
assert sensor is not None
assert sensor.state == "off"
assert sensor.name == "Hue motion sensor: Motion"
assert sensor.name == "Hue motion sensor Motion"
assert sensor.attributes["device_class"] == "motion"
assert sensor.attributes["motion_valid"] is True
+2 -3
View File
@@ -7,7 +7,7 @@ from aiohue.errors import LinkButtonNotPressed
import pytest
import voluptuous as vol
from homeassistant import config_entries, data_entry_flow
from homeassistant import config_entries
from homeassistant.components import ssdp, zeroconf
from homeassistant.components.hue import config_flow, const
from homeassistant.components.hue.errors import CannotConnect
@@ -706,8 +706,7 @@ async def test_options_flow_v2(hass):
)
entry.add_to_hass(hass)
with pytest.raises(data_entry_flow.UnknownHandler):
await hass.config_entries.options.async_init(entry.entry_id)
assert config_flow.HueFlowHandler.async_supports_options_flow(entry) is False
async def test_bridge_zeroconf(hass, aioclient_mock):
+43 -11
View File
@@ -54,12 +54,12 @@ async def test_light_entity_migration(
# create device/entity with V1 schema in registry
device = dev_reg.async_get_or_create(
config_entry_id=config_entry.entry_id,
identifiers={(hue.DOMAIN, "00:17:88:01:09:aa:bb:65")},
identifiers={(hue.DOMAIN, "00:17:88:01:09:aa:bb:65-0b")},
)
ent_reg.async_get_or_create(
"light",
hue.DOMAIN,
"00:17:88:01:09:aa:bb:65",
"00:17:88:01:09:aa:bb:65-0b",
suggested_object_id="migrated_light_1",
device_id=device.id,
)
@@ -74,14 +74,13 @@ async def test_light_entity_migration(
):
await hue.migration.handle_v2_migration(hass, config_entry)
# migrated device should have new identifier (guid) and old style (mac)
# migrated device should now have the new identifier (guid) instead of old style (mac)
migrated_device = dev_reg.async_get(device.id)
assert migrated_device is not None
assert migrated_device.identifiers == {
(hue.DOMAIN, "0b216218-d811-4c95-8c55-bbcda50f9d50"),
(hue.DOMAIN, "00:17:88:01:09:aa:bb:65"),
(hue.DOMAIN, "0b216218-d811-4c95-8c55-bbcda50f9d50")
}
# the entity should have the new identifier (guid)
# the entity should have the new unique_id (guid)
migrated_entity = ent_reg.async_get("light.migrated_light_1")
assert migrated_entity is not None
assert migrated_entity.unique_id == "02cba059-9c2c-4d45-97e4-4f79b1bfbaa1"
@@ -131,14 +130,13 @@ async def test_sensor_entity_migration(
):
await hue.migration.handle_v2_migration(hass, config_entry)
# migrated device should have new identifier (guid) and old style (mac)
# migrated device should now have the new identifier (guid) instead of old style (mac)
migrated_device = dev_reg.async_get(device.id)
assert migrated_device is not None
assert migrated_device.identifiers == {
(hue.DOMAIN, "2330b45d-6079-4c6e-bba6-1b68afb1a0d6"),
(hue.DOMAIN, device_mac),
(hue.DOMAIN, "2330b45d-6079-4c6e-bba6-1b68afb1a0d6")
}
# the entities should have the correct V2 identifier (guid)
# the entities should have the correct V2 unique_id (guid)
for dev_class, platform, new_id in sensor_mappings:
migrated_entity = ent_reg.async_get(
f"{platform}.hue_migrated_{dev_class}_sensor"
@@ -147,7 +145,7 @@ async def test_sensor_entity_migration(
assert migrated_entity.unique_id == new_id
async def test_group_entity_migration(
async def test_group_entity_migration_with_v1_id(
hass, mock_bridge_v2, mock_config_entry_v2, v2_resources_test_data
):
"""Test if entity schema for grouped_lights migrates from v1 to v2."""
@@ -156,6 +154,7 @@ async def test_group_entity_migration(
ent_reg = er.async_get(hass)
# create (deviceless) entity with V1 schema in registry
# using the legacy style group id as unique id
ent_reg.async_get_or_create(
"light",
hue.DOMAIN,
@@ -177,3 +176,36 @@ async def test_group_entity_migration(
migrated_entity = ent_reg.async_get("light.hue_migrated_grouped_light")
assert migrated_entity is not None
assert migrated_entity.unique_id == "e937f8db-2f0e-49a0-936e-027e60e15b34"
async def test_group_entity_migration_with_v2_group_id(
hass, mock_bridge_v2, mock_config_entry_v2, v2_resources_test_data
):
"""Test if entity schema for grouped_lights migrates from v1 to v2."""
config_entry = mock_bridge_v2.config_entry = mock_config_entry_v2
ent_reg = er.async_get(hass)
# create (deviceless) entity with V1 schema in registry
# using the V2 group id as unique id
ent_reg.async_get_or_create(
"light",
hue.DOMAIN,
"6ddc9066-7e7d-4a03-a773-c73937968296",
suggested_object_id="hue_migrated_grouped_light",
config_entry=config_entry,
)
# now run the migration and check results
await mock_bridge_v2.api.load_test_data(v2_resources_test_data)
await hass.async_block_till_done()
with patch(
"homeassistant.components.hue.migration.HueBridgeV2",
return_value=mock_bridge_v2.api,
):
await hue.migration.handle_v2_migration(hass, config_entry)
# the entity should have the new identifier (guid)
migrated_entity = ent_reg.async_get("light.hue_migrated_grouped_light")
assert migrated_entity is not None
assert migrated_entity.unique_id == "e937f8db-2f0e-49a0-936e-027e60e15b34"
+3 -3
View File
@@ -22,7 +22,7 @@ async def test_sensors(hass, mock_bridge_v2, v2_resources_test_data):
sensor = hass.states.get("sensor.hue_motion_sensor_temperature")
assert sensor is not None
assert sensor.state == "18.1"
assert sensor.attributes["friendly_name"] == "Hue motion sensor: Temperature"
assert sensor.attributes["friendly_name"] == "Hue motion sensor Temperature"
assert sensor.attributes["device_class"] == "temperature"
assert sensor.attributes["state_class"] == "measurement"
assert sensor.attributes["unit_of_measurement"] == "°C"
@@ -32,7 +32,7 @@ async def test_sensors(hass, mock_bridge_v2, v2_resources_test_data):
sensor = hass.states.get("sensor.hue_motion_sensor_illuminance")
assert sensor is not None
assert sensor.state == "63"
assert sensor.attributes["friendly_name"] == "Hue motion sensor: Illuminance"
assert sensor.attributes["friendly_name"] == "Hue motion sensor Illuminance"
assert sensor.attributes["device_class"] == "illuminance"
assert sensor.attributes["state_class"] == "measurement"
assert sensor.attributes["unit_of_measurement"] == "lx"
@@ -43,7 +43,7 @@ async def test_sensors(hass, mock_bridge_v2, v2_resources_test_data):
sensor = hass.states.get("sensor.wall_switch_with_2_controls_battery")
assert sensor is not None
assert sensor.state == "100"
assert sensor.attributes["friendly_name"] == "Wall switch with 2 controls: Battery"
assert sensor.attributes["friendly_name"] == "Wall switch with 2 controls Battery"
assert sensor.attributes["device_class"] == "battery"
assert sensor.attributes["state_class"] == "measurement"
assert sensor.attributes["unit_of_measurement"] == "%"
+1 -1
View File
@@ -17,7 +17,7 @@ async def test_switch(hass, mock_bridge_v2, v2_resources_test_data):
# test config switch to enable/disable motion sensor
test_entity = hass.states.get("switch.hue_motion_sensor_motion")
assert test_entity is not None
assert test_entity.name == "Hue motion sensor: Motion"
assert test_entity.name == "Hue motion sensor Motion"
assert test_entity.state == "on"
assert test_entity.attributes["device_class"] == "switch"
@@ -5,6 +5,7 @@ import pytest
from homeassistant import config_entries, data_entry_flow
from homeassistant.components import islamic_prayer_times
from homeassistant.components.islamic_prayer_times import config_flow # noqa: F401
from homeassistant.components.islamic_prayer_times.const import CONF_CALC_METHOD, DOMAIN
from tests.common import MockConfigEntry
+13 -23
View File
@@ -53,31 +53,12 @@ def create_config_entry(hass, token_expiration_time=None) -> MockConfigEntry:
return config_entry
class FakeDeviceManager(DeviceManager):
"""Fake DeviceManager that can supply a list of devices and structures."""
def __init__(self, devices: dict, structures: dict):
"""Initialize FakeDeviceManager."""
super().__init__()
self._devices = devices
@property
def structures(self) -> dict:
"""Override structures with fake result."""
return self._structures
@property
def devices(self) -> dict:
"""Override devices with fake result."""
return self._devices
class FakeSubscriber(GoogleNestSubscriber):
"""Fake subscriber that supplies a FakeDeviceManager."""
def __init__(self, device_manager: FakeDeviceManager):
def __init__(self):
"""Initialize Fake Subscriber."""
self._device_manager = device_manager
self._device_manager = DeviceManager()
def set_update_callback(self, callback: Callable[[EventMessage], Awaitable[None]]):
"""Capture the callback set by Home Assistant."""
@@ -121,8 +102,14 @@ async def async_setup_sdm_platform(
"""Set up the platform and prerequisites."""
if with_config:
create_config_entry(hass)
device_manager = FakeDeviceManager(devices=devices, structures=structures)
subscriber = FakeSubscriber(device_manager)
subscriber = FakeSubscriber()
device_manager = await subscriber.async_get_device_manager()
if devices:
for device in devices.values():
device_manager.add_device(device)
if structures:
for structure in structures.values():
device_manager.add_structure(structure)
with patch(
"homeassistant.helpers.config_entry_oauth2_flow.async_get_config_entry_implementation"
), patch("homeassistant.components.nest.PLATFORMS", [platform]), patch(
@@ -131,4 +118,7 @@ async def async_setup_sdm_platform(
):
assert await async_setup_component(hass, DOMAIN, CONFIG)
await hass.async_block_till_done()
# Disabled to reduce setup burden, and enabled manually by tests that
# need to exercise this
subscriber.cache_policy.fetch = False
return subscriber
@@ -17,7 +17,7 @@ from homeassistant.const import CONF_CLIENT_ID, CONF_CLIENT_SECRET
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_entry_oauth2_flow
from .common import FakeDeviceManager, FakeSubscriber, MockConfigEntry
from .common import FakeSubscriber, MockConfigEntry
CLIENT_ID = "1234"
CLIENT_SECRET = "5678"
@@ -43,15 +43,9 @@ APP_REDIRECT_URL = "urn:ietf:wg:oauth:2.0:oob"
@pytest.fixture
def device_manager() -> FakeDeviceManager:
"""Create FakeDeviceManager."""
return FakeDeviceManager(devices={}, structures={})
@pytest.fixture
def subscriber(device_manager: FakeDeviceManager) -> FakeSubscriber:
def subscriber() -> FakeSubscriber:
"""Create FakeSubscriber."""
return FakeSubscriber(device_manager)
return FakeSubscriber()
def get_config_entry(hass):
+6 -6
View File
@@ -117,7 +117,7 @@ async def test_doorbell_chime_event(hass):
"device_id": entry.device_id,
"type": "doorbell_chime",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
@@ -145,7 +145,7 @@ async def test_camera_motion_event(hass):
"device_id": entry.device_id,
"type": "camera_motion",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
@@ -173,7 +173,7 @@ async def test_camera_sound_event(hass):
"device_id": entry.device_id,
"type": "camera_sound",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
@@ -201,7 +201,7 @@ async def test_camera_person_event(hass):
"device_id": entry.device_id,
"type": "camera_person",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
@@ -238,13 +238,13 @@ async def test_camera_multiple_event(hass):
"device_id": entry.device_id,
"type": "camera_motion",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
assert events[1].data == {
"device_id": entry.device_id,
"type": "camera_person",
"timestamp": event_time,
"nest_event_id": EVENT_ID,
"nest_event_id": EVENT_SESSION_ID,
}
+117 -29
View File
@@ -27,6 +27,7 @@ DEVICE_ID = "example/api/device/id"
DEVICE_NAME = "Front"
PLATFORM = "camera"
NEST_EVENT = "nest_event"
EVENT_ID = "1aXEvi9ajKVTdDsXdJda8fzfCa..."
EVENT_SESSION_ID = "CjY5Y3VKaTZwR3o4Y19YbTVfMF..."
CAMERA_DEVICE_TYPE = "sdm.devices.types.CAMERA"
CAMERA_TRAITS = {
@@ -81,27 +82,31 @@ async def async_setup_devices(hass, auth, device_type, traits={}, events=[]):
return subscriber
def create_event(event_id, event_type, timestamp=None):
def create_event(
event_session_id, event_id, event_type, timestamp=None, device_id=None
):
"""Create an EventMessage for a single event type."""
if not timestamp:
timestamp = dt_util.now()
event_data = {
event_type: {
"eventSessionId": EVENT_SESSION_ID,
"eventSessionId": event_session_id,
"eventId": event_id,
},
}
return create_event_message(event_id, event_data, timestamp)
return create_event_message(event_data, timestamp, device_id=device_id)
def create_event_message(event_id, event_data, timestamp):
def create_event_message(event_data, timestamp, device_id=None):
"""Create an EventMessage for a single event type."""
if device_id is None:
device_id = DEVICE_ID
return EventMessage(
{
"eventId": f"{event_id}-{timestamp}",
"eventId": f"{EVENT_ID}-{timestamp}",
"timestamp": timestamp.isoformat(timespec="seconds"),
"resourceUpdate": {
"name": DEVICE_ID,
"name": device_id,
"events": event_data,
},
},
@@ -161,7 +166,6 @@ async def test_supported_device(hass, auth):
async def test_camera_event(hass, auth, hass_client):
"""Test a media source and image created for an event."""
event_id = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_timestamp = dt_util.now()
await async_setup_devices(
hass,
@@ -170,7 +174,8 @@ async def test_camera_event(hass, auth, hass_client):
CAMERA_TRAITS,
events=[
create_event(
event_id,
EVENT_SESSION_ID,
EVENT_ID,
PERSON_EVENT,
timestamp=event_timestamp,
),
@@ -211,7 +216,7 @@ async def test_camera_event(hass, auth, hass_client):
# The device expands recent events
assert len(browse.children) == 1
assert browse.children[0].domain == DOMAIN
assert browse.children[0].identifier == f"{device.id}/{event_id}"
assert browse.children[0].identifier == f"{device.id}/{EVENT_SESSION_ID}"
event_timestamp_string = event_timestamp.strftime(DATE_STR_FORMAT)
assert browse.children[0].title == f"Person @ {event_timestamp_string}"
assert not browse.children[0].can_expand
@@ -219,19 +224,19 @@ async def test_camera_event(hass, auth, hass_client):
# Browse to the event
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{event_id}"
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{EVENT_SESSION_ID}"
)
assert browse.domain == DOMAIN
assert browse.identifier == f"{device.id}/{event_id}"
assert browse.identifier == f"{device.id}/{EVENT_SESSION_ID}"
assert "Person" in browse.title
assert not browse.can_expand
assert not browse.children
# Resolving the event links to the media
media = await media_source.async_resolve_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{event_id}"
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{EVENT_SESSION_ID}"
)
assert media.url == f"/api/nest/event_media/{device.id}/{event_id}"
assert media.url == f"/api/nest/event_media/{device.id}/{EVENT_SESSION_ID}"
assert media.mime_type == "image/jpeg"
auth.responses = [
@@ -248,9 +253,9 @@ async def test_camera_event(hass, auth, hass_client):
async def test_event_order(hass, auth):
"""Test multiple events are in descending timestamp order."""
event_id1 = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_session_id1 = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_timestamp1 = dt_util.now()
event_id2 = "GXXWRWVeHNUlUU3V3MGV3bUOYW..."
event_session_id2 = "GXXWRWVeHNUlUU3V3MGV3bUOYW..."
event_timestamp2 = event_timestamp1 + datetime.timedelta(seconds=5)
await async_setup_devices(
hass,
@@ -259,12 +264,14 @@ async def test_event_order(hass, auth):
CAMERA_TRAITS,
events=[
create_event(
event_id1,
event_session_id1,
EVENT_ID + "1",
PERSON_EVENT,
timestamp=event_timestamp1,
),
create_event(
event_id2,
event_session_id2,
EVENT_ID + "2",
MOTION_EVENT,
timestamp=event_timestamp2,
),
@@ -291,7 +298,7 @@ async def test_event_order(hass, auth):
# Motion event is most recent
assert len(browse.children) == 2
assert browse.children[0].domain == DOMAIN
assert browse.children[0].identifier == f"{device.id}/{event_id2}"
assert browse.children[0].identifier == f"{device.id}/{event_session_id2}"
event_timestamp_string = event_timestamp2.strftime(DATE_STR_FORMAT)
assert browse.children[0].title == f"Motion @ {event_timestamp_string}"
assert not browse.children[0].can_expand
@@ -299,7 +306,7 @@ async def test_event_order(hass, auth):
# Person event is next
assert browse.children[1].domain == DOMAIN
assert browse.children[1].identifier == f"{device.id}/{event_id1}"
assert browse.children[1].identifier == f"{device.id}/{event_session_id1}"
event_timestamp_string = event_timestamp1.strftime(DATE_STR_FORMAT)
assert browse.children[1].title == f"Person @ {event_timestamp_string}"
assert not browse.children[1].can_expand
@@ -393,9 +400,12 @@ async def test_resolve_invalid_event_id(hass, auth):
async def test_camera_event_clip_preview(hass, auth, hass_client):
"""Test an event for a battery camera video clip."""
event_id = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_timestamp = dt_util.now()
event_data = {
"sdm.devices.events.CameraMotion.Motion": {
"eventSessionId": EVENT_SESSION_ID,
"eventId": "n:2",
},
"sdm.devices.events.CameraClipPreview.ClipPreview": {
"eventSessionId": EVENT_SESSION_ID,
"previewUrl": "https://127.0.0.1/example",
@@ -408,7 +418,6 @@ async def test_camera_event_clip_preview(hass, auth, hass_client):
BATTERY_CAMERA_TRAITS,
events=[
create_event_message(
event_id,
event_data,
timestamp=event_timestamp,
),
@@ -437,7 +446,7 @@ async def test_camera_event_clip_preview(hass, auth, hass_client):
assert browse.children[0].domain == DOMAIN
actual_event_id = browse.children[0].identifier
event_timestamp_string = event_timestamp.strftime(DATE_STR_FORMAT)
assert browse.children[0].title == f"Event @ {event_timestamp_string}"
assert browse.children[0].title == f"Motion @ {event_timestamp_string}"
assert not browse.children[0].can_expand
assert len(browse.children[0].children) == 0
@@ -488,7 +497,6 @@ async def test_event_media_render_invalid_event_id(hass, auth, hass_client):
async def test_event_media_failure(hass, auth, hass_client):
"""Test event media fetch sees a failure from the server."""
event_id = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_timestamp = dt_util.now()
await async_setup_devices(
hass,
@@ -497,7 +505,8 @@ async def test_event_media_failure(hass, auth, hass_client):
CAMERA_TRAITS,
events=[
create_event(
event_id,
EVENT_SESSION_ID,
EVENT_ID,
PERSON_EVENT,
timestamp=event_timestamp,
),
@@ -515,9 +524,9 @@ async def test_event_media_failure(hass, auth, hass_client):
# Resolving the event links to the media
media = await media_source.async_resolve_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{event_id}"
hass, f"{const.URI_SCHEME}{DOMAIN}/{device.id}/{EVENT_SESSION_ID}"
)
assert media.url == f"/api/nest/event_media/{device.id}/{event_id}"
assert media.url == f"/api/nest/event_media/{device.id}/{EVENT_SESSION_ID}"
assert media.mime_type == "image/jpeg"
auth.responses = [
@@ -533,7 +542,6 @@ async def test_event_media_failure(hass, auth, hass_client):
async def test_media_permission_unauthorized(hass, auth, hass_client, hass_admin_user):
"""Test case where user does not have permissions to view media."""
event_id = "FWWVQVUdGNUlTU2V4MGV2aTNXV..."
event_timestamp = dt_util.now()
await async_setup_devices(
hass,
@@ -542,7 +550,8 @@ async def test_media_permission_unauthorized(hass, auth, hass_client, hass_admin
CAMERA_TRAITS,
events=[
create_event(
event_id,
EVENT_SESSION_ID,
EVENT_ID,
PERSON_EVENT,
timestamp=event_timestamp,
),
@@ -558,7 +567,7 @@ async def test_media_permission_unauthorized(hass, auth, hass_client, hass_admin
assert device
assert device.name == DEVICE_NAME
media_url = f"/api/nest/event_media/{device.id}/{event_id}"
media_url = f"/api/nest/event_media/{device.id}/{EVENT_SESSION_ID}"
# Empty policy with no access to the entity
hass_admin_user.mock_policy({})
@@ -568,3 +577,82 @@ async def test_media_permission_unauthorized(hass, auth, hass_client, hass_admin
assert response.status == HTTPStatus.UNAUTHORIZED, (
"Response not matched: %s" % response
)
async def test_multiple_devices(hass, auth, hass_client):
"""Test events received for multiple devices."""
device_id1 = f"{DEVICE_ID}-1"
device_id2 = f"{DEVICE_ID}-2"
devices = {
device_id1: Device.MakeDevice(
{
"name": device_id1,
"type": CAMERA_DEVICE_TYPE,
"traits": CAMERA_TRAITS,
},
auth=auth,
),
device_id2: Device.MakeDevice(
{
"name": device_id2,
"type": CAMERA_DEVICE_TYPE,
"traits": CAMERA_TRAITS,
},
auth=auth,
),
}
subscriber = await async_setup_sdm_platform(hass, PLATFORM, devices=devices)
device_registry = dr.async_get(hass)
device1 = device_registry.async_get_device({(DOMAIN, device_id1)})
assert device1
device2 = device_registry.async_get_device({(DOMAIN, device_id2)})
assert device2
# Very no events have been received yet
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device1.id}"
)
assert len(browse.children) == 0
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device2.id}"
)
assert len(browse.children) == 0
# Send events for device #1
for i in range(0, 5):
await subscriber.async_receive_event(
create_event(
f"event-session-id-{i}",
f"event-id-{i}",
PERSON_EVENT,
device_id=device_id1,
)
)
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device1.id}"
)
assert len(browse.children) == 5
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device2.id}"
)
assert len(browse.children) == 0
# Send events for device #2
for i in range(0, 3):
await subscriber.async_receive_event(
create_event(
f"other-id-{i}", f"event-id{i}", PERSON_EVENT, device_id=device_id2
)
)
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device1.id}"
)
assert len(browse.children) == 5
browse = await media_source.async_browse_media(
hass, f"{const.URI_SCHEME}{DOMAIN}/{device2.id}"
)
assert len(browse.children) == 3
@@ -5,7 +5,10 @@
"id": "91763b24c43d3e344f424e8b",
"name": "MYHOME",
"altitude": 112,
"coordinates": [52.516263, 13.377726],
"coordinates": [
52.516263,
13.377726
],
"country": "DE",
"timezone": "Europe/Berlin",
"rooms": [
@@ -13,25 +16,33 @@
"id": "2746182631",
"name": "Livingroom",
"type": "livingroom",
"module_ids": ["12:34:56:00:01:ae"]
"module_ids": [
"12:34:56:00:01:ae"
]
},
{
"id": "3688132631",
"name": "Hall",
"type": "custom",
"module_ids": ["12:34:56:00:f1:62"]
"module_ids": [
"12:34:56:00:f1:62"
]
},
{
"id": "2833524037",
"name": "Entrada",
"type": "lobby",
"module_ids": ["12:34:56:03:a5:54"]
"module_ids": [
"12:34:56:03:a5:54"
]
},
{
"id": "2940411577",
"name": "Cocina",
"type": "kitchen",
"module_ids": ["12:34:56:03:a0:ac"]
"module_ids": [
"12:34:56:03:a0:ac"
]
}
],
"modules": [
@@ -388,6 +399,85 @@
}
],
"therm_mode": "schedule"
},
{
"id": "111111111111111111111401",
"name": "Home with no modules",
"altitude": 9,
"coordinates": [
1.23456789,
50.0987654
],
"country": "BE",
"timezone": "Europe/Brussels",
"rooms": [
{
"id": "1111111401",
"name": "Livingroom",
"type": "livingroom"
}
],
"temperature_control_mode": "heating",
"therm_mode": "away",
"therm_setpoint_default_duration": 120,
"cooling_mode": "schedule",
"schedules": [
{
"away_temp": 14,
"hg_temp": 7,
"name": "Week",
"timetable": [
{
"zone_id": 1,
"m_offset": 0
},
{
"zone_id": 6,
"m_offset": 420
}
],
"zones": [
{
"type": 0,
"name": "Comfort",
"rooms_temp": [],
"id": 0,
"rooms": []
},
{
"type": 1,
"name": "Nacht",
"rooms_temp": [],
"id": 1,
"rooms": []
},
{
"type": 5,
"name": "Eco",
"rooms_temp": [],
"id": 4,
"rooms": []
},
{
"type": 4,
"name": "Tussenin",
"rooms_temp": [],
"id": 5,
"rooms": []
},
{
"type": 4,
"name": "Ochtend",
"rooms_temp": [],
"id": 6,
"rooms": []
}
],
"id": "700000000000000000000401",
"selected": true,
"type": "therm"
}
]
}
],
"user": {
@@ -404,4 +494,4 @@
"status": "ok",
"time_exec": 0.056135892868042,
"time_server": 1559171003
}
}
@@ -0,0 +1,4 @@
{
"status": "ok",
"time_server": 1638873670
}
+30
View File
@@ -9,6 +9,7 @@ from hatasmota.utils import (
get_topic_tele_sensor,
get_topic_tele_will,
)
import pytest
from homeassistant.components import cover
from homeassistant.components.tasmota.const import DEFAULT_PREFIX
@@ -34,6 +35,35 @@ async def test_missing_relay(hass, mqtt_mock, setup_tasmota):
"""Test no cover is discovered if relays are missing."""
@pytest.mark.parametrize(
"relay_config, num_covers",
[
([3, 3, 3, 3, 3, 3, 1, 1, 3, 3], 4),
([3, 3, 3, 3, 0, 0, 0, 0], 2),
([3, 3, 1, 1, 0, 0, 0, 0], 1),
([3, 3, 3, 1, 0, 0, 0, 0], 0),
],
)
async def test_multiple_covers(
hass, mqtt_mock, setup_tasmota, relay_config, num_covers
):
"""Test discovery of multiple covers."""
config = copy.deepcopy(DEFAULT_CONFIG)
config["rl"] = relay_config
mac = config["mac"]
assert len(hass.states.async_all("cover")) == 0
async_fire_mqtt_message(
hass,
f"{DEFAULT_PREFIX}/{mac}/config",
json.dumps(config),
)
await hass.async_block_till_done()
assert len(hass.states.async_all("cover")) == num_covers
async def test_controlling_state_via_mqtt(hass, mqtt_mock, setup_tasmota):
"""Test state update via MQTT."""
config = copy.deepcopy(DEFAULT_CONFIG)
@@ -24,7 +24,7 @@ async def test_sensors(hass: HomeAssistant) -> None:
"sensor.tesla_wall_connector_grid_frequency", "50.021", "49.981"
),
EntityAndExpectedValues(
"sensor.tesla_wall_connector_total_energy", "988.022", "989.0"
"sensor.tesla_wall_connector_energy", "988022", "989000"
),
EntityAndExpectedValues(
"sensor.tesla_wall_connector_phase_a_current", "10", "7"
+22
View File
@@ -1,4 +1,5 @@
"""Test the frame helper."""
# pylint: disable=protected-access
from unittest.mock import Mock, patch
import pytest
@@ -70,3 +71,24 @@ async def test_extract_frame_no_integration(caplog):
],
), pytest.raises(frame.MissingIntegrationFrame):
frame.get_integration_frame()
@pytest.mark.usefixtures("mock_integration_frame")
@patch.object(frame, "_REPORTED_INTEGRATIONS", set())
async def test_prevent_flooding(caplog):
"""Test to ensure a report is only written once to the log."""
what = "accessed hi instead of hello"
key = "/home/paulus/homeassistant/components/hue/light.py:23"
frame.report(what, error_if_core=False)
assert what in caplog.text
assert key in frame._REPORTED_INTEGRATIONS
assert len(frame._REPORTED_INTEGRATIONS) == 1
caplog.clear()
frame.report(what, error_if_core=False)
assert what not in caplog.text
assert key in frame._REPORTED_INTEGRATIONS
assert len(frame._REPORTED_INTEGRATIONS) == 1