Compare commits

..

76 Commits

Author SHA1 Message Date
Paulus Schoutsen
ca5a88342b 2023.3.6 (#90150) 2023-03-22 23:58:20 -04:00
Paulus Schoutsen
117113cdfc Bumped version to 2023.3.6 2023-03-22 22:59:47 -04:00
Paulus Schoutsen
174342860b Always enforce URL param ordering for signed URLs (#90148)
Always enforce URL param ordering
2023-03-22 22:59:42 -04:00
J. Nick Koston
a7b5a0297e Bump PySwitchbot to 0.37.4 (#90146)
fixes #90090 fixes #89061

changelog: https://github.com/Danielhiversen/pySwitchbot/compare/0.37.3...0.37.4
2023-03-22 22:59:41 -04:00
Luke
406e92511b Bump to oralb-ble 0.17.6 (#90081) 2023-03-22 22:59:40 -04:00
Klaas Schoute
146347e31a Bump easyEnergy to v0.2.2 (#90080) 2023-03-22 22:59:39 -04:00
Klaas Schoute
3747fd5dcb Bump easyEnergy to v0.2.1 (#89630) 2023-03-22 22:59:38 -04:00
J. Nick Koston
53d400ca96 Bump yalexs-ble to 2.1.1 (#90015)
* Bump yalexs-ble to 2.1.1

There was another task that could be prematurely GCed

changelog: https://github.com/bdraco/yalexs-ble/compare/v2.1.0...v2.1.1

* fixes
2023-03-22 22:57:41 -04:00
J. Nick Koston
2a18261efb Bump yalexs_ble to 2.1.0 (#89772)
switches to using cryptography to reduce the number of deps

changelog: https://github.com/bdraco/yalexs-ble/compare/v2.0.4...v2.1.0
2023-03-22 22:57:40 -04:00
J. Nick Koston
1f71068740 Handle cancelation of wait_for_ble_connections_free in esphome bluetooth (#90014)
Handle cancelation in wait_for_ble_connections_free

If `wait_for_ble_connections_free` was canceled due to timeout or
the esp disconnecting from Home Assistant the future would get
canceled. When we reconnect and get the next callback we need
to handle it being done.

fixes
```
2023-03-21 02:34:36.876 ERROR (MainThread) [homeassistant] Error doing job: Fatal error: protocol.data_received() call failed.
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/asyncio/selector_events.py", line 868, in _read_ready__data_received
    self._protocol.data_received(data)
  File "/usr/local/lib/python3.10/site-packages/aioesphomeapi/_frame_helper.py", line 195, in data_received
    self._callback_packet(msg_type_int, bytes(packet_data))
  File "/usr/local/lib/python3.10/site-packages/aioesphomeapi/_frame_helper.py", line 110, in _callback_packet
    self._on_pkt(Packet(type_, data))
  File "/usr/local/lib/python3.10/site-packages/aioesphomeapi/connection.py", line 688, in _process_packet
    handler(msg)
  File "/usr/local/lib/python3.10/site-packages/aioesphomeapi/client.py", line 482, in on_msg
    on_bluetooth_connections_free_update(resp.free, resp.limit)
  File "/usr/src/homeassistant/homeassistant/components/esphome/entry_data.py", line 136, in async_update_ble_connection_limits
    fut.set_result(free)
asyncio.exceptions.InvalidStateError: invalid state
```
2023-03-22 22:54:42 -04:00
micha91
92fb978a03 Bump aiomusiccast to 0.14.8 (#89978) 2023-03-22 22:54:42 -04:00
J. Nick Koston
127f2289a1 Remove async_block_till_done in freebox (#89928)
async_block_till_done() is not meant to be called in integrations
2023-03-22 22:54:41 -04:00
Jan Bouwhuis
de6f55dcfb Fix blocking MQTT entry unload (#89922)
* Remove unneeded async_block_till_done

* use await asyncio.sleep(0) instead
2023-03-22 22:54:40 -04:00
Joakim Plate
713d3025f2 Correct missing wordswap for S series nibe (#89866)
Correct missing wordswap for nibe
2023-03-22 22:54:39 -04:00
J. Nick Koston
1e03ff68a2 Bump aioharmony to 0.2.10 (#89831)
fixes #89823
2023-03-22 22:54:38 -04:00
Jan Bouwhuis
a5aa5c0c01 Fix imap_email_content unknown status and replaying stale states (#89563) 2023-03-22 22:54:37 -04:00
Franck Nijhof
b6d001bfe6 2023.3.5 (#89814) 2023-03-16 20:45:14 +01:00
Franck Nijhof
7e18e15cac Bumped version to 2023.3.5 2023-03-16 18:48:17 +01:00
Bram Kragten
e651ca747b Update frontend to 20230309.1 (#89802) 2023-03-16 18:47:51 +01:00
J. Nick Koston
9fa73fe3a9 Bump aioesphomeapi to 13.5.1 (#89777) 2023-03-16 18:47:47 +01:00
Jan Bouwhuis
abda7b8a5b Fix imap server push holding HA startup (#89750) 2023-03-16 18:47:44 +01:00
jan iversen
90a4afb6fa Correct modbus serial method parameter (#89738) 2023-03-16 18:47:40 +01:00
Marcio Granzotto Rodrigues
52981699cf Bump bond-async to 0.1.23 (#89697) 2023-03-16 18:47:37 +01:00
Joakim Plate
c3d7696c2e Update to nibe 2.1.4 (#89686) 2023-03-16 18:47:33 +01:00
jan iversen
f120bac17f Secure modbus hub_collect remains valid (#89684)
Secure hub_collect remains valid.
2023-03-16 18:47:28 +01:00
Joakim Plate
02738fb9d4 Handle int or mapping for off case in nibe cooling (#89680)
Handle int or mapping for off case in nibe
2023-03-16 18:47:25 +01:00
J. Nick Koston
a9a6ff50cc Bump aioesphomeapi to 13.5.0 (#89262) 2023-03-16 18:47:21 +01:00
zhangshengdong29
fdd9c5383f ArestData does not have available (#88631) 2023-03-16 18:47:17 +01:00
Paulus Schoutsen
d084e70aff 2023.3.4 (#89647) 2023-03-14 00:10:23 -04:00
puddly
69582b7ecb Bump ZHA dependencies (#89667)
* Bump `zha-quirks` library and account for `setup_quirks` signature

* Bump other ZHA dependencies

* Revert zigpy bump
2023-03-13 22:06:05 -04:00
Paulus Schoutsen
160518350f Bump SQLAlchemy to 2.0.6 (#89650) 2023-03-13 14:54:27 -04:00
Paulus Schoutsen
daa5718a80 Bumped version to 2023.3.4 2023-03-13 13:26:50 -04:00
tomrennen
f5562e93ac Improved "ON" state check for Use room sensor for cooling (#89634) 2023-03-13 13:26:44 -04:00
Erik Montnemery
d2f90236d1 Rename modules named repairs.py which are not repairs platforms (#89618) 2023-03-13 13:26:43 -04:00
J. Nick Koston
65c614421a Increase maximum aiohttp connections to 4096 (#89611)
fixes #89408
2023-03-13 13:26:41 -04:00
Eugenio Panadero
22922da607 Bump aiopvpc to 4.1.0 (#89593) 2023-03-13 13:26:40 -04:00
J. Nick Koston
ca0304ffc4 Fix get_significant_states_with_session query looking at legacy columns (#89558) 2023-03-13 13:26:39 -04:00
Robert Svensson
950a1f6e9e Bump pydeconz to v110 (#89527)
* Bump pydeconz to v109

* Bump pydeconz to v110 for additional color modes
2023-03-13 13:26:38 -04:00
rappenze
1e7f58d859 Fix bug in fibaro cover (#89502) 2023-03-13 13:26:37 -04:00
J. Nick Koston
7cb4620671 Fix data migration never finishing when database has invalid datetimes (#89474)
* Fix data migration never finishing when database has invalid datetimes

If there were impossible datetime values in the database (likely
from a manual sqlite to MySQL conversion) the conversion would
never complete

* Update homeassistant/components/recorder/migration.py
2023-03-13 13:26:36 -04:00
Kevin Worrel
8c2569d2ce Reconnect on any ScreenLogic exception (#89269)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-03-13 13:26:34 -04:00
Arjan
6ebd493c4d Fix gtfs with 2023.3 (sqlachemy update) (#89175) 2023-03-13 13:26:33 -04:00
Jan Stienstra
990ecbba72 Recode Home Assistant instance name to ascii for Jellyfin (#87368)
Recode instance name to ascii
2023-03-13 13:26:32 -04:00
Paulus Schoutsen
ddde17606d 2023.3.3 (#89459) 2023-03-09 14:40:06 -05:00
Paulus Schoutsen
3fba181e7b Bumped version to 2023.3.3 2023-03-09 13:30:46 -05:00
Erik Montnemery
da79bf8534 Fix Dormakaba dKey deadbolt binary sensor (#89447)
* Fix Dormakaba dKey deadbolt binary sensor

* Spelling
2023-03-09 13:18:23 -05:00
Paul Bottein
83e2cc32b7 Update frontend to 20230309.0 (#89446) 2023-03-09 13:18:22 -05:00
Joakim Sørensen
c7fb404a17 Add paths for add-on changelog and documentation (#89411) 2023-03-09 13:18:21 -05:00
Jan Bouwhuis
f1e114380a Allow enum as MQTT sensor device_class (#89391) 2023-03-09 13:18:20 -05:00
Brandon Rothweiler
04e4a644cb Bump pymazda to 0.3.8 (#89387) 2023-03-09 13:18:19 -05:00
Dillon Fearns
e606c2e227 Bump roombapy to 1.6.6 (#89366)
Co-authored-by: J. Nick Koston <nick@koston.org>
2023-03-09 13:18:17 -05:00
Jan Bouwhuis
ebf95feff3 Fix MQTT rgb light brightness scaling (#89264)
* Normalize received RGB colors to 100% brightness

* Assert on rgb_color attribute

* Use max for RGB to get brightness

* Avoid division and add clamp

* remove clamp

Co-authored-by: Erik Montnemery <erik@montnemery.com>

---------

Co-authored-by: Erik Montnemery <erik@montnemery.com>
2023-03-09 13:18:15 -05:00
Franck Nijhof
3dca4c2f23 2023.3.2 (#89381) 2023-03-08 18:35:50 +01:00
Franck Nijhof
3f8f38f2df Bumped version to 2023.3.2 2023-03-08 16:24:08 +01:00
epenet
0844a0b269 Fix invalid state class in litterrobot (#89380) 2023-03-08 16:23:30 +01:00
Franck Nijhof
b65180d20a Improve Supervisor API handling (#89379) 2023-03-08 16:23:26 +01:00
starkillerOG
7f8a9697f0 Fix setting Reolink focus (#89374)
fix setting focus
2023-03-08 16:23:22 +01:00
J. Nick Koston
563bd4a0dd Fix bluetooth history and device expire running in the executor (#89342) 2023-03-08 16:23:18 +01:00
Florent Thoumie
29b5ef31c1 Recreate iaqualink httpx client upon service exception (#89341) 2023-03-08 16:23:13 +01:00
Renat Sibgatulin
863f8b727d Remove invalid device class in air-Q integration (#89329)
Remove device_class from sensors using inconsistent units
2023-03-08 16:23:09 +01:00
J. Nick Koston
83ed8cf689 Fix thread diagnostics loading blocking the event loop (#89307)
* Fix thread diagnostics loading blocking the event loop

* patch target
2023-03-08 16:23:06 +01:00
Tom Harris
52cd2f9429 Fix Insteon open issues with adding devices by address and missing events (#89305)
* Add missing events

* Bump dependancies

* Update for code review
2023-03-08 16:23:02 +01:00
puddly
74d3b2374b Clean ZHA radio path with trailing whitespace (#89299)
* Clean config flow entries with trailing whitespace

* Rewrite the config entry at runtime, without upgrading

* Skip intermediate `data = config_entry.data` variable

* Perform a deepcopy to ensure the config entry will actually be updated
2023-03-08 16:22:58 +01:00
epenet
f982af2412 Ignore DSL entities if SFR box is not adsl (#89291) 2023-03-08 16:22:53 +01:00
luar123
0b5ddd9cbf Bump python-snapcast to 2.3.2 (#89259) 2023-03-08 16:22:49 +01:00
J. Nick Koston
8d1aa0132e Make sql subqueries threadsafe (#89254)
* Make sql subqueries threadsafe

fixes #89224

* fix join outside of lambda

* move statement generation into a seperate function to make it easier to test

* add cache key tests

* no need to mock hass
2023-03-08 16:22:45 +01:00
J. Nick Koston
d737b97c91 Bump sqlalchemy to 2.0.5post1 (#89253)
changelog: https://docs.sqlalchemy.org/en/20/changelog/changelog_20.html#change-2.0.5

mostly bugfixes for 2.x regressions
2023-03-08 16:22:41 +01:00
Marc Mueller
0fac12866d Fix conditional check (#89231) 2023-03-08 16:22:38 +01:00
Bram Kragten
e3fe71f76e Update frontend to 20230306.0 (#89227) 2023-03-08 16:22:34 +01:00
J. Nick Koston
eba1bfad51 Bump aioesphomeapi to 13.4.2 (#89210) 2023-03-08 16:22:30 +01:00
Franck Nijhof
1a0a385e03 Fix Tuya Python 3.11 compatibility issue (#89189) 2023-03-08 16:22:26 +01:00
MarkGodwin
c9999cd08c Fix host IP and scheme entry issues in TP-Link Omada (#89130)
Fixing host IP and scheme entry issues
2023-03-08 16:22:22 +01:00
rappenze
8252aeead2 Bump pyfibaro version to 0.6.9 (#89120) 2023-03-08 16:22:18 +01:00
J. Nick Koston
c27a69ef85 Handle InnoDB deadlocks during migration (#89073)
* Handle slow InnoDB rollback when encountering duplicates during migration

fixes #89069

* adjust

* fix mock

* tests

* return on success
2023-03-08 16:22:15 +01:00
J. Nick Koston
d4c28a1f4a Cache transient templates compiles provided via api (#89065)
* Cache transient templates compiles provided via api

partially fixes #89047 (there is more going on here)

* add a bit more coverage just to be sure

* switch method

* Revert "switch method"

This reverts commit 0e9e1c8cbe8753159f4fd6775cdc9cf217d66f0e.

* tweak

* hold hass

* empty for github flakey
2023-03-08 16:22:10 +01:00
Andrew Westrope
322eb4bd83 Check type key of zone exists in geniushub (#86798)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2023-03-08 16:22:05 +01:00
99 changed files with 1648 additions and 724 deletions

View File

@@ -1100,6 +1100,7 @@ build.json @home-assistant/supervisor
/homeassistant/components/smhi/ @gjohansson-ST
/tests/components/smhi/ @gjohansson-ST
/homeassistant/components/sms/ @ocalvo
/homeassistant/components/snapcast/ @luar123
/homeassistant/components/snooz/ @AustinBrunkhorst
/tests/components/snooz/ @AustinBrunkhorst
/homeassistant/components/solaredge/ @frenck

View File

@@ -68,7 +68,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="co",
name="CO",
device_class=SensorDeviceClass.CO,
native_unit_of_measurement=CONCENTRATION_MILLIGRAMS_PER_CUBIC_METER,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("co"),
@@ -289,7 +288,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="tvoc",
name="VOC",
device_class=SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_BILLION,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("tvoc"),
@@ -297,7 +295,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
AirQEntityDescription(
key="tvoc_ionsc",
name="VOC (Industrial)",
device_class=SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_BILLION,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("tvoc_ionsc"),

View File

@@ -1,5 +1,6 @@
"""Rest API for Home Assistant."""
import asyncio
from functools import lru_cache
from http import HTTPStatus
import logging
@@ -350,6 +351,12 @@ class APIComponentsView(HomeAssistantView):
return self.json(request.app["hass"].config.components)
@lru_cache
def _cached_template(template_str: str, hass: ha.HomeAssistant) -> template.Template:
"""Return a cached template."""
return template.Template(template_str, hass)
class APITemplateView(HomeAssistantView):
"""View to handle Template requests."""
@@ -362,7 +369,7 @@ class APITemplateView(HomeAssistantView):
raise Unauthorized()
try:
data = await request.json()
tpl = template.Template(data["template"], request.app["hass"])
tpl = _cached_template(data["template"], request.app["hass"])
return tpl.async_render(variables=data.get("variables"), parse_result=False)
except (ValueError, TemplateError) as ex:
return self.json_message(

View File

@@ -180,7 +180,7 @@ class ArestData:
self._resource = resource
self._pin = pin
self.data = {}
self._attr_available = True
self.available = True
@Throttle(MIN_TIME_BETWEEN_UPDATES)
def update(self):
@@ -201,7 +201,7 @@ class ArestData:
f"{self._resource}/digital/{self._pin}", timeout=10
)
self.data = {"value": response.json()["return_value"]}
self._attr_available = True
self.available = True
except requests.exceptions.ConnectionError:
_LOGGER.error("No route to device %s", self._resource)
self._attr_available = False
self.available = False

View File

@@ -28,5 +28,5 @@
"documentation": "https://www.home-assistant.io/integrations/august",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==1.2.7", "yalexs_ble==2.0.4"]
"requirements": ["yalexs==1.2.7", "yalexs-ble==2.1.1"]
}

View File

@@ -60,7 +60,7 @@ from .const import (
DEFAULT_PROBABILITY_THRESHOLD,
)
from .helpers import Observation
from .repairs import raise_mirrored_entries, raise_no_prob_given_false
from .issues import raise_mirrored_entries, raise_no_prob_given_false
_LOGGER = logging.getLogger(__name__)

View File

@@ -1,4 +1,4 @@
"""Helpers for generating repairs."""
"""Helpers for generating issues."""
from __future__ import annotations
from homeassistant.core import HomeAssistant

View File

@@ -227,20 +227,21 @@ class BaseHaRemoteScanner(BaseHaScanner):
self.hass, self._async_expire_devices, timedelta(seconds=30)
)
cancel_stop = self.hass.bus.async_listen(
EVENT_HOMEASSISTANT_STOP, self._save_history
EVENT_HOMEASSISTANT_STOP, self._async_save_history
)
self._async_setup_scanner_watchdog()
@hass_callback
def _cancel() -> None:
self._save_history()
self._async_save_history()
self._async_stop_scanner_watchdog()
cancel_track()
cancel_stop()
return _cancel
def _save_history(self, event: Event | None = None) -> None:
@hass_callback
def _async_save_history(self, event: Event | None = None) -> None:
"""Save the history."""
self._storage.async_set_advertisement_history(
self.source,
@@ -252,6 +253,7 @@ class BaseHaRemoteScanner(BaseHaScanner):
),
)
@hass_callback
def _async_expire_devices(self, _datetime: datetime.datetime) -> None:
"""Expire old devices."""
now = MONOTONIC_TIME()

View File

@@ -7,6 +7,6 @@
"iot_class": "local_push",
"loggers": ["bond_async"],
"quality_scale": "platinum",
"requirements": ["bond-async==0.1.22"],
"requirements": ["bond-async==0.1.23"],
"zeroconf": ["_bond._tcp.local."]
}

View File

@@ -8,7 +8,7 @@
"iot_class": "local_push",
"loggers": ["pydeconz"],
"quality_scale": "platinum",
"requirements": ["pydeconz==108"],
"requirements": ["pydeconz==110"],
"ssdp": [
{
"manufacturer": "Royal Philips Electronics",

View File

@@ -45,9 +45,10 @@ BINARY_SENSOR_DESCRIPTIONS = (
),
DormakabaDkeyBinarySensorDescription(
key="security_locked",
name="Dead bolt",
name="Deadbolt",
device_class=BinarySensorDeviceClass.LOCK,
is_on=lambda state: state.unlock_status != UnlockStatus.SECURITY_LOCKED,
is_on=lambda state: state.unlock_status
not in (UnlockStatus.SECURITY_LOCKED, UnlockStatus.UNLOCKED_SECURITY_LOCKED),
),
)

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/easyenergy",
"iot_class": "cloud_polling",
"quality_scale": "platinum",
"requirements": ["easyenergy==0.1.2"]
"requirements": ["easyenergy==0.2.2"]
}

View File

@@ -130,10 +130,15 @@ class RuntimeEntryData:
)
self.ble_connections_free = free
self.ble_connections_limit = limit
if free:
for fut in self._ble_connection_free_futures:
if not free:
return
for fut in self._ble_connection_free_futures:
# If wait_for_ble_connections_free gets cancelled, it will
# leave a future in the list. We need to check if it's done
# before setting the result.
if not fut.done():
fut.set_result(free)
self._ble_connection_free_futures.clear()
self._ble_connection_free_futures.clear()
async def wait_for_ble_connections_free(self) -> int:
"""Wait until there are free BLE connections."""

View File

@@ -14,6 +14,6 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["aioesphomeapi", "noiseprotocol"],
"requirements": ["aioesphomeapi==13.4.1", "esphome-dashboard-api==1.2.3"],
"requirements": ["aioesphomeapi==13.5.1", "esphome-dashboard-api==1.2.3"],
"zeroconf": ["_esphomelib._tcp.local."]
}

View File

@@ -94,9 +94,9 @@ class FibaroCover(FibaroDevice, CoverEntity):
"""Return if the cover is closed."""
if self._is_open_close_only():
state = self.fibaro_device.state
if not state.has_value or state.str_value.lower() == "unknown":
if not state.has_value or state.str_value().lower() == "unknown":
return None
return state.str_value.lower() == "closed"
return state.str_value().lower() == "closed"
if self.current_cover_position is None:
return None

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["pyfibaro"],
"requirements": ["pyfibaro==0.6.8"]
"requirements": ["pyfibaro==0.6.9"]
}

View File

@@ -77,7 +77,6 @@ class FreeboxFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
# Check permissions
await fbx.system.get_config()
await fbx.lan.get_hosts_list()
await self.hass.async_block_till_done()
# Close connection
await fbx.close()

View File

@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20230302.0"]
"requirements": ["home-assistant-frontend==20230309.1"]
}

View File

@@ -41,7 +41,7 @@ async def async_setup_platform(
[
GeniusClimateZone(broker, z)
for z in broker.client.zone_objs
if z.data["type"] in GH_ZONES
if z.data.get("type") in GH_ZONES
]
)

View File

@@ -42,7 +42,7 @@ async def async_setup_platform(
[
GeniusSwitch(broker, z)
for z in broker.client.zone_objs
if z.data["type"] == GH_ON_OFF_ZONE
if z.data.get("type") == GH_ON_OFF_ZONE
]
)

View File

@@ -48,7 +48,7 @@ async def async_setup_platform(
[
GeniusWaterHeater(broker, z)
for z in broker.client.zone_objs
if z.data["type"] in GH_HEATERS
if z.data.get("type") in GH_HEATERS
]
)

View File

@@ -342,12 +342,14 @@ def get_next_departure(
origin_stop_time.departure_time
LIMIT :limit
"""
result = schedule.engine.execute(
result = schedule.engine.connect().execute(
text(sql_query),
origin_station_id=start_station_id,
end_station_id=end_station_id,
today=now_date,
limit=limit,
{
"origin_station_id": start_station_id,
"end_station_id": end_station_id,
"today": now_date,
"limit": limit,
},
)
# Create lookup timetable for today and possibly tomorrow, taking into
@@ -357,7 +359,8 @@ def get_next_departure(
yesterday_start = today_start = tomorrow_start = None
yesterday_last = today_last = ""
for row in result:
for row_cursor in result:
row = row_cursor._asdict()
if row["yesterday"] == 1 and yesterday_date >= row["start_date"]:
extras = {"day": "yesterday", "first": None, "last": False}
if yesterday_start is None:
@@ -800,7 +803,10 @@ class GTFSDepartureSensor(SensorEntity):
@staticmethod
def dict_for_table(resource: Any) -> dict:
"""Return a dictionary for the SQLAlchemy resource given."""
return {col: getattr(resource, col) for col in resource.__table__.columns}
_dict = {}
for column in resource.__table__.columns:
_dict[column.name] = str(getattr(resource, column.name))
return _dict
def append_keys(self, resource: dict, prefix: str | None = None) -> None:
"""Properly format key val pairs to append to attributes."""

View File

@@ -13,7 +13,7 @@
"documentation": "https://www.home-assistant.io/integrations/harmony",
"iot_class": "local_push",
"loggers": ["aioharmony", "slixmpp"],
"requirements": ["aioharmony==0.2.9"],
"requirements": ["aioharmony==0.2.10"],
"ssdp": [
{
"manufacturer": "Logitech",

View File

@@ -96,7 +96,7 @@ from .handler import ( # noqa: F401
)
from .http import HassIOView
from .ingress import async_setup_ingress_view
from .repairs import SupervisorRepairs
from .issues import SupervisorIssues
from .websocket_api import async_load_websocket_api
_LOGGER = logging.getLogger(__name__)
@@ -123,7 +123,7 @@ DATA_SUPERVISOR_INFO = "hassio_supervisor_info"
DATA_ADDONS_CHANGELOGS = "hassio_addons_changelogs"
DATA_ADDONS_INFO = "hassio_addons_info"
DATA_ADDONS_STATS = "hassio_addons_stats"
DATA_SUPERVISOR_REPAIRS = "supervisor_repairs"
DATA_SUPERVISOR_ISSUES = "supervisor_issues"
HASSIO_UPDATE_INTERVAL = timedelta(minutes=5)
ADDONS_COORDINATOR = "hassio_addons_coordinator"
@@ -581,9 +581,9 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool: # noqa:
hass.config_entries.flow.async_init(DOMAIN, context={"source": "system"})
)
# Start listening for problems with supervisor and making repairs
hass.data[DATA_SUPERVISOR_REPAIRS] = repairs = SupervisorRepairs(hass, hassio)
await repairs.setup()
# Start listening for problems with supervisor and making issues
hass.data[DATA_SUPERVISOR_ISSUES] = issues = SupervisorIssues(hass, hassio)
await issues.setup()
return True

View File

@@ -36,6 +36,7 @@ X_AUTH_TOKEN = "X-Supervisor-Token"
X_INGRESS_PATH = "X-Ingress-Path"
X_HASS_USER_ID = "X-Hass-User-ID"
X_HASS_IS_ADMIN = "X-Hass-Is-Admin"
X_HASS_SOURCE = "X-Hass-Source"
WS_TYPE = "type"
WS_ID = "id"

View File

@@ -17,7 +17,7 @@ from homeassistant.const import SERVER_PORT
from homeassistant.core import HomeAssistant
from homeassistant.loader import bind_hass
from .const import ATTR_DISCOVERY, DOMAIN
from .const import ATTR_DISCOVERY, DOMAIN, X_HASS_SOURCE
_LOGGER = logging.getLogger(__name__)
@@ -445,6 +445,8 @@ class HassIO:
payload=None,
timeout=10,
return_text=False,
*,
source="core.handler",
):
"""Send API command to Hass.io.
@@ -458,7 +460,8 @@ class HassIO:
headers={
aiohttp.hdrs.AUTHORIZATION: (
f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}"
)
),
X_HASS_SOURCE: source,
},
timeout=aiohttp.ClientTimeout(total=timeout),
)

View File

@@ -6,6 +6,7 @@ from http import HTTPStatus
import logging
import os
import re
from urllib.parse import quote, unquote
import aiohttp
from aiohttp import web
@@ -19,13 +20,16 @@ from aiohttp.hdrs import (
TRANSFER_ENCODING,
)
from aiohttp.web_exceptions import HTTPBadGateway
from multidict import istr
from homeassistant.components.http import KEY_AUTHENTICATED, HomeAssistantView
from homeassistant.components.http import (
KEY_AUTHENTICATED,
KEY_HASS_USER,
HomeAssistantView,
)
from homeassistant.components.onboarding import async_is_onboarded
from homeassistant.core import HomeAssistant
from .const import X_HASS_IS_ADMIN, X_HASS_USER_ID
from .const import X_HASS_SOURCE
_LOGGER = logging.getLogger(__name__)
@@ -34,23 +38,53 @@ MAX_UPLOAD_SIZE = 1024 * 1024 * 1024
# pylint: disable=implicit-str-concat
NO_TIMEOUT = re.compile(
r"^(?:"
r"|homeassistant/update"
r"|hassos/update"
r"|hassos/update/cli"
r"|supervisor/update"
r"|addons/[^/]+/(?:update|install|rebuild)"
r"|backups/.+/full"
r"|backups/.+/partial"
r"|backups/[^/]+/(?:upload|download)"
r")$"
)
NO_AUTH_ONBOARDING = re.compile(r"^(?:" r"|supervisor/logs" r"|backups/[^/]+/.+" r")$")
# fmt: off
# Onboarding can upload backups and restore it
PATHS_NOT_ONBOARDED = re.compile(
r"^(?:"
r"|backups/[a-f0-9]{8}(/info|/new/upload|/download|/restore/full|/restore/partial)?"
r"|backups/new/upload"
r")$"
)
NO_AUTH = re.compile(r"^(?:" r"|app/.*" r"|[store\/]*addons/[^/]+/(logo|icon)" r")$")
# Authenticated users manage backups + download logs, changelog and documentation
PATHS_ADMIN = re.compile(
r"^(?:"
r"|backups/[a-f0-9]{8}(/info|/download|/restore/full|/restore/partial)?"
r"|backups/new/upload"
r"|audio/logs"
r"|cli/logs"
r"|core/logs"
r"|dns/logs"
r"|host/logs"
r"|multicast/logs"
r"|observer/logs"
r"|supervisor/logs"
r"|addons/[^/]+/(changelog|documentation|logs)"
r")$"
)
NO_STORE = re.compile(r"^(?:" r"|app/entrypoint.js" r")$")
# Unauthenticated requests come in for Supervisor panel + add-on images
PATHS_NO_AUTH = re.compile(
r"^(?:"
r"|app/.*"
r"|(store/)?addons/[^/]+/(logo|icon)"
r")$"
)
NO_STORE = re.compile(
r"^(?:"
r"|app/entrypoint.js"
r")$"
)
# pylint: enable=implicit-str-concat
# fmt: on
class HassIOView(HomeAssistantView):
@@ -65,38 +99,66 @@ class HassIOView(HomeAssistantView):
self._host = host
self._websession = websession
async def _handle(
self, request: web.Request, path: str
) -> web.Response | web.StreamResponse:
"""Route data to Hass.io."""
hass = request.app["hass"]
if _need_auth(hass, path) and not request[KEY_AUTHENTICATED]:
return web.Response(status=HTTPStatus.UNAUTHORIZED)
return await self._command_proxy(path, request)
delete = _handle
get = _handle
post = _handle
async def _command_proxy(
self, path: str, request: web.Request
) -> web.StreamResponse:
async def _handle(self, request: web.Request, path: str) -> web.StreamResponse:
"""Return a client request with proxy origin for Hass.io supervisor.
This method is a coroutine.
Use cases:
- Onboarding allows restoring backups
- Load Supervisor panel and add-on logo unauthenticated
- User upload/restore backups
"""
headers = _init_header(request)
if path == "backups/new/upload":
# We need to reuse the full content type that includes the boundary
headers[
CONTENT_TYPE
] = request._stored_content_type # pylint: disable=protected-access
# No bullshit
if path != unquote(path):
return web.Response(status=HTTPStatus.BAD_REQUEST)
hass: HomeAssistant = request.app["hass"]
is_admin = request[KEY_AUTHENTICATED] and request[KEY_HASS_USER].is_admin
authorized = is_admin
if is_admin:
allowed_paths = PATHS_ADMIN
elif not async_is_onboarded(hass):
allowed_paths = PATHS_NOT_ONBOARDED
# During onboarding we need the user to manage backups
authorized = True
else:
# Either unauthenticated or not an admin
allowed_paths = PATHS_NO_AUTH
no_auth_path = PATHS_NO_AUTH.match(path)
headers = {
X_HASS_SOURCE: "core.http",
}
if no_auth_path:
if request.method != "GET":
return web.Response(status=HTTPStatus.METHOD_NOT_ALLOWED)
else:
if not allowed_paths.match(path):
return web.Response(status=HTTPStatus.UNAUTHORIZED)
if authorized:
headers[
AUTHORIZATION
] = f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}"
if request.method == "POST":
headers[CONTENT_TYPE] = request.content_type
# _stored_content_type is only computed once `content_type` is accessed
if path == "backups/new/upload":
# We need to reuse the full content type that includes the boundary
headers[
CONTENT_TYPE
] = request._stored_content_type # pylint: disable=protected-access
try:
client = await self._websession.request(
method=request.method,
url=f"http://{self._host}/{path}",
url=f"http://{self._host}/{quote(path)}",
params=request.query,
data=request.content,
headers=headers,
@@ -123,20 +185,8 @@ class HassIOView(HomeAssistantView):
raise HTTPBadGateway()
def _init_header(request: web.Request) -> dict[istr, str]:
"""Create initial header."""
headers = {
AUTHORIZATION: f"Bearer {os.environ.get('SUPERVISOR_TOKEN', '')}",
CONTENT_TYPE: request.content_type,
}
# Add user data
if request.get("hass_user") is not None:
headers[istr(X_HASS_USER_ID)] = request["hass_user"].id
headers[istr(X_HASS_IS_ADMIN)] = str(int(request["hass_user"].is_admin))
return headers
get = _handle
post = _handle
def _response_header(response: aiohttp.ClientResponse, path: str) -> dict[str, str]:
@@ -164,12 +214,3 @@ def _get_timeout(path: str) -> ClientTimeout:
if NO_TIMEOUT.match(path):
return ClientTimeout(connect=10, total=None)
return ClientTimeout(connect=10, total=300)
def _need_auth(hass: HomeAssistant, path: str) -> bool:
"""Return if a path need authentication."""
if not async_is_onboarded(hass) and NO_AUTH_ONBOARDING.match(path):
return False
if NO_AUTH.match(path):
return False
return True

View File

@@ -3,20 +3,22 @@ from __future__ import annotations
import asyncio
from collections.abc import Iterable
from functools import lru_cache
from ipaddress import ip_address
import logging
import os
from urllib.parse import quote
import aiohttp
from aiohttp import ClientTimeout, hdrs, web
from aiohttp.web_exceptions import HTTPBadGateway, HTTPBadRequest
from multidict import CIMultiDict
from yarl import URL
from homeassistant.components.http import HomeAssistantView
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import X_AUTH_TOKEN, X_INGRESS_PATH
from .const import X_HASS_SOURCE, X_INGRESS_PATH
_LOGGER = logging.getLogger(__name__)
@@ -42,9 +44,19 @@ class HassIOIngress(HomeAssistantView):
self._host = host
self._websession = websession
@lru_cache
def _create_url(self, token: str, path: str) -> str:
"""Create URL to service."""
return f"http://{self._host}/ingress/{token}/{path}"
base_path = f"/ingress/{token}/"
url = f"http://{self._host}{base_path}{quote(path)}"
try:
if not URL(url).path.startswith(base_path):
raise HTTPBadRequest()
except ValueError as err:
raise HTTPBadRequest() from err
return url
async def _handle(
self, request: web.Request, token: str, path: str
@@ -185,10 +197,8 @@ def _init_header(request: web.Request, token: str) -> CIMultiDict | dict[str, st
continue
headers[name] = value
# Inject token / cleanup later on Supervisor
headers[X_AUTH_TOKEN] = os.environ.get("SUPERVISOR_TOKEN", "")
# Ingress information
headers[X_HASS_SOURCE] = "core.ingress"
headers[X_INGRESS_PATH] = f"/api/hassio_ingress/{token}"
# Set X-Forwarded-For

View File

@@ -70,11 +70,11 @@ UNHEALTHY_REASONS = {
}
class SupervisorRepairs:
"""Create repairs from supervisor events."""
class SupervisorIssues:
"""Create issues from supervisor events."""
def __init__(self, hass: HomeAssistant, client: HassIO) -> None:
"""Initialize supervisor repairs."""
"""Initialize supervisor issues."""
self._hass = hass
self._client = client
self._unsupported_reasons: set[str] = set()
@@ -87,7 +87,7 @@ class SupervisorRepairs:
@unhealthy_reasons.setter
def unhealthy_reasons(self, reasons: set[str]) -> None:
"""Set unhealthy reasons. Create or delete repairs as necessary."""
"""Set unhealthy reasons. Create or delete issues as necessary."""
for unhealthy in reasons - self.unhealthy_reasons:
if unhealthy in UNHEALTHY_REASONS:
translation_key = f"unhealthy_{unhealthy}"
@@ -119,7 +119,7 @@ class SupervisorRepairs:
@unsupported_reasons.setter
def unsupported_reasons(self, reasons: set[str]) -> None:
"""Set unsupported reasons. Create or delete repairs as necessary."""
"""Set unsupported reasons. Create or delete issues as necessary."""
for unsupported in reasons - UNSUPPORTED_SKIP_REPAIR - self.unsupported_reasons:
if unsupported in UNSUPPORTED_REASONS:
translation_key = f"unsupported_{unsupported}"
@@ -149,18 +149,18 @@ class SupervisorRepairs:
await self.update()
async_dispatcher_connect(
self._hass, EVENT_SUPERVISOR_EVENT, self._supervisor_events_to_repairs
self._hass, EVENT_SUPERVISOR_EVENT, self._supervisor_events_to_issues
)
async def update(self) -> None:
"""Update repairs from Supervisor resolution center."""
"""Update issuess from Supervisor resolution center."""
data = await self._client.get_resolution_info()
self.unhealthy_reasons = set(data[ATTR_UNHEALTHY])
self.unsupported_reasons = set(data[ATTR_UNSUPPORTED])
@callback
def _supervisor_events_to_repairs(self, event: dict[str, Any]) -> None:
"""Create repairs from supervisor events."""
def _supervisor_events_to_issues(self, event: dict[str, Any]) -> None:
"""Create issues from supervisor events."""
if ATTR_WS_EVENT not in event:
return

View File

@@ -116,6 +116,7 @@ async def websocket_supervisor_api(
method=msg[ATTR_METHOD],
timeout=msg.get(ATTR_TIMEOUT, 10),
payload=msg.get(ATTR_DATA, {}),
source="core.websocket_api",
)
if result.get(ATTR_RESULT) == "error":

View File

@@ -60,9 +60,7 @@ def async_sign_path(
url = URL(path)
now = dt_util.utcnow()
params = dict(sorted(url.query.items()))
for param in SAFE_QUERY_PARAMS:
params.pop(param, None)
params = [itm for itm in url.query.items() if itm[0] not in SAFE_QUERY_PARAMS]
encoded = jwt.encode(
{
"iss": refresh_token_id,
@@ -75,7 +73,7 @@ def async_sign_path(
algorithm="HS256",
)
params[SIGN_QUERY_PARAM] = encoded
params.append((SIGN_QUERY_PARAM, encoded))
url = url.with_query(params)
return f"{url.path}?{url.query_string}"
@@ -184,10 +182,11 @@ async def async_setup_auth(hass: HomeAssistant, app: Application) -> None:
if claims["path"] != request.path:
return False
params = dict(sorted(request.query.items()))
del params[SIGN_QUERY_PARAM]
for param in SAFE_QUERY_PARAMS:
params.pop(param, None)
params = [
list(itm) # claims stores tuples as lists
for itm in request.query.items()
if itm[0] not in SAFE_QUERY_PARAMS and itm[0] != SIGN_QUERY_PARAM
]
if claims["params"] != params:
return False

View File

@@ -153,6 +153,7 @@ async def async_setup_entry( # noqa: C901
system.serial,
svc_exception,
)
await system.aqualink.close()
else:
cur = system.online
if cur and not prev:

View File

@@ -3,6 +3,7 @@ from __future__ import annotations
from collections.abc import Awaitable
import httpx
from iaqualink.exception import AqualinkServiceException
from homeassistant.exceptions import HomeAssistantError
@@ -12,5 +13,5 @@ async def await_or_reraise(awaitable: Awaitable) -> None:
"""Execute API call while catching service exceptions."""
try:
await awaitable
except AqualinkServiceException as svc_exception:
except (AqualinkServiceException, httpx.HTTPError) as svc_exception:
raise HomeAssistantError(f"Aqualink error: {svc_exception}") from svc_exception

View File

@@ -77,7 +77,9 @@ class ImapDataUpdateCoordinator(DataUpdateCoordinator[int]):
f"Invalid response for search '{self.config_entry.data[CONF_SEARCH]}': {result} / {lines[0]}"
)
if self.support_push:
self.hass.async_create_task(self.async_wait_server_push())
self.hass.async_create_background_task(
self.async_wait_server_push(), "Wait for IMAP data push"
)
return len(lines[0].split())
async def async_wait_server_push(self) -> None:
@@ -100,5 +102,7 @@ class ImapDataUpdateCoordinator(DataUpdateCoordinator[int]):
async def shutdown(self, *_) -> None:
"""Close resources."""
if self.imap_client:
if self.imap_client.has_pending_idle():
self.imap_client.idle_done()
await self.imap_client.stop_wait_server_push()
await self.imap_client.logout()

View File

@@ -95,9 +95,25 @@ class EmailReader:
self._folder = folder
self._verify_ssl = verify_ssl
self._last_id = None
self._last_message = None
self._unread_ids = deque([])
self.connection = None
@property
def last_id(self) -> int | None:
"""Return last email uid that was processed."""
return self._last_id
@property
def last_unread_id(self) -> int | None:
"""Return last email uid received."""
# We assume the last id in the list is the last unread id
# We cannot know if that is the newest one, because it could arrive later
# https://stackoverflow.com/questions/12409862/python-imap-the-order-of-uids
if self._unread_ids:
return int(self._unread_ids[-1])
return self._last_id
def connect(self):
"""Login and setup the connection."""
ssl_context = client_context() if self._verify_ssl else None
@@ -128,21 +144,21 @@ class EmailReader:
try:
self.connection.select(self._folder, readonly=True)
if not self._unread_ids:
search = f"SINCE {datetime.date.today():%d-%b-%Y}"
if self._last_id is not None:
search = f"UID {self._last_id}:*"
_, data = self.connection.uid("search", None, search)
self._unread_ids = deque(data[0].split())
if self._last_id is None:
# search for today and yesterday
time_from = datetime.datetime.now() - datetime.timedelta(days=1)
search = f"SINCE {time_from:%d-%b-%Y}"
else:
search = f"UID {self._last_id}:*"
_, data = self.connection.uid("search", None, search)
self._unread_ids = deque(data[0].split())
while self._unread_ids:
message_uid = self._unread_ids.popleft()
if self._last_id is None or int(message_uid) > self._last_id:
self._last_id = int(message_uid)
return self._fetch_message(message_uid)
return self._fetch_message(str(self._last_id))
self._last_message = self._fetch_message(message_uid)
return self._last_message
except imaplib.IMAP4.error:
_LOGGER.info("Connection to %s lost, attempting to reconnect", self._server)
@@ -254,22 +270,30 @@ class EmailContentSensor(SensorEntity):
def update(self) -> None:
"""Read emails and publish state change."""
email_message = self._email_reader.read_next()
while (
self._last_id is None or self._last_id != self._email_reader.last_unread_id
):
if email_message is None:
self._message = None
self._state_attributes = {}
return
if email_message is None:
self._message = None
self._state_attributes = {}
return
self._last_id = self._email_reader.last_id
if self.sender_allowed(email_message):
message = EmailContentSensor.get_msg_subject(email_message)
if self.sender_allowed(email_message):
message = EmailContentSensor.get_msg_subject(email_message)
if self._value_template is not None:
message = self.render_template(email_message)
if self._value_template is not None:
message = self.render_template(email_message)
self._message = message
self._state_attributes = {
ATTR_FROM: EmailContentSensor.get_msg_sender(email_message),
ATTR_SUBJECT: EmailContentSensor.get_msg_subject(email_message),
ATTR_DATE: email_message["Date"],
ATTR_BODY: EmailContentSensor.get_msg_text(email_message),
}
self._message = message
self._state_attributes = {
ATTR_FROM: EmailContentSensor.get_msg_sender(email_message),
ATTR_SUBJECT: EmailContentSensor.get_msg_subject(email_message),
ATTR_DATE: email_message["Date"],
ATTR_BODY: EmailContentSensor.get_msg_text(email_message),
}
if self._last_id == self._email_reader.last_unread_id:
break
email_message = self._email_reader.read_next()

View File

@@ -17,8 +17,8 @@
"iot_class": "local_push",
"loggers": ["pyinsteon", "pypubsub"],
"requirements": [
"pyinsteon==1.3.3",
"insteon-frontend-home-assistant==0.3.2"
"pyinsteon==1.3.4",
"insteon-frontend-home-assistant==0.3.3"
],
"usb": [
{

View File

@@ -1,11 +1,13 @@
"""Utilities used by insteon component."""
import asyncio
from collections.abc import Callable
import logging
from pyinsteon import devices
from pyinsteon.address import Address
from pyinsteon.constants import ALDBStatus, DeviceAction
from pyinsteon.events import OFF_EVENT, OFF_FAST_EVENT, ON_EVENT, ON_FAST_EVENT
from pyinsteon.device_types.device_base import Device
from pyinsteon.events import OFF_EVENT, OFF_FAST_EVENT, ON_EVENT, ON_FAST_EVENT, Event
from pyinsteon.managers.link_manager import (
async_enter_linking_mode,
async_enter_unlinking_mode,
@@ -27,7 +29,7 @@ from homeassistant.const import (
CONF_PLATFORM,
ENTITY_MATCH_ALL,
)
from homeassistant.core import ServiceCall, callback
from homeassistant.core import HomeAssistant, ServiceCall, callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.dispatcher import (
async_dispatcher_connect,
@@ -89,49 +91,52 @@ from .schemas import (
_LOGGER = logging.getLogger(__name__)
def add_on_off_event_device(hass, device):
def _register_event(event: Event, listener: Callable) -> None:
"""Register the events raised by a device."""
_LOGGER.debug(
"Registering on/off event for %s %d %s",
str(event.address),
event.group,
event.name,
)
event.subscribe(listener, force_strong_ref=True)
def add_on_off_event_device(hass: HomeAssistant, device: Device) -> None:
"""Register an Insteon device as an on/off event device."""
@callback
def async_fire_group_on_off_event(name, address, group, button):
def async_fire_group_on_off_event(
name: str, address: Address, group: int, button: str
):
# Firing an event when a button is pressed.
if button and button[-2] == "_":
button_id = button[-1].lower()
else:
button_id = None
schema = {CONF_ADDRESS: address}
schema = {CONF_ADDRESS: address, "group": group}
if button_id:
schema[EVENT_CONF_BUTTON] = button_id
if name == ON_EVENT:
event = EVENT_GROUP_ON
if name == OFF_EVENT:
elif name == OFF_EVENT:
event = EVENT_GROUP_OFF
if name == ON_FAST_EVENT:
elif name == ON_FAST_EVENT:
event = EVENT_GROUP_ON_FAST
if name == OFF_FAST_EVENT:
elif name == OFF_FAST_EVENT:
event = EVENT_GROUP_OFF_FAST
else:
event = f"insteon.{name}"
_LOGGER.debug("Firing event %s with %s", event, schema)
hass.bus.async_fire(event, schema)
for group in device.events:
if isinstance(group, int):
for event in device.events[group]:
if event in [
OFF_EVENT,
ON_EVENT,
OFF_FAST_EVENT,
ON_FAST_EVENT,
]:
_LOGGER.debug(
"Registering on/off event for %s %d %s",
str(device.address),
group,
event,
)
device.events[group][event].subscribe(
async_fire_group_on_off_event, force_strong_ref=True
)
for name_or_group, event in device.events.items():
if isinstance(name_or_group, int):
for _, event in device.events[name_or_group].items():
_register_event(event, async_fire_group_on_off_event)
else:
_register_event(event, async_fire_group_on_off_event)
def register_new_device_callback(hass):

View File

@@ -20,10 +20,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry_data[CONF_CLIENT_DEVICE_ID] = entry.entry_id
hass.config_entries.async_update_entry(entry, data=entry_data)
client = create_client(
device_id=entry.data[CONF_CLIENT_DEVICE_ID],
device_name=hass.config.location_name,
)
device_id = entry.data[CONF_CLIENT_DEVICE_ID]
device_name = ascii(hass.config.location_name)
client = create_client(device_id=device_id, device_name=device_name)
try:
user_id, connect_result = await validate_input(hass, dict(entry.data), client)

View File

@@ -84,7 +84,7 @@ def ensure_zone(value):
if value is None:
raise vol.Invalid("zone value is None")
if str(value) not in ZONES is None:
if str(value) not in ZONES:
raise vol.Invalid("zone not valid")
return str(value)

View File

@@ -140,7 +140,7 @@ ROBOT_SENSOR_MAP: dict[type[Robot], list[RobotSensorEntityDescription]] = {
name="Pet weight",
native_unit_of_measurement=UnitOfMass.POUNDS,
device_class=SensorDeviceClass.WEIGHT,
state_class=SensorStateClass.TOTAL,
state_class=SensorStateClass.MEASUREMENT,
),
],
FeederRobot: [

View File

@@ -7,5 +7,5 @@
"iot_class": "cloud_polling",
"loggers": ["pymazda"],
"quality_scale": "platinum",
"requirements": ["pymazda==0.3.7"]
"requirements": ["pymazda==0.3.8"]
}

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
import asyncio
from collections.abc import Callable, Coroutine
from contextlib import suppress
from functools import wraps
from functools import lru_cache, wraps
from http import HTTPStatus
import logging
import secrets
@@ -365,6 +365,12 @@ async def webhook_stream_camera(
return webhook_response(resp, registration=config_entry.data)
@lru_cache
def _cached_template(template_str: str, hass: HomeAssistant) -> template.Template:
"""Return a cached template."""
return template.Template(template_str, hass)
@WEBHOOK_COMMANDS.register("render_template")
@validate_schema(
{
@@ -381,7 +387,7 @@ async def webhook_render_template(
resp = {}
for key, item in data.items():
try:
tpl = template.Template(item[ATTR_TEMPLATE], hass)
tpl = _cached_template(item[ATTR_TEMPLATE], hass)
resp[key] = tpl.async_render(item.get(ATTR_TEMPLATE_VARIABLES))
except TemplateError as ex:
resp[key] = {"error": str(ex)}

View File

@@ -16,7 +16,7 @@ from pymodbus.client import (
from pymodbus.constants import Defaults
from pymodbus.exceptions import ModbusException
from pymodbus.pdu import ModbusResponse
from pymodbus.transaction import ModbusRtuFramer
from pymodbus.transaction import ModbusAsciiFramer, ModbusRtuFramer, ModbusSocketFramer
import voluptuous as vol
from homeassistant.const import (
@@ -137,8 +137,10 @@ async def async_modbus_setup(
for name in hubs:
if not await hubs[name].async_setup():
return False
hub_collect = hass.data[DOMAIN]
else:
hass.data[DOMAIN] = hub_collect = {}
hass.data[DOMAIN] = hub_collect = {}
for conf_hub in config[DOMAIN]:
my_hub = ModbusHub(hass, conf_hub)
hub_collect[conf_hub[CONF_NAME]] = my_hub
@@ -279,9 +281,12 @@ class ModbusHub:
}
if self._config_type == SERIAL:
# serial configuration
if client_config[CONF_METHOD] == "ascii":
self._pb_params["framer"] = ModbusAsciiFramer
else:
self._pb_params["framer"] = ModbusRtuFramer
self._pb_params.update(
{
"method": client_config[CONF_METHOD],
"baudrate": client_config[CONF_BAUDRATE],
"stopbits": client_config[CONF_STOPBITS],
"bytesize": client_config[CONF_BYTESIZE],
@@ -293,6 +298,8 @@ class ModbusHub:
self._pb_params["host"] = client_config[CONF_HOST]
if self._config_type == RTUOVERTCP:
self._pb_params["framer"] = ModbusRtuFramer
else:
self._pb_params["framer"] = ModbusSocketFramer
Defaults.Timeout = client_config[CONF_TIMEOUT]
if CONF_MSG_WAIT in client_config:

View File

@@ -706,7 +706,7 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
for component in PLATFORMS
)
)
await hass.async_block_till_done()
await asyncio.sleep(0)
# Unsubscribe reload dispatchers
while reload_dispatchers := mqtt_data.reload_dispatchers:
reload_dispatchers.pop()()

View File

@@ -495,8 +495,12 @@ class MqttLight(MqttEntity, LightEntity, RestoreEntity):
self._attr_color_mode = color_mode
if self._topic[CONF_BRIGHTNESS_STATE_TOPIC] is None:
rgb = convert_color(*color)
percent_bright = float(color_util.color_RGB_to_hsv(*rgb)[2]) / 100.0
self._attr_brightness = min(round(percent_bright * 255), 255)
brightness = max(rgb)
self._attr_brightness = brightness
# Normalize the color to 100% brightness
color = tuple(
min(round(channel / brightness * 255), 255) for channel in color
)
return color
@callback

View File

@@ -281,7 +281,7 @@ class MqttSensor(MqttEntity, RestoreSensor):
else:
self._attr_native_value = new_value
return
if self.device_class is None:
if self.device_class in {None, SensorDeviceClass.ENUM}:
self._attr_native_value = new_value
return
if (payload_datetime := dt_util.parse_datetime(new_value)) is None:

View File

@@ -62,13 +62,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Nibe Heat Pump from a config entry."""
heatpump = HeatPump(Model[entry.data[CONF_MODEL]])
heatpump.word_swap = entry.data.get(CONF_WORD_SWAP, True)
await heatpump.initialize()
connection: Connection
connection_type = entry.data[CONF_CONNECTION_TYPE]
if connection_type == CONF_CONNECTION_TYPE_NIBEGW:
heatpump.word_swap = entry.data[CONF_WORD_SWAP]
connection = NibeGW(
heatpump,
entry.data[CONF_IP_ADDRESS],

View File

@@ -31,6 +31,7 @@ from . import Coordinator
from .const import (
DOMAIN,
LOGGER,
VALUES_COOL_WITH_ROOM_SENSOR_OFF,
VALUES_MIXING_VALVE_CLOSED_STATE,
VALUES_PRIORITY_COOLING,
VALUES_PRIORITY_HEATING,
@@ -139,10 +140,13 @@ class NibeClimateEntity(CoordinatorEntity[Coordinator], ClimateEntity):
mode = HVACMode.OFF
if _get_value(self._coil_use_room_sensor) == "ON":
if _get_value(self._coil_cooling_with_room_sensor) == "ON":
mode = HVACMode.HEAT_COOL
else:
if (
_get_value(self._coil_cooling_with_room_sensor)
in VALUES_COOL_WITH_ROOM_SENSOR_OFF
):
mode = HVACMode.HEAT
else:
mode = HVACMode.HEAT_COOL
self._attr_hvac_mode = mode
setpoint_heat = _get_float(self._coil_setpoint_heat)

View File

@@ -89,6 +89,7 @@ async def validate_nibegw_input(
"""Validate the user input allows us to connect."""
heatpump = HeatPump(Model[data[CONF_MODEL]])
heatpump.word_swap = True
await heatpump.initialize()
connection = NibeGW(

View File

@@ -17,3 +17,4 @@ CONF_MODBUS_UNIT = "modbus_unit"
VALUES_MIXING_VALVE_CLOSED_STATE = (30, "CLOSED", "SHUNT CLOSED")
VALUES_PRIORITY_HEATING = (30, "HEAT")
VALUES_PRIORITY_COOLING = (60, "COOLING")
VALUES_COOL_WITH_ROOM_SENSOR_OFF = (0, "OFF")

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/nibe_heatpump",
"iot_class": "local_polling",
"requirements": ["nibe==2.0.0"]
"requirements": ["nibe==2.1.4"]
}

View File

@@ -11,5 +11,6 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/oralb",
"iot_class": "local_push",
"requirements": ["oralb-ble==0.17.5"]
"loggers": ["oralb-ble"],
"requirements": ["oralb-ble==0.17.6"]
}

View File

@@ -7,5 +7,5 @@
"iot_class": "cloud_polling",
"loggers": ["aiopvpc"],
"quality_scale": "platinum",
"requirements": ["aiopvpc==4.0.1"]
"requirements": ["aiopvpc==4.1.0"]
}

View File

@@ -17,7 +17,6 @@ from sqlalchemy.orm.query import Query
from sqlalchemy.orm.session import Session
from sqlalchemy.sql.expression import literal
from sqlalchemy.sql.lambdas import StatementLambdaElement
from sqlalchemy.sql.selectable import Subquery
from homeassistant.const import COMPRESSED_STATE_LAST_UPDATED, COMPRESSED_STATE_STATE
from homeassistant.core import HomeAssistant, State, split_entity_id
@@ -283,9 +282,11 @@ def _significant_states_stmt(
(States.last_changed_ts == States.last_updated_ts)
| States.last_changed_ts.is_(None)
)
stmt += lambda q: q.filter(
(States.last_changed == States.last_updated) | States.last_changed.is_(None)
)
else:
stmt += lambda q: q.filter(
(States.last_changed == States.last_updated)
| States.last_changed.is_(None)
)
elif significant_changes_only:
if schema_version >= 31:
stmt += lambda q: q.filter(
@@ -592,48 +593,6 @@ def get_last_state_changes(
)
def _generate_most_recent_states_for_entities_by_date(
schema_version: int,
run_start: datetime,
utc_point_in_time: datetime,
entity_ids: list[str],
) -> Subquery:
"""Generate the sub query for the most recent states for specific entities by date."""
if schema_version >= 31:
run_start_ts = process_timestamp(run_start).timestamp()
utc_point_in_time_ts = dt_util.utc_to_timestamp(utc_point_in_time)
return (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated_ts).label("max_last_updated"),
)
.filter(
(States.last_updated_ts >= run_start_ts)
& (States.last_updated_ts < utc_point_in_time_ts)
)
.filter(States.entity_id.in_(entity_ids))
.group_by(States.entity_id)
.subquery()
)
return (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated).label("max_last_updated"),
)
.filter(
(States.last_updated >= run_start)
& (States.last_updated < utc_point_in_time)
)
.filter(States.entity_id.in_(entity_ids))
.group_by(States.entity_id)
.subquery()
)
def _get_states_for_entities_stmt(
schema_version: int,
run_start: datetime,
@@ -645,16 +604,29 @@ def _get_states_for_entities_stmt(
stmt, join_attributes = lambda_stmt_and_join_attributes(
schema_version, no_attributes, include_last_changed=True
)
most_recent_states_for_entities_by_date = (
_generate_most_recent_states_for_entities_by_date(
schema_version, run_start, utc_point_in_time, entity_ids
)
)
# We got an include-list of entities, accelerate the query by filtering already
# in the inner query.
if schema_version >= 31:
run_start_ts = process_timestamp(run_start).timestamp()
utc_point_in_time_ts = dt_util.utc_to_timestamp(utc_point_in_time)
stmt += lambda q: q.join(
most_recent_states_for_entities_by_date,
(
most_recent_states_for_entities_by_date := (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated_ts).label("max_last_updated"),
)
.filter(
(States.last_updated_ts >= run_start_ts)
& (States.last_updated_ts < utc_point_in_time_ts)
)
.filter(States.entity_id.in_(entity_ids))
.group_by(States.entity_id)
.subquery()
)
),
and_(
States.entity_id
== most_recent_states_for_entities_by_date.c.max_entity_id,
@@ -664,7 +636,21 @@ def _get_states_for_entities_stmt(
)
else:
stmt += lambda q: q.join(
most_recent_states_for_entities_by_date,
(
most_recent_states_for_entities_by_date := select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated).label("max_last_updated"),
)
.filter(
(States.last_updated >= run_start)
& (States.last_updated < utc_point_in_time)
)
.filter(States.entity_id.in_(entity_ids))
.group_by(States.entity_id)
.subquery()
),
and_(
States.entity_id
== most_recent_states_for_entities_by_date.c.max_entity_id,
@@ -679,45 +665,6 @@ def _get_states_for_entities_stmt(
return stmt
def _generate_most_recent_states_by_date(
schema_version: int,
run_start: datetime,
utc_point_in_time: datetime,
) -> Subquery:
"""Generate the sub query for the most recent states by date."""
if schema_version >= 31:
run_start_ts = process_timestamp(run_start).timestamp()
utc_point_in_time_ts = dt_util.utc_to_timestamp(utc_point_in_time)
return (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated_ts).label("max_last_updated"),
)
.filter(
(States.last_updated_ts >= run_start_ts)
& (States.last_updated_ts < utc_point_in_time_ts)
)
.group_by(States.entity_id)
.subquery()
)
return (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated).label("max_last_updated"),
)
.filter(
(States.last_updated >= run_start)
& (States.last_updated < utc_point_in_time)
)
.group_by(States.entity_id)
.subquery()
)
def _get_states_for_all_stmt(
schema_version: int,
run_start: datetime,
@@ -733,12 +680,26 @@ def _get_states_for_all_stmt(
# query, then filter out unwanted domains as well as applying the custom filter.
# This filtering can't be done in the inner query because the domain column is
# not indexed and we can't control what's in the custom filter.
most_recent_states_by_date = _generate_most_recent_states_by_date(
schema_version, run_start, utc_point_in_time
)
if schema_version >= 31:
run_start_ts = process_timestamp(run_start).timestamp()
utc_point_in_time_ts = dt_util.utc_to_timestamp(utc_point_in_time)
stmt += lambda q: q.join(
most_recent_states_by_date,
(
most_recent_states_by_date := (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated_ts).label("max_last_updated"),
)
.filter(
(States.last_updated_ts >= run_start_ts)
& (States.last_updated_ts < utc_point_in_time_ts)
)
.group_by(States.entity_id)
.subquery()
)
),
and_(
States.entity_id == most_recent_states_by_date.c.max_entity_id,
States.last_updated_ts == most_recent_states_by_date.c.max_last_updated,
@@ -746,7 +707,22 @@ def _get_states_for_all_stmt(
)
else:
stmt += lambda q: q.join(
most_recent_states_by_date,
(
most_recent_states_by_date := (
select(
States.entity_id.label("max_entity_id"),
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(States.last_updated).label("max_last_updated"),
)
.filter(
(States.last_updated >= run_start)
& (States.last_updated < utc_point_in_time)
)
.group_by(States.entity_id)
.subquery()
)
),
and_(
States.entity_id == most_recent_states_by_date.c.max_entity_id,
States.last_updated == most_recent_states_by_date.c.max_last_updated,

View File

@@ -6,5 +6,5 @@
"integration_type": "system",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["sqlalchemy==2.0.4", "fnvhash==0.1.0"]
"requirements": ["sqlalchemy==2.0.6", "fnvhash==0.1.0"]
}

View File

@@ -50,7 +50,7 @@ from .tasks import (
PostSchemaMigrationTask,
StatisticsTimestampMigrationCleanupTask,
)
from .util import session_scope
from .util import database_job_retry_wrapper, session_scope
if TYPE_CHECKING:
from . import Recorder
@@ -158,7 +158,9 @@ def migrate_schema(
hass.add_job(instance.async_set_db_ready)
new_version = version + 1
_LOGGER.info("Upgrading recorder db schema to version %s", new_version)
_apply_update(hass, engine, session_maker, new_version, current_version)
_apply_update(
instance, hass, engine, session_maker, new_version, current_version
)
with session_scope(session=session_maker()) as session:
session.add(SchemaChanges(schema_version=new_version))
@@ -508,7 +510,9 @@ def _drop_foreign_key_constraints(
)
@database_job_retry_wrapper("Apply migration update", 10)
def _apply_update( # noqa: C901
instance: Recorder,
hass: HomeAssistant,
engine: Engine,
session_maker: Callable[[], Session],
@@ -922,7 +926,7 @@ def _apply_update( # noqa: C901
# There may be duplicated statistics entries, delete duplicates
# and try again
with session_scope(session=session_maker()) as session:
delete_statistics_duplicates(hass, session)
delete_statistics_duplicates(instance, hass, session)
_migrate_statistics_columns_to_timestamp(session_maker, engine)
# Log at error level to ensure the user sees this message in the log
# since we logged the error above.
@@ -965,7 +969,7 @@ def post_schema_migration(
# since they are no longer used and take up a significant amount of space.
assert instance.event_session is not None
assert instance.engine is not None
_wipe_old_string_time_columns(instance.engine, instance.event_session)
_wipe_old_string_time_columns(instance, instance.engine, instance.event_session)
if old_version < 35 <= new_version:
# In version 34 we migrated all the created, start, and last_reset
# columns to be timestamps. In version 34 we need to wipe the old columns
@@ -978,7 +982,10 @@ def _wipe_old_string_statistics_columns(instance: Recorder) -> None:
instance.queue_task(StatisticsTimestampMigrationCleanupTask())
def _wipe_old_string_time_columns(engine: Engine, session: Session) -> None:
@database_job_retry_wrapper("Wipe old string time columns", 3)
def _wipe_old_string_time_columns(
instance: Recorder, engine: Engine, session: Session
) -> None:
"""Wipe old string time columns to save space."""
# Wipe Events.time_fired since its been replaced by Events.time_fired_ts
# Wipe States.last_updated since its been replaced by States.last_updated_ts
@@ -1065,7 +1072,7 @@ def _migrate_columns_to_timestamp(
result = session.connection().execute(
text(
"UPDATE events set time_fired_ts="
"IF(time_fired is NULL,0,"
"IF(time_fired is NULL or UNIX_TIMESTAMP(time_fired) is NULL,0,"
"UNIX_TIMESTAMP(time_fired)"
") "
"where time_fired_ts is NULL "
@@ -1078,7 +1085,7 @@ def _migrate_columns_to_timestamp(
result = session.connection().execute(
text(
"UPDATE states set last_updated_ts="
"IF(last_updated is NULL,0,"
"IF(last_updated is NULL or UNIX_TIMESTAMP(last_updated) is NULL,0,"
"UNIX_TIMESTAMP(last_updated) "
"), "
"last_changed_ts="
@@ -1154,7 +1161,7 @@ def _migrate_statistics_columns_to_timestamp(
result = session.connection().execute(
text(
f"UPDATE {table} set start_ts="
"IF(start is NULL,0,"
"IF(start is NULL or UNIX_TIMESTAMP(start) is NULL,0,"
"UNIX_TIMESTAMP(start) "
"), "
"created_ts="
@@ -1162,7 +1169,7 @@ def _migrate_statistics_columns_to_timestamp(
"last_reset_ts="
"UNIX_TIMESTAMP(last_reset) "
"where start_ts is NULL "
"LIMIT 250000;"
"LIMIT 100000;"
)
)
elif engine.dialect.name == SupportedDialect.POSTGRESQL:
@@ -1180,7 +1187,7 @@ def _migrate_statistics_columns_to_timestamp(
"created_ts=EXTRACT(EPOCH FROM created), "
"last_reset_ts=EXTRACT(EPOCH FROM last_reset) "
"where id IN ( "
f"SELECT id FROM {table} where start_ts is NULL LIMIT 250000 "
f"SELECT id FROM {table} where start_ts is NULL LIMIT 100000 "
" );"
)
)

View File

@@ -16,14 +16,13 @@ import re
from statistics import mean
from typing import TYPE_CHECKING, Any, Literal, cast
from sqlalchemy import and_, bindparam, func, lambda_stmt, select, text
from sqlalchemy import Select, and_, bindparam, func, lambda_stmt, select, text
from sqlalchemy.engine import Engine
from sqlalchemy.engine.row import Row
from sqlalchemy.exc import OperationalError, SQLAlchemyError, StatementError
from sqlalchemy.orm.session import Session
from sqlalchemy.sql.expression import literal_column, true
from sqlalchemy.sql.lambdas import StatementLambdaElement
from sqlalchemy.sql.selectable import Subquery
import voluptuous as vol
from homeassistant.const import ATTR_UNIT_OF_MEASUREMENT
@@ -75,6 +74,7 @@ from .models import (
datetime_to_timestamp_or_none,
)
from .util import (
database_job_retry_wrapper,
execute,
execute_stmt_lambda_element,
get_instance,
@@ -515,7 +515,10 @@ def _delete_duplicates_from_table(
return (total_deleted_rows, all_non_identical_duplicates)
def delete_statistics_duplicates(hass: HomeAssistant, session: Session) -> None:
@database_job_retry_wrapper("delete statistics duplicates", 3)
def delete_statistics_duplicates(
instance: Recorder, hass: HomeAssistant, session: Session
) -> None:
"""Identify and delete duplicated statistics.
A backup will be made of duplicated statistics before it is deleted.
@@ -646,27 +649,19 @@ def _compile_hourly_statistics_summary_mean_stmt(
)
def _compile_hourly_statistics_last_sum_stmt_subquery(
start_time_ts: float, end_time_ts: float
) -> Subquery:
"""Generate the summary mean statement for hourly statistics."""
return (
select(*QUERY_STATISTICS_SUMMARY_SUM)
.filter(StatisticsShortTerm.start_ts >= start_time_ts)
.filter(StatisticsShortTerm.start_ts < end_time_ts)
.subquery()
)
def _compile_hourly_statistics_last_sum_stmt(
start_time_ts: float, end_time_ts: float
) -> StatementLambdaElement:
"""Generate the summary mean statement for hourly statistics."""
subquery = _compile_hourly_statistics_last_sum_stmt_subquery(
start_time_ts, end_time_ts
)
return lambda_stmt(
lambda: select(subquery)
lambda: select(
subquery := (
select(*QUERY_STATISTICS_SUMMARY_SUM)
.filter(StatisticsShortTerm.start_ts >= start_time_ts)
.filter(StatisticsShortTerm.start_ts < end_time_ts)
.subquery()
)
)
.filter(subquery.c.rownum == 1)
.order_by(subquery.c.metadata_id)
)
@@ -1263,7 +1258,8 @@ def _reduce_statistics_per_month(
)
def _statistics_during_period_stmt(
def _generate_statistics_during_period_stmt(
columns: Select,
start_time: datetime,
end_time: datetime | None,
metadata_ids: list[int] | None,
@@ -1275,21 +1271,6 @@ def _statistics_during_period_stmt(
This prepares a lambda_stmt query, so we don't insert the parameters yet.
"""
start_time_ts = start_time.timestamp()
columns = select(table.metadata_id, table.start_ts)
if "last_reset" in types:
columns = columns.add_columns(table.last_reset_ts)
if "max" in types:
columns = columns.add_columns(table.max)
if "mean" in types:
columns = columns.add_columns(table.mean)
if "min" in types:
columns = columns.add_columns(table.min)
if "state" in types:
columns = columns.add_columns(table.state)
if "sum" in types:
columns = columns.add_columns(table.sum)
stmt = lambda_stmt(lambda: columns.filter(table.start_ts >= start_time_ts))
if end_time is not None:
end_time_ts = end_time.timestamp()
@@ -1303,6 +1284,23 @@ def _statistics_during_period_stmt(
return stmt
def _generate_max_mean_min_statistic_in_sub_period_stmt(
columns: Select,
start_time: datetime | None,
end_time: datetime | None,
table: type[StatisticsBase],
metadata_id: int,
) -> StatementLambdaElement:
stmt = lambda_stmt(lambda: columns.filter(table.metadata_id == metadata_id))
if start_time is not None:
start_time_ts = start_time.timestamp()
stmt += lambda q: q.filter(table.start_ts >= start_time_ts)
if end_time is not None:
end_time_ts = end_time.timestamp()
stmt += lambda q: q.filter(table.start_ts < end_time_ts)
return stmt
def _get_max_mean_min_statistic_in_sub_period(
session: Session,
result: dict[str, float],
@@ -1328,13 +1326,9 @@ def _get_max_mean_min_statistic_in_sub_period(
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
columns = columns.add_columns(func.min(table.min))
stmt = lambda_stmt(lambda: columns.filter(table.metadata_id == metadata_id))
if start_time is not None:
start_time_ts = start_time.timestamp()
stmt += lambda q: q.filter(table.start_ts >= start_time_ts)
if end_time is not None:
end_time_ts = end_time.timestamp()
stmt += lambda q: q.filter(table.start_ts < end_time_ts)
stmt = _generate_max_mean_min_statistic_in_sub_period_stmt(
columns, start_time, end_time, table, metadata_id
)
stats = cast(Sequence[Row[Any]], execute_stmt_lambda_element(session, stmt))
if not stats:
return
@@ -1749,8 +1743,21 @@ def _statistics_during_period_with_session(
table: type[Statistics | StatisticsShortTerm] = (
Statistics if period != "5minute" else StatisticsShortTerm
)
stmt = _statistics_during_period_stmt(
start_time, end_time, metadata_ids, table, types
columns = select(table.metadata_id, table.start_ts) # type: ignore[call-overload]
if "last_reset" in types:
columns = columns.add_columns(table.last_reset_ts)
if "max" in types:
columns = columns.add_columns(table.max)
if "mean" in types:
columns = columns.add_columns(table.mean)
if "min" in types:
columns = columns.add_columns(table.min)
if "state" in types:
columns = columns.add_columns(table.state)
if "sum" in types:
columns = columns.add_columns(table.sum)
stmt = _generate_statistics_during_period_stmt(
columns, start_time, end_time, metadata_ids, table, types
)
stats = cast(Sequence[Row], execute_stmt_lambda_element(session, stmt))
@@ -1915,28 +1922,24 @@ def get_last_short_term_statistics(
)
def _generate_most_recent_statistic_row(metadata_ids: list[int]) -> Subquery:
"""Generate the subquery to find the most recent statistic row."""
return (
select(
StatisticsShortTerm.metadata_id,
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(StatisticsShortTerm.start_ts).label("start_max"),
)
.where(StatisticsShortTerm.metadata_id.in_(metadata_ids))
.group_by(StatisticsShortTerm.metadata_id)
).subquery()
def _latest_short_term_statistics_stmt(
metadata_ids: list[int],
) -> StatementLambdaElement:
"""Create the statement for finding the latest short term stat rows."""
stmt = lambda_stmt(lambda: select(*QUERY_STATISTICS_SHORT_TERM))
most_recent_statistic_row = _generate_most_recent_statistic_row(metadata_ids)
stmt += lambda s: s.join(
most_recent_statistic_row,
(
most_recent_statistic_row := (
select(
StatisticsShortTerm.metadata_id,
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(StatisticsShortTerm.start_ts).label("start_max"),
)
.where(StatisticsShortTerm.metadata_id.in_(metadata_ids))
.group_by(StatisticsShortTerm.metadata_id)
).subquery()
),
(
StatisticsShortTerm.metadata_id # pylint: disable=comparison-with-callable
== most_recent_statistic_row.c.metadata_id
@@ -1984,21 +1987,34 @@ def get_latest_short_term_statistics(
)
def _get_most_recent_statistics_subquery(
metadata_ids: set[int], table: type[StatisticsBase], start_time_ts: float
) -> Subquery:
"""Generate the subquery to find the most recent statistic row."""
return (
select(
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(table.start_ts).label("max_start_ts"),
table.metadata_id.label("max_metadata_id"),
def _generate_statistics_at_time_stmt(
columns: Select,
table: type[StatisticsBase],
metadata_ids: set[int],
start_time_ts: float,
) -> StatementLambdaElement:
"""Create the statement for finding the statistics for a given time."""
return lambda_stmt(
lambda: columns.join(
(
most_recent_statistic_ids := (
select(
# https://github.com/sqlalchemy/sqlalchemy/issues/9189
# pylint: disable-next=not-callable
func.max(table.start_ts).label("max_start_ts"),
table.metadata_id.label("max_metadata_id"),
)
.filter(table.start_ts < start_time_ts)
.filter(table.metadata_id.in_(metadata_ids))
.group_by(table.metadata_id)
.subquery()
)
),
and_(
table.start_ts == most_recent_statistic_ids.c.max_start_ts,
table.metadata_id == most_recent_statistic_ids.c.max_metadata_id,
),
)
.filter(table.start_ts < start_time_ts)
.filter(table.metadata_id.in_(metadata_ids))
.group_by(table.metadata_id)
.subquery()
)
@@ -2023,19 +2039,10 @@ def _statistics_at_time(
columns = columns.add_columns(table.state)
if "sum" in types:
columns = columns.add_columns(table.sum)
start_time_ts = start_time.timestamp()
most_recent_statistic_ids = _get_most_recent_statistics_subquery(
metadata_ids, table, start_time_ts
stmt = _generate_statistics_at_time_stmt(
columns, table, metadata_ids, start_time_ts
)
stmt = lambda_stmt(lambda: columns).join(
most_recent_statistic_ids,
and_(
table.start_ts == most_recent_statistic_ids.c.max_start_ts,
table.metadata_id == most_recent_statistic_ids.c.max_metadata_id,
),
)
return cast(Sequence[Row], execute_stmt_lambda_element(session, stmt))

View File

@@ -568,6 +568,17 @@ def end_incomplete_runs(session: Session, start_time: datetime) -> None:
session.add(run)
def _is_retryable_error(instance: Recorder, err: OperationalError) -> bool:
"""Return True if the error is retryable."""
assert instance.engine is not None
return bool(
instance.engine.dialect.name == SupportedDialect.MYSQL
and isinstance(err.orig, BaseException)
and err.orig.args
and err.orig.args[0] in RETRYABLE_MYSQL_ERRORS
)
_FuncType = Callable[Concatenate[_RecorderT, _P], bool]
@@ -585,12 +596,8 @@ def retryable_database_job(
try:
return job(instance, *args, **kwargs)
except OperationalError as err:
assert instance.engine is not None
if (
instance.engine.dialect.name == SupportedDialect.MYSQL
and err.orig
and err.orig.args[0] in RETRYABLE_MYSQL_ERRORS
):
if _is_retryable_error(instance, err):
assert isinstance(err.orig, BaseException)
_LOGGER.info(
"%s; %s not completed, retrying", err.orig.args[1], description
)
@@ -608,6 +615,46 @@ def retryable_database_job(
return decorator
_WrappedFuncType = Callable[Concatenate[_RecorderT, _P], None]
def database_job_retry_wrapper(
description: str, attempts: int = 5
) -> Callable[[_WrappedFuncType[_RecorderT, _P]], _WrappedFuncType[_RecorderT, _P]]:
"""Try to execute a database job multiple times.
This wrapper handles InnoDB deadlocks and lock timeouts.
This is different from retryable_database_job in that it will retry the job
attempts number of times instead of returning False if the job fails.
"""
def decorator(
job: _WrappedFuncType[_RecorderT, _P]
) -> _WrappedFuncType[_RecorderT, _P]:
@functools.wraps(job)
def wrapper(instance: _RecorderT, *args: _P.args, **kwargs: _P.kwargs) -> None:
for attempt in range(attempts):
try:
job(instance, *args, **kwargs)
return
except OperationalError as err:
if attempt == attempts - 1 or not _is_retryable_error(
instance, err
):
raise
assert isinstance(err.orig, BaseException)
_LOGGER.info(
"%s; %s failed, retrying", err.orig.args[1], description
)
time.sleep(instance.db_retry_wait)
# Failed with retryable error
return wrapper
return decorator
def periodic_db_cleanups(instance: Recorder) -> None:
"""Run any database cleanups that need to happen periodically.

View File

@@ -64,7 +64,7 @@ NUMBER_ENTITIES = (
get_max_value=lambda api, ch: api.zoom_range(ch)["focus"]["pos"]["max"],
supported=lambda api, ch: api.zoom_supported(ch),
value=lambda api, ch: api.get_focus(ch),
method=lambda api, ch, value: api.set_zoom(ch, int(value)),
method=lambda api, ch, value: api.set_focus(ch, int(value)),
),
)

View File

@@ -24,5 +24,5 @@
"documentation": "https://www.home-assistant.io/integrations/roomba",
"iot_class": "local_push",
"loggers": ["paho_mqtt", "roombapy"],
"requirements": ["roombapy==1.6.5"]
"requirements": ["roombapy==1.6.6"]
}

View File

@@ -159,11 +159,9 @@ class ScreenlogicDataUpdateCoordinator(DataUpdateCoordinator):
"""Fetch data from the Screenlogic gateway."""
try:
await self._async_update_configured_data()
except ScreenLogicError as error:
_LOGGER.warning("Update error - attempting reconnect: %s", error)
except (ScreenLogicError, ScreenLogicWarning) as ex:
_LOGGER.warning("Update error - attempting reconnect: %s", ex)
await self._async_reconnect_update_data()
except ScreenLogicWarning as warn:
raise UpdateFailed(f"Incomplete update: {warn}") from warn
return None

View File

@@ -1,13 +1,11 @@
"""SFR Box."""
from __future__ import annotations
import asyncio
from sfrbox_api.bridge import SFRBox
from sfrbox_api.exceptions import SFRBoxAuthenticationError, SFRBoxError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr
@@ -40,15 +38,17 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
hass, box, "system", lambda b: b.system_get_info()
),
)
tasks = [
data.dsl.async_config_entry_first_refresh(),
data.system.async_config_entry_first_refresh(),
]
await asyncio.gather(*tasks)
await data.system.async_config_entry_first_refresh()
system_info = data.system.data
if system_info.net_infra == "adsl":
await data.dsl.async_config_entry_first_refresh()
else:
platforms = list(platforms)
platforms.remove(Platform.BINARY_SENSOR)
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = data
system_info = data.system.data
device_registry = dr.async_get(hass)
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,

View File

@@ -1,7 +1,6 @@
"""SFR Box sensor platform."""
from collections.abc import Callable, Iterable
from collections.abc import Callable
from dataclasses import dataclass
from itertools import chain
from typing import Generic, TypeVar
from sfrbox_api.models import DslInfo, SystemInfo
@@ -204,16 +203,15 @@ async def async_setup_entry(
"""Set up the sensors."""
data: DomainData = hass.data[DOMAIN][entry.entry_id]
entities: Iterable[SFRBoxSensor] = chain(
(
entities: list[SFRBoxSensor] = [
SFRBoxSensor(data.system, description, data.system.data)
for description in SYSTEM_SENSOR_TYPES
]
if data.system.data.net_infra == "adsl":
entities.extend(
SFRBoxSensor(data.dsl, description, data.system.data)
for description in DSL_SENSOR_TYPES
),
(
SFRBoxSensor(data.system, description, data.system.data)
for description in SYSTEM_SENSOR_TYPES
),
)
)
async_add_entities(entities)

View File

@@ -1,9 +1,9 @@
{
"domain": "snapcast",
"name": "Snapcast",
"codeowners": [],
"codeowners": ["@luar123"],
"documentation": "https://www.home-assistant.io/integrations/snapcast",
"iot_class": "local_polling",
"loggers": ["construct", "snapcast"],
"requirements": ["snapcast==2.3.0"]
"requirements": ["snapcast==2.3.2"]
}

View File

@@ -5,5 +5,5 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sql",
"iot_class": "local_polling",
"requirements": ["sqlalchemy==2.0.4"]
"requirements": ["sqlalchemy==2.0.6"]
}

View File

@@ -40,5 +40,5 @@
"documentation": "https://www.home-assistant.io/integrations/switchbot",
"iot_class": "local_push",
"loggers": ["switchbot"],
"requirements": ["PySwitchbot==0.37.3"]
"requirements": ["PySwitchbot==0.37.4"]
}

View File

@@ -17,9 +17,8 @@ some of their thread accessories can't be pinged, but it's still a thread proble
from __future__ import annotations
from typing import Any, TypedDict
from typing import TYPE_CHECKING, Any, TypedDict
from pyroute2 import NDB # pylint: disable=no-name-in-module
from python_otbr_api.tlv_parser import MeshcopTLVType
from homeassistant.components import zeroconf
@@ -29,6 +28,9 @@ from homeassistant.core import HomeAssistant
from .dataset_store import async_get_store
from .discovery import async_read_zeroconf_cache
if TYPE_CHECKING:
from pyroute2 import NDB # pylint: disable=no-name-in-module
class Neighbour(TypedDict):
"""A neighbour cache entry (ip neigh)."""
@@ -67,58 +69,69 @@ class Network(TypedDict):
unexpected_routers: set[str]
def _get_possible_thread_routes() -> (
tuple[dict[str, dict[str, Route]], dict[str, set[str]]]
):
def _get_possible_thread_routes(
ndb: NDB,
) -> tuple[dict[str, dict[str, Route]], dict[str, set[str]]]:
# Build a list of possible thread routes
# Right now, this is ipv6 /64's that have a gateway
# We cross reference with zerconf data to confirm which via's are known border routers
routes: dict[str, dict[str, Route]] = {}
reverse_routes: dict[str, set[str]] = {}
with NDB() as ndb:
for record in ndb.routes:
# Limit to IPV6 routes
if record.family != 10:
continue
# Limit to /64 prefixes
if record.dst_len != 64:
continue
# Limit to routes with a via
if not record.gateway and not record.nh_gateway:
continue
gateway = record.gateway or record.nh_gateway
route = routes.setdefault(gateway, {})
route[record.dst] = {
"metrics": record.metrics,
"priority": record.priority,
# NM creates "nexthop" routes - a single route with many via's
# Kernel creates many routes with a single via
"is_nexthop": record.nh_gateway is not None,
}
reverse_routes.setdefault(record.dst, set()).add(gateway)
for record in ndb.routes:
# Limit to IPV6 routes
if record.family != 10:
continue
# Limit to /64 prefixes
if record.dst_len != 64:
continue
# Limit to routes with a via
if not record.gateway and not record.nh_gateway:
continue
gateway = record.gateway or record.nh_gateway
route = routes.setdefault(gateway, {})
route[record.dst] = {
"metrics": record.metrics,
"priority": record.priority,
# NM creates "nexthop" routes - a single route with many via's
# Kernel creates many routes with a single via
"is_nexthop": record.nh_gateway is not None,
}
reverse_routes.setdefault(record.dst, set()).add(gateway)
return routes, reverse_routes
def _get_neighbours() -> dict[str, Neighbour]:
neighbours: dict[str, Neighbour] = {}
with NDB() as ndb:
for record in ndb.neighbours:
neighbours[record.dst] = {
"lladdr": record.lladdr,
"state": record.state,
"probes": record.probes,
}
def _get_neighbours(ndb: NDB) -> dict[str, Neighbour]:
# Build a list of neighbours
neighbours: dict[str, Neighbour] = {
record.dst: {
"lladdr": record.lladdr,
"state": record.state,
"probes": record.probes,
}
for record in ndb.neighbours
}
return neighbours
def _get_routes_and_neighbors():
"""Get the routes and neighbours from pyroute2."""
# Import in the executor since import NDB can take a while
from pyroute2 import ( # pylint: disable=no-name-in-module, import-outside-toplevel
NDB,
)
with NDB() as ndb: # pylint: disable=not-callable
routes, reverse_routes = _get_possible_thread_routes(ndb)
neighbours = _get_neighbours(ndb)
return routes, reverse_routes, neighbours
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for all known thread networks."""
networks: dict[str, Network] = {}
# Start with all networks that HA knows about
@@ -140,13 +153,12 @@ async def async_get_config_entry_diagnostics(
# Find all routes currently act that might be thread related, so we can match them to
# border routers as we process the zeroconf data.
routes, reverse_routes = await hass.async_add_executor_job(
_get_possible_thread_routes
#
# Also find all neighbours
routes, reverse_routes, neighbours = await hass.async_add_executor_job(
_get_routes_and_neighbors
)
# Find all neighbours
neighbours = await hass.async_add_executor_job(_get_neighbours)
aiozc = await zeroconf.async_get_async_instance(hass)
for data in async_read_zeroconf_cache(aiozc):
if not data.extended_pan_id:

View File

@@ -3,9 +3,12 @@ from __future__ import annotations
from collections.abc import Mapping
import logging
import re
from types import MappingProxyType
from typing import Any, NamedTuple
from urllib.parse import urlsplit
from aiohttp import CookieJar
from tplink_omada_client.exceptions import (
ConnectionFailed,
LoginFailed,
@@ -20,7 +23,10 @@ from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME, CONF_VE
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResult
from homeassistant.helpers import selector
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.aiohttp_client import (
async_create_clientsession,
async_get_clientsession,
)
from .const import DOMAIN
@@ -42,11 +48,26 @@ async def create_omada_client(
hass: HomeAssistant, data: MappingProxyType[str, Any]
) -> OmadaClient:
"""Create a TP-Link Omada client API for the given config entry."""
host = data[CONF_HOST]
host: str = data[CONF_HOST]
verify_ssl = bool(data[CONF_VERIFY_SSL])
if not host.lower().startswith(("http://", "https://")):
host = "https://" + host
host_parts = urlsplit(host)
if (
host_parts.hostname
and re.fullmatch(r"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}", host_parts.hostname)
is not None
):
# TP-Link API uses cookies for login session, so an unsafe cookie jar is required for IP addresses
websession = async_create_clientsession(hass, cookie_jar=CookieJar(unsafe=True))
else:
websession = async_get_clientsession(hass, verify_ssl=verify_ssl)
username = data[CONF_USERNAME]
password = data[CONF_PASSWORD]
websession = async_get_clientsession(hass, verify_ssl=verify_ssl)
return OmadaClient(host, username, password, websession=websession)

View File

@@ -1,7 +1,7 @@
"""Support for the Tuya lights."""
from __future__ import annotations
from dataclasses import dataclass
from dataclasses import dataclass, field
import json
from typing import Any, cast
@@ -59,7 +59,9 @@ class TuyaLightEntityDescription(LightEntityDescription):
color_data: DPCode | tuple[DPCode, ...] | None = None
color_mode: DPCode | None = None
color_temp: DPCode | tuple[DPCode, ...] | None = None
default_color_type: ColorTypeData = DEFAULT_COLOR_TYPE_DATA
default_color_type: ColorTypeData = field(
default_factory=lambda: DEFAULT_COLOR_TYPE_DATA
)
LIGHTS: dict[str, tuple[TuyaLightEntityDescription, ...]] = {

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from collections.abc import Callable
from contextlib import suppress
import datetime as dt
from functools import lru_cache
import json
from typing import Any, cast
@@ -424,6 +425,12 @@ def handle_ping(
connection.send_message(pong_message(msg["id"]))
@lru_cache
def _cached_template(template_str: str, hass: HomeAssistant) -> template.Template:
"""Return a cached template."""
return template.Template(template_str, hass)
@decorators.websocket_command(
{
vol.Required("type"): "render_template",
@@ -440,7 +447,7 @@ async def handle_render_template(
) -> None:
"""Handle render_template command."""
template_str = msg["template"]
template_obj = template.Template(template_str, hass)
template_obj = _cached_template(template_str, hass)
variables = msg.get("variables")
timeout = msg.get("timeout")
info = None

View File

@@ -12,5 +12,5 @@
"dependencies": ["bluetooth_adapters"],
"documentation": "https://www.home-assistant.io/integrations/yalexs_ble",
"iot_class": "local_push",
"requirements": ["yalexs-ble==2.0.4"]
"requirements": ["yalexs-ble==2.1.1"]
}

View File

@@ -7,7 +7,7 @@
"documentation": "https://www.home-assistant.io/integrations/yamaha_musiccast",
"iot_class": "local_push",
"loggers": ["aiomusiccast"],
"requirements": ["aiomusiccast==0.14.7"],
"requirements": ["aiomusiccast==0.14.8"],
"ssdp": [
{
"manufacturer": "Yamaha Corporation"

View File

@@ -1,5 +1,6 @@
"""Support for Zigbee Home Automation devices."""
import asyncio
import copy
import logging
import os
@@ -90,6 +91,15 @@ async def async_setup_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> b
Will automatically load components to support devices found on the network.
"""
# Strip whitespace around `socket://` URIs, this is no longer accepted by zigpy
# This will be removed in 2023.7.0
path = config_entry.data[CONF_DEVICE][CONF_DEVICE_PATH]
data = copy.deepcopy(dict(config_entry.data))
if path.startswith("socket://") and path != path.strip():
data[CONF_DEVICE][CONF_DEVICE_PATH] = path.strip()
hass.config_entries.async_update_entry(config_entry, data=data)
zha_data = hass.data.setdefault(DATA_ZHA, {})
config = zha_data.get(DATA_ZHA_CONFIG, {})
@@ -97,7 +107,7 @@ async def async_setup_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> b
zha_data.setdefault(platform, [])
if config.get(CONF_ENABLE_QUIRKS, True):
setup_quirks(config)
setup_quirks(custom_quirks_path=config.get(CONF_CUSTOM_QUIRKS_PATH))
# temporary code to remove the ZHA storage file from disk.
# this will be removed in 2022.10.0

View File

@@ -20,10 +20,10 @@
"zigpy_znp"
],
"requirements": [
"bellows==0.34.9",
"bellows==0.34.10",
"pyserial==3.5",
"pyserial-asyncio==0.6",
"zha-quirks==0.0.93",
"zha-quirks==0.0.94",
"zigpy-deconz==0.19.2",
"zigpy==0.53.2",
"zigpy-xbee==0.16.2",

View File

@@ -8,7 +8,7 @@ from .backports.enum import StrEnum
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2023
MINOR_VERSION: Final = 3
PATCH_VERSION: Final = "1"
PATCH_VERSION: Final = "6"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 10, 0)

View File

@@ -39,6 +39,20 @@ SERVER_SOFTWARE = "{0}/{1} aiohttp/{2} Python/{3[0]}.{3[1]}".format(
WARN_CLOSE_MSG = "closes the Home Assistant aiohttp session"
#
# The default connection limit of 100 meant that you could only have
# 100 concurrent connections.
#
# This was effectively a limit of 100 devices and than
# the supervisor API would fail as soon as it was hit.
#
# We now apply the 100 limit per host, so that we can have 100 connections
# to a single host, but can have more than 4096 connections in total to
# prevent a single host from using all available connections.
#
MAXIMUM_CONNECTIONS = 4096
MAXIMUM_CONNECTIONS_PER_HOST = 100
class HassClientResponse(aiohttp.ClientResponse):
"""aiohttp.ClientResponse with a json method that uses json_loads by default."""
@@ -261,7 +275,12 @@ def _async_get_connector(
else:
ssl_context = False
connector = aiohttp.TCPConnector(enable_cleanup_closed=True, ssl=ssl_context)
connector = aiohttp.TCPConnector(
enable_cleanup_closed=True,
ssl=ssl_context,
limit=MAXIMUM_CONNECTIONS,
limit_per_host=MAXIMUM_CONNECTIONS_PER_HOST,
)
hass.data[key] = connector
async def _async_close_connector(event: Event) -> None:

View File

@@ -23,7 +23,7 @@ fnvhash==0.1.0
hass-nabucasa==0.61.0
hassil==1.0.6
home-assistant-bluetooth==1.9.3
home-assistant-frontend==20230302.0
home-assistant-frontend==20230309.1
home-assistant-intents==2023.2.28
httpx==0.23.3
ifaddr==0.1.7
@@ -42,7 +42,7 @@ pyudev==0.23.2
pyyaml==6.0
requests==2.28.2
scapy==2.5.0
sqlalchemy==2.0.4
sqlalchemy==2.0.6
typing-extensions>=4.5.0,<5.0
voluptuous-serialize==2.6.0
voluptuous==0.13.1

View File

@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "homeassistant"
version = "2023.3.1"
version = "2023.3.6"
license = {text = "Apache-2.0"}
description = "Open-source home automation platform running on Python 3."
readme = "README.rst"

View File

@@ -40,7 +40,7 @@ PyRMVtransport==0.3.3
PySocks==1.7.1
# homeassistant.components.switchbot
PySwitchbot==0.37.3
PySwitchbot==0.37.4
# homeassistant.components.transport_nsw
PyTransportNSW==0.1.1
@@ -156,7 +156,7 @@ aioecowitt==2023.01.0
aioemonitor==1.0.5
# homeassistant.components.esphome
aioesphomeapi==13.4.1
aioesphomeapi==13.5.1
# homeassistant.components.flo
aioflo==2021.11.0
@@ -171,7 +171,7 @@ aiogithubapi==22.10.1
aioguardian==2022.07.0
# homeassistant.components.harmony
aioharmony==0.2.9
aioharmony==0.2.10
# homeassistant.components.homekit_controller
aiohomekit==2.6.1
@@ -214,7 +214,7 @@ aiolyric==1.0.9
aiomodernforms==0.1.8
# homeassistant.components.yamaha_musiccast
aiomusiccast==0.14.7
aiomusiccast==0.14.8
# homeassistant.components.nanoleaf
aionanoleaf==0.2.1
@@ -241,7 +241,7 @@ aiopurpleair==2022.12.1
aiopvapi==2.0.4
# homeassistant.components.pvpc_hourly_pricing
aiopvpc==4.0.1
aiopvpc==4.1.0
# homeassistant.components.lidarr
# homeassistant.components.radarr
@@ -422,7 +422,7 @@ beautifulsoup4==4.11.1
# beewi_smartclim==0.0.10
# homeassistant.components.zha
bellows==0.34.9
bellows==0.34.10
# homeassistant.components.bmw_connected_drive
bimmer_connected==0.12.1
@@ -467,7 +467,7 @@ bluetooth-auto-recovery==1.0.3
bluetooth-data-tools==0.3.1
# homeassistant.components.bond
bond-async==0.1.22
bond-async==0.1.23
# homeassistant.components.bosch_shc
boschshcpy==0.2.35
@@ -625,7 +625,7 @@ dynalite_devices==0.1.47
eagle100==0.1.1
# homeassistant.components.easyenergy
easyenergy==0.1.2
easyenergy==0.2.2
# homeassistant.components.ebusd
ebusdpy==0.0.17
@@ -907,7 +907,7 @@ hole==0.8.0
holidays==0.18.0
# homeassistant.components.frontend
home-assistant-frontend==20230302.0
home-assistant-frontend==20230309.1
# homeassistant.components.conversation
home-assistant-intents==2023.2.28
@@ -979,7 +979,7 @@ influxdb==5.3.1
inkbird-ble==0.5.6
# homeassistant.components.insteon
insteon-frontend-home-assistant==0.3.2
insteon-frontend-home-assistant==0.3.3
# homeassistant.components.intellifire
intellifire4py==2.2.2
@@ -1201,7 +1201,7 @@ nextcord==2.0.0a8
nextdns==1.3.0
# homeassistant.components.nibe_heatpump
nibe==2.0.0
nibe==2.1.4
# homeassistant.components.niko_home_control
niko-home-control==0.2.1
@@ -1299,7 +1299,7 @@ openwrt-luci-rpc==1.1.11
openwrt-ubus-rpc==0.0.2
# homeassistant.components.oralb
oralb-ble==0.17.5
oralb-ble==0.17.6
# homeassistant.components.oru
oru==0.1.11
@@ -1573,7 +1573,7 @@ pydaikin==2.9.0
pydanfossair==0.1.0
# homeassistant.components.deconz
pydeconz==108
pydeconz==110
# homeassistant.components.delijn
pydelijn==1.0.0
@@ -1621,7 +1621,7 @@ pyevilgenius==2.0.0
pyezviz==0.2.0.9
# homeassistant.components.fibaro
pyfibaro==0.6.8
pyfibaro==0.6.9
# homeassistant.components.fido
pyfido==2.1.2
@@ -1687,7 +1687,7 @@ pyialarm==2.2.0
pyicloud==1.0.0
# homeassistant.components.insteon
pyinsteon==1.3.3
pyinsteon==1.3.4
# homeassistant.components.intesishome
pyintesishome==1.8.0
@@ -1771,7 +1771,7 @@ pymailgunner==1.4
pymata-express==1.19
# homeassistant.components.mazda
pymazda==0.3.7
pymazda==0.3.8
# homeassistant.components.mediaroom
pymediaroom==0.6.5.4
@@ -2264,7 +2264,7 @@ rocketchat-API==0.6.1
rokuecp==0.17.1
# homeassistant.components.roomba
roombapy==1.6.5
roombapy==1.6.6
# homeassistant.components.roon
roonapi==0.1.3
@@ -2367,7 +2367,7 @@ smart-meter-texas==0.4.7
smhi-pkg==1.0.16
# homeassistant.components.snapcast
snapcast==2.3.0
snapcast==2.3.2
# homeassistant.components.sonos
soco==0.29.1
@@ -2398,7 +2398,7 @@ spotipy==2.22.1
# homeassistant.components.recorder
# homeassistant.components.sql
sqlalchemy==2.0.4
sqlalchemy==2.0.6
# homeassistant.components.srp_energy
srpenergy==1.3.6
@@ -2669,15 +2669,13 @@ xs1-api-client==3.0.0
# homeassistant.components.yale_smart_alarm
yalesmartalarmclient==0.3.9
# homeassistant.components.august
# homeassistant.components.yalexs_ble
yalexs-ble==2.0.4
yalexs-ble==2.1.1
# homeassistant.components.august
yalexs==1.2.7
# homeassistant.components.august
yalexs_ble==2.0.4
# homeassistant.components.yeelight
yeelight==0.7.10
@@ -2706,7 +2704,7 @@ zeroconf==0.47.3
zeversolar==0.3.1
# homeassistant.components.zha
zha-quirks==0.0.93
zha-quirks==0.0.94
# homeassistant.components.zhong_hong
zhong_hong_hvac==1.0.9

View File

@@ -36,7 +36,7 @@ PyRMVtransport==0.3.3
PySocks==1.7.1
# homeassistant.components.switchbot
PySwitchbot==0.37.3
PySwitchbot==0.37.4
# homeassistant.components.transport_nsw
PyTransportNSW==0.1.1
@@ -143,7 +143,7 @@ aioecowitt==2023.01.0
aioemonitor==1.0.5
# homeassistant.components.esphome
aioesphomeapi==13.4.1
aioesphomeapi==13.5.1
# homeassistant.components.flo
aioflo==2021.11.0
@@ -155,7 +155,7 @@ aiogithubapi==22.10.1
aioguardian==2022.07.0
# homeassistant.components.harmony
aioharmony==0.2.9
aioharmony==0.2.10
# homeassistant.components.homekit_controller
aiohomekit==2.6.1
@@ -195,7 +195,7 @@ aiolyric==1.0.9
aiomodernforms==0.1.8
# homeassistant.components.yamaha_musiccast
aiomusiccast==0.14.7
aiomusiccast==0.14.8
# homeassistant.components.nanoleaf
aionanoleaf==0.2.1
@@ -219,7 +219,7 @@ aiopurpleair==2022.12.1
aiopvapi==2.0.4
# homeassistant.components.pvpc_hourly_pricing
aiopvpc==4.0.1
aiopvpc==4.1.0
# homeassistant.components.lidarr
# homeassistant.components.radarr
@@ -352,7 +352,7 @@ base36==0.1.1
beautifulsoup4==4.11.1
# homeassistant.components.zha
bellows==0.34.9
bellows==0.34.10
# homeassistant.components.bmw_connected_drive
bimmer_connected==0.12.1
@@ -384,7 +384,7 @@ bluetooth-auto-recovery==1.0.3
bluetooth-data-tools==0.3.1
# homeassistant.components.bond
bond-async==0.1.22
bond-async==0.1.23
# homeassistant.components.bosch_shc
boschshcpy==0.2.35
@@ -490,7 +490,7 @@ dynalite_devices==0.1.47
eagle100==0.1.1
# homeassistant.components.easyenergy
easyenergy==0.1.2
easyenergy==0.2.2
# homeassistant.components.elgato
elgato==4.0.1
@@ -690,7 +690,7 @@ hole==0.8.0
holidays==0.18.0
# homeassistant.components.frontend
home-assistant-frontend==20230302.0
home-assistant-frontend==20230309.1
# homeassistant.components.conversation
home-assistant-intents==2023.2.28
@@ -738,7 +738,7 @@ influxdb==5.3.1
inkbird-ble==0.5.6
# homeassistant.components.insteon
insteon-frontend-home-assistant==0.3.2
insteon-frontend-home-assistant==0.3.3
# homeassistant.components.intellifire
intellifire4py==2.2.2
@@ -891,7 +891,7 @@ nextcord==2.0.0a8
nextdns==1.3.0
# homeassistant.components.nibe_heatpump
nibe==2.0.0
nibe==2.1.4
# homeassistant.components.nfandroidtv
notifications-android-tv==0.1.5
@@ -947,7 +947,7 @@ openai==0.26.2
openerz-api==0.2.0
# homeassistant.components.oralb
oralb-ble==0.17.5
oralb-ble==0.17.6
# homeassistant.components.ovo_energy
ovoenergy==1.2.0
@@ -1134,7 +1134,7 @@ pycoolmasternet-async==0.1.5
pydaikin==2.9.0
# homeassistant.components.deconz
pydeconz==108
pydeconz==110
# homeassistant.components.dexcom
pydexcom==0.2.3
@@ -1161,7 +1161,7 @@ pyevilgenius==2.0.0
pyezviz==0.2.0.9
# homeassistant.components.fibaro
pyfibaro==0.6.8
pyfibaro==0.6.9
# homeassistant.components.fido
pyfido==2.1.2
@@ -1212,7 +1212,7 @@ pyialarm==2.2.0
pyicloud==1.0.0
# homeassistant.components.insteon
pyinsteon==1.3.3
pyinsteon==1.3.4
# homeassistant.components.ipma
pyipma==3.0.6
@@ -1275,7 +1275,7 @@ pymailgunner==1.4
pymata-express==1.19
# homeassistant.components.mazda
pymazda==0.3.7
pymazda==0.3.8
# homeassistant.components.melcloud
pymelcloud==2.5.8
@@ -1600,7 +1600,7 @@ ring_doorbell==0.7.2
rokuecp==0.17.1
# homeassistant.components.roomba
roombapy==1.6.5
roombapy==1.6.6
# homeassistant.components.roon
roonapi==0.1.3
@@ -1698,7 +1698,7 @@ spotipy==2.22.1
# homeassistant.components.recorder
# homeassistant.components.sql
sqlalchemy==2.0.4
sqlalchemy==2.0.6
# homeassistant.components.srp_energy
srpenergy==1.3.6
@@ -1894,15 +1894,13 @@ xmltodict==0.13.0
# homeassistant.components.yale_smart_alarm
yalesmartalarmclient==0.3.9
# homeassistant.components.august
# homeassistant.components.yalexs_ble
yalexs-ble==2.0.4
yalexs-ble==2.1.1
# homeassistant.components.august
yalexs==1.2.7
# homeassistant.components.august
yalexs_ble==2.0.4
# homeassistant.components.yeelight
yeelight==0.7.10
@@ -1922,7 +1920,7 @@ zeroconf==0.47.3
zeversolar==0.3.1
# homeassistant.components.zha
zha-quirks==0.0.93
zha-quirks==0.0.94
# homeassistant.components.zha
zigpy-deconz==0.19.2

View File

@@ -349,6 +349,52 @@ async def test_api_template(hass: HomeAssistant, mock_api_client: TestClient) ->
assert body == "10"
hass.states.async_set("sensor.temperature", 20)
resp = await mock_api_client.post(
const.URL_API_TEMPLATE,
json={"template": "{{ states.sensor.temperature.state }}"},
)
body = await resp.text()
assert body == "20"
hass.states.async_remove("sensor.temperature")
resp = await mock_api_client.post(
const.URL_API_TEMPLATE,
json={"template": "{{ states.sensor.temperature.state }}"},
)
body = await resp.text()
assert body == ""
async def test_api_template_cached(
hass: HomeAssistant, mock_api_client: TestClient
) -> None:
"""Test the template API uses the cache."""
hass.states.async_set("sensor.temperature", 30)
resp = await mock_api_client.post(
const.URL_API_TEMPLATE,
json={"template": "{{ states.sensor.temperature.state }}"},
)
body = await resp.text()
assert body == "30"
hass.states.async_set("sensor.temperature", 40)
resp = await mock_api_client.post(
const.URL_API_TEMPLATE,
json={"template": "{{ states.sensor.temperature.state }}"},
)
body = await resp.text()
assert body == "40"
async def test_api_template_error(
hass: HomeAssistant, mock_api_client: TestClient

View File

@@ -1,5 +1,6 @@
"""Fixtures for Hass.io."""
import os
import re
from unittest.mock import Mock, patch
import pytest
@@ -12,6 +13,16 @@ from homeassistant.setup import async_setup_component
from . import SUPERVISOR_TOKEN
@pytest.fixture(autouse=True)
def disable_security_filter():
"""Disable the security filter to ensure the integration is secure."""
with patch(
"homeassistant.components.http.security_filter.FILTERS",
re.compile("not-matching-anything"),
):
yield
@pytest.fixture
def hassio_env():
"""Fixture to inject hassio env."""
@@ -37,6 +48,13 @@ def hassio_stubs(hassio_env, hass, hass_client, aioclient_mock):
), patch(
"homeassistant.components.hassio.HassIO.get_info",
side_effect=HassioAPIError(),
), patch(
"homeassistant.components.hassio.HassIO.get_ingress_panels",
return_value={"panels": []},
), patch(
"homeassistant.components.hassio.issues.SupervisorIssues.setup"
), patch(
"homeassistant.components.hassio.HassIO.refresh_updates"
):
hass.state = CoreState.starting
hass.loop.run_until_complete(async_setup_component(hass, "hassio", {}))
@@ -67,13 +85,7 @@ async def hassio_client_supervisor(hass, aiohttp_client, hassio_stubs):
@pytest.fixture
def hassio_handler(hass, aioclient_mock):
async def hassio_handler(hass, aioclient_mock):
"""Create mock hassio handler."""
async def get_client_session():
return async_get_clientsession(hass)
websession = hass.loop.run_until_complete(get_client_session())
with patch.dict(os.environ, {"SUPERVISOR_TOKEN": SUPERVISOR_TOKEN}):
yield HassIO(hass.loop, websession, "127.0.0.1")
yield HassIO(hass.loop, async_get_clientsession(hass), "127.0.0.1")

View File

@@ -1,13 +1,21 @@
"""The tests for the hassio component."""
from __future__ import annotations
from typing import Any, Literal
import aiohttp
from aiohttp import hdrs, web
import pytest
from homeassistant.components.hassio.handler import HassioAPIError
from homeassistant.components.hassio.handler import HassIO, HassioAPIError
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from tests.test_util.aiohttp import AiohttpClientMocker
async def test_api_ping(hassio_handler, aioclient_mock: AiohttpClientMocker) -> None:
async def test_api_ping(
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API ping."""
aioclient_mock.get("http://127.0.0.1/supervisor/ping", json={"result": "ok"})
@@ -16,7 +24,7 @@ async def test_api_ping(hassio_handler, aioclient_mock: AiohttpClientMocker) ->
async def test_api_ping_error(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API ping error."""
aioclient_mock.get("http://127.0.0.1/supervisor/ping", json={"result": "error"})
@@ -26,7 +34,7 @@ async def test_api_ping_error(
async def test_api_ping_exeption(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API ping exception."""
aioclient_mock.get("http://127.0.0.1/supervisor/ping", exc=aiohttp.ClientError())
@@ -35,7 +43,9 @@ async def test_api_ping_exeption(
assert aioclient_mock.call_count == 1
async def test_api_info(hassio_handler, aioclient_mock: AiohttpClientMocker) -> None:
async def test_api_info(
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API generic info."""
aioclient_mock.get(
"http://127.0.0.1/info",
@@ -53,7 +63,7 @@ async def test_api_info(hassio_handler, aioclient_mock: AiohttpClientMocker) ->
async def test_api_info_error(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant info error."""
aioclient_mock.get(
@@ -67,7 +77,7 @@ async def test_api_info_error(
async def test_api_host_info(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Host info."""
aioclient_mock.get(
@@ -90,7 +100,7 @@ async def test_api_host_info(
async def test_api_supervisor_info(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Supervisor info."""
aioclient_mock.get(
@@ -108,7 +118,9 @@ async def test_api_supervisor_info(
assert data["channel"] == "stable"
async def test_api_os_info(hassio_handler, aioclient_mock: AiohttpClientMocker) -> None:
async def test_api_os_info(
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API OS info."""
aioclient_mock.get(
"http://127.0.0.1/os/info",
@@ -125,7 +137,7 @@ async def test_api_os_info(hassio_handler, aioclient_mock: AiohttpClientMocker)
async def test_api_host_info_error(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant info error."""
aioclient_mock.get(
@@ -139,7 +151,7 @@ async def test_api_host_info_error(
async def test_api_core_info(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant Core info."""
aioclient_mock.get(
@@ -153,7 +165,7 @@ async def test_api_core_info(
async def test_api_core_info_error(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant Core info error."""
aioclient_mock.get(
@@ -167,7 +179,7 @@ async def test_api_core_info_error(
async def test_api_homeassistant_stop(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant stop."""
aioclient_mock.post("http://127.0.0.1/homeassistant/stop", json={"result": "ok"})
@@ -177,7 +189,7 @@ async def test_api_homeassistant_stop(
async def test_api_homeassistant_restart(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Home Assistant restart."""
aioclient_mock.post("http://127.0.0.1/homeassistant/restart", json={"result": "ok"})
@@ -187,7 +199,7 @@ async def test_api_homeassistant_restart(
async def test_api_addon_info(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Add-on info."""
aioclient_mock.get(
@@ -201,7 +213,7 @@ async def test_api_addon_info(
async def test_api_addon_stats(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Add-on stats."""
aioclient_mock.get(
@@ -215,7 +227,7 @@ async def test_api_addon_stats(
async def test_api_discovery_message(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API discovery message."""
aioclient_mock.get(
@@ -229,7 +241,7 @@ async def test_api_discovery_message(
async def test_api_retrieve_discovery(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API discovery message."""
aioclient_mock.get(
@@ -243,7 +255,7 @@ async def test_api_retrieve_discovery(
async def test_api_ingress_panels(
hassio_handler, aioclient_mock: AiohttpClientMocker
hassio_handler: HassIO, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test setup with API Ingress panels."""
aioclient_mock.get(
@@ -267,3 +279,56 @@ async def test_api_ingress_panels(
assert aioclient_mock.call_count == 1
assert data["panels"]
assert "slug" in data["panels"]
@pytest.mark.parametrize(
("api_call", "method", "payload"),
[
["retrieve_discovery_messages", "GET", None],
["refresh_updates", "POST", None],
["update_diagnostics", "POST", True],
],
)
async def test_api_headers(
hass,
aiohttp_raw_server,
socket_enabled,
api_call: str,
method: Literal["GET", "POST"],
payload: Any,
) -> None:
"""Test headers are forwarded correctly."""
received_request = None
async def mock_handler(request):
"""Return OK."""
nonlocal received_request
received_request = request
return web.json_response({"result": "ok", "data": None})
server = await aiohttp_raw_server(mock_handler)
hassio_handler = HassIO(
hass.loop,
async_get_clientsession(hass),
f"{server.host}:{server.port}",
)
api_func = getattr(hassio_handler, api_call)
if payload:
await api_func(payload)
else:
await api_func()
assert received_request is not None
assert received_request.method == method
assert received_request.headers.get("X-Hass-Source") == "core.handler"
if method == "GET":
assert hdrs.CONTENT_TYPE not in received_request.headers
return
assert hdrs.CONTENT_TYPE in received_request.headers
if payload:
assert received_request.headers[hdrs.CONTENT_TYPE] == "application/json"
else:
assert received_request.headers[hdrs.CONTENT_TYPE] == "application/octet-stream"

View File

@@ -1,63 +1,45 @@
"""The tests for the hassio component."""
import asyncio
from http import HTTPStatus
from unittest.mock import patch
from aiohttp import StreamReader
import pytest
from homeassistant.components.hassio.http import _need_auth
from homeassistant.core import HomeAssistant
from tests.common import MockUser
from tests.test_util.aiohttp import AiohttpClientMocker
async def test_forward_request(
hassio_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test fetching normal path."""
aioclient_mock.post("http://127.0.0.1/beer", text="response")
@pytest.fixture
def mock_not_onboarded():
"""Mock that we're not onboarded."""
with patch(
"homeassistant.components.hassio.http.async_is_onboarded", return_value=False
):
yield
resp = await hassio_client.post("/api/hassio/beer")
# Check we got right response
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == "response"
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
@pytest.fixture
def hassio_user_client(hassio_client, hass_admin_user):
"""Return a Hass.io HTTP client tied to a non-admin user."""
hass_admin_user.groups = []
return hassio_client
@pytest.mark.parametrize(
"build_type", ["supervisor/info", "homeassistant/update", "host/info"]
)
async def test_auth_required_forward_request(hassio_noauth_client, build_type) -> None:
"""Test auth required for normal request."""
resp = await hassio_noauth_client.post(f"/api/hassio/{build_type}")
# Check we got right response
assert resp.status == HTTPStatus.UNAUTHORIZED
@pytest.mark.parametrize(
"build_type",
"path",
[
"app/index.html",
"app/hassio-app.html",
"app/index.html",
"app/hassio-app.html",
"app/some-chunk.js",
"app/app.js",
"app/entrypoint.js",
"addons/bl_b392/logo",
"addons/bl_b392/icon",
],
)
async def test_forward_request_no_auth_for_panel(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
async def test_forward_request_onboarded_user_get(
hassio_user_client, aioclient_mock: AiohttpClientMocker, path: str
) -> None:
"""Test no auth needed for ."""
aioclient_mock.get(f"http://127.0.0.1/{build_type}", text="response")
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_client.get(f"/api/hassio/{build_type}")
resp = await hassio_user_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
@@ -66,15 +48,68 @@ async def test_forward_request_no_auth_for_panel(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
# We only expect a single header.
assert aioclient_mock.mock_calls[0][3] == {"X-Hass-Source": "core.http"}
async def test_forward_request_no_auth_for_logo(
hassio_client, aioclient_mock: AiohttpClientMocker
@pytest.mark.parametrize("method", ["POST", "PUT", "DELETE", "RANDOM"])
async def test_forward_request_onboarded_user_unallowed_methods(
hassio_user_client, aioclient_mock: AiohttpClientMocker, method: str
) -> None:
"""Test no auth needed for logo."""
aioclient_mock.get("http://127.0.0.1/addons/bl_b392/logo", text="response")
"""Test fetching normal path."""
resp = await hassio_user_client.post("/api/hassio/app/entrypoint.js")
resp = await hassio_client.get("/api/hassio/addons/bl_b392/logo")
# Check we got right response
assert resp.status == HTTPStatus.METHOD_NOT_ALLOWED
# Check we did not forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("bad_path", "expected_status"),
[
# Caught by bullshit filter
("app/%252E./entrypoint.js", HTTPStatus.BAD_REQUEST),
# The .. is processed, making it an unauthenticated path
("app/../entrypoint.js", HTTPStatus.UNAUTHORIZED),
("app/%2E%2E/entrypoint.js", HTTPStatus.UNAUTHORIZED),
# Unauthenticated path
("supervisor/info", HTTPStatus.UNAUTHORIZED),
("supervisor/logs", HTTPStatus.UNAUTHORIZED),
("addons/bl_b392/logs", HTTPStatus.UNAUTHORIZED),
],
)
async def test_forward_request_onboarded_user_unallowed_paths(
hassio_user_client,
aioclient_mock: AiohttpClientMocker,
bad_path: str,
expected_status: int,
) -> None:
"""Test fetching normal path."""
resp = await hassio_user_client.get(f"/api/hassio/{bad_path}")
# Check we got right response
assert resp.status == expected_status
# Check we didn't forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
"path",
[
"app/entrypoint.js",
"addons/bl_b392/logo",
"addons/bl_b392/icon",
],
)
async def test_forward_request_onboarded_noauth_get(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker, path: str
) -> None:
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_noauth_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
@@ -83,15 +118,73 @@ async def test_forward_request_no_auth_for_logo(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
# We only expect a single header.
assert aioclient_mock.mock_calls[0][3] == {"X-Hass-Source": "core.http"}
async def test_forward_request_no_auth_for_icon(
hassio_client, aioclient_mock: AiohttpClientMocker
@pytest.mark.parametrize("method", ["POST", "PUT", "DELETE", "RANDOM"])
async def test_forward_request_onboarded_noauth_unallowed_methods(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker, method: str
) -> None:
"""Test no auth needed for icon."""
aioclient_mock.get("http://127.0.0.1/addons/bl_b392/icon", text="response")
"""Test fetching normal path."""
resp = await hassio_noauth_client.post("/api/hassio/app/entrypoint.js")
resp = await hassio_client.get("/api/hassio/addons/bl_b392/icon")
# Check we got right response
assert resp.status == HTTPStatus.METHOD_NOT_ALLOWED
# Check we did not forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("bad_path", "expected_status"),
[
# Caught by bullshit filter
("app/%252E./entrypoint.js", HTTPStatus.BAD_REQUEST),
# The .. is processed, making it an unauthenticated path
("app/../entrypoint.js", HTTPStatus.UNAUTHORIZED),
("app/%2E%2E/entrypoint.js", HTTPStatus.UNAUTHORIZED),
# Unauthenticated path
("supervisor/info", HTTPStatus.UNAUTHORIZED),
("supervisor/logs", HTTPStatus.UNAUTHORIZED),
("addons/bl_b392/logs", HTTPStatus.UNAUTHORIZED),
],
)
async def test_forward_request_onboarded_noauth_unallowed_paths(
hassio_noauth_client,
aioclient_mock: AiohttpClientMocker,
bad_path: str,
expected_status: int,
) -> None:
"""Test fetching normal path."""
resp = await hassio_noauth_client.get(f"/api/hassio/{bad_path}")
# Check we got right response
assert resp.status == expected_status
# Check we didn't forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("path", "authenticated"),
[
("app/entrypoint.js", False),
("addons/bl_b392/logo", False),
("addons/bl_b392/icon", False),
("backups/1234abcd/info", True),
],
)
async def test_forward_request_not_onboarded_get(
hassio_noauth_client,
aioclient_mock: AiohttpClientMocker,
path: str,
authenticated: bool,
mock_not_onboarded,
) -> None:
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_noauth_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
@@ -100,61 +193,226 @@ async def test_forward_request_no_auth_for_icon(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
expected_headers = {
"X-Hass-Source": "core.http",
}
if authenticated:
expected_headers["Authorization"] = "Bearer 123456"
assert aioclient_mock.mock_calls[0][3] == expected_headers
async def test_forward_log_request(
hassio_client, aioclient_mock: AiohttpClientMocker
@pytest.mark.parametrize(
"path",
[
"backups/new/upload",
"backups/1234abcd/restore/full",
"backups/1234abcd/restore/partial",
],
)
async def test_forward_request_not_onboarded_post(
hassio_noauth_client,
aioclient_mock: AiohttpClientMocker,
path: str,
mock_not_onboarded,
) -> None:
"""Test fetching normal log path doesn't remove ANSI color escape codes."""
aioclient_mock.get("http://127.0.0.1/beer/logs", text="\033[32mresponse\033[0m")
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_client.get("/api/hassio/beer/logs")
resp = await hassio_noauth_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == "\033[32mresponse\033[0m"
assert body == "response"
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
# We only expect a single header.
assert aioclient_mock.mock_calls[0][3] == {
"X-Hass-Source": "core.http",
"Authorization": "Bearer 123456",
}
@pytest.mark.parametrize("method", ["POST", "PUT", "DELETE", "RANDOM"])
async def test_forward_request_not_onboarded_unallowed_methods(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker, method: str
) -> None:
"""Test fetching normal path."""
resp = await hassio_noauth_client.post("/api/hassio/app/entrypoint.js")
# Check we got right response
assert resp.status == HTTPStatus.METHOD_NOT_ALLOWED
# Check we did not forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("bad_path", "expected_status"),
[
# Caught by bullshit filter
("app/%252E./entrypoint.js", HTTPStatus.BAD_REQUEST),
# The .. is processed, making it an unauthenticated path
("app/../entrypoint.js", HTTPStatus.UNAUTHORIZED),
("app/%2E%2E/entrypoint.js", HTTPStatus.UNAUTHORIZED),
# Unauthenticated path
("supervisor/info", HTTPStatus.UNAUTHORIZED),
("supervisor/logs", HTTPStatus.UNAUTHORIZED),
("addons/bl_b392/logs", HTTPStatus.UNAUTHORIZED),
],
)
async def test_forward_request_not_onboarded_unallowed_paths(
hassio_noauth_client,
aioclient_mock: AiohttpClientMocker,
bad_path: str,
expected_status: int,
mock_not_onboarded,
) -> None:
"""Test fetching normal path."""
resp = await hassio_noauth_client.get(f"/api/hassio/{bad_path}")
# Check we got right response
assert resp.status == expected_status
# Check we didn't forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("path", "authenticated"),
[
("app/entrypoint.js", False),
("addons/bl_b392/logo", False),
("addons/bl_b392/icon", False),
("backups/1234abcd/info", True),
("supervisor/logs", True),
("addons/bl_b392/logs", True),
("addons/bl_b392/changelog", True),
("addons/bl_b392/documentation", True),
],
)
async def test_forward_request_admin_get(
hassio_client,
aioclient_mock: AiohttpClientMocker,
path: str,
authenticated: bool,
) -> None:
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == "response"
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
expected_headers = {
"X-Hass-Source": "core.http",
}
if authenticated:
expected_headers["Authorization"] = "Bearer 123456"
assert aioclient_mock.mock_calls[0][3] == expected_headers
@pytest.mark.parametrize(
"path",
[
"backups/new/upload",
"backups/1234abcd/restore/full",
"backups/1234abcd/restore/partial",
],
)
async def test_forward_request_admin_post(
hassio_client,
aioclient_mock: AiohttpClientMocker,
path: str,
) -> None:
"""Test fetching normal path."""
aioclient_mock.get(f"http://127.0.0.1/{path}", text="response")
resp = await hassio_client.get(f"/api/hassio/{path}")
# Check we got right response
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == "response"
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
# We only expect a single header.
assert aioclient_mock.mock_calls[0][3] == {
"X-Hass-Source": "core.http",
"Authorization": "Bearer 123456",
}
@pytest.mark.parametrize("method", ["POST", "PUT", "DELETE", "RANDOM"])
async def test_forward_request_admin_unallowed_methods(
hassio_client, aioclient_mock: AiohttpClientMocker, method: str
) -> None:
"""Test fetching normal path."""
resp = await hassio_client.post("/api/hassio/app/entrypoint.js")
# Check we got right response
assert resp.status == HTTPStatus.METHOD_NOT_ALLOWED
# Check we did not forward command
assert len(aioclient_mock.mock_calls) == 0
@pytest.mark.parametrize(
("bad_path", "expected_status"),
[
# Caught by bullshit filter
("app/%252E./entrypoint.js", HTTPStatus.BAD_REQUEST),
# The .. is processed, making it an unauthenticated path
("app/../entrypoint.js", HTTPStatus.UNAUTHORIZED),
("app/%2E%2E/entrypoint.js", HTTPStatus.UNAUTHORIZED),
# Unauthenticated path
("supervisor/info", HTTPStatus.UNAUTHORIZED),
],
)
async def test_forward_request_admin_unallowed_paths(
hassio_client,
aioclient_mock: AiohttpClientMocker,
bad_path: str,
expected_status: int,
) -> None:
"""Test fetching normal path."""
resp = await hassio_client.get(f"/api/hassio/{bad_path}")
# Check we got right response
assert resp.status == expected_status
# Check we didn't forward command
assert len(aioclient_mock.mock_calls) == 0
async def test_bad_gateway_when_cannot_find_supervisor(
hassio_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we get a bad gateway error if we can't find supervisor."""
aioclient_mock.get("http://127.0.0.1/addons/test/info", exc=asyncio.TimeoutError)
aioclient_mock.get("http://127.0.0.1/app/entrypoint.js", exc=asyncio.TimeoutError)
resp = await hassio_client.get("/api/hassio/addons/test/info")
resp = await hassio_client.get("/api/hassio/app/entrypoint.js")
assert resp.status == HTTPStatus.BAD_GATEWAY
async def test_forwarding_user_info(
hassio_client, hass_admin_user: MockUser, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test that we forward user info correctly."""
aioclient_mock.get("http://127.0.0.1/hello")
resp = await hassio_client.get("/api/hassio/hello")
# Check we got right response
assert resp.status == HTTPStatus.OK
assert len(aioclient_mock.mock_calls) == 1
req_headers = aioclient_mock.mock_calls[0][-1]
assert req_headers["X-Hass-User-ID"] == hass_admin_user.id
assert req_headers["X-Hass-Is-Admin"] == "1"
async def test_backup_upload_headers(
hassio_client, aioclient_mock: AiohttpClientMocker, caplog: pytest.LogCaptureFixture
hassio_client,
aioclient_mock: AiohttpClientMocker,
caplog: pytest.LogCaptureFixture,
mock_not_onboarded,
) -> None:
"""Test that we forward the full header for backup upload."""
content_type = "multipart/form-data; boundary='--webkit'"
aioclient_mock.get("http://127.0.0.1/backups/new/upload")
aioclient_mock.post("http://127.0.0.1/backups/new/upload")
resp = await hassio_client.get(
resp = await hassio_client.post(
"/api/hassio/backups/new/upload", headers={"Content-Type": content_type}
)
@@ -168,19 +426,19 @@ async def test_backup_upload_headers(
async def test_backup_download_headers(
hassio_client, aioclient_mock: AiohttpClientMocker
hassio_client, aioclient_mock: AiohttpClientMocker, mock_not_onboarded
) -> None:
"""Test that we forward the full header for backup download."""
content_disposition = "attachment; filename=test.tar"
aioclient_mock.get(
"http://127.0.0.1/backups/slug/download",
"http://127.0.0.1/backups/1234abcd/download",
headers={
"Content-Length": "50000000",
"Content-Disposition": content_disposition,
},
)
resp = await hassio_client.get("/api/hassio/backups/slug/download")
resp = await hassio_client.get("/api/hassio/backups/1234abcd/download")
# Check we got right response
assert resp.status == HTTPStatus.OK
@@ -190,21 +448,10 @@ async def test_backup_download_headers(
assert resp.headers["Content-Disposition"] == content_disposition
def test_need_auth(hass: HomeAssistant) -> None:
"""Test if the requested path needs authentication."""
assert not _need_auth(hass, "addons/test/logo")
assert _need_auth(hass, "backups/new/upload")
assert _need_auth(hass, "supervisor/logs")
hass.data["onboarding"] = False
assert not _need_auth(hass, "backups/new/upload")
assert not _need_auth(hass, "supervisor/logs")
async def test_stream(hassio_client, aioclient_mock: AiohttpClientMocker) -> None:
"""Verify that the request is a stream."""
aioclient_mock.get("http://127.0.0.1/test")
await hassio_client.get("/api/hassio/test", data="test")
aioclient_mock.get("http://127.0.0.1/app/entrypoint.js")
await hassio_client.get("/api/hassio/app/entrypoint.js", data="test")
assert isinstance(aioclient_mock.mock_calls[-1][2], StreamReader)

View File

@@ -21,7 +21,7 @@ from tests.test_util.aiohttp import AiohttpClientMocker
],
)
async def test_ingress_request_get(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.get(
@@ -29,7 +29,7 @@ async def test_ingress_request_get(
text="test",
)
resp = await hassio_client.get(
resp = await hassio_noauth_client.get(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -41,7 +41,8 @@ async def test_ingress_request_get(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -63,7 +64,7 @@ async def test_ingress_request_get(
],
)
async def test_ingress_request_post(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.post(
@@ -71,7 +72,7 @@ async def test_ingress_request_post(
text="test",
)
resp = await hassio_client.post(
resp = await hassio_noauth_client.post(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -83,7 +84,8 @@ async def test_ingress_request_post(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -105,7 +107,7 @@ async def test_ingress_request_post(
],
)
async def test_ingress_request_put(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.put(
@@ -113,7 +115,7 @@ async def test_ingress_request_put(
text="test",
)
resp = await hassio_client.put(
resp = await hassio_noauth_client.put(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -125,7 +127,8 @@ async def test_ingress_request_put(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -147,7 +150,7 @@ async def test_ingress_request_put(
],
)
async def test_ingress_request_delete(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.delete(
@@ -155,7 +158,7 @@ async def test_ingress_request_delete(
text="test",
)
resp = await hassio_client.delete(
resp = await hassio_noauth_client.delete(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -167,7 +170,8 @@ async def test_ingress_request_delete(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -189,7 +193,7 @@ async def test_ingress_request_delete(
],
)
async def test_ingress_request_patch(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.patch(
@@ -197,7 +201,7 @@ async def test_ingress_request_patch(
text="test",
)
resp = await hassio_client.patch(
resp = await hassio_noauth_client.patch(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -209,7 +213,8 @@ async def test_ingress_request_patch(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -231,7 +236,7 @@ async def test_ingress_request_patch(
],
)
async def test_ingress_request_options(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.options(
@@ -239,7 +244,7 @@ async def test_ingress_request_options(
text="test",
)
resp = await hassio_client.options(
resp = await hassio_noauth_client.options(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
@@ -251,7 +256,8 @@ async def test_ingress_request_options(
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -273,20 +279,21 @@ async def test_ingress_request_options(
],
)
async def test_ingress_websocket(
hassio_client, build_type, aioclient_mock: AiohttpClientMocker
hassio_noauth_client, build_type, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test no auth needed for ."""
aioclient_mock.get(f"http://127.0.0.1/ingress/{build_type[0]}/{build_type[1]}")
# Ignore error because we can setup a full IO infrastructure
await hassio_client.ws_connect(
await hassio_noauth_client.ws_connect(
f"/api/hassio_ingress/{build_type[0]}/{build_type[1]}",
headers={"X-Test-Header": "beer"},
)
# Check we forwarded command
assert len(aioclient_mock.mock_calls) == 1
assert aioclient_mock.mock_calls[-1][3][X_AUTH_TOKEN] == "123456"
assert X_AUTH_TOKEN not in aioclient_mock.mock_calls[-1][3]
assert aioclient_mock.mock_calls[-1][3]["X-Hass-Source"] == "core.ingress"
assert (
aioclient_mock.mock_calls[-1][3]["X-Ingress-Path"]
== f"/api/hassio_ingress/{build_type[0]}"
@@ -298,7 +305,9 @@ async def test_ingress_websocket(
async def test_ingress_missing_peername(
hassio_client, aioclient_mock: AiohttpClientMocker, caplog: pytest.LogCaptureFixture
hassio_noauth_client,
aioclient_mock: AiohttpClientMocker,
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test hadnling of missing peername."""
aioclient_mock.get(
@@ -314,7 +323,7 @@ async def test_ingress_missing_peername(
return_value=MagicMock(),
) as transport_mock:
transport_mock.get_extra_info = get_extra_info
resp = await hassio_client.get(
resp = await hassio_noauth_client.get(
"/api/hassio_ingress/lorem/ipsum",
headers={"X-Test-Header": "beer"},
)
@@ -323,3 +332,19 @@ async def test_ingress_missing_peername(
# Check we got right response
assert resp.status == HTTPStatus.BAD_REQUEST
async def test_forwarding_paths_as_requested(
hassio_noauth_client, aioclient_mock
) -> None:
"""Test incomnig URLs with double encoding go out as dobule encoded."""
# This double encoded string should be forwarded double-encoded too.
aioclient_mock.get(
"http://127.0.0.1/ingress/mock-token/hello/%252e./world",
text="test",
)
resp = await hassio_noauth_client.get(
"/api/hassio_ingress/mock-token/hello/%252e./world",
)
assert await resp.text() == "test"

View File

@@ -1,4 +1,4 @@
"""Test repairs from supervisor issues."""
"""Test issues from supervisor issues."""
from __future__ import annotations
import os
@@ -145,12 +145,12 @@ def assert_repair_in_list(issues: list[dict[str, Any]], unhealthy: bool, reason:
} in issues
async def test_unhealthy_repairs(
async def test_unhealthy_issues(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test repairs added for unhealthy systems."""
"""Test issues added for unhealthy systems."""
mock_resolution_info(aioclient_mock, unhealthy=["docker", "setup"])
result = await async_setup_component(hass, "hassio", {})
@@ -166,12 +166,12 @@ async def test_unhealthy_repairs(
assert_repair_in_list(msg["result"]["issues"], unhealthy=True, reason="setup")
async def test_unsupported_repairs(
async def test_unsupported_issues(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test repairs added for unsupported systems."""
"""Test issues added for unsupported systems."""
mock_resolution_info(aioclient_mock, unsupported=["content_trust", "os"])
result = await async_setup_component(hass, "hassio", {})
@@ -189,12 +189,12 @@ async def test_unsupported_repairs(
assert_repair_in_list(msg["result"]["issues"], unhealthy=False, reason="os")
async def test_unhealthy_repairs_add_remove(
async def test_unhealthy_issues_add_remove(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test unhealthy repairs added and removed from dispatches."""
"""Test unhealthy issues added and removed from dispatches."""
mock_resolution_info(aioclient_mock)
result = await async_setup_component(hass, "hassio", {})
@@ -245,12 +245,12 @@ async def test_unhealthy_repairs_add_remove(
assert msg["result"] == {"issues": []}
async def test_unsupported_repairs_add_remove(
async def test_unsupported_issues_add_remove(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Test unsupported repairs added and removed from dispatches."""
"""Test unsupported issues added and removed from dispatches."""
mock_resolution_info(aioclient_mock)
result = await async_setup_component(hass, "hassio", {})
@@ -301,12 +301,12 @@ async def test_unsupported_repairs_add_remove(
assert msg["result"] == {"issues": []}
async def test_reset_repairs_supervisor_restart(
async def test_reset_issues_supervisor_restart(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
hass_ws_client: WebSocketGenerator,
) -> None:
"""Unsupported/unhealthy repairs reset on supervisor restart."""
"""Unsupported/unhealthy issues reset on supervisor restart."""
mock_resolution_info(aioclient_mock, unsupported=["os"], unhealthy=["docker"])
result = await async_setup_component(hass, "hassio", {})

View File

@@ -153,6 +153,11 @@ async def test_websocket_supervisor_api(
msg = await websocket_client.receive_json()
assert msg["result"]["version_latest"] == "1.0.0"
assert aioclient_mock.mock_calls[-1][3] == {
"X-Hass-Source": "core.websocket_api",
"Authorization": "Bearer 123456",
}
async def test_websocket_supervisor_api_error(
hassio_env,

View File

@@ -352,6 +352,12 @@ async def test_auth_access_signed_path_with_query_param(
data = await req.json()
assert data["user_id"] == refresh_token.user.id
# Without query params not allowed
url = yarl.URL(signed_path)
signed_path = f"{url.path}?{SIGN_QUERY_PARAM}={url.query.get(SIGN_QUERY_PARAM)}"
req = await client.get(signed_path)
assert req.status == HTTPStatus.UNAUTHORIZED
async def test_auth_access_signed_path_with_query_param_order(
hass: HomeAssistant,
@@ -374,12 +380,24 @@ async def test_auth_access_signed_path_with_query_param_order(
refresh_token_id=refresh_token.id,
)
url = yarl.URL(signed_path)
signed_path = f"{url.path}?{SIGN_QUERY_PARAM}={url.query.get(SIGN_QUERY_PARAM)}&foo=bar&test=test"
req = await client.get(signed_path)
assert req.status == HTTPStatus.OK
data = await req.json()
assert data["user_id"] == refresh_token.user.id
# Change order
req = await client.get(
f"{url.path}?{SIGN_QUERY_PARAM}={url.query.get(SIGN_QUERY_PARAM)}&foo=bar&test=test"
)
assert req.status == HTTPStatus.UNAUTHORIZED
# Duplicate a param
req = await client.get(
f"{url.path}?{SIGN_QUERY_PARAM}={url.query.get(SIGN_QUERY_PARAM)}&test=test&foo=aaa&foo=bar"
)
assert req.status == HTTPStatus.UNAUTHORIZED
# Remove a param
req = await client.get(
f"{url.path}?{SIGN_QUERY_PARAM}={url.query.get(SIGN_QUERY_PARAM)}&test=test"
)
assert req.status == HTTPStatus.UNAUTHORIZED
async def test_auth_access_signed_path_with_query_param_safe_param(

View File

@@ -14,9 +14,16 @@ from homeassistant.helpers.template import Template
class FakeEMailReader:
"""A test class for sending test emails."""
def __init__(self, messages):
def __init__(self, messages) -> None:
"""Set up the fake email reader."""
self._messages = messages
self.last_id = 0
self.last_unread_id = len(messages)
def add_test_message(self, message):
"""Add a new message."""
self.last_unread_id += 1
self._messages.append(message)
def connect(self):
"""Stay always Connected."""
@@ -26,6 +33,7 @@ class FakeEMailReader:
"""Get the next email."""
if len(self._messages) == 0:
return None
self.last_id += 1
return self._messages.popleft()
@@ -146,7 +154,7 @@ async def test_multi_part_only_other_text(hass: HomeAssistant) -> None:
async def test_multiple_emails(hass: HomeAssistant) -> None:
"""Test multiple emails."""
"""Test multiple emails, discarding stale states."""
states = []
test_message1 = email.message.Message()
@@ -158,9 +166,15 @@ async def test_multiple_emails(hass: HomeAssistant) -> None:
test_message2 = email.message.Message()
test_message2["From"] = "sender@test.com"
test_message2["Subject"] = "Test 2"
test_message2["Date"] = datetime.datetime(2016, 1, 1, 12, 44, 57)
test_message2["Date"] = datetime.datetime(2016, 1, 1, 12, 44, 58)
test_message2.set_payload("Test Message 2")
test_message3 = email.message.Message()
test_message3["From"] = "sender@test.com"
test_message3["Subject"] = "Test 3"
test_message3["Date"] = datetime.datetime(2016, 1, 1, 12, 50, 1)
test_message3.set_payload("Test Message 2")
def state_changed_listener(entity_id, from_s, to_s):
states.append(to_s)
@@ -178,11 +192,13 @@ async def test_multiple_emails(hass: HomeAssistant) -> None:
sensor.async_schedule_update_ha_state(True)
await hass.async_block_till_done()
# Fake a new received message
sensor._email_reader.add_test_message(test_message3)
sensor.async_schedule_update_ha_state(True)
await hass.async_block_till_done()
assert states[0].state == "Test"
assert states[1].state == "Test 2"
assert states[0].state == "Test 2"
assert states[1].state == "Test 3"
assert sensor.extra_state_attributes["body"] == "Test Message 2"

View File

@@ -378,7 +378,7 @@ async def test_duplicate_entity_validator(do_config) -> None:
CONF_TYPE: SERIAL,
CONF_BAUDRATE: 9600,
CONF_BYTESIZE: 8,
CONF_METHOD: "rtu",
CONF_METHOD: "ascii",
CONF_PORT: TEST_PORT_SERIAL,
CONF_PARITY: "E",
CONF_STOPBITS: 1,

View File

@@ -636,8 +636,8 @@ async def test_brightness_from_rgb_controlling_scale(
}
},
)
mqtt_mock = await mqtt_mock_entry_with_yaml_config()
await hass.async_block_till_done()
await mqtt_mock_entry_with_yaml_config()
state = hass.states.get("light.test")
assert state.state == STATE_UNKNOWN
@@ -650,10 +650,29 @@ async def test_brightness_from_rgb_controlling_scale(
state = hass.states.get("light.test")
assert state.attributes.get("brightness") == 255
async_fire_mqtt_message(hass, "test_scale_rgb/rgb/status", "127,0,0")
async_fire_mqtt_message(hass, "test_scale_rgb/rgb/status", "128,64,32")
state = hass.states.get("light.test")
assert state.attributes.get("brightness") == 127
assert state.attributes.get("brightness") == 128
assert state.attributes.get("rgb_color") == (255, 128, 64)
mqtt_mock.async_publish.reset_mock()
await common.async_turn_on(hass, "light.test", brightness=191)
await hass.async_block_till_done()
mqtt_mock.async_publish.assert_has_calls(
[
call("test_scale_rgb/set", "on", 0, False),
call("test_scale_rgb/rgb/set", "191,95,47", 0, False),
],
any_order=True,
)
async_fire_mqtt_message(hass, "test_scale_rgb/rgb/status", "191,95,47")
await hass.async_block_till_done()
state = hass.states.get("light.test")
assert state.attributes.get("brightness") == 191
assert state.attributes.get("rgb_color") == (255, 127, 63)
async def test_controlling_state_via_topic_with_templates(

View File

@@ -135,6 +135,8 @@ async def test_setting_sensor_value_via_mqtt_message(
False,
),
(sensor.SensorDeviceClass.TIMESTAMP, "invalid", STATE_UNKNOWN, True),
(sensor.SensorDeviceClass.ENUM, "some_value", "some_value", False),
(None, "some_value", "some_value", False),
],
)
async def test_setting_sensor_native_value_handling_via_mqtt_message(

View File

@@ -209,6 +209,27 @@ def test_significant_states_with_session_entity_minimal_response_no_matches(
)
def test_significant_states_with_session_single_entity(
hass_recorder: Callable[..., HomeAssistant],
) -> None:
"""Test get_significant_states_with_session with a single entity."""
hass = hass_recorder()
hass.states.set("demo.id", "any", {"attr": True})
hass.states.set("demo.id", "any2", {"attr": True})
wait_recording_done(hass)
now = dt_util.utcnow()
with session_scope(hass=hass) as session:
states = history.get_significant_states_with_session(
hass,
session,
now - timedelta(days=1),
now,
entity_ids=["demo.id"],
minimal_response=False,
)
assert len(states["demo.id"]) == 2
@pytest.mark.parametrize(
("attributes", "no_attributes", "limit"),
[

View File

@@ -69,7 +69,7 @@ async def test_schema_update_calls(recorder_db_url: str, hass: HomeAssistant) ->
session_maker = instance.get_session
update.assert_has_calls(
[
call(hass, engine, session_maker, version + 1, 0)
call(instance, hass, engine, session_maker, version + 1, 0)
for version in range(0, db_schema.SCHEMA_VERSION)
]
)
@@ -304,6 +304,8 @@ async def test_schema_migrate(
migration_version = None
real_migrate_schema = recorder.migration.migrate_schema
real_apply_update = recorder.migration._apply_update
real_create_index = recorder.migration._create_index
create_calls = 0
def _create_engine_test(*args, **kwargs):
"""Test version of create_engine that initializes with old schema.
@@ -355,6 +357,17 @@ async def test_schema_migrate(
migration_stall.wait()
real_apply_update(*args)
def _sometimes_failing_create_index(*args):
"""Make the first index create raise a retryable error to ensure we retry."""
if recorder_db_url.startswith("mysql://"):
nonlocal create_calls
if create_calls < 1:
create_calls += 1
mysql_exception = OperationalError("statement", {}, [])
mysql_exception.orig = Exception(1205, "retryable")
raise mysql_exception
real_create_index(*args)
with patch("homeassistant.components.recorder.ALLOW_IN_MEMORY_DB", True), patch(
"homeassistant.components.recorder.core.create_engine",
new=_create_engine_test,
@@ -368,6 +381,11 @@ async def test_schema_migrate(
), patch(
"homeassistant.components.recorder.migration._apply_update",
wraps=_instrument_apply_update,
) as apply_update_mock, patch(
"homeassistant.components.recorder.util.time.sleep"
), patch(
"homeassistant.components.recorder.migration._create_index",
wraps=_sometimes_failing_create_index,
), patch(
"homeassistant.components.recorder.Recorder._schedule_compile_missing_statistics",
), patch(
@@ -394,12 +412,13 @@ async def test_schema_migrate(
assert migration_version == db_schema.SCHEMA_VERSION
assert setup_run.called
assert recorder.util.async_migration_in_progress(hass) is not True
assert apply_update_mock.called
def test_invalid_update(hass: HomeAssistant) -> None:
"""Test that an invalid new version raises an exception."""
with pytest.raises(ValueError):
migration._apply_update(hass, Mock(), Mock(), -1, 0)
migration._apply_update(Mock(), hass, Mock(), Mock(), -1, 0)
@pytest.mark.parametrize(

View File

@@ -2,7 +2,7 @@
from datetime import datetime, timedelta
import json
import sqlite3
from unittest.mock import MagicMock, patch
from unittest.mock import patch
import pytest
from sqlalchemy.exc import DatabaseError, OperationalError
@@ -192,7 +192,7 @@ async def test_purge_old_states_encounters_temporary_mysql_error(
await async_wait_recording_done(hass)
mysql_exception = OperationalError("statement", {}, [])
mysql_exception.orig = MagicMock(args=(1205, "retryable"))
mysql_exception.orig = Exception(1205, "retryable")
with patch(
"homeassistant.components.recorder.util.time.sleep"

View File

@@ -8,7 +8,7 @@ import sys
from unittest.mock import ANY, DEFAULT, MagicMock, patch, sentinel
import pytest
from sqlalchemy import create_engine
from sqlalchemy import create_engine, select
from sqlalchemy.exc import OperationalError
from sqlalchemy.orm import Session
@@ -22,6 +22,10 @@ from homeassistant.components.recorder.models import (
)
from homeassistant.components.recorder.statistics import (
STATISTIC_UNIT_TO_UNIT_CONVERTER,
_generate_get_metadata_stmt,
_generate_max_mean_min_statistic_in_sub_period_stmt,
_generate_statistics_at_time_stmt,
_generate_statistics_during_period_stmt,
_statistics_during_period_with_session,
_update_or_add_metadata,
async_add_external_statistics,
@@ -1231,8 +1235,9 @@ def test_delete_duplicates_no_duplicates(
"""Test removal of duplicated statistics."""
hass = hass_recorder()
wait_recording_done(hass)
instance = recorder.get_instance(hass)
with session_scope(hass=hass) as session:
delete_statistics_duplicates(hass, session)
delete_statistics_duplicates(instance, hass, session)
assert "duplicated statistics rows" not in caplog.text
assert "Found non identical" not in caplog.text
assert "Found duplicated" not in caplog.text
@@ -1798,3 +1803,100 @@ def record_states(hass):
states[sns4].append(set_state(sns4, "20", attributes=sns4_attr))
return zero, four, states
def test_cache_key_for_generate_statistics_during_period_stmt():
"""Test cache key for _generate_statistics_during_period_stmt."""
columns = select(StatisticsShortTerm.metadata_id, StatisticsShortTerm.start_ts)
stmt = _generate_statistics_during_period_stmt(
columns, dt_util.utcnow(), dt_util.utcnow(), [0], StatisticsShortTerm, {}
)
cache_key_1 = stmt._generate_cache_key()
stmt2 = _generate_statistics_during_period_stmt(
columns, dt_util.utcnow(), dt_util.utcnow(), [0], StatisticsShortTerm, {}
)
cache_key_2 = stmt2._generate_cache_key()
assert cache_key_1 == cache_key_2
columns2 = select(
StatisticsShortTerm.metadata_id,
StatisticsShortTerm.start_ts,
StatisticsShortTerm.sum,
StatisticsShortTerm.mean,
)
stmt3 = _generate_statistics_during_period_stmt(
columns2,
dt_util.utcnow(),
dt_util.utcnow(),
[0],
StatisticsShortTerm,
{"max", "mean"},
)
cache_key_3 = stmt3._generate_cache_key()
assert cache_key_1 != cache_key_3
def test_cache_key_for_generate_get_metadata_stmt():
"""Test cache key for _generate_get_metadata_stmt."""
stmt_mean = _generate_get_metadata_stmt([0], "mean")
stmt_mean2 = _generate_get_metadata_stmt([1], "mean")
stmt_sum = _generate_get_metadata_stmt([0], "sum")
stmt_none = _generate_get_metadata_stmt()
assert stmt_mean._generate_cache_key() == stmt_mean2._generate_cache_key()
assert stmt_mean._generate_cache_key() != stmt_sum._generate_cache_key()
assert stmt_mean._generate_cache_key() != stmt_none._generate_cache_key()
def test_cache_key_for_generate_max_mean_min_statistic_in_sub_period_stmt():
"""Test cache key for _generate_max_mean_min_statistic_in_sub_period_stmt."""
columns = select(StatisticsShortTerm.metadata_id, StatisticsShortTerm.start_ts)
stmt = _generate_max_mean_min_statistic_in_sub_period_stmt(
columns,
dt_util.utcnow(),
dt_util.utcnow(),
StatisticsShortTerm,
[0],
)
cache_key_1 = stmt._generate_cache_key()
stmt2 = _generate_max_mean_min_statistic_in_sub_period_stmt(
columns,
dt_util.utcnow(),
dt_util.utcnow(),
StatisticsShortTerm,
[0],
)
cache_key_2 = stmt2._generate_cache_key()
assert cache_key_1 == cache_key_2
columns2 = select(
StatisticsShortTerm.metadata_id,
StatisticsShortTerm.start_ts,
StatisticsShortTerm.sum,
StatisticsShortTerm.mean,
)
stmt3 = _generate_max_mean_min_statistic_in_sub_period_stmt(
columns2,
dt_util.utcnow(),
dt_util.utcnow(),
StatisticsShortTerm,
[0],
)
cache_key_3 = stmt3._generate_cache_key()
assert cache_key_1 != cache_key_3
def test_cache_key_for_generate_statistics_at_time_stmt():
"""Test cache key for _generate_statistics_at_time_stmt."""
columns = select(StatisticsShortTerm.metadata_id, StatisticsShortTerm.start_ts)
stmt = _generate_statistics_at_time_stmt(columns, StatisticsShortTerm, {0}, 0.0)
cache_key_1 = stmt._generate_cache_key()
stmt2 = _generate_statistics_at_time_stmt(columns, StatisticsShortTerm, {0}, 0.0)
cache_key_2 = stmt2._generate_cache_key()
assert cache_key_1 == cache_key_2
columns2 = select(
StatisticsShortTerm.metadata_id,
StatisticsShortTerm.start_ts,
StatisticsShortTerm.sum,
StatisticsShortTerm.mean,
)
stmt3 = _generate_statistics_at_time_stmt(columns2, StatisticsShortTerm, {0}, 0.0)
cache_key_3 = stmt3._generate_cache_key()
assert cache_key_1 != cache_key_3

View File

@@ -133,9 +133,7 @@ class MockNeighbour:
@pytest.fixture
def ndb() -> Mock:
"""Prevent NDB poking the OS route tables."""
with patch(
"homeassistant.components.thread.diagnostics.NDB"
) as ndb, ndb() as instance:
with patch("pyroute2.NDB") as ndb, ndb() as instance:
instance.neighbours = []
instance.routes = []
yield instance

View File

@@ -22,14 +22,14 @@ from homeassistant.data_entry_flow import FlowResultType
from tests.common import MockConfigEntry
MOCK_USER_DATA = {
"host": "1.1.1.1",
"host": "https://fake.omada.host",
"verify_ssl": True,
"username": "test-username",
"password": "test-password",
}
MOCK_ENTRY_DATA = {
"host": "1.1.1.1",
"host": "https://fake.omada.host",
"verify_ssl": True,
"site": "SiteId",
"username": "test-username",
@@ -111,7 +111,7 @@ async def test_form_multiple_sites(hass: HomeAssistant) -> None:
assert result3["type"] == FlowResultType.CREATE_ENTRY
assert result3["title"] == "OC200 (Site 2)"
assert result3["data"] == {
"host": "1.1.1.1",
"host": "https://fake.omada.host",
"verify_ssl": True,
"site": "second",
"username": "test-username",
@@ -272,7 +272,7 @@ async def test_async_step_reauth_success(hass: HomeAssistant) -> None:
mocked_validate.assert_called_once_with(
hass,
{
"host": "1.1.1.1",
"host": "https://fake.omada.host",
"verify_ssl": True,
"site": "SiteId",
"username": "new_uname",
@@ -353,6 +353,64 @@ async def test_create_omada_client_parses_args(hass: HomeAssistant) -> None:
assert result is not None
mock_client.assert_called_once_with(
"1.1.1.1", "test-username", "test-password", "ws"
"https://fake.omada.host", "test-username", "test-password", "ws"
)
mock_clientsession.assert_called_once_with(hass, verify_ssl=True)
async def test_create_omada_client_adds_missing_scheme(hass: HomeAssistant) -> None:
"""Test config arguments are passed to Omada client."""
with patch(
"homeassistant.components.tplink_omada.config_flow.OmadaClient", autospec=True
) as mock_client, patch(
"homeassistant.components.tplink_omada.config_flow.async_get_clientsession",
return_value="ws",
) as mock_clientsession:
result = await create_omada_client(
hass,
{
"host": "fake.omada.host",
"verify_ssl": True,
"username": "test-username",
"password": "test-password",
},
)
assert result is not None
mock_client.assert_called_once_with(
"https://fake.omada.host", "test-username", "test-password", "ws"
)
mock_clientsession.assert_called_once_with(hass, verify_ssl=True)
async def test_create_omada_client_with_ip_creates_clientsession(
hass: HomeAssistant,
) -> None:
"""Test config arguments are passed to Omada client."""
with patch(
"homeassistant.components.tplink_omada.config_flow.OmadaClient", autospec=True
) as mock_client, patch(
"homeassistant.components.tplink_omada.config_flow.CookieJar", autospec=True
) as mock_jar, patch(
"homeassistant.components.tplink_omada.config_flow.async_create_clientsession",
return_value="ws",
) as mock_create_clientsession:
result = await create_omada_client(
hass,
{
"host": "10.10.10.10",
"verify_ssl": True, # Verify is meaningless for IP
"username": "test-username",
"password": "test-password",
},
)
assert result is not None
mock_client.assert_called_once_with(
"https://10.10.10.10", "test-username", "test-password", "ws"
)
mock_create_clientsession.assert_called_once_with(
hass, cookie_jar=mock_jar.return_value
)

View File

@@ -1,9 +1,10 @@
"""Tests for ZHA integration init."""
from unittest.mock import AsyncMock, patch
from unittest.mock import AsyncMock, Mock, patch
import pytest
from zigpy.config import CONF_DEVICE, CONF_DEVICE_PATH
from homeassistant.components.zha import async_setup_entry
from homeassistant.components.zha.core.const import (
CONF_BAUDRATE,
CONF_RADIO_TYPE,
@@ -108,3 +109,41 @@ async def test_config_depreciation(hass: HomeAssistant, zha_config) -> None:
) as setup_mock:
assert await async_setup_component(hass, DOMAIN, {DOMAIN: zha_config})
assert setup_mock.call_count == 1
@pytest.mark.parametrize(
("path", "cleaned_path"),
[
("/dev/path1", "/dev/path1"),
("/dev/path1 ", "/dev/path1 "),
("socket://dev/path1 ", "socket://dev/path1"),
],
)
@patch("homeassistant.components.zha.setup_quirks", Mock(return_value=True))
@patch("homeassistant.components.zha.api.async_load_api", Mock(return_value=True))
async def test_setup_with_v3_spaces_in_uri(
hass: HomeAssistant, path: str, cleaned_path: str
) -> None:
"""Test migration of config entry from v3 with spaces after `socket://` URI."""
config_entry_v3 = MockConfigEntry(
domain=DOMAIN,
data={
CONF_RADIO_TYPE: DATA_RADIO_TYPE,
CONF_DEVICE: {CONF_DEVICE_PATH: path, CONF_BAUDRATE: 115200},
},
version=3,
)
config_entry_v3.add_to_hass(hass)
with patch(
"homeassistant.components.zha.ZHAGateway", return_value=AsyncMock()
) as mock_gateway:
mock_gateway.return_value.coordinator_ieee = "mock_ieee"
mock_gateway.return_value.radio_description = "mock_radio"
assert await async_setup_entry(hass, config_entry_v3)
hass.data[DOMAIN]["zha_gateway"] = mock_gateway.return_value
assert config_entry_v3.data[CONF_RADIO_TYPE] == DATA_RADIO_TYPE
assert config_entry_v3.data[CONF_DEVICE][CONF_DEVICE_PATH] == cleaned_path
assert config_entry_v3.version == 3