Compare commits

..

77 Commits

Author SHA1 Message Date
Franck Nijhof 7766649304 Bump version to 2025.4.0b9 2025-03-29 17:50:46 +00:00
Simone Chemelli 07e9020dfa Fix immediate state update for Comelit (#141735) 2025-03-29 17:50:36 +00:00
J. Diego Rodríguez Royo f504a759e0 Set Home Connect program action field as not required (#141729)
* Set Home Connect program action field as not required

* Remove required field

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>

---------

Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2025-03-29 17:50:32 +00:00
Joost Lekkerkerker 9927de4801 Only trigger events on button updates in SmartThings (#141720)
Only trigger events on button updates
2025-03-29 17:50:29 +00:00
Joost Lekkerkerker 1244fc4682 Only link the parent device if known in SmartThings (#141719)
Only link the parent device if we know the parent device
2025-03-29 17:50:26 +00:00
Norbert Rittel e77a1b12f7 Sentence-case "Medium type" in mopeka (#141718) 2025-03-29 17:50:22 +00:00
J. Nick Koston 5459daaa10 Fix ESPHome entities not being removed when the ESPHome config removes an entire platform (#141708)
* Fix old ESPHome entities not being removed when configuration changes

fixes #140756

* make sure all callbacks fire

* make sure all callbacks fire

* make sure all callbacks fire

* make sure all callbacks fire

* revert

* cover
2025-03-29 17:50:18 +00:00
J. Nick Koston 400131df78 Fix ESPHome update entities being loaded before device_info is available (#141704)
* Fix ESPHome update entities being loaded before device_info is available

Since we load platforms when restoring config, the update
platform could be loaded before the connection to the
device was finished which meant device_info could still
be empty. Wait until device_info is available to
load the update platform.

fixes #135906

* Apply suggestions from code review

* move comment

* Update entry_data.py

Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>

---------

Co-authored-by: TheJulianJES <TheJulianJES@users.noreply.github.com>
2025-03-29 17:50:15 +00:00
Franck Nijhof 28e1843ff9 Fix Tuya tdq category to pick up temp & humid (#141698) 2025-03-29 17:50:12 +00:00
Franck Nijhof df777318d1 Handle invalid JSON errors in AirNow (#141695) 2025-03-29 17:50:08 +00:00
Jan Bouwhuis 6ad5e9e89c Improve MQTT translation strings (#141691)
* Improve MQTT options translation string

* more improvements
2025-03-29 17:50:05 +00:00
Norbert Rittel a0bd8deee9 Replace "country" with common string in holiday (#141687) 2025-03-29 17:50:01 +00:00
Marcel van der Veldt 405cbd6a00 Always set pause feature on Music Assistant mediaplayers (#141686) 2025-03-29 17:49:58 +00:00
Marcel van der Veldt 3e0eb5ab2c Bump music assistant client to 1.2.0 (#141668)
* Bump music assistant client to 1.2.0

* Update test fixtures
2025-03-29 17:49:55 +00:00
Norbert Rittel fad75a70b6 Add a common string for "country" (#141653) 2025-03-29 17:49:52 +00:00
Josef Zweck d9720283df Add unkown to uncalibrated state for tedee (#141262) 2025-03-29 17:49:46 +00:00
Franck Nijhof 14eed1778b Bump version to 2025.4.0b8 2025-03-28 20:46:26 +00:00
Norbert Rittel 049aaa7e8b Fix grammar / sentence-casing in workday (#141682)
* Fix grammar / sentence-casing in `workday`

Also replace "country" with common string.

* Add two more references

* Fix second data description reference

* Add "given" to action description for better translations
2025-03-28 20:46:17 +00:00
J. Nick Koston 35717e8216 Increase websocket_api allowed peak time to 10s (#141680)
* Increase websocket_api allowed peak time to 10s

fixes #141624

During integration reload or startup, we can end up sending a message for
each entity being created for integrations that create them from an external
source (ie MQTT) because the messages come in one at a time. This can overload
the loop and/or client for more than 5s. While we have done significant work
to optimize for this path, we are at the limit at what we can expect clients
to be able to process in the time window, so increase the time window.

* adjust test
2025-03-28 20:46:13 +00:00
Franck Nijhof 2a081abc18 Fix camera proxy with sole image quality settings (#141676) 2025-03-28 20:46:10 +00:00
puddly b7f29c7358 Handle all firmware types for ZBT-1 and Yellow update entities (#141674)
Handle other firmware types
2025-03-28 20:46:06 +00:00
Jason Hunter 3bb6373df5 Update Duke Energy package to fix integration (#141669)
* Update Duke Energy package to fix integration

* fix tests
2025-03-28 20:46:03 +00:00
Michael Hansen e1b4edec50 Bump intents and always prefer more literal text (#141663) 2025-03-28 20:46:00 +00:00
puddly 147bee57e1 Include ZBT-1 and Yellow in device registry (#141623)
* Add the Yellow and ZBT-1 to the device registry

* Unload platforms

* Fix unit tests

* Rename the Yellow update entity to `Radio firmware`

* Rename `EmberZNet` to `EmberZNet Zigbee`

* Prefix the `sw_version` with the firmware type and clean up

* Fix unit tests

* Remove unnecessary `always_update=False` from data update coordinator
2025-03-28 20:45:56 +00:00
Erwin Douna fcdaea64da Tado add proper off state (#135480)
* Add proper off state

* Remove current temp

* Add default frost temp
2025-03-28 20:45:53 +00:00
Franck Nijhof d1512d46be Bump version to 2025.4.0b7 2025-03-28 16:00:45 +00:00
Bram Kragten 0be7db6270 Update frontend to 20250328.0 (#141659) 2025-03-28 15:09:56 +00:00
Paulus Schoutsen 2af0282725 Enable the message box on default for satelitte announcement actions (#141654) 2025-03-28 15:09:51 +00:00
Franck Nijhof ff458c8417 Bump version to 2025.4.0b6 2025-03-28 15:04:34 +00:00
Franck Nijhof cc93152ff0 Fix ESPHome event entity staying unavailable (#141650) 2025-03-28 14:05:40 +00:00
Paulus Schoutsen 9965f01609 Ensure connection test sound has no preannouncement (#141647) 2025-03-28 14:05:37 +00:00
Jan Bouwhuis e9c76ce694 Fix duplicate 'device' term in MQTT translation strings (#141646)
* Fix duplicate 'device' from MQTT translation strings

* Update homeassistant/components/mqtt/strings.json
2025-03-28 14:05:34 +00:00
Norbert Rittel 58ab7d350d Fix sentence-casing in airvisual user strings (#141632) 2025-03-28 14:05:30 +00:00
Nick Pesce e4d6e20ebd Use correct default value for multi press buttons in the Matter integration (#141630)
* Respect the min 2 constraint for the switch MultiPressMax attribute

* Update test_event.py

* Update generic_switch_multi.json

* Fix issue and update tests
2025-03-28 14:05:27 +00:00
Tsvi Mostovicz 45e273897a Jewish calendar match omer service variables requirement to documentation (#141620)
The documentation and the omer schema require a Nusach to be specified, but the YAML misses that requirement
2025-03-28 14:05:23 +00:00
Jan Bouwhuis d9ec7142d7 Fix volatile_organic_compounds_parts translation string to be referenced for MQTT subentries device class selector (#141618)
* Fix ` volatile_organic_compounds_parts` translation string to be referenced for MQTT subentries device class selector

* Fix tests
2025-03-28 14:05:20 +00:00
Petro31 e162499267 Fix an issue with the switch preview in beta (#141617)
Fix an issue with the switch preview
2025-03-28 14:05:16 +00:00
Jan-Philipp Benecke 67f21429e3 Bump aiowebdav2 to 0.4.4 (#141615) 2025-03-28 14:05:12 +00:00
J. Nick Koston a0563f06c9 Fix zeroconf logging level not being respected (#141601)
Removes an old logging workaround that is no longer needed

fixes #141558
2025-03-28 14:05:05 +00:00
Luke Lashley e7c4fdc8bb Bump Python-Snoo to 0.6.5 (#141599)
* Bump Python-Snoo to 0.6.5

* add to event_types
2025-03-28 14:05:00 +00:00
Norbert Rittel c490e350bc Make names of switch entities in gree consistent with docs (#141580) 2025-03-28 14:04:56 +00:00
Robert Resch e11409ef99 Reverts #141363 "Deprecate SmartThings machine state sensors" (#141573)
Reverts #141363
2025-03-28 14:04:52 +00:00
Joost Lekkerkerker 5c8e415a76 Add default string and icon for light effect off (#141567) 2025-03-28 14:04:49 +00:00
alorente e795fb9497 Fix missing response for queued mode scripts (#141460) 2025-03-28 14:04:45 +00:00
Norbert Rittel d0afabb85c Fix misleading friendly names of pvoutput sensors (#141312)
* Fix misleading friendly names of `pvoutput` sensors

* Update test_sensor.py

* Update test_sensor.py - prettier
2025-03-28 14:04:41 +00:00
Franck Nijhof 4f3e8e9b94 Bump version to 2025.4.0b5 2025-03-27 20:03:14 +00:00
Paul Bottein 46c1cbbc9c Update frontend to 20250327.1 (#141596) 2025-03-27 20:03:01 +00:00
Simon Lamon 8d9a4ea278 Fix typing error in NMBS (#141589)
Fix typing error
2025-03-27 20:02:58 +00:00
Jan-Philipp Benecke 22c83e2393 Bump aiowebdav2 to 0.4.3 (#141586) 2025-03-27 20:02:55 +00:00
Joost Lekkerkerker c83a75f6f9 Add brand for Bosch (#141561) 2025-03-27 20:02:51 +00:00
Franck Nijhof 841c727112 Bump version to 2025.4.0b4 2025-03-27 16:59:36 +00:00
Bram Kragten d8c9655bfd Update frontend to 20250327.0 (#141585) 2025-03-27 16:59:29 +00:00
Erik Montnemery 942ed89cc4 Revert "Promote after dependencies in bootstrap" (#141584)
Revert "Promote after dependencies in bootstrap (#140352)"

This reverts commit 3766040960.
2025-03-27 16:59:25 +00:00
Franck Nijhof a1fe6b9cf3 Bump version to 2025.4.0b3 2025-03-27 15:38:31 +00:00
Luke Lashley 2567181cc2 Better handle Roborock discovery (#141575) 2025-03-27 15:38:24 +00:00
Joost Lekkerkerker 028e4f6029 Also migrate completion time entities in SmartThings (#141572) 2025-03-27 15:38:21 +00:00
Martin Hjelmare b82e1a9bef Handle cloud subscription expired for backup upload (#141564)
Handle cloud backup subscription expired for upload
2025-03-27 15:38:18 +00:00
Joost Lekkerkerker 438f226c31 Add icons to hue effects (#141559) 2025-03-27 15:38:15 +00:00
Erwin Douna 2f139e3cb1 Tado fix HomeKit flow (#141525)
* Initial commit

* Fix

* Fix

---------

Co-authored-by: Joostlek <joostlek@outlook.com>
2025-03-27 15:38:07 +00:00
Franck Nijhof 5d75e96fbf Bump version to 2025.4.0b2 2025-03-27 10:19:35 +00:00
Norbert Rittel dcf2ec5c37 Fix sentence-casing in konnected strings, replace "override" with "custom" (#141553)
Fix sentence-casing in `konnected`strings, replace "Override" with "Custom"

Make string consistent with HA standards.

As "Override" can be misunderstood as the verb, replace it with "Custom".
2025-03-27 10:19:22 +00:00
Simon Lamon 2431e1ba98 Bump linkplay to v0.2.2 (#141542)
Bump linkplay
2025-03-27 10:19:18 +00:00
Thomas55555 4ead108c15 Handle webcal prefix in remote calendar (#141541)
Handel webcal prefix in remote calendar
2025-03-27 10:19:14 +00:00
Michael Hansen ec8363fa49 Add default preannounce sound to Assist satellites (#141522)
* Add default preannounce sound

* Allow None to disable sound

* Register static path instead of HTTP view

* Fix path

---------

Co-authored-by: Paulus Schoutsen <balloob@gmail.com>
2025-03-27 10:19:09 +00:00
J. Diego Rodríguez Royo e7ff0a3f8b Improve some Home Connect deprecations (#141508) 2025-03-27 10:19:06 +00:00
Ivan Lopez Hernandez f4c0eb4189 Initialize google.genai.Client in the executor (#141432)
* Intialize the client on an executor thread

* Fix MyPy error

* MyPy error

* Exception error

* Fix ruff

* Update __init__.py

---------

Co-authored-by: tronikos <tronikos@users.noreply.github.com>
2025-03-27 10:19:02 +00:00
Manu b1ee5a76e1 Support for upcoming pyLoad-ng release in pyLoad integration (#141297)
Fix extra key `proxy` in pyLoad
2025-03-27 10:18:58 +00:00
Norbert Rittel 6b9e8c301b Fix wrong friendly name for storage_power in solaredge (#141269)
* Fix wrong friendly name for `storage_power` in `solaredge`

"Stored power" is a contradiction in itself.
You can only store energy.

* Two additional spelling fixes

* Sentence-case "site"
2025-03-27 10:18:53 +00:00
Franck Nijhof 89c3266c7e Bump version to 2025.4.0b1 2025-03-26 23:21:26 +00:00
Jan Bouwhuis cff0a632e8 Fix QoS schema issue in MQTT subentries (#141531) 2025-03-26 23:21:17 +00:00
Jan Bouwhuis e04d8557ae Fix MQTT options flow QoS selector can not serialize (#141528) 2025-03-26 23:21:14 +00:00
Thomas55555 ca6286f241 Fix work area sensor for Husqvarna Automower (#141527)
* Fix work area sensor for Husqvarna Automower

* simplify
2025-03-26 23:21:10 +00:00
Robert Resch 35bcc9d5af Show box for Smartthings rise number entity (#141526) 2025-03-26 23:21:07 +00:00
Joost Lekkerkerker 25b45ce867 Sort SmartThings devices to be created by parent device id (#141515) 2025-03-26 23:21:03 +00:00
Robert Resch d568209bd5 Bump deebot-client to 12.4.0 (#141501) 2025-03-26 23:21:00 +00:00
Simone Chemelli 8a43e8af9e Fix refresh state for Comelit alarm (#141370) 2025-03-26 23:20:56 +00:00
Franck Nijhof 785e5b2c16 Bump version to 2025.4.0b0 2025-03-26 17:41:03 +00:00
114 changed files with 1910 additions and 1659 deletions
+14 -69
View File
@@ -40,7 +40,7 @@ env:
CACHE_VERSION: 12
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 9
HA_SHORT_VERSION: "2025.5"
HA_SHORT_VERSION: "2025.4"
DEFAULT_PYTHON: "3.13"
ALL_PYTHON_VERSIONS: "['3.13']"
# 10.3 is the oldest supported version
@@ -876,6 +876,15 @@ jobs:
- mypy
name: Split tests for full run
steps:
- name: Install additional OS dependencies
run: |
sudo rm /etc/apt/sources.list.d/microsoft-prod.list
sudo apt-get update
sudo apt-get -y install \
bluez \
ffmpeg \
libturbojpeg \
libgammu-dev
- name: Check out code from GitHub
uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
@@ -884,18 +893,6 @@ jobs:
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Generate partial pytest execution time restore key
id: generate-pytest-execution-time-report-key
run: |
echo "key=pytest-execution-time-report-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore pytest execution time cache
uses: actions/cache/restore@v4.2.3
with:
path: pytest-execution-time-report.json
key: >-
${{ runner.os }}-${{ steps.generate-pytest-execution-time-report-key.outputs.key }}
restore-keys: |
${{ runner.os }}-pytest-execution-time-report-
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@v4.2.3
@@ -908,8 +905,7 @@ jobs:
- name: Run split_tests.py
run: |
. venv/bin/activate
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} \
tests pytest-execution-time-report.json
python -m script.split_tests ${{ needs.info.outputs.test_group_count }} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@v4.6.2
with:
@@ -1006,7 +1002,6 @@ jobs:
${cov_params[@]} \
-o console_output_style=count \
-p no:sugar \
--execution-time-report-name pytest-execution-time-report-${{ matrix.python-version }}-${{ matrix.group }}.json \
--exclude-warning-annotations \
$(sed -n "${{ matrix.group }},1p" pytest_buckets.txt) \
2>&1 | tee pytest-${{ matrix.python-version }}-${{ matrix.group }}.txt
@@ -1015,9 +1010,7 @@ jobs:
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: |
pytest-*.txt
pytest-*.json
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
@@ -1032,60 +1025,12 @@ jobs:
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
- name: Remove pytest_buckets
run: rm pytest_buckets.txt
- name: Check dirty
run: |
./script/check_dirty
pytest-combine-test-execution-time:
runs-on: ubuntu-24.04
needs:
- info
- pytest-full
name: Combine test execution times
steps:
- name: Check out code from GitHub
uses: actions/checkout@v4.2.2
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@v5.5.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: actions/cache/restore@v4.2.3
with:
path: venv
fail-on-cache-miss: true
key: >-
${{ runner.os }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }}
- name: Generate partial pytest execution time restore key
id: generate-pytest-execution-time-report-key
run: |
echo "key=pytest-execution-time-report-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Download pytest execution time artifacts
uses: actions/download-artifact@v4.2.1
with:
pattern: pytest-${{ github.run_number }}-${{ env.DEFAULT_PYTHON }}-*
merge-multiple: true
- name: Combine files into one
run: |
. venv/bin/activate
python -m script.merge_pytest_execution_time_reports "pytest-execution-time-report-${{ env.DEFAULT_PYTHON }}-*.json"
- name: Upload combined pytest execution time artifact
uses: actions/upload-artifact@v4.6.2
with:
name: pytest-execution-time-report-${{ github.run_number }}
path: pytest-execution-time-report.json
- name: Save pytest execution time cache
uses: actions/cache/save@v4.2.3
with:
path: pytest-execution-time-report.json
key: >-
${{ runner.os }}-${{
steps.generate-pytest-execution-time-report-key.outputs.key }}
pytest-mariadb:
runs-on: ubuntu-24.04
services:
+1 -4
View File
@@ -137,7 +137,4 @@ tmp_cache
.ropeproject
# Will be created from script/split_tests.py
pytest_buckets.txt
# Contains test execution times used for splitting tests
pytest-execution-time-report*.json
pytest_buckets.txt
+17 -11
View File
@@ -859,14 +859,8 @@ async def _async_set_up_integrations(
integrations, all_integrations = await _async_resolve_domains_and_preload(
hass, config
)
# Detect all cycles
integrations_after_dependencies = (
await loader.resolve_integrations_after_dependencies(
hass, all_integrations.values(), set(all_integrations)
)
)
all_domains = set(integrations_after_dependencies)
domains = set(integrations) & all_domains
all_domains = set(all_integrations)
domains = set(integrations)
_LOGGER.info(
"Domains to be set up: %s | %s",
@@ -874,8 +868,6 @@ async def _async_set_up_integrations(
all_domains - domains,
)
async_set_domains_to_be_loaded(hass, all_domains)
# Initialize recorder
if "recorder" in all_domains:
recorder.async_initialize_recorder(hass)
@@ -908,12 +900,24 @@ async def _async_set_up_integrations(
stage_dep_domains_unfiltered = {
dep
for domain in stage_domains
for dep in integrations_after_dependencies[domain]
for dep in all_integrations[domain].all_dependencies
if dep not in stage_domains
}
stage_dep_domains = stage_dep_domains_unfiltered - hass.config.components
stage_all_domains = stage_domains | stage_dep_domains
stage_all_integrations = {
domain: all_integrations[domain] for domain in stage_all_domains
}
# Detect all cycles
stage_integrations_after_dependencies = (
await loader.resolve_integrations_after_dependencies(
hass, stage_all_integrations.values(), stage_all_domains
)
)
stage_all_domains = set(stage_integrations_after_dependencies)
stage_domains &= stage_all_domains
stage_dep_domains &= stage_all_domains
_LOGGER.info(
"Setting up stage %s: %s | %s\nDependencies: %s | %s",
@@ -924,6 +928,8 @@ async def _async_set_up_integrations(
stage_dep_domains_unfiltered - stage_dep_domains,
)
async_set_domains_to_be_loaded(hass, stage_all_domains)
if timeout is None:
await _async_setup_multi_components(hass, stage_all_domains, config)
continue
+5
View File
@@ -0,0 +1,5 @@
{
"domain": "bosch",
"name": "Bosch",
"integrations": ["bosch_alarm", "bosch_shc", "home_connect"]
}
@@ -8,7 +8,7 @@ from aiohttp import ClientSession
from aiohttp.client_exceptions import ClientConnectorError
from pyairnow import WebServiceAPI
from pyairnow.conv import aqi_to_concentration
from pyairnow.errors import AirNowError
from pyairnow.errors import AirNowError, InvalidJsonError
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -79,7 +79,7 @@ class AirNowDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
distance=self.distance,
)
except (AirNowError, ClientConnectorError) as error:
except (AirNowError, ClientConnectorError, InvalidJsonError) as error:
raise UpdateFailed(error) from error
if not obs:
@@ -2,7 +2,7 @@
"config": {
"step": {
"geography_by_coords": {
"title": "Configure a Geography",
"title": "Configure a geography",
"description": "Use the AirVisual cloud API to monitor a latitude/longitude.",
"data": {
"api_key": "[%key:common::config_flow::data::api_key%]",
@@ -56,12 +56,12 @@
"sensor": {
"pollutant_label": {
"state": {
"co": "Carbon Monoxide",
"n2": "Nitrogen Dioxide",
"co": "Carbon monoxide",
"n2": "Nitrogen dioxide",
"o3": "Ozone",
"p1": "PM10",
"p2": "PM2.5",
"s2": "Sulfur Dioxide"
"s2": "Sulfur dioxide"
}
},
"pollutant_level": {
@@ -6,5 +6,5 @@
"iot_class": "cloud_push",
"loggers": ["boto3", "botocore", "s3transfer"],
"quality_scale": "legacy",
"requirements": ["boto3==1.37.1"]
"requirements": ["boto3==1.34.131"]
}
@@ -8,6 +8,7 @@ announce:
message:
required: false
example: "Time to wake up!"
default: ""
selector:
text:
media_id:
@@ -28,6 +29,7 @@ start_conversation:
start_message:
required: false
example: "You left the lights on in the living room. Turn them off?"
default: ""
selector:
text:
start_media_id:
@@ -198,7 +198,8 @@ async def websocket_test_connection(
hass.async_create_background_task(
satellite.async_internal_announce(
media_id=f"{CONNECTION_TEST_URL_BASE}/{connection_id}"
media_id=f"{CONNECTION_TEST_URL_BASE}/{connection_id}",
preannounce_media_id=None,
),
f"assist_satellite_connection_test_{msg['entity_id']}",
)
+1 -1
View File
@@ -6,5 +6,5 @@
"iot_class": "cloud_push",
"loggers": ["aiobotocore", "botocore"],
"quality_scale": "legacy",
"requirements": ["aiobotocore==2.21.1", "botocore==1.37.1"]
"requirements": ["aiobotocore==2.13.1", "botocore==1.34.131"]
}
+12 -2
View File
@@ -4,13 +4,14 @@ from __future__ import annotations
import asyncio
from collections.abc import AsyncIterator, Callable, Coroutine, Mapping
from http import HTTPStatus
import logging
import random
from typing import Any
from aiohttp import ClientError
from aiohttp import ClientError, ClientResponseError
from hass_nabucasa import Cloud, CloudError
from hass_nabucasa.api import CloudApiNonRetryableError
from hass_nabucasa.api import CloudApiError, CloudApiNonRetryableError
from hass_nabucasa.cloud_api import (
FilesHandlerListEntry,
async_files_delete_file,
@@ -120,6 +121,8 @@ class CloudBackupAgent(BackupAgent):
"""
if not backup.protected:
raise BackupAgentError("Cloud backups must be protected")
if self._cloud.subscription_expired:
raise BackupAgentError("Cloud subscription has expired")
size = backup.size
try:
@@ -152,6 +155,13 @@ class CloudBackupAgent(BackupAgent):
) from err
raise BackupAgentError(f"Failed to upload backup {err}") from err
except CloudError as err:
if (
isinstance(err, CloudApiError)
and isinstance(err.orig_exc, ClientResponseError)
and err.orig_exc.status == HTTPStatus.FORBIDDEN
and self._cloud.subscription_expired
):
raise BackupAgentError("Cloud subscription has expired") from err
if tries == _RETRY_LIMIT:
raise BackupAgentError(f"Failed to upload backup {err}") from err
tries += 1
@@ -9,6 +9,7 @@ from typing import Any
import pycfdns
import voluptuous as vol
from homeassistant.components import persistent_notification
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_API_TOKEN, CONF_ZONE
from homeassistant.core import HomeAssistant
@@ -117,6 +118,8 @@ class CloudflareConfigFlow(ConfigFlow, domain=DOMAIN):
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle a flow initiated by the user."""
persistent_notification.async_dismiss(self.hass, "cloudflare_setup")
errors: dict[str, str] = {}
if user_input is not None:
+11 -10
View File
@@ -8,7 +8,7 @@ from aiocomelit import ComelitSerialBridgeObject
from aiocomelit.const import COVER, STATE_COVER, STATE_OFF, STATE_ON
from homeassistant.components.cover import CoverDeviceClass, CoverEntity, CoverState
from homeassistant.core import HomeAssistant, callback
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.helpers.update_coordinator import CoordinatorEntity
@@ -98,13 +98,20 @@ class ComelitCoverEntity(
"""Return if the cover is opening."""
return self._current_action("opening")
async def _cover_set_state(self, action: int, state: int) -> None:
"""Set desired cover state."""
self._last_state = self.state
await self._api.set_device_status(COVER, self._device.index, action)
self.coordinator.data[COVER][self._device.index].status = state
self.async_write_ha_state()
async def async_close_cover(self, **kwargs: Any) -> None:
"""Close cover."""
await self._api.set_device_status(COVER, self._device.index, STATE_OFF)
await self._cover_set_state(STATE_OFF, 2)
async def async_open_cover(self, **kwargs: Any) -> None:
"""Open cover."""
await self._api.set_device_status(COVER, self._device.index, STATE_ON)
await self._cover_set_state(STATE_ON, 1)
async def async_stop_cover(self, **_kwargs: Any) -> None:
"""Stop the cover."""
@@ -112,13 +119,7 @@ class ComelitCoverEntity(
return
action = STATE_ON if self.is_closing else STATE_OFF
await self._api.set_device_status(COVER, self._device.index, action)
@callback
def _handle_coordinator_update(self) -> None:
"""Handle device update."""
self._last_state = self.state
self.async_write_ha_state()
await self._cover_set_state(action, 0)
async def async_added_to_hass(self) -> None:
"""Handle entity which will be added."""
+2 -1
View File
@@ -59,7 +59,8 @@ class ComelitLightEntity(CoordinatorEntity[ComelitSerialBridge], LightEntity):
async def _light_set_state(self, state: int) -> None:
"""Set desired light state."""
await self.coordinator.api.set_device_status(LIGHT, self._device.index, state)
await self.coordinator.async_request_refresh()
self.coordinator.data[LIGHT][self._device.index].status = state
self.async_write_ha_state()
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the light on."""
+2 -1
View File
@@ -67,7 +67,8 @@ class ComelitSwitchEntity(CoordinatorEntity[ComelitSerialBridge], SwitchEntity):
await self.coordinator.api.set_device_status(
self._device.type, self._device.index, state
)
await self.coordinator.async_request_refresh()
self.coordinator.data[self._device.type][self._device.index].status = state
self.async_write_ha_state()
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
@@ -650,7 +650,14 @@ class DefaultAgent(ConversationEntity):
if (
(maybe_result is None) # first result
or (num_matched_entities > best_num_matched_entities)
or (
# More literal text matched
result.text_chunks_matched > maybe_result.text_chunks_matched
)
or (
# More entities matched
num_matched_entities > best_num_matched_entities
)
or (
# Fewer unmatched entities
(num_matched_entities == best_num_matched_entities)
@@ -662,16 +669,6 @@ class DefaultAgent(ConversationEntity):
and (num_unmatched_entities == best_num_unmatched_entities)
and (num_unmatched_ranges > best_num_unmatched_ranges)
)
or (
# More literal text matched
(num_matched_entities == best_num_matched_entities)
and (num_unmatched_entities == best_num_unmatched_entities)
and (num_unmatched_ranges == best_num_unmatched_ranges)
and (
result.text_chunks_matched
> maybe_result.text_chunks_matched
)
)
or (
# Prefer match failures with entities
(result.text_chunks_matched == maybe_result.text_chunks_matched)
@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["hassil==2.2.3", "home-assistant-intents==2025.3.24"]
"requirements": ["hassil==2.2.3", "home-assistant-intents==2025.3.28"]
}
@@ -50,10 +50,10 @@ class DukeEnergyConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
username = auth["cdp_internal_user_id"].lower()
username = auth["internalUserID"].lower()
await self.async_set_unique_id(username)
self._abort_if_unique_id_configured()
email = auth["email"].lower()
email = auth["loginEmailAddress"].lower()
data = {
CONF_EMAIL: email,
CONF_USERNAME: username,
@@ -6,5 +6,5 @@
"dependencies": ["recorder"],
"documentation": "https://www.home-assistant.io/integrations/duke_energy",
"iot_class": "cloud_polling",
"requirements": ["aiodukeenergy==0.2.2"]
"requirements": ["aiodukeenergy==0.3.0"]
}
+22 -18
View File
@@ -282,15 +282,18 @@ class RuntimeEntryData:
) -> None:
"""Distribute an update of static infos to all platforms."""
# First, load all platforms
needed_platforms = set()
if async_get_dashboard(hass):
needed_platforms.add(Platform.UPDATE)
needed_platforms: set[Platform] = set()
if self.device_info and self.device_info.voice_assistant_feature_flags_compat(
self.api_version
):
needed_platforms.add(Platform.BINARY_SENSOR)
needed_platforms.add(Platform.SELECT)
if self.device_info:
if async_get_dashboard(hass):
# Only load the update platform if the device_info is set
# When we restore the entry, the device_info may not be set yet
# and we don't want to load the update platform since it needs
# a complete device_info.
needed_platforms.add(Platform.UPDATE)
if self.device_info.voice_assistant_feature_flags_compat(self.api_version):
needed_platforms.add(Platform.BINARY_SENSOR)
needed_platforms.add(Platform.SELECT)
ent_reg = er.async_get(hass)
registry_get_entity = ent_reg.async_get_entity_id
@@ -312,18 +315,19 @@ class RuntimeEntryData:
# Make a dict of the EntityInfo by type and send
# them to the listeners for each specific EntityInfo type
infos_by_type: dict[type[EntityInfo], list[EntityInfo]] = {}
infos_by_type: defaultdict[type[EntityInfo], list[EntityInfo]] = defaultdict(
list
)
for info in infos:
info_type = type(info)
if info_type not in infos_by_type:
infos_by_type[info_type] = []
infos_by_type[info_type].append(info)
infos_by_type[type(info)].append(info)
callbacks_by_type = self.entity_info_callbacks
for type_, entity_infos in infos_by_type.items():
if callbacks_ := callbacks_by_type.get(type_):
for callback_ in callbacks_:
callback_(entity_infos)
for type_, callbacks in self.entity_info_callbacks.items():
# If all entities for a type are removed, we
# still need to call the callbacks with an empty list
# to make sure the entities are removed.
entity_infos = infos_by_type.get(type_, [])
for callback_ in callbacks:
callback_(entity_infos)
# Finally update static info subscriptions
for callback_ in self.static_info_update_subscriptions:
+10
View File
@@ -33,6 +33,16 @@ class EsphomeEvent(EsphomeEntity[EventInfo, Event], EventEntity):
self._trigger_event(self._state.event_type)
self.async_write_ha_state()
@callback
def _on_device_update(self) -> None:
"""Call when device updates or entry data changes."""
super()._on_device_update()
if self._entry_data.available:
# Event entities should go available directly
# when the device comes online and not wait
# for the next data push.
self.async_write_ha_state()
async_setup_entry = partial(
platform_async_setup_entry,
@@ -20,5 +20,5 @@
"documentation": "https://www.home-assistant.io/integrations/frontend",
"integration_type": "system",
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20250326.0"]
"requirements": ["home-assistant-frontend==20250328.0"]
}
@@ -7,7 +7,7 @@ import logging
from types import MappingProxyType
from typing import Any
from google import genai
from google import genai # type: ignore[attr-defined]
from google.genai.errors import APIError, ClientError
from requests.exceptions import Timeout
import voluptuous as vol
+2 -2
View File
@@ -16,13 +16,13 @@
"name": "Panel light"
},
"quiet": {
"name": "Quiet"
"name": "Quiet mode"
},
"fresh_air": {
"name": "Fresh air"
},
"xfan": {
"name": "XFan"
"name": "Xtra fan"
},
"health_mode": {
"name": "Health mode"
@@ -387,15 +387,6 @@ class HeosMediaPlayer(CoordinatorEntity[HeosCoordinator], MediaPlayerEntity):
await self._player.play_preset_station(index)
return
if media_type == "queue":
# media_id must be an int
try:
queue_id = int(media_id)
except ValueError:
raise ValueError(f"Invalid queue id '{media_id}'") from None
await self._player.play_queue(queue_id)
return
raise ValueError(f"Unsupported media type '{media_type}'")
@catch_action_error("select source")
@@ -8,7 +8,7 @@
"step": {
"user": {
"data": {
"country": "Country"
"country": "[%key:common::config_flow::data::country%]"
}
},
"options": {
@@ -64,7 +64,6 @@ set_program_and_options:
- selected_program
program:
example: dishcare_dishwasher_program_auto2
required: true
selector:
select:
mode: dropdown
@@ -31,7 +31,6 @@ class FirmwareUpdateCoordinator(DataUpdateCoordinator[FirmwareManifest]):
_LOGGER,
name="firmware update coordinator",
update_interval=FIRMWARE_REFRESH_INTERVAL,
always_update=False,
)
self.hass = hass
self.session = session
@@ -199,7 +199,7 @@ class BaseFirmwareUpdateEntity(
# This entity is not currently associated with a device so we must manually
# give it a name
self._attr_name = f"{self._config_entry.title} Update"
self._attr_title = self.entity_description.firmware_name or "unknown"
self._attr_title = self.entity_description.firmware_name or "Unknown"
if (
self._current_firmware_info is None
@@ -15,14 +15,13 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up a Home Assistant SkyConnect config entry."""
await hass.config_entries.async_forward_entry_setups(entry, ["update"])
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
await hass.config_entries.async_unload_platforms(entry, ["update"])
return True
@@ -21,11 +21,20 @@ from homeassistant.components.update import UpdateDeviceClass
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import FIRMWARE, FIRMWARE_VERSION, NABU_CASA_FIRMWARE_RELEASES_URL
from .const import (
DOMAIN,
FIRMWARE,
FIRMWARE_VERSION,
NABU_CASA_FIRMWARE_RELEASES_URL,
PRODUCT,
SERIAL_NUMBER,
HardwareVariant,
)
_LOGGER = logging.getLogger(__name__)
@@ -42,7 +51,7 @@ FIRMWARE_ENTITY_DESCRIPTIONS: dict[
fw_type="skyconnect_zigbee_ncp",
version_key="ezsp_version",
expected_firmware_type=ApplicationType.EZSP,
firmware_name="EmberZNet",
firmware_name="EmberZNet Zigbee",
),
ApplicationType.SPINEL: FirmwareUpdateEntityDescription(
key="firmware",
@@ -55,6 +64,28 @@ FIRMWARE_ENTITY_DESCRIPTIONS: dict[
expected_firmware_type=ApplicationType.SPINEL,
firmware_name="OpenThread RCP",
),
ApplicationType.CPC: FirmwareUpdateEntityDescription(
key="firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
version_parser=lambda fw: fw,
fw_type="skyconnect_multipan",
version_key="cpc_version",
expected_firmware_type=ApplicationType.CPC,
firmware_name="Multiprotocol",
),
ApplicationType.GECKO_BOOTLOADER: FirmwareUpdateEntityDescription(
key="firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
version_parser=lambda fw: fw,
fw_type=None, # We don't want to update the bootloader
version_key="gecko_bootloader_version",
expected_firmware_type=ApplicationType.GECKO_BOOTLOADER,
firmware_name="Gecko Bootloader",
),
None: FirmwareUpdateEntityDescription(
key="firmware",
display_precision=0,
@@ -77,9 +108,16 @@ def _async_create_update_entity(
) -> FirmwareUpdateEntity:
"""Create an update entity that handles firmware type changes."""
firmware_type = config_entry.data[FIRMWARE]
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[
ApplicationType(firmware_type) if firmware_type is not None else None
]
try:
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[
ApplicationType(firmware_type)
]
except (KeyError, ValueError):
_LOGGER.debug(
"Unknown firmware type %r, using default entity description", firmware_type
)
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[None]
entity = FirmwareUpdateEntity(
device=config_entry.data["device"],
@@ -130,6 +168,7 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
"""SkyConnect firmware update entity."""
bootloader_reset_type = None
_attr_has_entity_name = True
def __init__(
self,
@@ -141,8 +180,18 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
"""Initialize the SkyConnect firmware update entity."""
super().__init__(device, config_entry, update_coordinator, entity_description)
self._attr_unique_id = (
f"{self._config_entry.data['serial_number']}_{self.entity_description.key}"
variant = HardwareVariant.from_usb_product_name(
self._config_entry.data[PRODUCT]
)
serial_number = self._config_entry.data[SERIAL_NUMBER]
self._attr_unique_id = f"{serial_number}_{self.entity_description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_number)},
name=f"{variant.full_name} ({serial_number[:8]})",
model=variant.full_name,
manufacturer="Nabu Casa",
serial_number=serial_number,
)
# Use the cached firmware info if it exists
@@ -155,6 +204,17 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
source="homeassistant_sky_connect",
)
def _update_attributes(self) -> None:
"""Recompute the attributes of the entity."""
super()._update_attributes()
assert self.device_entry is not None
device_registry = dr.async_get(self.hass)
device_registry.async_update_device(
device_id=self.device_entry.id,
sw_version=f"{self.entity_description.firmware_name} {self._attr_installed_version}",
)
@callback
def _firmware_info_callback(self, firmware_info: FirmwareInfo) -> None:
"""Handle updated firmware info being pushed by an integration."""
@@ -62,6 +62,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
await hass.config_entries.async_unload_platforms(entry, ["update"])
return True
@@ -2,8 +2,9 @@
DOMAIN = "homeassistant_yellow"
RADIO_MODEL = "Home Assistant Yellow"
RADIO_MANUFACTURER = "Nabu Casa"
MODEL = "Home Assistant Yellow"
MANUFACTURER = "Nabu Casa"
RADIO_DEVICE = "/dev/ttyAMA1"
ZHA_HW_DISCOVERY_DATA = {
@@ -149,5 +149,12 @@
"run_zigbee_flasher_addon": "[%key:component::homeassistant_hardware::firmware_picker::options::progress::run_zigbee_flasher_addon%]",
"uninstall_zigbee_flasher_addon": "[%key:component::homeassistant_hardware::firmware_picker::options::progress::uninstall_zigbee_flasher_addon%]"
}
},
"entity": {
"update": {
"firmware": {
"name": "Radio firmware"
}
}
}
}
@@ -21,13 +21,17 @@ from homeassistant.components.update import UpdateDeviceClass
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import (
DOMAIN,
FIRMWARE,
FIRMWARE_VERSION,
MANUFACTURER,
MODEL,
NABU_CASA_FIRMWARE_RELEASES_URL,
RADIO_DEVICE,
)
@@ -39,7 +43,7 @@ FIRMWARE_ENTITY_DESCRIPTIONS: dict[
ApplicationType | None, FirmwareUpdateEntityDescription
] = {
ApplicationType.EZSP: FirmwareUpdateEntityDescription(
key="firmware",
key="radio_firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
@@ -47,10 +51,10 @@ FIRMWARE_ENTITY_DESCRIPTIONS: dict[
fw_type="yellow_zigbee_ncp",
version_key="ezsp_version",
expected_firmware_type=ApplicationType.EZSP,
firmware_name="EmberZNet",
firmware_name="EmberZNet Zigbee",
),
ApplicationType.SPINEL: FirmwareUpdateEntityDescription(
key="firmware",
key="radio_firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
@@ -60,12 +64,34 @@ FIRMWARE_ENTITY_DESCRIPTIONS: dict[
expected_firmware_type=ApplicationType.SPINEL,
firmware_name="OpenThread RCP",
),
None: FirmwareUpdateEntityDescription(
ApplicationType.CPC: FirmwareUpdateEntityDescription(
key="firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
version_parser=lambda fw: fw,
fw_type="yellow_multipan",
version_key="cpc_version",
expected_firmware_type=ApplicationType.CPC,
firmware_name="Multiprotocol",
),
ApplicationType.GECKO_BOOTLOADER: FirmwareUpdateEntityDescription(
key="firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
version_parser=lambda fw: fw,
fw_type=None, # We don't want to update the bootloader
version_key="gecko_bootloader_version",
expected_firmware_type=ApplicationType.GECKO_BOOTLOADER,
firmware_name="Gecko Bootloader",
),
None: FirmwareUpdateEntityDescription(
key="radio_firmware",
display_precision=0,
device_class=UpdateDeviceClass.FIRMWARE,
entity_category=EntityCategory.CONFIG,
version_parser=lambda fw: fw,
fw_type=None,
version_key=None,
expected_firmware_type=None,
@@ -82,9 +108,16 @@ def _async_create_update_entity(
) -> FirmwareUpdateEntity:
"""Create an update entity that handles firmware type changes."""
firmware_type = config_entry.data[FIRMWARE]
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[
ApplicationType(firmware_type) if firmware_type is not None else None
]
try:
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[
ApplicationType(firmware_type)
]
except (KeyError, ValueError):
_LOGGER.debug(
"Unknown firmware type %r, using default entity description", firmware_type
)
entity_description = FIRMWARE_ENTITY_DESCRIPTIONS[None]
entity = FirmwareUpdateEntity(
device=RADIO_DEVICE,
@@ -135,6 +168,7 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
"""Yellow firmware update entity."""
bootloader_reset_type = "yellow" # Triggers a GPIO reset
_attr_has_entity_name = True
def __init__(
self,
@@ -145,8 +179,13 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
) -> None:
"""Initialize the Yellow firmware update entity."""
super().__init__(device, config_entry, update_coordinator, entity_description)
self._attr_unique_id = self.entity_description.key
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, "yellow")},
name=MODEL,
model=MODEL,
manufacturer=MANUFACTURER,
)
# Use the cached firmware info if it exists
if self._config_entry.data[FIRMWARE] is not None:
@@ -158,6 +197,17 @@ class FirmwareUpdateEntity(BaseFirmwareUpdateEntity):
source="homeassistant_yellow",
)
def _update_attributes(self) -> None:
"""Recompute the attributes of the entity."""
super()._update_attributes()
assert self.device_entry is not None
device_registry = dr.async_get(self.hass)
device_registry.async_update_device(
device_id=self.device_entry.id,
sw_version=f"{self.entity_description.firmware_name} {self._attr_installed_version}",
)
@callback
def _firmware_info_callback(self, firmware_info: FirmwareInfo) -> None:
"""Handle updated firmware info being pushed by an integration."""
@@ -57,7 +57,7 @@
},
"exceptions": {
"invalid_controller_id": {
"message": "Invalid controller ID \"{controller_id}\", expected one of \"{controller_ids}\""
"message": "Invalid controller_id \"{controller_id}\", expected one of \"{controller_ids}\""
}
},
"options": {
@@ -6,6 +6,7 @@ count_omer:
selector:
date:
nusach:
required: true
example: "sfarad"
default: "sfarad"
selector:
+9 -1
View File
@@ -1,7 +1,15 @@
{
"entity_component": {
"_": {
"default": "mdi:lightbulb"
"default": "mdi:lightbulb",
"state_attributes": {
"effect": {
"default": "mdi:circle-medium",
"state": {
"off": "mdi:star-off"
}
}
}
}
},
"services": {
+4 -1
View File
@@ -93,7 +93,10 @@
"name": "Color temperature (Kelvin)"
},
"effect": {
"name": "Effect"
"name": "Effect",
"state": {
"off": "[%key:common::state::off%]"
}
},
"effect_list": {
"name": "Available effects"
+1 -1
View File
@@ -69,7 +69,7 @@ class MatterEventEntity(MatterEntity, EventEntity):
max_presses_supported = self.get_matter_attribute_value(
clusters.Switch.Attributes.MultiPressMax
)
max_presses_supported = min(max_presses_supported or 1, 8)
max_presses_supported = min(max_presses_supported or 2, 8)
for i in range(max_presses_supported):
event_types.append(f"multi_press_{i + 1}") # noqa: PERF401
elif feature_map & SwitchFeature.kMomentarySwitch:
+1 -1
View File
@@ -6,7 +6,7 @@
"description": "[%key:component::bluetooth::config::step::user::description%]",
"data": {
"address": "[%key:common::config_flow::data::device%]",
"medium_type": "Medium Type"
"medium_type": "Medium type"
}
},
"bluetooth_confirm": {
+37 -99
View File
@@ -337,7 +337,7 @@ def validate_sensor_platform_config(
return errors
@dataclass(frozen=True, kw_only=True)
@dataclass(frozen=True)
class PlatformField:
"""Stores a platform config field schema, required flag and validator."""
@@ -372,132 +372,80 @@ def unit_of_measurement_selector(user_data: dict[str, Any | None]) -> Selector:
COMMON_ENTITY_FIELDS = {
CONF_PLATFORM: PlatformField(
selector=SUBENTRY_PLATFORM_SELECTOR,
required=True,
validator=str,
exclude_from_reconfig=True,
),
CONF_NAME: PlatformField(
selector=TEXT_SELECTOR,
required=False,
validator=str,
exclude_from_reconfig=True,
),
CONF_ENTITY_PICTURE: PlatformField(
selector=TEXT_SELECTOR, required=False, validator=cv.url, error="invalid_url"
SUBENTRY_PLATFORM_SELECTOR, True, str, exclude_from_reconfig=True
),
CONF_NAME: PlatformField(TEXT_SELECTOR, False, str, exclude_from_reconfig=True),
CONF_ENTITY_PICTURE: PlatformField(TEXT_SELECTOR, False, cv.url, "invalid_url"),
}
PLATFORM_ENTITY_FIELDS = {
Platform.NOTIFY.value: {},
Platform.SENSOR.value: {
CONF_DEVICE_CLASS: PlatformField(
selector=SENSOR_DEVICE_CLASS_SELECTOR, required=False, validator=str
),
CONF_STATE_CLASS: PlatformField(
selector=SENSOR_STATE_CLASS_SELECTOR, required=False, validator=str
),
CONF_DEVICE_CLASS: PlatformField(SENSOR_DEVICE_CLASS_SELECTOR, False, str),
CONF_STATE_CLASS: PlatformField(SENSOR_STATE_CLASS_SELECTOR, False, str),
CONF_UNIT_OF_MEASUREMENT: PlatformField(
selector=unit_of_measurement_selector,
required=False,
validator=str,
custom_filtering=True,
unit_of_measurement_selector, False, str, custom_filtering=True
),
CONF_SUGGESTED_DISPLAY_PRECISION: PlatformField(
selector=SUGGESTED_DISPLAY_PRECISION_SELECTOR,
required=False,
validator=cv.positive_int,
SUGGESTED_DISPLAY_PRECISION_SELECTOR,
False,
cv.positive_int,
section="advanced_settings",
),
CONF_OPTIONS: PlatformField(
selector=OPTIONS_SELECTOR,
required=False,
validator=cv.ensure_list,
OPTIONS_SELECTOR,
False,
cv.ensure_list,
conditions=({"device_class": "enum"},),
),
},
Platform.SWITCH.value: {
CONF_DEVICE_CLASS: PlatformField(
selector=SWITCH_DEVICE_CLASS_SELECTOR, required=False, validator=str
),
CONF_DEVICE_CLASS: PlatformField(SWITCH_DEVICE_CLASS_SELECTOR, False, str),
},
}
PLATFORM_MQTT_FIELDS = {
Platform.NOTIFY.value: {
CONF_COMMAND_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=True,
validator=valid_publish_topic,
error="invalid_publish_topic",
TEXT_SELECTOR, True, valid_publish_topic, "invalid_publish_topic"
),
CONF_COMMAND_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=cv.template,
error="invalid_template",
),
CONF_RETAIN: PlatformField(
selector=BOOLEAN_SELECTOR, required=False, validator=bool
TEMPLATE_SELECTOR, False, cv.template, "invalid_template"
),
CONF_RETAIN: PlatformField(BOOLEAN_SELECTOR, False, bool),
},
Platform.SENSOR.value: {
CONF_STATE_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=True,
validator=valid_subscribe_topic,
error="invalid_subscribe_topic",
TEXT_SELECTOR, True, valid_subscribe_topic, "invalid_subscribe_topic"
),
CONF_VALUE_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=cv.template,
error="invalid_template",
TEMPLATE_SELECTOR, False, cv.template, "invalid_template"
),
CONF_LAST_RESET_VALUE_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=cv.template,
error="invalid_template",
TEMPLATE_SELECTOR,
False,
cv.template,
"invalid_template",
conditions=({CONF_STATE_CLASS: "total"},),
),
CONF_EXPIRE_AFTER: PlatformField(
selector=EXPIRE_AFTER_SELECTOR,
required=False,
validator=cv.positive_int,
section="advanced_settings",
EXPIRE_AFTER_SELECTOR, False, cv.positive_int, section="advanced_settings"
),
},
Platform.SWITCH.value: {
CONF_COMMAND_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=True,
validator=valid_publish_topic,
error="invalid_publish_topic",
TEXT_SELECTOR, True, valid_publish_topic, "invalid_publish_topic"
),
CONF_COMMAND_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=cv.template,
error="invalid_template",
TEMPLATE_SELECTOR, False, cv.template, "invalid_template"
),
CONF_STATE_TOPIC: PlatformField(
selector=TEXT_SELECTOR,
required=False,
validator=valid_subscribe_topic,
error="invalid_subscribe_topic",
TEXT_SELECTOR, False, valid_subscribe_topic, "invalid_subscribe_topic"
),
CONF_VALUE_TEMPLATE: PlatformField(
selector=TEMPLATE_SELECTOR,
required=False,
validator=cv.template,
error="invalid_template",
),
CONF_RETAIN: PlatformField(
selector=BOOLEAN_SELECTOR, required=False, validator=bool
),
CONF_OPTIMISTIC: PlatformField(
selector=BOOLEAN_SELECTOR, required=False, validator=bool
TEMPLATE_SELECTOR, False, cv.template, "invalid_template"
),
CONF_RETAIN: PlatformField(BOOLEAN_SELECTOR, False, bool),
CONF_OPTIMISTIC: PlatformField(BOOLEAN_SELECTOR, False, bool),
},
}
ENTITY_CONFIG_VALIDATOR: dict[
@@ -510,24 +458,14 @@ ENTITY_CONFIG_VALIDATOR: dict[
}
MQTT_DEVICE_PLATFORM_FIELDS = {
ATTR_NAME: PlatformField(selector=TEXT_SELECTOR, required=False, validator=str),
ATTR_SW_VERSION: PlatformField(
selector=TEXT_SELECTOR, required=False, validator=str
),
ATTR_HW_VERSION: PlatformField(
selector=TEXT_SELECTOR, required=False, validator=str
),
ATTR_MODEL: PlatformField(selector=TEXT_SELECTOR, required=False, validator=str),
ATTR_MODEL_ID: PlatformField(selector=TEXT_SELECTOR, required=False, validator=str),
ATTR_CONFIGURATION_URL: PlatformField(
selector=TEXT_SELECTOR, required=False, validator=cv.url, error="invalid_url"
),
ATTR_NAME: PlatformField(TEXT_SELECTOR, False, str),
ATTR_SW_VERSION: PlatformField(TEXT_SELECTOR, False, str),
ATTR_HW_VERSION: PlatformField(TEXT_SELECTOR, False, str),
ATTR_MODEL: PlatformField(TEXT_SELECTOR, False, str),
ATTR_MODEL_ID: PlatformField(TEXT_SELECTOR, False, str),
ATTR_CONFIGURATION_URL: PlatformField(TEXT_SELECTOR, False, cv.url, "invalid_url"),
CONF_QOS: PlatformField(
selector=QOS_SELECTOR,
required=False,
validator=int,
default=DEFAULT_QOS,
section="mqtt_settings",
QOS_SELECTOR, False, int, default=DEFAULT_QOS, section="mqtt_settings"
),
}
+6 -6
View File
@@ -219,10 +219,10 @@
"options": "Add option"
},
"data_description": {
"device_class": "The device class of the {platform} entity. [Learn more.]({url}#device_class)",
"state_class": "The [state_class](https://developers.home-assistant.io/docs/core/entity/sensor/#available-state-classes) of the sensor. [Learn more.]({url}#state_class)",
"device_class": "The Device class of the {platform} entity. [Learn more.]({url}#device_class)",
"state_class": "The [State class](https://developers.home-assistant.io/docs/core/entity/sensor/#available-state-classes) of the sensor. [Learn more.]({url}#state_class)",
"unit_of_measurement": "Defines the unit of measurement of the sensor, if any.",
"options": "Options for allowed sensor state values. The sensors device_class must be set to Enumeration. The options option cannot be used together with State Class or Unit of measurement."
"options": "Options for allowed sensor state values. The sensors Device class must be set to Enumeration. The 'Options' setting cannot be used together with State class or Unit of measurement."
},
"sections": {
"advanced_settings": {
@@ -285,9 +285,9 @@
"invalid_uom": "The unit of measurement \"{unit_of_measurement}\" is not supported by the selected device class, please either remove the device class, select a device class which supports \"{unit_of_measurement}\", or pick a supported unit of measurement from the list",
"invalid_url": "Invalid URL",
"options_not_allowed_with_state_class_or_uom": "The 'Options' setting is not allowed when state class or unit of measurement are used",
"options_device_class_enum": "The 'Options' setting must be used with the Enumeration device class'. If you continue, the existing options will be reset",
"options_device_class_enum": "The 'Options' setting must be used with the Enumeration device class. If you continue, the existing options will be reset",
"options_with_enum_device_class": "Configure options for the enumeration sensor",
"uom_required_for_device_class": "The selected device device class requires a unit"
"uom_required_for_device_class": "The selected device class requires a unit"
}
}
},
@@ -453,7 +453,7 @@
"temperature": "[%key:component::sensor::entity_component::temperature::name%]",
"timestamp": "[%key:component::sensor::entity_component::timestamp::name%]",
"volatile_organic_compounds": "[%key:component::sensor::entity_component::volatile_organic_compounds::name%]",
"volatile_organic_compounds_parts": "[%key:component::sensor::entity_component::volatile_organic_compounds::name%]",
"volatile_organic_compounds_parts": "[%key:component::sensor::entity_component::volatile_organic_compounds_parts::name%]",
"voltage": "[%key:component::sensor::entity_component::voltage::name%]",
"volume": "[%key:component::sensor::entity_component::volume::name%]",
"volume_flow_rate": "[%key:component::sensor::entity_component::volume_flow_rate::name%]",
@@ -7,6 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/music_assistant",
"iot_class": "local_push",
"loggers": ["music_assistant"],
"requirements": ["music-assistant-client==1.1.1"],
"requirements": ["music-assistant-client==1.2.0"],
"zeroconf": ["_mass._tcp.local."]
}
@@ -94,6 +94,12 @@ SUPPORTED_FEATURES_BASE = (
| MediaPlayerEntityFeature.MEDIA_ENQUEUE
| MediaPlayerEntityFeature.MEDIA_ANNOUNCE
| MediaPlayerEntityFeature.SEEK
# we always add pause support,
# regardless if the underlying player actually natively supports pause
# because the MA behavior is to internally handle pause with stop
# (and a resume position) and we'd like to keep the UX consistent
# background info: https://github.com/home-assistant/core/issues/140118
| MediaPlayerEntityFeature.PAUSE
)
QUEUE_OPTION_MAP = {
@@ -697,8 +703,6 @@ class MusicAssistantPlayer(MusicAssistantEntity, MediaPlayerEntity):
supported_features = SUPPORTED_FEATURES_BASE
if PlayerFeature.SET_MEMBERS in self.player.supported_features:
supported_features |= MediaPlayerEntityFeature.GROUPING
if PlayerFeature.PAUSE in self.player.supported_features:
supported_features |= MediaPlayerEntityFeature.PAUSE
if self.player.mute_control != PLAYER_CONTROL_NONE:
supported_features |= MediaPlayerEntityFeature.VOLUME_MUTE
if self.player.volume_control != PLAYER_CONTROL_NONE:
+1 -1
View File
@@ -360,7 +360,7 @@ class NMBSSensor(SensorEntity):
attrs[ATTR_LONGITUDE] = self.station_coordinates[1]
if self.is_via_connection and not self._excl_vias:
via = self._attrs.vias.via[0]
via = self._attrs.vias[0]
attrs["via"] = via.station
attrs["via_arrival_platform"] = via.arrival.platform
+9
View File
@@ -104,6 +104,15 @@ def _resize_image(image, opts):
new_width = opts.max_width
(old_width, old_height) = img.size
old_size = len(image)
# If no max_width specified, only apply quality changes if requested
if new_width is None:
if opts.quality is None:
return image
imgbuf = io.BytesIO()
img.save(imgbuf, "JPEG", optimize=True, quality=quality)
return imgbuf.getvalue()
if old_width <= new_width:
if opts.quality is None:
_LOGGER.debug("Image is smaller-than/equal-to requested width")
@@ -27,19 +27,19 @@
"entity": {
"sensor": {
"energy_consumption": {
"name": "Energy consumed"
"name": "Energy consumption"
},
"energy_generation": {
"name": "Energy generated"
"name": "Energy generation"
},
"efficiency": {
"name": "Efficiency"
},
"power_consumption": {
"name": "Power consumed"
"name": "Power consumption"
},
"power_generation": {
"name": "Power generated"
"name": "Power generation"
}
}
}
@@ -143,6 +143,7 @@ class RoborockFlowHandler(ConfigFlow, domain=DOMAIN):
self, discovery_info: DhcpServiceInfo
) -> ConfigFlowResult:
"""Handle a flow started by a dhcp discovery."""
await self._async_handle_discovery_without_unique_id()
device_registry = dr.async_get(self.hass)
device = device_registry.async_get_device(
connections={
+1 -1
View File
@@ -47,7 +47,7 @@
"name": "Supports AirPlay"
},
"supports_ethernet": {
"name": "Supports Ethernet"
"name": "Supports ethernet"
},
"supports_find_remote": {
"name": "Supports find remote"
@@ -6,5 +6,5 @@
"iot_class": "cloud_push",
"loggers": ["boto3", "botocore", "s3transfer"],
"quality_scale": "legacy",
"requirements": ["boto3==1.37.1"]
"requirements": ["boto3==1.34.131"]
}
+2 -2
View File
@@ -278,10 +278,10 @@
"name": "Timestamp"
},
"volatile_organic_compounds": {
"name": "VOCs"
"name": "Volatile organic compounds"
},
"volatile_organic_compounds_parts": {
"name": "[%key:component::sensor::entity_component::volatile_organic_compounds::name%]"
"name": "Volatile organic compounds parts"
},
"voltage": {
"name": "Voltage"
@@ -4,5 +4,5 @@
"codeowners": ["@fabaff"],
"documentation": "https://www.home-assistant.io/integrations/serial",
"iot_class": "local_polling",
"requirements": ["pyserial-asyncio-fast==0.16"]
"requirements": ["pyserial-asyncio-fast==0.14"]
}
@@ -352,7 +352,10 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return {
"new_unique_id": f"{device_id}_{MAIN}_{Capability.THREE_AXIS}_{Attribute.THREE_AXIS}_{new_attribute}",
}
if attribute == Attribute.MACHINE_STATE:
if attribute in {
Attribute.MACHINE_STATE,
Attribute.COMPLETION_TIME,
}:
capability = determine_machine_type(
hass, entry.entry_id, device_id
)
@@ -423,7 +426,7 @@ def create_devices(
kwargs[ATTR_CONNECTIONS] = {
(dr.CONNECTION_NETWORK_MAC, device.device.hub.mac_address)
}
if device.device.parent_device_id:
if device.device.parent_device_id and device.device.parent_device_id in devices:
kwargs[ATTR_VIA_DEVICE] = (DOMAIN, device.device.parent_device_id)
if (ocf := device.device.ocf) is not None:
kwargs.update(
@@ -58,5 +58,6 @@ class SmartThingsButtonEvent(SmartThingsEntity, EventEntity):
)
def _update_handler(self, event: DeviceEvent) -> None:
self._trigger_event(cast(str, event.value))
self.async_write_ha_state()
if event.attribute is Attribute.BUTTON:
self._trigger_event(cast(str, event.value))
super()._update_handler(event)
@@ -331,7 +331,6 @@ CAPABILITY_TO_SENSORS: dict[
translation_key="dryer_machine_state",
options=WASHER_OPTIONS,
device_class=SensorDeviceClass.ENUM,
deprecated=lambda _: "machine_state",
)
],
Attribute.DRYER_JOB_STATE: [
@@ -966,7 +965,6 @@ CAPABILITY_TO_SENSORS: dict[
translation_key="washer_machine_state",
options=WASHER_OPTIONS,
device_class=SensorDeviceClass.ENUM,
deprecated=lambda _: "machine_state",
)
],
Attribute.WASHER_JOB_STATE: [
@@ -487,10 +487,6 @@
"title": "Deprecated refrigerator door binary sensor detected in some automations or scripts",
"description": "The refrigerator door binary sensor `{entity}` is deprecated and is used in the following automations or scripts:\n{items}\n\nSeparate entities for cooler and freezer door are available and should be used going forward. Please use them in the above automations or scripts to fix this issue."
},
"deprecated_machine_state": {
"title": "Deprecated machine state sensor detected in some automations or scripts",
"description": "The machine state sensor `{entity}` is deprecated and is used in the following automations or scripts:\n{items}\n\nA select entity is now available for the machine state and should be used going forward. Please use the new select entity in the above automations or scripts to fix this issue."
},
"deprecated_switch_appliance": {
"title": "Deprecated switch detected in some automations or scripts",
"description": "The switch `{entity}` is deprecated because the actions did not work, so it has been replaced with a binary sensor instead.\n\nThe switch was used in the following automations or scripts:\n{items}\n\nPlease use the new binary sensor in the above automations or scripts to fix this issue."
+1
View File
@@ -31,6 +31,7 @@ async def async_setup_entry(
"power",
"status_requested",
"sticky_white_noise_updated",
"config_change",
],
),
)
+1 -1
View File
@@ -7,5 +7,5 @@
"iot_class": "cloud_push",
"loggers": ["snoo"],
"quality_scale": "bronze",
"requirements": ["python-snoo==0.6.4"]
"requirements": ["python-snoo==0.6.5"]
}
+2 -1
View File
@@ -55,7 +55,8 @@
"activity": "Activity press",
"power": "Power button pressed",
"status_requested": "Status requested",
"sticky_white_noise_updated": "Sleepytime sounds updated"
"sticky_white_noise_updated": "Sleepytime sounds updated",
"config_change": "Config changed"
}
}
}
+3 -5
View File
@@ -477,11 +477,9 @@ class TadoClimate(TadoZoneEntity, ClimateEntity):
@property
def target_temperature(self) -> float | None:
"""Return the temperature we try to reach."""
# If the target temperature will be None
# if the device is performing an action
# that does not affect the temperature or
# the device is switching states
return self._tado_zone_data.target_temp or self._tado_zone_data.current_temp
if self._current_tado_hvac_mode == CONST_MODE_OFF:
return TADO_DEFAULT_MIN_TEMP
return self._tado_zone_data.target_temp
async def set_timer(
self,
+11 -10
View File
@@ -22,10 +22,7 @@ from homeassistant.config_entries import (
)
from homeassistant.core import callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.service_info.zeroconf import (
ATTR_PROPERTIES_ID,
ZeroconfServiceInfo,
)
from homeassistant.helpers.service_info.zeroconf import ZeroconfServiceInfo
from .const import (
CONF_FALLBACK,
@@ -164,12 +161,16 @@ class TadoConfigFlow(ConfigFlow, domain=DOMAIN):
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
"""Handle HomeKit discovery."""
self._async_abort_entries_match()
properties = {
key.lower(): value for key, value in discovery_info.properties.items()
}
await self.async_set_unique_id(properties[ATTR_PROPERTIES_ID])
self._abort_if_unique_id_configured()
await self._async_handle_discovery_without_unique_id()
return await self.async_step_homekit_confirm()
async def async_step_homekit_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Prepare for Homekit."""
if user_input is None:
return self.async_show_form(step_id="homekit_confirm")
return await self.async_step_user()
@staticmethod
@@ -16,6 +16,10 @@
"title": "Authenticate with Tado",
"description": "You need to reauthenticate with Tado. Press `Submit` to start the authentication process."
},
"homekit": {
"title": "Authenticate with Tado",
"description": "Your device has been discovered and needs to authenticate with Tado. Press `Submit` to start the authentication process."
},
"timeout": {
"description": "The authentication process timed out. Please try again."
}
@@ -41,7 +41,7 @@ ENTITIES: tuple[TedeeBinarySensorEntityDescription, ...] = (
TedeeBinarySensorEntityDescription(
key="semi_locked",
translation_key="semi_locked",
is_on_fn=lambda lock: lock.state == TedeeLockState.HALF_OPEN,
is_on_fn=lambda lock: lock.state is TedeeLockState.HALF_OPEN,
entity_category=EntityCategory.DIAGNOSTIC,
),
TedeeBinarySensorEntityDescription(
@@ -53,7 +53,10 @@ ENTITIES: tuple[TedeeBinarySensorEntityDescription, ...] = (
TedeeBinarySensorEntityDescription(
key="uncalibrated",
translation_key="uncalibrated",
is_on_fn=lambda lock: lock.state == TedeeLockState.UNCALIBRATED,
is_on_fn=(
lambda lock: lock.state is TedeeLockState.UNCALIBRATED
or lock.state is TedeeLockState.UNKNOWN
),
device_class=BinarySensorDeviceClass.PROBLEM,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
+4 -3
View File
@@ -120,7 +120,7 @@ def rewrite_legacy_to_modern_conf(
return switches
def rewrite_options_to_moder_conf(option_config: dict[str, dict]) -> dict[str, dict]:
def rewrite_options_to_modern_conf(option_config: dict[str, dict]) -> dict[str, dict]:
"""Rewrite option configuration to modern configuration."""
option_config = {**option_config}
@@ -189,7 +189,7 @@ async def async_setup_entry(
"""Initialize config entry."""
_options = dict(config_entry.options)
_options.pop("template_type")
_options = rewrite_options_to_moder_conf(_options)
_options = rewrite_options_to_modern_conf(_options)
validated_config = SWITCH_CONFIG_SCHEMA(_options)
async_add_entities([SwitchTemplate(hass, validated_config, config_entry.entry_id)])
@@ -199,7 +199,8 @@ def async_create_preview_switch(
hass: HomeAssistant, name: str, config: dict[str, Any]
) -> SwitchTemplate:
"""Create a preview switch."""
validated_config = SWITCH_CONFIG_SCHEMA(config | {CONF_NAME: name})
updated_config = rewrite_options_to_modern_conf(config)
validated_config = SWITCH_CONFIG_SCHEMA(updated_config | {CONF_NAME: name})
return SwitchTemplate(hass, validated_config, None)
@@ -7,7 +7,7 @@ from dataclasses import dataclass
from itertools import chain
from typing import Any
from tesla_fleet_api.const import AutoSeat, Scope, Seat
from tesla_fleet_api.const import Scope, Seat
from homeassistant.components.switch import (
SwitchDeviceClass,
@@ -46,9 +46,7 @@ VEHICLE_DESCRIPTIONS: tuple[TeslaFleetSwitchEntityDescription, ...] = (
),
TeslaFleetSwitchEntityDescription(
key="climate_state_auto_seat_climate_left",
on_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_LEFT, True
),
on_func=lambda api: api.remote_auto_seat_climate_request(Seat.FRONT_LEFT, True),
off_func=lambda api: api.remote_auto_seat_climate_request(
Seat.FRONT_LEFT, False
),
@@ -57,10 +55,10 @@ VEHICLE_DESCRIPTIONS: tuple[TeslaFleetSwitchEntityDescription, ...] = (
TeslaFleetSwitchEntityDescription(
key="climate_state_auto_seat_climate_right",
on_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_RIGHT, True
Seat.FRONT_RIGHT, True
),
off_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_RIGHT, False
Seat.FRONT_RIGHT, False
),
scopes=[Scope.VEHICLE_CMDS],
),
+5 -13
View File
@@ -7,7 +7,7 @@ from dataclasses import dataclass
from itertools import chain
from typing import Any
from tesla_fleet_api.const import AutoSeat, Scope
from tesla_fleet_api.const import Scope
from teslemetry_stream import TeslemetryStreamVehicle
from homeassistant.components.switch import (
@@ -62,23 +62,15 @@ VEHICLE_DESCRIPTIONS: tuple[TeslemetrySwitchEntityDescription, ...] = (
TeslemetrySwitchEntityDescription(
key="climate_state_auto_seat_climate_left",
streaming_listener=lambda x, y: x.listen_AutoSeatClimateLeft(y),
on_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_LEFT, True
),
off_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_LEFT, False
),
on_func=lambda api: api.remote_auto_seat_climate_request(1, True),
off_func=lambda api: api.remote_auto_seat_climate_request(1, False),
scopes=[Scope.VEHICLE_CMDS],
),
TeslemetrySwitchEntityDescription(
key="climate_state_auto_seat_climate_right",
streaming_listener=lambda x, y: x.listen_AutoSeatClimateRight(y),
on_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_RIGHT, True
),
off_func=lambda api: api.remote_auto_seat_climate_request(
AutoSeat.FRONT_RIGHT, False
),
on_func=lambda api: api.remote_auto_seat_climate_request(2, True),
off_func=lambda api: api.remote_auto_seat_climate_request(2, False),
scopes=[Scope.VEHICLE_CMDS],
),
TeslemetrySwitchEntityDescription(
+31
View File
@@ -454,6 +454,37 @@ SENSORS: dict[str, tuple[TuyaSensorEntityDescription, ...]] = {
state_class=SensorStateClass.MEASUREMENT,
entity_registry_enabled_default=False,
),
TuyaSensorEntityDescription(
key=DPCode.VA_TEMPERATURE,
translation_key="temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
TuyaSensorEntityDescription(
key=DPCode.TEMP_CURRENT,
translation_key="temperature",
device_class=SensorDeviceClass.TEMPERATURE,
state_class=SensorStateClass.MEASUREMENT,
),
TuyaSensorEntityDescription(
key=DPCode.VA_HUMIDITY,
translation_key="humidity",
device_class=SensorDeviceClass.HUMIDITY,
state_class=SensorStateClass.MEASUREMENT,
),
TuyaSensorEntityDescription(
key=DPCode.HUMIDITY_VALUE,
translation_key="humidity",
device_class=SensorDeviceClass.HUMIDITY,
state_class=SensorStateClass.MEASUREMENT,
),
TuyaSensorEntityDescription(
key=DPCode.BRIGHT_VALUE,
translation_key="illuminance",
device_class=SensorDeviceClass.ILLUMINANCE,
state_class=SensorStateClass.MEASUREMENT,
),
*BATTERY_SENSORS,
),
# Luminance Sensor
# https://developer.tuya.com/en/docs/iot/categoryldcg?id=Kaiuz3n7u69l8
@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aiowebdav2"],
"quality_scale": "bronze",
"requirements": ["aiowebdav2==0.4.2"]
"requirements": ["aiowebdav2==0.4.4"]
}
@@ -21,7 +21,7 @@ type AsyncWebSocketCommandHandler = Callable[
DOMAIN: Final = "websocket_api"
URL: Final = "/api/websocket"
PENDING_MSG_PEAK: Final = 1024
PENDING_MSG_PEAK_TIME: Final = 5
PENDING_MSG_PEAK_TIME: Final = 10
# Maximum number of messages that can be pending at any given time.
# This is effectively the upper limit of the number of entities
# that can fire state changes within ~1 second.
@@ -2,13 +2,13 @@
"title": "Workday",
"config": {
"abort": {
"already_configured": "Workday has already been setup with chosen configuration"
"already_configured": "Workday has already been set up with chosen configuration"
},
"step": {
"user": {
"data": {
"name": "[%key:common::config_flow::data::name%]",
"country": "Country"
"country": "[%key:common::config_flow::data::country%]"
}
},
"options": {
@@ -18,7 +18,7 @@
"days_offset": "Offset",
"workdays": "Days to include",
"add_holidays": "Add holidays",
"remove_holidays": "Remove Holidays",
"remove_holidays": "Remove holidays",
"province": "Subdivision of country",
"language": "Language for named holidays",
"category": "Additional category as holiday"
@@ -116,14 +116,14 @@
},
"issues": {
"bad_country": {
"title": "Configured Country for {title} does not exist",
"title": "Configured country for {title} does not exist",
"fix_flow": {
"step": {
"country": {
"title": "Select country for {title}",
"description": "Select a country to use for your Workday sensor.",
"data": {
"country": "[%key:component::workday::config::step::user::data::country%]"
"country": "[%key:common::config_flow::data::country%]"
}
},
"province": {
@@ -133,7 +133,7 @@
"province": "[%key:component::workday::config::step::options::data::province%]"
},
"data_description": {
"province": "State, Territory, Province, Region of Country"
"province": "[%key:component::workday::config::step::options::data_description::province%]"
}
}
}
@@ -150,7 +150,7 @@
"province": "[%key:component::workday::config::step::options::data::province%]"
},
"data_description": {
"province": "[%key:component::workday::issues::bad_country::fix_flow::step::province::data_description::province%]"
"province": "[%key:component::workday::config::step::options::data_description::province%]"
}
}
}
@@ -217,7 +217,7 @@
"services": {
"check_date": {
"name": "Check date",
"description": "Check if date is workday.",
"description": "Checks if a given date is a workday.",
"fields": {
"check_date": {
"name": "Date",
@@ -145,8 +145,6 @@ def _async_get_instance(hass: HomeAssistant) -> HaAsyncZeroconf:
if DOMAIN in hass.data:
return cast(HaAsyncZeroconf, hass.data[DOMAIN])
logging.getLogger("zeroconf").setLevel(logging.NOTSET)
zeroconf = HaZeroconf(**_async_get_zc_args(hass))
aio_zc = HaAsyncZeroconf(zc=zeroconf)
+2 -2
View File
@@ -24,8 +24,8 @@ if TYPE_CHECKING:
APPLICATION_NAME: Final = "HomeAssistant"
MAJOR_VERSION: Final = 2025
MINOR_VERSION: Final = 5
PATCH_VERSION: Final = "0.dev0"
MINOR_VERSION: Final = 4
PATCH_VERSION: Final = "0b9"
__short_version__: Final = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__: Final = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER: Final[tuple[int, int, int]] = (3, 13, 0)
+22 -18
View File
@@ -759,17 +759,28 @@
"config_flow": true,
"iot_class": "local_push"
},
"bosch_alarm": {
"name": "Bosch Alarm",
"integration_type": "device",
"config_flow": true,
"iot_class": "local_push"
},
"bosch_shc": {
"name": "Bosch SHC",
"integration_type": "hub",
"config_flow": true,
"iot_class": "local_push"
"bosch": {
"name": "Bosch",
"integrations": {
"bosch_alarm": {
"integration_type": "device",
"config_flow": true,
"iot_class": "local_push",
"name": "Bosch Alarm"
},
"bosch_shc": {
"integration_type": "hub",
"config_flow": true,
"iot_class": "local_push",
"name": "Bosch SHC"
},
"home_connect": {
"integration_type": "hub",
"config_flow": true,
"iot_class": "cloud_push",
"name": "Home Connect"
}
}
},
"brandt": {
"name": "Brandt Smart Control",
@@ -2639,13 +2650,6 @@
"config_flow": true,
"iot_class": "local_polling"
},
"home_connect": {
"name": "Home Connect",
"integration_type": "hub",
"config_flow": true,
"iot_class": "cloud_push",
"single_config_entry": true
},
"home_plus_control": {
"name": "Legrand Home+ Control",
"integration_type": "virtual",
+2 -2
View File
@@ -1311,7 +1311,7 @@ class _QueuedScriptRun(_ScriptRun):
lock_acquired = False
async def async_run(self) -> None:
async def async_run(self) -> ScriptRunResult | None:
"""Run script."""
# Wait for previous run, if any, to finish by attempting to acquire the script's
# shared lock. At the same time monitor if we've been told to stop.
@@ -1325,7 +1325,7 @@ class _QueuedScriptRun(_ScriptRun):
self.lock_acquired = True
# We've acquired the lock so we can go ahead and start the run.
await super().async_run()
return await super().async_run()
def _finish(self) -> None:
if self.lock_acquired:
+2 -2
View File
@@ -38,8 +38,8 @@ habluetooth==3.37.0
hass-nabucasa==0.94.0
hassil==2.2.3
home-assistant-bluetooth==1.13.1
home-assistant-frontend==20250326.0
home-assistant-intents==2025.3.24
home-assistant-frontend==20250328.0
home-assistant-intents==2025.3.28
httpx==0.28.1
ifaddr==0.2.0
Jinja2==3.1.6
+1
View File
@@ -47,6 +47,7 @@
"access_token": "Access token",
"api_key": "API key",
"api_token": "API token",
"country": "Country",
"device": "Device",
"elevation": "Elevation",
"email": "Email",
+723 -714
View File
File diff suppressed because it is too large Load Diff
+10 -10
View File
@@ -210,7 +210,7 @@ aioazuredevops==2.2.1
aiobafi6==0.9.0
# homeassistant.components.aws
aiobotocore==2.21.1
aiobotocore==2.13.1
# homeassistant.components.comelit
aiocomelit==0.11.3
@@ -225,7 +225,7 @@ aiodiscover==2.6.1
aiodns==3.2.0
# homeassistant.components.duke_energy
aiodukeenergy==0.2.2
aiodukeenergy==0.3.0
# homeassistant.components.eafm
aioeafm==0.1.2
@@ -422,7 +422,7 @@ aiowaqi==3.1.0
aiowatttime==0.1.1
# homeassistant.components.webdav
aiowebdav2==0.4.2
aiowebdav2==0.4.4
# homeassistant.components.webostv
aiowebostv==0.7.3
@@ -652,10 +652,10 @@ boschshcpy==0.2.91
# homeassistant.components.amazon_polly
# homeassistant.components.route53
boto3==1.37.1
boto3==1.34.131
# homeassistant.components.aws
botocore==1.37.1
botocore==1.34.131
# homeassistant.components.bring
bring-api==1.1.0
@@ -1157,10 +1157,10 @@ hole==0.8.0
holidays==0.69
# homeassistant.components.frontend
home-assistant-frontend==20250326.0
home-assistant-frontend==20250328.0
# homeassistant.components.conversation
home-assistant-intents==2025.3.24
home-assistant-intents==2025.3.28
# homeassistant.components.homematicip_cloud
homematicip==1.1.7
@@ -1453,7 +1453,7 @@ mozart-api==4.1.1.116.4
mullvad-api==1.0.0
# homeassistant.components.music_assistant
music-assistant-client==1.1.1
music-assistant-client==1.2.0
# homeassistant.components.tts
mutagen==1.47.0
@@ -2289,7 +2289,7 @@ pyschlage==2024.11.0
pysensibo==1.1.0
# homeassistant.components.serial
pyserial-asyncio-fast==0.16
pyserial-asyncio-fast==0.14
# homeassistant.components.acer_projector
# homeassistant.components.crownstone
@@ -2476,7 +2476,7 @@ python-roborock==2.16.1
python-smarttub==0.0.39
# homeassistant.components.snoo
python-snoo==0.6.4
python-snoo==0.6.5
# homeassistant.components.songpal
python-songpal==0.16.2
+8 -8
View File
@@ -198,7 +198,7 @@ aioazuredevops==2.2.1
aiobafi6==0.9.0
# homeassistant.components.aws
aiobotocore==2.21.1
aiobotocore==2.13.1
# homeassistant.components.comelit
aiocomelit==0.11.3
@@ -213,7 +213,7 @@ aiodiscover==2.6.1
aiodns==3.2.0
# homeassistant.components.duke_energy
aiodukeenergy==0.2.2
aiodukeenergy==0.3.0
# homeassistant.components.eafm
aioeafm==0.1.2
@@ -404,7 +404,7 @@ aiowaqi==3.1.0
aiowatttime==0.1.1
# homeassistant.components.webdav
aiowebdav2==0.4.2
aiowebdav2==0.4.4
# homeassistant.components.webostv
aiowebostv==0.7.3
@@ -576,7 +576,7 @@ bosch-alarm-mode2==0.4.3
boschshcpy==0.2.91
# homeassistant.components.aws
botocore==1.37.1
botocore==1.34.131
# homeassistant.components.bring
bring-api==1.1.0
@@ -984,10 +984,10 @@ hole==0.8.0
holidays==0.69
# homeassistant.components.frontend
home-assistant-frontend==20250326.0
home-assistant-frontend==20250328.0
# homeassistant.components.conversation
home-assistant-intents==2025.3.24
home-assistant-intents==2025.3.28
# homeassistant.components.homematicip_cloud
homematicip==1.1.7
@@ -1223,7 +1223,7 @@ mozart-api==4.1.1.116.4
mullvad-api==1.0.0
# homeassistant.components.music_assistant
music-assistant-client==1.1.1
music-assistant-client==1.2.0
# homeassistant.components.tts
mutagen==1.47.0
@@ -2007,7 +2007,7 @@ python-roborock==2.16.1
python-smarttub==0.0.39
# homeassistant.components.snoo
python-snoo==0.6.4
python-snoo==0.6.5
# homeassistant.components.songpal
python-songpal==0.16.2
+1 -1
View File
@@ -25,7 +25,7 @@ RUN --mount=from=ghcr.io/astral-sh/uv:0.6.10,source=/uv,target=/bin/uv \
-c /usr/src/homeassistant/homeassistant/package_constraints.txt \
-r /usr/src/homeassistant/requirements.txt \
stdlib-list==0.10.0 pipdeptree==2.25.1 tqdm==4.67.1 ruff==0.11.0 \
PyTurboJPEG==1.7.5 go2rtc-client==0.1.2 ha-ffmpeg==3.2.2 hassil==2.2.3 home-assistant-intents==2025.3.24 mutagen==1.47.0 pymicro-vad==1.0.1 pyspeex-noise==1.0.2
PyTurboJPEG==1.7.5 go2rtc-client==0.1.2 ha-ffmpeg==3.2.2 hassil==2.2.3 home-assistant-intents==2025.3.28 mutagen==1.47.0 pymicro-vad==1.0.1 pyspeex-noise==1.0.2
LABEL "name"="hassfest"
LABEL "maintainer"="Home Assistant <hello@home-assistant.io>"
@@ -1,61 +0,0 @@
#!/usr/bin/env python3
"""Helper script to merge all pytest execution time reports into one file."""
from __future__ import annotations
import argparse
import pathlib
from homeassistant.helpers.json import save_json
from homeassistant.util.json import load_json_object
def merge_json_files(pattern: str, output_file: str) -> None:
"""Merge JSON files matching the pattern into a single JSON file."""
# Needs to be in sync with PytestExecutionTimeReport in conftest.py
result: dict[str, float] = {}
for file in pathlib.Path().glob(pattern):
print(f"Processing {file}")
data = load_json_object(file)
if not isinstance(data, dict):
print(f"Skipping {file} due to invalid data format.")
continue
for key, value in data.items():
if not isinstance(value, (int, float)):
print(
f"Skipping {key} in {file} due to invalid value type: {type(value)}."
)
continue
if key in result:
result[key] += value
else:
result[key] = value
# Write the merged data to the output file
save_json(output_file, result)
def main() -> None:
"""Execute script."""
parser = argparse.ArgumentParser(
description="Merge all pytest execution time reports into one file."
)
parser.add_argument(
"pattern",
help="Glob pattern to match JSON pytest execution time report files",
type=str,
)
parser.add_argument(
"output_file",
help="Path to the output file",
type=str,
nargs="?",
default="pytest-execution-time-report.json",
)
arguments = parser.parse_args()
merge_json_files(arguments.pattern, arguments.output_file)
if __name__ == "__main__":
main()
+62 -161
View File
@@ -5,11 +5,11 @@ from __future__ import annotations
import argparse
from dataclasses import dataclass, field
from datetime import timedelta
from math import ceil
from pathlib import Path
from typing import Final, cast
from homeassistant.util.json import load_json_object
import subprocess
import sys
from typing import Final
class Bucket:
@@ -19,15 +19,13 @@ class Bucket:
self,
):
"""Initialize bucket."""
self.approx_execution_time = timedelta(seconds=0)
self.not_measured_files = 0
self.total_tests = 0
self._paths: list[str] = []
def add(self, part: TestFolder | TestFile) -> None:
"""Add tests to bucket."""
part.add_to_bucket()
self.approx_execution_time += part.approx_execution_time
self.not_measured_files += part.not_measured_files
self.total_tests += part.total_tests
self._paths.append(str(part.path))
def get_paths_line(self) -> str:
@@ -35,132 +33,64 @@ class Bucket:
return " ".join(self._paths) + "\n"
def add_not_measured_files(
test: TestFolder | TestFile, not_measured_files: set[TestFile]
) -> None:
"""Add not measured files to test folder."""
if test.not_measured_files > 0:
if isinstance(test, TestFolder):
for child in test.children.values():
add_not_measured_files(child, not_measured_files)
else:
not_measured_files.add(test)
def sort_by_not_measured(bucket: Bucket) -> tuple[int, float]:
"""Sort by not measured files."""
return (bucket.not_measured_files, bucket.approx_execution_time.total_seconds())
def sort_by_execution_time(bucket: Bucket) -> tuple[float, int]:
"""Sort by execution time."""
return (bucket.approx_execution_time.total_seconds(), bucket.not_measured_files)
class BucketHolder:
"""Class to hold buckets."""
def __init__(self, bucket_count: int) -> None:
def __init__(self, tests_per_bucket: int, bucket_count: int) -> None:
"""Initialize bucket holder."""
self._tests_per_bucket = tests_per_bucket
self._bucket_count = bucket_count
self._buckets: list[Bucket] = [Bucket() for _ in range(bucket_count)]
def split_tests(self, test_folder: TestFolder) -> None:
"""Split tests into buckets."""
avg_execution_time = test_folder.approx_execution_time / self._bucket_count
avg_not_measured_files = test_folder.not_measured_files / self._bucket_count
digits = len(str(test_folder.total_tests))
sorted_tests = sorted(
test_folder.get_all_flatten(),
key=lambda x: (
-x.approx_execution_time,
-x.count_children() if isinstance(x, TestFolder) else 0,
x.not_measured_files,
),
test_folder.get_all_flatten(), reverse=True, key=lambda x: x.total_tests
)
not_measured_tests = set()
for tests in sorted_tests:
if tests.added_to_bucket:
# Already added to bucket
continue
print(f"~{tests.approx_execution_time} execution time for {tests.path}")
print(f"{tests.total_tests:>{digits}} tests in {tests.path}")
smallest_bucket = min(self._buckets, key=lambda x: x.total_tests)
is_file = isinstance(tests, TestFile)
sort_key = sort_by_execution_time
if tests.not_measured_files and tests.approx_execution_time == 0:
# If tests are not measured, sort by not measured files
sort_key = sort_by_not_measured
smallest_bucket = min(self._buckets, key=sort_key)
if (
(smallest_bucket.approx_execution_time + tests.approx_execution_time)
< avg_execution_time
and (smallest_bucket.not_measured_files + tests.not_measured_files)
< avg_not_measured_files
smallest_bucket.total_tests + tests.total_tests < self._tests_per_bucket
) or is_file:
smallest_bucket.add(tests)
add_not_measured_files(
tests,
not_measured_tests,
)
# Ensure all files from the same folder are in the same bucket
# to ensure that syrupy correctly identifies unused snapshots
if is_file:
added_tests = []
for other_test in tests.parent.children.values():
if other_test is tests or isinstance(other_test, TestFolder):
continue
print(
f"{other_test.total_tests:>{digits}} tests in {other_test.path} (same bucket)"
)
smallest_bucket.add(other_test)
added_tests.append(other_test)
add_not_measured_files(
other_test,
not_measured_tests,
)
if added_tests:
print(
f"Added {len(added_tests)} tests to the same bucket so syrupy can identify unused snapshots"
)
print(
" - "
+ "\n - ".join(
str(test.path) for test in sorted(added_tests)
)
)
# verify that all tests are added to a bucket
if not test_folder.added_to_bucket:
raise ValueError("Not all tests are added to a bucket")
if not_measured_tests:
print(f"Found {len(not_measured_tests)} not measured test files: ")
for test in sorted(not_measured_tests, key=lambda x: x.path):
print(f" - {test.path}")
def create_ouput_file(self) -> None:
"""Create output file."""
with Path("pytest_buckets.txt").open("w") as file:
for idx, bucket in enumerate(self._buckets):
print(
f"Bucket {idx + 1} execution time should be ~{str_without_milliseconds(bucket.approx_execution_time)}"
f" with {bucket.not_measured_files} not measured files"
)
print(f"Bucket {idx + 1} has {bucket.total_tests} tests")
file.write(bucket.get_paths_line())
def str_without_milliseconds(td: timedelta) -> str:
"""Return str without milliseconds."""
return str(td).split(".")[0]
@dataclass
class TestFile:
"""Class represents a single test file and the number of tests it has."""
total_tests: int
path: Path
parent: TestFolder
# 0 means not measured
approx_execution_time: timedelta
added_to_bucket: bool = field(default=False, init=False)
parent: TestFolder | None = field(default=None, init=False)
def add_to_bucket(self) -> None:
"""Add test file to bucket."""
@@ -168,18 +98,9 @@ class TestFile:
raise ValueError("Already added to bucket")
self.added_to_bucket = True
@property
def not_measured_files(self) -> int:
"""Return files not measured."""
return 1 if self.approx_execution_time.total_seconds() == 0 else 0
def __gt__(self, other: TestFile) -> bool:
"""Return if greater than."""
return self.approx_execution_time > other.approx_execution_time
def __hash__(self) -> int:
"""Return hash."""
return hash(self.path)
return self.total_tests > other.total_tests
class TestFolder:
@@ -191,31 +112,15 @@ class TestFolder:
self.children: dict[Path, TestFolder | TestFile] = {}
@property
def approx_execution_time(self) -> timedelta:
"""Return approximate execution time."""
time = timedelta(seconds=0)
for test in self.children.values():
time += test.approx_execution_time
return time
@property
def not_measured_files(self) -> int:
"""Return files not measured."""
return sum([test.not_measured_files for test in self.children.values()])
def total_tests(self) -> int:
"""Return total tests."""
return sum([test.total_tests for test in self.children.values()])
@property
def added_to_bucket(self) -> bool:
"""Return if added to bucket."""
return all(test.added_to_bucket for test in self.children.values())
def count_children(self) -> int:
"""Return the number of children."""
return len(self.children) + sum(
child.count_children()
for child in self.children.values()
if isinstance(child, TestFolder)
)
def add_to_bucket(self) -> None:
"""Add test file to bucket."""
if self.added_to_bucket:
@@ -225,18 +130,11 @@ class TestFolder:
def __repr__(self) -> str:
"""Return representation."""
return f"TestFolder(approx_execution_time={self.approx_execution_time}, children={len(self.children)})"
def add_test_file(
self, path: Path, execution_time: float, skip_file_if_present: bool
) -> None:
"""Add test file to folder."""
self._add_test_file(
TestFile(path, self, timedelta(seconds=execution_time)),
skip_file_if_present,
return (
f"TestFolder(total_tests={self.total_tests}, children={len(self.children)})"
)
def _add_test_file(self, file: TestFile, skip_file_if_present: bool) -> None:
def add_test_file(self, file: TestFile) -> None:
"""Add test file to folder."""
path = file.path
file.parent = self
@@ -245,10 +143,6 @@ class TestFolder:
raise ValueError("Path is not a child of this folder")
if len(relative_path.parts) == 1:
if path in self.children:
if skip_file_if_present:
return
raise ValueError(f"File already exists: {path}")
self.children[path] = file
return
@@ -257,7 +151,7 @@ class TestFolder:
self.children[child_path] = child = TestFolder(child_path)
elif not isinstance(child, TestFolder):
raise ValueError("Child is not a folder")
child._add_test_file(file, skip_file_if_present)
child.add_test_file(file)
def get_all_flatten(self) -> list[TestFolder | TestFile]:
"""Return self and all children as flatten list."""
@@ -270,21 +164,35 @@ class TestFolder:
return result
def process_execution_time_file(
execution_time_file: Path, test_folder: TestFolder
) -> None:
"""Process the execution time file."""
for file, execution_time in load_json_object(execution_time_file).items():
test_folder.add_test_file(Path(file), cast(float, execution_time), False)
def collect_tests(path: Path) -> TestFolder:
"""Collect all tests."""
result = subprocess.run(
["pytest", "--collect-only", "-qq", "-p", "no:warnings", path],
check=False,
capture_output=True,
text=True,
)
if result.returncode != 0:
print("Failed to collect tests:")
print(result.stderr)
print(result.stdout)
sys.exit(1)
def add_missing_test_files(folder: Path, test_folder: TestFolder) -> None:
"""Scan test folder for missing files."""
for path in folder.iterdir():
if path.is_dir():
add_missing_test_files(path, test_folder)
elif path.name.startswith("test_") and path.suffix == ".py":
test_folder.add_test_file(path, 0.0, True)
folder = TestFolder(path)
for line in result.stdout.splitlines():
if not line.strip():
continue
file_path, _, total_tests = line.partition(": ")
if not path or not total_tests:
print(f"Unexpected line: {line}")
sys.exit(1)
file = TestFile(int(total_tests), Path(file_path))
folder.add_test_file(file)
return folder
def main() -> None:
@@ -305,31 +213,24 @@ def main() -> None:
type=check_greater_0,
)
parser.add_argument(
"test_folder",
"path",
help="Path to the test files to split into buckets",
type=Path,
)
parser.add_argument(
"execution_time_file",
help="Path to the file containing the execution time of each test",
type=Path,
)
arguments = parser.parse_args()
tests = TestFolder(arguments.test_folder)
print("Collecting tests...")
tests = collect_tests(arguments.path)
tests_per_bucket = ceil(tests.total_tests / arguments.bucket_count)
if arguments.execution_time_file.exists():
print(f"Using execution time file: {arguments.execution_time_file}")
process_execution_time_file(arguments.execution_time_file, tests)
print("Scanning test files...")
add_missing_test_files(arguments.test_folder, tests)
bucket_holder = BucketHolder(arguments.bucket_count)
bucket_holder = BucketHolder(tests_per_bucket, arguments.bucket_count)
print("Splitting tests...")
bucket_holder.split_tests(tests)
print(f"Total tests: {tests.total_tests}")
print(f"Estimated tests per bucket: {tests_per_bucket}")
bucket_holder.create_ouput_file()
@@ -445,6 +445,7 @@ async def test_connection_test(
assert len(entity.announcements) == 1
assert entity.announcements[0].message == ""
assert entity.announcements[0].preannounce_media_id is None
announcement_media_id = entity.announcements[0].media_id
hass_url = "http://10.10.10.10:8123"
assert announcement_media_id.startswith(
+1 -1
View File
@@ -127,7 +127,7 @@ async def test_awair_gen1_sensors(
assert_expected_properties(
hass,
entity_registry,
"sensor.living_room_vocs",
"sensor.living_room_volatile_organic_compounds_parts",
f"{AWAIR_UUID}_{SENSOR_TYPES_MAP[API_VOC].unique_id_tag}",
"366",
{
+116 -2
View File
@@ -5,9 +5,9 @@ from io import StringIO
from typing import Any
from unittest.mock import ANY, Mock, PropertyMock, patch
from aiohttp import ClientError
from aiohttp import ClientError, ClientResponseError
from hass_nabucasa import CloudError
from hass_nabucasa.api import CloudApiNonRetryableError
from hass_nabucasa.api import CloudApiError, CloudApiNonRetryableError
from hass_nabucasa.files import FilesError, StorageType
import pytest
@@ -547,6 +547,120 @@ async def test_agents_upload_not_protected(
assert stored_backup["failed_agent_ids"] == ["cloud.cloud"]
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_upload_not_subscribed(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
hass_storage: dict[str, Any],
cloud: Mock,
) -> None:
"""Test upload backup when cloud user is not subscribed."""
cloud.subscription_expired = True
client = await hass_client()
backup_data = "test"
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
extra_metadata={},
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=True,
size=len(backup_data),
)
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
mocked_open.return_value.read = Mock(side_effect=[backup_data.encode(), b""])
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO(backup_data)},
)
await hass.async_block_till_done()
assert resp.status == 201
assert cloud.files.upload.call_count == 0
store_backups = hass_storage[BACKUP_DOMAIN]["data"]["backups"]
assert len(store_backups) == 1
stored_backup = store_backups[0]
assert stored_backup["backup_id"] == backup_id
assert stored_backup["failed_agent_ids"] == ["cloud.cloud"]
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_upload_not_subscribed_midway(
hass: HomeAssistant,
hass_client: ClientSessionGenerator,
hass_storage: dict[str, Any],
cloud: Mock,
) -> None:
"""Test upload backup when cloud subscription expires during the call."""
client = await hass_client()
backup_data = "test"
backup_id = "test-backup"
test_backup = AgentBackup(
addons=[AddonInfo(name="Test", slug="test", version="1.0.0")],
backup_id=backup_id,
database_included=True,
date="1970-01-01T00:00:00.000Z",
extra_metadata={},
folders=[Folder.MEDIA, Folder.SHARE],
homeassistant_included=True,
homeassistant_version="2024.12.0",
name="Test",
protected=True,
size=len(backup_data),
)
async def mock_upload(*args: Any, **kwargs: Any) -> None:
"""Mock file upload."""
cloud.subscription_expired = True
raise CloudApiError(
"Boom!", orig_exc=ClientResponseError(Mock(), Mock(), status=403)
)
cloud.files.upload.side_effect = mock_upload
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
) as fetch_backup,
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
mocked_open.return_value.read = Mock(side_effect=[backup_data.encode(), b""])
fetch_backup.return_value = test_backup
resp = await client.post(
"/api/backup/upload?agent_id=cloud.cloud",
data={"file": StringIO(backup_data)},
)
await hass.async_block_till_done()
assert resp.status == 201
assert cloud.files.upload.call_count == 1
store_backups = hass_storage[BACKUP_DOMAIN]["data"]["backups"]
assert len(store_backups) == 1
stored_backup = store_backups[0]
assert stored_backup["backup_id"] == backup_id
assert stored_backup["failed_agent_ids"] == ["cloud.cloud"]
@pytest.mark.usefixtures("cloud_logged_in", "mock_list_files")
async def test_agents_upload_wrong_size(
hass: HomeAssistant,
+2 -2
View File
@@ -61,8 +61,8 @@ def mock_api() -> Generator[AsyncMock]:
):
api = mock_api.return_value
api.authenticate.return_value = {
"email": "TEST@EXAMPLE.COM",
"cdp_internal_user_id": "test-username",
"loginEmailAddress": "TEST@EXAMPLE.COM",
"internalUserID": "test-username",
}
api.get_meters.return_value = {}
yield api
+70
View File
@@ -260,6 +260,76 @@ async def test_entities_removed_after_reload(
assert len(hass_storage[storage_key]["data"]["binary_sensor"]) == 1
async def test_entities_for_entire_platform_removed(
hass: HomeAssistant,
entity_registry: er.EntityRegistry,
mock_client: APIClient,
hass_storage: dict[str, Any],
mock_esphome_device: Callable[
[APIClient, list[EntityInfo], list[UserService], list[EntityState]],
Awaitable[MockESPHomeDevice],
],
) -> None:
"""Test removing all entities for a specific platform when static info changes."""
entity_info = [
BinarySensorInfo(
object_id="mybinary_sensor_to_be_removed",
key=1,
name="my binary_sensor to be removed",
unique_id="mybinary_sensor_to_be_removed",
),
]
states = [
BinarySensorState(key=1, state=True, missing_state=False),
]
user_service = []
mock_device = await mock_esphome_device(
mock_client=mock_client,
entity_info=entity_info,
user_service=user_service,
states=states,
)
entry = mock_device.entry
entry_id = entry.entry_id
storage_key = f"esphome.{entry_id}"
state = hass.states.get("binary_sensor.test_mybinary_sensor_to_be_removed")
assert state is not None
assert state.state == STATE_ON
await hass.config_entries.async_unload(entry.entry_id)
await hass.async_block_till_done()
assert len(hass_storage[storage_key]["data"]["binary_sensor"]) == 1
state = hass.states.get("binary_sensor.test_mybinary_sensor_to_be_removed")
assert state is not None
reg_entry = entity_registry.async_get(
"binary_sensor.test_mybinary_sensor_to_be_removed"
)
assert reg_entry is not None
assert state.attributes[ATTR_RESTORED] is True
entity_info = []
states = []
mock_device = await mock_esphome_device(
mock_client=mock_client,
entity_info=entity_info,
user_service=user_service,
states=states,
entry=entry,
)
assert mock_device.entry.entry_id == entry_id
state = hass.states.get("binary_sensor.test_mybinary_sensor_to_be_removed")
assert state is None
reg_entry = entity_registry.async_get(
"binary_sensor.test_mybinary_sensor_to_be_removed"
)
assert reg_entry is None
await hass.config_entries.async_unload(entry.entry_id)
await hass.async_block_till_done()
assert len(hass_storage[storage_key]["data"]["binary_sensor"]) == 0
async def test_entity_info_object_ids(
hass: HomeAssistant,
mock_client: APIClient,
+22 -3
View File
@@ -4,6 +4,7 @@ from aioesphomeapi import APIClient, Event, EventInfo
import pytest
from homeassistant.components.event import EventDeviceClass
from homeassistant.const import STATE_UNAVAILABLE
from homeassistant.core import HomeAssistant
@@ -11,9 +12,9 @@ from homeassistant.core import HomeAssistant
async def test_generic_event_entity(
hass: HomeAssistant,
mock_client: APIClient,
mock_generic_device_entry,
mock_esphome_device,
) -> None:
"""Test a generic event entity."""
"""Test a generic event entity and its availability behavior."""
entity_info = [
EventInfo(
object_id="myevent",
@@ -26,13 +27,31 @@ async def test_generic_event_entity(
]
states = [Event(key=1, event_type="type1")]
user_service = []
await mock_generic_device_entry(
device = await mock_esphome_device(
mock_client=mock_client,
entity_info=entity_info,
user_service=user_service,
states=states,
)
await hass.async_block_till_done()
# Test initial state
state = hass.states.get("event.test_myevent")
assert state is not None
assert state.state == "2024-04-24T00:00:00.000+00:00"
assert state.attributes["event_type"] == "type1"
# Test device becomes unavailable
await device.mock_disconnect(True)
await hass.async_block_till_done()
state = hass.states.get("event.test_myevent")
assert state.state == STATE_UNAVAILABLE
# Test device becomes available again
await device.mock_connect()
await hass.async_block_till_done()
# Event entity should be available immediately without waiting for data
state = hass.states.get("event.test_myevent")
assert state.state == "2024-04-24T00:00:00.000+00:00"
assert state.attributes["event_type"] == "type1"
+31 -27
View File
@@ -86,26 +86,28 @@ def stub_reconnect():
)
async def test_update_entity(
hass: HomeAssistant,
stub_reconnect,
mock_config_entry,
mock_device_info,
mock_dashboard: dict[str, Any],
devices_payload,
expected_state,
expected_attributes,
devices_payload: list[dict[str, Any]],
expected_state: str,
expected_attributes: dict[str, Any],
mock_client: APIClient,
mock_esphome_device: Callable[
[APIClient, list[EntityInfo], list[UserService], list[EntityState]],
Awaitable[MockESPHomeDevice],
],
) -> None:
"""Test ESPHome update entity."""
mock_dashboard["configured"] = devices_payload
await async_get_dashboard(hass).async_refresh()
with patch(
"homeassistant.components.esphome.update.DomainData.get_entry_data",
return_value=Mock(available=True, device_info=mock_device_info, info={}),
):
assert await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
await mock_esphome_device(
mock_client=mock_client,
entity_info=[],
user_service=[],
states=[],
)
state = hass.states.get("update.none_firmware")
state = hass.states.get("update.test_firmware")
assert state is not None
assert state.state == expected_state
for key, expected_value in expected_attributes.items():
@@ -130,7 +132,7 @@ async def test_update_entity(
await hass.services.async_call(
"update",
"install",
{"entity_id": "update.none_firmware"},
{"entity_id": "update.test_firmware"},
blocking=True,
)
@@ -155,7 +157,7 @@ async def test_update_entity(
await hass.services.async_call(
"update",
"install",
{"entity_id": "update.none_firmware"},
{"entity_id": "update.test_firmware"},
blocking=True,
)
@@ -177,7 +179,7 @@ async def test_update_entity(
await hass.services.async_call(
"update",
"install",
{"entity_id": "update.none_firmware"},
{"entity_id": "update.test_firmware"},
blocking=True,
)
@@ -274,28 +276,30 @@ async def test_update_device_state_for_availability(
async def test_update_entity_dashboard_not_available_startup(
hass: HomeAssistant,
stub_reconnect,
mock_config_entry,
mock_device_info,
mock_client: APIClient,
mock_esphome_device: Callable[
[APIClient, list[EntityInfo], list[UserService], list[EntityState]],
Awaitable[MockESPHomeDevice],
],
mock_dashboard: dict[str, Any],
) -> None:
"""Test ESPHome update entity when dashboard is not available at startup."""
with (
patch(
"homeassistant.components.esphome.update.DomainData.get_entry_data",
return_value=Mock(available=True, device_info=mock_device_info, info={}),
),
patch(
"esphome_dashboard_api.ESPHomeDashboardAPI.get_devices",
side_effect=TimeoutError,
),
):
await async_get_dashboard(hass).async_refresh()
assert await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
await mock_esphome_device(
mock_client=mock_client,
entity_info=[],
user_service=[],
states=[],
)
# We have a dashboard but it is not available
state = hass.states.get("update.none_firmware")
state = hass.states.get("update.test_firmware")
assert state is None
mock_dashboard["configured"] = [
@@ -308,7 +312,7 @@ async def test_update_entity_dashboard_not_available_startup(
await async_get_dashboard(hass).async_refresh()
await hass.async_block_till_done()
state = hass.states.get("update.none_firmware")
state = hass.states.get("update.test_firmware")
assert state.state == STATE_ON
expected_attributes = {
"latest_version": "2023.2.0-dev",
@@ -16,10 +16,10 @@
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'switch',
'friendly_name': 'fake-device-1 Quiet',
'friendly_name': 'fake-device-1 Quiet mode',
}),
'context': <ANY>,
'entity_id': 'switch.fake_device_1_quiet',
'entity_id': 'switch.fake_device_1_quiet_mode',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
@@ -40,10 +40,10 @@
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'switch',
'friendly_name': 'fake-device-1 XFan',
'friendly_name': 'fake-device-1 Xtra fan',
}),
'context': <ANY>,
'entity_id': 'switch.fake_device_1_xfan',
'entity_id': 'switch.fake_device_1_xtra_fan',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
@@ -109,7 +109,7 @@
'disabled_by': None,
'domain': 'switch',
'entity_category': None,
'entity_id': 'switch.fake_device_1_quiet',
'entity_id': 'switch.fake_device_1_quiet_mode',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -121,7 +121,7 @@
}),
'original_device_class': <SwitchDeviceClass.SWITCH: 'switch'>,
'original_icon': None,
'original_name': 'Quiet',
'original_name': 'Quiet mode',
'platform': 'gree',
'previous_unique_id': None,
'supported_features': 0,
@@ -173,7 +173,7 @@
'disabled_by': None,
'domain': 'switch',
'entity_category': None,
'entity_id': 'switch.fake_device_1_xfan',
'entity_id': 'switch.fake_device_1_xtra_fan',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -185,7 +185,7 @@
}),
'original_device_class': <SwitchDeviceClass.SWITCH: 'switch'>,
'original_icon': None,
'original_name': 'XFan',
'original_name': 'Xtra fan',
'platform': 'gree',
'previous_unique_id': None,
'supported_features': 0,
+15 -15
View File
@@ -22,11 +22,11 @@ from homeassistant.setup import async_setup_component
from tests.common import MockConfigEntry
ENTITY_ID_LIGHT_PANEL = f"{SWITCH_DOMAIN}.fake_device_1_panel_light"
ENTITY_ID_PANEL_LIGHT = f"{SWITCH_DOMAIN}.fake_device_1_panel_light"
ENTITY_ID_HEALTH_MODE = f"{SWITCH_DOMAIN}.fake_device_1_health_mode"
ENTITY_ID_QUIET = f"{SWITCH_DOMAIN}.fake_device_1_quiet"
ENTITY_ID_QUIET_MODE = f"{SWITCH_DOMAIN}.fake_device_1_quiet_mode"
ENTITY_ID_FRESH_AIR = f"{SWITCH_DOMAIN}.fake_device_1_fresh_air"
ENTITY_ID_XFAN = f"{SWITCH_DOMAIN}.fake_device_1_xfan"
ENTITY_ID_XTRA_FAN = f"{SWITCH_DOMAIN}.fake_device_1_xtra_fan"
async def async_setup_gree(hass: HomeAssistant) -> MockConfigEntry:
@@ -54,11 +54,11 @@ async def test_registry_settings(
@pytest.mark.parametrize(
"entity",
[
ENTITY_ID_LIGHT_PANEL,
ENTITY_ID_PANEL_LIGHT,
ENTITY_ID_HEALTH_MODE,
ENTITY_ID_QUIET,
ENTITY_ID_QUIET_MODE,
ENTITY_ID_FRESH_AIR,
ENTITY_ID_XFAN,
ENTITY_ID_XTRA_FAN,
],
)
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
@@ -81,11 +81,11 @@ async def test_send_switch_on(hass: HomeAssistant, entity: str) -> None:
@pytest.mark.parametrize(
"entity",
[
ENTITY_ID_LIGHT_PANEL,
ENTITY_ID_PANEL_LIGHT,
ENTITY_ID_HEALTH_MODE,
ENTITY_ID_QUIET,
ENTITY_ID_QUIET_MODE,
ENTITY_ID_FRESH_AIR,
ENTITY_ID_XFAN,
ENTITY_ID_XTRA_FAN,
],
)
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
@@ -112,11 +112,11 @@ async def test_send_switch_on_device_timeout(
@pytest.mark.parametrize(
"entity",
[
ENTITY_ID_LIGHT_PANEL,
ENTITY_ID_PANEL_LIGHT,
ENTITY_ID_HEALTH_MODE,
ENTITY_ID_QUIET,
ENTITY_ID_QUIET_MODE,
ENTITY_ID_FRESH_AIR,
ENTITY_ID_XFAN,
ENTITY_ID_XTRA_FAN,
],
)
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
@@ -139,11 +139,11 @@ async def test_send_switch_off(hass: HomeAssistant, entity: str) -> None:
@pytest.mark.parametrize(
"entity",
[
ENTITY_ID_LIGHT_PANEL,
ENTITY_ID_PANEL_LIGHT,
ENTITY_ID_HEALTH_MODE,
ENTITY_ID_QUIET,
ENTITY_ID_QUIET_MODE,
ENTITY_ID_FRESH_AIR,
ENTITY_ID_XFAN,
ENTITY_ID_XTRA_FAN,
],
)
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
-1
View File
@@ -41,7 +41,6 @@ class MockHeos(Heos):
self.player_get_quick_selects: AsyncMock = AsyncMock()
self.player_play_next: AsyncMock = AsyncMock()
self.player_play_previous: AsyncMock = AsyncMock()
self.player_play_queue: AsyncMock = AsyncMock()
self.player_play_quick_select: AsyncMock = AsyncMock()
self.player_set_mute: AsyncMock = AsyncMock()
self.player_set_play_mode: AsyncMock = AsyncMock()
@@ -1321,51 +1321,6 @@ async def test_play_media_music_source_url(
controller.play_url.assert_called_once()
async def test_play_media_queue(
hass: HomeAssistant,
config_entry: MockConfigEntry,
controller: MockHeos,
) -> None:
"""Test the play media service with type queue."""
config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(config_entry.entry_id)
await hass.services.async_call(
MEDIA_PLAYER_DOMAIN,
SERVICE_PLAY_MEDIA,
{
ATTR_ENTITY_ID: "media_player.test_player",
ATTR_MEDIA_CONTENT_TYPE: "queue",
ATTR_MEDIA_CONTENT_ID: "2",
},
blocking=True,
)
controller.player_play_queue.assert_called_once_with(1, 2)
async def test_play_media_queue_invalid(
hass: HomeAssistant, config_entry: MockConfigEntry, controller: MockHeos
) -> None:
"""Test the play media service with an invalid queue id."""
config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(config_entry.entry_id)
with pytest.raises(
HomeAssistantError,
match=re.escape("Unable to play media: Invalid queue id 'Invalid'"),
):
await hass.services.async_call(
MEDIA_PLAYER_DOMAIN,
SERVICE_PLAY_MEDIA,
{
ATTR_ENTITY_ID: "media_player.test_player",
ATTR_MEDIA_CONTENT_TYPE: "queue",
ATTR_MEDIA_CONTENT_ID: "Invalid",
},
blocking=True,
)
assert controller.player_play_queue.call_count == 0
async def test_browse_media_root(
hass: HomeAssistant,
config_entry: MockConfigEntry,
@@ -1,5 +1,7 @@
"""Test SkyConnect firmware update entity."""
import pytest
from homeassistant.components.homeassistant_hardware.helpers import (
async_notify_firmware_info,
)
@@ -14,9 +16,7 @@ from .common import USB_DATA_ZBT1
from tests.common import MockConfigEntry
UPDATE_ENTITY_ID = (
"update.homeassistant_sky_connect_9e2adbd75b8beb119fe564a0f320645d_firmware"
)
UPDATE_ENTITY_ID = "update.home_assistant_connect_zbt_1_9e2adbd7_firmware"
async def test_zbt1_update_entity(hass: HomeAssistant) -> None:
@@ -59,8 +59,9 @@ async def test_zbt1_update_entity(hass: HomeAssistant) -> None:
await hass.async_block_till_done()
state_ezsp = hass.states.get(UPDATE_ENTITY_ID)
assert state_ezsp is not None
assert state_ezsp.state == "unknown"
assert state_ezsp.attributes["title"] == "EmberZNet"
assert state_ezsp.attributes["title"] == "EmberZNet Zigbee"
assert state_ezsp.attributes["installed_version"] == "7.3.1.0"
assert state_ezsp.attributes["latest_version"] is None
@@ -80,7 +81,52 @@ async def test_zbt1_update_entity(hass: HomeAssistant) -> None:
# After the firmware update, the entity has the new version and the correct state
state_spinel = hass.states.get(UPDATE_ENTITY_ID)
assert state_spinel is not None
assert state_spinel.state == "unknown"
assert state_spinel.attributes["title"] == "OpenThread RCP"
assert state_spinel.attributes["installed_version"] == "2.4.4.0"
assert state_spinel.attributes["latest_version"] is None
@pytest.mark.parametrize(
("firmware", "version", "expected"),
[
("ezsp", "7.3.1.0 build 0", "EmberZNet Zigbee 7.3.1.0"),
("spinel", "SL-OPENTHREAD/2.4.4.0_GitHub-7074a43e4", "OpenThread RCP 2.4.4.0"),
("bootloader", "2.4.2", "Gecko Bootloader 2.4.2"),
("cpc", "4.3.2", "Multiprotocol 4.3.2"),
("router", "1.2.3.4", "Unknown 1.2.3.4"), # Not supported but still shown
],
)
async def test_zbt1_update_entity_state(
hass: HomeAssistant, firmware: str, version: str, expected: str
) -> None:
"""Test the ZBT-1 firmware update entity with different firmware types."""
await async_setup_component(hass, "homeassistant", {})
zbt1_config_entry = MockConfigEntry(
domain="homeassistant_sky_connect",
data={
"firmware": firmware,
"firmware_version": version,
"device": USB_DATA_ZBT1.device,
"manufacturer": USB_DATA_ZBT1.manufacturer,
"pid": USB_DATA_ZBT1.pid,
"product": USB_DATA_ZBT1.description,
"serial_number": USB_DATA_ZBT1.serial_number,
"vid": USB_DATA_ZBT1.vid,
},
version=1,
minor_version=3,
)
zbt1_config_entry.add_to_hass(hass)
assert await hass.config_entries.async_setup(zbt1_config_entry.entry_id)
await hass.async_block_till_done()
state = hass.states.get(UPDATE_ENTITY_ID)
assert state is not None
assert (
f"{state.attributes['title']} {state.attributes['installed_version']}"
== expected
)
@@ -2,6 +2,8 @@
from unittest.mock import patch
import pytest
from homeassistant.components.homeassistant_hardware.helpers import (
async_notify_firmware_info,
)
@@ -15,7 +17,7 @@ from homeassistant.setup import async_setup_component
from tests.common import MockConfigEntry
UPDATE_ENTITY_ID = "update.homeassistant_yellow_firmware"
UPDATE_ENTITY_ID = "update.home_assistant_yellow_firmware"
async def test_yellow_update_entity(hass: HomeAssistant) -> None:
@@ -24,6 +26,7 @@ async def test_yellow_update_entity(hass: HomeAssistant) -> None:
# Set up the Yellow integration
yellow_config_entry = MockConfigEntry(
title="Home Assistant Yellow",
domain="homeassistant_yellow",
data={
"firmware": "ezsp",
@@ -62,8 +65,9 @@ async def test_yellow_update_entity(hass: HomeAssistant) -> None:
await hass.async_block_till_done()
state_ezsp = hass.states.get(UPDATE_ENTITY_ID)
assert state_ezsp is not None
assert state_ezsp.state == "unknown"
assert state_ezsp.attributes["title"] == "EmberZNet"
assert state_ezsp.attributes["title"] == "EmberZNet Zigbee"
assert state_ezsp.attributes["installed_version"] == "7.3.1.0"
assert state_ezsp.attributes["latest_version"] is None
@@ -83,7 +87,58 @@ async def test_yellow_update_entity(hass: HomeAssistant) -> None:
# After the firmware update, the entity has the new version and the correct state
state_spinel = hass.states.get(UPDATE_ENTITY_ID)
assert state_spinel is not None
assert state_spinel.state == "unknown"
assert state_spinel.attributes["title"] == "OpenThread RCP"
assert state_spinel.attributes["installed_version"] == "2.4.4.0"
assert state_spinel.attributes["latest_version"] is None
@pytest.mark.parametrize(
("firmware", "version", "expected"),
[
("ezsp", "7.3.1.0 build 0", "EmberZNet Zigbee 7.3.1.0"),
("spinel", "SL-OPENTHREAD/2.4.4.0_GitHub-7074a43e4", "OpenThread RCP 2.4.4.0"),
("bootloader", "2.4.2", "Gecko Bootloader 2.4.2"),
("cpc", "4.3.2", "Multiprotocol 4.3.2"),
("router", "1.2.3.4", "Unknown 1.2.3.4"), # Not supported but still shown
],
)
async def test_yellow_update_entity_state(
hass: HomeAssistant, firmware: str, version: str, expected: str
) -> None:
"""Test the Yellow firmware update entity with different firmware types."""
await async_setup_component(hass, "homeassistant", {})
# Set up the Yellow integration
yellow_config_entry = MockConfigEntry(
title="Home Assistant Yellow",
domain="homeassistant_yellow",
data={
"firmware": firmware,
"firmware_version": version,
"device": RADIO_DEVICE,
},
version=1,
minor_version=3,
)
yellow_config_entry.add_to_hass(hass)
with (
patch(
"homeassistant.components.homeassistant_yellow.is_hassio", return_value=True
),
patch(
"homeassistant.components.homeassistant_yellow.get_os_info",
return_value={"board": "yellow"},
),
):
assert await hass.config_entries.async_setup(yellow_config_entry.entry_id)
await hass.async_block_till_done()
state = hass.states.get(UPDATE_ENTITY_ID)
assert state is not None
assert (
f"{state.attributes['title']} {state.attributes['installed_version']}"
== expected
)
@@ -72,7 +72,6 @@
"1/59/0": 2,
"1/59/65533": 1,
"1/59/1": 0,
"1/59/2": 2,
"1/59/65531": [0, 1, 65528, 65529, 65531, 65532, 65533],
"1/59/65532": 30,
"1/59/65528": [],
@@ -102,7 +101,7 @@
"2/59/0": 2,
"2/59/65533": 1,
"2/59/1": 0,
"2/59/2": 2,
"2/59/2": 4,
"2/59/65531": [0, 1, 65528, 65529, 65531, 65532, 65533],
"2/59/65532": 30,
"2/59/65528": [],
@@ -132,6 +132,8 @@
'event_types': list([
'multi_press_1',
'multi_press_2',
'multi_press_3',
'multi_press_4',
'long_press',
'long_release',
]),
@@ -172,6 +174,8 @@
'event_types': list([
'multi_press_1',
'multi_press_2',
'multi_press_3',
'multi_press_4',
'long_press',
'long_release',
]),
@@ -686,7 +686,7 @@
'state': '20.0',
})
# ---
# name: test_sensors[air_purifier][sensor.air_purifier_vocs-entry]
# name: test_sensors[air_purifier][sensor.air_purifier_volatile_organic_compounds_parts-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -701,7 +701,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.air_purifier_vocs',
'entity_id': 'sensor.air_purifier_volatile_organic_compounds_parts',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -713,7 +713,7 @@
}),
'original_device_class': <SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS_PARTS: 'volatile_organic_compounds_parts'>,
'original_icon': None,
'original_name': 'VOCs',
'original_name': 'Volatile organic compounds parts',
'platform': 'matter',
'previous_unique_id': None,
'supported_features': 0,
@@ -722,16 +722,16 @@
'unit_of_measurement': 'ppm',
})
# ---
# name: test_sensors[air_purifier][sensor.air_purifier_vocs-state]
# name: test_sensors[air_purifier][sensor.air_purifier_volatile_organic_compounds_parts-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'volatile_organic_compounds_parts',
'friendly_name': 'Air Purifier VOCs',
'friendly_name': 'Air Purifier Volatile organic compounds parts',
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
'unit_of_measurement': 'ppm',
}),
'context': <ANY>,
'entity_id': 'sensor.air_purifier_vocs',
'entity_id': 'sensor.air_purifier_volatile_organic_compounds_parts',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
@@ -1167,7 +1167,7 @@
'state': '20.08',
})
# ---
# name: test_sensors[air_quality_sensor][sensor.lightfi_aq1_air_quality_sensor_vocs-entry]
# name: test_sensors[air_quality_sensor][sensor.lightfi_aq1_air_quality_sensor_volatile_organic_compounds_parts-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -1182,7 +1182,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.lightfi_aq1_air_quality_sensor_vocs',
'entity_id': 'sensor.lightfi_aq1_air_quality_sensor_volatile_organic_compounds_parts',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -1194,7 +1194,7 @@
}),
'original_device_class': <SensorDeviceClass.VOLATILE_ORGANIC_COMPOUNDS_PARTS: 'volatile_organic_compounds_parts'>,
'original_icon': None,
'original_name': 'VOCs',
'original_name': 'Volatile organic compounds parts',
'platform': 'matter',
'previous_unique_id': None,
'supported_features': 0,
@@ -1203,16 +1203,16 @@
'unit_of_measurement': 'ppm',
})
# ---
# name: test_sensors[air_quality_sensor][sensor.lightfi_aq1_air_quality_sensor_vocs-state]
# name: test_sensors[air_quality_sensor][sensor.lightfi_aq1_air_quality_sensor_volatile_organic_compounds_parts-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'volatile_organic_compounds_parts',
'friendly_name': 'lightfi-aq1-air-quality-sensor VOCs',
'friendly_name': 'lightfi-aq1-air-quality-sensor Volatile organic compounds parts',
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
'unit_of_measurement': 'ppm',
}),
'context': <ANY>,
'entity_id': 'sensor.lightfi_aq1_air_quality_sensor_vocs',
'entity_id': 'sensor.lightfi_aq1_air_quality_sensor_volatile_organic_compounds_parts',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
+15 -6
View File
@@ -36,7 +36,7 @@ async def test_generic_switch_node(
assert state
assert state.state == "unknown"
assert state.name == "Mock Generic Switch Button"
# check event_types from featuremap 30
# check event_types from featuremap 14 (0b1110)
assert state.attributes[ATTR_EVENT_TYPES] == [
"initial_press",
"short_release",
@@ -76,7 +76,7 @@ async def test_generic_switch_multi_node(
assert state_button_1.state == "unknown"
# name should be 'DeviceName Button (1)' due to the label set to just '1'
assert state_button_1.name == "Mock Generic Switch Button (1)"
# check event_types from featuremap 14
# check event_types from featuremap 30 (0b11110) and MultiPressMax unset (default 2)
assert state_button_1.attributes[ATTR_EVENT_TYPES] == [
"multi_press_1",
"multi_press_2",
@@ -84,11 +84,20 @@ async def test_generic_switch_multi_node(
"long_release",
]
# check button 2
state_button_1 = hass.states.get("event.mock_generic_switch_fancy_button")
assert state_button_1
assert state_button_1.state == "unknown"
state_button_2 = hass.states.get("event.mock_generic_switch_fancy_button")
assert state_button_2
assert state_button_2.state == "unknown"
# name should be 'DeviceName Fancy Button' due to the label set to 'Fancy Button'
assert state_button_1.name == "Mock Generic Switch Fancy Button"
assert state_button_2.name == "Mock Generic Switch Fancy Button"
# check event_types from featuremap 30 (0b11110) and MultiPressMax 4
assert state_button_2.attributes[ATTR_EVENT_TYPES] == [
"multi_press_1",
"multi_press_2",
"multi_press_3",
"multi_press_4",
"long_press",
"long_release",
]
# trigger firing a multi press event
await trigger_subscription_callback(

Some files were not shown because too many files have changed in this diff Show More