Compare commits

..

45 Commits

Author SHA1 Message Date
Jan Čermák
36cb3e21fe Merge remote-tracking branch 'origin/dev' into gha-builder 2026-03-05 12:17:11 +01:00
Jan Čermák
f645b232f9 Fix container-(username|password) -> container-registry-(username|password) 2026-03-05 12:14:36 +01:00
Jan Čermák
e8454d9b2c Use updated build-image action inputs, sort alphabetically 2026-03-05 12:10:43 +01:00
Andreas Jakl
5fe2ab93ff Add device tracker to NRGkick integration (#164804)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-03-05 12:00:30 +01:00
Glenn de Haan
0e4698eb99 Add device class to active_liter_lpm sensor (#164809) 2026-03-05 11:50:37 +01:00
epenet
698c5eca00 Migrate remaining netgear coordinators to separate module (#164826) 2026-03-05 11:49:28 +01:00
Raphael Hehl
c7776057b7 Enforce SSRF redirect protection only for connector allowed_protocol_schema_set (#164769)
Co-authored-by: RaHehl <rahehl@users.noreply.github.com>
Co-authored-by: J. Nick Koston <nick@home-assistant.io>
2026-03-05 11:45:05 +01:00
Jan Čermák
02ae9b2f71 Generate machine dockerfiles using hassfest script 2026-03-05 11:22:12 +01:00
Erik Montnemery
e87c677cc4 Improve homee tests (#164820) 2026-03-05 11:15:50 +01:00
Erik Montnemery
c3858a0841 Improve tuya diagnostic tests (#164819) 2026-03-05 11:13:01 +01:00
Michael
42bc5c3a5f Add remote.turned_on and remote.turned_off triggers (#164535)
Co-authored-by: Martin Hjelmare <marhje52@gmail.com>
2026-03-05 10:52:29 +01:00
epenet
76bc58da2c Add base NetgearDataCoordinator to netgear (#164816) 2026-03-05 10:52:12 +01:00
epenet
fc8719ce35 Remove caio from licenses exception list (#164806) 2026-03-05 10:18:08 +01:00
dependabot[bot]
60a4a97d9c Bump dawidd6/action-download-artifact from 14 to 16 (#164790)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-05 10:16:23 +01:00
Erwin Douna
284721e1df Bump pyportainer 1.0.32 (#164803) 2026-03-05 09:06:46 +01:00
Norbert Rittel
bfa707d79e Use common string for "host" in devialet config flow (#164798) 2026-03-05 08:32:46 +01:00
Norbert Rittel
633e2e7469 Use common state for "medium" in smartthings (#164799) 2026-03-05 08:32:35 +01:00
dependabot[bot]
ad1c6846e7 Bump actions/upload-artifact from 6.0.0 to 7.0.0 (#164791) 2026-03-05 07:29:59 +01:00
Erwin Douna
f75140b626 Add const to Portainer for endpoint up (#164746) 2026-03-05 00:38:59 +01:00
rappenze
f83757da7c Use unique fibaro_id in test fixtures (#164763) 2026-03-04 22:04:38 +00:00
Norbert Rittel
ca338c98f3 Clarify description of vacuum.clean_area action (#164764) 2026-03-04 21:57:59 +00:00
Ian Foster
18a8afb017 Update keyboard_remote dependencies (#164755) 2026-03-04 19:47:17 +01:00
Jan Čermák
f6f7390063 Restore build context also in build_python 2026-03-04 18:26:21 +01:00
Jan Čermák
bfa1fd7f1b Use new home-assistant/builder actions for image builds
This PR completely drops usage of the builder action in favor of new actions
introduced in home-assistant/builder#273. This results in faster builds with
better caching options and simple local builds using Docker BuildKit.

The image dependency chain currently still uses per-arch builds but once
docker-base and docker repositories start publishing multi-arch images, we can
simplify the action a bit further.

The idea to use composite actions comes from #162245 and this PR fully predates
it. There is minor difference that the files generated twice in per-arch builds
are now generated and archived by the init job.
2026-03-04 18:05:12 +01:00
Italo Lombardi
0136e9c7eb ISS integration: better entity handling (#159050)
Co-authored-by: Ariel Ebersberger <ariel@ebersberger.io>
2026-03-04 17:46:48 +01:00
Erik Montnemery
d88c736016 Add is_closed state attribute to cover (#164739) 2026-03-04 16:54:06 +01:00
Robert Resch
780dc178a1 Use Python version file in CI for setting the default python version (#164751) 2026-03-04 16:53:31 +01:00
Petro31
b7ba945dfc Fix this variable preview issue with template entities from the UI (#164740) 2026-03-04 16:01:41 +01:00
Magnus Øverli
01de7052af Add deprecation timeline to flexit_bacnet fireplace switch (#164450) 2026-03-04 15:47:40 +01:00
Allen Porter
3fe6a31ee9 Improve Roborock device info creation and enhance device registration for disabled or failed devices. (#164553) 2026-03-04 15:45:51 +01:00
rappenze
95570643ec Fix handling of several thermostat QuickApp's in fibaro (#164344)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-04 15:40:49 +01:00
starkillerOG
e3210b0ab9 Fix Reolink entity unique_id migration when unique_id already exists (#164667) 2026-03-04 15:12:26 +01:00
Artur Pragacz
2edabf903a Add backup integration to recovery mode (#164734) 2026-03-04 14:33:28 +01:00
Stefan Agner
0e4e703b64 Ignore transient empty segments in Matter vacuum (#164737)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-04 14:24:28 +01:00
tobiaswaldvogel
88624f5179 Use jog up/down in motionblinds if no tilt position is available (#164694)
Signed-off-by: Tobias Waldvogel <tobias.waldvogel@gmail.com>
Co-authored-by: starkillerOG <starkiller.og@gmail.com>
2026-03-04 13:27:47 +01:00
Erwin Douna
4a5fdfc0ec Bump pyportainer 1.0.31 (#164733) 2026-03-04 13:26:10 +01:00
Bram Kragten
c6e91afae4 Update frontend to 20260304.0 (#164736) 2026-03-04 13:25:57 +01:00
Kamil Breguła
db5e7e4521 Refactor AWS S3 tests (#164098)
Co-authored-by: mik-laj <12058428+mik-laj@users.noreply.github.com>
Co-authored-by: Claude Haiku 4.5 <noreply@anthropic.com>
2026-03-04 13:13:43 +01:00
Joakim Plate
25489c224b Restore handling of is active input for chromecast (#164735) 2026-03-04 13:10:10 +01:00
Tom
c4f64598a0 Add informative errors to Proxmox VE buttons (#164417) 2026-03-04 12:48:17 +01:00
starkillerOG
59e579cf5a Bump reolink-aio to 0.19.1 (#164732) 2026-03-04 12:46:38 +01:00
epenet
831c28cf2c Migrate netgear to use runtime_data (#164718) 2026-03-04 11:37:05 +01:00
Erik Montnemery
be1affc6ba Pin exact Python version in .python-version (#164722) 2026-03-04 11:21:44 +01:00
J. Diego Rodríguez Royo
94a25b5688 Improve mobile_app notify.notify with not connected targets (#161855) 2026-03-04 11:11:02 +01:00
AlCalzone
382940d661 Support Z-Wave Hoppe eHandle tilt sensor (#164689) 2026-03-04 11:00:24 +01:00
170 changed files with 3799 additions and 1166 deletions

View File

@@ -10,7 +10,6 @@ on:
env:
BUILD_TYPE: core
DEFAULT_PYTHON: "3.14.2"
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
@@ -36,16 +35,17 @@ jobs:
channel: ${{ steps.version.outputs.channel }}
publish: ${{ steps.version.outputs.publish }}
architectures: ${{ env.ARCHITECTURES }}
base_image_version: ${{ env.BASE_IMAGE_VERSION }}
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
- name: Get information
id: info
@@ -75,44 +75,9 @@ jobs:
env:
LOKALISE_TOKEN: ${{ secrets.LOKALISE_TOKEN }}
- name: Archive translations
shell: bash
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: translations
path: translations.tar.gz
if-no-files-found: error
build_base:
name: Build ${{ matrix.arch }} base core image
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
include:
- arch: amd64
os: ubuntu-latest
- arch: aarch64
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
if: steps.version.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -122,8 +87,8 @@ jobs:
name: wheels
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
if: steps.version.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -132,18 +97,12 @@ jobs:
workflow_conclusion: success
name: package
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Adjust nightly version
if: needs.init.outputs.channel == 'dev'
if: steps.version.outputs.channel == 'dev'
shell: bash
env:
UV_PRERELEASE: allow
VERSION: ${{ needs.init.outputs.version }}
VERSION: ${{ steps.version.outputs.version }}
run: |
python3 -m pip install "$(grep '^uv' < requirements.txt)"
uv pip install packaging tomli
@@ -181,92 +140,72 @@ jobs:
sed -i "s|home-assistant-intents==.*||" requirements_all.txt requirements.txt
fi
- name: Download translations
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Write meta info file
shell: bash
run: |
echo "${GITHUB_SHA};${GITHUB_REF};${GITHUB_EVENT_NAME};${GITHUB_ACTOR}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
- name: Upload build context overlay
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
name: build-context
if-no-files-found: ignore
path: |
homeassistant/components/*/translations/
rootfs/OFFICIAL_IMAGE
home_assistant_frontend-*.whl
home_assistant_intents-*.whl
homeassistant/const.py
homeassistant/components/frontend/manifest.json
homeassistant/components/conversation/manifest.json
homeassistant/package_constraints.txt
requirements_all.txt
requirements.txt
pyproject.toml
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
build_base:
name: Build ${{ matrix.arch }} base core image
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
include:
- arch: amd64
os: ubuntu-24.04
- arch: aarch64
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
cosign-release: "v2.5.3"
persist-credentials: false
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build variables
id: vars
shell: bash
env:
ARCH: ${{ matrix.arch }}
run: |
echo "base_image=ghcr.io/home-assistant/${ARCH}-homeassistant-base:${BASE_IMAGE_VERSION}" >> "$GITHUB_OUTPUT"
echo "cache_image=ghcr.io/home-assistant/${ARCH}-homeassistant:latest" >> "$GITHUB_OUTPUT"
echo "created=$(date --rfc-3339=seconds --utc)" >> "$GITHUB_OUTPUT"
- name: Verify base image signature
env:
BASE_IMAGE: ${{ steps.vars.outputs.base_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/docker/.*" \
"${BASE_IMAGE}"
- name: Verify cache image signature
id: cache
continue-on-error: true
env:
CACHE_IMAGE: ${{ steps.vars.outputs.cache_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/core/.*" \
"${CACHE_IMAGE}"
- name: Download build context overlay
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: build-context
- name: Build base image
id: build
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: home-assistant/builder/actions/build-image@gha-builder # zizmor: ignore[unpinned-uses]
with:
context: .
file: ./Dockerfile
platforms: ${{ steps.vars.outputs.platform }}
push: true
cache-from: ${{ steps.cache.outcome == 'success' && steps.vars.outputs.cache_image || '' }}
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=${{ steps.vars.outputs.base_image }}
tags: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
outputs: type=image,push=true,compression=zstd,compression-level=9,force-compression=true,oci-mediatypes=true
labels: |
io.hass.arch=${{ matrix.arch }}
io.hass.version=${{ needs.init.outputs.version }}
org.opencontainers.image.created=${{ steps.vars.outputs.created }}
org.opencontainers.image.version=${{ needs.init.outputs.version }}
- name: Sign image
env:
ARCH: ${{ matrix.arch }}
VERSION: ${{ needs.init.outputs.version }}
DIGEST: ${{ steps.build.outputs.digest }}
run: |
cosign sign --yes "ghcr.io/home-assistant/${ARCH}-homeassistant:${VERSION}@${DIGEST}"
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
context: .
cosign-base-identity: "https://github.com/home-assistant/docker/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
image: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant
image-tags: ${{ needs.init.outputs.version }}
push: true
version: ${{ needs.init.outputs.version }}
build_machine:
name: Build ${{ matrix.machine }} machine core image
@@ -315,35 +254,38 @@ jobs:
with:
persist-credentials: false
- name: Set build additional args
- name: Compute extra tags
id: tags
shell: bash
env:
VERSION: ${{ needs.init.outputs.version }}
run: |
# Create general tags
if [[ "${VERSION}" =~ d ]]; then
echo "BUILD_ARGS=--additional-tag dev" >> $GITHUB_ENV
echo "extra_tags=dev" >> "$GITHUB_OUTPUT"
elif [[ "${VERSION}" =~ b ]]; then
echo "BUILD_ARGS=--additional-tag beta" >> $GITHUB_ENV
echo "extra_tags=beta" >> "$GITHUB_OUTPUT"
else
echo "BUILD_ARGS=--additional-tag stable" >> $GITHUB_ENV
echo "extra_tags=stable" >> "$GITHUB_OUTPUT"
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
- name: Build machine image
uses: home-assistant/builder/actions/build-image@gha-builder # zizmor: ignore[unpinned-uses]
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image
uses: home-assistant/builder@6cb4fd3d1338b6e22d0958a4bcb53e0965ea63b4 # 2026.02.1
with:
image: ${{ matrix.arch }}
args: |
$BUILD_ARGS \
--target /data/machine \
--cosign \
--machine "${{ needs.init.outputs.version }}=${{ matrix.machine }}"
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
context: machine/
cosign-base-identity: "https://github.com/home-assistant/core/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
file: machine/${{ matrix.machine }}
image: ghcr.io/home-assistant/${{ matrix.machine }}-homeassistant
image-tags: |
${{ needs.init.outputs.version }}
${{ steps.tags.outputs.extra_tags }}
push: true
version: ${{ needs.init.outputs.version }}
publish_ha:
name: Publish version files
@@ -538,20 +480,15 @@ jobs:
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
- name: Download translations
- name: Download build context overlay
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
name: build-context
- name: Build package
shell: bash

View File

@@ -41,8 +41,7 @@ env:
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.4"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
ADDITIONAL_PYTHON_VERSIONS: "[]"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -166,6 +165,11 @@ jobs:
tests_glob=""
lint_only=""
skip_coverage=""
default_python=$(cat .python-version)
all_python_versions=$(jq -cn \
--arg default_python "${default_python}" \
--argjson additional_python_versions "${ADDITIONAL_PYTHON_VERSIONS}" \
'[$default_python] + $additional_python_versions')
if [[ "${INTEGRATION_CHANGES}" != "[]" ]];
then
@@ -235,8 +239,8 @@ jobs:
echo "mariadb_groups=${mariadb_groups}" >> $GITHUB_OUTPUT
echo "postgresql_groups: ${postgresql_groups}"
echo "postgresql_groups=${postgresql_groups}" >> $GITHUB_OUTPUT
echo "python_versions: ${ALL_PYTHON_VERSIONS}"
echo "python_versions=${ALL_PYTHON_VERSIONS}" >> $GITHUB_OUTPUT
echo "python_versions: ${all_python_versions}"
echo "python_versions=${all_python_versions}" >> $GITHUB_OUTPUT
echo "test_full_suite: ${test_full_suite}"
echo "test_full_suite=${test_full_suite}" >> $GITHUB_OUTPUT
echo "integrations_glob: ${integrations_glob}"
@@ -452,7 +456,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -503,13 +507,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -540,13 +544,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -576,11 +580,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Run gen_copilot_instructions.py
run: |
@@ -653,7 +657,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${PYTHON_VERSION}.json
- name: Upload licenses
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -682,13 +686,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -735,13 +739,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -786,11 +790,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Generate partial mypy restore key
id: generate-mypy-key
@@ -798,7 +802,7 @@ jobs:
mypy_version=$(cat requirements_test.txt | grep 'mypy.*=' | cut -d '=' -f 3)
echo "version=${mypy_version}" >> $GITHUB_OUTPUT
echo "key=mypy-${MYPY_CACHE_VERSION}-${mypy_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -879,13 +883,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
- name: Restore full Python virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -901,7 +905,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${TEST_GROUP_COUNT} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -1020,14 +1024,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1040,7 +1044,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1177,7 +1181,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1185,7 +1189,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1199,7 +1203,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1338,7 +1342,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1346,7 +1350,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1360,7 +1364,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1514,14 +1518,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1534,7 +1538,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml

View File

@@ -15,9 +15,6 @@ concurrency:
group: ${{ github.workflow }}
cancel-in-progress: true
env:
DEFAULT_PYTHON: "3.14.2"
jobs:
upload:
name: Upload
@@ -29,10 +26,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
- name: Upload Translations
env:

View File

@@ -16,9 +16,6 @@ on:
- "requirements.txt"
- "script/gen_requirements_all.py"
env:
DEFAULT_PYTHON: "3.14.2"
permissions: {}
concurrency:
@@ -36,11 +33,11 @@ jobs:
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
- name: Set up Python
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
python-version-file: ".python-version"
check-latest: true
- name: Create Python virtual environment
@@ -77,7 +74,7 @@ jobs:
) > .env_file
- name: Upload env_file
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: env_file
path: ./.env_file
@@ -85,7 +82,7 @@ jobs:
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -97,7 +94,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt

View File

@@ -1 +1 @@
3.14
3.14.2

31
Dockerfile generated
View File

@@ -1,19 +1,9 @@
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant-base:latest
FROM ${BUILD_FROM}
LABEL \
io.hass.type="core" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.source="https://github.com/home-assistant/core" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"
# Synchronize with homeassistant/core.py:async_stop
ENV \
S6_SERVICES_GRACETIME=240000 \
@@ -60,3 +50,22 @@ RUN \
homeassistant/homeassistant
WORKDIR /config
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"

View File

@@ -239,6 +239,8 @@ DEFAULT_INTEGRATIONS = {
}
DEFAULT_INTEGRATIONS_RECOVERY_MODE = {
# These integrations are set up if recovery mode is activated.
"backup",
"cloud",
"frontend",
}
DEFAULT_INTEGRATIONS_SUPERVISOR = {

View File

@@ -149,6 +149,7 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"lock",
"media_player",
"person",
"remote",
"scene",
"siren",
"switch",

View File

@@ -804,8 +804,22 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
@property
def state(self) -> MediaPlayerState | None:
"""Return the state of the player."""
# The lovelace app loops media to prevent timing out, don't show that
if (chromecast := self._chromecast) is None or (
cast_status := self.cast_status
) is None:
# Not connected to any chromecast, or not yet got any status
return None
if (
chromecast.cast_type == pychromecast.const.CAST_TYPE_CHROMECAST
and not chromecast.ignore_cec
and cast_status.is_active_input is False
):
# The display interface for the device has been turned off or switched away
return MediaPlayerState.OFF
if self.app_id == CAST_APP_ID_HOMEASSISTANT_LOVELACE:
# The lovelace app loops media to prevent timing out, don't show that
return MediaPlayerState.PLAYING
if (media_status := self._media_status()[0]) is not None:
@@ -822,16 +836,12 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
# Some apps don't report media status, show the player as playing
return MediaPlayerState.PLAYING
if self.app_id is not None and self.app_id != pychromecast.config.APP_BACKDROP:
# We have an active app
return MediaPlayerState.IDLE
if self._chromecast is not None and self._chromecast.is_idle:
# If library consider us idle, that is our off state
# it takes HDMI status into account for cast devices.
if self.app_id in (pychromecast.IDLE_APP_ID, None):
# We have no active app or the home screen app. This is
# same app as APP_BACKDROP.
return MediaPlayerState.OFF
return None
return MediaPlayerState.IDLE
@property
def media_content_id(self) -> str | None:

View File

@@ -91,6 +91,7 @@ class CoverEntityFeature(IntFlag):
ATTR_CURRENT_POSITION = "current_position"
ATTR_CURRENT_TILT_POSITION = "current_tilt_position"
ATTR_IS_CLOSED = "is_closed"
ATTR_POSITION = "position"
ATTR_TILT_POSITION = "tilt_position"
@@ -267,7 +268,9 @@ class CoverEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
@property
def state_attributes(self) -> dict[str, Any]:
"""Return the state attributes."""
data = {}
data: dict[str, Any] = {}
data[ATTR_IS_CLOSED] = self.is_closed
if (current := self.current_cover_position) is not None:
data[ATTR_CURRENT_POSITION] = current

View File

@@ -13,7 +13,7 @@
},
"user": {
"data": {
"host": "Host"
"host": "[%key:common::config_flow::data::host%]"
},
"description": "Please enter the host name or IP address of the Devialet device."
}

View File

@@ -275,8 +275,11 @@ class FibaroController:
# otherwise add the first visible device in the group
# which is a hack, but solves a problem with FGT having
# hidden compatibility devices before the real device
if last_climate_parent != device.parent_fibaro_id or (
device.has_endpoint_id and last_endpoint != device.endpoint_id
# Second hack is for quickapps which have parent id 0 and no children
if (
last_climate_parent != device.parent_fibaro_id
or (device.has_endpoint_id and last_endpoint != device.endpoint_id)
or device.parent_fibaro_id == 0
):
_LOGGER.debug("Handle separately")
self.fibaro_devices[platform].append(device)

View File

@@ -154,7 +154,7 @@
},
"issues": {
"deprecated_fireplace_switch": {
"description": "The fireplace mode switch entity `{entity_id}` is deprecated and will be removed in a future version.\n\nFireplace mode has been moved to a climate preset on the climate entity to better match the device interface.\n\nPlease update your automations to use the `climate.set_preset_mode` action with preset mode `fireplace` instead of using the switch entity.\n\nAfter updating your automations, you can safely disable this switch entity.",
"description": "The fireplace mode switch entity `{entity_id}` is deprecated and will be removed in Home Assistant 2026.9.\n\nFireplace mode has been moved to a climate preset on the climate entity to better match the device interface.\n\nPlease update your automations to use the `climate.set_preset_mode` action with preset mode `fireplace` instead of using the switch entity.\n\nAfter updating your automations, you can safely disable this switch entity.",
"title": "Fireplace mode switch is deprecated"
}
}

View File

@@ -91,6 +91,7 @@ async def async_setup_entry(
hass,
DOMAIN,
f"deprecated_switch_{fireplace_switch_unique_id}",
breaks_in_ha_version="2026.9.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=IssueSeverity.WARNING,
@@ -102,7 +103,7 @@ async def async_setup_entry(
entities.append(FlexitSwitch(coordinator, description))
else:
entities.append(FlexitSwitch(coordinator, description))
async_add_entities(entities)
async_add_entities(entities)
PARALLEL_UPDATES = 1

View File

@@ -21,5 +21,5 @@
"integration_type": "system",
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260302.0"]
"requirements": ["home-assistant-frontend==20260304.0"]
}

View File

@@ -610,6 +610,7 @@ SENSORS: Final[tuple[HomeWizardSensorEntityDescription, ...]] = (
key="active_liter_lpm",
translation_key="active_liter_lpm",
native_unit_of_measurement=UnitOfVolumeFlowRate.LITERS_PER_MINUTE,
device_class=SensorDeviceClass.VOLUME_FLOW_RATE,
state_class=SensorStateClass.MEASUREMENT,
has_fn=lambda data: data.measurement.active_liter_lpm is not None,
value_fn=lambda data: data.measurement.active_liter_lpm,

View File

@@ -40,49 +40,6 @@ class HydrawiseData:
)
class HydrawiseWaterUseData:
"""Container for data fetched from the Hydrawise water use data.
Proxies the main data through the main coordinator to make sure water use
sensors have access to updated main data without needing to be updated at
the same frequency as the main data.
"""
def __init__(
self,
main_coordinator: HydrawiseMainDataUpdateCoordinator,
daily_water_summary: dict[int, ControllerWaterUseSummary],
) -> None:
"""Initialize the HydrawiseWaterUseData."""
self._main_coordinator = main_coordinator
self.daily_water_summary = daily_water_summary
@property
def user(self) -> User:
"""Return the Hydrawise user."""
return self._main_coordinator.data.user
@property
def controllers(self) -> dict[int, Controller]:
"""Return the Hydrawise controllers."""
return self._main_coordinator.data.controllers
@property
def zones(self) -> dict[int, Zone]:
"""Return the Hydrawise zones."""
return self._main_coordinator.data.zones
@property
def zone_id_to_controller(self) -> dict[int, Controller]:
"""Return a mapping of zone ID to controller."""
return self._main_coordinator.data.zone_id_to_controller
@property
def sensors(self) -> dict[int, Sensor]:
"""Return the Hydrawise sensors."""
return self._main_coordinator.data.sensors
@dataclass
class HydrawiseUpdateCoordinators:
"""Container for all Hydrawise DataUpdateCoordinator instances."""
@@ -91,7 +48,14 @@ class HydrawiseUpdateCoordinators:
water_use: HydrawiseWaterUseDataUpdateCoordinator
class HydrawiseMainDataUpdateCoordinator(DataUpdateCoordinator[HydrawiseData]):
class HydrawiseDataUpdateCoordinator(DataUpdateCoordinator[HydrawiseData]):
"""Base class for Hydrawise Data Update Coordinators."""
api: HydrawiseBase
config_entry: HydrawiseConfigEntry
class HydrawiseMainDataUpdateCoordinator(HydrawiseDataUpdateCoordinator):
"""The main Hydrawise Data Update Coordinator.
This fetches the primary state data for Hydrawise controllers and zones
@@ -99,9 +63,6 @@ class HydrawiseMainDataUpdateCoordinator(DataUpdateCoordinator[HydrawiseData]):
integration are updated in a timely manner.
"""
api: HydrawiseBase
config_entry: HydrawiseConfigEntry
def __init__(
self,
hass: HomeAssistant,
@@ -212,18 +173,13 @@ class HydrawiseMainDataUpdateCoordinator(DataUpdateCoordinator[HydrawiseData]):
new_zone_callback(new_zones)
class HydrawiseWaterUseDataUpdateCoordinator(
DataUpdateCoordinator[HydrawiseWaterUseData]
):
class HydrawiseWaterUseDataUpdateCoordinator(HydrawiseDataUpdateCoordinator):
"""Data Update Coordinator for Hydrawise Water Use.
This fetches data that is more expensive for the Hydrawise API to compute
at a less frequent interval as to not overload the Hydrawise servers.
"""
api: HydrawiseBase
config_entry: HydrawiseConfigEntry
_main_coordinator: HydrawiseMainDataUpdateCoordinator
def __init__(
@@ -244,7 +200,7 @@ class HydrawiseWaterUseDataUpdateCoordinator(
self.api = api
self._main_coordinator = main_coordinator
async def _async_update_data(self) -> HydrawiseWaterUseData:
async def _async_update_data(self) -> HydrawiseData:
"""Fetch the latest data from Hydrawise."""
daily_water_summary: dict[int, ControllerWaterUseSummary] = {}
for controller in self._main_coordinator.data.controllers.values():
@@ -253,7 +209,11 @@ class HydrawiseWaterUseDataUpdateCoordinator(
now().replace(hour=0, minute=0, second=0, microsecond=0),
now(),
)
return HydrawiseWaterUseData(
main_coordinator=self._main_coordinator,
main_data = self._main_coordinator.data
return HydrawiseData(
user=main_data.user,
controllers=main_data.controllers,
zones=main_data.zones,
sensors=main_data.sensors,
daily_water_summary=daily_water_summary,
)

View File

@@ -10,17 +10,10 @@ from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER, MODEL_ZONE
from .coordinator import (
HydrawiseMainDataUpdateCoordinator,
HydrawiseWaterUseDataUpdateCoordinator,
)
from .coordinator import HydrawiseDataUpdateCoordinator
class HydrawiseEntity(
CoordinatorEntity[
HydrawiseMainDataUpdateCoordinator | HydrawiseWaterUseDataUpdateCoordinator
]
):
class HydrawiseEntity(CoordinatorEntity[HydrawiseDataUpdateCoordinator]):
"""Entity class for Hydrawise devices."""
_attr_attribution = "Data provided by hydrawise.com"
@@ -28,8 +21,7 @@ class HydrawiseEntity(
def __init__(
self,
coordinator: HydrawiseMainDataUpdateCoordinator
| HydrawiseWaterUseDataUpdateCoordinator,
coordinator: HydrawiseDataUpdateCoordinator,
description: EntityDescription,
controller: Controller,
*,

View File

@@ -2,66 +2,21 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
import pyiss
import requests
from requests.exceptions import HTTPError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
from .coordinator import IssConfigEntry, IssDataUpdateCoordinator
PLATFORMS = [Platform.SENSOR]
@dataclass
class IssData:
"""Dataclass representation of data returned from pyiss."""
number_of_people_in_space: int
current_location: dict[str, str]
def update(iss: pyiss.ISS) -> IssData:
"""Retrieve data from the pyiss API."""
return IssData(
number_of_people_in_space=iss.number_of_people_in_space(),
current_location=iss.current_location(),
)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: IssConfigEntry) -> bool:
"""Set up this integration using UI."""
hass.data.setdefault(DOMAIN, {})
iss = pyiss.ISS()
async def async_update() -> IssData:
try:
return await hass.async_add_executor_job(update, iss)
except (HTTPError, requests.exceptions.ConnectionError) as ex:
raise UpdateFailed("Unable to retrieve data") from ex
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_method=async_update,
update_interval=timedelta(seconds=60),
)
coordinator = IssDataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
hass.data[DOMAIN] = coordinator
entry.runtime_data = coordinator
entry.async_on_unload(entry.add_update_listener(update_listener))
@@ -70,13 +25,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: IssConfigEntry) -> bool:
"""Handle removal of an entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
del hass.data[DOMAIN]
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
async def update_listener(hass: HomeAssistant, entry: ConfigEntry) -> None:
async def update_listener(hass: HomeAssistant, entry: IssConfigEntry) -> None:
"""Handle options update."""
await hass.config_entries.async_reload(entry.entry_id)

View File

@@ -4,16 +4,12 @@ from __future__ import annotations
import voluptuous as vol
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
OptionsFlow,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.const import CONF_SHOW_ON_MAP
from homeassistant.core import callback
from .const import DEFAULT_NAME, DOMAIN
from .coordinator import IssConfigEntry
class ISSConfigFlow(ConfigFlow, domain=DOMAIN):
@@ -24,7 +20,7 @@ class ISSConfigFlow(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(
config_entry: ConfigEntry,
config_entry: IssConfigEntry,
) -> OptionsFlowHandler:
"""Get the options flow for this handler."""
return OptionsFlowHandler()

View File

@@ -3,3 +3,5 @@
DOMAIN = "iss"
DEFAULT_NAME = "ISS"
MAX_CONSECUTIVE_FAILURES = 5

View File

@@ -0,0 +1,76 @@
"""DataUpdateCoordinator for the ISS integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
import pyiss
import requests
from requests.exceptions import HTTPError
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, MAX_CONSECUTIVE_FAILURES
type IssConfigEntry = ConfigEntry[IssDataUpdateCoordinator]
_LOGGER = logging.getLogger(__name__)
@dataclass
class IssData:
"""Dataclass representation of data returned from pyiss."""
number_of_people_in_space: int
current_location: dict[str, str]
class IssDataUpdateCoordinator(DataUpdateCoordinator[IssData]):
"""ISS coordinator that tolerates transient API failures."""
config_entry: IssConfigEntry
def __init__(self, hass: HomeAssistant, entry: IssConfigEntry) -> None:
"""Initialize the ISS coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_interval=timedelta(seconds=60),
)
self._consecutive_failures = 0
self.iss = pyiss.ISS()
def _fetch_iss_data(self) -> IssData:
"""Fetch data from ISS API (blocking)."""
return IssData(
number_of_people_in_space=self.iss.number_of_people_in_space(),
current_location=self.iss.current_location(),
)
async def _async_update_data(self) -> IssData:
"""Fetch data from the ISS API, tolerating transient failures."""
try:
data = await self.hass.async_add_executor_job(self._fetch_iss_data)
except (HTTPError, requests.exceptions.ConnectionError) as err:
self._consecutive_failures += 1
if self.data is None:
raise UpdateFailed("Unable to retrieve data") from err
if self._consecutive_failures >= MAX_CONSECUTIVE_FAILURES:
raise UpdateFailed(
f"Unable to retrieve data after {self._consecutive_failures} consecutive update failures"
) from err
_LOGGER.debug(
"Transient API error (%s/%s), using cached data: %s",
self._consecutive_failures,
MAX_CONSECUTIVE_FAILURES,
err,
)
return self.data
self._consecutive_failures = 0
return data

View File

@@ -6,36 +6,32 @@ import logging
from typing import Any
from homeassistant.components.sensor import SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_LATITUDE, ATTR_LONGITUDE, CONF_SHOW_ON_MAP
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import IssData
from .const import DEFAULT_NAME, DOMAIN
from .coordinator import IssConfigEntry, IssDataUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: IssConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the sensor platform."""
coordinator: DataUpdateCoordinator[IssData] = hass.data[DOMAIN]
coordinator = entry.runtime_data
show_on_map = entry.options.get(CONF_SHOW_ON_MAP, False)
async_add_entities([IssSensor(coordinator, entry, show_on_map)])
class IssSensor(CoordinatorEntity[DataUpdateCoordinator[IssData]], SensorEntity):
class IssSensor(CoordinatorEntity[IssDataUpdateCoordinator], SensorEntity):
"""Implementation of the ISS sensor."""
_attr_has_entity_name = True
@@ -43,8 +39,8 @@ class IssSensor(CoordinatorEntity[DataUpdateCoordinator[IssData]], SensorEntity)
def __init__(
self,
coordinator: DataUpdateCoordinator[IssData],
entry: ConfigEntry,
coordinator: IssDataUpdateCoordinator,
entry: IssConfigEntry,
show: bool,
) -> None:
"""Initialize the sensor."""

View File

@@ -7,5 +7,5 @@
"iot_class": "local_push",
"loggers": ["aionotify", "evdev"],
"quality_scale": "legacy",
"requirements": ["evdev==1.6.1", "asyncinotify==4.2.0"]
"requirements": ["evdev==1.9.3", "asyncinotify==4.4.0"]
}

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
import logging
from typing import TYPE_CHECKING, Any
from chip.clusters import Objects as clusters
@@ -26,6 +27,8 @@ from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
_LOGGER = logging.getLogger(__name__)
class OperationalState(IntEnum):
"""Operational State of the vacuum cleaner.
@@ -254,9 +257,18 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
VacuumEntityFeature.CLEAN_AREA in self.supported_features
and self.registry_entry is not None
and (last_seen_segments := self.last_seen_segments) is not None
and self._current_segments != {s.id: s for s in last_seen_segments}
# Ignore empty segments; some devices transiently
# report an empty list before sending the real one.
and (current_segments := self._current_segments)
):
self.async_create_segments_issue()
last_seen_by_id = {s.id: s for s in last_seen_segments}
if current_segments != last_seen_by_id:
_LOGGER.debug(
"Vacuum segments changed: last_seen=%s, current=%s",
last_seen_by_id,
current_segments,
)
self.async_create_segments_issue()
@callback
def _calculate_features(self) -> None:

View File

@@ -120,6 +120,7 @@ class MobileAppNotificationService(BaseNotificationService):
local_push_channels = self.hass.data[DOMAIN][DATA_PUSH_CHANNEL]
failed_targets = []
for target in targets:
registration = self.hass.data[DOMAIN][DATA_CONFIG_ENTRIES][target].data
@@ -134,12 +135,16 @@ class MobileAppNotificationService(BaseNotificationService):
# Test if local push only.
if ATTR_PUSH_URL not in registration[ATTR_APP_DATA]:
raise HomeAssistantError(
"Device not connected to local push notifications"
)
failed_targets.append(target)
continue
await self._async_send_remote_message_target(target, registration, data)
if failed_targets:
raise HomeAssistantError(
f"Device(s) with webhook id(s) {', '.join(failed_targets)} not connected to local push notifications"
)
async def _async_send_remote_message_target(self, target, registration, data):
"""Send a message to a target."""
app_data = registration[ATTR_APP_DATA]

View File

@@ -307,17 +307,25 @@ class MotionTiltDevice(MotionPositionDevice):
async def async_open_cover_tilt(self, **kwargs: Any) -> None:
"""Open the cover tilt."""
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 0)
if self.current_cover_tilt_position is not None:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 0)
await self.async_request_position_till_stop()
await self.async_request_position_till_stop()
else:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Jog_up)
async def async_close_cover_tilt(self, **kwargs: Any) -> None:
"""Close the cover tilt."""
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 180)
if self.current_cover_tilt_position is not None:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 180)
await self.async_request_position_till_stop()
await self.async_request_position_till_stop()
else:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Jog_down)
async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
"""Move the cover tilt to a specific position."""

View File

@@ -2,39 +2,31 @@
from __future__ import annotations
from datetime import timedelta
import logging
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PORT, CONF_SSL
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import (
DOMAIN,
KEY_COORDINATOR,
KEY_COORDINATOR_FIRMWARE,
KEY_COORDINATOR_LINK,
KEY_COORDINATOR_SPEED,
KEY_COORDINATOR_TRAFFIC,
KEY_COORDINATOR_UTIL,
KEY_ROUTER,
PLATFORMS,
from .const import PLATFORMS
from .coordinator import (
NetgearConfigEntry,
NetgearFirmwareCoordinator,
NetgearLinkCoordinator,
NetgearRuntimeData,
NetgearSpeedTestCoordinator,
NetgearTrackerCoordinator,
NetgearTrafficMeterCoordinator,
NetgearUtilizationCoordinator,
)
from .errors import CannotLoginException
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=30)
SPEED_TEST_INTERVAL = timedelta(hours=2)
SCAN_INTERVAL_FIRMWARE = timedelta(hours=5)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: NetgearConfigEntry) -> bool:
"""Set up Netgear component."""
router = NetgearRouter(hass, entry)
try:
@@ -59,116 +51,41 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
router.ssl,
)
hass.data.setdefault(DOMAIN, {})
async def async_update_devices() -> bool:
"""Fetch data from the router."""
if router.track_devices:
return await router.async_update_device_trackers()
return False
async def async_update_traffic_meter() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_traffic_meter()
async def async_update_speed_test() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_speed_test()
async def async_check_firmware() -> dict[str, Any] | None:
"""Check for new firmware of the router."""
return await router.async_check_new_firmware()
async def async_update_utilization() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_utilization()
async def async_check_link_status() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_link_status()
# Create update coordinators
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Devices",
update_method=async_update_devices,
update_interval=SCAN_INTERVAL,
)
coordinator_traffic_meter = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Traffic meter",
update_method=async_update_traffic_meter,
update_interval=SCAN_INTERVAL,
)
coordinator_speed_test = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Speed test",
update_method=async_update_speed_test,
update_interval=SPEED_TEST_INTERVAL,
)
coordinator_firmware = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Firmware",
update_method=async_check_firmware,
update_interval=SCAN_INTERVAL_FIRMWARE,
)
coordinator_utilization = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Utilization",
update_method=async_update_utilization,
update_interval=SCAN_INTERVAL,
)
coordinator_link = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Ethernet Link Status",
update_method=async_check_link_status,
update_interval=SCAN_INTERVAL,
)
coordinator_tracker = NetgearTrackerCoordinator(hass, router, entry)
coordinator_traffic_meter = NetgearTrafficMeterCoordinator(hass, router, entry)
coordinator_speed_test = NetgearSpeedTestCoordinator(hass, router, entry)
coordinator_firmware = NetgearFirmwareCoordinator(hass, router, entry)
coordinator_utilization = NetgearUtilizationCoordinator(hass, router, entry)
coordinator_link = NetgearLinkCoordinator(hass, router, entry)
if router.track_devices:
await coordinator.async_config_entry_first_refresh()
await coordinator_tracker.async_config_entry_first_refresh()
await coordinator_traffic_meter.async_config_entry_first_refresh()
await coordinator_firmware.async_config_entry_first_refresh()
await coordinator_utilization.async_config_entry_first_refresh()
await coordinator_link.async_config_entry_first_refresh()
hass.data[DOMAIN][entry.entry_id] = {
KEY_ROUTER: router,
KEY_COORDINATOR: coordinator,
KEY_COORDINATOR_TRAFFIC: coordinator_traffic_meter,
KEY_COORDINATOR_SPEED: coordinator_speed_test,
KEY_COORDINATOR_FIRMWARE: coordinator_firmware,
KEY_COORDINATOR_UTIL: coordinator_utilization,
KEY_COORDINATOR_LINK: coordinator_link,
}
entry.runtime_data = NetgearRuntimeData(
router=router,
coordinator_tracker=coordinator_tracker,
coordinator_traffic=coordinator_traffic_meter,
coordinator_speed=coordinator_speed_test,
coordinator_firmware=coordinator_firmware,
coordinator_utilization=coordinator_utilization,
coordinator_link=coordinator_link,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: NetgearConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
if not hass.data[DOMAIN]:
hass.data.pop(DOMAIN)
router = entry.runtime_data.router
if not router.track_devices:
router_id = None
@@ -193,10 +110,10 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_remove_config_entry_device(
hass: HomeAssistant, config_entry: ConfigEntry, device_entry: dr.DeviceEntry
hass: HomeAssistant, config_entry: NetgearConfigEntry, device_entry: dr.DeviceEntry
) -> bool:
"""Remove a device from a config entry."""
router = hass.data[DOMAIN][config_entry.entry_id][KEY_ROUTER]
router = config_entry.runtime_data.router
device_mac = None
for connection in device_entry.connections:

View File

@@ -9,13 +9,11 @@ from homeassistant.components.button import (
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .entity import NetgearRouterCoordinatorEntity
from .router import NetgearRouter
@@ -39,14 +37,14 @@ BUTTONS = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: NetgearConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up button for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
router = entry.runtime_data.router
coordinator_tracker = entry.runtime_data.coordinator_tracker
async_add_entities(
NetgearRouterButtonEntity(coordinator, router, entity_description)
NetgearRouterButtonEntity(coordinator_tracker, router, entity_description)
for entity_description in BUTTONS
)
@@ -58,7 +56,7 @@ class NetgearRouterButtonEntity(NetgearRouterCoordinatorEntity, ButtonEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: NetgearTrackerCoordinator,
router: NetgearRouter,
entity_description: NetgearButtonEntityDescription,
) -> None:

View File

@@ -16,14 +16,6 @@ PLATFORMS = [
CONF_CONSIDER_HOME = "consider_home"
KEY_ROUTER = "router"
KEY_COORDINATOR = "coordinator"
KEY_COORDINATOR_TRAFFIC = "coordinator_traffic"
KEY_COORDINATOR_SPEED = "coordinator_speed"
KEY_COORDINATOR_FIRMWARE = "coordinator_firmware"
KEY_COORDINATOR_UTIL = "coordinator_utilization"
KEY_COORDINATOR_LINK = "coordinator_link"
DEFAULT_CONSIDER_HOME = timedelta(seconds=180)
DEFAULT_NAME = "Netgear router"

View File

@@ -0,0 +1,163 @@
"""Models for the Netgear integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=30)
SCAN_INTERVAL_FIRMWARE = timedelta(hours=5)
SPEED_TEST_INTERVAL = timedelta(hours=2)
@dataclass
class NetgearRuntimeData:
"""Runtime data for the Netgear integration."""
router: NetgearRouter
coordinator_tracker: NetgearTrackerCoordinator
coordinator_traffic: NetgearTrafficMeterCoordinator
coordinator_speed: NetgearSpeedTestCoordinator
coordinator_firmware: NetgearFirmwareCoordinator
coordinator_utilization: NetgearUtilizationCoordinator
coordinator_link: NetgearLinkCoordinator
type NetgearConfigEntry = ConfigEntry[NetgearRuntimeData]
class NetgearDataCoordinator[T](DataUpdateCoordinator[T]):
"""Base coordinator for Netgear."""
config_entry: NetgearConfigEntry
def __init__(
self,
hass: HomeAssistant,
router: NetgearRouter,
entry: NetgearConfigEntry,
*,
name: str,
update_interval: timedelta,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} {name}",
update_interval=update_interval,
)
self.router = router
class NetgearTrackerCoordinator(NetgearDataCoordinator[bool]):
"""Coordinator for Netgear device tracking."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Devices", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> bool:
"""Fetch data from the router."""
if self.router.track_devices:
return await self.router.async_update_device_trackers()
return False
class NetgearTrafficMeterCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear traffic meter data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Traffic meter", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_traffic_meter()
class NetgearSpeedTestCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear speed test data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Speed test", update_interval=SPEED_TEST_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_speed_test()
class NetgearFirmwareCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear firmware updates."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Firmware", update_interval=SCAN_INTERVAL_FIRMWARE
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Check for new firmware of the router."""
return await self.router.async_check_new_firmware()
class NetgearUtilizationCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear utilization data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Utilization", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_utilization()
class NetgearLinkCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear Ethernet link status."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
router,
entry,
name="Ethernet Link Status",
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_link_status()

View File

@@ -5,12 +5,11 @@ from __future__ import annotations
import logging
from homeassistant.components.device_tracker import ScannerEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DEVICE_ICONS, DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .const import DEVICE_ICONS
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .entity import NetgearDeviceEntity
from .router import NetgearRouter
@@ -19,18 +18,18 @@ _LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: NetgearConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up device tracker for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
router = entry.runtime_data.router
coordinator_tracker = entry.runtime_data.coordinator_tracker
tracked = set()
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
if not coordinator.data:
if not coordinator_tracker.data:
return
new_entities = []
@@ -39,14 +38,16 @@ async def async_setup_entry(
if mac in tracked:
continue
new_entities.append(NetgearScannerEntity(coordinator, router, device))
new_entities.append(
NetgearScannerEntity(coordinator_tracker, router, device)
)
tracked.add(mac)
async_add_entities(new_entities)
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
coordinator.data = True
coordinator_tracker.data = True
new_device_callback()
@@ -56,7 +57,10 @@ class NetgearScannerEntity(NetgearDeviceEntity, ScannerEntity):
_attr_has_entity_name = False
def __init__(
self, coordinator: DataUpdateCoordinator, router: NetgearRouter, device: dict
self,
coordinator: NetgearTrackerCoordinator,
router: NetgearRouter,
device: dict,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator, router, device)

View File

@@ -3,28 +3,30 @@
from __future__ import annotations
from abc import abstractmethod
from typing import Any
from homeassistant.const import CONF_HOST
from homeassistant.core import callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import NetgearDataCoordinator, NetgearTrackerCoordinator
from .router import NetgearRouter
class NetgearDeviceEntity(CoordinatorEntity):
class NetgearDeviceEntity(CoordinatorEntity[NetgearTrackerCoordinator]):
"""Base class for a device connected to a Netgear router."""
_attr_has_entity_name = True
def __init__(
self, coordinator: DataUpdateCoordinator, router: NetgearRouter, device: dict
self,
coordinator: NetgearTrackerCoordinator,
router: NetgearRouter,
device: dict,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator)
@@ -86,12 +88,12 @@ class NetgearRouterEntity(Entity):
)
class NetgearRouterCoordinatorEntity(NetgearRouterEntity, CoordinatorEntity):
class NetgearRouterCoordinatorEntity[T: NetgearDataCoordinator[Any]](
NetgearRouterEntity, CoordinatorEntity[T]
):
"""Base class for a Netgear router entity."""
def __init__(
self, coordinator: DataUpdateCoordinator, router: NetgearRouter
) -> None:
def __init__(self, coordinator: T, router: NetgearRouter) -> None:
"""Initialize a Netgear device."""
CoordinatorEntity.__init__(self, coordinator)
NetgearRouterEntity.__init__(self, router)

View File

@@ -7,6 +7,7 @@ from dataclasses import dataclass
from datetime import date, datetime
from decimal import Decimal
import logging
from typing import Any
from homeassistant.components.sensor import (
RestoreSensor,
@@ -15,7 +16,6 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
EntityCategory,
@@ -26,16 +26,11 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import (
DOMAIN,
KEY_COORDINATOR,
KEY_COORDINATOR_LINK,
KEY_COORDINATOR_SPEED,
KEY_COORDINATOR_TRAFFIC,
KEY_COORDINATOR_UTIL,
KEY_ROUTER,
from .coordinator import (
NetgearConfigEntry,
NetgearDataCoordinator,
NetgearTrackerCoordinator,
)
from .entity import NetgearDeviceEntity, NetgearRouterCoordinatorEntity
from .router import NetgearRouter
@@ -275,16 +270,16 @@ SENSOR_LINK_TYPES = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: NetgearConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up device tracker for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
coordinator_traffic = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_TRAFFIC]
coordinator_speed = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_SPEED]
coordinator_utilization = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_UTIL]
coordinator_link = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_LINK]
"""Set up Netgear sensors from a config entry."""
router = entry.runtime_data.router
coordinator_tracker = entry.runtime_data.coordinator_tracker
coordinator_traffic = entry.runtime_data.coordinator_traffic
coordinator_speed = entry.runtime_data.coordinator_speed
coordinator_utilization = entry.runtime_data.coordinator_utilization
coordinator_link = entry.runtime_data.coordinator_link
async_add_entities(
NetgearRouterSensorEntity(coordinator, router, description)
@@ -306,7 +301,7 @@ async def async_setup_entry(
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
if not coordinator.data:
if not coordinator_tracker.data:
return
new_entities: list[NetgearSensorEntity] = []
@@ -316,16 +311,16 @@ async def async_setup_entry(
continue
new_entities.extend(
NetgearSensorEntity(coordinator, router, device, attribute)
NetgearSensorEntity(coordinator_tracker, router, device, attribute)
for attribute in sensors
)
tracked.add(mac)
async_add_entities(new_entities)
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
coordinator.data = True
coordinator_tracker.data = True
new_device_callback()
@@ -334,7 +329,7 @@ class NetgearSensorEntity(NetgearDeviceEntity, SensorEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: NetgearTrackerCoordinator,
router: NetgearRouter,
device: dict,
attribute: str,
@@ -373,7 +368,7 @@ class NetgearRouterSensorEntity(NetgearRouterCoordinatorEntity, RestoreSensor):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: NetgearDataCoordinator[dict[str, Any] | None],
router: NetgearRouter,
entity_description: NetgearSensorEntityDescription,
) -> None:

View File

@@ -9,13 +9,11 @@ from typing import Any
from pynetgear import ALLOW, BLOCK
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .entity import NetgearDeviceEntity, NetgearRouterEntity
from .router import NetgearRouter
@@ -100,11 +98,11 @@ ROUTER_SWITCH_TYPES = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: NetgearConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up switches for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
router = entry.runtime_data.router
async_add_entities(
NetgearRouterSwitchEntity(router, description)
@@ -112,14 +110,14 @@ async def async_setup_entry(
)
# Entities per network device
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
coordinator_tracker = entry.runtime_data.coordinator_tracker
tracked = set()
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
new_entities = []
if not coordinator.data:
if not coordinator_tracker.data:
return
for mac, device in router.devices.items():
@@ -128,7 +126,9 @@ async def async_setup_entry(
new_entities.extend(
[
NetgearAllowBlock(coordinator, router, device, entity_description)
NetgearAllowBlock(
coordinator_tracker, router, device, entity_description
)
for entity_description in SWITCH_TYPES
]
)
@@ -136,9 +136,9 @@ async def async_setup_entry(
async_add_entities(new_entities)
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
coordinator.data = True
coordinator_tracker.data = True
new_device_callback()
@@ -149,7 +149,7 @@ class NetgearAllowBlock(NetgearDeviceEntity, SwitchEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: NetgearTrackerCoordinator,
router: NetgearRouter,
device: dict,
entity_description: SwitchEntityDescription,

View File

@@ -10,12 +10,10 @@ from homeassistant.components.update import (
UpdateEntity,
UpdateEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN, KEY_COORDINATOR_FIRMWARE, KEY_ROUTER
from .coordinator import NetgearConfigEntry, NetgearFirmwareCoordinator
from .entity import NetgearRouterCoordinatorEntity
from .router import NetgearRouter
@@ -24,18 +22,20 @@ LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: NetgearConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up update entities for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_FIRMWARE]
router = entry.runtime_data.router
coordinator = entry.runtime_data.coordinator_firmware
entities = [NetgearUpdateEntity(coordinator, router)]
async_add_entities(entities)
class NetgearUpdateEntity(NetgearRouterCoordinatorEntity, UpdateEntity):
class NetgearUpdateEntity(
NetgearRouterCoordinatorEntity[NetgearFirmwareCoordinator], UpdateEntity
):
"""Update entity for a Netgear device."""
_attr_device_class = UpdateDeviceClass.FIRMWARE
@@ -43,7 +43,7 @@ class NetgearUpdateEntity(NetgearRouterCoordinatorEntity, UpdateEntity):
def __init__(
self,
coordinator: DataUpdateCoordinator,
coordinator: NetgearFirmwareCoordinator,
router: NetgearRouter,
) -> None:
"""Initialize a Netgear device."""

View File

@@ -12,6 +12,7 @@ from .coordinator import NRGkickConfigEntry, NRGkickDataUpdateCoordinator
PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.DEVICE_TRACKER,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,

View File

@@ -0,0 +1,74 @@
"""Device tracker platform for NRGkick."""
from __future__ import annotations
from typing import Any, Final
from homeassistant.components.device_tracker import SourceType
from homeassistant.components.device_tracker.config_entry import TrackerEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import NRGkickConfigEntry, NRGkickDataUpdateCoordinator
from .entity import NRGkickEntity, get_nested_dict_value
PARALLEL_UPDATES = 0
TRACKER_KEY: Final = "gps_tracker"
async def async_setup_entry(
_hass: HomeAssistant,
entry: NRGkickConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up NRGkick device tracker based on a config entry."""
coordinator = entry.runtime_data
data = coordinator.data
assert data is not None
info_data: dict[str, Any] = data.info
general_info: dict[str, Any] = info_data.get("general", {})
model_type = general_info.get("model_type")
# GPS module is only available on SIM-capable models (same check as cellular
# sensors). SIM-capable models include "SIM" in their model type string.
has_sim_module = isinstance(model_type, str) and "SIM" in model_type.upper()
if has_sim_module:
async_add_entities([NRGkickDeviceTracker(coordinator)])
class NRGkickDeviceTracker(NRGkickEntity, TrackerEntity):
"""Representation of a NRGkick GPS device tracker."""
_attr_translation_key = TRACKER_KEY
_attr_source_type = SourceType.GPS
def __init__(
self,
coordinator: NRGkickDataUpdateCoordinator,
) -> None:
"""Initialize the device tracker."""
super().__init__(coordinator, TRACKER_KEY)
def _gps_float(self, key: str) -> float | None:
"""Return a GPS value as float, or None if GPS data is unavailable."""
value = get_nested_dict_value(self.coordinator.data.info, "gps", key)
return float(value) if value is not None else None
@property
def latitude(self) -> float | None:
"""Return latitude value of the device."""
return self._gps_float("latitude")
@property
def longitude(self) -> float | None:
"""Return longitude value of the device."""
return self._gps_float("longitude")
@property
def location_accuracy(self) -> float:
"""Return the location accuracy of the device."""
return self._gps_float("accuracy") or 0.0

View File

@@ -6,12 +6,20 @@ from dataclasses import asdict
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
from homeassistant.const import (
ATTR_LATITUDE,
ATTR_LONGITUDE,
CONF_PASSWORD,
CONF_USERNAME,
)
from homeassistant.core import HomeAssistant
from .coordinator import NRGkickConfigEntry
TO_REDACT = {
ATTR_LATITUDE,
ATTR_LONGITUDE,
"altitude",
CONF_PASSWORD,
CONF_USERNAME,
}

View File

@@ -5,6 +5,11 @@
"default": "mdi:ev-station"
}
},
"device_tracker": {
"gps_tracker": {
"default": "mdi:map-marker"
}
},
"number": {
"current_set": {
"default": "mdi:current-ac"

View File

@@ -83,6 +83,11 @@
"name": "Charge permitted"
}
},
"device_tracker": {
"gps_tracker": {
"name": "GPS tracker"
}
},
"number": {
"current_set": {
"name": "Charging current"

View File

@@ -15,7 +15,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PortainerConfigEntry
from .const import CONTAINER_STATE_RUNNING, STACK_STATUS_ACTIVE
from .const import ContainerState, EndpointStatus, StackStatus
from .coordinator import PortainerContainerData
from .entity import (
PortainerContainerEntity,
@@ -53,7 +53,7 @@ CONTAINER_SENSORS: tuple[PortainerContainerBinarySensorEntityDescription, ...] =
PortainerContainerBinarySensorEntityDescription(
key="status",
translation_key="status",
state_fn=lambda data: data.container.state == CONTAINER_STATE_RUNNING,
state_fn=lambda data: data.container.state == ContainerState.RUNNING,
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),
@@ -63,7 +63,7 @@ ENDPOINT_SENSORS: tuple[PortainerEndpointBinarySensorEntityDescription, ...] = (
PortainerEndpointBinarySensorEntityDescription(
key="status",
translation_key="status",
state_fn=lambda data: data.endpoint.status == 1, # 1 = Running | 2 = Stopped
state_fn=lambda data: data.endpoint.status == EndpointStatus.UP,
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),
@@ -73,9 +73,7 @@ STACK_SENSORS: tuple[PortainerStackBinarySensorEntityDescription, ...] = (
PortainerStackBinarySensorEntityDescription(
key="stack_status",
translation_key="status",
state_fn=lambda data: (
data.stack.status == STACK_STATUS_ACTIVE
), # 1 = Active | 2 = Inactive
state_fn=lambda data: data.stack.status == StackStatus.ACTIVE,
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),

View File

@@ -1,17 +1,34 @@
"""Constants for the Portainer integration."""
from enum import IntEnum, StrEnum
DOMAIN = "portainer"
DEFAULT_NAME = "Portainer"
ENDPOINT_STATUS_DOWN = 2
class EndpointStatus(IntEnum):
"""Portainer endpoint status."""
CONTAINER_STATE_RUNNING = "running"
STACK_STATUS_ACTIVE = 1
STACK_STATUS_INACTIVE = 2
UP = 1
DOWN = 2
STACK_TYPE_SWARM = 1
STACK_TYPE_COMPOSE = 2
STACK_TYPE_KUBERNETES = 3
class ContainerState(StrEnum):
"""Portainer container state."""
RUNNING = "running"
class StackStatus(IntEnum):
"""Portainer stack status."""
ACTIVE = 1
INACTIVE = 2
class StackType(IntEnum):
"""Portainer stack type."""
SWARM = 1
COMPOSE = 2
KUBERNETES = 3

View File

@@ -29,7 +29,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import CONTAINER_STATE_RUNNING, DOMAIN, ENDPOINT_STATUS_DOWN
from .const import DOMAIN, ContainerState, EndpointStatus
type PortainerConfigEntry = ConfigEntry[PortainerCoordinator]
@@ -154,7 +154,7 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
mapped_endpoints: dict[int, PortainerCoordinatorData] = {}
for endpoint in endpoints:
if endpoint.status == ENDPOINT_STATUS_DOWN:
if endpoint.status == EndpointStatus.DOWN:
_LOGGER.debug(
"Skipping offline endpoint: %s (ID: %d)",
endpoint.name,
@@ -215,7 +215,7 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
running_containers = [
container
for container in containers
if container.state == CONTAINER_STATE_RUNNING
if container.state == ContainerState.RUNNING
]
if running_containers:
container_stats = dict(

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyportainer==1.0.28"]
"requirements": ["pyportainer==1.0.32"]
}

View File

@@ -17,7 +17,7 @@ from homeassistant.const import PERCENTAGE, UnitOfInformation
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import STACK_TYPE_COMPOSE, STACK_TYPE_KUBERNETES, STACK_TYPE_SWARM
from .const import StackType
from .coordinator import (
PortainerConfigEntry,
PortainerContainerData,
@@ -293,11 +293,11 @@ STACK_SENSORS: tuple[PortainerStackSensorEntityDescription, ...] = (
translation_key="stack_type",
value_fn=lambda data: (
"swarm"
if data.stack.type == STACK_TYPE_SWARM
if data.stack.type == StackType.SWARM
else "compose"
if data.stack.type == STACK_TYPE_COMPOSE
if data.stack.type == StackType.COMPOSE
else "kubernetes"
if data.stack.type == STACK_TYPE_KUBERNETES
if data.stack.type == StackType.KUBERNETES
else None
),
device_class=SensorDeviceClass.ENUM,

View File

@@ -23,7 +23,7 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PortainerConfigEntry
from .const import DOMAIN, STACK_STATUS_ACTIVE
from .const import DOMAIN, StackStatus
from .coordinator import (
PortainerContainerData,
PortainerCoordinator,
@@ -99,7 +99,7 @@ STACK_SWITCHES: tuple[PortainerStackSwitchEntityDescription, ...] = (
key="stack",
translation_key="stack",
device_class=SwitchDeviceClass.SWITCH,
is_on_fn=lambda data: data.stack.status == STACK_STATUS_ACTIVE,
is_on_fn=lambda data: data.stack.status == StackStatus.ACTIVE,
turn_on_fn=lambda portainer: portainer.start_stack,
turn_off_fn=lambda portainer: portainer.stop_stack,
),

View File

@@ -19,12 +19,13 @@ from homeassistant.components.button import (
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator, ProxmoxNodeData
from .entity import ProxmoxContainerEntity, ProxmoxNodeEntity, ProxmoxVMEntity
from .helpers import is_granted
@dataclass(frozen=True, kw_only=True)
@@ -264,6 +265,11 @@ class ProxmoxNodeButtonEntity(ProxmoxNodeEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the node button action via executor."""
if not is_granted(self.coordinator.permissions, p_type="nodes"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_node_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
@@ -278,6 +284,11 @@ class ProxmoxVMButtonEntity(ProxmoxVMEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the VM button action via executor."""
if not is_granted(self.coordinator.permissions, p_type="vms"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_vm_lxc_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
@@ -293,6 +304,12 @@ class ProxmoxContainerButtonEntity(ProxmoxContainerEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the container button action via executor."""
# Container power actions fall under vms
if not is_granted(self.coordinator.permissions, p_type="vms"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_vm_lxc_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,

View File

@@ -17,3 +17,5 @@ DEFAULT_VERIFY_SSL = True
TYPE_VM = 0
TYPE_CONTAINER = 1
UPDATE_INTERVAL = 60
PERM_POWER = "VM.PowerMgmt"

View File

@@ -70,6 +70,7 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
self.known_nodes: set[str] = set()
self.known_vms: set[tuple[str, int]] = set()
self.known_containers: set[tuple[str, int]] = set()
self.permissions: dict[str, dict[str, int]] = {}
self.new_nodes_callbacks: list[Callable[[list[ProxmoxNodeData]], None]] = []
self.new_vms_callbacks: list[
@@ -101,11 +102,21 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
translation_key="timeout_connect",
translation_placeholders={"error": repr(err)},
) from err
except ResourceException as err:
except ProxmoxServerError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="api_error_details",
translation_placeholders={"error": repr(err)},
) from err
except ProxmoxPermissionsError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="permissions_error",
) from err
except ProxmoxNodesNotFoundError as err:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise ConfigEntryError(
@@ -143,7 +154,6 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise UpdateFailed(
@@ -180,7 +190,19 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
password=self.config_entry.data[CONF_PASSWORD],
verify_ssl=self.config_entry.data.get(CONF_VERIFY_SSL, DEFAULT_VERIFY_SSL),
)
self.proxmox.nodes.get()
try:
self.permissions = self.proxmox.access.permissions.get()
except ResourceException as err:
if 400 <= err.status_code < 500:
raise ProxmoxPermissionsError from err
raise ProxmoxServerError from err
try:
self.proxmox.nodes.get()
except ResourceException as err:
if 400 <= err.status_code < 500:
raise ProxmoxNodesNotFoundError from err
raise ProxmoxServerError from err
def _fetch_all_nodes(
self,
@@ -230,3 +252,19 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
if new_containers:
_LOGGER.debug("New containers found: %s", new_containers)
self.known_containers.update(new_containers)
class ProxmoxSetupError(Exception):
"""Base exception for Proxmox setup issues."""
class ProxmoxNodesNotFoundError(ProxmoxSetupError):
"""Raised when the API works but no nodes are visible."""
class ProxmoxPermissionsError(ProxmoxSetupError):
"""Raised when failing to retrieve permissions."""
class ProxmoxServerError(ProxmoxSetupError):
"""Raised when the Proxmox server returns an error."""

View File

@@ -0,0 +1,13 @@
"""Helpers for Proxmox VE."""
from .const import PERM_POWER
def is_granted(
permissions: dict[str, dict[str, int]],
p_type: str = "vms",
permission: str = PERM_POWER,
) -> bool:
"""Validate user permissions for the given type and permission."""
path = f"/{p_type}"
return permissions.get(path, {}).get(permission) == 1

View File

@@ -175,6 +175,9 @@
}
},
"exceptions": {
"api_error_details": {
"message": "An error occurred while communicating with the Proxmox VE instance: {error}"
},
"api_error_no_details": {
"message": "An error occurred while communicating with the Proxmox VE instance."
},
@@ -193,6 +196,15 @@
"no_nodes_found": {
"message": "No active nodes were found on the Proxmox VE server."
},
"no_permission_node_power": {
"message": "The configured Proxmox VE user does not have permission to manage the power state of nodes. Please grant the user the 'VM.PowerMgmt' permission and try again."
},
"no_permission_vm_lxc_power": {
"message": "The configured Proxmox VE user does not have permission to manage the power state of VMs and containers. Please grant the user the 'VM.PowerMgmt' permission and try again."
},
"permissions_error": {
"message": "Failed to retrieve Proxmox VE permissions. Please check your credentials and try again."
},
"ssl_error": {
"message": "An SSL error occurred: {error}"
},

View File

@@ -3,7 +3,7 @@
"name": "Recovery Mode",
"codeowners": ["@home-assistant/core"],
"config_flow": false,
"dependencies": ["frontend", "persistent_notification", "cloud"],
"dependencies": ["persistent_notification"],
"documentation": "https://www.home-assistant.io/integrations/recovery_mode",
"integration_type": "system",
"quality_scale": "internal"

View File

@@ -26,5 +26,13 @@
"turn_on": {
"service": "mdi:remote"
}
},
"triggers": {
"turned_off": {
"trigger": "mdi:remote-off"
},
"turned_on": {
"trigger": "mdi:remote"
}
}
}

View File

@@ -1,4 +1,8 @@
{
"common": {
"trigger_behavior_description": "The behavior of the targeted remotes to trigger on.",
"trigger_behavior_name": "Behavior"
},
"device_automation": {
"action_type": {
"toggle": "[%key:common::device_automation::action_type::toggle%]",
@@ -27,6 +31,15 @@
}
}
},
"selector": {
"trigger_behavior": {
"options": {
"any": "Any",
"first": "First",
"last": "Last"
}
}
},
"services": {
"delete_command": {
"description": "Deletes a command or a list of commands from the database.",
@@ -113,5 +126,27 @@
"name": "[%key:common::action::turn_on%]"
}
},
"title": "Remote"
"title": "Remote",
"triggers": {
"turned_off": {
"description": "Triggers when one or more remotes turn off.",
"fields": {
"behavior": {
"description": "[%key:component::remote::common::trigger_behavior_description%]",
"name": "[%key:component::remote::common::trigger_behavior_name%]"
}
},
"name": "Remote turned off"
},
"turned_on": {
"description": "Triggers when one or more remotes turn on.",
"fields": {
"behavior": {
"description": "[%key:component::remote::common::trigger_behavior_description%]",
"name": "[%key:component::remote::common::trigger_behavior_name%]"
}
},
"name": "Remote turned on"
}
}
}

View File

@@ -0,0 +1,17 @@
"""Provides triggers for remotes."""
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.trigger import Trigger, make_entity_target_state_trigger
from . import DOMAIN
TRIGGERS: dict[str, type[Trigger]] = {
"turned_on": make_entity_target_state_trigger(DOMAIN, STATE_ON),
"turned_off": make_entity_target_state_trigger(DOMAIN, STATE_OFF),
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for remotes."""
return TRIGGERS

View File

@@ -0,0 +1,18 @@
.trigger_common: &trigger_common
target:
entity:
domain: remote
fields:
behavior:
required: true
default: any
selector:
select:
options:
- first
- last
- any
translation_key: trigger_behavior
turned_off: *trigger_common
turned_on: *trigger_common

View File

@@ -565,7 +565,20 @@ def migrate_entity_ids(
entity.unique_id,
new_id,
)
entity_reg.async_update_entity(entity.entity_id, new_unique_id=new_id)
existing_entity = entity_reg.async_get_entity_id(
entity.domain, entity.platform, new_id
)
if existing_entity is None:
entity_reg.async_update_entity(entity.entity_id, new_unique_id=new_id)
else:
_LOGGER.warning(
"Reolink entity with unique_id %s already exists, "
"removing entity with unique_id %s",
new_id,
entity.unique_id,
)
entity_reg.async_remove(entity.entity_id)
continue
if entity.device_id in ch_device_ids:
ch = ch_device_ids[entity.device_id]
@@ -595,7 +608,7 @@ def migrate_entity_ids(
else:
_LOGGER.warning(
"Reolink entity with unique_id %s already exists, "
"removing device with unique_id %s",
"removing entity with unique_id %s",
new_id,
entity.unique_id,
)

View File

@@ -20,5 +20,5 @@
"iot_class": "local_push",
"loggers": ["reolink_aio"],
"quality_scale": "platinum",
"requirements": ["reolink-aio==0.19.0"]
"requirements": ["reolink-aio==0.19.1"]
}

View File

@@ -47,6 +47,7 @@ from .coordinator import (
RoborockWashingMachineUpdateCoordinator,
RoborockWetDryVacUpdateCoordinator,
)
from .models import get_device_info
from .roborock_storage import CacheStore, async_cleanup_map_storage
from .services import async_setup_services
@@ -130,8 +131,22 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
devices = await device_manager.get_devices()
_LOGGER.debug("Device manager found %d devices", len(devices))
# Register all discovered devices in the device registry so we can
# check the disabled state before creating coordinators.
device_registry = dr.async_get(hass)
for device in devices:
device_registry.async_get_or_create(
config_entry_id=entry.entry_id,
**get_device_info(device),
)
enabled_devices = [
device for device in devices if not _is_device_disabled(device_registry, device)
]
_LOGGER.debug("%d of %d devices are enabled", len(enabled_devices), len(devices))
coordinators = await asyncio.gather(
*build_setup_functions(hass, entry, devices, user_data),
*build_setup_functions(hass, entry, enabled_devices, user_data),
return_exceptions=True,
)
v1_coords = [
@@ -149,7 +164,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
for coord in coordinators
if isinstance(coord, RoborockB01Q7UpdateCoordinator)
]
if len(v1_coords) + len(a01_coords) + len(b01_q7_coords) == 0:
if len(v1_coords) + len(a01_coords) + len(b01_q7_coords) == 0 and enabled_devices:
raise ConfigEntryNotReady(
"No devices were able to successfully setup",
translation_domain=DOMAIN,
@@ -164,6 +179,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: RoborockConfigEntry) ->
return True
def _is_device_disabled(
device_registry: dr.DeviceRegistry,
device: RoborockDevice,
) -> bool:
"""Check if a device is disabled in the device registry."""
device_entry = device_registry.async_get_device(identifiers={(DOMAIN, device.duid)})
return device_entry is not None and device_entry.disabled
def _remove_stale_devices(
hass: HomeAssistant,
entry: RoborockConfigEntry,

View File

@@ -45,7 +45,7 @@ from .const import (
V1_LOCAL_IN_CLEANING_INTERVAL,
V1_LOCAL_NOT_CLEANING_INTERVAL,
)
from .models import DeviceState
from .models import DeviceState, get_device_info
SCAN_INTERVAL = timedelta(seconds=30)
@@ -103,14 +103,7 @@ class RoborockDataUpdateCoordinator(DataUpdateCoordinator[DeviceState]):
)
self._device = device
self.properties_api = properties_api
self.device_info = DeviceInfo(
name=self._device.device_info.name,
identifiers={(DOMAIN, self.duid)},
manufacturer="Roborock",
model=self._device.product.model,
model_id=self._device.product.model,
sw_version=self._device.device_info.fv,
)
self.device_info = get_device_info(device)
if mac := properties_api.network_info.mac:
self.device_info[ATTR_CONNECTIONS] = {
(dr.CONNECTION_NETWORK_MAC, dr.format_mac(mac))
@@ -385,13 +378,7 @@ class RoborockDataUpdateCoordinatorA01(DataUpdateCoordinator[dict[_V, StateType]
update_interval=A01_UPDATE_INTERVAL,
)
self._device = device
self.device_info = DeviceInfo(
name=device.name,
identifiers={(DOMAIN, device.duid)},
manufacturer="Roborock",
model=device.product.model,
sw_version=device.device_info.fv,
)
self.device_info = get_device_info(device)
self.request_protocols: list[_V] = []
@cached_property
@@ -517,13 +504,7 @@ class RoborockDataUpdateCoordinatorB01(DataUpdateCoordinator[B01Props]):
update_interval=A01_UPDATE_INTERVAL,
)
self._device = device
self.device_info = DeviceInfo(
name=device.name,
identifiers={(DOMAIN, device.duid)},
manufacturer="Roborock",
model=device.product.model,
sw_version=device.device_info.fv,
)
self.device_info = get_device_info(device)
@cached_property
def duid(self) -> str:

View File

@@ -13,12 +13,29 @@ from roborock.data import (
HomeDataProduct,
NetworkInfo,
)
from roborock.devices.device import RoborockDevice
from roborock.devices.traits.v1.status import StatusTrait
from vacuum_map_parser_base.map_data import MapData
from homeassistant.helpers.device_registry import DeviceInfo
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
def get_device_info(device: RoborockDevice) -> DeviceInfo:
"""Create a DeviceInfo for a Roborock device."""
return DeviceInfo(
name=device.name,
identifiers={(DOMAIN, device.duid)},
manufacturer="Roborock",
model=device.product.model,
model_id=device.product.model,
sw_version=device.device_info.fv,
)
@dataclass
class DeviceState:
"""Data about the current state of a device."""

View File

@@ -256,7 +256,7 @@
"state": {
"high": "[%key:common::state::high%]",
"low": "[%key:common::state::low%]",
"medium": "Medium",
"medium": "[%key:common::state::medium%]",
"moderate_high": "Moderate high",
"moderate_low": "Moderate low"
}

View File

@@ -29,7 +29,7 @@ from homeassistant.core import (
)
from homeassistant.exceptions import TemplateError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.entity import Entity, async_generate_entity_id
from homeassistant.helpers.event import (
TrackTemplate,
TrackTemplateResult,
@@ -264,16 +264,23 @@ class TemplateEntity(AbstractTemplateEntity):
return None
return cast(str, self._blueprint_inputs[CONF_USE_BLUEPRINT][CONF_PATH])
def _get_this_variable(self) -> TemplateStateFromEntityId:
"""Create a this variable for the entity."""
if self._preview_callback:
preview_entity_id = async_generate_entity_id(
self._entity_id_format, self._attr_name or "preview", hass=self.hass
)
return TemplateStateFromEntityId(self.hass, preview_entity_id)
return TemplateStateFromEntityId(self.hass, self.entity_id)
def _render_script_variables(self) -> dict[str, Any]:
"""Render configured variables."""
if isinstance(self._run_variables, dict):
return self._run_variables
return self._run_variables.async_render(
self.hass,
{
"this": TemplateStateFromEntityId(self.hass, self.entity_id),
},
self.hass, {"this": self._get_this_variable()}
)
def setup_state_template(
@@ -451,7 +458,7 @@ class TemplateEntity(AbstractTemplateEntity):
has_availability_template = False
variables = {
"this": TemplateStateFromEntityId(self.hass, self.entity_id),
"this": self._get_this_variable(),
**self._render_script_variables(),
}

View File

@@ -117,7 +117,7 @@
},
"services": {
"clean_area": {
"description": "Tells a vacuum cleaner to clean an area.",
"description": "Tells a vacuum cleaner to clean one or more areas.",
"fields": {
"cleaning_area_id": {
"description": "Areas to clean.",

View File

@@ -401,6 +401,11 @@ async def async_setup_entry(
or int(state_key) in info.entity_description.states
)
)
elif (
isinstance(info, NewZwaveDiscoveryInfo)
and info.entity_class is ZWaveBooleanBinarySensor
):
entities.append(ZWaveBooleanBinarySensor(config_entry, driver, info))
elif isinstance(info, NewZwaveDiscoveryInfo):
pass # other entity classes are not migrated yet
elif info.platform_hint == "notification":
@@ -481,12 +486,16 @@ class ZWaveBooleanBinarySensor(ZWaveBaseEntity, BinarySensorEntity):
self,
config_entry: ZwaveJSConfigEntry,
driver: Driver,
info: ZwaveDiscoveryInfo,
info: ZwaveDiscoveryInfo | NewZwaveDiscoveryInfo,
) -> None:
"""Initialize a ZWaveBooleanBinarySensor entity."""
super().__init__(config_entry, driver, info)
# Entity class attributes
if isinstance(info, NewZwaveDiscoveryInfo):
# Entity name and description are set from the discovery schema.
return
# Entity class attributes for old-style discovery.
self._attr_name = self.generate_name(include_value_name=True)
primary_value = self.info.primary_value
if description := BOOLEAN_SENSOR_MAPPINGS.get(
@@ -578,6 +587,27 @@ class ZWaveConfigParameterBinarySensor(ZWaveBooleanBinarySensor):
DISCOVERY_SCHEMAS: list[NewZWaveDiscoverySchema] = [
NewZWaveDiscoverySchema(
# Hoppe eHandle ConnectSense (0x0313:0x0701:0x0002) - window tilt sensor.
# The window tilt state is exposed as a binary sensor that is disabled by default
# instead of a notification sensor. We enable that sensor and give it a name
# that is more consistent with the other window related entities.
platform=Platform.BINARY_SENSOR,
manufacturer_id={0x0313},
product_id={0x0002},
product_type={0x0701},
primary_value=ZWaveValueDiscoverySchema(
command_class={CommandClass.SENSOR_BINARY},
property={"Tilt"},
type={ValueType.BOOLEAN},
),
entity_description=BinarySensorEntityDescription(
key="window_door_is_tilted",
name="Window/door is tilted",
device_class=BinarySensorDeviceClass.WINDOW,
),
entity_class=ZWaveBooleanBinarySensor,
),
NewZWaveDiscoverySchema(
platform=Platform.BINARY_SENSOR,
primary_value=ZWaveValueDiscoverySchema(

View File

@@ -87,6 +87,12 @@ async def _ssrf_redirect_middleware(
# Relative redirects stay on the same host - always safe
return resp
# Only schemes that aiohttp can open a network connection for need
# SSRF protection. Custom app URI schemes (e.g. weconnect://) are inert
# from a networking perspective and must not be blocked.
if connector and redirect_url.scheme not in connector.allowed_protocol_schema_set:
return resp
host = redirect_url.host
if await _async_is_blocked_host(host, connector):
resp.close()

View File

@@ -40,7 +40,7 @@ habluetooth==5.8.0
hass-nabucasa==1.15.0
hassil==3.5.0
home-assistant-bluetooth==1.13.1
home-assistant-frontend==20260302.0
home-assistant-frontend==20260304.0
home-assistant-intents==2026.3.3
httpx==0.28.1
ifaddr==0.2.0

View File

@@ -1,10 +0,0 @@
image: ghcr.io/home-assistant/{machine}-homeassistant
build_from:
aarch64: "ghcr.io/home-assistant/aarch64-homeassistant:"
amd64: "ghcr.io/home-assistant/amd64-homeassistant:"
cosign:
base_identity: https://github.com/home-assistant/core/.*
identity: https://github.com/home-assistant/core/.*
labels:
io.hass.type: core
org.opencontainers.image.source: https://github.com/home-assistant/core

View File

@@ -1,7 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
libva-intel-driver
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,10 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# NOTE: intel-nuc will be replaced by generic-x86-64. Make sure to apply
# changes in generic-x86-64 as well.
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
libva-intel-driver
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,4 +1,18 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,7 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,7 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,7 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

View File

@@ -1,7 +1,21 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
ARG BUILD_ARCH=aarch64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}"

10
requirements_all.txt generated
View File

@@ -553,7 +553,7 @@ async-upnp-client==0.46.2
asyncarve==0.1.1
# homeassistant.components.keyboard_remote
asyncinotify==4.2.0
asyncinotify==4.4.0
# homeassistant.components.supla
asyncpysupla==0.0.5
@@ -936,7 +936,7 @@ eternalegypt==0.0.18
eufylife-ble-client==0.1.8
# homeassistant.components.keyboard_remote
# evdev==1.6.1
# evdev==1.9.3
# homeassistant.components.evohome
evohome-async==1.1.3
@@ -1223,7 +1223,7 @@ hole==0.9.0
holidays==0.84
# homeassistant.components.frontend
home-assistant-frontend==20260302.0
home-assistant-frontend==20260304.0
# homeassistant.components.conversation
home-assistant-intents==2026.3.3
@@ -2373,7 +2373,7 @@ pyplaato==0.0.19
pypoint==3.0.0
# homeassistant.components.portainer
pyportainer==1.0.28
pyportainer==1.0.32
# homeassistant.components.probe_plus
pyprobeplus==1.1.2
@@ -2799,7 +2799,7 @@ renault-api==0.5.6
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.19.0
reolink-aio==0.19.1
# homeassistant.components.idteck_prox
rfk101py==0.0.1

View File

@@ -1084,7 +1084,7 @@ hole==0.9.0
holidays==0.84
# homeassistant.components.frontend
home-assistant-frontend==20260302.0
home-assistant-frontend==20260304.0
# homeassistant.components.conversation
home-assistant-intents==2026.3.3
@@ -2026,7 +2026,7 @@ pyplaato==0.0.19
pypoint==3.0.0
# homeassistant.components.portainer
pyportainer==1.0.28
pyportainer==1.0.32
# homeassistant.components.probe_plus
pyprobeplus==1.1.2
@@ -2371,7 +2371,7 @@ renault-api==0.5.6
renson-endura-delta==1.7.2
# homeassistant.components.reolink
reolink-aio==0.19.0
reolink-aio==0.19.1
# homeassistant.components.rflink
rflink==0.0.67

View File

@@ -16,19 +16,9 @@ _GO2RTC_SHA = (
DOCKERFILE_TEMPLATE = r"""# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant-base:latest
FROM ${{BUILD_FROM}}
LABEL \
io.hass.type="core" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.source="https://github.com/home-assistant/core" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"
# Synchronize with homeassistant/core.py:async_stop
ENV \
S6_SERVICES_GRACETIME={timeout} \
@@ -75,8 +65,88 @@ RUN \
homeassistant/homeassistant
WORKDIR /config
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${{BUILD_ARCH}}" \
io.hass.version="${{BUILD_VERSION}}" \
org.opencontainers.image.created="${{BUILD_DATE}}" \
org.opencontainers.image.version="${{BUILD_VERSION}}" \
org.opencontainers.image.source="${{BUILD_REPOSITORY}}" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"
"""
@dataclass(frozen=True)
class _MachineConfig:
"""Machine-specific Dockerfile configuration."""
arch: str
packages: tuple[str, ...] = ()
_MACHINES = {
"generic-x86-64": _MachineConfig(arch="amd64", packages=("libva-intel-driver",)),
"green": _MachineConfig(arch="aarch64"),
"intel-nuc": _MachineConfig(arch="amd64", packages=("libva-intel-driver",)),
"khadas-vim3": _MachineConfig(arch="aarch64"),
"odroid-c2": _MachineConfig(arch="aarch64"),
"odroid-c4": _MachineConfig(arch="aarch64"),
"odroid-m1": _MachineConfig(arch="aarch64"),
"odroid-n2": _MachineConfig(arch="aarch64"),
"qemuarm-64": _MachineConfig(arch="aarch64"),
"qemux86-64": _MachineConfig(arch="amd64"),
"raspberrypi3-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"raspberrypi4-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"raspberrypi5-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"yellow": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
}
_MACHINE_DOCKERFILE_TEMPLATE = r"""# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/{arch}-homeassistant:latest
FROM ${{BUILD_FROM}}
{extra_packages}
ARG BUILD_ARCH={arch}
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${{BUILD_ARCH}}" \
io.hass.version="${{BUILD_VERSION}}" \
org.opencontainers.image.created="${{BUILD_DATE}}" \
org.opencontainers.image.version="${{BUILD_VERSION}}" \
org.opencontainers.image.source="${{BUILD_REPOSITORY}}"
"""
def _generate_machine_dockerfile(machine_config: _MachineConfig) -> str:
"""Generate a machine Dockerfile from configuration."""
if machine_config.packages:
pkg_lines = " \\\n ".join(machine_config.packages)
extra_packages = f"\nRUN apk --no-cache add \\\n {pkg_lines}\n"
else:
extra_packages = ""
return _MACHINE_DOCKERFILE_TEMPLATE.format(
arch=machine_config.arch,
extra_packages=extra_packages,
)
_HASSFEST_TEMPLATE = r"""# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
@@ -174,7 +244,7 @@ def _generate_files(config: Config) -> list[File]:
config.root / "requirements_test_pre_commit.txt", {"ruff"}
)
return [
files = [
File(
DOCKERFILE_TEMPLATE.format(
timeout=timeout,
@@ -192,6 +262,16 @@ def _generate_files(config: Config) -> list[File]:
),
]
for machine_name, machine_config in sorted(_MACHINES.items()):
files.append(
File(
_generate_machine_dockerfile(machine_config),
config.root / "machine" / machine_name,
)
)
return files
def validate(integrations: dict[str, Integration], config: Config) -> None:
"""Validate dockerfile."""

View File

@@ -181,7 +181,6 @@ EXCEPTIONS = {
"PySwitchmate", # https://github.com/Danielhiversen/pySwitchmate/pull/16
"PyXiaomiGateway", # https://github.com/Danielhiversen/PyXiaomiGateway/pull/201
"chacha20poly1305", # LGPL
"caio", # Apache 2 https://github.com/mosquito/caio/?tab=Apache-2.0-1-ov-file#readme
"commentjson", # https://github.com/vaidik/commentjson/pull/55
"crownstone-cloud", # https://github.com/crownstone/crownstone-lib-python-cloud/pull/5
"crownstone-core", # https://github.com/crownstone/crownstone-lib-python-core/pull/6

View File

@@ -41,6 +41,7 @@
'current_position': 90,
'device_class': 'damper',
'friendly_name': 'Zone 1 Damper',
'is_closed': False,
'supported_features': <CoverEntityFeature: 7>,
}),
'context': <ANY>,
@@ -93,6 +94,7 @@
'current_position': 100,
'device_class': 'damper',
'friendly_name': 'Zone 2 Damper',
'is_closed': False,
'supported_features': <CoverEntityFeature: 7>,
}),
'context': <ANY>,

View File

@@ -40,6 +40,7 @@
'attributes': ReadOnlyDict({
'device_class': 'garage',
'friendly_name': 'Test Door',
'is_closed': True,
'supported_features': <CoverEntityFeature: 3>,
}),
'context': <ANY>,

View File

@@ -6,10 +6,7 @@ from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from homeassistant.components.aws_s3.backup import (
MULTIPART_MIN_PART_SIZE_BYTES,
suggested_filenames,
)
from homeassistant.components.aws_s3.backup import suggested_filenames
from homeassistant.components.aws_s3.const import DOMAIN
from homeassistant.components.backup import AgentBackup
@@ -18,11 +15,14 @@ from .const import CONFIG_ENTRY_DATA
from tests.common import MockConfigEntry
@pytest.fixture(
params=[2**20, MULTIPART_MIN_PART_SIZE_BYTES],
ids=["small", "large"],
)
def test_backup(request: pytest.FixtureRequest) -> None:
@pytest.fixture
def backup_size() -> int:
"""Backup size, override in tests to change defaults."""
return 2**20
@pytest.fixture
def mock_agent_backup(backup_size: int) -> AgentBackup:
"""Test backup fixture."""
return AgentBackup(
addons=[],
@@ -35,12 +35,12 @@ def test_backup(request: pytest.FixtureRequest) -> None:
homeassistant_version="2024.12.0.dev0",
name="Core 2024.12.0.dev0",
protected=False,
size=request.param,
size=backup_size,
)
@pytest.fixture(autouse=True)
def mock_client(test_backup: AgentBackup) -> Generator[AsyncMock]:
def mock_client(mock_agent_backup: AgentBackup) -> Generator[AsyncMock]:
"""Mock the S3 client."""
with patch(
"aiobotocore.session.AioSession.create_client",
@@ -49,7 +49,7 @@ def mock_client(test_backup: AgentBackup) -> Generator[AsyncMock]:
) as create_client:
client = create_client.return_value
tar_file, metadata_file = suggested_filenames(test_backup)
tar_file, metadata_file = suggested_filenames(mock_agent_backup)
# Mock the paginator for list_objects_v2
client.get_paginator = MagicMock()
@@ -66,7 +66,7 @@ def mock_client(test_backup: AgentBackup) -> Generator[AsyncMock]:
yield b"backup data"
async def read(self) -> bytes:
return json.dumps(test_backup.as_dict()).encode()
return json.dumps(mock_agent_backup.as_dict()).encode()
client.get_object.return_value = {"Body": MockStream()}
client.head_bucket.return_value = {}

View File

@@ -1,41 +1,5 @@
# serializer version: 1
# name: test_entry_diagnostics[large]
dict({
'backup': list([
dict({
'addons': list([
]),
'backup_id': '23e64aec',
'database_included': True,
'date': '2024-11-22T11:48:48.727189+01:00',
'extra_metadata': dict({
}),
'folders': list([
]),
'homeassistant_included': True,
'homeassistant_version': '2024.12.0.dev0',
'name': 'Core 2024.12.0.dev0',
'protected': False,
'size': 20971520,
}),
]),
'backup_agents': list([
dict({
'name': 'test',
}),
]),
'config': dict({
'access_key_id': '**REDACTED**',
'bucket': 'test',
'endpoint_url': 'https://s3.eu-south-1.amazonaws.com',
'secret_access_key': '**REDACTED**',
}),
'coordinator_data': dict({
'all_backups_size': 20971520,
}),
})
# ---
# name: test_entry_diagnostics[small]
# name: test_entry_diagnostics
dict({
'backup': list([
dict({

View File

@@ -1,5 +1,5 @@
# serializer version: 1
# name: test_sensor[large].2
# name: test_sensor.2
DeviceRegistryEntrySnapshot({
'area_id': None,
'config_entries': <ANY>,
@@ -30,7 +30,7 @@
'via_device_id': None,
})
# ---
# name: test_sensor[large][sensor.bucket_test_total_size_of_backups-entry]
# name: test_sensor[sensor.bucket_test_total_size_of_backups-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -72,95 +72,7 @@
'unit_of_measurement': <UnitOfInformation.MEBIBYTES: 'MiB'>,
})
# ---
# name: test_sensor[large][sensor.bucket_test_total_size_of_backups-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'data_size',
'friendly_name': 'Bucket test Total size of backups',
'unit_of_measurement': <UnitOfInformation.MEBIBYTES: 'MiB'>,
}),
'context': <ANY>,
'entity_id': 'sensor.bucket_test_total_size_of_backups',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '20.0',
})
# ---
# name: test_sensor[small].2
DeviceRegistryEntrySnapshot({
'area_id': None,
'config_entries': <ANY>,
'config_entries_subentries': <ANY>,
'configuration_url': None,
'connections': set({
}),
'disabled_by': None,
'entry_type': <DeviceEntryType.SERVICE: 'service'>,
'hw_version': None,
'id': <ANY>,
'identifiers': set({
tuple(
'aws_s3',
'test',
),
}),
'labels': set({
}),
'manufacturer': 'AWS',
'model': 'AWS S3',
'model_id': None,
'name': 'Bucket test',
'name_by_user': None,
'primary_config_entry': <ANY>,
'serial_number': None,
'sw_version': None,
'via_device_id': None,
})
# ---
# name: test_sensor[small][sensor.bucket_test_total_size_of_backups-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': <EntityCategory.DIAGNOSTIC: 'diagnostic'>,
'entity_id': 'sensor.bucket_test_total_size_of_backups',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Total size of backups',
'options': dict({
'sensor': dict({
'suggested_display_precision': 0,
}),
'sensor.private': dict({
'suggested_unit_of_measurement': <UnitOfInformation.MEBIBYTES: 'MiB'>,
}),
}),
'original_device_class': <SensorDeviceClass.DATA_SIZE: 'data_size'>,
'original_icon': None,
'original_name': 'Total size of backups',
'platform': 'aws_s3',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'backups_size',
'unique_id': 'test_backups_size',
'unit_of_measurement': <UnitOfInformation.MEBIBYTES: 'MiB'>,
})
# ---
# name: test_sensor[small][sensor.bucket_test_total_size_of_backups-state]
# name: test_sensor[sensor.bucket_test_total_size_of_backups-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'data_size',

View File

@@ -4,7 +4,7 @@ from collections.abc import AsyncGenerator
from io import StringIO
import json
from time import time
from unittest.mock import ANY, AsyncMock, Mock, patch
from unittest.mock import ANY, AsyncMock, Mock, call, patch
from botocore.exceptions import ConnectTimeoutError
import pytest
@@ -99,7 +99,7 @@ async def test_agents_list_backups(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test agent list backups."""
@@ -111,24 +111,24 @@ async def test_agents_list_backups(
assert response["result"]["agent_errors"] == {}
assert response["result"]["backups"] == [
{
"addons": test_backup.addons,
"addons": mock_agent_backup.addons,
"agents": {
f"{DOMAIN}.{mock_config_entry.entry_id}": {
"protected": test_backup.protected,
"size": test_backup.size,
"protected": mock_agent_backup.protected,
"size": mock_agent_backup.size,
}
},
"backup_id": test_backup.backup_id,
"database_included": test_backup.database_included,
"date": test_backup.date,
"extra_metadata": test_backup.extra_metadata,
"backup_id": mock_agent_backup.backup_id,
"database_included": mock_agent_backup.database_included,
"date": mock_agent_backup.date,
"extra_metadata": mock_agent_backup.extra_metadata,
"failed_addons": [],
"failed_agent_ids": [],
"failed_folders": [],
"folders": test_backup.folders,
"homeassistant_included": test_backup.homeassistant_included,
"homeassistant_version": test_backup.homeassistant_version,
"name": test_backup.name,
"folders": mock_agent_backup.folders,
"homeassistant_included": mock_agent_backup.homeassistant_included,
"homeassistant_version": mock_agent_backup.homeassistant_version,
"name": mock_agent_backup.name,
"with_automatic_settings": None,
}
]
@@ -138,37 +138,37 @@ async def test_agents_get_backup(
hass: HomeAssistant,
hass_ws_client: WebSocketGenerator,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test agent get backup."""
client = await hass_ws_client(hass)
await client.send_json_auto_id(
{"type": "backup/details", "backup_id": test_backup.backup_id}
{"type": "backup/details", "backup_id": mock_agent_backup.backup_id}
)
response = await client.receive_json()
assert response["success"]
assert response["result"]["agent_errors"] == {}
assert response["result"]["backup"] == {
"addons": test_backup.addons,
"addons": mock_agent_backup.addons,
"agents": {
f"{DOMAIN}.{mock_config_entry.entry_id}": {
"protected": test_backup.protected,
"size": test_backup.size,
"protected": mock_agent_backup.protected,
"size": mock_agent_backup.size,
}
},
"backup_id": test_backup.backup_id,
"database_included": test_backup.database_included,
"date": test_backup.date,
"extra_metadata": test_backup.extra_metadata,
"backup_id": mock_agent_backup.backup_id,
"database_included": mock_agent_backup.database_included,
"date": mock_agent_backup.date,
"extra_metadata": mock_agent_backup.extra_metadata,
"failed_addons": [],
"failed_agent_ids": [],
"failed_folders": [],
"folders": test_backup.folders,
"homeassistant_included": test_backup.homeassistant_included,
"homeassistant_version": test_backup.homeassistant_version,
"name": test_backup.name,
"folders": mock_agent_backup.folders,
"homeassistant_included": mock_agent_backup.homeassistant_included,
"homeassistant_version": mock_agent_backup.homeassistant_version,
"name": mock_agent_backup.name,
"with_automatic_settings": None,
}
@@ -197,7 +197,7 @@ async def test_agents_list_backups_with_corrupted_metadata(
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
caplog: pytest.LogCaptureFixture,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test listing backups when one metadata file is corrupted."""
# Create agent
@@ -220,7 +220,7 @@ async def test_agents_list_backups_with_corrupted_metadata(
]
# Mock responses for get_object calls
valid_metadata = json.dumps(test_backup.as_dict())
valid_metadata = json.dumps(mock_agent_backup.as_dict())
corrupted_metadata = "{invalid json content"
async def mock_get_object(**kwargs):
@@ -239,7 +239,7 @@ async def test_agents_list_backups_with_corrupted_metadata(
backups = await agent.async_list_backups()
assert len(backups) == 1
assert backups[0].backup_id == test_backup.backup_id
assert backups[0].backup_id == mock_agent_backup.backup_id
assert "Failed to process metadata file" in caplog.text
@@ -290,72 +290,31 @@ async def test_agents_delete_not_throwing_on_not_found(
assert mock_client.delete_object.call_count == 0
async def test_agents_upload(
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
) -> None:
"""Test agent upload backup."""
client = await hass_client()
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
return_value=test_backup,
),
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
# we must emit at least two chunks
# the "appendix" chunk triggers the upload of the final buffer part
mocked_open.return_value.read = Mock(
side_effect=[
b"a" * test_backup.size,
b"appendix",
b"",
]
)
resp = await client.post(
f"/api/backup/upload?agent_id={DOMAIN}.{mock_config_entry.entry_id}",
data={"file": StringIO("test")},
)
assert resp.status == 201
assert f"Uploading backup {test_backup.backup_id}" in caplog.text
if test_backup.size < MULTIPART_MIN_PART_SIZE_BYTES:
# single part + metadata both as regular upload (no multiparts)
assert mock_client.create_multipart_upload.await_count == 0
assert mock_client.put_object.await_count == 2
else:
assert "Uploading final part" in caplog.text
# 2 parts as multipart + metadata as regular upload
assert mock_client.create_multipart_upload.await_count == 1
assert mock_client.upload_part.await_count == 2
assert mock_client.complete_multipart_upload.await_count == 1
assert mock_client.put_object.await_count == 1
@pytest.mark.parametrize(
"backup_size",
[
2**20,
MULTIPART_MIN_PART_SIZE_BYTES,
],
ids=["small", "large"],
)
async def test_agents_upload_network_failure(
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test agent upload backup with network failure."""
client = await hass_client()
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
return_value=test_backup,
return_value=mock_agent_backup,
),
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
return_value=mock_agent_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
@@ -396,7 +355,7 @@ async def test_error_during_delete(
hass_ws_client: WebSocketGenerator,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test the error wrapper."""
mock_client.delete_object.side_effect = BotoCoreError
@@ -406,7 +365,7 @@ async def test_error_during_delete(
await client.send_json_auto_id(
{
"type": "backup/delete",
"backup_id": test_backup.backup_id,
"backup_id": mock_agent_backup.backup_id,
}
)
response = await client.receive_json()
@@ -422,7 +381,7 @@ async def test_error_during_delete(
async def test_cache_expiration(
hass: HomeAssistant,
mock_client: MagicMock,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
) -> None:
"""Test that the cache expires correctly."""
# Mock the entry
@@ -441,7 +400,7 @@ async def test_cache_expiration(
mock_client.reset_mock()
# Mock metadata response
metadata_content = json.dumps(test_backup.as_dict())
metadata_content = json.dumps(mock_agent_backup.as_dict())
mock_body = AsyncMock()
mock_body.read.return_value = metadata_content.encode()
mock_client.get_paginator.return_value.paginate.return_value.__aiter__.return_value = [
@@ -587,7 +546,7 @@ async def test_agent_list_backups_parametrized(
hass_ws_client: WebSocketGenerator,
mock_config_entry: MockConfigEntry,
mock_client: MagicMock,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
config_entry_extra_data: dict,
expected_paginate_extra_kwargs: dict,
) -> None:
@@ -618,7 +577,7 @@ async def test_agent_delete_backup_parametrized(
hass_ws_client: WebSocketGenerator,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
expected_key_prefix: str,
) -> None:
"""Test agent delete backup with and without prefix."""
@@ -635,7 +594,7 @@ async def test_agent_delete_backup_parametrized(
assert response["success"]
assert response["result"] == {"agent_errors": {}}
tar_filename, metadata_filename = suggested_filenames(test_backup)
tar_filename, metadata_filename = suggested_filenames(mock_agent_backup)
expected_tar_key = f"{expected_key_prefix}{tar_filename}"
expected_metadata_key = f"{expected_key_prefix}{metadata_filename}"
@@ -644,6 +603,40 @@ async def test_agent_delete_backup_parametrized(
mock_client.delete_object.assert_any_call(Bucket="test", Key=expected_metadata_key)
async def _upload_backup(
hass_client: ClientSessionGenerator,
agent_id: str,
mock_agent_backup: AgentBackup,
) -> None:
"""Perform a backup upload with the necessary mocks set up."""
client = await hass_client()
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
return_value=mock_agent_backup,
),
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=mock_agent_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
# we must emit at least two chunks
# the "appendix" chunk triggers the upload of the final buffer part
mocked_open.return_value.read = Mock(
side_effect=[
b"a" * mock_agent_backup.size,
b"appendix",
b"",
]
)
resp = await client.post(
f"/api/backup/upload?agent_id={agent_id}",
data={"file": StringIO("test")},
)
assert resp.status == 201
@pytest.mark.parametrize(
("config_entry_extra_data", "expected_key_prefix"),
[
@@ -652,75 +645,95 @@ async def test_agent_delete_backup_parametrized(
],
ids=["with_prefix", "no_prefix"],
)
async def test_agent_upload_backup_parametrized(
hass: HomeAssistant,
async def test_agent_upload_small_backup_parametrized(
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
expected_key_prefix: str,
) -> None:
"""Test agent upload backup with and without prefix."""
client = await hass_client()
with (
patch(
"homeassistant.components.backup.manager.BackupManager.async_get_backup",
return_value=test_backup,
),
patch(
"homeassistant.components.backup.manager.read_backup",
return_value=test_backup,
),
patch("pathlib.Path.open") as mocked_open,
):
# we must emit at least two chunks
# the "appendix" chunk triggers the upload of the final buffer part
mocked_open.return_value.read = Mock(
side_effect=[
b"a" * test_backup.size,
b"appendix",
b"",
]
)
resp = await client.post(
f"/api/backup/upload?agent_id={DOMAIN}.{mock_config_entry.entry_id}",
data={"file": StringIO("test")},
)
"""Test agent upload small backup with and without prefix."""
await _upload_backup(
hass_client, f"{DOMAIN}.{mock_config_entry.entry_id}", mock_agent_backup
)
assert resp.status == 201
tar_filename, metadata_filename = suggested_filenames(test_backup)
expected_tar_key = f"{expected_key_prefix}{tar_filename}"
expected_metadata_key = f"{expected_key_prefix}{metadata_filename}"
if test_backup.size < MULTIPART_MIN_PART_SIZE_BYTES:
mock_client.put_object.assert_any_call(
Bucket="test", Key=expected_tar_key, Body=ANY
)
mock_client.put_object.assert_any_call(
Bucket="test", Key=expected_metadata_key, Body=ANY
)
else:
mock_client.create_multipart_upload.assert_called_with(
Bucket="test", Key=expected_tar_key
)
mock_client.upload_part.assert_any_call(
assert f"Uploading backup {mock_agent_backup.backup_id}" in caplog.text
assert mock_client.create_multipart_upload.await_count == 0
assert mock_client.upload_part.await_count == 0
assert mock_client.complete_multipart_upload.await_count == 0
assert mock_client.put_object.await_count == 2
tar_filename, metadata_filename = suggested_filenames(mock_agent_backup)
mock_client.put_object.assert_has_calls(
[
call(Bucket="test", Key=f"{expected_key_prefix}{tar_filename}", Body=ANY),
call(
Bucket="test",
Key=expected_tar_key,
Key=f"{expected_key_prefix}{metadata_filename}",
Body=ANY,
),
]
)
@pytest.mark.parametrize("backup_size", [MULTIPART_MIN_PART_SIZE_BYTES], ids=["large"])
@pytest.mark.parametrize(
("config_entry_extra_data", "expected_key_prefix"),
[
({"prefix": "backups/home"}, "backups/home/"),
({}, ""),
],
ids=["with_prefix", "no_prefix"],
)
async def test_agent_upload_large_backup_parametrized(
hass_client: ClientSessionGenerator,
caplog: pytest.LogCaptureFixture,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
mock_agent_backup: AgentBackup,
expected_key_prefix: str,
) -> None:
"""Test agent upload large (multipart) backup with and without prefix."""
await _upload_backup(
hass_client, f"{DOMAIN}.{mock_config_entry.entry_id}", mock_agent_backup
)
tar_filename, metadata_filename = suggested_filenames(mock_agent_backup)
tar_key = f"{expected_key_prefix}{tar_filename}"
metadata_key = f"{expected_key_prefix}{metadata_filename}"
assert f"Uploading backup {mock_agent_backup.backup_id}" in caplog.text
assert mock_client.create_multipart_upload.await_count == 1
assert mock_client.upload_part.await_count == 2
assert mock_client.complete_multipart_upload.await_count == 1
assert mock_client.put_object.await_count == 1
mock_client.create_multipart_upload.assert_called_with(Bucket="test", Key=tar_key)
mock_client.upload_part.assert_has_calls(
[
call(
Bucket="test",
Key=tar_key,
PartNumber=1,
UploadId="upload_id",
Body=ANY,
)
mock_client.complete_multipart_upload.assert_called_with(
),
call(
Bucket="test",
Key=expected_tar_key,
Key=tar_key,
PartNumber=2,
UploadId="upload_id",
MultipartUpload=ANY,
)
mock_client.put_object.assert_called_with(
Bucket="test", Key=expected_metadata_key, Body=ANY
)
Body=ANY,
),
]
)
mock_client.complete_multipart_upload.assert_called_with(
Bucket="test",
Key=tar_key,
UploadId="upload_id",
MultipartUpload=ANY,
)
mock_client.put_object.assert_called_with(Bucket="test", Key=metadata_key, Body=ANY)
@pytest.mark.parametrize(
@@ -736,7 +749,7 @@ async def test_agent_download_backup_parametrized(
hass_client: ClientSessionGenerator,
mock_client: MagicMock,
mock_config_entry: MockConfigEntry,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
expected_key_prefix: str,
) -> None:
"""Test agent download backup with and without prefix."""
@@ -749,7 +762,7 @@ async def test_agent_download_backup_parametrized(
assert resp.status == 200
assert await resp.content.read() == b"backup data"
tar_filename, _ = suggested_filenames(test_backup)
tar_filename, _ = suggested_filenames(mock_agent_backup)
expected_tar_key = f"{expected_key_prefix}{tar_filename}"

View File

@@ -90,7 +90,7 @@ async def test_calculate_backups_size(
mock_client: AsyncMock,
mock_config_entry: MockConfigEntry,
freezer: FrozenDateTimeFactory,
test_backup: AgentBackup,
mock_agent_backup: AgentBackup,
config_entry_extra_data: dict,
expected_pagination_call: dict,
) -> None:
@@ -104,7 +104,7 @@ async def test_calculate_backups_size(
assert state.state == "0.0"
# Add a backup
metadata_content = json.dumps(test_backup.as_dict())
metadata_content = json.dumps(mock_agent_backup.as_dict())
mock_body = AsyncMock()
mock_body.read.return_value = metadata_content.encode()
mock_client.get_object.return_value = {"Body": mock_body}

View File

@@ -68,7 +68,7 @@ def get_fake_chromecast(info: ChromecastInfo):
mock = MagicMock(uuid=info.uuid)
mock.app_id = None
mock.media_controller.status = None
mock.is_idle = True
mock.ignore_cec = False
return mock
@@ -888,7 +888,6 @@ async def test_entity_cast_status(
assert not state.attributes.get("is_volume_muted")
chromecast.app_id = "1234"
chromecast.is_idle = False
cast_status = MagicMock()
cast_status.volume_level = 0.5
cast_status.volume_muted = False
@@ -1601,7 +1600,6 @@ async def test_entity_media_states(
# App id updated, but no media status
chromecast.app_id = app_id
chromecast.is_idle = False
cast_status = MagicMock()
cast_status_cb(cast_status)
await hass.async_block_till_done()
@@ -1644,7 +1642,6 @@ async def test_entity_media_states(
# App no longer running
chromecast.app_id = pychromecast.IDLE_APP_ID
chromecast.is_idle = True
cast_status = MagicMock()
cast_status_cb(cast_status)
await hass.async_block_till_done()
@@ -1653,7 +1650,6 @@ async def test_entity_media_states(
# No cast status
chromecast.app_id = None
chromecast.is_idle = False
cast_status_cb(None)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
@@ -1721,20 +1717,70 @@ async def test_entity_media_states_lovelace_app(
chromecast.app_id = pychromecast.IDLE_APP_ID
media_status.player_is_idle = False
chromecast.is_idle = True
media_status_cb(media_status)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == "off"
chromecast.app_id = None
chromecast.is_idle = False
cast_status_cb(None)
media_status_cb(media_status)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == "unknown"
async def test_entity_media_states_active_input(
hass: HomeAssistant, entity_registry: er.EntityRegistry
) -> None:
"""Test various entity media states when the lovelace app is active."""
entity_id = "media_player.speaker"
info = get_fake_chromecast_info()
chromecast, _ = await async_setup_media_player_cast(hass, info)
chromecast.cast_type = pychromecast.const.CAST_TYPE_CHROMECAST
cast_status_cb, conn_status_cb, _ = get_status_callbacks(chromecast)
chromecast.app_id = "84912283"
cast_status = MagicMock()
connection_status = MagicMock()
connection_status.status = "CONNECTED"
conn_status_cb(connection_status)
await hass.async_block_till_done()
# Unknown input status
cast_status.is_active_input = None
cast_status_cb(cast_status)
state = hass.states.get(entity_id)
assert state is not None
assert state.state == "idle"
# Active input status
cast_status.is_active_input = True
cast_status_cb(cast_status)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state.state == "idle"
# Inactive input status
cast_status.is_active_input = False
cast_status_cb(cast_status)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state is not None
assert state.state == "off"
# Inactive input status, but ignored
chromecast.ignore_cec = True
cast_status_cb(cast_status)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state is not None
assert state.state == "idle"
async def test_group_media_states(
hass: HomeAssistant, entity_registry: er.EntityRegistry, mz_mock
) -> None:
@@ -2404,7 +2450,6 @@ async def test_entity_media_states_active_app_reported_idle(
# Scenario: Custom App is running (e.g. DashCast), but device reports is_idle=True
chromecast.app_id = "84912283" # Example Custom App ID
chromecast.is_idle = True # Device thinks it's idle/standby
# Trigger a status update
cast_status = MagicMock()
@@ -2417,7 +2462,6 @@ async def test_entity_media_states_active_app_reported_idle(
# Scenario: Backdrop (Screensaver) is running. Should still be OFF.
chromecast.app_id = pychromecast.config.APP_BACKDROP
chromecast.is_idle = True
cast_status_cb(cast_status)
await hass.async_block_till_done()

View File

@@ -41,6 +41,7 @@
'current_position': 75,
'device_class': 'shutter',
'friendly_name': 'Shutter mock 1',
'is_closed': False,
'supported_features': <CoverEntityFeature: 15>,
}),
'context': <ANY>,

View File

@@ -40,6 +40,7 @@
'attributes': ReadOnlyDict({
'device_class': 'shutter',
'friendly_name': 'Cover0',
'is_closed': None,
'supported_features': <CoverEntityFeature: 11>,
}),
'context': <ANY>,

View File

@@ -34,10 +34,10 @@ async def test_services(
# Test init all covers should be open
assert is_open(hass, ent1)
assert is_open(hass, ent2)
assert is_open(hass, ent2, 50)
assert is_open(hass, ent3)
assert is_open(hass, ent4)
assert is_open(hass, ent5)
assert is_open(hass, ent5, 50)
assert is_open(hass, ent6)
# call basic toggle services
@@ -50,10 +50,10 @@ async def test_services(
# entities should be either closed or closing, depending on if they report transitional states
assert is_closed(hass, ent1)
assert is_closing(hass, ent2)
assert is_closing(hass, ent2, 50)
assert is_closed(hass, ent3)
assert is_closed(hass, ent4)
assert is_closing(hass, ent5)
assert is_closing(hass, ent5, 50)
assert is_closing(hass, ent6)
# call basic toggle services and set different cover position states
@@ -68,10 +68,10 @@ async def test_services(
# entities should be in correct state depending on the SUPPORT_STOP feature and cover position
assert is_open(hass, ent1)
assert is_closed(hass, ent2)
assert is_closed(hass, ent2, 0)
assert is_open(hass, ent3)
assert is_open(hass, ent4)
assert is_open(hass, ent5)
assert is_open(hass, ent5, 15)
assert is_opening(hass, ent6)
# call basic toggle services
@@ -84,10 +84,10 @@ async def test_services(
# entities should be in correct state depending on the SUPPORT_STOP feature and cover position
assert is_closed(hass, ent1)
assert is_opening(hass, ent2)
assert is_opening(hass, ent2, 0, closed=True)
assert is_closed(hass, ent3)
assert is_closed(hass, ent4)
assert is_opening(hass, ent5)
assert is_opening(hass, ent5, 15)
assert is_closing(hass, ent6)
# Without STOP but still reports opening/closing has a 4th possible toggle state
@@ -98,13 +98,13 @@ async def test_services(
# After the unusual state transition: closing -> fully open, toggle should close
set_state(ent5, CoverState.OPEN)
await call_service(hass, SERVICE_TOGGLE, ent5) # Start closing
assert is_closing(hass, ent5)
assert is_closing(hass, ent5, 15)
set_state(
ent5, CoverState.OPEN
) # Unusual state transition from closing -> fully open
set_cover_position(ent5, 100)
await call_service(hass, SERVICE_TOGGLE, ent5) # Should close, not open
assert is_closing(hass, ent5)
assert is_closing(hass, ent5, 100)
def call_service(hass: HomeAssistant, service: str, ent: Entity) -> ServiceResponse:
@@ -124,21 +124,67 @@ def set_state(ent, state) -> None:
ent._values["state"] = state
def is_open(hass: HomeAssistant, ent: Entity) -> bool:
def _check_state(
hass: HomeAssistant,
ent: Entity,
*,
expected_state: str,
expected_position: int | None,
expected_is_closed: bool,
) -> bool:
"""Check if the state of a cover is as expected."""
state = hass.states.get(ent.entity_id)
correct_state = state.state == expected_state
correct_is_closed = state.attributes.get("is_closed") == expected_is_closed
correct_position = state.attributes.get("current_position") == expected_position
return all([correct_state, correct_is_closed, correct_position])
def is_open(hass: HomeAssistant, ent: Entity, position: int | None = None) -> bool:
"""Return if the cover is open based on the statemachine."""
return _check_state(
hass,
ent,
expected_state=CoverState.OPEN,
expected_position=position,
expected_is_closed=False,
)
def is_opening(
hass: HomeAssistant,
ent: Entity,
position: int | None = None,
*,
closed: bool = False,
) -> bool:
"""Return if the cover is opening based on the statemachine."""
return _check_state(
hass,
ent,
expected_state=CoverState.OPENING,
expected_position=position,
expected_is_closed=closed,
)
def is_closed(hass: HomeAssistant, ent: Entity, position: int | None = None) -> bool:
"""Return if the cover is closed based on the statemachine."""
return hass.states.is_state(ent.entity_id, CoverState.OPEN)
return _check_state(
hass,
ent,
expected_state=CoverState.CLOSED,
expected_position=position,
expected_is_closed=True,
)
def is_opening(hass: HomeAssistant, ent: Entity) -> bool:
"""Return if the cover is closed based on the statemachine."""
return hass.states.is_state(ent.entity_id, CoverState.OPENING)
def is_closed(hass: HomeAssistant, ent: Entity) -> bool:
"""Return if the cover is closed based on the statemachine."""
return hass.states.is_state(ent.entity_id, CoverState.CLOSED)
def is_closing(hass: HomeAssistant, ent: Entity) -> bool:
"""Return if the cover is closed based on the statemachine."""
return hass.states.is_state(ent.entity_id, CoverState.CLOSING)
def is_closing(hass: HomeAssistant, ent: Entity, position: int | None = None) -> bool:
"""Return if the cover is closing based on the statemachine."""
return _check_state(
hass,
ent,
expected_state=CoverState.CLOSING,
expected_position=position,
expected_is_closed=False,
)

View File

@@ -41,6 +41,7 @@
'current_position': 0,
'device_class': 'shade',
'friendly_name': 'Window covering device',
'is_closed': True,
'supported_features': <CoverEntityFeature: 15>,
}),
'context': <ANY>,
@@ -94,6 +95,7 @@
'current_tilt_position': 97,
'device_class': 'damper',
'friendly_name': 'Vent',
'is_closed': False,
'supported_features': <CoverEntityFeature: 255>,
}),
'context': <ANY>,
@@ -147,6 +149,7 @@
'current_tilt_position': 100,
'device_class': 'shade',
'friendly_name': 'Covering device',
'is_closed': False,
'supported_features': <CoverEntityFeature: 255>,
}),
'context': <ANY>,

View File

@@ -5,6 +5,7 @@
'current_position': 20,
'device_class': 'blind',
'friendly_name': 'Test',
'is_closed': False,
'supported_features': <CoverEntityFeature: 7>,
}),
'context': <ANY>,

Some files were not shown because too many files have changed in this diff Show More