Compare commits

..

41 Commits

Author SHA1 Message Date
Jan Čermák
dfa8bf679d Disable version check and publish 2026-03-18 12:22:28 +01:00
Jan Čermák
d0cd84b35e Revert changes to the build_base job 2026-03-18 12:15:49 +01:00
Jan Čermák
cebc35901f Remove redundant default "context: ." 2026-03-17 17:45:01 +01:00
Jan Čermák
def8dc202d Fix indent for continued packages 2026-03-17 17:23:06 +01:00
Jan Čermák
9e4fcac98a Pin builder actions to release hash 2026-03-17 17:22:17 +01:00
Jan Čermák
c0b581c924 Merge remote-tracking branch 'origin/dev' into gha-builder 2026-03-17 16:16:55 +01:00
Jan Čermák
cf3ad71c8f Add io.hass.machine labels to machine images 2026-03-17 16:09:16 +01:00
Joost Lekkerkerker
447d616097 Add select for SmartThings RVC sound mode (#164519) 2026-03-17 15:57:59 +01:00
Norbert Rittel
d3102e718d Consistenly sentence-case "API token" in habitica (#165369)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-03-17 14:30:39 +00:00
Josef Zweck
69ee49735a Remove support for homeassistant.update_entity from mold_indicator (#165797) 2026-03-17 15:26:22 +01:00
Daniel Hjelseth Høyer
35a99dd4a4 Fix Tibber update token (#164295)
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-03-17 15:11:51 +01:00
Ariel Ebersberger
51c3397be8 Refactor wemo integration to use async service action handlers (#165794)
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-03-17 15:07:00 +01:00
Brett Adams
57f0fd2ed2 Tesla Fleet: fix malformed energy live response handling (#165101) 2026-03-17 15:04:35 +01:00
Erik Montnemery
fa7a216afe Use return value from target_entities directly in condition tests (#165791) 2026-03-17 14:55:17 +01:00
Josef Zweck
20f4426e1d Fix mold_indicator sensor update (#158996) 2026-03-17 14:28:50 +01:00
Erik Montnemery
ba30563772 Deduplicate tests testing triggers in mode last (#165789) 2026-03-17 14:28:10 +01:00
A. Gideonse
b807c104a3 Add button platform to Indevolt integration (#165283) 2026-03-17 13:59:18 +01:00
epenet
9e6abb719a Add fixture for Kerui/Tuya video doorbell (#165786) 2026-03-17 13:57:28 +01:00
jvmahon
ed2083a60d Limit color temperature to maximum Matter MIREDs value (#163892)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-17 12:49:18 +00:00
Kornel
94db0d5eab Handle timeout in HKDevice.async_update (#162071)
Co-authored-by: J. Nick Koston <nick@koston.org>
2026-03-17 12:42:43 +00:00
dckiller51
06eed998b9 Add platform attribute to Xbox sensors (#161661) 2026-03-17 12:40:42 +00:00
Andrej Walilko
fb5c2f2566 Add shuffle service and enqueue support to jellyfin media player (#161632)
Co-authored-by: Andrej Walilko <awalilko@liquidweb.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-17 12:17:55 +00:00
Dominik
4f7d065230 Fix fritz target selector for dial and set_guest_wifi_password (#165396) 2026-03-17 13:15:51 +01:00
Erwin Douna
d034df9b93 Add Portainer request timeout (#165785) 2026-03-17 12:58:55 +01:00
Erik Montnemery
6c9fc7c7a1 Deduplicate tests testing triggers in mode first (#165779)
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
2026-03-17 12:44:21 +01:00
Leon Grave
ba58ef23d8 Add reauthentication-flow to freshr (#165545)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-03-17 12:22:54 +01:00
johanzander
0a0fa96ac1 Add silver quality scale for growatt_server (#165500) 2026-03-17 12:18:11 +01:00
Erik Montnemery
9cc7ef75b0 Move cover.trigger.CoverDomainSpec to cover.models (#165774) 2026-03-17 12:11:11 +01:00
Carlos Sánchez López
2e0d6d2bbf Add fixture for Tuya wg2 alarm panel (Duosmart C30) (#165701)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-03-17 11:34:53 +01:00
Artur Pragacz
bafef2065f Rework user-given entity name logic (#162763) 2026-03-17 11:09:20 +01:00
Erik Montnemery
fdfe87de4c Move condition/trigger test helpers to test.components.common (#165777) 2026-03-17 11:08:38 +01:00
Jan Čermák
6febd78e00 Move labels back to top 2026-03-10 09:44:05 +01:00
Jan Čermák
8ec6e36d3f Remove default BUILD_FROM in main Dockerfile for now 2026-03-10 09:17:56 +01:00
Jan Čermák
a04fc6b260 Mark autogenerated files 2026-03-10 09:17:08 +01:00
Jan Čermák
5c307fbb23 Remove dynamic labels from dockerfiles 2026-03-10 08:55:56 +01:00
Jan Čermák
36cb3e21fe Merge remote-tracking branch 'origin/dev' into gha-builder 2026-03-05 12:17:11 +01:00
Jan Čermák
f645b232f9 Fix container-(username|password) -> container-registry-(username|password) 2026-03-05 12:14:36 +01:00
Jan Čermák
e8454d9b2c Use updated build-image action inputs, sort alphabetically 2026-03-05 12:10:43 +01:00
Jan Čermák
02ae9b2f71 Generate machine dockerfiles using hassfest script 2026-03-05 11:22:12 +01:00
Jan Čermák
f6f7390063 Restore build context also in build_python 2026-03-04 18:26:21 +01:00
Jan Čermák
bfa1fd7f1b Use new home-assistant/builder actions for image builds
This PR completely drops usage of the builder action in favor of new actions
introduced in home-assistant/builder#273. This results in faster builds with
better caching options and simple local builds using Docker BuildKit.

The image dependency chain currently still uses per-arch builds but once
docker-base and docker repositories start publishing multi-arch images, we can
simplify the action a bit further.

The idea to use composite actions comes from #162245 and this PR fully predates
it. There is minor difference that the files generated twice in per-arch builds
are now generated and archived by the init job.
2026-03-04 18:05:12 +01:00
1091 changed files with 44027 additions and 29931 deletions

1
.gitattributes vendored
View File

@@ -16,6 +16,7 @@ Dockerfile.dev linguist-language=Dockerfile
CODEOWNERS linguist-generated=true
Dockerfile linguist-generated=true
homeassistant/generated/*.py linguist-generated=true
machine/* linguist-generated=true
mypy.ini linguist-generated=true
requirements.txt linguist-generated=true
requirements_all.txt linguist-generated=true

View File

@@ -35,6 +35,7 @@ jobs:
channel: ${{ steps.version.outputs.channel }}
publish: ${{ steps.version.outputs.publish }}
architectures: ${{ env.ARCHITECTURES }}
base_image_version: ${{ env.BASE_IMAGE_VERSION }}
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -56,10 +57,10 @@ jobs:
with:
type: ${{ env.BUILD_TYPE }}
- name: Verify version
uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
with:
ignore-dev: true
# - name: Verify version
# uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
# with:
# ignore-dev: true
- name: Fail if translations files are checked in
run: |
@@ -100,7 +101,7 @@ jobs:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
include:
- arch: amd64
os: ubuntu-latest
os: ubuntu-24.04
- arch: aarch64
os: ubuntu-24.04-arm
steps:
@@ -195,77 +196,20 @@ jobs:
run: |
echo "${GITHUB_SHA};${GITHUB_REF};${GITHUB_EVENT_NAME};${GITHUB_ACTOR}" > rootfs/OFFICIAL_IMAGE
- name: Login to GitHub Container Registry
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Install Cosign
uses: sigstore/cosign-installer@ba7bc0a3fef59531c69a25acd34668d6d3fe6f22 # v4.1.0
with:
cosign-release: "v2.5.3"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build variables
id: vars
shell: bash
env:
ARCH: ${{ matrix.arch }}
run: |
echo "base_image=ghcr.io/home-assistant/${ARCH}-homeassistant-base:${BASE_IMAGE_VERSION}" >> "$GITHUB_OUTPUT"
echo "cache_image=ghcr.io/home-assistant/${ARCH}-homeassistant:latest" >> "$GITHUB_OUTPUT"
echo "created=$(date --rfc-3339=seconds --utc)" >> "$GITHUB_OUTPUT"
- name: Verify base image signature
env:
BASE_IMAGE: ${{ steps.vars.outputs.base_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/docker/.*" \
"${BASE_IMAGE}"
- name: Verify cache image signature
id: cache
continue-on-error: true
env:
CACHE_IMAGE: ${{ steps.vars.outputs.cache_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/core/.*" \
"${CACHE_IMAGE}"
- name: Build base image
id: build
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
uses: home-assistant/builder/actions/build-image@62a1597b84b3461abad9816d9cd92862a2b542c3 # 2026.03.2
with:
context: .
file: ./Dockerfile
platforms: ${{ steps.vars.outputs.platform }}
push: true
cache-from: ${{ steps.cache.outcome == 'success' && steps.vars.outputs.cache_image || '' }}
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=${{ steps.vars.outputs.base_image }}
tags: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
outputs: type=image,push=true,compression=zstd,compression-level=9,force-compression=true,oci-mediatypes=true
labels: |
io.hass.arch=${{ matrix.arch }}
io.hass.version=${{ needs.init.outputs.version }}
org.opencontainers.image.created=${{ steps.vars.outputs.created }}
org.opencontainers.image.version=${{ needs.init.outputs.version }}
- name: Sign image
env:
ARCH: ${{ matrix.arch }}
VERSION: ${{ needs.init.outputs.version }}
DIGEST: ${{ steps.build.outputs.digest }}
run: |
cosign sign --yes "ghcr.io/home-assistant/${ARCH}-homeassistant:${VERSION}@${DIGEST}"
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
cosign-base-identity: "https://github.com/home-assistant/docker/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
image: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant
image-tags: ${{ needs.init.outputs.version }}
push: true
version: ${{ needs.init.outputs.version }}
build_machine:
name: Build ${{ matrix.machine }} machine core image
@@ -314,308 +258,310 @@ jobs:
with:
persist-credentials: false
- name: Set build additional args
- name: Compute extra tags
id: tags
shell: bash
env:
VERSION: ${{ needs.init.outputs.version }}
run: |
# Create general tags
if [[ "${VERSION}" =~ d ]]; then
echo "BUILD_ARGS=--additional-tag dev" >> $GITHUB_ENV
echo "extra_tags=dev" >> "$GITHUB_OUTPUT"
elif [[ "${VERSION}" =~ b ]]; then
echo "BUILD_ARGS=--additional-tag beta" >> $GITHUB_ENV
echo "extra_tags=beta" >> "$GITHUB_OUTPUT"
else
echo "BUILD_ARGS=--additional-tag stable" >> $GITHUB_ENV
echo "extra_tags=stable" >> "$GITHUB_OUTPUT"
fi
- name: Login to GitHub Container Registry
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
- name: Build machine image
uses: home-assistant/builder/actions/build-image@62a1597b84b3461abad9816d9cd92862a2b542c3 # 2026.03.2
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image
uses: home-assistant/builder@6cb4fd3d1338b6e22d0958a4bcb53e0965ea63b4 # 2026.02.1
with:
image: ${{ matrix.arch }}
args: |
$BUILD_ARGS \
--target /data/machine \
--cosign \
--machine "${{ needs.init.outputs.version }}=${{ matrix.machine }}"
publish_ha:
name: Publish version files
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_machine"]
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
with:
name: ${{ secrets.GIT_NAME }}
email: ${{ secrets.GIT_EMAIL }}
token: ${{ secrets.GIT_TOKEN }}
- name: Update version file
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: ${{ needs.init.outputs.channel }}
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
- name: Update version file (stable -> beta)
if: needs.init.outputs.channel == 'stable'
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: beta
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
publish_container:
name: Publish meta container for ${{ matrix.registry }}
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Install Cosign
uses: sigstore/cosign-installer@ba7bc0a3fef59531c69a25acd34668d6d3fe6f22 # v4.1.0
with:
cosign-release: "v2.5.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Verify architecture image signatures
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Verifying ${arch} image signature..."
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp https://github.com/home-assistant/core/.* \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
done
echo "✓ All images verified successfully"
# Generate all Docker tags based on version string
# Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# Examples:
# 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
- name: Generate Docker metadata
id: meta
uses: docker/metadata-action@030e881283bb7a6894de51c315a6bfe6a94e05cf # v6.0.0
with:
images: ${{ matrix.registry }}/home-assistant
sep-tags: ","
tags: |
type=raw,value=${{ needs.init.outputs.version }},priority=9999
type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v3.7.1
- name: Copy architecture images to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
# Use imagetools to copy image blobs directly between registries
# This preserves provenance/attestations and seems to be much faster than pull/push
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Copying ${arch} image to DockerHub..."
for attempt in 1 2 3; do
if docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
break
fi
echo "Attempt ${attempt} failed, retrying in 10 seconds..."
sleep 10
if [ "${attempt}" -eq 3 ]; then
echo "Failed after 3 attempts"
exit 1
fi
done
cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
done
- name: Create and push multi-arch manifests
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
REGISTRY: ${{ matrix.registry }}
VERSION: ${{ needs.init.outputs.version }}
META_TAGS: ${{ steps.meta.outputs.tags }}
run: |
# Build list of architecture images dynamically
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
ARCH_IMAGES=()
for arch in $ARCHS; do
ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
done
# Build list of all tags for single manifest creation
# Note: Using sep-tags=',' in metadata-action for easier parsing
TAG_ARGS=()
IFS=',' read -ra TAGS <<< "${META_TAGS}"
for tag in "${TAGS[@]}"; do
TAG_ARGS+=("--tag" "${tag}")
done
# Create manifest with ALL tags in a single operation (much faster!)
echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
# Sign each tag separately (signing requires individual tag names)
echo "Signing all tags..."
for tag in "${TAGS[@]}"; do
echo "Signing ${tag}"
cosign sign --yes "${tag}"
done
echo "All manifests created and signed successfully"
build_python:
name: Build PyPi package
environment: ${{ needs.init.outputs.channel }}
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
id-token: write # For PyPI trusted publishing
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
- name: Download translations
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Build package
shell: bash
run: |
# Remove dist, build, and homeassistant.egg-info
# when build locally for testing!
pip install build
python -m build
- name: Upload package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
skip-existing: true
hassfest-image:
name: Build and test hassfest image
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
attestations: write # For build provenance attestation
id-token: write # For build provenance attestation
needs: ["init"]
if: github.repository_owner == 'home-assistant'
env:
HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to GitHub Container Registry
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker image
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
load: true
tags: ${{ env.HASSFEST_IMAGE_TAG }}
- name: Run hassfest against core
run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
- name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
id: push
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
context: machine/
cosign-base-identity: "https://github.com/home-assistant/core/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
file: machine/${{ matrix.machine }}
image: ghcr.io/home-assistant/${{ matrix.machine }}-homeassistant
image-tags: |
${{ needs.init.outputs.version }}
${{ steps.tags.outputs.extra_tags }}
push: true
tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
- name: Generate artifact attestation
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
uses: actions/attest@59d89421af93a897026c735860bf21b6eb4f7b26 # v4.1.0
with:
subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
version: ${{ needs.init.outputs.version }}
# publish_ha:
# name: Publish version files
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_machine"]
# runs-on: ubuntu-latest
# permissions:
# contents: read
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Initialize git
# uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
# with:
# name: ${{ secrets.GIT_NAME }}
# email: ${{ secrets.GIT_EMAIL }}
# token: ${{ secrets.GIT_TOKEN }}
#
# - name: Update version file
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: ${{ needs.init.outputs.channel }}
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# - name: Update version file (stable -> beta)
# if: needs.init.outputs.channel == 'stable'
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: beta
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# publish_container:
# name: Publish meta container for ${{ matrix.registry }}
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# id-token: write # For cosign signing
# strategy:
# fail-fast: false
# matrix:
# registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
# steps:
# - name: Install Cosign
# uses: sigstore/cosign-installer@ba7bc0a3fef59531c69a25acd34668d6d3fe6f22 # v4.1.0
# with:
# cosign-release: "v2.5.3"
#
# - name: Login to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_TOKEN }}
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Verify architecture image signatures
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Verifying ${arch} image signature..."
# cosign verify \
# --certificate-oidc-issuer https://token.actions.githubusercontent.com \
# --certificate-identity-regexp https://github.com/home-assistant/core/.* \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
# done
# echo "✓ All images verified successfully"
#
# # Generate all Docker tags based on version string
# # Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# # Examples:
# # 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# # 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# # 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
# - name: Generate Docker metadata
# id: meta
# uses: docker/metadata-action@030e881283bb7a6894de51c315a6bfe6a94e05cf # v6.0.0
# with:
# images: ${{ matrix.registry }}/home-assistant
# sep-tags: ","
# tags: |
# type=raw,value=${{ needs.init.outputs.version }},priority=9999
# type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
# type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
#
# - name: Set up Docker Buildx
# uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v3.7.1
#
# - name: Copy architecture images to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# # Use imagetools to copy image blobs directly between registries
# # This preserves provenance/attestations and seems to be much faster than pull/push
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Copying ${arch} image to DockerHub..."
# for attempt in 1 2 3; do
# if docker buildx imagetools create \
# --tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
# break
# fi
# echo "Attempt ${attempt} failed, retrying in 10 seconds..."
# sleep 10
# if [ "${attempt}" -eq 3 ]; then
# echo "Failed after 3 attempts"
# exit 1
# fi
# done
# cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
# done
#
# - name: Create and push multi-arch manifests
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# REGISTRY: ${{ matrix.registry }}
# VERSION: ${{ needs.init.outputs.version }}
# META_TAGS: ${{ steps.meta.outputs.tags }}
# run: |
# # Build list of architecture images dynamically
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# ARCH_IMAGES=()
# for arch in $ARCHS; do
# ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
# done
#
# # Build list of all tags for single manifest creation
# # Note: Using sep-tags=',' in metadata-action for easier parsing
# TAG_ARGS=()
# IFS=',' read -ra TAGS <<< "${META_TAGS}"
# for tag in "${TAGS[@]}"; do
# TAG_ARGS+=("--tag" "${tag}")
# done
#
# # Create manifest with ALL tags in a single operation (much faster!)
# echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
# docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
#
# # Sign each tag separately (signing requires individual tag names)
# echo "Signing all tags..."
# for tag in "${TAGS[@]}"; do
# echo "Signing ${tag}"
# cosign sign --yes "${tag}"
# done
#
# echo "All manifests created and signed successfully"
#
# build_python:
# name: Build PyPi package
# environment: ${{ needs.init.outputs.channel }}
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# id-token: write # For PyPI trusted publishing
# if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Set up Python
# uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
# with:
# python-version-file: ".python-version"
#
# - name: Download translations
# uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
# with:
# name: translations
#
# - name: Extract translations
# run: |
# tar xvf translations.tar.gz
# rm translations.tar.gz
#
# - name: Build package
# shell: bash
# run: |
# # Remove dist, build, and homeassistant.egg-info
# # when build locally for testing!
# pip install build
# python -m build
#
# - name: Upload package to PyPI
# uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
# with:
# skip-existing: true
#
# hassfest-image:
# name: Build and test hassfest image
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# attestations: write # For build provenance attestation
# id-token: write # For build provenance attestation
# needs: ["init"]
# if: github.repository_owner == 'home-assistant'
# env:
# HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
# HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
# steps:
# - name: Checkout repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Build Docker image
# uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# load: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }}
#
# - name: Run hassfest against core
# run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
#
# - name: Push Docker image
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# id: push
# uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# push: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
#
# - name: Generate artifact attestation
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# uses: actions/attest@59d89421af93a897026c735860bf21b6eb4f7b26 # v4.1.0
# with:
# subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
# subject-digest: ${{ steps.push.outputs.digest }}
# push-to-registry: true

1
Dockerfile generated
View File

@@ -10,7 +10,6 @@ LABEL \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.source="https://github.com/home-assistant/core" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"

View File

@@ -153,8 +153,8 @@ def websocket_get_entities(
{
vol.Required("type"): "config/entity_registry/update",
vol.Required("entity_id"): cv.entity_id,
vol.Optional("aliases"): [vol.Any(str, None)],
# If passed in, we update value. Passing None will remove old value.
vol.Optional("aliases"): list,
vol.Optional("area_id"): vol.Any(str, None),
# Categories is a mapping of key/value (scope/category_id) pairs.
# If passed in, we update/adjust only the provided scope(s).
@@ -225,10 +225,15 @@ def websocket_update_entity(
changes[key] = msg[key]
if "aliases" in msg:
# Create a set for the aliases without:
# - Empty strings
# Sanitize aliases by removing:
# - Trailing and leading whitespace characters in the individual aliases
changes["aliases"] = {s_strip for s in msg["aliases"] if (s_strip := s.strip())}
# - Empty strings
changes["aliases"] = aliases = []
for alias in msg["aliases"]:
if alias is None:
aliases.append(er.COMPUTED_NAME)
elif alias := alias.strip():
aliases.append(alias)
if "labels" in msg:
# Convert labels to a set

View File

@@ -992,18 +992,11 @@ class DefaultAgent(ConversationEntity):
continue
context[attr] = state.attributes[attr]
if (
entity := entity_registry.async_get(state.entity_id)
) and entity.aliases:
for alias in entity.aliases:
alias = alias.strip()
if not alias:
continue
yield (alias, alias, context)
# Default name
yield (state.name, state.name, context)
entity_entry = entity_registry.async_get(state.entity_id)
for name in intent.async_get_entity_aliases(
self.hass, entity_entry, state=state
):
yield (name, name, context)
def _recognize_strict(
self,

View File

@@ -5,7 +5,7 @@ from homeassistant.core import HomeAssistant, State, split_entity_id
from homeassistant.helpers.condition import Condition, EntityConditionBase
from .const import ATTR_IS_CLOSED, DOMAIN, CoverDeviceClass
from .trigger import CoverDomainSpec
from .models import CoverDomainSpec
class CoverConditionBase(EntityConditionBase[CoverDomainSpec]):

View File

@@ -0,0 +1,12 @@
"""Data models for the cover integration."""
from dataclasses import dataclass
from homeassistant.helpers.automation import DomainSpec
@dataclass(frozen=True, slots=True)
class CoverDomainSpec(DomainSpec):
"""DomainSpec with a target value for comparison."""
target_value: str | bool | None = None

View File

@@ -1,20 +1,11 @@
"""Provides triggers for covers."""
from dataclasses import dataclass
from homeassistant.const import STATE_OFF, STATE_ON, STATE_UNAVAILABLE, STATE_UNKNOWN
from homeassistant.core import HomeAssistant, State, split_entity_id
from homeassistant.helpers.automation import DomainSpec
from homeassistant.helpers.trigger import EntityTriggerBase, Trigger
from .const import ATTR_IS_CLOSED, DOMAIN, CoverDeviceClass
@dataclass(frozen=True, slots=True)
class CoverDomainSpec(DomainSpec):
"""DomainSpec with a target value for comparison."""
target_value: str | bool | None = None
from .models import CoverDomainSpec
class CoverTriggerBase(EntityTriggerBase[CoverDomainSpec]):

View File

@@ -48,7 +48,7 @@ def async_redact_data[_T](data: _T, to_redact: Iterable[Any]) -> _T:
def _entity_entry_filter(a: attr.Attribute, _: Any) -> bool:
return a.name != "_cache"
return a.name not in ("_cache", "compat_aliases", "compat_name")
@callback

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from collections.abc import Mapping
from typing import Any
from aiohttp import ClientError
@@ -56,3 +57,42 @@ class FreshrFlowHandler(ConfigFlow, domain=DOMAIN):
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_reauth(
self, _user_input: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle reauthentication."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reauthentication confirmation."""
errors: dict[str, str] = {}
reauth_entry = self._get_reauth_entry()
if user_input is not None:
client = FreshrClient(session=async_get_clientsession(self.hass))
try:
await client.login(
reauth_entry.data[CONF_USERNAME], user_input[CONF_PASSWORD]
)
except LoginError:
errors["base"] = "invalid_auth"
except ClientError:
errors["base"] = "cannot_connect"
except Exception: # noqa: BLE001
LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data_updates={CONF_PASSWORD: user_input[CONF_PASSWORD]},
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=vol.Schema({vol.Required(CONF_PASSWORD): str}),
description_placeholders={CONF_USERNAME: reauth_entry.data[CONF_USERNAME]},
errors=errors,
)

View File

@@ -6,6 +6,6 @@
"documentation": "https://www.home-assistant.io/integrations/freshr",
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"quality_scale": "silver",
"requirements": ["pyfreshr==1.2.0"]
}

View File

@@ -36,7 +36,7 @@ rules:
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: todo
reauthentication-flow: done
test-coverage: done
# Gold

View File

@@ -2,9 +2,7 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"wrong_account": "Cannot change the account username."
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -12,6 +10,15 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"reauth_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"data_description": {
"password": "[%key:component::freshr::config::step::user::data_description::password%]"
},
"description": "Re-enter the password for your Fresh-r account `{username}`."
},
"user": {
"data": {
"password": "[%key:common::config_flow::data::password%]",

View File

@@ -4,9 +4,9 @@ set_guest_wifi_password:
required: true
selector:
device:
integration: fritz
entity:
device_class: connectivity
integration: fritz
domain: update
password:
required: false
selector:
@@ -23,9 +23,9 @@ dial:
required: true
selector:
device:
integration: fritz
entity:
device_class: connectivity
integration: fritz
domain: update
number:
required: true
selector:

View File

@@ -29,6 +29,7 @@ from homeassistant.helpers import (
area_registry as ar,
device_registry as dr,
entity_registry as er,
intent,
start,
)
from homeassistant.helpers.event import async_call_later
@@ -597,7 +598,6 @@ class GoogleEntity:
state = self.state
traits = self.traits()
entity_config = self.config.entity_config.get(state.entity_id, {})
name = (entity_config.get(CONF_NAME) or state.name).strip()
# Find entity/device/area registry entries
entity_entry, device_entry, area_entry = _get_registry_entries(
@@ -607,7 +607,6 @@ class GoogleEntity:
# Build the device info
device = {
"id": state.entity_id,
"name": {"name": name},
"attributes": {},
"traits": [trait.name for trait in traits],
"willReportState": self.config.should_report_state,
@@ -615,13 +614,18 @@ class GoogleEntity:
state.domain, state.attributes.get(ATTR_DEVICE_CLASS)
),
}
# Add aliases
if (config_aliases := entity_config.get(CONF_ALIASES, [])) or (
entity_entry and entity_entry.aliases
):
device["name"]["nicknames"] = [name, *config_aliases]
if entity_entry:
device["name"]["nicknames"].extend(entity_entry.aliases)
# Add name and aliases.
# The entity's alias list is ordered: the first slot naturally serves
# as the primary name (set to the auto-generated full entity name by
# default), while the rest serve as alternative names (nicknames).
aliases = intent.async_get_entity_aliases(
self.hass, entity_entry, state=state, allow_empty=False
)
name, *aliases = aliases
name = entity_config.get(CONF_NAME) or name
device["name"] = {"name": name}
if (config_aliases := entity_config.get(CONF_ALIASES, [])) or aliases:
device["name"]["nicknames"] = [name, *config_aliases, *aliases]
# Add local SDK info if enabled
if self.config.is_local_sdk_active and self.should_expose_local():

View File

@@ -7,5 +7,6 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["growattServer"],
"quality_scale": "silver",
"requirements": ["growattServer==1.9.0"]
}

View File

@@ -89,18 +89,18 @@
"step": {
"advanced": {
"data": {
"api_key": "API Token",
"api_key": "API token",
"api_user": "User ID",
"url": "[%key:common::config_flow::data::url%]",
"verify_ssl": "[%key:common::config_flow::data::verify_ssl%]"
},
"data_description": {
"api_key": "API Token of the Habitica account",
"api_key": "API token of the Habitica account",
"api_user": "User ID of your Habitica account",
"url": "URL of the Habitica installation to connect to. Defaults to `{default_url}`",
"verify_ssl": "Enable SSL certificate verification for secure connections. Disable only if connecting to a Habitica instance using a self-signed certificate"
},
"description": "You can retrieve your `User ID` and `API Token` from [**Settings -> Site Data**]({site_data}) on Habitica or the instance you want to connect to",
"description": "You can retrieve your 'User ID' and 'API token' from [**Settings -> Site Data**]({site_data}) on Habitica or the instance you want to connect to",
"title": "[%key:component::habitica::config::step::user::menu_options::advanced%]"
},
"login": {
@@ -126,7 +126,7 @@
"api_key": "[%key:component::habitica::config::step::advanced::data_description::api_key%]"
},
"description": "Enter your new API token below. You can find it in Habitica under 'Settings -> Site Data'",
"name": "Re-authorize via API Token"
"name": "Re-authorize via API token"
},
"reauth_login": {
"data": {

View File

@@ -965,7 +965,7 @@ class HKDevice:
# visible on the network.
self.async_set_available_state(False)
return
except AccessoryDisconnectedError, EncryptionError:
except AccessoryDisconnectedError, EncryptionError, TimeoutError:
# Temporary connection failure. Device may still available but our
# connection was dropped or we are reconnecting
self._poll_failures += 1

View File

@@ -10,7 +10,6 @@ from functools import partial
from ipaddress import IPv4Network, IPv6Network, ip_network
import logging
import os
from pathlib import Path
import socket
import ssl
from tempfile import NamedTemporaryFile
@@ -34,7 +33,6 @@ from homeassistant.components.network import async_get_source_ip
from homeassistant.const import (
EVENT_HOMEASSISTANT_START,
EVENT_HOMEASSISTANT_STOP,
HASSIO_USER_NAME,
SERVER_PORT,
)
from homeassistant.core import Event, HomeAssistant, callback
@@ -71,7 +69,7 @@ from .headers import setup_headers
from .request_context import setup_request_context
from .security_filter import setup_security_filter
from .static import CACHE_HEADERS, CachingStaticResource
from .web_runner import HomeAssistantTCPSite, HomeAssistantUnixSite
from .web_runner import HomeAssistantTCPSite
CONF_SERVER_HOST: Final = "server_host"
CONF_SERVER_PORT: Final = "server_port"
@@ -237,16 +235,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
source_ip_task = create_eager_task(async_get_source_ip(hass))
unix_socket_path: Path | None = None
if socket_env := os.environ.get("SUPERVISOR_CORE_API_SOCKET"):
socket_path = Path(socket_env)
if socket_path.is_absolute():
unix_socket_path = socket_path
else:
_LOGGER.error(
"Invalid unix socket path %s: path must be absolute", socket_env
)
server = HomeAssistantHTTP(
hass,
server_host=server_host,
@@ -256,7 +244,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
ssl_key=ssl_key,
trusted_proxies=trusted_proxies,
ssl_profile=ssl_profile,
unix_socket_path=unix_socket_path,
)
await server.async_initialize(
cors_origins=cors_origins,
@@ -280,21 +267,6 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async_when_setup_or_start(hass, "frontend", start_server)
if server.unix_socket_path is not None:
async def start_unix_socket(*_: Any) -> None:
"""Start the Unix socket after the Supervisor user is available."""
if any(
user
for user in await hass.auth.async_get_users()
if user.system_generated and user.name == HASSIO_USER_NAME
):
await server.async_start_unix_socket()
else:
_LOGGER.error("Supervisor user not found; not starting Unix socket")
async_when_setup_or_start(hass, "hassio", start_unix_socket)
hass.http = server
local_ip = await source_ip_task
@@ -394,7 +366,6 @@ class HomeAssistantHTTP:
server_port: int,
trusted_proxies: list[IPv4Network | IPv6Network],
ssl_profile: str,
unix_socket_path: Path | None = None,
) -> None:
"""Initialize the HTTP Home Assistant server."""
self.app = HomeAssistantApplication(
@@ -413,10 +384,8 @@ class HomeAssistantHTTP:
self.server_port = server_port
self.trusted_proxies = trusted_proxies
self.ssl_profile = ssl_profile
self.unix_socket_path = unix_socket_path
self.runner: web.AppRunner | None = None
self.site: HomeAssistantTCPSite | None = None
self.unix_site: HomeAssistantUnixSite | None = None
self.context: ssl.SSLContext | None = None
async def async_initialize(
@@ -641,29 +610,6 @@ class HomeAssistantHTTP:
context.load_cert_chain(cert_pem.name, key_pem.name)
return context
async def async_start_unix_socket(self) -> None:
"""Start listening on the Unix socket.
This is called separately from start() to delay serving the Unix
socket until the Supervisor user exists (created by the hassio
integration). Without this delay, Supervisor could connect before
its user is available and receive 401 responses it won't retry.
"""
if self.unix_socket_path is None or self.runner is None:
return
self.unix_site = HomeAssistantUnixSite(self.runner, self.unix_socket_path)
try:
await self.unix_site.start()
except OSError as error:
_LOGGER.error(
"Failed to create HTTP server on unix socket %s: %s",
self.unix_socket_path,
error,
)
self.unix_site = None
else:
_LOGGER.info("Now listening on unix socket %s", self.unix_socket_path)
async def start(self) -> None:
"""Start the aiohttp server."""
# Aiohttp freezes apps after start so that no changes can be made.
@@ -691,19 +637,6 @@ class HomeAssistantHTTP:
async def stop(self) -> None:
"""Stop the aiohttp server."""
if self.unix_site is not None:
await self.unix_site.stop()
if self.unix_socket_path is not None:
try:
await self.hass.async_add_executor_job(
self.unix_socket_path.unlink, True
)
except OSError as err:
_LOGGER.warning(
"Could not remove unix socket %s: %s",
self.unix_socket_path,
err,
)
if self.site is not None:
await self.site.stop()
if self.runner is not None:

View File

@@ -11,13 +11,7 @@ import time
from typing import Any, Final
from aiohttp import hdrs
from aiohttp.web import (
Application,
HTTPInternalServerError,
Request,
StreamResponse,
middleware,
)
from aiohttp.web import Application, Request, StreamResponse, middleware
import jwt
from jwt import api_jws
from yarl import URL
@@ -26,7 +20,6 @@ from homeassistant.auth import jwt_wrapper
from homeassistant.auth.const import GROUP_ID_READ_ONLY
from homeassistant.auth.models import User
from homeassistant.components import websocket_api
from homeassistant.const import HASSIO_USER_NAME
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.http import current_request
from homeassistant.helpers.json import json_bytes
@@ -34,12 +27,7 @@ from homeassistant.helpers.network import is_cloud_connection
from homeassistant.helpers.storage import Store
from homeassistant.util.network import is_local
from .const import (
KEY_AUTHENTICATED,
KEY_HASS_REFRESH_TOKEN_ID,
KEY_HASS_USER,
is_unix_socket_request,
)
from .const import KEY_AUTHENTICATED, KEY_HASS_REFRESH_TOKEN_ID, KEY_HASS_USER
_LOGGER = logging.getLogger(__name__)
@@ -129,7 +117,7 @@ def async_user_not_allowed_do_auth(
return "User cannot authenticate remotely"
async def async_setup_auth( # noqa: C901
async def async_setup_auth(
hass: HomeAssistant,
app: Application,
) -> None:
@@ -219,41 +207,6 @@ async def async_setup_auth( # noqa: C901
request[KEY_HASS_REFRESH_TOKEN_ID] = refresh_token.id
return True
supervisor_user_id: str | None = None
async def async_authenticate_unix_socket(request: Request) -> bool:
"""Authenticate a request from a Unix socket as the Supervisor user.
The Unix Socket is dedicated and only available to Supervisor. To
avoid the extra overhead and round trips for the authentication and
refresh tokens, we directly authenticate requests from the socket as
the Supervisor user.
"""
nonlocal supervisor_user_id
# Fast path: use cached user ID
if supervisor_user_id is not None:
if user := await hass.auth.async_get_user(supervisor_user_id):
request[KEY_HASS_USER] = user
return True
supervisor_user_id = None
# Slow path: find the Supervisor user by name
for user in await hass.auth.async_get_users():
if user.system_generated and user.name == HASSIO_USER_NAME:
supervisor_user_id = user.id
# Not setting KEY_HASS_REFRESH_TOKEN_ID since Supervisor user
# doesn't use refresh tokens.
request[KEY_HASS_USER] = user
return True
# The Unix socket should not be serving before the hassio integration
# has created the Supervisor user. If we get here, something is wrong.
_LOGGER.error(
"Supervisor user not found; cannot authenticate Unix socket request"
)
raise HTTPInternalServerError
@middleware
async def auth_middleware(
request: Request, handler: Callable[[Request], Awaitable[StreamResponse]]
@@ -261,11 +214,7 @@ async def async_setup_auth( # noqa: C901
"""Authenticate as middleware."""
authenticated = False
if is_unix_socket_request(request):
authenticated = await async_authenticate_unix_socket(request)
auth_type = "unix socket"
elif hdrs.AUTHORIZATION in request.headers and async_validate_auth_header(
if hdrs.AUTHORIZATION in request.headers and async_validate_auth_header(
request
):
authenticated = True
@@ -284,7 +233,7 @@ async def async_setup_auth( # noqa: C901
if authenticated and _LOGGER.isEnabledFor(logging.DEBUG):
_LOGGER.debug(
"Authenticated %s for %s using %s",
request.remote or "unknown",
request.remote,
request.path,
auth_type,
)

View File

@@ -30,7 +30,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.hassio import get_supervisor_ip, is_hassio
from homeassistant.util import dt as dt_util, yaml as yaml_util
from .const import KEY_HASS, is_unix_socket_request
from .const import KEY_HASS
from .view import HomeAssistantView
_LOGGER: Final = logging.getLogger(__name__)
@@ -72,10 +72,6 @@ async def ban_middleware(
request: Request, handler: Callable[[Request], Awaitable[StreamResponse]]
) -> StreamResponse:
"""IP Ban middleware."""
# Unix socket connections are trusted, skip ban checks
if is_unix_socket_request(request):
return await handler(request)
if (ban_manager := request.app.get(KEY_BAN_MANAGER)) is None:
_LOGGER.error("IP Ban middleware loaded but banned IPs not loaded")
return await handler(request)

View File

@@ -1,22 +1,10 @@
"""HTTP specific constants."""
import socket
from typing import Final
from aiohttp.web import Request
from homeassistant.helpers.http import KEY_AUTHENTICATED, KEY_HASS # noqa: F401
DOMAIN: Final = "http"
KEY_HASS_USER: Final = "hass_user"
KEY_HASS_REFRESH_TOKEN_ID: Final = "hass_refresh_token_id"
def is_unix_socket_request(request: Request) -> bool:
"""Check if request arrived over a Unix socket."""
if (transport := request.transport) is None:
return False
if (sock := transport.get_extra_info("socket")) is None:
return False
return bool(sock.family == socket.AF_UNIX)

View File

@@ -3,8 +3,6 @@
from __future__ import annotations
import asyncio
from pathlib import Path
import socket
from ssl import SSLContext
from aiohttp import web
@@ -70,62 +68,3 @@ class HomeAssistantTCPSite(web.BaseSite):
reuse_address=self._reuse_address,
reuse_port=self._reuse_port,
)
class HomeAssistantUnixSite(web.BaseSite):
"""HomeAssistant specific aiohttp UnixSite.
Listens on a Unix socket for local inter-process communication,
used for Supervisor to Core communication.
"""
__slots__ = ("_path",)
def __init__(
self,
runner: web.BaseRunner,
path: Path,
*,
backlog: int = 128,
) -> None:
"""Initialize HomeAssistantUnixSite."""
super().__init__(
runner,
backlog=backlog,
)
self._path = path
@property
def name(self) -> str:
"""Return server URL."""
return f"http://unix:{self._path}:"
def _create_unix_socket(self) -> socket.socket:
"""Create and bind a Unix domain socket.
Performs blocking filesystem I/O (mkdir, unlink, chmod) and is
intended to be run in an executor. Permissions are set after bind
but before the socket is handed to the event loop, so no
connections can arrive on an unrestricted socket.
"""
self._path.parent.mkdir(parents=True, exist_ok=True)
self._path.unlink(missing_ok=True)
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
try:
sock.bind(str(self._path))
except OSError:
sock.close()
raise
self._path.chmod(0o600)
return sock
async def start(self) -> None:
"""Start server."""
await super().start()
loop = asyncio.get_running_loop()
sock = await loop.run_in_executor(None, self._create_unix_socket)
server = self._runner.server
assert server is not None
self._server = await loop.create_unix_server(
server, sock=sock, backlog=self._backlog
)

View File

@@ -8,6 +8,7 @@ from homeassistant.core import HomeAssistant
from .coordinator import IndevoltConfigEntry, IndevoltCoordinator
PLATFORMS: list[Platform] = [
Platform.BUTTON,
Platform.NUMBER,
Platform.SELECT,
Platform.SENSOR,

View File

@@ -0,0 +1,70 @@
"""Button platform for Indevolt integration."""
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Final
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import IndevoltConfigEntry
from .coordinator import IndevoltCoordinator
from .entity import IndevoltEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class IndevoltButtonEntityDescription(ButtonEntityDescription):
"""Custom entity description class for Indevolt button entities."""
generation: list[int] = field(default_factory=lambda: [1, 2])
BUTTONS: Final = (
IndevoltButtonEntityDescription(
key="stop",
translation_key="stop",
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: IndevoltConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the button platform for Indevolt."""
coordinator = entry.runtime_data
device_gen = coordinator.generation
# Button initialization
async_add_entities(
IndevoltButtonEntity(coordinator=coordinator, description=description)
for description in BUTTONS
if device_gen in description.generation
)
class IndevoltButtonEntity(IndevoltEntity, ButtonEntity):
"""Represents a button entity for Indevolt devices."""
entity_description: IndevoltButtonEntityDescription
def __init__(
self,
coordinator: IndevoltCoordinator,
description: IndevoltButtonEntityDescription,
) -> None:
"""Initialize the Indevolt button entity."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{self.serial_number}_{description.key}"
async def async_press(self) -> None:
"""Handle the button press."""
await self.coordinator.async_execute_realtime_action([0, 0, 0])

View File

@@ -1,16 +1,27 @@
"""Constants for the Indevolt integration."""
DOMAIN = "indevolt"
from typing import Final
DOMAIN: Final = "indevolt"
# Default configurations
DEFAULT_PORT: Final = 8080
# Config entry fields
CONF_SERIAL_NUMBER = "serial_number"
CONF_GENERATION = "generation"
CONF_SERIAL_NUMBER: Final = "serial_number"
CONF_GENERATION: Final = "generation"
# Default values
DEFAULT_PORT = 8080
# API write/read keys for energy and value for outdoor/portable mode
ENERGY_MODE_READ_KEY: Final = "7101"
ENERGY_MODE_WRITE_KEY: Final = "47005"
PORTABLE_MODE: Final = 0
# API write key and value for real-time control mode
REALTIME_ACTION_KEY: Final = "47015"
REALTIME_ACTION_MODE: Final = 4
# API key fields
SENSOR_KEYS = {
SENSOR_KEYS: Final[dict[int, list[str]]] = {
1: [
"606",
"7101",

View File

@@ -4,7 +4,7 @@ from __future__ import annotations
from datetime import timedelta
import logging
from typing import Any
from typing import Any, Final
from aiohttp import ClientError
from indevolt_api import IndevoltAPI, TimeOutException
@@ -21,20 +21,37 @@ from .const import (
CONF_SERIAL_NUMBER,
DEFAULT_PORT,
DOMAIN,
ENERGY_MODE_READ_KEY,
ENERGY_MODE_WRITE_KEY,
PORTABLE_MODE,
REALTIME_ACTION_KEY,
REALTIME_ACTION_MODE,
SENSOR_KEYS,
)
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = 30
SCAN_INTERVAL: Final = 30
type IndevoltConfigEntry = ConfigEntry[IndevoltCoordinator]
class DeviceTimeoutError(HomeAssistantError):
"""Raised when device push times out."""
class DeviceConnectionError(HomeAssistantError):
"""Raised when device push fails due to connection issues."""
class IndevoltCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Coordinator for fetching and pushing data to indevolt devices."""
friendly_name: str
config_entry: IndevoltConfigEntry
firmware_version: str | None
serial_number: str
device_model: str
generation: int
def __init__(self, hass: HomeAssistant, entry: IndevoltConfigEntry) -> None:
"""Initialize the indevolt coordinator."""
@@ -53,6 +70,7 @@ class IndevoltCoordinator(DataUpdateCoordinator[dict[str, Any]]):
session=async_get_clientsession(hass),
)
self.friendly_name = entry.title
self.serial_number = entry.data[CONF_SERIAL_NUMBER]
self.device_model = entry.data[CONF_MODEL]
self.generation = entry.data[CONF_GENERATION]
@@ -85,6 +103,67 @@ class IndevoltCoordinator(DataUpdateCoordinator[dict[str, Any]]):
try:
return await self.api.set_data(sensor_key, value)
except TimeOutException as err:
raise HomeAssistantError(f"Device push timed out: {err}") from err
raise DeviceTimeoutError(f"Device push timed out: {err}") from err
except (ClientError, ConnectionError, OSError) as err:
raise HomeAssistantError(f"Device push failed: {err}") from err
raise DeviceConnectionError(f"Device push failed: {err}") from err
async def async_switch_energy_mode(
self, target_mode: int, refresh: bool = True
) -> None:
"""Attempt to switch device to given energy mode."""
current_mode = self.data.get(ENERGY_MODE_READ_KEY)
# Ensure current energy mode is known
if current_mode is None:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="failed_to_retrieve_current_energy_mode",
)
# Ensure device is not in "Outdoor/Portable mode"
if current_mode == PORTABLE_MODE:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="energy_mode_change_unavailable_outdoor_portable",
)
# Switch energy mode if required
if current_mode != target_mode:
try:
success = await self.async_push_data(ENERGY_MODE_WRITE_KEY, target_mode)
except (DeviceTimeoutError, DeviceConnectionError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="failed_to_switch_energy_mode",
) from err
if not success:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="failed_to_switch_energy_mode",
)
if refresh:
await self.async_request_refresh()
async def async_execute_realtime_action(self, action: list[int]) -> None:
"""Switch mode, execute action, and refresh for real-time control."""
await self.async_switch_energy_mode(REALTIME_ACTION_MODE, refresh=False)
try:
success = await self.async_push_data(REALTIME_ACTION_KEY, action)
except (DeviceTimeoutError, DeviceConnectionError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="failed_to_execute_realtime_action",
) from err
if not success:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="failed_to_execute_realtime_action",
)
await self.async_request_refresh()

View File

@@ -35,6 +35,11 @@
}
},
"entity": {
"button": {
"stop": {
"name": "Enable standby mode"
}
},
"number": {
"discharge_limit": {
"name": "Discharge limit"
@@ -289,5 +294,19 @@
"name": "LED indicator"
}
}
},
"exceptions": {
"energy_mode_change_unavailable_outdoor_portable": {
"message": "Energy mode cannot be changed when the device is in outdoor/portable mode"
},
"failed_to_execute_realtime_action": {
"message": "Failed to execute real-time action"
},
"failed_to_retrieve_current_energy_mode": {
"message": "Failed to retrieve current energy mode"
},
"failed_to_switch_energy_mode": {
"message": "Failed to switch to requested energy mode"
}
}
}

View File

@@ -4,11 +4,21 @@ from typing import Any
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.typing import ConfigType
from .client_wrapper import CannotConnect, InvalidAuth, create_client, validate_input
from .const import CONF_CLIENT_DEVICE_ID, DEFAULT_NAME, DOMAIN, PLATFORMS
from .coordinator import JellyfinConfigEntry, JellyfinDataUpdateCoordinator
from .services import async_setup_services
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Jellyfin component."""
await async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: JellyfinConfigEntry) -> bool:

View File

@@ -38,6 +38,8 @@ PLAYABLE_MEDIA_TYPES = [
MediaType.EPISODE,
MediaType.MOVIE,
MediaType.MUSIC,
MediaType.SEASON,
MediaType.TVSHOW,
]
@@ -98,8 +100,8 @@ async def build_item_response(
media_content_id: str,
) -> BrowseMedia:
"""Create response payload for the provided media query."""
title, media, thumbnail = await get_media_info(
hass, client, user_id, media_content_type, media_content_id
title, media, thumbnail, media_type = await get_media_info(
hass, client, user_id, media_content_id
)
if title is None or media is None:
@@ -111,12 +113,12 @@ async def build_item_response(
response = BrowseMedia(
media_class=CONTAINER_TYPES_SPECIFIC_MEDIA_CLASS.get(
str(media_content_type), MediaClass.DIRECTORY
str(media_type), MediaClass.DIRECTORY
),
media_content_id=media_content_id,
media_content_type=str(media_content_type),
media_content_type=str(media_type),
title=title,
can_play=bool(media_content_type in PLAYABLE_MEDIA_TYPES and media_content_id),
can_play=bool(media_type in PLAYABLE_MEDIA_TYPES and media_content_id),
can_expand=True,
children=children,
thumbnail=thumbnail,
@@ -207,18 +209,18 @@ async def get_media_info(
hass: HomeAssistant,
client: JellyfinClient,
user_id: str,
media_content_type: str | None,
media_content_id: str,
) -> tuple[str | None, list[dict[str, Any]] | None, str | None]:
) -> tuple[str | None, list[dict[str, Any]] | None, str | None, str | None]:
"""Fetch media info."""
thumbnail: str | None = None
title: str | None = None
media: list[dict[str, Any]] | None = None
media_type: str | None = None
item = await hass.async_add_executor_job(fetch_item, client, media_content_id)
if item is None:
return None, None, None
return None, None, None, None
title = item["Name"]
thumbnail = get_artwork_url(client, item)
@@ -231,4 +233,6 @@ async def get_media_info(
if not media or len(media) == 0:
media = None
return title, media, thumbnail
media_type = CONTENT_TYPE_MAP.get(item["Type"], MEDIA_TYPE_NONE)
return title, media, thumbnail, media_type

View File

@@ -74,9 +74,10 @@ MEDIA_CLASS_MAP = {
"MusicAlbum": MediaClass.ALBUM,
"MusicArtist": MediaClass.ARTIST,
"Audio": MediaClass.MUSIC,
"Series": MediaClass.DIRECTORY,
"Series": MediaClass.TV_SHOW,
"Movie": MediaClass.MOVIE,
"CollectionFolder": MediaClass.DIRECTORY,
"AggregateFolder": MediaClass.DIRECTORY,
"Folder": MediaClass.DIRECTORY,
"BoxSet": MediaClass.DIRECTORY,
"Episode": MediaClass.EPISODE,

View File

@@ -5,5 +5,10 @@
"default": "mdi:television-play"
}
}
},
"services": {
"play_media_shuffle": {
"service": "mdi:shuffle-variant"
}
}
}

View File

@@ -6,7 +6,9 @@ import logging
from typing import Any
from homeassistant.components.media_player import (
ATTR_MEDIA_ENQUEUE,
BrowseMedia,
MediaPlayerEnqueue,
MediaPlayerEntity,
MediaPlayerEntityFeature,
MediaPlayerState,
@@ -203,6 +205,7 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
| MediaPlayerEntityFeature.STOP
| MediaPlayerEntityFeature.SEEK
| MediaPlayerEntityFeature.SEARCH_MEDIA
| MediaPlayerEntityFeature.MEDIA_ENQUEUE
)
if "Mute" in commands and "Unmute" in commands:
@@ -245,8 +248,20 @@ class JellyfinMediaPlayer(JellyfinClientEntity, MediaPlayerEntity):
self, media_type: MediaType | str, media_id: str, **kwargs: Any
) -> None:
"""Play a piece of media."""
command = "PlayNow"
enqueue = kwargs.get(ATTR_MEDIA_ENQUEUE)
if enqueue == MediaPlayerEnqueue.NEXT:
command = "PlayNext"
elif enqueue == MediaPlayerEnqueue.ADD:
command = "PlayLast"
self.coordinator.api_client.jellyfin.remote_play_media(
self.session_id, [media_id]
self.session_id, [media_id], command
)
def play_media_shuffle(self, media_content_id: str) -> None:
"""Play a piece of media on shuffle."""
self.coordinator.api_client.jellyfin.remote_play_media(
self.session_id, [media_content_id], "PlayShuffle"
)
def set_volume_level(self, volume: float) -> None:

View File

@@ -0,0 +1,55 @@
"""Services for the Jellyfin integration."""
from __future__ import annotations
from typing import Any
import voluptuous as vol
from homeassistant.components.media_player import (
ATTR_MEDIA,
ATTR_MEDIA_CONTENT_ID,
DOMAIN as MP_DOMAIN,
MediaPlayerEntityFeature,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
JELLYFIN_PLAY_MEDIA_SHUFFLE_SCHEMA = {
vol.Required(ATTR_MEDIA_CONTENT_ID): cv.string,
}
def _promote_media_fields(data: dict[str, Any]) -> dict[str, Any]:
"""If 'media' key exists, promote its fields to the top level."""
if ATTR_MEDIA in data and isinstance(data[ATTR_MEDIA], dict):
if ATTR_MEDIA_CONTENT_ID in data:
raise vol.Invalid(
f"Play media cannot contain both '{ATTR_MEDIA}' and '{ATTR_MEDIA_CONTENT_ID}'"
)
media_data = data[ATTR_MEDIA]
if ATTR_MEDIA_CONTENT_ID in media_data:
data[ATTR_MEDIA_CONTENT_ID] = media_data[ATTR_MEDIA_CONTENT_ID]
del data[ATTR_MEDIA]
return data
async def async_setup_services(hass: HomeAssistant) -> None:
"""Set up services for the Jellyfin component."""
service.async_register_platform_entity_service(
hass,
DOMAIN,
"play_media_shuffle",
entity_domain=MP_DOMAIN,
schema=vol.All(
_promote_media_fields,
cv.make_entity_service_schema(JELLYFIN_PLAY_MEDIA_SHUFFLE_SCHEMA),
),
func="play_media_shuffle",
required_features=MediaPlayerEntityFeature.PLAY_MEDIA,
)

View File

@@ -0,0 +1,11 @@
play_media_shuffle:
target:
entity:
integration: jellyfin
domain: media_player
fields:
media:
required: true
selector:
media:
example: '{"media_content_id": "a656b907eb3a73532e40e44b968d0225"}'

View File

@@ -42,5 +42,17 @@
}
}
}
},
"services": {
"play_media_shuffle": {
"description": "Starts playing specified media shuffled. Overwrites current play queue.",
"fields": {
"media": {
"description": "The media selected to play.",
"name": "Media"
}
},
"name": "Play media shuffled"
}
}
}

View File

@@ -47,6 +47,14 @@ COLOR_MODE_MAP = {
clusters.ColorControl.Enums.ColorModeEnum.kColorTemperatureMireds: ColorMode.COLOR_TEMP,
}
# Maximum Mireds value per the Matter spec is 65279
# Conversion between Kelvin and Mireds is 1,000,000 / Kelvin, so this corresponds to a minimum color temperature of ~15.3K
# Which is shown in UI as 15 Kelvin due to rounding.
# But converting 15 Kelvin back to Mireds gives 66666 which is above the maximum,
# and causes Invoke error, so cap values over maximum when sending
MATTER_MAX_MIREDS = 65279
# there's a bug in (at least) Espressif's implementation of light transitions
# on devices based on Matter 1.0. Mark potential devices with this issue.
# https://github.com/home-assistant/core/issues/113775
@@ -152,7 +160,7 @@ class MatterLight(MatterEntity, LightEntity):
)
await self.send_device_command(
clusters.ColorControl.Commands.MoveToColorTemperature(
colorTemperatureMireds=color_temp_mired,
colorTemperatureMireds=min(color_temp_mired, MATTER_MAX_MIREDS),
# transition in matter is measured in tenths of a second
transitionTime=int(transition * 10),
# allow setting the color while the light is off,

View File

@@ -203,105 +203,80 @@ class MoldIndicator(SensorEntity):
def _async_setup_sensor(self) -> None:
"""Set up the sensor and start tracking state changes."""
@callback
def mold_indicator_sensors_state_listener(
event: Event[EventStateChangedData],
) -> None:
"""Handle for state changes for dependent sensors."""
new_state = event.data["new_state"]
old_state = event.data["old_state"]
entity = event.data["entity_id"]
_LOGGER.debug(
"Sensor state change for %s that had old state %s and new state %s",
entity,
old_state,
new_state,
)
if self._update_sensor(entity, old_state, new_state):
if self._preview_callback:
calculated_state = self._async_calculate_state()
self._preview_callback(
calculated_state.state, calculated_state.attributes
)
# only write state to the state machine if we are not in preview mode
else:
self.async_schedule_update_ha_state(True)
@callback
def mold_indicator_startup() -> None:
"""Add listeners and get 1st state."""
_LOGGER.debug("Startup for %s", self.entity_id)
self.async_on_remove(
async_track_state_change_event(
self.hass,
list(self._entities.values()),
mold_indicator_sensors_state_listener,
self._entities.values(),
self._async_mold_indicator_sensor_state_listener,
)
)
# Replay current state of source entities
for entity_id in self._entities.values():
state = self.hass.states.get(entity_id)
state_event: Event[EventStateChangedData] = Event(
"", {"entity_id": entity_id, "new_state": state, "old_state": None}
)
self._async_mold_indicator_sensor_state_listener(
state_event, update_state=False
)
# Read initial state
indoor_temp = self.hass.states.get(self._entities[CONF_INDOOR_TEMP])
outdoor_temp = self.hass.states.get(self._entities[CONF_OUTDOOR_TEMP])
indoor_hum = self.hass.states.get(self._entities[CONF_INDOOR_HUMIDITY])
self._recalculate()
schedule_update = self._update_sensor(
self._entities[CONF_INDOOR_TEMP], None, indoor_temp
)
if self._preview_callback:
calculated_state = self._async_calculate_state()
self._preview_callback(calculated_state.state, calculated_state.attributes)
schedule_update = (
False
if not self._update_sensor(
self._entities[CONF_OUTDOOR_TEMP], None, outdoor_temp
)
else schedule_update
)
@callback
def _async_mold_indicator_sensor_state_listener(
self, event: Event[EventStateChangedData], update_state: bool = True
) -> None:
"""Handle state changes for dependent sensors."""
entity_id = event.data["entity_id"]
new_state = event.data["new_state"]
schedule_update = (
False
if not self._update_sensor(
self._entities[CONF_INDOOR_HUMIDITY], None, indoor_hum
)
else schedule_update
)
_LOGGER.debug(
"Sensor state change for %s that had old state %s and new state %s",
entity_id,
event.data["old_state"],
new_state,
)
if schedule_update and not self._preview_callback:
self.async_schedule_update_ha_state(True)
if self._preview_callback:
# re-calculate dewpoint and mold indicator
self._calc_dewpoint()
self._calc_moldindicator()
if self._attr_native_value is None:
self._attr_available = False
else:
self._attr_available = True
calculated_state = self._async_calculate_state()
self._preview_callback(
calculated_state.state, calculated_state.attributes
)
mold_indicator_startup()
def _update_sensor(
self, entity: str, old_state: State | None, new_state: State | None
) -> bool:
"""Update information based on new sensor states."""
_LOGGER.debug("Sensor update for %s", entity)
if new_state is None:
return False
# If old_state is not set and new state is unknown then it means
# that the sensor just started up
if old_state is None and new_state.state == STATE_UNKNOWN:
return False
if entity == self._entities[CONF_INDOOR_TEMP]:
# update state depending on which sensor changed
if entity_id == self._entities[CONF_INDOOR_TEMP]:
self._indoor_temp = self._get_temperature_from_state(new_state)
elif entity == self._entities[CONF_OUTDOOR_TEMP]:
elif entity_id == self._entities[CONF_OUTDOOR_TEMP]:
self._outdoor_temp = self._get_temperature_from_state(new_state)
elif entity == self._entities[CONF_INDOOR_HUMIDITY]:
elif entity_id == self._entities[CONF_INDOOR_HUMIDITY]:
self._indoor_hum = self._get_humidity_from_state(new_state)
return True
if not update_state:
return
self._recalculate()
if self._preview_callback:
calculated_state = self._async_calculate_state()
self._preview_callback(calculated_state.state, calculated_state.attributes)
# only write state to the state machine if we are not in preview mode
else:
self.async_write_ha_state()
@callback
def _recalculate(self) -> None:
"""Recalculate mold indicator from cached sensor values."""
# Check if all sensors are available
if None in (self._indoor_temp, self._indoor_hum, self._outdoor_temp):
self._attr_available = False
self._attr_native_value = None
self._dewpoint = None
self._crit_temp = None
return
# Calculate dewpoint and mold indicator
self._calc_dewpoint()
self._calc_moldindicator()
self._attr_available = self._attr_native_value is not None
def _get_value_from_state(
self,
@@ -376,26 +351,6 @@ class MoldIndicator(SensorEntity):
return self._get_value_from_state(state, validate_humidity)
async def async_update(self) -> None:
"""Calculate latest state."""
_LOGGER.debug("Update state for %s", self.entity_id)
# check all sensors
if None in (self._indoor_temp, self._indoor_hum, self._outdoor_temp):
self._attr_available = False
self._dewpoint = None
self._crit_temp = None
return
# re-calculate dewpoint and mold indicator
self._calc_dewpoint()
self._calc_moldindicator()
if self._attr_native_value is None:
self._attr_available = False
self._dewpoint = None
self._crit_temp = None
else:
self._attr_available = True
def _calc_dewpoint(self) -> None:
"""Calculate the dewpoint for the indoor air."""
# Use magnus approximation to calculate the dew point

View File

@@ -283,10 +283,7 @@ class IntegrationOnboardingView(_BaseOnboardingStepView):
async def post(self, request: web.Request, data: dict[str, Any]) -> web.Response:
"""Handle token creation."""
hass = request.app[KEY_HASS]
if not (refresh_token_id := request.get(KEY_HASS_REFRESH_TOKEN_ID)):
return self.json_message(
"Refresh token not available", HTTPStatus.FORBIDDEN
)
refresh_token_id = request[KEY_HASS_REFRESH_TOKEN_ID]
async with self._lock:
if self._async_is_done():

View File

@@ -51,6 +51,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: PortainerConfigEntry) ->
session=async_create_clientsession(
hass=hass, verify_ssl=entry.data[CONF_VERIFY_SSL]
),
request_timeout=30,
max_retries=API_MAX_RETRIES,
)

View File

@@ -105,6 +105,9 @@
"robot_cleaner_driving_mode": {
"default": "mdi:car-cog"
},
"robot_cleaner_sound_mode": {
"default": "mdi:bell-cog"
},
"robot_cleaner_water_spray_level": {
"default": "mdi:spray-bottle"
},

View File

@@ -26,6 +26,12 @@ LAMP_TO_HA = {
"off": "off",
}
SOUND_MODE_TO_HA = {
"voice": "voice",
"beep": "tone",
"mute": "mute",
}
DRIVING_MODE_TO_HA = {
"areaThenWalls": "area_then_walls",
"wallFirst": "walls_first",
@@ -244,6 +250,16 @@ CAPABILITIES_TO_SELECT: dict[Capability | str, SmartThingsSelectDescription] = {
entity_category=EntityCategory.CONFIG,
value_is_integer=True,
),
Capability.SAMSUNG_CE_ROBOT_CLEANER_SYSTEM_SOUND_MODE: SmartThingsSelectDescription(
key=Capability.SAMSUNG_CE_ROBOT_CLEANER_SYSTEM_SOUND_MODE,
translation_key="robot_cleaner_sound_mode",
options_attribute=Attribute.SUPPORTED_SOUND_MODES,
status_attribute=Attribute.SOUND_MODE,
command=Command.SET_SOUND_MODE,
options_map=SOUND_MODE_TO_HA,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
),
Capability.SAMSUNG_CE_ROBOT_CLEANER_CLEANING_TYPE: SmartThingsSelectDescription(
key=Capability.SAMSUNG_CE_ROBOT_CLEANER_CLEANING_TYPE,
translation_key="robot_cleaner_cleaning_type",

View File

@@ -254,6 +254,14 @@
"walls_first": "Walls first"
}
},
"robot_cleaner_sound_mode": {
"name": "Sound mode",
"state": {
"mute": "Mute",
"tone": "Tone",
"voice": "Voice"
}
},
"robot_cleaner_water_spray_level": {
"name": "Water level",
"state": {

View File

@@ -195,9 +195,22 @@ class TeslaFleetEnergySiteLiveCoordinator(DataUpdateCoordinator[dict[str, Any]])
except TeslaFleetError as e:
raise UpdateFailed(e.message) from e
if not isinstance(data, dict):
LOGGER.debug(
"%s got unexpected live status response type: %s",
self.name,
type(data).__name__,
)
return self.data
# Convert Wall Connectors from array to dict
wall_connectors = data.get("wall_connectors")
if not isinstance(wall_connectors, list):
wall_connectors = []
data["wall_connectors"] = {
wc["din"]: wc for wc in (data.get("wall_connectors") or [])
wc["din"]: wc
for wc in wall_connectors
if isinstance(wc, dict) and "din" in wc
}
self.updated_once = True

View File

@@ -2,9 +2,10 @@
from __future__ import annotations
from datetime import timedelta
import asyncio
from datetime import datetime, timedelta
import logging
from typing import TYPE_CHECKING, cast
from typing import TYPE_CHECKING, TypedDict, cast
from aiohttp.client_exceptions import ClientError
import tibber
@@ -38,6 +39,58 @@ FIVE_YEARS = 5 * 365 * 24
_LOGGER = logging.getLogger(__name__)
class TibberHomeData(TypedDict):
"""Data for a Tibber home used by the price sensor."""
currency: str
price_unit: str
current_price: float | None
current_price_time: datetime | None
intraday_price_ranking: float | None
max_price: float
avg_price: float
min_price: float
off_peak_1: float
peak: float
off_peak_2: float
month_cost: float | None
peak_hour: float | None
peak_hour_time: datetime | None
month_cons: float | None
app_nickname: str | None
grid_company: str | None
estimated_annual_consumption: int | None
def _build_home_data(home: tibber.TibberHome) -> TibberHomeData:
"""Build TibberHomeData from a TibberHome for the price sensor."""
current_price, last_updated, price_rank = home.current_price_data()
attributes = home.current_attributes()
result: TibberHomeData = {
"currency": home.currency,
"price_unit": home.price_unit,
"current_price": current_price,
"current_price_time": last_updated,
"intraday_price_ranking": price_rank,
"max_price": attributes["max_price"],
"avg_price": attributes["avg_price"],
"min_price": attributes["min_price"],
"off_peak_1": attributes["off_peak_1"],
"peak": attributes["peak"],
"off_peak_2": attributes["off_peak_2"],
"month_cost": home.month_cost,
"peak_hour": home.peak_hour,
"peak_hour_time": home.peak_hour_time,
"month_cons": home.month_cons,
"app_nickname": home.info["viewer"]["home"].get("appNickname"),
"grid_company": home.info["viewer"]["home"]["meteringPointData"]["gridCompany"],
"estimated_annual_consumption": home.info["viewer"]["home"][
"meteringPointData"
]["estimatedAnnualConsumption"],
}
return result
class TibberDataCoordinator(DataUpdateCoordinator[None]):
"""Handle Tibber data and insert statistics."""
@@ -57,13 +110,16 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
name=f"Tibber {tibber_connection.name}",
update_interval=timedelta(minutes=20),
)
self._tibber_connection = tibber_connection
async def _async_update_data(self) -> None:
"""Update data via API."""
tibber_connection = await self.config_entry.runtime_data.async_get_client(
self.hass
)
try:
await self._tibber_connection.fetch_consumption_data_active_homes()
await self._tibber_connection.fetch_production_data_active_homes()
await tibber_connection.fetch_consumption_data_active_homes()
await tibber_connection.fetch_production_data_active_homes()
await self._insert_statistics()
except tibber.RetryableHttpExceptionError as err:
raise UpdateFailed(f"Error communicating with API ({err.status})") from err
@@ -75,7 +131,10 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
async def _insert_statistics(self) -> None:
"""Insert Tibber statistics."""
for home in self._tibber_connection.get_homes():
tibber_connection = await self.config_entry.runtime_data.async_get_client(
self.hass
)
for home in tibber_connection.get_homes():
sensors: list[tuple[str, bool, str | None, str]] = []
if home.hourly_consumption_data:
sensors.append(
@@ -194,6 +253,76 @@ class TibberDataCoordinator(DataUpdateCoordinator[None]):
async_add_external_statistics(self.hass, metadata, statistics)
class TibberPriceCoordinator(DataUpdateCoordinator[dict[str, TibberHomeData]]):
"""Handle Tibber price data and insert statistics."""
config_entry: TibberConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: TibberConfigEntry,
) -> None:
"""Initialize the price coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=f"{DOMAIN} price",
update_interval=timedelta(minutes=1),
)
def _seconds_until_next_15_minute(self) -> float:
"""Return seconds until the next 15-minute boundary (0, 15, 30, 45) in UTC."""
now = dt_util.utcnow()
next_minute = ((now.minute // 15) + 1) * 15
if next_minute >= 60:
next_run = now.replace(minute=0, second=0, microsecond=0) + timedelta(
hours=1
)
else:
next_run = now.replace(
minute=next_minute, second=0, microsecond=0, tzinfo=dt_util.UTC
)
return (next_run - now).total_seconds()
async def _async_update_data(self) -> dict[str, TibberHomeData]:
"""Update data via API and return per-home data for sensors."""
tibber_connection = await self.config_entry.runtime_data.async_get_client(
self.hass
)
active_homes = tibber_connection.get_homes(only_active=True)
try:
await asyncio.gather(
tibber_connection.fetch_consumption_data_active_homes(),
tibber_connection.fetch_production_data_active_homes(),
)
now = dt_util.now()
homes_to_update = [
home
for home in active_homes
if (
(last_data_timestamp := home.last_data_timestamp) is None
or (last_data_timestamp - now).total_seconds() < 11 * 3600
)
]
if homes_to_update:
await asyncio.gather(
*(home.update_info_and_price_info() for home in homes_to_update)
)
except tibber.RetryableHttpExceptionError as err:
raise UpdateFailed(f"Error communicating with API ({err.status})") from err
except tibber.FatalHttpExceptionError as err:
raise UpdateFailed(f"Error communicating with API ({err.status})") from err
result = {home.home_id: _build_home_data(home) for home in active_homes}
self.update_interval = timedelta(seconds=self._seconds_until_next_15_minute())
return result
class TibberDataAPICoordinator(DataUpdateCoordinator[dict[str, TibberDevice]]):
"""Fetch and cache Tibber Data API device capabilities."""

View File

@@ -3,10 +3,8 @@
from __future__ import annotations
from collections.abc import Callable
import datetime
from datetime import timedelta
import logging
from random import randrange
from typing import Any
import aiohttp
@@ -42,18 +40,20 @@ from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from homeassistant.util import Throttle, dt as dt_util
from homeassistant.util import dt as dt_util
from .const import DOMAIN, MANUFACTURER, TibberConfigEntry
from .coordinator import TibberDataAPICoordinator, TibberDataCoordinator
from .coordinator import (
TibberDataAPICoordinator,
TibberDataCoordinator,
TibberPriceCoordinator,
)
_LOGGER = logging.getLogger(__name__)
ICON = "mdi:currency-usd"
SCAN_INTERVAL = timedelta(minutes=1)
MIN_TIME_BETWEEN_UPDATES = timedelta(minutes=5)
PARALLEL_UPDATES = 0
TWENTY_MINUTES = 20 * 60
RT_SENSORS_UNIQUE_ID_MIGRATION = {
"accumulated_consumption_last_hour": "accumulated consumption current hour",
@@ -610,6 +610,7 @@ async def _async_setup_graphql_sensors(
entity_registry = er.async_get(hass)
coordinator: TibberDataCoordinator | None = None
price_coordinator: TibberPriceCoordinator | None = None
entities: list[TibberSensor] = []
for home in tibber_connection.get_homes(only_active=False):
try:
@@ -626,7 +627,9 @@ async def _async_setup_graphql_sensors(
raise PlatformNotReady from err
if home.has_active_subscription:
entities.append(TibberSensorElPrice(home))
if price_coordinator is None:
price_coordinator = TibberPriceCoordinator(hass, entry)
entities.append(TibberSensorElPrice(price_coordinator, home))
if coordinator is None:
coordinator = TibberDataCoordinator(hass, entry, tibber_connection)
entities.extend(
@@ -737,19 +740,21 @@ class TibberSensor(SensorEntity):
return device_info
class TibberSensorElPrice(TibberSensor):
class TibberSensorElPrice(TibberSensor, CoordinatorEntity[TibberPriceCoordinator]):
"""Representation of a Tibber sensor for el price."""
_attr_state_class = SensorStateClass.MEASUREMENT
_attr_translation_key = "electricity_price"
def __init__(self, tibber_home: TibberHome) -> None:
def __init__(
self,
coordinator: TibberPriceCoordinator,
tibber_home: TibberHome,
) -> None:
"""Initialize the sensor."""
super().__init__(tibber_home=tibber_home)
self._last_updated: datetime.datetime | None = None
self._spread_load_constant = randrange(TWENTY_MINUTES)
super().__init__(coordinator=coordinator, tibber_home=tibber_home)
self._attr_available = False
self._attr_native_unit_of_measurement = tibber_home.price_unit
self._attr_extra_state_attributes = {
"app_nickname": None,
"grid_company": None,
@@ -768,51 +773,38 @@ class TibberSensorElPrice(TibberSensor):
self._device_name = self._home_name
async def async_update(self) -> None:
"""Get the latest data and updates the states."""
now = dt_util.now()
if (
not self._tibber_home.last_data_timestamp
or (self._tibber_home.last_data_timestamp - now).total_seconds()
< 10 * 3600 - self._spread_load_constant
or not self.available
):
_LOGGER.debug("Asking for new data")
await self._fetch_data()
elif (
self._tibber_home.price_total
and self._last_updated
and self._last_updated.hour == now.hour
and now - self._last_updated < timedelta(minutes=15)
and self._tibber_home.last_data_timestamp
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
data = self.coordinator.data
if not data or (
(home_data := data.get(self._tibber_home.home_id)) is None
or (current_price := home_data.get("current_price")) is None
):
self._attr_available = False
self.async_write_ha_state()
return
res = self._tibber_home.current_price_data()
self._attr_native_value, self._last_updated, price_rank = res
self._attr_extra_state_attributes["intraday_price_ranking"] = price_rank
attrs = self._tibber_home.current_attributes()
self._attr_extra_state_attributes.update(attrs)
self._attr_available = self._attr_native_value is not None
self._attr_native_unit_of_measurement = self._tibber_home.price_unit
@Throttle(MIN_TIME_BETWEEN_UPDATES)
async def _fetch_data(self) -> None:
_LOGGER.debug("Fetching data")
try:
await self._tibber_home.update_info_and_price_info()
except TimeoutError, aiohttp.ClientError:
return
data = self._tibber_home.info["viewer"]["home"]
self._attr_extra_state_attributes["app_nickname"] = data["appNickname"]
self._attr_extra_state_attributes["grid_company"] = data["meteringPointData"][
"gridCompany"
self._attr_native_unit_of_measurement = home_data.get(
"price_unit", self._tibber_home.price_unit
)
self._attr_native_value = current_price
self._attr_extra_state_attributes["intraday_price_ranking"] = home_data.get(
"intraday_price_ranking"
)
self._attr_extra_state_attributes["max_price"] = home_data["max_price"]
self._attr_extra_state_attributes["avg_price"] = home_data["avg_price"]
self._attr_extra_state_attributes["min_price"] = home_data["min_price"]
self._attr_extra_state_attributes["off_peak_1"] = home_data["off_peak_1"]
self._attr_extra_state_attributes["peak"] = home_data["peak"]
self._attr_extra_state_attributes["off_peak_2"] = home_data["off_peak_2"]
self._attr_extra_state_attributes["app_nickname"] = home_data["app_nickname"]
self._attr_extra_state_attributes["grid_company"] = home_data["grid_company"]
self._attr_extra_state_attributes["estimated_annual_consumption"] = home_data[
"estimated_annual_consumption"
]
self._attr_extra_state_attributes["estimated_annual_consumption"] = data[
"meteringPointData"
]["estimatedAnnualConsumption"]
self._attr_available = True
self.async_write_ha_state()
class TibberDataSensor(TibberSensor, CoordinatorEntity[TibberDataCoordinator]):

View File

@@ -10,7 +10,6 @@ import voluptuous as vol
from voluptuous.humanize import humanize_error
from homeassistant.components.http.ban import process_success_login, process_wrong_login
from homeassistant.components.http.const import KEY_HASS_USER
from homeassistant.const import __version__
from homeassistant.core import CALLBACK_TYPE, HomeAssistant
from homeassistant.helpers.json import json_bytes
@@ -69,19 +68,6 @@ class AuthPhase:
# send_bytes_text will directly send a message to the client.
self._send_bytes_text = send_bytes_text
async def async_handle_unix_socket(self) -> ActiveConnection:
"""Handle a pre-authenticated Unix socket connection."""
conn = ActiveConnection(
self._logger,
self._hass,
self._send_message,
self._request[KEY_HASS_USER],
refresh_token=None,
)
await self._send_bytes_text(AUTH_OK_MESSAGE)
self._logger.debug("Auth OK (unix socket)")
return conn
async def async_handle(self, msg: JsonValueType) -> ActiveConnection:
"""Handle authentication."""
try:

View File

@@ -59,14 +59,14 @@ class ActiveConnection:
hass: HomeAssistant,
send_message: Callable[[bytes | str | dict[str, Any]], None],
user: User,
refresh_token: RefreshToken | None,
refresh_token: RefreshToken,
) -> None:
"""Initialize an active connection."""
self.logger = logger
self.hass = hass
self.send_message = send_message
self.user = user
self.refresh_token_id = refresh_token.id if refresh_token else None
self.refresh_token_id = refresh_token.id
self.subscriptions: dict[Hashable, Callable[[], Any]] = {}
self.last_id = 0
self.can_coalesce = False

View File

@@ -14,7 +14,6 @@ from aiohttp import WSMsgType, web
from aiohttp.http_websocket import WebSocketWriter
from homeassistant.components.http import KEY_HASS, HomeAssistantView
from homeassistant.components.http.const import is_unix_socket_request
from homeassistant.const import EVENT_HOMEASSISTANT_STOP, EVENT_LOGGING_CHANGED
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_send
@@ -37,12 +36,12 @@ from .error import Disconnect
from .messages import message_to_json_bytes
from .util import describe_request
if TYPE_CHECKING:
from .connection import ActiveConnection
CLOSE_MSG_TYPES = {WSMsgType.CLOSE, WSMsgType.CLOSED, WSMsgType.CLOSING}
AUTH_MESSAGE_TIMEOUT = 10 # seconds
if TYPE_CHECKING:
from .connection import ActiveConnection
_WS_LOGGER: Final = logging.getLogger(f"{__name__}.connection")
@@ -387,45 +386,37 @@ class WebSocketHandler:
send_bytes_text: Callable[[bytes], Coroutine[Any, Any, None]],
) -> ActiveConnection:
"""Handle the auth phase of the websocket connection."""
request = self._request
await send_bytes_text(AUTH_REQUIRED_MESSAGE)
if is_unix_socket_request(request):
# Unix socket requests are pre-authenticated by the HTTP
# auth middleware — skip the token exchange.
connection = await auth.async_handle_unix_socket()
else:
await send_bytes_text(AUTH_REQUIRED_MESSAGE)
# Auth Phase
try:
msg = await self._wsock.receive(AUTH_MESSAGE_TIMEOUT)
except TimeoutError as err:
raise Disconnect(
f"Did not receive auth message within {AUTH_MESSAGE_TIMEOUT} seconds"
) from err
# Auth Phase
try:
msg = await self._wsock.receive(AUTH_MESSAGE_TIMEOUT)
except TimeoutError as err:
if msg.type in (WSMsgType.CLOSE, WSMsgType.CLOSED, WSMsgType.CLOSING):
raise Disconnect("Received close message during auth phase")
if msg.type is not WSMsgType.TEXT:
if msg.type is WSMsgType.ERROR:
# msg.data is the exception
raise Disconnect(
f"Did not receive auth message within {AUTH_MESSAGE_TIMEOUT} seconds"
) from err
if msg.type in (WSMsgType.CLOSE, WSMsgType.CLOSED, WSMsgType.CLOSING):
raise Disconnect("Received close message during auth phase")
if msg.type is not WSMsgType.TEXT:
if msg.type is WSMsgType.ERROR:
# msg.data is the exception
raise Disconnect(
f"Received error message during auth phase: {msg.data}"
)
raise Disconnect(
f"Received non-Text message of type {msg.type} during auth phase"
f"Received error message during auth phase: {msg.data}"
)
raise Disconnect(
f"Received non-Text message of type {msg.type} during auth phase"
)
try:
auth_msg_data = json_loads(msg.data)
except ValueError as err:
raise Disconnect("Received invalid JSON during auth phase") from err
if self._debug:
self._logger.debug("%s: Received %s", self.description, auth_msg_data)
connection = await auth.async_handle(auth_msg_data)
try:
auth_msg_data = json_loads(msg.data)
except ValueError as err:
raise Disconnect("Received invalid JSON during auth phase") from err
if self._debug:
self._logger.debug("%s: Received %s", self.description, auth_msg_data)
connection = await auth.async_handle(auth_msg_data)
# As the webserver is now started before the start
# event we do not want to block for websocket responses
#

View File

@@ -2,9 +2,9 @@
from __future__ import annotations
from collections.abc import Generator
import contextlib
from collections.abc import Callable
import logging
from typing import Any
from pywemo.exceptions import ActionException
@@ -64,23 +64,20 @@ class WemoEntity(CoordinatorEntity[DeviceCoordinator]):
"""Return the device info."""
return self._device_info
@contextlib.contextmanager
def _wemo_call_wrapper(self, message: str) -> Generator[None]:
"""Wrap calls to the device that change its state.
async def _async_wemo_call(self, message: str, action: Callable[[], Any]) -> None:
"""Run a WeMo device action in the executor and update listeners.
1. Takes care of making available=False when communications with the
device fails.
2. Ensures all entities sharing the same coordinator are aware of
updates to the device state.
Handles errors from the device and ensures all entities sharing the
same coordinator are aware of updates to the device state.
"""
try:
yield
await self.hass.async_add_executor_job(action)
except ActionException as err:
_LOGGER.warning("Could not %s for %s (%s)", message, self.name, err)
self.coordinator.last_exception = err
self.coordinator.last_update_success = False # Used for self.available.
self.coordinator.last_update_success = False
finally:
self.hass.add_job(self.coordinator.async_update_listeners)
self.coordinator.async_update_listeners()
class WemoBinaryStateEntity(WemoEntity):

View File

@@ -3,6 +3,7 @@
from __future__ import annotations
from datetime import timedelta
import functools as ft
import math
from typing import Any
@@ -60,14 +61,16 @@ async def async_setup_entry(
platform = entity_platform.async_get_current_platform()
# This will call WemoHumidifier.set_humidity(target_humidity=VALUE)
# This will call WemoHumidifier.async_set_humidity(target_humidity=VALUE)
platform.async_register_entity_service(
SERVICE_SET_HUMIDITY, SET_HUMIDITY_SCHEMA, WemoHumidifier.set_humidity.__name__
SERVICE_SET_HUMIDITY,
SET_HUMIDITY_SCHEMA,
WemoHumidifier.async_set_humidity.__name__,
)
# This will call WemoHumidifier.reset_filter_life()
# This will call WemoHumidifier.async_reset_filter_life()
platform.async_register_entity_service(
SERVICE_RESET_FILTER_LIFE, None, WemoHumidifier.reset_filter_life.__name__
SERVICE_RESET_FILTER_LIFE, None, WemoHumidifier.async_reset_filter_life.__name__
)
@@ -124,25 +127,26 @@ class WemoHumidifier(WemoBinaryStateEntity, FanEntity):
self._last_fan_on_mode = self.wemo.fan_mode
super()._handle_coordinator_update()
def turn_on(
async def async_turn_on(
self,
percentage: int | None = None,
preset_mode: str | None = None,
**kwargs: Any,
) -> None:
"""Turn the fan on."""
self._set_percentage(percentage)
await self._async_set_percentage(percentage)
def turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
with self._wemo_call_wrapper("turn off"):
self.wemo.set_state(FanMode.Off)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the fan off."""
await self._async_wemo_call(
"turn off", ft.partial(self.wemo.set_state, FanMode.Off)
)
def set_percentage(self, percentage: int) -> None:
async def async_set_percentage(self, percentage: int) -> None:
"""Set the fan_mode of the Humidifier."""
self._set_percentage(percentage)
await self._async_set_percentage(percentage)
def _set_percentage(self, percentage: int | None) -> None:
async def _async_set_percentage(self, percentage: int | None) -> None:
if percentage is None:
named_speed = self._last_fan_on_mode
elif percentage == 0:
@@ -152,10 +156,11 @@ class WemoHumidifier(WemoBinaryStateEntity, FanEntity):
math.ceil(percentage_to_ranged_value(SPEED_RANGE, percentage))
)
with self._wemo_call_wrapper("set speed"):
self.wemo.set_state(named_speed)
await self._async_wemo_call(
"set speed", ft.partial(self.wemo.set_state, named_speed)
)
def set_humidity(self, target_humidity: float) -> None:
async def async_set_humidity(self, target_humidity: float) -> None:
"""Set the target humidity level for the Humidifier."""
if target_humidity < 50:
pywemo_humidity = DesiredHumidity.FortyFivePercent
@@ -168,10 +173,10 @@ class WemoHumidifier(WemoBinaryStateEntity, FanEntity):
elif target_humidity >= 100:
pywemo_humidity = DesiredHumidity.OneHundredPercent
with self._wemo_call_wrapper("set humidity"):
self.wemo.set_humidity(pywemo_humidity)
await self._async_wemo_call(
"set humidity", ft.partial(self.wemo.set_humidity, pywemo_humidity)
)
def reset_filter_life(self) -> None:
async def async_reset_filter_life(self) -> None:
"""Reset the filter life to 100%."""
with self._wemo_call_wrapper("reset filter life"):
self.wemo.reset_filter_life()
await self._async_wemo_call("reset filter life", self.wemo.reset_filter_life)

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
import functools as ft
from typing import Any, cast
from pywemo import Bridge, BridgeLight, Dimmer
@@ -166,7 +167,7 @@ class WemoLight(WemoEntity, LightEntity):
"""Return true if device is on."""
return self.light.state.get("onoff", WEMO_OFF) != WEMO_OFF
def turn_on(self, **kwargs: Any) -> None:
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the light on."""
xy_color = None
@@ -184,7 +185,7 @@ class WemoLight(WemoEntity, LightEntity):
"force_update": False,
}
with self._wemo_call_wrapper("turn on"):
def _turn_on() -> None:
if xy_color is not None:
self.light.set_color(xy_color, transition=transition_time)
@@ -195,12 +196,14 @@ class WemoLight(WemoEntity, LightEntity):
self.light.turn_on(**turn_on_kwargs)
def turn_off(self, **kwargs: Any) -> None:
await self._async_wemo_call("turn on", _turn_on)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the light off."""
transition_time = int(kwargs.get(ATTR_TRANSITION, 0))
with self._wemo_call_wrapper("turn off"):
self.light.turn_off(transition=transition_time)
await self._async_wemo_call(
"turn off", ft.partial(self.light.turn_off, transition=transition_time)
)
class WemoDimmer(WemoBinaryStateEntity, LightEntity):
@@ -216,20 +219,19 @@ class WemoDimmer(WemoBinaryStateEntity, LightEntity):
wemo_brightness: int = self.wemo.get_brightness()
return int((wemo_brightness * 255) / 100)
def turn_on(self, **kwargs: Any) -> None:
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the dimmer on."""
# Wemo dimmer switches use a range of [0, 100] to control
# brightness. Level 255 might mean to set it to previous value
if ATTR_BRIGHTNESS in kwargs:
brightness = kwargs[ATTR_BRIGHTNESS]
brightness = int((brightness / 255) * 100)
with self._wemo_call_wrapper("set brightness"):
self.wemo.set_brightness(brightness)
await self._async_wemo_call(
"set brightness", ft.partial(self.wemo.set_brightness, brightness)
)
else:
with self._wemo_call_wrapper("turn on"):
self.wemo.on()
await self._async_wemo_call("turn on", self.wemo.on)
def turn_off(self, **kwargs: Any) -> None:
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the dimmer off."""
with self._wemo_call_wrapper("turn off"):
self.wemo.off()
await self._async_wemo_call("turn off", self.wemo.off)

View File

@@ -119,12 +119,10 @@ class WemoSwitch(WemoBinaryStateEntity, SwitchEntity):
return "mdi:coffee"
return None
def turn_on(self, **kwargs: Any) -> None:
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
with self._wemo_call_wrapper("turn on"):
self.wemo.on()
await self._async_wemo_call("turn on", self.wemo.on)
def turn_off(self, **kwargs: Any) -> None:
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
with self._wemo_call_wrapper("turn off"):
self.wemo.off()
await self._async_wemo_call("turn off", self.wemo.off)

View File

@@ -44,6 +44,16 @@ MAP_JOIN_RESTRICTIONS = {
"followed": "joinable",
}
MAP_PLATFORM_NAME = {
"Android": "Android",
"iOS": "iOS",
"Nintendo": "Nintendo Switch",
"Scarlett": "Xbox Series X|S",
"WindowsOneCore": "Windows",
"Xbox360": "Xbox 360",
"XboxOne": "Xbox One",
}
class XboxSensor(StrEnum):
"""Xbox sensor."""
@@ -63,6 +73,9 @@ class XboxSensor(StrEnum):
FREE_STORAGE = "free_storage"
PRESENCE_ACTIVE = "Active"
@dataclass(kw_only=True, frozen=True)
class XboxSensorEntityDescription(XboxBaseEntityDescription, SensorEntityDescription):
"""Xbox sensor description."""
@@ -79,7 +92,7 @@ class XboxStorageDeviceSensorEntityDescription(
value_fn: Callable[[StorageDevice], StateType]
def now_playing_attributes(_: Person, title: Title | None) -> dict[str, Any]:
def now_playing_attributes(person: Person, title: Title | None) -> dict[str, Any]:
"""Attributes of the currently played title."""
attributes: dict[str, Any] = {
"short_description": None,
@@ -91,9 +104,35 @@ def now_playing_attributes(_: Person, title: Title | None) -> dict[str, Any]:
"achievements": None,
"gamerscore": None,
"progress": None,
"platform": None,
}
if person.presence_details:
active_entry = next(
(
d
for d in person.presence_details
if d.state == PRESENCE_ACTIVE and d.is_game
),
None,
) or next(
(d for d in person.presence_details if d.state == PRESENCE_ACTIVE),
None,
)
if active_entry:
platform = active_entry.device
if platform == "Scarlett" and title and title.devices:
if "Xbox360" in title.devices:
platform = "Xbox360"
elif "XboxOne" in title.devices:
platform = "XboxOne"
attributes["platform"] = MAP_PLATFORM_NAME.get(platform, platform)
if not title:
return attributes
if title.detail is not None:
attributes.update(
{

View File

@@ -141,6 +141,7 @@
},
"genres": { "name": "Genres" },
"min_age": { "name": "Minimum age" },
"platform": { "name": "Platform" },
"progress": { "name": "Progress" },
"publisher": { "name": "Publisher" },
"release_date": { "name": "Release date" },

View File

@@ -13,7 +13,7 @@ from __future__ import annotations
from collections import defaultdict
from collections.abc import Callable, Hashable, KeysView, Mapping
from datetime import datetime, timedelta
from enum import StrEnum
from enum import Enum, StrEnum
import logging
import time
from typing import TYPE_CHECKING, Any, Literal, NotRequired, TypedDict
@@ -80,7 +80,7 @@ EVENT_ENTITY_REGISTRY_UPDATED: EventType[EventEntityRegistryUpdatedData] = Event
_LOGGER = logging.getLogger(__name__)
STORAGE_VERSION_MAJOR = 1
STORAGE_VERSION_MINOR = 20
STORAGE_VERSION_MINOR = 21
STORAGE_KEY = "core.entity_registry"
CLEANUP_INTERVAL = 3600 * 24
@@ -91,6 +91,28 @@ ENTITY_CATEGORY_VALUE_TO_INDEX: dict[EntityCategory | None, int] = {
}
ENTITY_CATEGORY_INDEX_TO_VALUE = dict(enumerate(EntityCategory))
class ComputedNameType(Enum):
"""Singleton representing the computed full entity name in aliases."""
_singleton = 0
COMPUTED_NAME = ComputedNameType._singleton # noqa: SLF001
type AliasEntry = str | ComputedNameType
def _serialize_aliases(aliases: list[AliasEntry]) -> list[str | None]:
"""Convert aliases to a JSON-serializable list."""
return [None if a is COMPUTED_NAME else a for a in aliases]
def _deserialize_aliases(aliases: list[str | None]) -> list[AliasEntry]:
"""Convert aliases from JSON to internal representation."""
return [COMPUTED_NAME if a is None else a for a in aliases]
# Attributes relevant to describing entity
# to external services.
ENTITY_DESCRIBING_ATTRIBUTES = {
@@ -184,7 +206,7 @@ class RegistryEntry:
unique_id: str = attr.ib()
platform: str = attr.ib()
previous_unique_id: str | None = attr.ib(default=None)
aliases: set[str] = attr.ib(factory=set)
aliases: list[AliasEntry] = attr.ib(factory=list)
area_id: str | None = attr.ib(default=None)
categories: dict[str, str] = attr.ib(factory=dict)
capabilities: Mapping[str, Any] | None = attr.ib()
@@ -215,6 +237,11 @@ class RegistryEntry:
supported_features: int = attr.ib()
translation_key: str | None = attr.ib()
unit_of_measurement: str | None = attr.ib()
# For backwards compatibility, should be removed in the future
compat_aliases: list[str] = attr.ib(factory=list, eq=False)
compat_name: str | None = attr.ib(default=None, eq=False)
_cache: dict[str, Any] = attr.ib(factory=dict, eq=False, init=False)
@domain.default
@@ -252,7 +279,7 @@ class RegistryEntry:
display_dict["hb"] = True
if self.has_entity_name:
display_dict["hn"] = True
name = self.name or self.original_name
name = self.name if self.name is not None else self.original_name
if name is not None:
display_dict["en"] = name
if self.domain == "sensor" and (sensor_options := self.options.get("sensor")):
@@ -320,7 +347,7 @@ class RegistryEntry:
# it every time
return {
**self.as_partial_dict,
"aliases": list(self.aliases),
"aliases": _serialize_aliases(self.aliases),
"capabilities": self.capabilities,
"device_class": self.device_class,
"original_device_class": self.original_device_class,
@@ -349,7 +376,8 @@ class RegistryEntry:
return json_fragment(
json_bytes(
{
"aliases": list(self.aliases),
"aliases": self.compat_aliases,
"aliases_v2": _serialize_aliases(self.aliases),
"area_id": self.area_id,
"categories": self.categories,
"capabilities": self.capabilities,
@@ -367,7 +395,8 @@ class RegistryEntry:
"has_entity_name": self.has_entity_name,
"labels": list(self.labels),
"modified_at": self.modified_at,
"name": self.name,
"name": self.compat_name,
"name_v2": self.name,
"object_id_base": self.object_id_base,
"options": self.options,
"original_device_class": self.original_device_class,
@@ -414,7 +443,7 @@ class RegistryEntry:
@callback
def _async_get_full_entity_name_generic(
def _async_get_full_entity_name(
hass: HomeAssistant,
*,
device_id: str | None,
@@ -430,13 +459,14 @@ def _async_get_full_entity_name_generic(
Used for both full entity name and entity ID.
"""
use_device = False
if name is None:
if overridden_name is not None:
name = overridden_name
else:
name = original_name
if has_entity_name:
use_device = True
if name is not None:
use_device = True
elif overridden_name is not None:
name = overridden_name
else:
name = original_name
if has_entity_name:
use_device = True
device = (
dr.async_get(hass).async_get(device_id)
@@ -467,7 +497,7 @@ def async_get_full_entity_name(
original_name = (
original_name if original_name is not UNDEFINED else entry.original_name
)
return _async_get_full_entity_name_generic(
return _async_get_full_entity_name(
hass,
device_id=entry.device_id,
fallback="",
@@ -477,6 +507,82 @@ def async_get_full_entity_name(
)
@callback
def async_get_entity_aliases(
hass: HomeAssistant,
entry: RegistryEntry,
*,
allow_empty: bool = True,
) -> list[str]:
"""Get all names/aliases for an entity.
Processes entry aliases where COMPUTED_NAME entries are replaced with the
computed full entity name. String entries are used as-is.
The returned list preserves the order set by the user.
"""
entry_aliases = entry.aliases
if not entry_aliases:
if allow_empty:
return []
entry_aliases = [COMPUTED_NAME]
aliases = []
for alias in entry_aliases:
if alias is COMPUTED_NAME:
alias = async_get_full_entity_name(hass, entry)
aliases.append(alias.strip())
return aliases
@callback
def _async_strip_prefix_from_entity_name(
entity_name: str | None, prefix: str | None
) -> str | None:
"""Strip prefix from entity name.
Returns None if the prefix does not meaningfully match.
"""
if not entity_name or not prefix:
return None
prefix_lower = prefix.casefold()
prefix_len = len(prefix_lower)
candidate = entity_name[:prefix_len]
true_prefix_len = len(candidate)
candidate = candidate.casefold()
if not candidate.startswith(prefix_lower):
return None
# Casefolded string can differ in length
prefix_diff = len(candidate) - prefix_len
while prefix_diff > 0:
true_prefix_len -= 1
prefix_diff -= len(entity_name[true_prefix_len].casefold())
# Casefolded string matched in a middle of a character, not a valid prefix
if prefix_diff < 0:
return None
new_name = entity_name[true_prefix_len:].lstrip(" -:")
if not new_name:
return ""
# Must have at least one separator character
if len(new_name) == len(entity_name) - true_prefix_len:
return None
first_word = new_name.partition(" ")[0]
# Preserve a mixed-case word, capitalize lowercase
if not first_word.islower():
return new_name
return new_name[0].upper() + new_name[1:]
@attr.s(frozen=True, slots=True)
class DeletedRegistryEntry:
"""Deleted Entity Registry Entry."""
@@ -485,7 +591,7 @@ class DeletedRegistryEntry:
unique_id: str = attr.ib()
platform: str = attr.ib()
aliases: set[str] = attr.ib()
aliases: list[AliasEntry] = attr.ib()
area_id: str | None = attr.ib()
categories: dict[str, str] = attr.ib()
config_entry_id: str | None = attr.ib()
@@ -505,6 +611,10 @@ class DeletedRegistryEntry:
)
orphaned_timestamp: float | None = attr.ib()
# For backwards compatibility, should be removed in the future
compat_aliases: list[str] = attr.ib(factory=list, eq=False)
compat_name: str | None = attr.ib(default=None, eq=False)
_cache: dict[str, Any] = attr.ib(factory=dict, eq=False, init=False)
@domain.default
@@ -518,7 +628,8 @@ class DeletedRegistryEntry:
return json_fragment(
json_bytes(
{
"aliases": list(self.aliases),
"aliases": self.compat_aliases,
"aliases_v2": _serialize_aliases(self.aliases),
"area_id": self.area_id,
"categories": self.categories,
"config_entry_id": self.config_entry_id,
@@ -538,7 +649,8 @@ class DeletedRegistryEntry:
"id": self.id,
"labels": list(self.labels),
"modified_at": self.modified_at,
"name": self.name,
"name": self.compat_name,
"name_v2": self.name,
"options": self.options if self.options is not UNDEFINED else {},
"options_undefined": self.options is UNDEFINED,
"orphaned_timestamp": self.orphaned_timestamp,
@@ -691,6 +803,48 @@ class EntityRegistryStore(storage.Store[dict[str, list[dict[str, Any]]]]):
for entity in data["entities"]:
entity["object_id_base"] = entity["original_name"]
if old_minor_version < 21:
# Version 1.21 migrates the full name to include device name,
# even if entity name is overwritten by user.
# It also adds support for COMPUTED_NAME in aliases and starts preserving their order.
# To avoid a major version bump, we keep the old name and aliases as-is
# and use new name_v2 and aliases_v2 fields instead.
device_registry = dr.async_get(self.hass)
for entity in data["entities"]:
alias_to_add: str | None = None
if (
(name := entity["name"])
and (device_id := entity["device_id"]) is not None
and (device := device_registry.async_get(device_id)) is not None
and (device_name := device.name_by_user or device.name)
):
# Strip the device name prefix from the entity name if present,
# and add the full generated name as an alias.
# If the name doesn't have the device name prefix and the
# entity is exposed to a voice assistant, add the previous
# name as an alias instead to preserve backwards compatibility.
if (
new_name := _async_strip_prefix_from_entity_name(
name, device_name
)
) is not None:
name = new_name
elif any(
entity.get("options", {}).get(key, {}).get("should_expose")
for key in ("conversation", "cloud.google_assistant")
):
alias_to_add = name
entity["name_v2"] = name
entity["aliases_v2"] = [alias_to_add, *entity["aliases"]]
for entity in data["deleted_entities"]:
# We don't know what the device name was, so the only thing we can do
# is to clear the overwritten name to not mislead users.
entity["name_v2"] = None
entity["aliases_v2"] = [None, *entity["aliases"]]
if old_major_version > 1:
raise NotImplementedError
return data
@@ -1029,13 +1183,15 @@ class EntityRegistry(BaseRegistry):
`name` is the name set by the user, not the original name from the integration.
`name` has priority over `suggested_object_id`, which has priority
over `object_id_base`.
`name` and `suggested_object_id` will never be prefixed with the device name,
`object_id_base` will be if `has_entity_name` is True.
`name` will always be prefixed with the device name.
`suggested_object_id` will not be prefixed with the device name.
`object_id_base` will be prefixed with the device name if
`has_entity_name` is True.
Entity ID conflicts are checked against registered and currently
existing entities, as well as provided `reserved_entity_ids`.
"""
object_id = _async_get_full_entity_name_generic(
object_id = _async_get_full_entity_name(
self.hass,
device_id=device_id,
fallback=f"{platform}_{unique_id}",
@@ -1159,6 +1315,8 @@ class EntityRegistry(BaseRegistry):
aliases = deleted_entity.aliases
area_id = deleted_entity.area_id
categories = deleted_entity.categories
compat_aliases = deleted_entity.compat_aliases
compat_name = deleted_entity.compat_name
created_at = deleted_entity.created_at
device_class = deleted_entity.device_class
if deleted_entity.disabled_by is not UNDEFINED:
@@ -1186,9 +1344,11 @@ class EntityRegistry(BaseRegistry):
else:
options = get_initial_options() if get_initial_options else None
else:
aliases = set()
aliases = [COMPUTED_NAME]
area_id = None
categories = {}
compat_aliases = []
compat_name = None
device_class = None
icon = None
labels = set()
@@ -1230,6 +1390,8 @@ class EntityRegistry(BaseRegistry):
area_id=area_id,
categories=categories,
capabilities=none_if_undefined(capabilities),
compat_aliases=compat_aliases,
compat_name=compat_name,
config_entry_id=none_if_undefined(config_entry_id),
config_subentry_id=none_if_undefined(config_subentry_id),
created_at=created_at,
@@ -1290,6 +1452,8 @@ class EntityRegistry(BaseRegistry):
aliases=entity.aliases,
area_id=entity.area_id,
categories=entity.categories,
compat_aliases=entity.compat_aliases,
compat_name=entity.compat_name,
config_entry_id=config_entry_id,
config_subentry_id=entity.config_subentry_id,
created_at=entity.created_at,
@@ -1422,7 +1586,7 @@ class EntityRegistry(BaseRegistry):
self,
entity_id: str,
*,
aliases: set[str] | UndefinedType = UNDEFINED,
aliases: list[AliasEntry] | UndefinedType = UNDEFINED,
area_id: str | None | UndefinedType = UNDEFINED,
categories: dict[str, str] | UndefinedType = UNDEFINED,
capabilities: Mapping[str, Any] | None | UndefinedType = UNDEFINED,
@@ -1573,7 +1737,7 @@ class EntityRegistry(BaseRegistry):
self,
entity_id: str,
*,
aliases: set[str] | UndefinedType = UNDEFINED,
aliases: list[AliasEntry] | UndefinedType = UNDEFINED,
area_id: str | None | UndefinedType = UNDEFINED,
categories: dict[str, str] | UndefinedType = UNDEFINED,
capabilities: Mapping[str, Any] | None | UndefinedType = UNDEFINED,
@@ -1715,10 +1879,12 @@ class EntityRegistry(BaseRegistry):
continue
entities[entity["entity_id"]] = RegistryEntry(
aliases=set(entity["aliases"]),
aliases=_deserialize_aliases(entity["aliases_v2"]),
area_id=entity["area_id"],
categories=entity["categories"],
capabilities=entity["capabilities"],
compat_aliases=entity["aliases"],
compat_name=entity["name"],
config_entry_id=entity["config_entry_id"],
config_subentry_id=entity["config_subentry_id"],
created_at=datetime.fromisoformat(entity["created_at"]),
@@ -1739,7 +1905,7 @@ class EntityRegistry(BaseRegistry):
has_entity_name=entity["has_entity_name"],
labels=set(entity["labels"]),
modified_at=datetime.fromisoformat(entity["modified_at"]),
name=entity["name"],
name=entity["name_v2"],
object_id_base=entity.get("object_id_base"),
options=entity["options"],
original_device_class=entity["original_device_class"],
@@ -1785,9 +1951,11 @@ class EntityRegistry(BaseRegistry):
entity["unique_id"],
)
deleted_entities[key] = DeletedRegistryEntry(
aliases=set(entity["aliases"]),
aliases=_deserialize_aliases(entity["aliases_v2"]),
area_id=entity["area_id"],
categories=entity["categories"],
compat_aliases=entity["aliases"],
compat_name=entity["name"],
config_entry_id=entity["config_entry_id"],
config_subentry_id=entity["config_subentry_id"],
created_at=datetime.fromisoformat(entity["created_at"]),
@@ -1807,7 +1975,7 @@ class EntityRegistry(BaseRegistry):
id=entity["id"],
labels=set(entity["labels"]),
modified_at=datetime.fromisoformat(entity["modified_at"]),
name=entity["name"],
name=entity["name_v2"],
options=entity["options"]
if not entity["options_undefined"]
else UNDEFINED,

View File

@@ -415,6 +415,7 @@ def _normalize_name(name: str) -> str:
def _filter_by_name(
hass: HomeAssistant,
name: str,
candidates: Iterable[MatchTargetsCandidate],
) -> Iterable[MatchTargetsCandidate]:
@@ -422,31 +423,19 @@ def _filter_by_name(
name_norm = _normalize_name(name)
for candidate in candidates:
# Accept name or entity id
if (candidate.state.entity_id == name) or _normalize_name(
candidate.state.name
) == name_norm:
# Accept entity id
if candidate.state.entity_id == name:
candidate.matched_name = name
yield candidate
continue
if candidate.entity is None:
continue
if candidate.entity.name and (
_normalize_name(candidate.entity.name) == name_norm
for candidate_name in async_get_entity_aliases(
hass, candidate.entity, state=candidate.state
):
candidate.matched_name = name
yield candidate
continue
# Check aliases
if candidate.entity.aliases:
for alias in candidate.entity.aliases:
if _normalize_name(alias) == name_norm:
candidate.matched_name = name
yield candidate
break
if _normalize_name(candidate_name) == name_norm:
candidate.matched_name = name
yield candidate
break
def _filter_by_features(
@@ -583,7 +572,7 @@ def async_match_targets( # noqa: C901
if constraints.name:
# Filter by entity name or alias
candidates = list(_filter_by_name(constraints.name, candidates))
candidates = list(_filter_by_name(hass, constraints.name, candidates))
if not candidates:
return MatchTargetsResult(False, MatchFailedReason.NAME)
@@ -1501,3 +1490,25 @@ class IntentResponse:
response_dict["data"] = response_data
return response_dict
@callback
def async_get_entity_aliases(
hass: HomeAssistant,
entity_entry: er.RegistryEntry | None,
*,
state: State,
allow_empty: bool = True,
) -> list[str]:
"""Get all names/aliases for an entity.
If no entity registry entry is provided, returns a list with just the
state name. Otherwise, delegates to the entity registry to resolve aliases,
where COMPUTED_NAME aliases are replaced with the computed full entity name.
The returned list preserves the order set by the user.
"""
if entity_entry is None:
return [state.name.strip()]
return er.async_get_entity_aliases(hass, entity_entry, allow_empty=allow_empty)

View File

@@ -659,26 +659,34 @@ def _get_exposed_entities(
continue
entity_entry = entity_registry.async_get(state.entity_id)
names = [state.name]
device_entry = (
device_registry.async_get(entity_entry.device_id)
if entity_entry is not None and entity_entry.device_id is not None
else None
)
names = intent.async_get_entity_aliases(hass, entity_entry, state=state)
area_names = []
if entity_entry is not None:
names.extend(entity_entry.aliases)
if entity_entry.area_id and (
area := area_registry.async_get_area(entity_entry.area_id)
if (
entity_entry.area_id is not None
and (area_entry := area_registry.async_get_area(entity_entry.area_id))
is not None
):
# Entity is in area
area_names.append(area.name)
area_names.extend(area.aliases)
elif entity_entry.device_id and (
device := device_registry.async_get(entity_entry.device_id)
):
area_names.append(area_entry.name)
area_names.extend(area_entry.aliases)
elif device_entry is not None:
# Check device area
if device.area_id and (
area := area_registry.async_get_area(device.area_id)
if (
device_entry.area_id is not None
and (
area_entry := area_registry.async_get_area(device_entry.area_id)
)
is not None
):
area_names.append(area.name)
area_names.extend(area.aliases)
area_names.append(area_entry.name)
area_names.extend(area_entry.aliases)
info: dict[str, Any] = {
"names": ", ".join(names),
@@ -919,12 +927,10 @@ def _get_cached_action_parameters(
entity_registry = er.async_get(hass)
if (
entity_id := entity_registry.async_get_entity_id(domain, domain, action)
) and (entity_entry := entity_registry.async_get(entity_id)):
aliases: list[str] = []
if entity_entry.name:
aliases.append(entity_entry.name)
if entity_entry.aliases:
aliases.extend(entity_entry.aliases)
) is not None and (
entity_entry := entity_registry.async_get(entity_id)
) is not None:
aliases = er.async_get_entity_aliases(hass, entity_entry)
if aliases:
if description:
description = description + ". Aliases: " + str(list(aliases))

10
machine/build.yaml generated
View File

@@ -1,10 +0,0 @@
image: ghcr.io/home-assistant/{machine}-homeassistant
build_from:
aarch64: "ghcr.io/home-assistant/aarch64-homeassistant:"
amd64: "ghcr.io/home-assistant/amd64-homeassistant:"
cosign:
base_identity: https://github.com/home-assistant/core/.*
identity: https://github.com/home-assistant/core/.*
labels:
io.hass.type: core
org.opencontainers.image.source: https://github.com/home-assistant/core

11
machine/generic-x86-64 generated
View File

@@ -1,7 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
libva-intel-driver
LABEL io.hass.machine="generic-x86-64"

9
machine/green generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="green"

14
machine/intel-nuc generated
View File

@@ -1,10 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# NOTE: intel-nuc will be replaced by generic-x86-64. Make sure to apply
# changes in generic-x86-64 as well.
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
libva-intel-driver
LABEL io.hass.machine="intel-nuc"

9
machine/khadas-vim3 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="khadas-vim3"

9
machine/odroid-c2 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="odroid-c2"

9
machine/odroid-c4 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="odroid-c4"

9
machine/odroid-m1 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="odroid-m1"

9
machine/odroid-n2 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="odroid-n2"

9
machine/qemuarm-64 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="qemuarm-64"

9
machine/qemux86-64 generated
View File

@@ -1,4 +1,7 @@
ARG \
BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant:latest
FROM ${BUILD_FROM}
FROM $BUILD_FROM
LABEL io.hass.machine="qemux86-64"

View File

@@ -1,7 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
LABEL io.hass.machine="raspberrypi3-64"

View File

@@ -1,7 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
LABEL io.hass.machine="raspberrypi4-64"

View File

@@ -1,7 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
LABEL io.hass.machine="raspberrypi5-64"

13
machine/yellow generated
View File

@@ -1,7 +1,10 @@
ARG \
BUILD_FROM
FROM $BUILD_FROM
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/aarch64-homeassistant:latest
FROM ${BUILD_FROM}
RUN apk --no-cache add \
raspberrypi-utils
raspberrypi-utils
LABEL io.hass.machine="yellow"

View File

@@ -25,7 +25,6 @@ LABEL \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.source="https://github.com/home-assistant/core" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"
@@ -77,6 +76,59 @@ RUN \
WORKDIR /config
"""
@dataclass(frozen=True)
class _MachineConfig:
"""Machine-specific Dockerfile configuration."""
arch: str
packages: tuple[str, ...] = ()
_MACHINES = {
"generic-x86-64": _MachineConfig(arch="amd64", packages=("libva-intel-driver",)),
"green": _MachineConfig(arch="aarch64"),
"intel-nuc": _MachineConfig(arch="amd64", packages=("libva-intel-driver",)),
"khadas-vim3": _MachineConfig(arch="aarch64"),
"odroid-c2": _MachineConfig(arch="aarch64"),
"odroid-c4": _MachineConfig(arch="aarch64"),
"odroid-m1": _MachineConfig(arch="aarch64"),
"odroid-n2": _MachineConfig(arch="aarch64"),
"qemuarm-64": _MachineConfig(arch="aarch64"),
"qemux86-64": _MachineConfig(arch="amd64"),
"raspberrypi3-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"raspberrypi4-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"raspberrypi5-64": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
"yellow": _MachineConfig(arch="aarch64", packages=("raspberrypi-utils",)),
}
_MACHINE_DOCKERFILE_TEMPLATE = r"""# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/{arch}-homeassistant:latest
FROM ${{BUILD_FROM}}
{extra_packages}
LABEL io.hass.machine="{machine}"
"""
def _generate_machine_dockerfile(
machine_name: str, machine_config: _MachineConfig
) -> str:
"""Generate a machine Dockerfile from configuration."""
if machine_config.packages:
pkg_lines = " \\\n ".join(machine_config.packages)
extra_packages = f"\nRUN apk --no-cache add \\\n {pkg_lines}\n"
else:
extra_packages = ""
return _MACHINE_DOCKERFILE_TEMPLATE.format(
arch=machine_config.arch,
extra_packages=extra_packages,
machine=machine_name,
)
_HASSFEST_TEMPLATE = r"""# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
@@ -174,7 +226,7 @@ def _generate_files(config: Config) -> list[File]:
config.root / "requirements_test_pre_commit.txt", {"ruff"}
)
return [
files = [
File(
DOCKERFILE_TEMPLATE.format(
timeout=timeout,
@@ -192,6 +244,16 @@ def _generate_files(config: Config) -> list[File]:
),
]
for machine_name, machine_config in sorted(_MACHINES.items()):
files.append(
File(
_generate_machine_dockerfile(machine_name, machine_config),
config.root / "machine" / machine_name,
)
)
return files
def validate(integrations: dict[str, Integration], config: Config) -> None:
"""Validate dockerfile."""

View File

@@ -1419,7 +1419,6 @@ INTEGRATIONS_WITHOUT_SCALE = [
"greenwave",
"group",
"gtfs",
"growatt_server",
"guardian",
"harman_kardon_avr",
"harmony",

View File

@@ -1,909 +1 @@
"""The tests for components."""
from collections.abc import Iterable
from enum import StrEnum
import itertools
from typing import Any, TypedDict
import pytest
from homeassistant.const import (
ATTR_AREA_ID,
ATTR_DEVICE_ID,
ATTR_FLOOR_ID,
ATTR_LABEL_ID,
CONF_ABOVE,
CONF_BELOW,
CONF_CONDITION,
CONF_ENTITY_ID,
CONF_OPTIONS,
CONF_PLATFORM,
CONF_TARGET,
STATE_UNAVAILABLE,
STATE_UNKNOWN,
)
from homeassistant.core import HomeAssistant, ServiceCall
from homeassistant.helpers import (
area_registry as ar,
device_registry as dr,
entity_registry as er,
floor_registry as fr,
label_registry as lr,
)
from homeassistant.helpers.condition import (
ConditionCheckerTypeOptional,
async_from_config as async_condition_from_config,
)
from homeassistant.helpers.trigger import (
CONF_LOWER_LIMIT,
CONF_THRESHOLD_TYPE,
CONF_UPPER_LIMIT,
ThresholdType,
)
from homeassistant.setup import async_setup_component
from tests.common import MockConfigEntry, mock_device_registry
async def target_entities(hass: HomeAssistant, domain: str) -> dict[str, list[str]]:
"""Create multiple entities associated with different targets.
Returns a dict with the following keys:
- included: List of entity_ids meant to be targeted.
- excluded: List of entity_ids not meant to be targeted.
"""
config_entry = MockConfigEntry(domain="test")
config_entry.add_to_hass(hass)
floor_reg = fr.async_get(hass)
floor = floor_reg.async_get_floor_by_name("Test Floor") or floor_reg.async_create(
"Test Floor"
)
area_reg = ar.async_get(hass)
area = area_reg.async_get_area_by_name("Test Area") or area_reg.async_create(
"Test Area", floor_id=floor.floor_id
)
label_reg = lr.async_get(hass)
label = label_reg.async_get_label_by_name("Test Label") or label_reg.async_create(
"Test Label"
)
device = dr.DeviceEntry(id="test_device", area_id=area.id, labels={label.label_id})
mock_device_registry(hass, {device.id: device})
entity_reg = er.async_get(hass)
# Entities associated with area
entity_area = entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_area",
suggested_object_id=f"area_{domain}",
)
entity_reg.async_update_entity(entity_area.entity_id, area_id=area.id)
entity_area_excluded = entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_area_excluded",
suggested_object_id=f"area_{domain}_excluded",
)
entity_reg.async_update_entity(entity_area_excluded.entity_id, area_id=area.id)
# Entities associated with device
entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_device",
suggested_object_id=f"device_{domain}",
device_id=device.id,
)
entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_device2",
suggested_object_id=f"device2_{domain}",
device_id=device.id,
)
entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_device_excluded",
suggested_object_id=f"device_{domain}_excluded",
device_id=device.id,
)
# Entities associated with label
entity_label = entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_label",
suggested_object_id=f"label_{domain}",
)
entity_reg.async_update_entity(entity_label.entity_id, labels={label.label_id})
entity_label_excluded = entity_reg.async_get_or_create(
domain=domain,
platform="test",
unique_id=f"{domain}_label_excluded",
suggested_object_id=f"label_{domain}_excluded",
)
entity_reg.async_update_entity(
entity_label_excluded.entity_id, labels={label.label_id}
)
# Return all available entities
return {
"included": [
f"{domain}.standalone_{domain}",
f"{domain}.standalone2_{domain}",
f"{domain}.label_{domain}",
f"{domain}.area_{domain}",
f"{domain}.device_{domain}",
f"{domain}.device2_{domain}",
],
"excluded": [
f"{domain}.standalone_{domain}_excluded",
f"{domain}.label_{domain}_excluded",
f"{domain}.area_{domain}_excluded",
f"{domain}.device_{domain}_excluded",
],
}
def parametrize_target_entities(domain: str) -> list[tuple[dict, str, int]]:
"""Parametrize target entities for different target types.
Meant to be used with target_entities.
"""
return [
(
{
CONF_ENTITY_ID: [
f"{domain}.standalone_{domain}",
f"{domain}.standalone2_{domain}",
]
},
f"{domain}.standalone_{domain}",
2,
),
({ATTR_LABEL_ID: "test_label"}, f"{domain}.label_{domain}", 3),
({ATTR_AREA_ID: "test_area"}, f"{domain}.area_{domain}", 3),
({ATTR_FLOOR_ID: "test_floor"}, f"{domain}.area_{domain}", 3),
({ATTR_LABEL_ID: "test_label"}, f"{domain}.device_{domain}", 3),
({ATTR_AREA_ID: "test_area"}, f"{domain}.device_{domain}", 3),
({ATTR_FLOOR_ID: "test_floor"}, f"{domain}.device_{domain}", 3),
({ATTR_DEVICE_ID: "test_device"}, f"{domain}.device_{domain}", 2),
]
class _StateDescription(TypedDict):
"""Test state with attributes."""
state: str | None
attributes: dict
class TriggerStateDescription(TypedDict):
"""Test state and expected service call count."""
included: _StateDescription # State for entities meant to be targeted
excluded: _StateDescription # State for entities not meant to be targeted
count: int # Expected service call count
class ConditionStateDescription(TypedDict):
"""Test state and expected condition evaluation."""
included: _StateDescription # State for entities meant to be targeted
excluded: _StateDescription # State for entities not meant to be targeted
condition_true: bool # If the condition is expected to evaluate to true
condition_true_first_entity: bool # If the condition is expected to evaluate to true for the first targeted entity
def _parametrize_condition_states(
*,
condition: str,
condition_options: dict[str, Any] | None = None,
target_states: list[str | None | tuple[str | None, dict]],
other_states: list[str | None | tuple[str | None, dict]],
additional_attributes: dict | None,
condition_true_if_invalid: bool,
) -> list[tuple[str, dict[str, Any], list[ConditionStateDescription]]]:
"""Parametrize states and expected condition evaluations.
The target_states and other_states iterables are either iterables of
states or iterables of (state, attributes) tuples.
Returns a list of tuples with (condition, condition options, list of states),
where states is a list of ConditionStateDescription dicts.
"""
additional_attributes = additional_attributes or {}
condition_options = condition_options or {}
def state_with_attributes(
state: str | None | tuple[str | None, dict],
condition_true: bool,
condition_true_first_entity: bool,
) -> ConditionStateDescription:
"""Return ConditionStateDescription dict."""
if isinstance(state, str) or state is None:
return {
"included": {
"state": state,
"attributes": additional_attributes,
},
"excluded": {
"state": state,
"attributes": {},
},
"condition_true": condition_true,
"condition_true_first_entity": condition_true_first_entity,
}
return {
"included": {
"state": state[0],
"attributes": state[1] | additional_attributes,
},
"excluded": {
"state": state[0],
"attributes": state[1],
},
"condition_true": condition_true,
"condition_true_first_entity": condition_true_first_entity,
}
return [
(
condition,
condition_options,
list(
itertools.chain(
(state_with_attributes(None, condition_true_if_invalid, True),),
(
state_with_attributes(
STATE_UNAVAILABLE, condition_true_if_invalid, True
),
),
(
state_with_attributes(
STATE_UNKNOWN, condition_true_if_invalid, True
),
),
(
state_with_attributes(other_state, False, False)
for other_state in other_states
),
),
),
),
# Test each target state individually to isolate condition_true expectations
*(
(
condition,
condition_options,
[
state_with_attributes(other_states[0], False, False),
state_with_attributes(target_state, True, False),
],
)
for target_state in target_states
),
]
def parametrize_condition_states_any(
*,
condition: str,
condition_options: dict[str, Any] | None = None,
target_states: list[str | None | tuple[str | None, dict]],
other_states: list[str | None | tuple[str | None, dict]],
additional_attributes: dict | None = None,
) -> list[tuple[str, dict[str, Any], list[ConditionStateDescription]]]:
"""Parametrize states and expected condition evaluations.
The target_states and other_states iterables are either iterables of
states or iterables of (state, attributes) tuples.
Returns a list of tuples with (condition, condition options, list of states),
where states is a list of ConditionStateDescription dicts.
"""
return _parametrize_condition_states(
condition=condition,
condition_options=condition_options,
target_states=target_states,
other_states=other_states,
additional_attributes=additional_attributes,
condition_true_if_invalid=False,
)
def parametrize_condition_states_all(
*,
condition: str,
condition_options: dict[str, Any] | None = None,
target_states: list[str | None | tuple[str | None, dict]],
other_states: list[str | None | tuple[str | None, dict]],
additional_attributes: dict | None = None,
) -> list[tuple[str, dict[str, Any], list[ConditionStateDescription]]]:
"""Parametrize states and expected condition evaluations.
The target_states and other_states iterables are either iterables of
states or iterables of (state, attributes) tuples.
Returns a list of tuples with (condition, condition options, list of states),
where states is a list of ConditionStateDescription dicts.
"""
return _parametrize_condition_states(
condition=condition,
condition_options=condition_options,
target_states=target_states,
other_states=other_states,
additional_attributes=additional_attributes,
condition_true_if_invalid=True,
)
def parametrize_trigger_states(
*,
trigger: str,
trigger_options: dict[str, Any] | None = None,
target_states: list[str | None | tuple[str | None, dict]],
other_states: list[str | None | tuple[str | None, dict]],
extra_invalid_states: list[str | None | tuple[str | None, dict]] | None = None,
additional_attributes: dict | None = None,
trigger_from_none: bool = True,
retrigger_on_target_state: bool = False,
) -> list[tuple[str, dict[str, Any], list[TriggerStateDescription]]]:
"""Parametrize states and expected service call counts.
The target_states, other_states, and extra_invalid_states iterables are
either iterables of states or iterables of (state, attributes) tuples.
Set `trigger_from_none` to False if the trigger is not expected to fire
when the initial state is None, this is relevant for triggers that limit
entities to a certain device class because the device class can't be
determined when the state is None.
Set `retrigger_on_target_state` to True if the trigger is expected to fire
when the state changes to another target state.
Returns a list of tuples with (trigger, list of states),
where states is a list of TriggerStateDescription dicts.
"""
extra_invalid_states = extra_invalid_states or []
invalid_states = [STATE_UNAVAILABLE, STATE_UNKNOWN, *extra_invalid_states]
additional_attributes = additional_attributes or {}
trigger_options = trigger_options or {}
def state_with_attributes(
state: str | None | tuple[str | None, dict], count: int
) -> TriggerStateDescription:
"""Return TriggerStateDescription dict."""
if isinstance(state, str) or state is None:
return {
"included": {
"state": state,
"attributes": additional_attributes,
},
"excluded": {
"state": state if additional_attributes else None,
"attributes": {},
},
"count": count,
}
return {
"included": {
"state": state[0],
"attributes": state[1] | additional_attributes,
},
"excluded": {
"state": state[0] if additional_attributes else None,
"attributes": state[1],
},
"count": count,
}
tests = [
# Initial state None
(
trigger,
trigger_options,
list(
itertools.chain.from_iterable(
(
state_with_attributes(None, 0),
state_with_attributes(target_state, 0),
state_with_attributes(other_state, 0),
state_with_attributes(
target_state, 1 if trigger_from_none else 0
),
)
for target_state in target_states
for other_state in other_states
)
),
),
# Initial state different from target state
(
trigger,
trigger_options,
# other_state,
list(
itertools.chain.from_iterable(
(
state_with_attributes(other_state, 0),
state_with_attributes(target_state, 1),
state_with_attributes(other_state, 0),
state_with_attributes(target_state, 1),
)
for target_state in target_states
for other_state in other_states
)
),
),
# Initial state same as target state
(
trigger,
trigger_options,
list(
itertools.chain.from_iterable(
(
state_with_attributes(target_state, 0),
state_with_attributes(target_state, 0),
state_with_attributes(other_state, 0),
state_with_attributes(target_state, 1),
# Repeat target state to test retriggering
state_with_attributes(target_state, 0),
state_with_attributes(STATE_UNAVAILABLE, 0),
)
for target_state in target_states
for other_state in other_states
)
),
),
# Initial state unavailable / unknown + extra invalid states
(
trigger,
trigger_options,
list(
itertools.chain.from_iterable(
(
state_with_attributes(invalid_state, 0),
state_with_attributes(target_state, 0),
state_with_attributes(other_state, 0),
state_with_attributes(target_state, 1),
)
for invalid_state in invalid_states
for target_state in target_states
for other_state in other_states
)
),
),
]
if len(target_states) > 1:
# If more than one target state, test state change between target states
tests.append(
(
trigger,
trigger_options,
list(
itertools.chain.from_iterable(
(
state_with_attributes(target_states[idx - 1], 0),
state_with_attributes(
target_state, 1 if retrigger_on_target_state else 0
),
state_with_attributes(other_state, 0),
state_with_attributes(target_states[idx - 1], 1),
state_with_attributes(
target_state, 1 if retrigger_on_target_state else 0
),
state_with_attributes(STATE_UNAVAILABLE, 0),
)
for idx, target_state in enumerate(target_states[1:], start=1)
for other_state in other_states
)
),
),
)
return tests
def parametrize_numerical_attribute_changed_trigger_states(
trigger: str, state: str, attribute: str
) -> list[tuple[str, dict[str, Any], list[TriggerStateDescription]]]:
"""Parametrize states and expected service call counts for numerical changed triggers."""
return [
*parametrize_trigger_states(
trigger=trigger,
trigger_options={},
target_states=[
(state, {attribute: 0}),
(state, {attribute: 50}),
(state, {attribute: 100}),
],
other_states=[(state, {attribute: None})],
retrigger_on_target_state=True,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={CONF_ABOVE: 10},
target_states=[
(state, {attribute: 50}),
(state, {attribute: 100}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 0}),
],
retrigger_on_target_state=True,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={CONF_BELOW: 90},
target_states=[
(state, {attribute: 0}),
(state, {attribute: 50}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 100}),
],
retrigger_on_target_state=True,
),
]
def parametrize_numerical_attribute_crossed_threshold_trigger_states(
trigger: str, state: str, attribute: str
) -> list[tuple[str, dict[str, Any], list[TriggerStateDescription]]]:
"""Parametrize states and expected service call counts for numerical crossed threshold triggers."""
return [
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.BETWEEN,
CONF_LOWER_LIMIT: 10,
CONF_UPPER_LIMIT: 90,
},
target_states=[
(state, {attribute: 50}),
(state, {attribute: 60}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 0}),
(state, {attribute: 100}),
],
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.OUTSIDE,
CONF_LOWER_LIMIT: 10,
CONF_UPPER_LIMIT: 90,
},
target_states=[
(state, {attribute: 0}),
(state, {attribute: 100}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 50}),
(state, {attribute: 60}),
],
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.ABOVE,
CONF_LOWER_LIMIT: 10,
},
target_states=[
(state, {attribute: 50}),
(state, {attribute: 100}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 0}),
],
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.BELOW,
CONF_UPPER_LIMIT: 90,
},
target_states=[
(state, {attribute: 0}),
(state, {attribute: 50}),
],
other_states=[
(state, {attribute: None}),
(state, {attribute: 100}),
],
),
]
def parametrize_numerical_state_value_changed_trigger_states(
trigger: str, device_class: str
) -> list[tuple[str, dict[str, Any], list[TriggerStateDescription]]]:
"""Parametrize states and expected service call counts for numerical state-value changed triggers.
Unlike parametrize_numerical_attribute_changed_trigger_states, this is for
entities where the tracked numerical value is in state.state (e.g. sensor
entities), not in an attribute.
"""
from homeassistant.const import ATTR_DEVICE_CLASS # noqa: PLC0415
additional_attributes = {ATTR_DEVICE_CLASS: device_class}
return [
*parametrize_trigger_states(
trigger=trigger,
trigger_options={},
target_states=["0", "50", "100"],
other_states=["none"],
additional_attributes=additional_attributes,
retrigger_on_target_state=True,
trigger_from_none=False,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={CONF_ABOVE: 10},
target_states=["50", "100"],
other_states=["none", "0"],
additional_attributes=additional_attributes,
retrigger_on_target_state=True,
trigger_from_none=False,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={CONF_BELOW: 90},
target_states=["0", "50"],
other_states=["none", "100"],
additional_attributes=additional_attributes,
retrigger_on_target_state=True,
trigger_from_none=False,
),
]
def parametrize_numerical_state_value_crossed_threshold_trigger_states(
trigger: str, device_class: str
) -> list[tuple[str, dict[str, Any], list[TriggerStateDescription]]]:
"""Parametrize states and expected service call counts for numerical state-value crossed threshold triggers.
Unlike parametrize_numerical_attribute_crossed_threshold_trigger_states,
this is for entities where the tracked numerical value is in state.state
(e.g. sensor entities), not in an attribute.
"""
from homeassistant.const import ATTR_DEVICE_CLASS # noqa: PLC0415
additional_attributes = {ATTR_DEVICE_CLASS: device_class}
return [
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.BETWEEN,
CONF_LOWER_LIMIT: 10,
CONF_UPPER_LIMIT: 90,
},
target_states=["50", "60"],
other_states=["none", "0", "100"],
additional_attributes=additional_attributes,
trigger_from_none=False,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.OUTSIDE,
CONF_LOWER_LIMIT: 10,
CONF_UPPER_LIMIT: 90,
},
target_states=["0", "100"],
other_states=["none", "50", "60"],
additional_attributes=additional_attributes,
trigger_from_none=False,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.ABOVE,
CONF_LOWER_LIMIT: 10,
},
target_states=["50", "100"],
other_states=["none", "0"],
additional_attributes=additional_attributes,
trigger_from_none=False,
),
*parametrize_trigger_states(
trigger=trigger,
trigger_options={
CONF_THRESHOLD_TYPE: ThresholdType.BELOW,
CONF_UPPER_LIMIT: 90,
},
target_states=["0", "50"],
other_states=["none", "100"],
additional_attributes=additional_attributes,
trigger_from_none=False,
),
]
async def arm_trigger(
hass: HomeAssistant,
trigger: str,
trigger_options: dict[str, Any] | None,
trigger_target: dict,
) -> None:
"""Arm the specified trigger, call service test.automation when it triggers."""
# Local include to avoid importing the automation component unnecessarily
from homeassistant.components import automation # noqa: PLC0415
options = {CONF_OPTIONS: {**trigger_options}} if trigger_options is not None else {}
await async_setup_component(
hass,
automation.DOMAIN,
{
automation.DOMAIN: {
"trigger": {
CONF_PLATFORM: trigger,
CONF_TARGET: {**trigger_target},
}
| options,
"action": {
"service": "test.automation",
"data_template": {CONF_ENTITY_ID: "{{ trigger.entity_id }}"},
},
}
},
)
async def create_target_condition(
hass: HomeAssistant,
*,
condition: str,
target: dict,
behavior: str,
) -> ConditionCheckerTypeOptional:
"""Create a target condition."""
return await async_condition_from_config(
hass,
{
CONF_CONDITION: condition,
CONF_TARGET: target,
CONF_OPTIONS: {"behavior": behavior},
},
)
def set_or_remove_state(
hass: HomeAssistant,
entity_id: str,
state: TriggerStateDescription,
) -> None:
"""Set or remove the state of an entity."""
if state["state"] is None:
hass.states.async_remove(entity_id)
else:
hass.states.async_set(
entity_id, state["state"], state["attributes"], force_update=True
)
def other_states(state: StrEnum | Iterable[StrEnum]) -> list[str]:
"""Return a sorted list with all states except the specified one."""
if isinstance(state, StrEnum):
excluded_values = {state.value}
enum_class = state.__class__
else:
if len(state) == 0:
raise ValueError("state iterable must not be empty")
excluded_values = {s.value for s in state}
enum_class = list(state)[0].__class__
return sorted({s.value for s in enum_class} - excluded_values)
async def assert_condition_gated_by_labs_flag(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, condition: str
) -> None:
"""Helper to check that a condition is gated by the labs flag."""
# Local include to avoid importing the automation component unnecessarily
from homeassistant.components import automation # noqa: PLC0415
await async_setup_component(
hass,
automation.DOMAIN,
{
automation.DOMAIN: {
"trigger": {"platform": "event", "event_type": "test_event"},
"condition": {
CONF_CONDITION: condition,
CONF_TARGET: {ATTR_LABEL_ID: "test_label"},
CONF_OPTIONS: {"behavior": "any"},
},
"action": {
"service": "test.automation",
},
}
},
)
assert (
"Unnamed automation failed to setup conditions and has been disabled: "
f"Condition '{condition}' requires the experimental 'New triggers and "
"conditions' feature to be enabled in Home Assistant Labs settings "
"(feature flag: 'new_triggers_conditions')"
) in caplog.text
async def assert_trigger_gated_by_labs_flag(
hass: HomeAssistant, caplog: pytest.LogCaptureFixture, trigger: str
) -> None:
"""Helper to check that a trigger is gated by the labs flag."""
await arm_trigger(hass, trigger, None, {ATTR_LABEL_ID: "test_label"})
assert (
"Unnamed automation failed to setup triggers and has been disabled: Trigger "
f"'{trigger}' requires the experimental 'New triggers and conditions' "
"feature to be enabled in Home Assistant Labs settings (feature flag: "
"'new_triggers_conditions')"
) in caplog.text
async def assert_trigger_behavior_any(
hass: HomeAssistant,
*,
service_calls: list[ServiceCall],
target_entities: dict[str, list[str]],
trigger_target_config: dict,
entity_id: str,
entities_in_target: int,
trigger: str,
trigger_options: dict[str, Any],
states: list[TriggerStateDescription],
) -> None:
"""Test trigger fires in mode any."""
other_entity_ids = set(target_entities["included"]) - {entity_id}
excluded_entity_ids = set(target_entities["excluded"]) - {entity_id}
for eid in target_entities["included"]:
set_or_remove_state(hass, eid, states[0]["included"])
await hass.async_block_till_done()
for eid in excluded_entity_ids:
set_or_remove_state(hass, eid, states[0]["excluded"])
await hass.async_block_till_done()
await arm_trigger(hass, trigger, trigger_options, trigger_target_config)
for state in states[1:]:
excluded_state = state["excluded"]
included_state = state["included"]
set_or_remove_state(hass, entity_id, included_state)
await hass.async_block_till_done()
assert len(service_calls) == state["count"]
for service_call in service_calls:
assert service_call.data[CONF_ENTITY_ID] == entity_id
service_calls.clear()
for other_entity_id in other_entity_ids:
set_or_remove_state(hass, other_entity_id, included_state)
await hass.async_block_till_done()
for excluded_entity_id in excluded_entity_ids:
set_or_remove_state(hass, excluded_entity_id, excluded_state)
await hass.async_block_till_done()
assert len(service_calls) == (entities_in_target - 1) * state["count"]
service_calls.clear()

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_binary_sensors[binary_sensor.lunar_ddeeff_timer_running-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_buttons[button.lunar_ddeeff_reset_timer-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_buttons[button.lunar_ddeeff_start_stop_timer-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -99,8 +101,9 @@
# ---
# name: test_buttons[button.lunar_ddeeff_tare-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_sensors[sensor.lunar_ddeeff_battery-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -55,8 +56,9 @@
# ---
# name: test_sensors[sensor.lunar_ddeeff_volume_flow_rate-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -112,8 +114,9 @@
# ---
# name: test_sensors[sensor.lunar_ddeeff_weight-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,

File diff suppressed because it is too large Load Diff

View File

@@ -418,8 +418,9 @@
# ---
# name: test_weather[weather.home-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_climate_entities[climate.living_room-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'hvac_modes': list([
@@ -73,8 +74,9 @@
# ---
# name: test_climate_entities[climate.test_system-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'fan_modes': list([

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_switch_entities[switch.test_system_away_mode-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_switch_entities[switch.test_system_continuous_fan-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -99,8 +101,9 @@
# ---
# name: test_switch_entities[switch.test_system_quiet_mode-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -148,8 +151,9 @@
# ---
# name: test_switch_entities[switch.test_system_turbo_mode-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_fallback_to_get_rooms[sensor.room_1_energy-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
@@ -61,8 +62,9 @@
# ---
# name: test_fallback_to_get_rooms[sensor.room_1_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -118,8 +120,9 @@
# ---
# name: test_multiple_devices_create_individual_sensors[sensor.room_1_energy-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
@@ -178,8 +181,9 @@
# ---
# name: test_multiple_devices_create_individual_sensors[sensor.room_1_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -235,8 +239,9 @@
# ---
# name: test_multiple_devices_create_individual_sensors[sensor.room_2_energy-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
@@ -295,8 +300,9 @@
# ---
# name: test_multiple_devices_create_individual_sensors[sensor.room_2_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -352,8 +358,9 @@
# ---
# name: test_sensor_cloud[sensor.room_1_energy-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
@@ -412,8 +419,9 @@
# ---
# name: test_sensor_cloud[sensor.room_1_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_sensors[sensor.adguard_home_average_processing_speed-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -51,8 +52,9 @@
# ---
# name: test_sensors[sensor.adguard_home_dns_queries-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -101,8 +103,9 @@
# ---
# name: test_sensors[sensor.adguard_home_dns_queries_blocked-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -151,8 +154,9 @@
# ---
# name: test_sensors[sensor.adguard_home_dns_queries_blocked_ratio-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -201,8 +205,9 @@
# ---
# name: test_sensors[sensor.adguard_home_parental_control_blocked-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -251,8 +256,9 @@
# ---
# name: test_sensors[sensor.adguard_home_rules_count-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -301,8 +307,9 @@
# ---
# name: test_sensors[sensor.adguard_home_safe_browsing_blocked-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -351,8 +358,9 @@
# ---
# name: test_sensors[sensor.adguard_home_safe_searches_enforced-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_switch[switch.adguard_home_filtering-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_switch[switch.adguard_home_parental_control-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -99,8 +101,9 @@
# ---
# name: test_switch[switch.adguard_home_protection-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -148,8 +151,9 @@
# ---
# name: test_switch[switch.adguard_home_query_log-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -197,8 +201,9 @@
# ---
# name: test_switch[switch.adguard_home_safe_browsing-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -246,8 +251,9 @@
# ---
# name: test_switch[switch.adguard_home_safe_search-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_update[update.adguard_home-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[indoor][button.airgradient_calibrate_co2_sensor-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_all_entities[indoor][button.airgradient_test_led_bar-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -99,8 +101,9 @@
# ---
# name: test_all_entities[outdoor][button.airgradient_calibrate_co2_sensor-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[number.airgradient_display_brightness-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'max': 100,
@@ -60,8 +61,9 @@
# ---
# name: test_all_entities[number.airgradient_led_bar_brightness-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'max': 100,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[indoor][select.airgradient_co2_automatic_baseline_duration-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -67,8 +68,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_configuration_source-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -125,8 +127,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_display_pm_standard-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -183,8 +186,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_display_temperature_unit-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -241,8 +245,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_led_bar_mode-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -301,8 +306,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_nox_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -365,8 +371,9 @@
# ---
# name: test_all_entities[indoor][select.airgradient_voc_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -429,8 +436,9 @@
# ---
# name: test_all_entities[outdoor][select.airgradient_co2_automatic_baseline_duration-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -495,8 +503,9 @@
# ---
# name: test_all_entities[outdoor][select.airgradient_configuration_source-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -553,8 +562,9 @@
# ---
# name: test_all_entities[outdoor][select.airgradient_nox_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -617,8 +627,9 @@
# ---
# name: test_all_entities[outdoor][select.airgradient_voc_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[indoor][sensor.airgradient_carbon_dioxide-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -55,8 +56,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_carbon_dioxide_automatic_baseline_calibration-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -109,8 +111,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_display_brightness-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -159,8 +162,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_display_pm_standard-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -218,8 +222,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_display_temperature_unit-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -277,8 +282,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_humidity-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -331,8 +337,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_led_bar_brightness-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -381,8 +388,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_led_bar_mode-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'options': list([
@@ -442,8 +450,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_nox_index-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -494,8 +503,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_nox_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -548,8 +558,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_pm0_3-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -601,8 +612,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_pm1-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -655,8 +667,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_pm10-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -709,8 +722,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_pm2_5-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -763,8 +777,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_raw_nox-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -816,8 +831,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_raw_pm2_5-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -870,8 +886,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_raw_voc-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -923,8 +940,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_signal_strength-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -977,8 +995,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1034,8 +1053,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_voc_index-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1086,8 +1106,9 @@
# ---
# name: test_all_entities[indoor][sensor.airgradient_voc_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -1140,8 +1161,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_carbon_dioxide_automatic_baseline_calibration-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -1194,8 +1216,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_nox_index-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1246,8 +1269,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_nox_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -1300,8 +1324,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_raw_nox-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1353,8 +1378,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_raw_voc-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1406,8 +1432,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_signal_strength-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1460,8 +1487,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_voc_index-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -1512,8 +1540,9 @@
# ---
# name: test_all_entities[outdoor][sensor.airgradient_voc_index_learning_offset-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[switch.airgradient_post_data_to_airgradient-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[update.airgradient_firmware-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_sensor[sensor.home_carbon_monoxide-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -60,8 +61,9 @@
# ---
# name: test_sensor[sensor.home_common_air_quality_index-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -117,8 +119,9 @@
# ---
# name: test_sensor[sensor.home_humidity-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -175,8 +178,9 @@
# ---
# name: test_sensor[sensor.home_nitrogen_dioxide-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -235,8 +239,9 @@
# ---
# name: test_sensor[sensor.home_ozone-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -295,8 +300,9 @@
# ---
# name: test_sensor[sensor.home_pm1-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -353,8 +359,9 @@
# ---
# name: test_sensor[sensor.home_pm10-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -413,8 +420,9 @@
# ---
# name: test_sensor[sensor.home_pm2_5-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -473,8 +481,9 @@
# ---
# name: test_sensor[sensor.home_pressure-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -531,8 +540,9 @@
# ---
# name: test_sensor[sensor.home_sulphur_dioxide-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -591,8 +601,9 @@
# ---
# name: test_sensor[sensor.home_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_buttons[button.test_thermostat_recalibrate_co2_sensor-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_buttons[button.test_thermostat_restart-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_climate_entities[climate.test_thermostat-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'hvac_modes': list([

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_number_entities[number.test_thermostat_hysteresis_band-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'max': 0.5,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_sensors[sensor.test_thermostat_air_temperature-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -58,8 +59,9 @@
# ---
# name: test_sensors[sensor.test_thermostat_device_uptime-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -108,8 +110,9 @@
# ---
# name: test_sensors[sensor.test_thermostat_error_count-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
@@ -160,8 +163,9 @@
# ---
# name: test_sensors[sensor.test_thermostat_heating_uptime-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
@@ -220,8 +224,9 @@
# ---
# name: test_sensors[sensor.test_thermostat_humidity-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_switches[switch.test_thermostat_actuator_exercise_disabled-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -50,8 +51,9 @@
# ---
# name: test_switches[switch.test_thermostat_child_lock-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

View File

@@ -1,8 +1,9 @@
# serializer version: 1
# name: test_all_entities[airos_NanoStation_M5_sta_v6.3.16.json][binary_sensor.nanostation_m5_dhcp_client-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -51,8 +52,9 @@
# ---
# name: test_all_entities[airos_NanoStation_M5_sta_v6.3.16.json][binary_sensor.nanostation_m5_dhcp_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -101,8 +103,9 @@
# ---
# name: test_all_entities[airos_NanoStation_M5_sta_v6.3.16.json][binary_sensor.nanostation_m5_pppoe_link-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -151,8 +154,9 @@
# ---
# name: test_all_entities[airos_NanoStation_loco_M5_v6.3.16_XM_sta.json][binary_sensor.nanostation_loco_m5_client_dhcp_client-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -201,8 +205,9 @@
# ---
# name: test_all_entities[airos_NanoStation_loco_M5_v6.3.16_XM_sta.json][binary_sensor.nanostation_loco_m5_client_dhcp_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -251,8 +256,9 @@
# ---
# name: test_all_entities[airos_NanoStation_loco_M5_v6.3.16_XM_sta.json][binary_sensor.nanostation_loco_m5_client_pppoe_link-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -301,8 +307,9 @@
# ---
# name: test_all_entities[airos_liteapgps_ap_ptmp_40mhz.json][binary_sensor.house_bridge_dhcp_client-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -351,8 +358,9 @@
# ---
# name: test_all_entities[airos_liteapgps_ap_ptmp_40mhz.json][binary_sensor.house_bridge_dhcp_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -401,8 +409,9 @@
# ---
# name: test_all_entities[airos_liteapgps_ap_ptmp_40mhz.json][binary_sensor.house_bridge_dhcpv6_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -451,8 +460,9 @@
# ---
# name: test_all_entities[airos_liteapgps_ap_ptmp_40mhz.json][binary_sensor.house_bridge_port_forwarding-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -500,8 +510,9 @@
# ---
# name: test_all_entities[airos_liteapgps_ap_ptmp_40mhz.json][binary_sensor.house_bridge_pppoe_link-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -550,8 +561,9 @@
# ---
# name: test_all_entities[airos_loco5ac_ap-ptp.json][binary_sensor.nanostation_5ac_ap_name_dhcp_client-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -600,8 +612,9 @@
# ---
# name: test_all_entities[airos_loco5ac_ap-ptp.json][binary_sensor.nanostation_5ac_ap_name_dhcp_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -650,8 +663,9 @@
# ---
# name: test_all_entities[airos_loco5ac_ap-ptp.json][binary_sensor.nanostation_5ac_ap_name_dhcpv6_server-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -700,8 +714,9 @@
# ---
# name: test_all_entities[airos_loco5ac_ap-ptp.json][binary_sensor.nanostation_5ac_ap_name_port_forwarding-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
@@ -749,8 +764,9 @@
# ---
# name: test_all_entities[airos_loco5ac_ap-ptp.json][binary_sensor.nanostation_5ac_ap_name_pppoe_link-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'aliases': list([
None,
]),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,

Some files were not shown because too many files have changed in this diff Show More