Compare commits

..

2 Commits

Author SHA1 Message Date
Jan Čermák
8df4152d4e Disable unnecessary parts to test image build 2026-02-24 17:18:39 +01:00
Jan Čermák
e6ed0b5d14 Use native ARM runner for builder action, update to builder 2026.02.1
Since 2026.02.0 the builder has sha-pinning fixed, so we can also get rid of
the Zizmor error suppression.

Builder changes:
* https://github.com/home-assistant/builder/releases/tag/2026.02.0
* https://github.com/home-assistant/builder/releases/tag/2026.02.1
2026-02-24 16:15:30 +01:00
1568 changed files with 25458 additions and 93784 deletions

View File

@@ -1,46 +0,0 @@
---
name: github-pr-reviewer
description: Review a GitHub pull request and provide feedback comments. Use when the user says "review the current PR" or asks to review a specific PR.
---
# Review GitHub Pull Request
## Preparation:
- Check if the local commit matches the last one in the PR. If not, checkout the PR locally using 'gh pr checkout'.
- CRITICAL: If 'gh pr checkout' fails for ANY reason, you MUST immediately STOP.
- Do NOT attempt any workarounds.
- Do NOT proceed with the review.
- ALERT about the failure and WAIT for instructions.
- This is a hard requirement - no exceptions.
## Follow these steps:
1. Use 'gh pr view' to get the PR details and description.
2. Use 'gh pr diff' to see all the changes in the PR.
3. Analyze the code changes for:
- Code quality and style consistency
- Potential bugs or issues
- Performance implications
- Security concerns
- Test coverage
- Documentation updates if needed
4. Ensure any existing review comments have been addressed.
5. Generate constructive review comments in the CONSOLE. DO NOT POST TO GITHUB YOURSELF.
## IMPORTANT:
- Just review. DO NOT make any changes
- Be constructive and specific in your comments
- Suggest improvements where appropriate
- Only provide review feedback in the CONSOLE. DO NOT ACT ON GITHUB.
- No need to run tests or linters, just review the code changes.
- No need to highlight things that are already good.
## Output format:
- List specific comments for each file/line that needs attention
- In the end, summarize with an overall assessment (approve, request changes, or comment) and bullet point list of changes suggested, if any.
- Example output:
```
Overall assessment: request changes.
- [CRITICAL] Memory leak in homeassistant/components/sensor/my_sensor.py:143
- [PROBLEM] Inefficient algorithm in homeassistant/helpers/data_processing.py:87
- [SUGGESTION] Improve variable naming in homeassistant/helpers/config_validation.py:45
```

View File

@@ -34,7 +34,6 @@ base_platforms: &base_platforms
- homeassistant/components/humidifier/**
- homeassistant/components/image/**
- homeassistant/components/image_processing/**
- homeassistant/components/infrared/**
- homeassistant/components/lawn_mower/**
- homeassistant/components/light/**
- homeassistant/components/lock/**

File diff suppressed because it is too large Load Diff

View File

@@ -10,6 +10,7 @@ on:
env:
BUILD_TYPE: core
DEFAULT_PYTHON: "3.14.2"
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
@@ -41,10 +42,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Get information
id: info
@@ -56,10 +57,10 @@ jobs:
with:
type: ${{ env.BUILD_TYPE }}
- name: Verify version
uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
with:
ignore-dev: true
# - name: Verify version
# uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
# with:
# ignore-dev: true
- name: Fail if translations files are checked in
run: |
@@ -79,7 +80,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: translations
path: translations.tar.gz
@@ -111,7 +112,7 @@ jobs:
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -122,7 +123,7 @@ jobs:
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -131,11 +132,11 @@ jobs:
workflow_conclusion: success
name: package
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Adjust nightly version
if: needs.init.outputs.channel == 'dev'
@@ -181,7 +182,7 @@ jobs:
fi
- name: Download translations
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: translations
@@ -340,282 +341,283 @@ jobs:
image: ${{ matrix.arch }}
args: |
$BUILD_ARGS \
--test \
--target /data/machine \
--cosign \
--machine "${{ needs.init.outputs.version }}=${{ matrix.machine }}"
publish_ha:
name: Publish version files
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_machine"]
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
with:
name: ${{ secrets.GIT_NAME }}
email: ${{ secrets.GIT_EMAIL }}
token: ${{ secrets.GIT_TOKEN }}
- name: Update version file
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: ${{ needs.init.outputs.channel }}
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
- name: Update version file (stable -> beta)
if: needs.init.outputs.channel == 'stable'
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: beta
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
publish_container:
name: Publish meta container for ${{ matrix.registry }}
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
cosign-release: "v2.5.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Verify architecture image signatures
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Verifying ${arch} image signature..."
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp https://github.com/home-assistant/core/.* \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
done
echo "✓ All images verified successfully"
# Generate all Docker tags based on version string
# Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# Examples:
# 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
- name: Generate Docker metadata
id: meta
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: ${{ matrix.registry }}/home-assistant
sep-tags: ","
tags: |
type=raw,value=${{ needs.init.outputs.version }},priority=9999
type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.7.1
- name: Copy architecture images to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
# Use imagetools to copy image blobs directly between registries
# This preserves provenance/attestations and seems to be much faster than pull/push
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Copying ${arch} image to DockerHub..."
for attempt in 1 2 3; do
if docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
break
fi
echo "Attempt ${attempt} failed, retrying in 10 seconds..."
sleep 10
if [ "${attempt}" -eq 3 ]; then
echo "Failed after 3 attempts"
exit 1
fi
done
cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
done
- name: Create and push multi-arch manifests
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
REGISTRY: ${{ matrix.registry }}
VERSION: ${{ needs.init.outputs.version }}
META_TAGS: ${{ steps.meta.outputs.tags }}
run: |
# Build list of architecture images dynamically
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
ARCH_IMAGES=()
for arch in $ARCHS; do
ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
done
# Build list of all tags for single manifest creation
# Note: Using sep-tags=',' in metadata-action for easier parsing
TAG_ARGS=()
IFS=',' read -ra TAGS <<< "${META_TAGS}"
for tag in "${TAGS[@]}"; do
TAG_ARGS+=("--tag" "${tag}")
done
# Create manifest with ALL tags in a single operation (much faster!)
echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
# Sign each tag separately (signing requires individual tag names)
echo "Signing all tags..."
for tag in "${TAGS[@]}"; do
echo "Signing ${tag}"
cosign sign --yes "${tag}"
done
echo "All manifests created and signed successfully"
build_python:
name: Build PyPi package
environment: ${{ needs.init.outputs.channel }}
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
id-token: write # For PyPI trusted publishing
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
- name: Download translations
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Build package
shell: bash
run: |
# Remove dist, build, and homeassistant.egg-info
# when build locally for testing!
pip install build
python -m build
- name: Upload package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
skip-existing: true
hassfest-image:
name: Build and test hassfest image
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
attestations: write # For build provenance attestation
id-token: write # For build provenance attestation
needs: ["init"]
if: github.repository_owner == 'home-assistant'
env:
HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker image
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
load: true
tags: ${{ env.HASSFEST_IMAGE_TAG }}
- name: Run hassfest against core
run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
- name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
id: push
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
push: true
tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
- name: Generate artifact attestation
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v4.1.0
with:
subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
# publish_ha:
# name: Publish version files
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_machine"]
# runs-on: ubuntu-latest
# permissions:
# contents: read
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Initialize git
# uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
# with:
# name: ${{ secrets.GIT_NAME }}
# email: ${{ secrets.GIT_EMAIL }}
# token: ${{ secrets.GIT_TOKEN }}
#
# - name: Update version file
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: ${{ needs.init.outputs.channel }}
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# - name: Update version file (stable -> beta)
# if: needs.init.outputs.channel == 'stable'
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: beta
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# publish_container:
# name: Publish meta container for ${{ matrix.registry }}
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# id-token: write # For cosign signing
# strategy:
# fail-fast: false
# matrix:
# registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
# steps:
# - name: Install Cosign
# uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
# with:
# cosign-release: "v2.5.3"
#
# - name: Login to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_TOKEN }}
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Verify architecture image signatures
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Verifying ${arch} image signature..."
# cosign verify \
# --certificate-oidc-issuer https://token.actions.githubusercontent.com \
# --certificate-identity-regexp https://github.com/home-assistant/core/.* \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
# done
# echo "✓ All images verified successfully"
#
# # Generate all Docker tags based on version string
# # Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# # Examples:
# # 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# # 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# # 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
# - name: Generate Docker metadata
# id: meta
# uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
# with:
# images: ${{ matrix.registry }}/home-assistant
# sep-tags: ","
# tags: |
# type=raw,value=${{ needs.init.outputs.version }},priority=9999
# type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
# type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
#
# - name: Set up Docker Buildx
# uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.7.1
#
# - name: Copy architecture images to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# # Use imagetools to copy image blobs directly between registries
# # This preserves provenance/attestations and seems to be much faster than pull/push
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Copying ${arch} image to DockerHub..."
# for attempt in 1 2 3; do
# if docker buildx imagetools create \
# --tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
# break
# fi
# echo "Attempt ${attempt} failed, retrying in 10 seconds..."
# sleep 10
# if [ "${attempt}" -eq 3 ]; then
# echo "Failed after 3 attempts"
# exit 1
# fi
# done
# cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
# done
#
# - name: Create and push multi-arch manifests
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# REGISTRY: ${{ matrix.registry }}
# VERSION: ${{ needs.init.outputs.version }}
# META_TAGS: ${{ steps.meta.outputs.tags }}
# run: |
# # Build list of architecture images dynamically
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# ARCH_IMAGES=()
# for arch in $ARCHS; do
# ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
# done
#
# # Build list of all tags for single manifest creation
# # Note: Using sep-tags=',' in metadata-action for easier parsing
# TAG_ARGS=()
# IFS=',' read -ra TAGS <<< "${META_TAGS}"
# for tag in "${TAGS[@]}"; do
# TAG_ARGS+=("--tag" "${tag}")
# done
#
# # Create manifest with ALL tags in a single operation (much faster!)
# echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
# docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
#
# # Sign each tag separately (signing requires individual tag names)
# echo "Signing all tags..."
# for tag in "${TAGS[@]}"; do
# echo "Signing ${tag}"
# cosign sign --yes "${tag}"
# done
#
# echo "All manifests created and signed successfully"
#
# build_python:
# name: Build PyPi package
# environment: ${{ needs.init.outputs.channel }}
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# id-token: write # For PyPI trusted publishing
# if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Set up Python ${{ env.DEFAULT_PYTHON }}
# uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
# with:
# python-version: ${{ env.DEFAULT_PYTHON }}
#
# - name: Download translations
# uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
# with:
# name: translations
#
# - name: Extract translations
# run: |
# tar xvf translations.tar.gz
# rm translations.tar.gz
#
# - name: Build package
# shell: bash
# run: |
# # Remove dist, build, and homeassistant.egg-info
# # when build locally for testing!
# pip install build
# python -m build
#
# - name: Upload package to PyPI
# uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
# with:
# skip-existing: true
#
# hassfest-image:
# name: Build and test hassfest image
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# attestations: write # For build provenance attestation
# id-token: write # For build provenance attestation
# needs: ["init"]
# if: github.repository_owner == 'home-assistant'
# env:
# HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
# HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
# steps:
# - name: Checkout repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Build Docker image
# uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# load: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }}
#
# - name: Run hassfest against core
# run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
#
# - name: Push Docker image
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# id: push
# uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# push: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
#
# - name: Generate artifact attestation
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3.2.0
# with:
# subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
# subject-digest: ${{ steps.push.outputs.digest }}
# push-to-registry: true

View File

@@ -40,8 +40,9 @@ env:
CACHE_VERSION: 3
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.4"
ADDITIONAL_PYTHON_VERSIONS: "[]"
HA_SHORT_VERSION: "2026.3"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -165,11 +166,6 @@ jobs:
tests_glob=""
lint_only=""
skip_coverage=""
default_python=$(cat .python-version)
all_python_versions=$(jq -cn \
--arg default_python "${default_python}" \
--argjson additional_python_versions "${ADDITIONAL_PYTHON_VERSIONS}" \
'[$default_python] + $additional_python_versions')
if [[ "${INTEGRATION_CHANGES}" != "[]" ]];
then
@@ -239,8 +235,8 @@ jobs:
echo "mariadb_groups=${mariadb_groups}" >> $GITHUB_OUTPUT
echo "postgresql_groups: ${postgresql_groups}"
echo "postgresql_groups=${postgresql_groups}" >> $GITHUB_OUTPUT
echo "python_versions: ${all_python_versions}"
echo "python_versions=${all_python_versions}" >> $GITHUB_OUTPUT
echo "python_versions: ${ALL_PYTHON_VERSIONS}"
echo "python_versions=${ALL_PYTHON_VERSIONS}" >> $GITHUB_OUTPUT
echo "test_full_suite: ${test_full_suite}"
echo "test_full_suite=${test_full_suite}" >> $GITHUB_OUTPUT
echo "integrations_glob: ${integrations_glob}"
@@ -456,7 +452,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -507,13 +503,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -544,13 +540,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -580,11 +576,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Run gen_copilot_instructions.py
run: |
@@ -609,7 +605,7 @@ jobs:
with:
persist-credentials: false
- name: Dependency review
uses: actions/dependency-review-action@05fe4576374b728f0c523d6a13d64c25081e0803 # v4.8.3
uses: actions/dependency-review-action@3c4e3dcb1aa7874d2c16be7d79418e9b7efd6261 # v4.8.2
with:
license-check: false # We use our own license audit checks
@@ -657,7 +653,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${PYTHON_VERSION}.json
- name: Upload licenses
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -686,13 +682,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -739,13 +735,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -790,11 +786,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Generate partial mypy restore key
id: generate-mypy-key
@@ -802,7 +798,7 @@ jobs:
mypy_version=$(cat requirements_test.txt | grep 'mypy.*=' | cut -d '=' -f 3)
echo "version=${mypy_version}" >> $GITHUB_OUTPUT
echo "key=mypy-${MYPY_CACHE_VERSION}-${mypy_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -883,13 +879,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -905,7 +901,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${TEST_GROUP_COUNT} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -982,7 +978,7 @@ jobs:
run: |
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
- name: Download pytest_buckets
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: pytest_buckets
- name: Compile English translations
@@ -1024,14 +1020,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1044,7 +1040,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1181,7 +1177,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1189,7 +1185,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1203,7 +1199,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1342,7 +1338,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1350,7 +1346,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1364,7 +1360,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1391,7 +1387,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1518,14 +1514,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1538,7 +1534,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1562,7 +1558,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1591,7 +1587,7 @@ jobs:
&& needs.info.outputs.skip_coverage != 'true' && !cancelled()
steps:
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: test-results-*
- name: Upload test results to Codecov

View File

@@ -28,11 +28,11 @@ jobs:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@89a39a4e59826350b863aa6b6252a07ad50cf83e # v4.32.4
uses: github/codeql-action/init@9e907b5e64f6b83e7804b09294d44122997950d6 # v4.32.3
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@89a39a4e59826350b863aa6b6252a07ad50cf83e # v4.32.4
uses: github/codeql-action/analyze@9e907b5e64f6b83e7804b09294d44122997950d6 # v4.32.3
with:
category: "/language:python"

View File

@@ -236,7 +236,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@e09e65981758de8b2fdab13c2bfb7c7d5493b0b6 # v2.0.7
uses: actions/ai-inference@a380166897b5408b8fb7dddd148142794cb5624a # v2.0.6
with:
model: openai/gpt-4o
system-prompt: |

View File

@@ -62,7 +62,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@e09e65981758de8b2fdab13c2bfb7c7d5493b0b6 # v2.0.7
uses: actions/ai-inference@a380166897b5408b8fb7dddd148142794cb5624a # v2.0.6
with:
model: openai/gpt-4o-mini
system-prompt: |

View File

@@ -15,6 +15,9 @@ concurrency:
group: ${{ github.workflow }}
cancel-in-progress: true
env:
DEFAULT_PYTHON: "3.14.2"
jobs:
upload:
name: Upload
@@ -26,10 +29,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Upload Translations
env:

View File

@@ -16,6 +16,9 @@ on:
- "requirements.txt"
- "script/gen_requirements_all.py"
env:
DEFAULT_PYTHON: "3.14.2"
permissions: {}
concurrency:
@@ -33,11 +36,11 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Create Python virtual environment
@@ -74,7 +77,7 @@ jobs:
) > .env_file
- name: Upload env_file
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: env_file
path: ./.env_file
@@ -82,7 +85,7 @@ jobs:
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -94,7 +97,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt
@@ -107,7 +110,7 @@ jobs:
strategy:
fail-fast: false
matrix:
abi: ["cp314"]
abi: ["cp313", "cp314"]
arch: ["amd64", "aarch64"]
include:
- arch: amd64
@@ -121,12 +124,12 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_diff
@@ -158,7 +161,7 @@ jobs:
strategy:
fail-fast: false
matrix:
abi: ["cp314"]
abi: ["cp313", "cp314"]
arch: ["amd64", "aarch64"]
include:
- arch: amd64
@@ -172,17 +175,17 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_diff
- name: Download requirements_all_wheels
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_all_wheels
@@ -206,4 +209,4 @@ jobs:
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_all_wheels_${{ matrix.arch }}.txt"
requirements: "requirements_all.txt"

View File

@@ -1 +1 @@
3.14.2
3.14

View File

@@ -289,7 +289,6 @@ homeassistant.components.imgw_pib.*
homeassistant.components.immich.*
homeassistant.components.incomfort.*
homeassistant.components.inels.*
homeassistant.components.infrared.*
homeassistant.components.input_button.*
homeassistant.components.input_select.*
homeassistant.components.input_text.*
@@ -545,7 +544,6 @@ homeassistant.components.tcp.*
homeassistant.components.technove.*
homeassistant.components.tedee.*
homeassistant.components.telegram_bot.*
homeassistant.components.teslemetry.*
homeassistant.components.text.*
homeassistant.components.thethingsnetwork.*
homeassistant.components.threshold.*

318
AGENTS.md
View File

@@ -4,17 +4,325 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
## Code Review Guidelines
**When reviewing code, do NOT comment on:**
- **Missing imports** - We use static analysis tooling to catch that
- **Code formatting** - We have ruff as a formatting tool that will catch those if needed (unless specifically instructed otherwise in these instructions)
**Git commit practices during review:**
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
## Python Requirements
- **Compatibility**: Python 3.13+
- **Language Features**: Use the newest features when possible:
- Pattern matching
- Type hints
- f-strings (preferred over `%` or `.format()`)
- Dataclasses
- Walrus operator
### Strict Typing (Platinum)
- **Comprehensive Type Hints**: Add type hints to all functions, methods, and variables
- **Custom Config Entry Types**: When using runtime_data:
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
```
- **Library Requirements**: Include `py.typed` file for PEP-561 compliance
## Code Quality Standards
- **Formatting**: Ruff
- **Linting**: PyLint and Ruff
- **Type Checking**: MyPy
- **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists
- **Testing**: pytest with plain functions and fixtures
- **Language**: American English for all code, comments, and documentation (use sentence case, including titles)
### Writing Style Guidelines
- **Tone**: Friendly and informative
- **Perspective**: Use second-person ("you" and "your") for user-facing messages
- **Inclusivity**: Use objective, non-discriminatory language
- **Clarity**: Write for non-native English speakers
- **Formatting in Messages**:
- Use backticks for: file paths, filenames, variable names, field entries
- Use sentence case for titles and messages (capitalize only the first word and proper nouns)
- Avoid abbreviations when possible
### Documentation Standards
- **File Headers**: Short and concise
```python
"""Integration for Peblar EV chargers."""
```
- **Method/Function Docstrings**: Required for all
```python
async def async_setup_entry(hass: HomeAssistant, entry: PeblarConfigEntry) -> bool:
"""Set up Peblar from a config entry."""
```
- **Comment Style**:
- Use clear, descriptive comments
- Explain the "why" not just the "what"
- Keep code block lines under 80 characters when possible
- Use progressive disclosure (simple explanation first, complex details later)
## Async Programming
- All external I/O operations must be async
- **Best Practices**:
- Avoid sleeping in loops
- Avoid awaiting in loops - use `gather` instead
- No blocking calls
- Group executor jobs when possible - switching between event loop and executor is expensive
### Blocking Operations
- **Use Executor**: For blocking I/O operations
```python
result = await hass.async_add_executor_job(blocking_function, args)
```
- **Never Block Event Loop**: Avoid file operations, `time.sleep()`, blocking HTTP calls
- **Replace with Async**: Use `asyncio.sleep()` instead of `time.sleep()`
### Thread Safety
- **@callback Decorator**: For event loop safe functions
```python
@callback
def async_update_callback(self, event):
"""Safe to run in event loop."""
self.async_write_ha_state()
```
- **Sync APIs from Threads**: Use sync versions when calling from non-event loop threads
- **Registry Changes**: Must be done in event loop thread
### Error Handling
- **Exception Types**: Choose most specific exception available
- `ServiceValidationError`: User input errors (preferred over `ValueError`)
- `HomeAssistantError`: Device communication failures
- `ConfigEntryNotReady`: Temporary setup issues (device offline)
- `ConfigEntryAuthFailed`: Authentication problems
- `ConfigEntryError`: Permanent setup issues
- **Try/Catch Best Practices**:
- Only wrap code that can throw exceptions
- Keep try blocks minimal - process data after the try/catch
- **Avoid bare exceptions** except in specific cases:
- ❌ Generally not allowed: `except:` or `except Exception:`
- ✅ Allowed in config flows to ensure robustness
- ✅ Allowed in functions/methods that run in background tasks
- Bad pattern:
```python
try:
data = await device.get_data() # Can throw
# ❌ Don't process data inside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
except DeviceError:
_LOGGER.error("Failed to get data")
```
- Good pattern:
```python
try:
data = await device.get_data() # Can throw
except DeviceError:
_LOGGER.error("Failed to get data")
return
# ✅ Process data outside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
```
- **Bare Exception Usage**:
```python
# ❌ Not allowed in regular code
try:
data = await device.get_data()
except Exception: # Too broad
_LOGGER.error("Failed")
# ✅ Allowed in config flow for robustness
async def async_step_user(self, user_input=None):
try:
await self._test_connection(user_input)
except Exception: # Allowed here
errors["base"] = "unknown"
# ✅ Allowed in background tasks
async def _background_refresh():
try:
await coordinator.async_refresh()
except Exception: # Allowed in task
_LOGGER.exception("Unexpected error in background task")
```
- **Setup Failure Patterns**:
```python
try:
await device.async_setup()
except (asyncio.TimeoutError, TimeoutException) as ex:
raise ConfigEntryNotReady(f"Timeout connecting to {device.host}") from ex
except AuthFailed as ex:
raise ConfigEntryAuthFailed(f"Credentials expired for {device.name}") from ex
```
### Logging
- **Format Guidelines**:
- No periods at end of messages
- No integration names/domains (added automatically)
- No sensitive data (keys, tokens, passwords)
- Use debug level for non-user-facing messages
- **Use Lazy Logging**:
```python
_LOGGER.debug("This is a log message with %s", variable)
```
### Unavailability Logging
- **Log Once**: When device/service becomes unavailable (info level)
- **Log Recovery**: When device/service comes back online
- **Implementation Pattern**:
```python
_unavailable_logged: bool = False
if not self._unavailable_logged:
_LOGGER.info("The sensor is unavailable: %s", ex)
self._unavailable_logged = True
# On recovery:
if self._unavailable_logged:
_LOGGER.info("The sensor is back online")
self._unavailable_logged = False
```
## Development Commands
.vscode/tasks.json contains useful commands used for development.
### Environment
- **Local development (non-container)**: Activate the project venv before running commands: `source .venv/bin/activate`
- **Dev container**: No activation needed, the environment is pre-configured
## Python Syntax Notes
### Code Quality & Linting
- **Run all linters on all files**: `prek run --all-files`
- **Run linters on staged files only**: `prek run`
- **PyLint on everything** (slow): `pylint homeassistant`
- **PyLint on specific folder**: `pylint homeassistant/components/my_integration`
- **MyPy type checking (whole project)**: `mypy homeassistant/`
- **MyPy on specific integration**: `mypy homeassistant/components/my_integration`
- Python 3.14 explicitly allows `except TypeA, TypeB:` without parentheses.
### Testing
- **Quick test of changed files**: `pytest --timeout=10 --picked`
- **Update test snapshots**: Add `--snapshot-update` to pytest command
- ⚠️ Omit test results after using `--snapshot-update`
- Always run tests again without the flag to verify snapshots
- **Full test suite** (AVOID - very slow): `pytest ./tests`
## Good practices
### Dependencies & Requirements
- **Update generated files after dependency changes**: `python -m script.gen_requirements_all`
- **Install all Python requirements**:
```bash
uv pip install -r requirements_all.txt -r requirements.txt -r requirements_test.txt
```
- **Install test requirements only**:
```bash
uv pip install -r requirements_test_all.txt -r requirements.txt
```
Integrations with Platinum or Gold level in the Integration Quality Scale reflect a high standard of code quality and maintainability. When looking for examples of something, these are good places to start. The level is indicated in the manifest.json of the integration.
### Translations
- **Update translations after strings.json changes**:
```bash
python -m script.translations develop --all
```
### Project Validation
- **Run hassfest** (checks project structure and updates generated files):
```bash
python -m script.hassfest
```
## Common Anti-Patterns & Best Practices
### ❌ **Avoid These Patterns**
```python
# Blocking operations in event loop
data = requests.get(url) # ❌ Blocks event loop
time.sleep(5) # ❌ Blocks event loop
# Reusing BleakClient instances
self.client = BleakClient(address)
await self.client.connect()
# Later...
await self.client.connect() # ❌ Don't reuse
# Hardcoded strings in code
self._attr_name = "Temperature Sensor" # ❌ Not translatable
# Missing error handling
data = await self.api.get_data() # ❌ No exception handling
# Storing sensitive data in diagnostics
return {"api_key": entry.data[CONF_API_KEY]} # ❌ Exposes secrets
# Accessing hass.data directly in tests
coordinator = hass.data[DOMAIN][entry.entry_id] # ❌ Don't access hass.data
# User-configurable polling intervals
# In config flow
vol.Optional("scan_interval", default=60): cv.positive_int # ❌ Not allowed
# In coordinator
update_interval = timedelta(minutes=entry.data.get("scan_interval", 1)) # ❌ Not allowed
# User-configurable config entry names (non-helper integrations)
vol.Optional("name", default="My Device"): cv.string # ❌ Not allowed in regular integrations
# Too much code in try block
try:
response = await client.get_data() # Can throw
# ❌ Data processing should be outside try block
temperature = response["temperature"] / 10
humidity = response["humidity"]
self._attr_native_value = temperature
except ClientError:
_LOGGER.error("Failed to fetch data")
# Bare exceptions in regular code
try:
value = await sensor.read_value()
except Exception: # ❌ Too broad - catch specific exceptions
_LOGGER.error("Failed to read sensor")
```
### ✅ **Use These Patterns Instead**
```python
# Async operations with executor
data = await hass.async_add_executor_job(requests.get, url)
await asyncio.sleep(5) # ✅ Non-blocking
# Fresh BleakClient instances
client = BleakClient(address) # ✅ New instance each time
await client.connect()
# Translatable entity names
_attr_translation_key = "temperature_sensor" # ✅ Translatable
# Proper error handling
try:
data = await self.api.get_data()
except ApiException as err:
raise UpdateFailed(f"API error: {err}") from err
# Redacted diagnostics data
return async_redact_data(data, {"api_key", "password"}) # ✅ Safe
# Test through proper integration setup and fixtures
@pytest.fixture
async def init_integration(hass, mock_config_entry, mock_api):
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id) # ✅ Proper setup
# Integration-determined polling intervals (not user-configurable)
SCAN_INTERVAL = timedelta(minutes=5) # ✅ Common pattern: constant in const.py
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
# ✅ Integration determines interval based on device capabilities, connection type, etc.
interval = timedelta(minutes=1) if client.is_local else SCAN_INTERVAL
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=interval,
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```

28
CODEOWNERS generated
View File

@@ -242,8 +242,6 @@ build.json @home-assistant/supervisor
/tests/components/bosch_alarm/ @mag1024 @sanjay900
/homeassistant/components/bosch_shc/ @tschamm
/tests/components/bosch_shc/ @tschamm
/homeassistant/components/brands/ @home-assistant/core
/tests/components/brands/ @home-assistant/core
/homeassistant/components/braviatv/ @bieniu @Drafteed
/tests/components/braviatv/ @bieniu @Drafteed
/homeassistant/components/bring/ @miaucl @tr4nt0r
@@ -281,8 +279,6 @@ build.json @home-assistant/supervisor
/tests/components/cert_expiry/ @jjlawren
/homeassistant/components/chacon_dio/ @cnico
/tests/components/chacon_dio/ @cnico
/homeassistant/components/chess_com/ @joostlek
/tests/components/chess_com/ @joostlek
/homeassistant/components/cisco_ios/ @fbradyirl
/homeassistant/components/cisco_mobility_express/ @fbradyirl
/homeassistant/components/cisco_webex_teams/ @fbradyirl
@@ -403,6 +399,8 @@ build.json @home-assistant/supervisor
/tests/components/dsmr_reader/ @sorted-bits @glodenox @erwindouna
/homeassistant/components/duckdns/ @tr4nt0r
/tests/components/duckdns/ @tr4nt0r
/homeassistant/components/duke_energy/ @hunterjm
/tests/components/duke_energy/ @hunterjm
/homeassistant/components/duotecno/ @cereal2nd
/tests/components/duotecno/ @cereal2nd
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192
@@ -719,8 +717,8 @@ build.json @home-assistant/supervisor
/tests/components/homematic/ @pvizeli
/homeassistant/components/homematicip_cloud/ @hahn-th @lackas
/tests/components/homematicip_cloud/ @hahn-th @lackas
/homeassistant/components/homevolt/ @danielhiversen @liudger
/tests/components/homevolt/ @danielhiversen @liudger
/homeassistant/components/homevolt/ @danielhiversen
/tests/components/homevolt/ @danielhiversen
/homeassistant/components/homewizard/ @DCSBL
/tests/components/homewizard/ @DCSBL
/homeassistant/components/honeywell/ @rdfurman @mkmer
@@ -794,8 +792,6 @@ build.json @home-assistant/supervisor
/tests/components/inels/ @epdevlab
/homeassistant/components/influxdb/ @mdegat01 @Robbie1221
/tests/components/influxdb/ @mdegat01 @Robbie1221
/homeassistant/components/infrared/ @home-assistant/core
/tests/components/infrared/ @home-assistant/core
/homeassistant/components/inkbird/ @bdraco
/tests/components/inkbird/ @bdraco
/homeassistant/components/input_boolean/ @home-assistant/core
@@ -1202,8 +1198,6 @@ build.json @home-assistant/supervisor
/tests/components/open_meteo/ @frenck
/homeassistant/components/open_router/ @joostlek
/tests/components/open_router/ @joostlek
/homeassistant/components/opendisplay/ @g4bri3lDev
/tests/components/opendisplay/ @g4bri3lDev
/homeassistant/components/openerz/ @misialq
/tests/components/openerz/ @misialq
/homeassistant/components/openevse/ @c00w @firstof9
@@ -1309,8 +1303,8 @@ build.json @home-assistant/supervisor
/tests/components/prosegur/ @dgomes
/homeassistant/components/proximity/ @mib1185
/tests/components/proximity/ @mib1185
/homeassistant/components/proxmoxve/ @Corbeno @erwindouna @CoMPaTech
/tests/components/proxmoxve/ @Corbeno @erwindouna @CoMPaTech
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/tests/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/homeassistant/components/ps4/ @ktnrg45
/tests/components/ps4/ @ktnrg45
/homeassistant/components/pterodactyl/ @elmurato
@@ -1654,8 +1648,8 @@ build.json @home-assistant/supervisor
/tests/components/system_bridge/ @timmo001
/homeassistant/components/systemmonitor/ @gjohansson-ST
/tests/components/systemmonitor/ @gjohansson-ST
/homeassistant/components/systemnexa2/ @konsulten
/tests/components/systemnexa2/ @konsulten
/homeassistant/components/systemnexa2/ @konsulten @slangstrom
/tests/components/systemnexa2/ @konsulten @slangstrom
/homeassistant/components/tado/ @erwindouna
/tests/components/tado/ @erwindouna
/homeassistant/components/tag/ @home-assistant/core
@@ -1695,6 +1689,7 @@ build.json @home-assistant/supervisor
/tests/components/tessie/ @Bre77
/homeassistant/components/text/ @home-assistant/core
/tests/components/text/ @home-assistant/core
/homeassistant/components/tfiac/ @fredrike @mellado
/homeassistant/components/thermobeacon/ @bdraco
/tests/components/thermobeacon/ @bdraco
/homeassistant/components/thermopro/ @bdraco @h3ss
@@ -1902,8 +1897,8 @@ build.json @home-assistant/supervisor
/tests/components/withings/ @joostlek
/homeassistant/components/wiz/ @sbidy @arturpragacz
/tests/components/wiz/ @sbidy @arturpragacz
/homeassistant/components/wled/ @frenck @mik-laj
/tests/components/wled/ @frenck @mik-laj
/homeassistant/components/wled/ @frenck
/tests/components/wled/ @frenck
/homeassistant/components/wmspro/ @mback2k
/tests/components/wmspro/ @mback2k
/homeassistant/components/wolflink/ @adamkrol93 @mtielen
@@ -1971,7 +1966,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/zone/ @home-assistant/core
/tests/components/zone/ @home-assistant/core
/homeassistant/components/zoneminder/ @rohankapoorcom @nabbi
/tests/components/zoneminder/ @rohankapoorcom @nabbi
/homeassistant/components/zwave_js/ @home-assistant/z-wave
/tests/components/zwave_js/ @home-assistant/z-wave
/homeassistant/components/zwave_me/ @lawfulchaos @Z-Wave-Me @PoltoS

2
Dockerfile generated
View File

@@ -30,7 +30,7 @@ RUN \
# Verify go2rtc can be executed
go2rtc --version \
# Install uv
&& pip3 install uv==0.10.6
&& pip3 install uv==0.9.26
WORKDIR /usr/src

View File

@@ -10,7 +10,6 @@ coverage:
target: auto
threshold: 1
paths:
- homeassistant/components/*/backup.py
- homeassistant/components/*/config_flow.py
- homeassistant/components/*/device_action.py
- homeassistant/components/*/device_condition.py
@@ -29,7 +28,6 @@ coverage:
target: 100
threshold: 0
paths:
- homeassistant/components/*/backup.py
- homeassistant/components/*/config_flow.py
- homeassistant/components/*/device_action.py
- homeassistant/components/*/device_condition.py

View File

@@ -70,7 +70,7 @@ from .const import (
SIGNAL_BOOTSTRAP_INTEGRATIONS,
)
from .core_config import async_process_ha_core_config
from .exceptions import HomeAssistantError, UnsupportedStorageVersionError
from .exceptions import HomeAssistantError
from .helpers import (
area_registry,
category_registry,
@@ -210,7 +210,6 @@ DEFAULT_INTEGRATIONS = {
"analytics", # Needed for onboarding
"application_credentials",
"backup",
"brands",
"frontend",
"hardware",
"labs",
@@ -236,14 +235,9 @@ DEFAULT_INTEGRATIONS = {
"input_text",
"schedule",
"timer",
#
# Base platforms:
*BASE_PLATFORMS,
}
DEFAULT_INTEGRATIONS_RECOVERY_MODE = {
# These integrations are set up if recovery mode is activated.
"backup",
"cloud",
"frontend",
}
DEFAULT_INTEGRATIONS_SUPERVISOR = {
@@ -438,56 +432,32 @@ def _init_blocking_io_modules_in_executor() -> None:
is_docker_env()
async def async_load_base_functionality(hass: core.HomeAssistant) -> bool:
"""Load the registries and modules that will do blocking I/O.
Return whether loading succeeded.
"""
async def async_load_base_functionality(hass: core.HomeAssistant) -> None:
"""Load the registries and modules that will do blocking I/O."""
if DATA_REGISTRIES_LOADED in hass.data:
return True
return
hass.data[DATA_REGISTRIES_LOADED] = None
entity.async_setup(hass)
frame.async_setup(hass)
template.async_setup(hass)
translation.async_setup(hass)
recovery = hass.config.recovery_mode
try:
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass, load_empty=recovery)),
create_eager_task(category_registry.async_load(hass, load_empty=recovery)),
create_eager_task(device_registry.async_load(hass, load_empty=recovery)),
create_eager_task(entity_registry.async_load(hass, load_empty=recovery)),
create_eager_task(floor_registry.async_load(hass, load_empty=recovery)),
create_eager_task(issue_registry.async_load(hass, load_empty=recovery)),
create_eager_task(label_registry.async_load(hass, load_empty=recovery)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass, load_empty=recovery)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
except UnsupportedStorageVersionError as err:
# If we're already in recovery mode, we don't want to handle the exception
# and activate recovery mode again, as that would lead to an infinite loop.
if recovery:
raise
_LOGGER.error(
"Storage file %s was created by a newer version of Home Assistant"
" (storage version %s > %s); activating recovery mode; on-disk data"
" is preserved; upgrade Home Assistant or restore from a backup",
err.storage_key,
err.found_version,
err.max_supported_version,
)
return False
return True
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass)),
create_eager_task(category_registry.async_load(hass)),
create_eager_task(device_registry.async_load(hass)),
create_eager_task(entity_registry.async_load(hass)),
create_eager_task(floor_registry.async_load(hass)),
create_eager_task(issue_registry.async_load(hass)),
create_eager_task(label_registry.async_load(hass)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
async def async_from_config_dict(
@@ -504,9 +474,7 @@ async def async_from_config_dict(
# Prime custom component cache early so we know if registry entries are tied
# to a custom integration
await loader.async_get_custom_components(hass)
if not await async_load_base_functionality(hass):
return None
await async_load_base_functionality(hass)
# Set up core.
_LOGGER.debug("Setting up %s", CORE_INTEGRATIONS)

View File

@@ -1,5 +0,0 @@
{
"domain": "ubisys",
"name": "Ubisys",
"iot_standards": ["zigbee"]
}

View File

@@ -12,6 +12,10 @@ from homeassistant.helpers.dispatcher import dispatcher_send
from .const import DOMAIN, DOMAIN_DATA, LOGGER
SERVICE_SETTINGS = "change_setting"
SERVICE_CAPTURE_IMAGE = "capture_image"
SERVICE_TRIGGER_AUTOMATION = "trigger_automation"
ATTR_SETTING = "setting"
ATTR_VALUE = "value"
@@ -71,13 +75,16 @@ def async_setup_services(hass: HomeAssistant) -> None:
"""Home Assistant services."""
hass.services.async_register(
DOMAIN, "change_setting", _change_setting, schema=CHANGE_SETTING_SCHEMA
DOMAIN, SERVICE_SETTINGS, _change_setting, schema=CHANGE_SETTING_SCHEMA
)
hass.services.async_register(
DOMAIN, "capture_image", _capture_image, schema=CAPTURE_IMAGE_SCHEMA
DOMAIN, SERVICE_CAPTURE_IMAGE, _capture_image, schema=CAPTURE_IMAGE_SCHEMA
)
hass.services.async_register(
DOMAIN, "trigger_automation", _trigger_automation, schema=AUTOMATION_SCHEMA
DOMAIN,
SERVICE_TRIGGER_AUTOMATION,
_trigger_automation,
schema=AUTOMATION_SCHEMA,
)

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["accuweather"],
"requirements": ["accuweather==5.1.0"]
"requirements": ["accuweather==5.0.0"]
}

View File

@@ -30,8 +30,6 @@ async def system_health_info(hass: HomeAssistant) -> dict[str, Any]:
)
return {
"can_reach_server": system_health.async_check_can_reach_url(
hass, str(ENDPOINT)
),
"can_reach_server": system_health.async_check_can_reach_url(hass, ENDPOINT),
"remaining_requests": remaining_requests,
}

View File

@@ -191,7 +191,7 @@ class AccuWeatherEntity(
{
ATTR_FORECAST_TIME: utc_from_timestamp(item["EpochDate"]).isoformat(),
ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCoverDay"],
ATTR_FORECAST_HUMIDITY: item["RelativeHumidityDay"].get("Average"),
ATTR_FORECAST_HUMIDITY: item["RelativeHumidityDay"]["Average"],
ATTR_FORECAST_NATIVE_TEMP: item["TemperatureMax"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_TEMP_LOW: item["TemperatureMin"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperatureMax"][

View File

@@ -10,6 +10,8 @@ from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
ADVANTAGE_AIR_SERVICE_SET_TIME_TO = "set_time_to"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -18,7 +20,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"set_time_to",
ADVANTAGE_AIR_SERVICE_SET_TIME_TO,
entity_domain=SENSOR_DOMAIN,
schema={vol.Required("minutes"): cv.positive_int},
func="set_time_to",

View File

@@ -8,12 +8,18 @@ from homeassistant.helpers import service
from .const import DOMAIN
_DEV_EN_ALT = "enable_alerts"
_DEV_DS_ALT = "disable_alerts"
_DEV_EN_REC = "start_recording"
_DEV_DS_REC = "stop_recording"
_DEV_SNAP = "snapshot"
CAMERA_SERVICES = {
"enable_alerts": "async_enable_alerts",
"disable_alerts": "async_disable_alerts",
"start_recording": "async_start_recording",
"stop_recording": "async_stop_recording",
"snapshot": "async_snapshot",
_DEV_EN_ALT: "async_enable_alerts",
_DEV_DS_ALT: "async_disable_alerts",
_DEV_EN_REC: "async_start_recording",
_DEV_DS_REC: "async_stop_recording",
_DEV_SNAP: "async_snapshot",
}

View File

@@ -93,6 +93,7 @@ class AirobotNumber(AirobotEntity, NumberEntity):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="set_value_failed",
translation_placeholders={"error": str(err)},
) from err
else:
await self.coordinator.async_request_refresh()

View File

@@ -112,7 +112,7 @@
"message": "Failed to set temperature to {temperature}."
},
"set_value_failed": {
"message": "Failed to set value."
"message": "Failed to set value: {error}"
},
"switch_turn_off_failed": {
"message": "Failed to turn off {switch}."

View File

@@ -4,16 +4,7 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.exceptions import (
AirOSConnectionAuthenticationError,
AirOSConnectionSetupError,
AirOSDataMissingError,
AirOSDeviceConnectionError,
AirOSKeyDataMissingError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
from homeassistant.const import (
CONF_HOST,
@@ -24,11 +15,6 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryError,
ConfigEntryNotReady,
)
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -53,40 +39,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
hass, verify_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL]
)
conn_data = {
CONF_HOST: entry.data[CONF_HOST],
CONF_USERNAME: entry.data[CONF_USERNAME],
CONF_PASSWORD: entry.data[CONF_PASSWORD],
"use_ssl": entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
"session": session,
}
# Determine firmware version before creating the device instance
try:
device_data: DetectDeviceData = await async_get_firmware_data(**conn_data)
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
TimeoutError,
) as err:
raise ConfigEntryNotReady from err
except (
AirOSConnectionAuthenticationError,
AirOSDataMissingError,
) as err:
raise ConfigEntryAuthFailed from err
except AirOSKeyDataMissingError as err:
raise ConfigEntryError("key_data_missing") from err
except Exception as err:
raise ConfigEntryError("unknown") from err
airos_class: type[AirOS8 | AirOS6] = (
AirOS8 if device_data["fw_major"] == 8 else AirOS6
airos_device = AirOS8(
host=entry.data[CONF_HOST],
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
session=session,
use_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
airos_device = airos_class(**conn_data)
coordinator = AirOSDataUpdateCoordinator(hass, entry, device_data, airos_device)
coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator

View File

@@ -4,9 +4,7 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import Generic, TypeVar
from airos.data import AirOSDataBaseClass
import logging
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -20,24 +18,25 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirOS8Data, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
PARALLEL_UPDATES = 0
_LOGGER = logging.getLogger(__name__)
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirOSBinarySensorEntityDescription(
BinarySensorEntityDescription,
Generic[AirOSDataModel],
):
class AirOSBinarySensorEntityDescription(BinarySensorEntityDescription):
"""Describe an AirOS binary sensor."""
value_fn: Callable[[AirOSDataModel], bool]
value_fn: Callable[[AirOS8Data], bool]
AirOS8BinarySensorEntityDescription = AirOSBinarySensorEntityDescription[AirOS8Data]
COMMON_BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
AirOSBinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOSBinarySensorEntityDescription(
key="dhcp_client",
translation_key="dhcp_client",
@@ -54,23 +53,6 @@ COMMON_BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="pppoe",
translation_key="pppoe",
device_class=BinarySensorDeviceClass.CONNECTIVITY,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.pppoe,
entity_registry_enabled_default=False,
),
)
AIROS8_BINARY_SENSORS: tuple[AirOS8BinarySensorEntityDescription, ...] = (
AirOS8BinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOS8BinarySensorEntityDescription(
key="dhcp6_server",
translation_key="dhcp6_server",
device_class=BinarySensorDeviceClass.RUNNING,
@@ -78,6 +60,14 @@ AIROS8_BINARY_SENSORS: tuple[AirOS8BinarySensorEntityDescription, ...] = (
value_fn=lambda data: data.services.dhcp6d_stateful,
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="pppoe",
translation_key="pppoe",
device_class=BinarySensorDeviceClass.CONNECTIVITY,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.pppoe,
entity_registry_enabled_default=False,
),
)
@@ -89,18 +79,9 @@ async def async_setup_entry(
"""Set up the AirOS binary sensors from a config entry."""
coordinator = config_entry.runtime_data
entities = [
AirOSBinarySensor(coordinator, description)
for description in COMMON_BINARY_SENSORS
]
if coordinator.device_data["fw_major"] == 8:
entities.extend(
AirOSBinarySensor(coordinator, description)
for description in AIROS8_BINARY_SENSORS
)
async_add_entities(entities)
async_add_entities(
AirOSBinarySensor(coordinator, description) for description in BINARY_SENSORS
)
class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
import logging
from airos.exceptions import AirOSException
from homeassistant.components.button import (
@@ -16,6 +18,8 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import DOMAIN, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
REBOOT_BUTTON = ButtonEntityDescription(

View File

@@ -7,8 +7,6 @@ from collections.abc import Mapping
import logging
from typing import Any
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.discovery import airos_discover_devices
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -19,7 +17,6 @@ from airos.exceptions import (
AirOSKeyDataMissingError,
AirOSListenerError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
import voluptuous as vol
from homeassistant.config_entries import (
@@ -56,11 +53,10 @@ from .const import (
MAC_ADDRESS,
SECTION_ADVANCED_SETTINGS,
)
from .coordinator import AirOS8
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
# Discovery duration in seconds, airOS announces every 20 seconds
DISCOVER_INTERVAL: int = 30
@@ -96,7 +92,7 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOSDeviceDetect
self.airos_device: AirOS8
self.errors: dict[str, str] = {}
self.discovered_devices: dict[str, dict[str, Any]] = {}
self.discovery_abort_reason: str | None = None
@@ -139,14 +135,16 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
verify_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
airos_device = AirOS8(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
device_data: DetectDeviceData = await async_get_firmware_data(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
await airos_device.login()
airos_data = await airos_device.status()
except (
AirOSConnectionSetupError,
@@ -161,14 +159,14 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected exception during credential validation")
self.errors["base"] = "unknown"
else:
await self.async_set_unique_id(device_data["mac"])
await self.async_set_unique_id(airos_data.derived.mac)
if self.source in [SOURCE_REAUTH, SOURCE_RECONFIGURE]:
self._abort_if_unique_id_mismatch()
else:
self._abort_if_unique_id_configured()
return {"title": device_data["hostname"], "data": config_data}
return {"title": airos_data.host.hostname, "data": config_data}
return None

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6, AirOS6Data
from airos.airos8 import AirOS8, AirOS8Data
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -12,7 +11,6 @@ from airos.exceptions import (
AirOSDataMissingError,
AirOSDeviceConnectionError,
)
from airos.helpers import DetectDeviceData
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -23,28 +21,19 @@ from .const import DOMAIN, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
AirOSDataDetect = AirOS8Data | AirOS6Data
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
"""Class to manage fetching AirOS data from single endpoint."""
airos_device: AirOSDeviceDetect
config_entry: AirOSConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
device_data: DetectDeviceData,
airos_device: AirOSDeviceDetect,
self, hass: HomeAssistant, config_entry: AirOSConfigEntry, airos_device: AirOS8
) -> None:
"""Initialize the coordinator."""
self.airos_device = airos_device
self.device_data = device_data
super().__init__(
hass,
_LOGGER,
@@ -53,7 +42,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> AirOSDataDetect:
async def _async_update_data(self) -> AirOS8Data:
"""Fetch data from AirOS."""
try:
await self.airos_device.login()
@@ -73,7 +62,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
except AirOSDataMissingError as err:
except (AirOSDataMissingError,) as err:
_LOGGER.error("Expected data not returned by airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -7,6 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "platinum",
"quality_scale": "silver",
"requirements": ["airos==0.6.4"]
}

View File

@@ -42,20 +42,16 @@ rules:
# Gold
devices: done
diagnostics: done
discovery-update-info: done
discovery:
status: exempt
comment: No way to detect device on the network
discovery-update-info: todo
discovery: todo
docs-data-update: done
docs-examples: done
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
dynamic-devices: todo
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
@@ -65,10 +61,8 @@ rules:
status: exempt
comment: no (custom) icons used or envisioned
reconfiguration-flow: done
repair-issues: done
stale-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
repair-issues: todo
stale-devices: todo
# Platinum
async-dependency: done

View File

@@ -5,14 +5,8 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import Generic, TypeVar
from airos.data import (
AirOSDataBaseClass,
DerivedWirelessMode,
DerivedWirelessRole,
NetRole,
)
from airos.data import DerivedWirelessMode, DerivedWirelessRole, NetRole
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -43,19 +37,15 @@ WIRELESS_ROLE_OPTIONS = [mode.value for mode in DerivedWirelessRole]
PARALLEL_UPDATES = 0
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
@dataclass(frozen=True, kw_only=True)
class AirOSSensorEntityDescription(SensorEntityDescription, Generic[AirOSDataModel]):
class AirOSSensorEntityDescription(SensorEntityDescription):
"""Describe an AirOS sensor."""
value_fn: Callable[[AirOSDataModel], StateType]
value_fn: Callable[[AirOS8Data], StateType]
AirOS8SensorEntityDescription = AirOSSensorEntityDescription[AirOS8Data]
COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
AirOSSensorEntityDescription(
key="host_cpuload",
translation_key="host_cpuload",
@@ -85,6 +75,54 @@ COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
translation_key="wireless_essid",
value_fn=lambda data: data.wireless.essid,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOSSensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
AirOSSensorEntityDescription(
key="host_uptime",
translation_key="host_uptime",
@@ -120,57 +158,6 @@ COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
options=WIRELESS_ROLE_OPTIONS,
entity_registry_enabled_default=False,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
)
AIROS8_SENSORS: tuple[AirOS8SensorEntityDescription, ...] = (
AirOS8SensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOS8SensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
)
@@ -182,14 +169,7 @@ async def async_setup_entry(
"""Set up the AirOS sensors from a config entry."""
coordinator = config_entry.runtime_data
entities = [AirOSSensor(coordinator, description) for description in COMMON_SENSORS]
if coordinator.device_data["fw_major"] == 8:
entities.extend(
AirOSSensor(coordinator, description) for description in AIROS8_SENSORS
)
async_add_entities(entities)
async_add_entities(AirOSSensor(coordinator, description) for description in SENSORS)
class AirOSSensor(AirOSEntity, SensorEntity):

View File

@@ -18,10 +18,6 @@ from homeassistant.helpers.schema_config_entry_flow import (
SchemaOptionsFlowHandler,
)
from homeassistant.helpers.selector import BooleanSelector
from homeassistant.helpers.service_info.zeroconf import (
ATTR_PROPERTIES_ID,
ZeroconfServiceInfo,
)
from .const import CONF_CLIP_NEGATIVE, CONF_RETURN_AVERAGE, DOMAIN
@@ -50,9 +46,6 @@ class AirQConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 1
_discovered_host: str
_discovered_name: str
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -97,58 +90,6 @@ class AirQConfigFlow(ConfigFlow, domain=DOMAIN):
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
"""Handle zeroconf discovery of an air-Q device."""
self._discovered_host = discovery_info.host
self._discovered_name = discovery_info.properties.get("devicename", "air-Q")
device_id = discovery_info.properties.get(ATTR_PROPERTIES_ID)
if not device_id:
return self.async_abort(reason="incomplete_discovery")
await self.async_set_unique_id(device_id)
self._abort_if_unique_id_configured(
updates={CONF_IP_ADDRESS: self._discovered_host},
reload_on_update=True,
)
self.context["title_placeholders"] = {"name": self._discovered_name}
return await self.async_step_discovery_confirm()
async def async_step_discovery_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle user confirmation of a discovered air-Q device."""
errors: dict[str, str] = {}
if user_input is not None:
session = async_get_clientsession(self.hass)
airq = AirQ(self._discovered_host, user_input[CONF_PASSWORD], session)
try:
await airq.validate()
except ClientConnectionError:
errors["base"] = "cannot_connect"
except InvalidAuth:
errors["base"] = "invalid_auth"
else:
return self.async_create_entry(
title=self._discovered_name,
data={
CONF_IP_ADDRESS: self._discovered_host,
CONF_PASSWORD: user_input[CONF_PASSWORD],
},
)
return self.async_show_form(
step_id="discovery_confirm",
data_schema=vol.Schema({vol.Required(CONF_PASSWORD): str}),
description_placeholders={"name": self._discovered_name},
errors=errors,
)
@staticmethod
@callback
def async_get_options_flow(

View File

@@ -7,13 +7,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["aioairq"],
"requirements": ["aioairq==0.4.7"],
"zeroconf": [
{
"properties": {
"device": "air-q"
},
"type": "_http._tcp.local."
}
]
"requirements": ["aioairq==0.4.7"]
}

View File

@@ -1,23 +1,14 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"incomplete_discovery": "The discovered air-Q device did not provide a device ID. Ensure the firmware is up to date."
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_input": "[%key:common::config_flow::error::invalid_host%]"
},
"flow_title": "{name}",
"step": {
"discovery_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"description": "Do you want to set up **{name}**?",
"title": "Set up air-Q"
},
"user": {
"data": {
"ip_address": "[%key:common::config_flow::data::ip%]",

View File

@@ -117,23 +117,23 @@ class AirtouchAC(CoordinatorEntity, ClimateEntity):
return super()._handle_coordinator_update()
@property
def current_temperature(self) -> int:
def current_temperature(self):
"""Return the current temperature."""
return self._unit.Temperature
@property
def fan_mode(self) -> str:
def fan_mode(self):
"""Return fan mode of the AC this group belongs to."""
return AT_TO_HA_FAN_SPEED[self._airtouch.acs[self._ac_number].AcFanSpeed]
@property
def fan_modes(self) -> list[str]:
def fan_modes(self):
"""Return the list of available fan modes."""
airtouch_fan_speeds = self._airtouch.GetSupportedFanSpeedsForAc(self._ac_number)
return [AT_TO_HA_FAN_SPEED[speed] for speed in airtouch_fan_speeds]
@property
def hvac_mode(self) -> HVACMode:
def hvac_mode(self):
"""Return hvac target hvac state."""
is_off = self._unit.PowerState == "Off"
if is_off:
@@ -236,17 +236,17 @@ class AirtouchGroup(CoordinatorEntity, ClimateEntity):
return self._airtouch.acs[self._unit.BelongsToAc].MaxSetpoint
@property
def current_temperature(self) -> int:
def current_temperature(self):
"""Return the current temperature."""
return self._unit.Temperature
@property
def target_temperature(self) -> int:
def target_temperature(self):
"""Return the temperature we are trying to reach."""
return self._unit.TargetSetpoint
@property
def hvac_mode(self) -> HVACMode:
def hvac_mode(self):
"""Return hvac target hvac state."""
# there are other power states that aren't 'on' but still count as on (eg. 'Turbo')
is_off = self._unit.PowerState == "Off"
@@ -272,12 +272,12 @@ class AirtouchGroup(CoordinatorEntity, ClimateEntity):
self.async_write_ha_state()
@property
def fan_mode(self) -> str:
def fan_mode(self):
"""Return fan mode of the AC this group belongs to."""
return AT_TO_HA_FAN_SPEED[self._airtouch.acs[self._unit.BelongsToAc].AcFanSpeed]
@property
def fan_modes(self) -> list[str]:
def fan_modes(self):
"""Return the list of available fan modes."""
airtouch_fan_speeds = self._airtouch.GetSupportedFanSpeedsByGroup(
self._group_number

View File

@@ -7,7 +7,13 @@ from datetime import timedelta
from math import ceil
from typing import Any
from pyairvisual.cloud_api import CloudAPI
from pyairvisual.cloud_api import (
CloudAPI,
InvalidKeyError,
KeyExpiredError,
UnauthorizedError,
)
from pyairvisual.errors import AirVisualError
from homeassistant.components import automation
from homeassistant.config_entries import SOURCE_IMPORT, ConfigEntry
@@ -22,12 +28,14 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers import (
aiohttp_client,
device_registry as dr,
entity_registry as er,
)
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import (
CONF_CITY,
@@ -39,7 +47,8 @@ from .const import (
INTEGRATION_TYPE_NODE_PRO,
LOGGER,
)
from .coordinator import AirVisualConfigEntry, AirVisualDataUpdateCoordinator
type AirVisualConfigEntry = ConfigEntry[DataUpdateCoordinator]
# We use a raw string for the airvisual_pro domain (instead of importing the actual
# constant) so that we can avoid listing it as a dependency:
@@ -76,8 +85,8 @@ def async_get_cloud_api_update_interval(
@callback
def async_get_cloud_coordinators_by_api_key(
hass: HomeAssistant, api_key: str
) -> list[AirVisualDataUpdateCoordinator]:
"""Get all AirVisualDataUpdateCoordinator objects related to a particular API key."""
) -> list[DataUpdateCoordinator]:
"""Get all DataUpdateCoordinator objects related to a particular API key."""
return [
entry.runtime_data
for entry in hass.config_entries.async_entries(DOMAIN)
@@ -171,11 +180,38 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirVisualConfigEntry) ->
websession = aiohttp_client.async_get_clientsession(hass)
cloud_api = CloudAPI(entry.data[CONF_API_KEY], session=websession)
coordinator = AirVisualDataUpdateCoordinator(
async def async_update_data() -> dict[str, Any]:
"""Get new data from the API."""
if CONF_CITY in entry.data:
api_coro = cloud_api.air_quality.city(
entry.data[CONF_CITY],
entry.data[CONF_STATE],
entry.data[CONF_COUNTRY],
)
else:
api_coro = cloud_api.air_quality.nearest_city(
entry.data[CONF_LATITUDE],
entry.data[CONF_LONGITUDE],
)
try:
return await api_coro
except (InvalidKeyError, KeyExpiredError, UnauthorizedError) as ex:
raise ConfigEntryAuthFailed from ex
except AirVisualError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err
coordinator = DataUpdateCoordinator(
hass,
entry,
cloud_api,
LOGGER,
config_entry=entry,
name=async_get_geography_id(entry.data),
# We give a placeholder update interval in order to create the coordinator;
# then, below, we use the coordinator's presence (along with any other
# coordinators using the same API key) to calculate an actual, leveled
# update interval:
update_interval=timedelta(minutes=5),
update_method=async_update_data,
)
entry.async_on_unload(entry.add_update_listener(async_reload_entry))

View File

@@ -1,72 +0,0 @@
"""Define an AirVisual data coordinator."""
from __future__ import annotations
from datetime import timedelta
from typing import Any
from pyairvisual.cloud_api import (
CloudAPI,
InvalidKeyError,
KeyExpiredError,
UnauthorizedError,
)
from pyairvisual.errors import AirVisualError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_COUNTRY, CONF_LATITUDE, CONF_LONGITUDE, CONF_STATE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import CONF_CITY, LOGGER
type AirVisualConfigEntry = ConfigEntry[AirVisualDataUpdateCoordinator]
class AirVisualDataUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Class to manage fetching AirVisual data."""
config_entry: AirVisualConfigEntry
def __init__(
self,
hass: HomeAssistant,
entry: AirVisualConfigEntry,
cloud_api: CloudAPI,
name: str,
) -> None:
"""Initialize the coordinator."""
self._cloud_api = cloud_api
super().__init__(
hass,
LOGGER,
config_entry=entry,
name=name,
# We give a placeholder update interval in order to create the coordinator;
# then, in async_setup_entry, we use the coordinator's presence (along with
# any other coordinators using the same API key) to calculate an actual,
# leveled update interval:
update_interval=timedelta(minutes=5),
)
async def _async_update_data(self) -> dict[str, Any]:
"""Get new data from the API."""
if CONF_CITY in self.config_entry.data:
api_coro = self._cloud_api.air_quality.city(
self.config_entry.data[CONF_CITY],
self.config_entry.data[CONF_STATE],
self.config_entry.data[CONF_COUNTRY],
)
else:
api_coro = self._cloud_api.air_quality.nearest_city(
self.config_entry.data[CONF_LATITUDE],
self.config_entry.data[CONF_LONGITUDE],
)
try:
return await api_coro
except (InvalidKeyError, KeyExpiredError, UnauthorizedError) as ex:
raise ConfigEntryAuthFailed from ex
except AirVisualError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err

View File

@@ -15,8 +15,8 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from . import AirVisualConfigEntry
from .const import CONF_CITY
from .coordinator import AirVisualConfigEntry
CONF_COORDINATES = "coordinates"
CONF_TITLE = "title"

View File

@@ -2,25 +2,29 @@
from __future__ import annotations
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import callback
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .coordinator import AirVisualDataUpdateCoordinator
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
class AirVisualEntity(CoordinatorEntity[AirVisualDataUpdateCoordinator]):
class AirVisualEntity(CoordinatorEntity):
"""Define a generic AirVisual entity."""
def __init__(
self,
coordinator: AirVisualDataUpdateCoordinator,
coordinator: DataUpdateCoordinator,
entry: ConfigEntry,
description: EntityDescription,
) -> None:
"""Initialize."""
super().__init__(coordinator)
self._attr_extra_state_attributes = {}
self._entry = entry
self.entity_description = description
async def async_added_to_hass(self) -> None:

View File

@@ -8,6 +8,7 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
ATTR_LATITUDE,
ATTR_LONGITUDE,
@@ -23,9 +24,10 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from . import AirVisualConfigEntry
from .const import CONF_CITY
from .coordinator import AirVisualConfigEntry, AirVisualDataUpdateCoordinator
from .entity import AirVisualEntity
ATTR_CITY = "city"
@@ -111,7 +113,7 @@ async def async_setup_entry(
"""Set up AirVisual sensors based on a config entry."""
coordinator = entry.runtime_data
async_add_entities(
AirVisualGeographySensor(coordinator, description, locale)
AirVisualGeographySensor(coordinator, entry, description, locale)
for locale in GEOGRAPHY_SENSOR_LOCALES
for description in GEOGRAPHY_SENSOR_DESCRIPTIONS
)
@@ -122,14 +124,14 @@ class AirVisualGeographySensor(AirVisualEntity, SensorEntity):
def __init__(
self,
coordinator: AirVisualDataUpdateCoordinator,
coordinator: DataUpdateCoordinator,
entry: ConfigEntry,
description: SensorEntityDescription,
locale: str,
) -> None:
"""Initialize."""
super().__init__(coordinator, description)
super().__init__(coordinator, entry, description)
entry = coordinator.config_entry
self._attr_extra_state_attributes.update(
{
ATTR_CITY: entry.data.get(CONF_CITY),
@@ -180,16 +182,16 @@ class AirVisualGeographySensor(AirVisualEntity, SensorEntity):
#
# We use any coordinates in the config entry and, in the case of a geography by
# name, we fall back to the latitude longitude provided in the coordinator data:
latitude = self.coordinator.config_entry.data.get(
latitude = self._entry.data.get(
CONF_LATITUDE,
self.coordinator.data["location"]["coordinates"][1],
)
longitude = self.coordinator.config_entry.data.get(
longitude = self._entry.data.get(
CONF_LONGITUDE,
self.coordinator.data["location"]["coordinates"][0],
)
if self.coordinator.config_entry.options[CONF_SHOW_ON_MAP]:
if self._entry.options[CONF_SHOW_ON_MAP]:
self._attr_extra_state_attributes[ATTR_LATITUDE] = latitude
self._attr_extra_state_attributes[ATTR_LONGITUDE] = longitude
self._attr_extra_state_attributes.pop("lati", None)

View File

@@ -4,9 +4,18 @@ from __future__ import annotations
import asyncio
from contextlib import suppress
from dataclasses import dataclass
from datetime import timedelta
from typing import Any
from pyairvisual.node import NodeProError, NodeSamba
from pyairvisual.node import (
InvalidAuthenticationError,
NodeConnectionError,
NodeProError,
NodeSamba,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_IP_ADDRESS,
CONF_PASSWORD,
@@ -14,16 +23,25 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .coordinator import (
AirVisualProConfigEntry,
AirVisualProCoordinator,
AirVisualProData,
)
from .const import LOGGER
PLATFORMS = [Platform.SENSOR]
UPDATE_INTERVAL = timedelta(minutes=1)
type AirVisualProConfigEntry = ConfigEntry[AirVisualProData]
@dataclass
class AirVisualProData:
"""Define a data class."""
coordinator: DataUpdateCoordinator
node: NodeSamba
async def async_setup_entry(
hass: HomeAssistant, entry: AirVisualProConfigEntry
@@ -36,15 +54,48 @@ async def async_setup_entry(
except NodeProError as err:
raise ConfigEntryNotReady from err
coordinator = AirVisualProCoordinator(hass, entry, node)
reload_task: asyncio.Task | None = None
async def async_get_data() -> dict[str, Any]:
"""Get data from the device."""
try:
data = await node.async_get_latest_measurements()
data["history"] = {}
if data["settings"].get("follow_mode") == "device":
history = await node.async_get_history(include_trends=False)
data["history"] = history.get("measurements", [])[-1]
except InvalidAuthenticationError as err:
raise ConfigEntryAuthFailed("Invalid Samba password") from err
except NodeConnectionError as err:
nonlocal reload_task
if not reload_task:
reload_task = hass.async_create_task(
hass.config_entries.async_reload(entry.entry_id)
)
raise UpdateFailed(f"Connection to Pro unit lost: {err}") from err
except NodeProError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err
return data
coordinator = DataUpdateCoordinator(
hass,
LOGGER,
config_entry=entry,
name="Node/Pro data",
update_interval=UPDATE_INTERVAL,
update_method=async_get_data,
)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = AirVisualProData(coordinator=coordinator, node=node)
async def async_shutdown(_: Event) -> None:
"""Define an event handler to disconnect from the websocket."""
if coordinator.reload_task:
nonlocal reload_task
if reload_task:
with suppress(asyncio.CancelledError):
coordinator.reload_task.cancel()
reload_task.cancel()
await node.async_disconnect()
entry.async_on_unload(

View File

@@ -1,79 +0,0 @@
"""DataUpdateCoordinator for the AirVisual Pro integration."""
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from datetime import timedelta
from typing import Any
from pyairvisual.node import (
InvalidAuthenticationError,
NodeConnectionError,
NodeProError,
NodeSamba,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import LOGGER
UPDATE_INTERVAL = timedelta(minutes=1)
@dataclass
class AirVisualProData:
"""Define a data class."""
coordinator: AirVisualProCoordinator
node: NodeSamba
type AirVisualProConfigEntry = ConfigEntry[AirVisualProData]
class AirVisualProCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Coordinator for AirVisual Pro data."""
config_entry: AirVisualProConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: AirVisualProConfigEntry,
node: NodeSamba,
) -> None:
"""Initialize."""
super().__init__(
hass,
LOGGER,
config_entry=config_entry,
name="Node/Pro data",
update_interval=UPDATE_INTERVAL,
)
self._node = node
self.reload_task: asyncio.Task[bool] | None = None
async def _async_update_data(self) -> dict[str, Any]:
"""Get data from the device."""
try:
data = await self._node.async_get_latest_measurements()
data["history"] = {}
if data["settings"].get("follow_mode") == "device":
history = await self._node.async_get_history(include_trends=False)
data["history"] = history.get("measurements", [])[-1]
except InvalidAuthenticationError as err:
raise ConfigEntryAuthFailed("Invalid Samba password") from err
except NodeConnectionError as err:
if self.reload_task is None:
self.reload_task = self.hass.async_create_task(
self.hass.config_entries.async_reload(self.config_entry.entry_id)
)
raise UpdateFailed(f"Connection to Pro unit lost: {err}") from err
except NodeProError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err
return data

View File

@@ -8,7 +8,7 @@ from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import CONF_PASSWORD
from homeassistant.core import HomeAssistant
from .coordinator import AirVisualProConfigEntry
from . import AirVisualProConfigEntry
CONF_MAC_ADDRESS = "mac_address"
CONF_SERIAL_NUMBER = "serial_number"

View File

@@ -4,17 +4,19 @@ from __future__ import annotations
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import EntityDescription
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from .const import DOMAIN
from .coordinator import AirVisualProCoordinator
class AirVisualProEntity(CoordinatorEntity[AirVisualProCoordinator]):
class AirVisualProEntity(CoordinatorEntity):
"""Define a generic AirVisual Pro entity."""
def __init__(
self, coordinator: AirVisualProCoordinator, description: EntityDescription
self, coordinator: DataUpdateCoordinator, description: EntityDescription
) -> None:
"""Initialize."""
super().__init__(coordinator)

View File

@@ -22,7 +22,7 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirVisualProConfigEntry
from . import AirVisualProConfigEntry
from .entity import AirVisualProEntity

View File

@@ -5,13 +5,12 @@ from __future__ import annotations
from datetime import timedelta
import logging
import aiohttp
from genie_partner_sdk.client import AladdinConnectClient
from genie_partner_sdk.model import GarageDoor
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
type AladdinConnectConfigEntry = ConfigEntry[dict[str, AladdinConnectCoordinator]]
@@ -41,10 +40,7 @@ class AladdinConnectCoordinator(DataUpdateCoordinator[GarageDoor]):
async def _async_update_data(self) -> GarageDoor:
"""Fetch data from the Aladdin Connect API."""
try:
await self.client.update_door(self.data.device_id, self.data.door_number)
except aiohttp.ClientError as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
await self.client.update_door(self.data.device_id, self.data.door_number)
self.data.status = self.client.get_door_status(
self.data.device_id, self.data.door_number
)

View File

@@ -4,19 +4,14 @@ from __future__ import annotations
from typing import Any
import aiohttp
from homeassistant.components.cover import CoverDeviceClass, CoverEntity
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, SUPPORTED_FEATURES
from .const import SUPPORTED_FEATURES
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
from .entity import AladdinConnectEntity
PARALLEL_UPDATES = 1
async def async_setup_entry(
hass: HomeAssistant,
@@ -45,23 +40,11 @@ class AladdinCoverEntity(AladdinConnectEntity, CoverEntity):
async def async_open_cover(self, **kwargs: Any) -> None:
"""Issue open command to cover."""
try:
await self.client.open_door(self._device_id, self._number)
except aiohttp.ClientError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="open_door_failed",
) from err
await self.client.open_door(self._device_id, self._number)
async def async_close_cover(self, **kwargs: Any) -> None:
"""Issue close command to cover."""
try:
await self.client.close_door(self._device_id, self._number)
except aiohttp.ClientError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="close_door_failed",
) from err
await self.client.close_door(self._device_id, self._number)
@property
def is_closed(self) -> bool | None:

View File

@@ -1,32 +0,0 @@
"""Diagnostics support for Aladdin Connect."""
from __future__ import annotations
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
from .coordinator import AladdinConnectConfigEntry
TO_REDACT = {"access_token", "refresh_token"}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: AladdinConnectConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"config_entry": async_redact_data(config_entry.as_dict(), TO_REDACT),
"doors": {
uid: {
"device_id": coordinator.data.device_id,
"door_number": coordinator.data.door_number,
"name": coordinator.data.name,
"status": coordinator.data.status,
"link_status": coordinator.data.link_status,
"battery_level": coordinator.data.battery_level,
}
for uid, coordinator in config_entry.runtime_data.items()
},
}

View File

@@ -26,26 +26,24 @@ rules:
unique-config-entry: done
# Silver
action-exceptions: done
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration does not have an options flow.
docs-installation-parameters: done
entity-unavailable:
status: done
comment: Handled by the coordinator.
entity-unavailable: todo
integration-owner: done
log-when-unavailable:
status: done
comment: Handled by the coordinator.
parallel-updates: done
log-when-unavailable: todo
parallel-updates: todo
reauthentication-flow: done
test-coverage: done
test-coverage:
status: todo
comment: Platform tests for cover and sensor need to be implemented to reach 95% coverage.
# Gold
devices: done
diagnostics: done
diagnostics: todo
discovery: done
discovery-update-info:
status: exempt
@@ -66,7 +64,9 @@ rules:
icon-translations: todo
reconfiguration-flow: todo
repair-issues: todo
stale-devices: done
stale-devices:
status: todo
comment: We can automatically remove removed devices
# Platinum
async-dependency: todo

View File

@@ -20,8 +20,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
from .entity import AladdinConnectEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AladdinConnectSensorEntityDescription(SensorEntityDescription):

View File

@@ -32,13 +32,5 @@
"title": "[%key:common::config_flow::title::reauth%]"
}
}
},
"exceptions": {
"close_door_failed": {
"message": "Failed to close the garage door"
},
"open_door_failed": {
"message": "Failed to open the garage door"
}
}
}

View File

@@ -44,7 +44,7 @@ def make_entity_state_trigger_required_features(
class CustomTrigger(EntityStateTriggerRequiredFeatures):
"""Trigger for entity state changes."""
_domains = {domain}
_domain = domain
_to_states = {to_state}
_required_features = required_features

View File

@@ -13,6 +13,9 @@ from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
SERVICE_ALARM_TOGGLE_CHIME = "alarm_toggle_chime"
SERVICE_ALARM_KEYPRESS = "alarm_keypress"
ATTR_KEYPRESS = "keypress"
@@ -23,7 +26,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"alarm_toggle_chime",
SERVICE_ALARM_TOGGLE_CHIME,
entity_domain=ALARM_CONTROL_PANEL_DOMAIN,
schema={
vol.Required(ATTR_CODE): cv.string,
@@ -34,7 +37,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"alarm_keypress",
SERVICE_ALARM_KEYPRESS,
entity_domain=ALARM_CONTROL_PANEL_DOMAIN,
schema={
vol.Required(ATTR_KEYPRESS): cv.string,

View File

@@ -1,6 +1,6 @@
"""Defines a base Alexa Devices entity."""
from aioamazondevices.const.devices import SPEAKER_GROUP_DEVICE_TYPE
from aioamazondevices.const.devices import SPEAKER_GROUP_MODEL
from aioamazondevices.structures import AmazonDevice
from homeassistant.helpers.device_registry import DeviceInfo
@@ -25,20 +25,19 @@ class AmazonEntity(CoordinatorEntity[AmazonDevicesCoordinator]):
"""Initialize the entity."""
super().__init__(coordinator)
self._serial_num = serial_num
model = self.device.model
model_details = coordinator.api.get_model_details(self.device) or {}
model = model_details.get("model")
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_num)},
name=self.device.account_name,
model=model,
model_id=self.device.device_type,
manufacturer=self.device.manufacturer or "Amazon",
hw_version=self.device.hardware_version,
manufacturer=model_details.get("manufacturer", "Amazon"),
hw_version=model_details.get("hw_version"),
sw_version=(
self.device.software_version
if model != SPEAKER_GROUP_DEVICE_TYPE
else None
self.device.software_version if model != SPEAKER_GROUP_MODEL else None
),
serial_number=serial_num if model != SPEAKER_GROUP_DEVICE_TYPE else None,
serial_number=serial_num if model != SPEAKER_GROUP_MODEL else None,
)
self.entity_description = description
self._attr_unique_id = f"{serial_num}-{description.key}"

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==13.0.0"]
"requirements": ["aioamazondevices==12.0.0"]
}

View File

@@ -16,6 +16,9 @@ from .coordinator import AmazonConfigEntry
ATTR_TEXT_COMMAND = "text_command"
ATTR_SOUND = "sound"
ATTR_INFO_SKILL = "info_skill"
SERVICE_TEXT_COMMAND = "send_text_command"
SERVICE_SOUND_NOTIFICATION = "send_sound"
SERVICE_INFO_SKILL = "send_info_skill"
SCHEMA_SOUND_SERVICE = vol.Schema(
{
@@ -125,17 +128,17 @@ def async_setup_services(hass: HomeAssistant) -> None:
"""Set up the services for the Amazon Devices integration."""
for service_name, method, schema in (
(
"send_sound",
SERVICE_SOUND_NOTIFICATION,
async_send_sound_notification,
SCHEMA_SOUND_SERVICE,
),
(
"send_text_command",
SERVICE_TEXT_COMMAND,
async_send_text_command,
SCHEMA_CUSTOM_COMMAND,
),
(
"send_info_skill",
SERVICE_INFO_SKILL,
async_send_info_skill,
SCHEMA_INFO_SKILL,
),

View File

@@ -16,6 +16,8 @@ ATTRIBUTION = "Data provided by Amber Electric"
LOGGER = logging.getLogger(__package__)
PLATFORMS = [Platform.BINARY_SENSOR, Platform.SENSOR]
SERVICE_GET_FORECASTS = "get_forecasts"
GENERAL_CHANNEL = "general"
CONTROLLED_LOAD_CHANNEL = "controlled_load"
FEED_IN_CHANNEL = "feed_in"

View File

@@ -22,6 +22,7 @@ from .const import (
DOMAIN,
FEED_IN_CHANNEL,
GENERAL_CHANNEL,
SERVICE_GET_FORECASTS,
)
from .coordinator import AmberConfigEntry
from .helpers import format_cents_to_dollars, normalize_descriptor
@@ -100,7 +101,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
hass.services.async_register(
DOMAIN,
"get_forecasts",
SERVICE_GET_FORECASTS,
handle_get_forecasts,
GET_FORECASTS_SCHEMA,
supports_response=SupportsResponse.ONLY,

View File

@@ -49,6 +49,18 @@ SCAN_INTERVAL = timedelta(seconds=15)
STREAM_SOURCE_LIST = ["snapshot", "mjpeg", "rtsp"]
_SRV_EN_REC = "enable_recording"
_SRV_DS_REC = "disable_recording"
_SRV_EN_AUD = "enable_audio"
_SRV_DS_AUD = "disable_audio"
_SRV_EN_MOT_REC = "enable_motion_recording"
_SRV_DS_MOT_REC = "disable_motion_recording"
_SRV_GOTO = "goto_preset"
_SRV_CBW = "set_color_bw"
_SRV_TOUR_ON = "start_tour"
_SRV_TOUR_OFF = "stop_tour"
_SRV_PTZ_CTRL = "ptz_control"
_ATTR_PTZ_TT = "travel_time"
_ATTR_PTZ_MOV = "movement"
_MOV = [
@@ -91,17 +103,17 @@ _SRV_PTZ_SCHEMA = _SRV_SCHEMA.extend(
)
CAMERA_SERVICES = {
"enable_recording": (_SRV_SCHEMA, "async_enable_recording", ()),
"disable_recording": (_SRV_SCHEMA, "async_disable_recording", ()),
"enable_audio": (_SRV_SCHEMA, "async_enable_audio", ()),
"disable_audio": (_SRV_SCHEMA, "async_disable_audio", ()),
"enable_motion_recording": (_SRV_SCHEMA, "async_enable_motion_recording", ()),
"disable_motion_recording": (_SRV_SCHEMA, "async_disable_motion_recording", ()),
"goto_preset": (_SRV_GOTO_SCHEMA, "async_goto_preset", (_ATTR_PRESET,)),
"set_color_bw": (_SRV_CBW_SCHEMA, "async_set_color_bw", (_ATTR_COLOR_BW,)),
"start_tour": (_SRV_SCHEMA, "async_start_tour", ()),
"stop_tour": (_SRV_SCHEMA, "async_stop_tour", ()),
"ptz_control": (
_SRV_EN_REC: (_SRV_SCHEMA, "async_enable_recording", ()),
_SRV_DS_REC: (_SRV_SCHEMA, "async_disable_recording", ()),
_SRV_EN_AUD: (_SRV_SCHEMA, "async_enable_audio", ()),
_SRV_DS_AUD: (_SRV_SCHEMA, "async_disable_audio", ()),
_SRV_EN_MOT_REC: (_SRV_SCHEMA, "async_enable_motion_recording", ()),
_SRV_DS_MOT_REC: (_SRV_SCHEMA, "async_disable_motion_recording", ()),
_SRV_GOTO: (_SRV_GOTO_SCHEMA, "async_goto_preset", (_ATTR_PRESET,)),
_SRV_CBW: (_SRV_CBW_SCHEMA, "async_set_color_bw", (_ATTR_COLOR_BW,)),
_SRV_TOUR_ON: (_SRV_SCHEMA, "async_start_tour", ()),
_SRV_TOUR_OFF: (_SRV_SCHEMA, "async_stop_tour", ()),
_SRV_PTZ_CTRL: (
_SRV_PTZ_SCHEMA,
"async_ptz_control",
(_ATTR_PTZ_MOV, _ATTR_PTZ_TT),

View File

@@ -36,7 +36,7 @@ from .const import (
SIGNAL_CONFIG_ENTITY,
)
from .entity import AndroidTVEntity, adb_decorator
from .services import ATTR_ADB_RESPONSE, ATTR_HDMI_INPUT
from .services import ATTR_ADB_RESPONSE, ATTR_HDMI_INPUT, SERVICE_LEARN_SENDEVENT
_LOGGER = logging.getLogger(__name__)
@@ -271,7 +271,7 @@ class ADBDevice(AndroidTVEntity, MediaPlayerEntity):
self.async_write_ha_state()
msg = (
f"Output from service 'learn_sendevent' from"
f"Output from service '{SERVICE_LEARN_SENDEVENT}' from"
f" {self.entity_id}: '{output}'"
)
persistent_notification.async_create(

View File

@@ -16,6 +16,11 @@ ATTR_DEVICE_PATH = "device_path"
ATTR_HDMI_INPUT = "hdmi_input"
ATTR_LOCAL_PATH = "local_path"
SERVICE_ADB_COMMAND = "adb_command"
SERVICE_DOWNLOAD = "download"
SERVICE_LEARN_SENDEVENT = "learn_sendevent"
SERVICE_UPLOAD = "upload"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -24,7 +29,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"adb_command",
SERVICE_ADB_COMMAND,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={vol.Required(ATTR_COMMAND): cv.string},
func="adb_command",
@@ -32,7 +37,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"learn_sendevent",
SERVICE_LEARN_SENDEVENT,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema=None,
func="learn_sendevent",
@@ -40,7 +45,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"download",
SERVICE_DOWNLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,
@@ -51,7 +56,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"upload",
SERVICE_UPLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,

View File

@@ -46,7 +46,6 @@ class AnthropicTaskEntity(
ai_task.AITaskEntityFeature.GENERATE_DATA
| ai_task.AITaskEntityFeature.SUPPORT_ATTACHMENTS
)
_attr_translation_key = "ai_task_data"
async def _async_generate_data(
self,

View File

@@ -43,9 +43,7 @@ from homeassistant.helpers.selector import (
from homeassistant.helpers.typing import VolDictType
from .const import (
CODE_EXECUTION_UNSUPPORTED_MODELS,
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
CONF_PROMPT,
CONF_RECOMMENDED,
@@ -417,16 +415,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
else:
self.options.pop(CONF_THINKING_EFFORT, None)
if not model.startswith(tuple(CODE_EXECUTION_UNSUPPORTED_MODELS)):
step_schema[
vol.Optional(
CONF_CODE_EXECUTION,
default=DEFAULT[CONF_CODE_EXECUTION],
)
] = bool
else:
self.options.pop(CONF_CODE_EXECUTION, None)
if not model.startswith(tuple(WEB_SEARCH_UNSUPPORTED_MODELS)):
step_schema.update(
{

View File

@@ -11,7 +11,6 @@ DEFAULT_AI_TASK_NAME = "Claude AI Task"
CONF_RECOMMENDED = "recommended"
CONF_PROMPT = "prompt"
CONF_CHAT_MODEL = "chat_model"
CONF_CODE_EXECUTION = "code_execution"
CONF_MAX_TOKENS = "max_tokens"
CONF_TEMPERATURE = "temperature"
CONF_THINKING_BUDGET = "thinking_budget"
@@ -26,7 +25,6 @@ CONF_WEB_SEARCH_TIMEZONE = "timezone"
DEFAULT = {
CONF_CHAT_MODEL: "claude-haiku-4-5",
CONF_CODE_EXECUTION: False,
CONF_MAX_TOKENS: 3000,
CONF_TEMPERATURE: 1.0,
CONF_THINKING_BUDGET: 0,
@@ -67,10 +65,6 @@ WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
CODE_EXECUTION_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
DEPRECATED_MODELS = [
"claude-3",
]

View File

@@ -37,7 +37,6 @@ class AnthropicConversationEntity(
"""Anthropic conversation agent."""
_attr_supports_streaming = True
_attr_translation_key = "conversation"
def __init__(self, entry: AnthropicConfigEntry, subentry: ConfigSubentry) -> None:
"""Initialize the agent."""

View File

@@ -3,23 +3,19 @@
import base64
from collections.abc import AsyncGenerator, Callable, Iterable
from dataclasses import dataclass, field
from datetime import UTC, datetime
import json
from mimetypes import guess_file_type
from pathlib import Path
from typing import Any, Literal, cast
from typing import Any
import anthropic
from anthropic import AsyncStream
from anthropic.types import (
Base64ImageSourceParam,
Base64PDFSourceParam,
BashCodeExecutionToolResultBlock,
CitationsDelta,
CitationsWebSearchResultLocation,
CitationWebSearchResultLocationParam,
CodeExecutionTool20250825Param,
Container,
ContentBlockParam,
DocumentBlockParam,
ImageBlockParam,
@@ -45,7 +41,6 @@ from anthropic.types import (
TextCitation,
TextCitationParam,
TextDelta,
TextEditorCodeExecutionToolResultBlock,
ThinkingBlock,
ThinkingBlockParam,
ThinkingConfigAdaptiveParam,
@@ -56,21 +51,18 @@ from anthropic.types import (
ToolChoiceAutoParam,
ToolChoiceToolParam,
ToolParam,
ToolResultBlockParam,
ToolUnionParam,
ToolUseBlock,
ToolUseBlockParam,
Usage,
WebSearchTool20250305Param,
WebSearchToolRequestErrorParam,
WebSearchToolResultBlock,
WebSearchToolResultBlockParamContentParam,
)
from anthropic.types.bash_code_execution_tool_result_block_param import (
Content as BashCodeExecutionToolResultContentParam,
WebSearchToolResultBlockParam,
WebSearchToolResultError,
)
from anthropic.types.message_create_params import MessageCreateParamsStreaming
from anthropic.types.text_editor_code_execution_tool_result_block_param import (
Content as TextEditorCodeExecutionToolResultContentParam,
)
import voluptuous as vol
from voluptuous_openapi import convert
@@ -82,12 +74,10 @@ from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.json import json_dumps
from homeassistant.util import slugify
from homeassistant.util.json import JsonObjectType
from . import AnthropicConfigEntry
from .const import (
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
@@ -144,7 +134,6 @@ class ContentDetails:
citation_details: list[CitationDetails] = field(default_factory=list)
thinking_signature: str | None = None
redacted_thinking: str | None = None
container: Container | None = None
def has_content(self) -> bool:
"""Check if there is any text content."""
@@ -155,7 +144,6 @@ class ContentDetails:
return (
self.thinking_signature is not None
or self.redacted_thinking is not None
or self.container is not None
or self.has_citations()
)
@@ -200,53 +188,30 @@ class ContentDetails:
def _convert_content(
chat_content: Iterable[conversation.Content],
) -> tuple[list[MessageParam], str | None]:
) -> list[MessageParam]:
"""Transform HA chat_log content into Anthropic API format."""
messages: list[MessageParam] = []
container_id: str | None = None
for content in chat_content:
if isinstance(content, conversation.ToolResultContent):
external_tool = True
if content.tool_name == "web_search":
tool_result_block: ContentBlockParam = {
"type": "web_search_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
WebSearchToolResultBlockParamContentParam,
content.tool_result["content"]
if "content" in content.tool_result
else {
"type": "web_search_tool_result_error",
"error_code": content.tool_result.get(
"error_code", "unavailable"
),
},
tool_result_block: ContentBlockParam = WebSearchToolResultBlockParam(
type="web_search_tool_result",
tool_use_id=content.tool_call_id,
content=content.tool_result["content"]
if "content" in content.tool_result
else WebSearchToolRequestErrorParam(
type="web_search_tool_result_error",
error_code=content.tool_result.get("error_code", "unavailable"), # type: ignore[typeddict-item]
),
}
elif content.tool_name == "bash_code_execution":
tool_result_block = {
"type": "bash_code_execution_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
BashCodeExecutionToolResultContentParam, content.tool_result
),
}
elif content.tool_name == "text_editor_code_execution":
tool_result_block = {
"type": "text_editor_code_execution_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
TextEditorCodeExecutionToolResultContentParam,
content.tool_result,
),
}
)
external_tool = True
else:
tool_result_block = {
"type": "tool_result",
"tool_use_id": content.tool_call_id,
"content": json_dumps(content.tool_result),
}
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json_dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (
"assistant" if external_tool else "user"
@@ -312,11 +277,6 @@ def _convert_content(
data=content.native.redacted_thinking,
)
)
if (
content.native.container is not None
and content.native.container.expires_at > datetime.now(UTC)
):
container_id = content.native.container.id
if content.content:
current_index = 0
@@ -365,23 +325,10 @@ def _convert_content(
ServerToolUseBlockParam(
type="server_tool_use",
id=tool_call.id,
name=cast(
Literal[
"web_search",
"bash_code_execution",
"text_editor_code_execution",
],
tool_call.tool_name,
),
name="web_search",
input=tool_call.tool_args,
)
if tool_call.external
and tool_call.tool_name
in [
"web_search",
"bash_code_execution",
"text_editor_code_execution",
]
if tool_call.external and tool_call.tool_name == "web_search"
else ToolUseBlockParam(
type="tool_use",
id=tool_call.id,
@@ -400,10 +347,10 @@ def _convert_content(
# If there is only one text block, simplify the content to a string
messages[-1]["content"] = messages[-1]["content"][0]["text"]
else:
# Note: We don't pass SystemContent here as it's passed to the API as the prompt
raise HomeAssistantError("Unexpected content type in chat log")
# Note: We don't pass SystemContent here as its passed to the API as the prompt
raise TypeError(f"Unexpected content type: {type(content)}")
return messages, container_id
return messages
async def _transform_stream( # noqa: C901 - This is complex, but better to have it in one place
@@ -442,8 +389,8 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
Each message could contain multiple blocks of the same type.
"""
if stream is None or not hasattr(stream, "__aiter__"):
raise HomeAssistantError("Expected a stream of messages")
if stream is None:
raise TypeError("Expected a stream of messages")
current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = None
current_tool_args: str
@@ -456,6 +403,8 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
LOGGER.debug("Received response: %s", response)
if isinstance(response, RawMessageStartEvent):
if response.message.role != "assistant":
raise ValueError("Unexpected message role")
input_usage = response.message.usage
first_block = True
elif isinstance(response, RawContentBlockStartEvent):
@@ -529,14 +478,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
input={},
)
current_tool_args = ""
elif isinstance(
response.content_block,
(
WebSearchToolResultBlock,
BashCodeExecutionToolResultBlock,
TextEditorCodeExecutionToolResultBlock,
),
):
elif isinstance(response.content_block, WebSearchToolResultBlock):
if content_details:
content_details.delete_empty()
yield {"native": content_details}
@@ -545,16 +487,26 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
yield {
"role": "tool_result",
"tool_call_id": response.content_block.tool_use_id,
"tool_name": response.content_block.type.removesuffix(
"_tool_result"
),
"tool_name": "web_search",
"tool_result": {
"content": cast(
JsonObjectType, response.content_block.to_dict()["content"]
)
"type": "web_search_tool_result_error",
"error_code": response.content_block.content.error_code,
}
if isinstance(response.content_block.content, list)
else cast(JsonObjectType, response.content_block.content.to_dict()),
if isinstance(
response.content_block.content, WebSearchToolResultError
)
else {
"content": [
{
"type": "web_search_result",
"encrypted_content": block.encrypted_content,
"page_age": block.page_age,
"title": block.title,
"url": block.url,
}
for block in response.content_block.content
]
},
}
first_block = True
elif isinstance(response, RawContentBlockDeltaEvent):
@@ -603,7 +555,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
elif isinstance(response, RawMessageDeltaEvent):
if (usage := response.usage) is not None:
chat_log.async_trace(_create_token_stats(input_usage, usage))
content_details.container = response.delta.container
if response.delta.stop_reason == "refusal":
raise HomeAssistantError("Potential policy violation detected")
elif isinstance(response, RawMessageStopEvent):
@@ -664,7 +615,7 @@ class AnthropicBaseLLMEntity(Entity):
system = chat_log.content[0]
if not isinstance(system, conversation.SystemContent):
raise HomeAssistantError("First message must be a system message")
raise TypeError("First message must be a system message")
# System prompt with caching enabled
system_prompt: list[TextBlockParam] = [
@@ -675,7 +626,7 @@ class AnthropicBaseLLMEntity(Entity):
)
]
messages, container_id = _convert_content(chat_log.content[1:])
messages = _convert_content(chat_log.content[1:])
model = options.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
@@ -685,7 +636,6 @@ class AnthropicBaseLLMEntity(Entity):
max_tokens=options.get(CONF_MAX_TOKENS, DEFAULT[CONF_MAX_TOKENS]),
system=system_prompt,
stream=True,
container=container_id,
)
if not model.startswith(tuple(NON_ADAPTIVE_THINKING_MODELS)):
@@ -724,14 +674,6 @@ class AnthropicBaseLLMEntity(Entity):
for tool in chat_log.llm_api.tools
]
if options.get(CONF_CODE_EXECUTION):
tools.append(
CodeExecutionTool20250825Param(
name="code_execution",
type="code_execution_20250825",
),
)
if options.get(CONF_WEB_SEARCH):
web_search = WebSearchTool20250305Param(
name="web_search",
@@ -842,25 +784,21 @@ class AnthropicBaseLLMEntity(Entity):
try:
stream = await client.messages.create(**model_args)
new_messages, model_args["container"] = _convert_content(
[
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
_transform_stream(
chat_log,
stream,
output_tool=structure_name or None,
),
)
]
messages.extend(
_convert_content(
[
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
_transform_stream(
chat_log,
stream,
output_tool=structure_name or None,
),
)
]
)
)
messages.extend(new_messages)
except anthropic.AuthenticationError as err:
self.entry.async_start_reauth(self.hass)
raise HomeAssistantError(
"Authentication error with Anthropic API, reauthentication required"
) from err
except anthropic.AnthropicError as err:
raise HomeAssistantError(
f"Sorry, I had a problem talking to Anthropic: {err}"

View File

@@ -1,14 +0,0 @@
{
"entity": {
"ai_task": {
"ai_task_data": {
"default": "mdi:asterisk"
}
},
"conversation": {
"conversation": {
"default": "mdi:asterisk"
}
}
}
}

View File

@@ -1,6 +1,6 @@
{
"domain": "anthropic",
"name": "Anthropic",
"name": "Anthropic Conversation",
"after_dependencies": ["assist_pipeline", "intent"],
"codeowners": ["@Shulyaka"],
"config_flow": true,
@@ -8,6 +8,5 @@
"documentation": "https://www.home-assistant.io/integrations/anthropic",
"integration_type": "service",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["anthropic==0.83.0"]
}

View File

@@ -31,7 +31,10 @@ rules:
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: done
action-exceptions:
status: todo
comment: |
Reevaluate exceptions for entity services.
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
@@ -89,7 +92,7 @@ rules:
No entities disabled by default.
entity-translations: todo
exception-translations: todo
icon-translations: done
icon-translations: todo
reconfiguration-flow: done
repair-issues: done
stale-devices:

View File

@@ -69,7 +69,6 @@
},
"model": {
"data": {
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data::code_execution%]",
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_budget%]",
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_effort%]",
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data::user_location%]",
@@ -77,7 +76,6 @@
"web_search_max_uses": "[%key:component::anthropic::config_subentries::conversation::step::model::data::web_search_max_uses%]"
},
"data_description": {
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::code_execution%]",
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_budget%]",
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_effort%]",
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::user_location%]",
@@ -129,7 +127,6 @@
},
"model": {
"data": {
"code_execution": "Code execution",
"thinking_budget": "Thinking budget",
"thinking_effort": "Thinking effort",
"user_location": "Include home location",
@@ -137,7 +134,6 @@
"web_search_max_uses": "Maximum web searches"
},
"data_description": {
"code_execution": "Allow the model to execute code in a secure sandbox environment, enabling it to analyze data and perform complex calculations.",
"thinking_budget": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking.",
"thinking_effort": "Control how many tokens Claude uses when responding, trading off between response thoroughness and token efficiency",
"user_location": "Localize search results based on home location",

View File

@@ -117,7 +117,6 @@ class SharpAquosTVDevice(MediaPlayerEntity):
| MediaPlayerEntityFeature.VOLUME_SET
| MediaPlayerEntityFeature.PLAY
)
_attr_volume_step = 2 / 60
def __init__(
self, name: str, remote: sharp_aquos_rc.TV, power_on_enabled: bool = False
@@ -162,6 +161,22 @@ class SharpAquosTVDevice(MediaPlayerEntity):
"""Turn off tvplayer."""
self._remote.power(0)
@_retry
def volume_up(self) -> None:
"""Volume up the media player."""
if self.volume_level is None:
_LOGGER.debug("Unknown volume in volume_up")
return
self._remote.volume(int(self.volume_level * 60) + 2)
@_retry
def volume_down(self) -> None:
"""Volume down media player."""
if self.volume_level is None:
_LOGGER.debug("Unknown volume in volume_down")
return
self._remote.volume(int(self.volume_level * 60) - 2)
@_retry
def set_volume_level(self, volume: float) -> None:
"""Set Volume media player."""

View File

@@ -30,5 +30,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==9.2.0", "yalexs-ble==3.2.7"]
"requirements": ["yalexs==9.2.0", "yalexs-ble==3.2.4"]
}

View File

@@ -61,13 +61,7 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
frequency = self.client.measure(4)
i_leak_dcdc = self.client.measure(6)
i_leak_inverter = self.client.measure(7)
power_in_1 = self.client.measure(8)
power_in_2 = self.client.measure(9)
temperature_c = self.client.measure(21)
voltage_in_1 = self.client.measure(23)
current_in_1 = self.client.measure(25)
voltage_in_2 = self.client.measure(26)
current_in_2 = self.client.measure(27)
r_iso = self.client.measure(30)
energy_wh = self.client.cumulated_energy(5)
[alarm, *_] = self.client.alarms()
@@ -93,13 +87,7 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
data["grid_frequency"] = round(frequency, 1)
data["i_leak_dcdc"] = i_leak_dcdc
data["i_leak_inverter"] = i_leak_inverter
data["power_in_1"] = round(power_in_1, 1)
data["power_in_2"] = round(power_in_2, 1)
data["temp"] = round(temperature_c, 1)
data["voltage_in_1"] = round(voltage_in_1, 1)
data["current_in_1"] = round(current_in_1, 1)
data["voltage_in_2"] = round(voltage_in_2, 1)
data["current_in_2"] = round(current_in_2, 1)
data["r_iso"] = r_iso
data["totalenergy"] = round(energy_wh / 1000, 2)
data["alarm"] = alarm

View File

@@ -68,7 +68,6 @@ SENSOR_TYPES = [
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfFrequency.HERTZ,
state_class=SensorStateClass.MEASUREMENT,
translation_key="grid_frequency",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
@@ -89,60 +88,6 @@ SENSOR_TYPES = [
translation_key="i_leak_inverter",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_1",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_2",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_1",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_1",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_2",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_2",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="alarm",
device_class=SensorDeviceClass.ENUM,

View File

@@ -24,18 +24,9 @@
"alarm": {
"name": "Alarm status"
},
"current_in_1": {
"name": "String 1 current"
},
"current_in_2": {
"name": "String 2 current"
},
"grid_current": {
"name": "Grid current"
},
"grid_frequency": {
"name": "Grid frequency"
},
"grid_voltage": {
"name": "Grid voltage"
},
@@ -45,12 +36,6 @@
"i_leak_inverter": {
"name": "Inverter leak current"
},
"power_in_1": {
"name": "String 1 power"
},
"power_in_2": {
"name": "String 2 power"
},
"power_output": {
"name": "Power output"
},
@@ -59,12 +44,6 @@
},
"total_energy": {
"name": "Total energy"
},
"voltage_in_1": {
"name": "String 1 voltage"
},
"voltage_in_2": {
"name": "String 2 voltage"
}
}
}

View File

@@ -148,11 +148,8 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"light",
"lock",
"media_player",
"number",
"person",
"remote",
"scene",
"schedule",
"siren",
"switch",
"text",

View File

@@ -19,7 +19,7 @@ from homeassistant.components.backup import (
from homeassistant.core import HomeAssistant, callback
from . import S3ConfigEntry
from .const import CONF_BUCKET, CONF_PREFIX, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .const import CONF_BUCKET, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .helpers import async_list_backups_from_s3
_LOGGER = logging.getLogger(__name__)
@@ -100,13 +100,6 @@ class S3BackupAgent(BackupAgent):
self.unique_id = entry.entry_id
self._backup_cache: dict[str, AgentBackup] = {}
self._cache_expiration = time()
self._prefix: str = entry.data.get(CONF_PREFIX, "")
def _with_prefix(self, key: str) -> str:
"""Add prefix to a key if configured."""
if not self._prefix:
return key
return f"{self._prefix}/{key}"
@handle_boto_errors
async def async_download_backup(
@@ -122,9 +115,7 @@ class S3BackupAgent(BackupAgent):
backup = await self._find_backup_by_id(backup_id)
tar_filename, _ = suggested_filenames(backup)
response = await self._client.get_object(
Bucket=self._bucket, Key=self._with_prefix(tar_filename)
)
response = await self._client.get_object(Bucket=self._bucket, Key=tar_filename)
return response["Body"].iter_chunks()
async def async_upload_backup(
@@ -151,7 +142,7 @@ class S3BackupAgent(BackupAgent):
metadata_content = json.dumps(backup.as_dict())
await self._client.put_object(
Bucket=self._bucket,
Key=self._with_prefix(metadata_filename),
Key=metadata_filename,
Body=metadata_content,
)
except BotoCoreError as err:
@@ -178,7 +169,7 @@ class S3BackupAgent(BackupAgent):
await self._client.put_object(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
Body=bytes(file_data),
)
@@ -195,7 +186,7 @@ class S3BackupAgent(BackupAgent):
_LOGGER.debug("Starting multipart upload for %s", tar_filename)
multipart_upload = await self._client.create_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
)
upload_id = multipart_upload["UploadId"]
try:
@@ -225,7 +216,7 @@ class S3BackupAgent(BackupAgent):
)
part = await cast(Any, self._client).upload_part(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
PartNumber=part_number,
UploadId=upload_id,
Body=part_data.tobytes(),
@@ -253,7 +244,7 @@ class S3BackupAgent(BackupAgent):
)
part = await cast(Any, self._client).upload_part(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
PartNumber=part_number,
UploadId=upload_id,
Body=remaining_data.tobytes(),
@@ -262,7 +253,7 @@ class S3BackupAgent(BackupAgent):
await cast(Any, self._client).complete_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
UploadId=upload_id,
MultipartUpload={"Parts": parts},
)
@@ -271,7 +262,7 @@ class S3BackupAgent(BackupAgent):
try:
await self._client.abort_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
UploadId=upload_id,
)
except BotoCoreError:
@@ -292,12 +283,8 @@ class S3BackupAgent(BackupAgent):
tar_filename, metadata_filename = suggested_filenames(backup)
# Delete both the backup file and its metadata file
await self._client.delete_object(
Bucket=self._bucket, Key=self._with_prefix(tar_filename)
)
await self._client.delete_object(
Bucket=self._bucket, Key=self._with_prefix(metadata_filename)
)
await self._client.delete_object(Bucket=self._bucket, Key=tar_filename)
await self._client.delete_object(Bucket=self._bucket, Key=metadata_filename)
# Reset cache after successful deletion
self._cache_expiration = time()
@@ -330,9 +317,7 @@ class S3BackupAgent(BackupAgent):
if time() <= self._cache_expiration:
return self._backup_cache
backups_list = await async_list_backups_from_s3(
self._client, self._bucket, self._prefix
)
backups_list = await async_list_backups_from_s3(self._client, self._bucket)
self._backup_cache = {b.backup_id: b for b in backups_list}
self._cache_expiration = time() + CACHE_TTL

View File

@@ -22,7 +22,6 @@ from .const import (
CONF_ACCESS_KEY_ID,
CONF_BUCKET,
CONF_ENDPOINT_URL,
CONF_PREFIX,
CONF_SECRET_ACCESS_KEY,
DEFAULT_ENDPOINT_URL,
DESCRIPTION_AWS_S3_DOCS_URL,
@@ -40,7 +39,6 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
vol.Required(CONF_ENDPOINT_URL, default=DEFAULT_ENDPOINT_URL): TextSelector(
config=TextSelectorConfig(type=TextSelectorType.URL)
),
vol.Optional(CONF_PREFIX, default=""): cv.string,
}
)
@@ -55,20 +53,16 @@ class S3ConfigFlow(ConfigFlow, domain=DOMAIN):
errors: dict[str, str] = {}
if user_input is not None:
normalized_prefix = user_input.get(CONF_PREFIX, "").strip("/")
# Check for existing entries, treating missing prefix as empty
for entry in self._async_current_entries(include_ignore=False):
entry_prefix = (entry.data.get(CONF_PREFIX) or "").strip("/")
if (
entry.data.get(CONF_BUCKET) == user_input[CONF_BUCKET]
and entry.data.get(CONF_ENDPOINT_URL)
== user_input[CONF_ENDPOINT_URL]
and entry_prefix == normalized_prefix
):
return self.async_abort(reason="already_configured")
self._async_abort_entries_match(
{
CONF_BUCKET: user_input[CONF_BUCKET],
CONF_ENDPOINT_URL: user_input[CONF_ENDPOINT_URL],
}
)
hostname = urlparse(user_input[CONF_ENDPOINT_URL]).hostname
if not hostname or not hostname.endswith(AWS_DOMAIN):
if not urlparse(user_input[CONF_ENDPOINT_URL]).hostname.endswith(
AWS_DOMAIN
):
errors[CONF_ENDPOINT_URL] = "invalid_endpoint_url"
else:
try:
@@ -90,18 +84,9 @@ class S3ConfigFlow(ConfigFlow, domain=DOMAIN):
except ConnectionError:
errors[CONF_ENDPOINT_URL] = "cannot_connect"
else:
data = dict(user_input)
if not normalized_prefix:
# Do not persist empty optional values
data.pop(CONF_PREFIX, None)
else:
data[CONF_PREFIX] = normalized_prefix
title = user_input[CONF_BUCKET]
if normalized_prefix:
title = f"{title} - {normalized_prefix}"
return self.async_create_entry(title=title, data=data)
return self.async_create_entry(
title=user_input[CONF_BUCKET], data=user_input
)
return self.async_show_form(
step_id="user",

View File

@@ -11,7 +11,6 @@ CONF_ACCESS_KEY_ID = "access_key_id"
CONF_SECRET_ACCESS_KEY = "secret_access_key"
CONF_ENDPOINT_URL = "endpoint_url"
CONF_BUCKET = "bucket"
CONF_PREFIX = "prefix"
AWS_DOMAIN = "amazonaws.com"
DEFAULT_ENDPOINT_URL = f"https://s3.eu-central-1.{AWS_DOMAIN}/"

View File

@@ -13,7 +13,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import CONF_BUCKET, CONF_PREFIX, DOMAIN
from .const import CONF_BUCKET, DOMAIN
from .helpers import async_list_backups_from_s3
SCAN_INTERVAL = timedelta(hours=6)
@@ -53,14 +53,11 @@ class S3DataUpdateCoordinator(DataUpdateCoordinator[SensorData]):
)
self.client = client
self._bucket: str = entry.data[CONF_BUCKET]
self._prefix: str = entry.data.get(CONF_PREFIX, "")
async def _async_update_data(self) -> SensorData:
"""Fetch data from AWS S3."""
try:
backups = await async_list_backups_from_s3(
self.client, self._bucket, self._prefix
)
backups = await async_list_backups_from_s3(self.client, self._bucket)
except BotoCoreError as error:
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -1,55 +0,0 @@
"""Diagnostics support for AWS S3."""
from __future__ import annotations
import dataclasses
from typing import Any
from homeassistant.components.backup import (
DATA_MANAGER as BACKUP_DATA_MANAGER,
BackupManager,
)
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
from .const import (
CONF_ACCESS_KEY_ID,
CONF_BUCKET,
CONF_PREFIX,
CONF_SECRET_ACCESS_KEY,
DOMAIN,
)
from .coordinator import S3ConfigEntry
from .helpers import async_list_backups_from_s3
TO_REDACT = (CONF_ACCESS_KEY_ID, CONF_SECRET_ACCESS_KEY)
async def async_get_config_entry_diagnostics(
hass: HomeAssistant,
entry: S3ConfigEntry,
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator = entry.runtime_data
backup_manager: BackupManager = hass.data[BACKUP_DATA_MANAGER]
backups = await async_list_backups_from_s3(
coordinator.client,
bucket=entry.data[CONF_BUCKET],
prefix=entry.data.get(CONF_PREFIX, ""),
)
data = {
"coordinator_data": dataclasses.asdict(coordinator.data),
"config": {
**entry.data,
**entry.options,
},
"backup_agents": [
{"name": agent.name}
for agent in backup_manager.backup_agents.values()
if agent.domain == DOMAIN
],
"backup": [backup.as_dict() for backup in backups],
}
return async_redact_data(data, TO_REDACT)

View File

@@ -17,17 +17,11 @@ _LOGGER = logging.getLogger(__name__)
async def async_list_backups_from_s3(
client: S3Client,
bucket: str,
prefix: str,
) -> list[AgentBackup]:
"""List backups from an S3 bucket by reading metadata files."""
paginator = client.get_paginator("list_objects_v2")
metadata_files: list[dict[str, Any]] = []
list_kwargs: dict[str, Any] = {"Bucket": bucket}
if prefix:
list_kwargs["Prefix"] = prefix + "/"
async for page in paginator.paginate(**list_kwargs):
async for page in paginator.paginate(Bucket=bucket):
metadata_files.extend(
obj
for obj in page.get("Contents", [])

View File

@@ -23,9 +23,7 @@ rules:
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry:
status: exempt
comment: Hassfest does not recognize the duplicate prevention logic. Duplicate entries are prevented by checking bucket, endpoint URL, and prefix in the config flow.
unique-config-entry: done
# Silver
action-exceptions:
@@ -38,14 +36,14 @@ rules:
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: todo
test-coverage: done
# Gold
devices: done
diagnostics: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: S3 is a cloud service that is not discovered on the network.

View File

@@ -15,14 +15,12 @@
"access_key_id": "Access key ID",
"bucket": "Bucket name",
"endpoint_url": "Endpoint URL",
"prefix": "Prefix",
"secret_access_key": "Secret access key"
},
"data_description": {
"access_key_id": "Access key ID to connect to AWS S3 API",
"bucket": "Bucket must already exist and be writable by the provided credentials.",
"endpoint_url": "Endpoint URL provided to [Boto3 Session]({boto3_docs_url}). Region-specific [AWS S3 endpoints]({aws_s3_docs_url}) are available in their docs.",
"prefix": "Folder or prefix to store backups in, for example `backups`",
"secret_access_key": "Secret access key to connect to AWS S3 API"
},
"title": "Add AWS S3 bucket"

View File

@@ -29,17 +29,12 @@ class StoredBackupData(TypedDict):
class _BackupStore(Store[StoredBackupData]):
"""Class to help storing backup data."""
# Maximum version we support reading for forward compatibility.
# This allows reading data written by a newer HA version after downgrade.
_MAX_READABLE_VERSION = 2
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize storage class."""
super().__init__(
hass,
STORAGE_VERSION,
STORAGE_KEY,
max_readable_version=self._MAX_READABLE_VERSION,
minor_version=STORAGE_VERSION_MINOR,
)
@@ -91,8 +86,8 @@ class _BackupStore(Store[StoredBackupData]):
# data["config"]["schedule"]["state"] will be removed. The bump to 2 is
# planned to happen after a 6 month quiet period with no minor version
# changes.
# Reject if major version is higher than _MAX_READABLE_VERSION.
if old_major_version > self._MAX_READABLE_VERSION:
# Reject if major version is higher than 2.
if old_major_version > 2:
raise NotImplementedError
return data

View File

@@ -43,11 +43,11 @@
"title": "The backup location {agent_id} is unavailable"
},
"automatic_backup_failed_addons": {
"description": "Apps {failed_addons} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Not all apps could be included in automatic backup"
"description": "Add-ons {failed_addons} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Not all add-ons could be included in automatic backup"
},
"automatic_backup_failed_agents_addons_folders": {
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Apps which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the Core and Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Add-ons which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the Core and Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Automatic backup was created with errors"
},
"automatic_backup_failed_create": {

View File

@@ -24,7 +24,7 @@ class BinarySensorOnOffTrigger(EntityTargetStateTriggerBase):
"""Class for binary sensor on/off triggers."""
_device_class: BinarySensorDeviceClass | None
_domains = {DOMAIN}
_domain: str = DOMAIN
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain."""

View File

@@ -190,7 +190,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "miners_revenue_usd":
self._attr_native_value = f"{stats.miners_revenue_usd:.0f}"
elif sensor_type == "btc_mined":
self._attr_native_value = str(stats.btc_mined * 1e-8)
self._attr_native_value = str(stats.btc_mined * 0.00000001)
elif sensor_type == "trade_volume_usd":
self._attr_native_value = f"{stats.trade_volume_usd:.1f}"
elif sensor_type == "difficulty":
@@ -208,13 +208,13 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "blocks_size":
self._attr_native_value = f"{stats.blocks_size:.1f}"
elif sensor_type == "total_fees_btc":
self._attr_native_value = f"{stats.total_fees_btc * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_fees_btc * 0.00000001:.2f}"
elif sensor_type == "total_btc_sent":
self._attr_native_value = f"{stats.total_btc_sent * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_btc_sent * 0.00000001:.2f}"
elif sensor_type == "estimated_btc_sent":
self._attr_native_value = f"{stats.estimated_btc_sent * 1e-8:.2f}"
self._attr_native_value = f"{stats.estimated_btc_sent * 0.00000001:.2f}"
elif sensor_type == "total_btc":
self._attr_native_value = f"{stats.total_btc * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_btc * 0.00000001:.2f}"
elif sensor_type == "total_blocks":
self._attr_native_value = f"{stats.total_blocks:.0f}"
elif sensor_type == "next_retarget":
@@ -222,7 +222,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "estimated_transaction_volume_usd":
self._attr_native_value = f"{stats.estimated_transaction_volume_usd:.2f}"
elif sensor_type == "miners_revenue_btc":
self._attr_native_value = f"{stats.miners_revenue_btc * 1e-8:.1f}"
self._attr_native_value = f"{stats.miners_revenue_btc * 0.00000001:.1f}"
elif sensor_type == "market_price_usd":
self._attr_native_value = f"{stats.market_price_usd:.2f}"

View File

@@ -85,7 +85,6 @@ class BluesoundPlayer(CoordinatorEntity[BluesoundCoordinator], MediaPlayerEntity
_attr_media_content_type = MediaType.MUSIC
_attr_has_entity_name = True
_attr_name = None
_attr_volume_step = 0.01
def __init__(
self,
@@ -689,6 +688,24 @@ class BluesoundPlayer(CoordinatorEntity[BluesoundCoordinator], MediaPlayerEntity
await self._player.play_url(url)
async def async_volume_up(self) -> None:
"""Volume up the media player."""
if self.volume_level is None:
return
new_volume = self.volume_level + 0.01
new_volume = min(1, new_volume)
await self.async_set_volume_level(new_volume)
async def async_volume_down(self) -> None:
"""Volume down the media player."""
if self.volume_level is None:
return
new_volume = self.volume_level - 0.01
new_volume = max(0, new_volume)
await self.async_set_volume_level(new_volume)
async def async_set_volume_level(self, volume: float) -> None:
"""Send volume_up command to media player."""
volume = int(round(volume * 100))

View File

@@ -1,291 +0,0 @@
"""The Brands integration."""
from __future__ import annotations
from collections import deque
from http import HTTPStatus
import logging
from pathlib import Path
from random import SystemRandom
import time
from typing import Any, Final
from aiohttp import ClientError, hdrs, web
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components.http import KEY_AUTHENTICATED, HomeAssistantView
from homeassistant.core import HomeAssistant, callback, valid_domain
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import async_get_custom_components
from .const import (
ALLOWED_IMAGES,
BRANDS_CDN_URL,
CACHE_TTL,
CATEGORY_RE,
CDN_TIMEOUT,
DOMAIN,
HARDWARE_IMAGE_RE,
IMAGE_FALLBACKS,
PLACEHOLDER,
TOKEN_CHANGE_INTERVAL,
)
_LOGGER = logging.getLogger(__name__)
_RND: Final = SystemRandom()
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Brands integration."""
access_tokens: deque[str] = deque([], 2)
access_tokens.append(hex(_RND.getrandbits(256))[2:])
hass.data[DOMAIN] = access_tokens
@callback
def _rotate_token(_now: Any) -> None:
"""Rotate the access token."""
access_tokens.append(hex(_RND.getrandbits(256))[2:])
async_track_time_interval(hass, _rotate_token, TOKEN_CHANGE_INTERVAL)
hass.http.register_view(BrandsIntegrationView(hass))
hass.http.register_view(BrandsHardwareView(hass))
websocket_api.async_register_command(hass, ws_access_token)
return True
@callback
@websocket_api.websocket_command({vol.Required("type"): "brands/access_token"})
def ws_access_token(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Return the current brands access token."""
access_tokens: deque[str] = hass.data[DOMAIN]
connection.send_result(msg["id"], {"token": access_tokens[-1]})
def _read_cached_file_with_marker(
cache_path: Path,
) -> tuple[bytes | None, float] | None:
"""Read a cached file, distinguishing between content and 404 markers.
Returns (content, mtime) where content is None for 404 markers (empty files).
Returns None if the file does not exist at all.
"""
if not cache_path.is_file():
return None
mtime = cache_path.stat().st_mtime
data = cache_path.read_bytes()
if not data:
# Empty file is a 404 marker
return (None, mtime)
return (data, mtime)
def _write_cache_file(cache_path: Path, data: bytes) -> None:
"""Write data to cache file, creating directories as needed."""
cache_path.parent.mkdir(parents=True, exist_ok=True)
cache_path.write_bytes(data)
def _read_brand_file(brand_dir: Path, image: str) -> bytes | None:
"""Read a brand image, trying fallbacks in a single I/O pass."""
for candidate in (image, *IMAGE_FALLBACKS.get(image, ())):
file_path = brand_dir / candidate
if file_path.is_file():
return file_path.read_bytes()
return None
class _BrandsBaseView(HomeAssistantView):
"""Base view for serving brand images."""
requires_auth = False
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the view."""
self._hass = hass
self._cache_dir = Path(hass.config.cache_path(DOMAIN))
def _authenticate(self, request: web.Request) -> None:
"""Authenticate the request using Bearer token or query token."""
access_tokens: deque[str] = self._hass.data[DOMAIN]
authenticated = (
request[KEY_AUTHENTICATED] or request.query.get("token") in access_tokens
)
if not authenticated:
if hdrs.AUTHORIZATION in request.headers:
raise web.HTTPUnauthorized
raise web.HTTPForbidden
async def _serve_from_custom_integration(
self,
domain: str,
image: str,
) -> web.Response | None:
"""Try to serve a brand image from a custom integration."""
custom_components = await async_get_custom_components(self._hass)
if (integration := custom_components.get(domain)) is None:
return None
if not integration.has_branding:
return None
brand_dir = Path(integration.file_path) / "brand"
data = await self._hass.async_add_executor_job(
_read_brand_file, brand_dir, image
)
if data is not None:
return self._build_response(data)
return None
async def _serve_from_cache_or_cdn(
self,
cdn_path: str,
cache_subpath: str,
*,
fallback_placeholder: bool = True,
) -> web.Response:
"""Serve from disk cache, fetching from CDN if needed."""
cache_path = self._cache_dir / cache_subpath
now = time.time()
# Try disk cache
result = await self._hass.async_add_executor_job(
_read_cached_file_with_marker, cache_path
)
if result is not None:
data, mtime = result
# Schedule background refresh if stale
if now - mtime > CACHE_TTL:
self._hass.async_create_background_task(
self._fetch_and_cache(cdn_path, cache_path),
f"brands_refresh_{cache_subpath}",
)
else:
# Cache miss - fetch from CDN
data = await self._fetch_and_cache(cdn_path, cache_path)
if data is None:
if fallback_placeholder:
return await self._serve_placeholder(
image=cache_subpath.rsplit("/", 1)[-1]
)
return web.Response(status=HTTPStatus.NOT_FOUND)
return self._build_response(data)
async def _fetch_and_cache(
self,
cdn_path: str,
cache_path: Path,
) -> bytes | None:
"""Fetch from CDN and write to cache. Returns data or None on 404."""
url = f"{BRANDS_CDN_URL}/{cdn_path}"
session = async_get_clientsession(self._hass)
try:
resp = await session.get(url, timeout=CDN_TIMEOUT)
except ClientError, TimeoutError:
_LOGGER.debug("Failed to fetch brand from CDN: %s", cdn_path)
return None
if resp.status == HTTPStatus.NOT_FOUND:
# Cache the 404 as empty file
await self._hass.async_add_executor_job(_write_cache_file, cache_path, b"")
return None
if resp.status != HTTPStatus.OK:
_LOGGER.debug("Unexpected CDN response %s for %s", resp.status, cdn_path)
return None
data = await resp.read()
await self._hass.async_add_executor_job(_write_cache_file, cache_path, data)
return data
async def _serve_placeholder(self, image: str) -> web.Response:
"""Serve a placeholder image."""
return await self._serve_from_cache_or_cdn(
cdn_path=f"_/{PLACEHOLDER}/{image}",
cache_subpath=f"integrations/{PLACEHOLDER}/{image}",
fallback_placeholder=False,
)
@staticmethod
def _build_response(data: bytes) -> web.Response:
"""Build a response with proper headers."""
return web.Response(
body=data,
content_type="image/png",
)
class BrandsIntegrationView(_BrandsBaseView):
"""Serve integration brand images."""
name = "api:brands:integration"
url = "/api/brands/integration/{domain}/{image}"
async def get(
self,
request: web.Request,
domain: str,
image: str,
) -> web.Response:
"""Handle GET request for an integration brand image."""
self._authenticate(request)
if not valid_domain(domain) or image not in ALLOWED_IMAGES:
return web.Response(status=HTTPStatus.NOT_FOUND)
use_placeholder = request.query.get("placeholder") != "no"
# 1. Try custom integration local files
if (
response := await self._serve_from_custom_integration(domain, image)
) is not None:
return response
# 2. Try cache / CDN (always use direct path for proper 404 caching)
return await self._serve_from_cache_or_cdn(
cdn_path=f"brands/{domain}/{image}",
cache_subpath=f"integrations/{domain}/{image}",
fallback_placeholder=use_placeholder,
)
class BrandsHardwareView(_BrandsBaseView):
"""Serve hardware brand images."""
name = "api:brands:hardware"
url = "/api/brands/hardware/{category}/{image:.+}"
async def get(
self,
request: web.Request,
category: str,
image: str,
) -> web.Response:
"""Handle GET request for a hardware brand image."""
self._authenticate(request)
if not CATEGORY_RE.match(category):
return web.Response(status=HTTPStatus.NOT_FOUND)
# Hardware images have dynamic names like "manufacturer_model.png"
# Validate it ends with .png and contains only safe characters
if not HARDWARE_IMAGE_RE.match(image):
return web.Response(status=HTTPStatus.NOT_FOUND)
cache_subpath = f"hardware/{category}/{image}"
return await self._serve_from_cache_or_cdn(
cdn_path=cache_subpath,
cache_subpath=cache_subpath,
)

View File

@@ -1,57 +0,0 @@
"""Constants for the Brands integration."""
from __future__ import annotations
from datetime import timedelta
import re
from typing import Final
from aiohttp import ClientTimeout
DOMAIN: Final = "brands"
# CDN
BRANDS_CDN_URL: Final = "https://brands.home-assistant.io"
CDN_TIMEOUT: Final = ClientTimeout(total=10)
PLACEHOLDER: Final = "_placeholder"
# Caching
CACHE_TTL: Final = 30 * 24 * 60 * 60 # 30 days in seconds
# Access token
TOKEN_CHANGE_INTERVAL: Final = timedelta(minutes=30)
# Validation
CATEGORY_RE: Final = re.compile(r"^[a-z0-9_]+$")
HARDWARE_IMAGE_RE: Final = re.compile(r"^[a-z0-9_-]+\.png$")
# Images and fallback chains
ALLOWED_IMAGES: Final = frozenset(
{
"icon.png",
"logo.png",
"icon@2x.png",
"logo@2x.png",
"dark_icon.png",
"dark_logo.png",
"dark_icon@2x.png",
"dark_logo@2x.png",
}
)
# Fallback chains for image resolution, mirroring the brands CDN build logic.
# When a requested image is not found, we try each fallback in order.
IMAGE_FALLBACKS: Final[dict[str, list[str]]] = {
"logo.png": ["icon.png"],
"icon@2x.png": ["icon.png"],
"logo@2x.png": ["logo.png", "icon.png"],
"dark_icon.png": ["icon.png"],
"dark_logo.png": ["dark_icon.png", "logo.png", "icon.png"],
"dark_icon@2x.png": ["icon@2x.png", "icon.png"],
"dark_logo@2x.png": [
"dark_icon@2x.png",
"logo@2x.png",
"logo.png",
"icon.png",
],
}

View File

@@ -1,10 +0,0 @@
{
"domain": "brands",
"name": "Brands",
"codeowners": ["@home-assistant/core"],
"config_flow": false,
"dependencies": ["http", "websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/brands",
"integration_type": "system",
"quality_scale": "internal"
}

View File

@@ -1,4 +1,4 @@
"""The BSB-LAN integration."""
"""The BSB-Lan integration."""
import asyncio
import dataclasses
@@ -36,7 +36,7 @@ from .const import CONF_PASSKEY, DOMAIN
from .coordinator import BSBLanFastCoordinator, BSBLanSlowCoordinator
from .services import async_setup_services
PLATFORMS = [Platform.BUTTON, Platform.CLIMATE, Platform.SENSOR, Platform.WATER_HEATER]
PLATFORMS = [Platform.CLIMATE, Platform.SENSOR, Platform.WATER_HEATER]
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
@@ -56,13 +56,13 @@ class BSBLanData:
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the BSB-LAN integration."""
"""Set up the BSB-Lan integration."""
async_setup_services(hass)
return True
async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bool:
"""Set up BSB-LAN from a config entry."""
"""Set up BSB-Lan from a config entry."""
# create config using BSBLANConfig
config = BSBLANConfig(

View File

@@ -1,59 +0,0 @@
"""Button platform for BSB-Lan integration."""
from __future__ import annotations
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import BSBLanConfigEntry, BSBLanData
from .coordinator import BSBLanFastCoordinator
from .entity import BSBLanEntity
from .helpers import async_sync_device_time
PARALLEL_UPDATES = 1
BUTTON_DESCRIPTIONS: tuple[ButtonEntityDescription, ...] = (
ButtonEntityDescription(
key="sync_time",
translation_key="sync_time",
entity_category=EntityCategory.CONFIG,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: BSBLanConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up BSB-Lan button entities from a config entry."""
data = entry.runtime_data
async_add_entities(
BSBLanButtonEntity(data.fast_coordinator, data, description)
for description in BUTTON_DESCRIPTIONS
)
class BSBLanButtonEntity(BSBLanEntity, ButtonEntity):
"""Defines a BSB-Lan button entity."""
entity_description: ButtonEntityDescription
def __init__(
self,
coordinator: BSBLanFastCoordinator,
data: BSBLanData,
description: ButtonEntityDescription,
) -> None:
"""Initialize BSB-Lan button entity."""
super().__init__(coordinator, data)
self.entity_description = description
self._attr_unique_id = f"{data.device.MAC}-{description.key}"
self._data = data
async def async_press(self) -> None:
"""Handle the button press."""
await async_sync_device_time(self._data.client, self._data.device.name)

View File

@@ -39,15 +39,15 @@ PRESET_MODES = [
PRESET_NONE,
]
# Mapping from Home Assistant HVACMode to BSB-LAN integer values
# BSB-LAN uses: 0=off, 1=auto, 2=eco/reduced, 3=heat/comfort
# Mapping from Home Assistant HVACMode to BSB-Lan integer values
# BSB-Lan uses: 0=off, 1=auto, 2=eco/reduced, 3=heat/comfort
HA_TO_BSBLAN_HVAC_MODE: Final[dict[HVACMode, int]] = {
HVACMode.OFF: 0,
HVACMode.AUTO: 1,
HVACMode.HEAT: 3,
}
# Mapping from BSB-LAN integer values to Home Assistant HVACMode
# Mapping from BSB-Lan integer values to Home Assistant HVACMode
BSBLAN_TO_HA_HVAC_MODE: Final[dict[int, HVACMode]] = {
0: HVACMode.OFF,
1: HVACMode.AUTO,
@@ -69,6 +69,7 @@ async def async_setup_entry(
class BSBLANClimate(BSBLanEntity, ClimateEntity):
"""Defines a BSBLAN climate device."""
_attr_has_entity_name = True
_attr_name = None
# Determine preset modes
_attr_supported_features = (
@@ -137,7 +138,7 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
@property
def preset_mode(self) -> str | None:
"""Return the current preset mode."""
# BSB-LAN mode 2 is eco/reduced mode
# BSB-Lan mode 2 is eco/reduced mode
if self._hvac_mode_value == 2:
return PRESET_ECO
return PRESET_NONE
@@ -162,7 +163,7 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
if ATTR_HVAC_MODE in kwargs:
data[ATTR_HVAC_MODE] = HA_TO_BSBLAN_HVAC_MODE[kwargs[ATTR_HVAC_MODE]]
if ATTR_PRESET_MODE in kwargs:
# eco preset uses BSB-LAN mode 2, none preset uses mode 1 (auto)
# eco preset uses BSB-Lan mode 2, none preset uses mode 1 (auto)
if kwargs[ATTR_PRESET_MODE] == PRESET_ECO:
data[ATTR_HVAC_MODE] = 2
elif kwargs[ATTR_PRESET_MODE] == PRESET_NONE:

View File

@@ -1,4 +1,4 @@
"""Config flow for BSB-LAN integration."""
"""Config flow for BSB-Lan integration."""
from __future__ import annotations

View File

@@ -1,4 +1,4 @@
"""Constants for the BSB-LAN integration."""
"""Constants for the BSB-Lan integration."""
from __future__ import annotations

Some files were not shown because too many files have changed in this diff Show More