Compare commits

...

113 Commits

Author SHA1 Message Date
Jan Čermák
8df4152d4e Disable unnecessary parts to test image build 2026-02-24 17:18:39 +01:00
Jan Čermák
e6ed0b5d14 Use native ARM runner for builder action, update to builder 2026.02.1
Since 2026.02.0 the builder has sha-pinning fixed, so we can also get rid of
the Zizmor error suppression.

Builder changes:
* https://github.com/home-assistant/builder/releases/tag/2026.02.0
* https://github.com/home-assistant/builder/releases/tag/2026.02.1
2026-02-24 16:15:30 +01:00
On Freund
7adfb0a40b Add bus support to MTA integration (#163220)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-24 16:11:13 +01:00
Zoltán Farkasdi
b4705e4a45 Fix flaky netatmo test (#163941) 2026-02-24 16:02:00 +01:00
Tom
a0176d18cf Add DHCP ip_addresses update to airOS (#163936) 2026-02-24 15:36:52 +01:00
Kevin Stillhammer
5543107f6c Allow to disable seconds in DurationSelector (#163803) 2026-02-24 15:11:26 +01:00
Klaas Schoute
6dc8840932 Rename Powerfox integration to Powerfox Cloud (#163723) 2026-02-24 14:42:43 +01:00
Stefan Agner
76902aa7fa Avoid adding Content-Type to non-body responses (#163885) 2026-02-24 14:31:04 +01:00
Erwin Douna
07b9877f64 Add button platform to Proxmox (#163791)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-24 14:24:20 +01:00
Erik Montnemery
40e2f79e60 Add support for reading backups using securetar v3 (#163920) 2026-02-24 14:23:00 +01:00
Christopher Fenner
aa707fcf41 Add gateway discovery via USB for EnOcean integration (#162756)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-24 11:58:01 +01:00
Willem-Jan van Rootselaar
4b53bc243d Add energy sensor to bsblan (#163879) 2026-02-24 11:56:27 +01:00
Robert Resch
220e94d029 Fix nightlies by reverting the builder to a version instead of a sha (#163935) 2026-02-24 11:48:19 +01:00
Erik Montnemery
b1f943ccda Replace discovery with user flow in Philips Hue BLE (#163924) 2026-02-24 11:06:31 +01:00
Brett Adams
e37d84049a Update Splunk integration to bronze quality scale (#163616)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-24 10:56:05 +01:00
Marc Mueller
209473e376 Remove myself as codeowner for fritzbox_callmonitor (#163927) 2026-02-24 10:45:58 +01:00
MoonDevLT
334c3af448 Bump lunatone-rest-api-client to 0.7.0 (#163594) 2026-02-24 10:10:04 +01:00
hanwg
5560139d24 Clean up duplicated code in Telegram bot (#163917) 2026-02-24 10:04:21 +01:00
Erik Montnemery
d4dec5d1d3 Improve backup_restore tests (#163921) 2026-02-24 10:03:42 +01:00
J. Nick Koston
6cb63a60bc Skip unknown entity types in ESPHome integration (#163887) 2026-02-24 08:48:27 +01:00
Franck Nijhof
991301e79e Merge branch 'master' into dev 2026-02-24 07:07:39 +00:00
andreimoraru
06e2b4633a Bump yt-dlp to 2026.2.21 (#163916) 2026-02-24 07:30:54 +01:00
Manu
048d8d217c Update strings in ntfy integration (#163912) 2026-02-24 06:24:18 +01:00
Kyle Johnson
3693bc5878 Make Google Assistant fan speed percent and step speeds mutually exclusive (#162770) 2026-02-23 22:26:09 +00:00
Franck Nijhof
9c640fe0fa 2026.2.3 (#163683) 2026-02-20 21:43:32 +01:00
Sid
62145e5f9e Bump eheimdigital to 1.6.0 (#161961) 2026-02-20 19:51:10 +00:00
Franck Nijhof
c0fc414bb9 Fix nrgkick tests for rc 2026-02-20 19:49:27 +00:00
Franck Nijhof
69411a05ff Bump version to 2026.2.3 2026-02-20 19:39:05 +00:00
Marc Mueller
06c9ec861d Fix hassfest requirements check (#163681) 2026-02-20 19:38:58 +00:00
Joost Lekkerkerker
946df1755f Bump pySmartThings to 3.5.3 (#163375)
Co-authored-by: Josef Zweck <josef@zweck.dev>
2026-02-20 19:38:56 +00:00
Thomas Sejr Madsen
d0678e0641 Fix touchline_sl zone availability when alarm state is set (#163338) 2026-02-20 19:38:55 +00:00
Allen Porter
ec56f183da Bump pyrainbird to 6.0.5 (#163333) 2026-02-20 19:38:53 +00:00
Åke Strandberg
033005e0de Add Miele dishwasher program code (#163308) 2026-02-20 19:38:52 +00:00
Andreas Jakl
91f9f5a826 NRGkick: do not update vehicle connected timestamp when vehicle is not connected (#163292) 2026-02-20 19:38:51 +00:00
David Recordon
ac4fcab827 Fix Control4 HVAC action mapping for multi-stage and idle states (#163222) 2026-02-20 19:38:49 +00:00
Allen Porter
d0eea77178 Fix remote calendar event handling of events within the same update period (#163186)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-20 19:38:48 +00:00
Markus Adrario
fb38fa3844 Add Lux to homee units (#163180) 2026-02-20 19:38:47 +00:00
Allen Porter
440efb953e Bump ical to 13.2.0 (#163123) 2026-02-20 19:38:45 +00:00
Manu
7ce47cca0d Fix blocking call in Xbox config flow (#163122) 2026-02-20 19:38:44 +00:00
Andre Lengwenus
a5f607bb91 Bump pypck to 0.9.11 (#163043) 2026-02-20 19:38:42 +00:00
Andre Lengwenus
b03043aa6f Bump pypck to 0.9.10 (#162333) 2026-02-20 19:38:41 +00:00
Robert Resch
0f3c7ca277 Block redirect to localhost (#162941) 2026-02-20 19:37:03 +00:00
Martin Hjelmare
3abf7c22f3 Fix Z-Wave climate set preset (#162728) 2026-02-20 19:37:01 +00:00
hbludworth
292e1de126 Show progress indicator during backup stage of Core/App update (#162683) 2026-02-20 19:37:00 +00:00
Christian Lackas
2d776a8193 Fix HomematicIP entity recovery after access point cloud reconnect (#162575) 2026-02-20 19:36:58 +00:00
Sid
039bbbb48c Fix dynamic entity creation in eheimdigital (#161155) 2026-02-20 19:36:56 +00:00
Luke Lashley
ad5565df95 Add the ability to select region for Roborock (#160898) 2026-02-20 19:36:55 +00:00
Franck Nijhof
3e6bc29a6a 2026.2.2 (#162950) 2026-02-13 21:05:06 +01:00
Franck Nijhof
ec8067a5a8 Bump version to 2026.2.2 2026-02-13 19:25:16 +00:00
Josef Zweck
6f47716d0a Log remaining token duration in onedrive (#162933) 2026-02-13 19:24:25 +00:00
puddly
efba5c6bcc Bump ZHA to 0.0.90 (#162894) 2026-02-13 19:24:24 +00:00
Sammy [Andrei Marinache]
d10e78079f Add Miele TQ1000WP tumble dryer programs and program phases (#162871)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
Co-authored-by: Åke Strandberg <ake@strandberg.eu>
2026-02-13 19:24:23 +00:00
Jon Seager
6d4581580f Bump pytouchlinesl to 0.6.0 (#162856) 2026-02-13 19:24:21 +00:00
Yoshi Walsh
0d9a41a540 Bump pydaikin to 2.17.2 (#162846) 2026-02-13 19:24:20 +00:00
Vicx
cd69e6db73 Bump slixmpp to 1.13.2 (#162837) 2026-02-13 19:24:19 +00:00
Xitee
1320367d0d Filter out transient zero values from qBittorrent alltime stats (#162821)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-13 19:24:18 +00:00
Joost Lekkerkerker
dfa4698887 Bump pySmartThings to 3.5.2 (#162809)
Co-authored-by: Josef Zweck <josef@zweck.dev>
2026-02-13 19:24:17 +00:00
Robert Resch
b426115de7 Bump cryptography to 46.0.5 (#162783) 2026-02-13 19:24:15 +00:00
hanwg
fb79fa37f8 Fix bug in edit_message_media action for Telegram bot (#162762) 2026-02-13 19:24:14 +00:00
Simone Chemelli
6a5f7bf424 Fix image platform state for Vodafone Station (#162747)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-02-13 19:24:13 +00:00
Simone Chemelli
142ca6dec1 Fix alarm refresh warning for Comelit SimpleHome (#162710) 2026-02-13 19:24:12 +00:00
epenet
0f986c24d0 Fix unavailable status in Tuya (#162709) 2026-02-13 19:24:11 +00:00
Josef Zweck
01f2b7b6f6 Bump onedrive-personal-sdk to 0.1.2 (#162689) 2026-02-13 19:24:09 +00:00
Michael
b9469027f5 Fix handling when FRITZ!Box reboots in FRITZ!Box Tools (#162679) 2026-02-13 19:24:08 +00:00
Tomás Correia
fbb94af748 fix to cloudflare r2 setup screen info (#162677) 2026-02-13 19:24:07 +00:00
Michael
148bdf6e3a Fix handling when FRITZ!Box reboots in FRITZ!Smarthome (#162676) 2026-02-13 19:24:05 +00:00
starkillerOG
91999f8871 Bump reolink-aio to 0.19.0 (#162672) 2026-02-13 19:24:04 +00:00
Jeef
aecca4eb99 Bump intellifire4py to 4.3.1 (#162659) 2026-02-13 19:24:03 +00:00
Allen Porter
bf8aa49bae Improve MCP SSE fallback error handling (#162655) 2026-02-13 19:24:02 +00:00
Joost Lekkerkerker
4423425683 Pin setuptools to 81.0.0 (#162589)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-13 19:24:01 +00:00
Aaron Godfrey
44202da53d Increase max tasks retrieved per page to prevent timeout (#162587) 2026-02-13 19:23:59 +00:00
Thomas55555
9f7dfb72c4 Bump aioautomower to 2.7.3 (#162583)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-13 19:23:58 +00:00
Michael
de07a69e4f Bump aioimmich to 0.12.0 (#162573)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-02-13 19:23:57 +00:00
Maikel Punie
bbf4c38115 migrate velbus config entries (#162565) 2026-02-13 19:23:56 +00:00
ElCruncharino
e1bb5d52ef Add timeout to B2 metadata downloads to prevent backup hang (#162562) 2026-02-13 19:23:54 +00:00
hanwg
eb64b6bdee Fix config flow bug for Telegram bot (#162555) 2026-02-13 19:23:53 +00:00
Andrea Turri
ecb288b735 Add new Miele mappings (#162544) 2026-02-13 19:23:52 +00:00
Norbert Rittel
a419c9c420 Sentence-case "speech-to-text" in google_cloud (#162534) 2026-02-13 19:23:51 +00:00
Brett Adams
dd29133324 Fix Tesla Fleet partner registration to use all regions (#162525)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-13 19:23:50 +00:00
Allen Porter
90f22ea516 Bump grpc to 1.78.0 (#162520) 2026-02-13 19:23:48 +00:00
Peter Grauvogel
9db1428265 Fix Green Planet Energy price unit conversion (#162511) 2026-02-13 19:23:47 +00:00
Denis Shulyaka
a696b05b0d Fix JSON serialization of time objects in Cloud conversation tool results (#162506) 2026-02-13 19:23:46 +00:00
Denis Shulyaka
77ddb63b73 Fix JSON serialization of time objects in Open Router tool results (#162505) 2026-02-13 19:23:44 +00:00
Denis Shulyaka
4180a6e176 Fix JSON serialization of time objects in Ollama tool results (#162502)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-13 19:23:43 +00:00
Denis Shulyaka
6d74c912d2 Fix JSON serialization of datetime objects in Google Generative AI tool results (#162495) 2026-02-13 19:23:42 +00:00
Denis Shulyaka
8a01dfcc00 Fix JSON serialization of time objects in OpenAI tool results (#162490) 2026-02-13 19:23:40 +00:00
Brett Adams
9722898dc6 Fix device_class of backup reserve sensor in Tessie (#162459) 2026-02-13 19:23:39 +00:00
Brett Adams
7438c71fcb Fix device_class of backup reserve sensor in teslemetry (#162458) 2026-02-13 19:23:38 +00:00
Christian Lackas
0b5e55b923 Fix absolute humidity sensor on HmIP-WGT glass thermostats (#162455) 2026-02-13 19:23:37 +00:00
ElCruncharino
61ed959e8e Fix AsyncIteratorReader blocking after stream exhaustion (#161731) 2026-02-13 19:17:20 +00:00
Jaap Pieroen
3989532465 Bump essent-dynamic-pricing to 0.3.1 (#160958)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-02-13 19:17:18 +00:00
Franck Nijhof
28027ddca4 2026.2.1 (#162450) 2026-02-06 22:44:07 +01:00
Franck Nijhof
fe0d7b3cca Bump version to 2026.2.1 2026-02-06 20:49:26 +00:00
jameson_uk
0dcc4e9527 dep: bump aioamazondevices to 11.1.3 (#162437) 2026-02-06 20:47:38 +00:00
Artur Pragacz
b13b189703 Make bad entity ID detection more lenient (#162425) 2026-02-06 20:47:37 +00:00
epenet
150829f599 Fix invalid yardian snaphots (#162422) 2026-02-06 20:47:36 +00:00
Joost Lekkerkerker
57dd9d9c23 Remove double unit of measurement for yardian (#162412) 2026-02-06 20:47:34 +00:00
Sab44
e2056cb12c Bump librehardwaremonitor-api to version 1.9.1 (#162409) 2026-02-06 20:47:33 +00:00
Joost Lekkerkerker
fa2c8992cf Remove entity id overwrite for ambient station (#162403) 2026-02-06 20:47:32 +00:00
Matt Zimmerman
ddf5c7fe3a Add missing config flow strings to SmartTub (#162375)
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-06 20:47:31 +00:00
Matt Zimmerman
7034ed6d3f Bump python-smarttub to 0.0.47 (#162367)
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-06 20:47:29 +00:00
Aaron Godfrey
9015b53c1b Fix conversion of data for todo.* actions (#162366) 2026-02-06 20:47:28 +00:00
Jordan Harvey
1cfa6561f7 Update pynintendoparental requirement to version 2.3.2.1 (#162362) 2026-02-06 20:47:27 +00:00
Shay Levy
eead02dcca Fix Shelly Linkedgo Thermostat status update (#162339) 2026-02-06 20:47:26 +00:00
Arie Catsman
456e51a221 Bump pyenphase to 2.4.5 (#162324) 2026-02-06 20:47:25 +00:00
Luo Chen
5d984ce186 Fix unicode escaping in MCP server tool response (#162319)
Co-authored-by: Franck Nijhof <git@frenck.dev>
Co-authored-by: Franck Nijhof <frenck@frenck.nl>
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-02-06 20:47:24 +00:00
Oliver
61f45489ac Add mapping for stopped state to denonavr media player (#162283) 2026-02-06 20:47:23 +00:00
Tomás Correia
f72c643b38 Fix multipart upload to use consistent part sizes for R2/S3 (#162278) 2026-02-06 20:47:22 +00:00
Oliver
27bc26e886 Bump denonavr to 1.3.2 (#162271) 2026-02-06 20:47:20 +00:00
Thomas55555
0e9f03cbc1 Bump google_air_quality_api to 3.0.1 (#162233) 2026-02-06 20:47:19 +00:00
David Bonnes
9480c33fb0 Bump evohome-async to 1.1.3 (#162232) 2026-02-06 20:47:18 +00:00
Jonathan
3e6b8663e8 Fix device_class of backup reserve sensor (#161178) 2026-02-06 20:47:17 +00:00
epenet
1c69a83793 Fix redundant off preset in Tuya climate (#161040) 2026-02-06 20:47:16 +00:00
85 changed files with 4951 additions and 834 deletions

View File

@@ -57,10 +57,10 @@ jobs:
with:
type: ${{ env.BUILD_TYPE }}
- name: Verify version
uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
with:
ignore-dev: true
# - name: Verify version
# uses: home-assistant/actions/helpers/verify-version@master # zizmor: ignore[unpinned-uses]
# with:
# ignore-dev: true
- name: Fail if translations files are checked in
run: |
@@ -272,7 +272,7 @@ jobs:
name: Build ${{ matrix.machine }} machine core image
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ubuntu-latest
runs-on: ${{ matrix.runs-on }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
@@ -294,6 +294,21 @@ jobs:
- raspberrypi5-64
- yellow
- green
include:
# Default: aarch64 on native ARM runner
- arch: aarch64
runs-on: ubuntu-24.04-arm
# Overrides for amd64 machines
- machine: generic-x86-64
arch: amd64
runs-on: ubuntu-24.04
- machine: qemux86-64
arch: amd64
runs-on: ubuntu-24.04
# TODO: remove, intel-nuc is a legacy name for x86-64, renamed in 2021
- machine: intel-nuc
arch: amd64
runs-on: ubuntu-24.04
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -321,286 +336,288 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image
uses: home-assistant/builder@21bc64d76dad7a5184c67826aab41c6b6f89023a # 2025.11.0
uses: home-assistant/builder@6cb4fd3d1338b6e22d0958a4bcb53e0965ea63b4 # 2026.02.1
with:
image: ${{ matrix.arch }}
args: |
$BUILD_ARGS \
--test \
--target /data/machine \
--cosign \
--machine "${{ needs.init.outputs.version }}=${{ matrix.machine }}"
publish_ha:
name: Publish version files
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_machine"]
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
with:
name: ${{ secrets.GIT_NAME }}
email: ${{ secrets.GIT_EMAIL }}
token: ${{ secrets.GIT_TOKEN }}
- name: Update version file
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: ${{ needs.init.outputs.channel }}
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
- name: Update version file (stable -> beta)
if: needs.init.outputs.channel == 'stable'
uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
with:
key: "homeassistant[]"
key-description: "Home Assistant Core"
version: ${{ needs.init.outputs.version }}
channel: beta
exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
publish_container:
name: Publish meta container for ${{ matrix.registry }}
environment: ${{ needs.init.outputs.channel }}
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
steps:
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
cosign-release: "v2.5.3"
- name: Login to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Verify architecture image signatures
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Verifying ${arch} image signature..."
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp https://github.com/home-assistant/core/.* \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
done
echo "✓ All images verified successfully"
# Generate all Docker tags based on version string
# Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# Examples:
# 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
- name: Generate Docker metadata
id: meta
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: ${{ matrix.registry }}/home-assistant
sep-tags: ","
tags: |
type=raw,value=${{ needs.init.outputs.version }},priority=9999
type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.7.1
- name: Copy architecture images to DockerHub
if: matrix.registry == 'docker.io/homeassistant'
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
VERSION: ${{ needs.init.outputs.version }}
run: |
# Use imagetools to copy image blobs directly between registries
# This preserves provenance/attestations and seems to be much faster than pull/push
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
for arch in $ARCHS; do
echo "Copying ${arch} image to DockerHub..."
for attempt in 1 2 3; do
if docker buildx imagetools create \
--tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
"ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
break
fi
echo "Attempt ${attempt} failed, retrying in 10 seconds..."
sleep 10
if [ "${attempt}" -eq 3 ]; then
echo "Failed after 3 attempts"
exit 1
fi
done
cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
done
- name: Create and push multi-arch manifests
shell: bash
env:
ARCHITECTURES: ${{ needs.init.outputs.architectures }}
REGISTRY: ${{ matrix.registry }}
VERSION: ${{ needs.init.outputs.version }}
META_TAGS: ${{ steps.meta.outputs.tags }}
run: |
# Build list of architecture images dynamically
ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
ARCH_IMAGES=()
for arch in $ARCHS; do
ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
done
# Build list of all tags for single manifest creation
# Note: Using sep-tags=',' in metadata-action for easier parsing
TAG_ARGS=()
IFS=',' read -ra TAGS <<< "${META_TAGS}"
for tag in "${TAGS[@]}"; do
TAG_ARGS+=("--tag" "${tag}")
done
# Create manifest with ALL tags in a single operation (much faster!)
echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
# Sign each tag separately (signing requires individual tag names)
echo "Signing all tags..."
for tag in "${TAGS[@]}"; do
echo "Signing ${tag}"
cosign sign --yes "${tag}"
done
echo "All manifests created and signed successfully"
build_python:
name: Build PyPi package
environment: ${{ needs.init.outputs.channel }}
needs: ["init", "build_base"]
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
id-token: write # For PyPI trusted publishing
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Build package
shell: bash
run: |
# Remove dist, build, and homeassistant.egg-info
# when build locally for testing!
pip install build
python -m build
- name: Upload package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
skip-existing: true
hassfest-image:
name: Build and test hassfest image
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
attestations: write # For build provenance attestation
id-token: write # For build provenance attestation
needs: ["init"]
if: github.repository_owner == 'home-assistant'
env:
HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker image
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
load: true
tags: ${{ env.HASSFEST_IMAGE_TAG }}
- name: Run hassfest against core
run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
- name: Push Docker image
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
id: push
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
context: . # So action will not pull the repository again
file: ./script/hassfest/docker/Dockerfile
push: true
tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
- name: Generate artifact attestation
if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3.2.0
with:
subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
# publish_ha:
# name: Publish version files
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_machine"]
# runs-on: ubuntu-latest
# permissions:
# contents: read
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Initialize git
# uses: home-assistant/actions/helpers/git-init@master # zizmor: ignore[unpinned-uses]
# with:
# name: ${{ secrets.GIT_NAME }}
# email: ${{ secrets.GIT_EMAIL }}
# token: ${{ secrets.GIT_TOKEN }}
#
# - name: Update version file
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: ${{ needs.init.outputs.channel }}
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# - name: Update version file (stable -> beta)
# if: needs.init.outputs.channel == 'stable'
# uses: home-assistant/actions/helpers/version-push@master # zizmor: ignore[unpinned-uses]
# with:
# key: "homeassistant[]"
# key-description: "Home Assistant Core"
# version: ${{ needs.init.outputs.version }}
# channel: beta
# exclude-list: '["odroid-xu","qemuarm","qemux86","raspberrypi","raspberrypi2","raspberrypi3","raspberrypi4","tinker"]'
#
# publish_container:
# name: Publish meta container for ${{ matrix.registry }}
# environment: ${{ needs.init.outputs.channel }}
# if: github.repository_owner == 'home-assistant'
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# id-token: write # For cosign signing
# strategy:
# fail-fast: false
# matrix:
# registry: ["ghcr.io/home-assistant", "docker.io/homeassistant"]
# steps:
# - name: Install Cosign
# uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
# with:
# cosign-release: "v2.5.3"
#
# - name: Login to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_TOKEN }}
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Verify architecture image signatures
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Verifying ${arch} image signature..."
# cosign verify \
# --certificate-oidc-issuer https://token.actions.githubusercontent.com \
# --certificate-identity-regexp https://github.com/home-assistant/core/.* \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"
# done
# echo "✓ All images verified successfully"
#
# # Generate all Docker tags based on version string
# # Version format: YYYY.MM.PATCH, YYYY.MM.PATCHbN (beta), or YYYY.MM.PATCH.devYYYYMMDDHHMM (dev)
# # Examples:
# # 2025.12.1 (stable) -> tags: 2025.12.1, 2025.12, stable, latest, beta, rc
# # 2025.12.0b3 (beta) -> tags: 2025.12.0b3, beta, rc
# # 2025.12.0.dev202511250240 -> tags: 2025.12.0.dev202511250240, dev
# - name: Generate Docker metadata
# id: meta
# uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
# with:
# images: ${{ matrix.registry }}/home-assistant
# sep-tags: ","
# tags: |
# type=raw,value=${{ needs.init.outputs.version }},priority=9999
# type=raw,value=dev,enable=${{ contains(needs.init.outputs.version, 'd') }}
# type=raw,value=beta,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=rc,enable=${{ !contains(needs.init.outputs.version, 'd') }}
# type=raw,value=stable,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=raw,value=latest,enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
# type=semver,pattern={{major}}.{{minor}},value=${{ needs.init.outputs.version }},enable=${{ !contains(needs.init.outputs.version, 'd') && !contains(needs.init.outputs.version, 'b') }}
#
# - name: Set up Docker Buildx
# uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.7.1
#
# - name: Copy architecture images to DockerHub
# if: matrix.registry == 'docker.io/homeassistant'
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# VERSION: ${{ needs.init.outputs.version }}
# run: |
# # Use imagetools to copy image blobs directly between registries
# # This preserves provenance/attestations and seems to be much faster than pull/push
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# for arch in $ARCHS; do
# echo "Copying ${arch} image to DockerHub..."
# for attempt in 1 2 3; do
# if docker buildx imagetools create \
# --tag "docker.io/homeassistant/${arch}-homeassistant:${VERSION}" \
# "ghcr.io/home-assistant/${arch}-homeassistant:${VERSION}"; then
# break
# fi
# echo "Attempt ${attempt} failed, retrying in 10 seconds..."
# sleep 10
# if [ "${attempt}" -eq 3 ]; then
# echo "Failed after 3 attempts"
# exit 1
# fi
# done
# cosign sign --yes "docker.io/homeassistant/${arch}-homeassistant:${VERSION}"
# done
#
# - name: Create and push multi-arch manifests
# shell: bash
# env:
# ARCHITECTURES: ${{ needs.init.outputs.architectures }}
# REGISTRY: ${{ matrix.registry }}
# VERSION: ${{ needs.init.outputs.version }}
# META_TAGS: ${{ steps.meta.outputs.tags }}
# run: |
# # Build list of architecture images dynamically
# ARCHS=$(echo "${ARCHITECTURES}" | jq -r '.[]')
# ARCH_IMAGES=()
# for arch in $ARCHS; do
# ARCH_IMAGES+=("${REGISTRY}/${arch}-homeassistant:${VERSION}")
# done
#
# # Build list of all tags for single manifest creation
# # Note: Using sep-tags=',' in metadata-action for easier parsing
# TAG_ARGS=()
# IFS=',' read -ra TAGS <<< "${META_TAGS}"
# for tag in "${TAGS[@]}"; do
# TAG_ARGS+=("--tag" "${tag}")
# done
#
# # Create manifest with ALL tags in a single operation (much faster!)
# echo "Creating multi-arch manifest with tags: ${TAGS[*]}"
# docker buildx imagetools create "${TAG_ARGS[@]}" "${ARCH_IMAGES[@]}"
#
# # Sign each tag separately (signing requires individual tag names)
# echo "Signing all tags..."
# for tag in "${TAGS[@]}"; do
# echo "Signing ${tag}"
# cosign sign --yes "${tag}"
# done
#
# echo "All manifests created and signed successfully"
#
# build_python:
# name: Build PyPi package
# environment: ${{ needs.init.outputs.channel }}
# needs: ["init", "build_base"]
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# id-token: write # For PyPI trusted publishing
# if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
# steps:
# - name: Checkout the repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Set up Python ${{ env.DEFAULT_PYTHON }}
# uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
# with:
# python-version: ${{ env.DEFAULT_PYTHON }}
#
# - name: Download translations
# uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
# with:
# name: translations
#
# - name: Extract translations
# run: |
# tar xvf translations.tar.gz
# rm translations.tar.gz
#
# - name: Build package
# shell: bash
# run: |
# # Remove dist, build, and homeassistant.egg-info
# # when build locally for testing!
# pip install build
# python -m build
#
# - name: Upload package to PyPI
# uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
# with:
# skip-existing: true
#
# hassfest-image:
# name: Build and test hassfest image
# runs-on: ubuntu-latest
# permissions:
# contents: read # To check out the repository
# packages: write # To push to GHCR
# attestations: write # For build provenance attestation
# id-token: write # For build provenance attestation
# needs: ["init"]
# if: github.repository_owner == 'home-assistant'
# env:
# HASSFEST_IMAGE_NAME: ghcr.io/home-assistant/hassfest
# HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
# steps:
# - name: Checkout repository
# uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# with:
# persist-credentials: false
#
# - name: Login to GitHub Container Registry
# uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
# with:
# registry: ghcr.io
# username: ${{ github.repository_owner }}
# password: ${{ secrets.GITHUB_TOKEN }}
#
# - name: Build Docker image
# uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# load: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }}
#
# - name: Run hassfest against core
# run: docker run --rm -v "${GITHUB_WORKSPACE}":/github/workspace "${HASSFEST_IMAGE_TAG}" --core-path=/github/workspace
#
# - name: Push Docker image
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# id: push
# uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
# with:
# context: . # So action will not pull the repository again
# file: ./script/hassfest/docker/Dockerfile
# push: true
# tags: ${{ env.HASSFEST_IMAGE_TAG }},${{ env.HASSFEST_IMAGE_NAME }}:latest
#
# - name: Generate artifact attestation
# if: needs.init.outputs.channel != 'dev' && needs.init.outputs.publish == 'true'
# uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3.2.0
# with:
# subject-name: ${{ env.HASSFEST_IMAGE_NAME }}
# subject-digest: ${{ steps.push.outputs.digest }}
# push-to-registry: true

2
CODEOWNERS generated
View File

@@ -555,8 +555,6 @@ build.json @home-assistant/supervisor
/tests/components/fritz/ @AaronDavidSchneider @chemelli74 @mib1185
/homeassistant/components/fritzbox/ @mib1185 @flabbamann
/tests/components/fritzbox/ @mib1185 @flabbamann
/homeassistant/components/fritzbox_callmonitor/ @cdce8p
/tests/components/fritzbox_callmonitor/ @cdce8p
/homeassistant/components/fronius/ @farmio
/tests/components/fronius/ @farmio
/homeassistant/components/frontend/ @home-assistant/frontend

View File

@@ -34,11 +34,13 @@ from homeassistant.const import (
)
from homeassistant.data_entry_flow import section
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.selector import (
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from .const import (
DEFAULT_SSL,
@@ -392,6 +394,18 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
except asyncio.CancelledError:
pass
async def async_step_dhcp(
self, discovery_info: DhcpServiceInfo
) -> ConfigFlowResult:
"""Automatically handle a DHCP discovered IP change."""
ip_address = discovery_info.ip
# python-airos defaults to upper for derived mac_address
normalized_mac = format_mac(discovery_info.macaddress).upper()
await self.async_set_unique_id(normalized_mac)
self._abort_if_unique_id_configured(updates={CONF_HOST: ip_address})
return self.async_abort(reason="unreachable")
async def async_step_discovery_no_devices(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:

View File

@@ -3,6 +3,7 @@
"name": "Ubiquiti airOS",
"codeowners": ["@CoMPaTech"],
"config_flow": true,
"dhcp": [{ "registered_devices": true }],
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",

View File

@@ -16,6 +16,7 @@ from typing import IO, Any, cast
import aiohttp
from securetar import (
InvalidPasswordError,
SecureTarArchive,
SecureTarError,
SecureTarFile,
@@ -165,7 +166,7 @@ def validate_password(path: Path, password: str | None) -> bool:
):
# If we can read the tar file, the password is correct
return True
except tarfile.ReadError, SecureTarReadError:
except tarfile.ReadError, InvalidPasswordError, SecureTarReadError:
LOGGER.debug("Invalid password")
return False
except Exception: # noqa: BLE001
@@ -192,13 +193,14 @@ def validate_password_stream(
for obj in input_archive.tar:
if not obj.name.endswith((".tar", ".tgz", ".tar.gz")):
continue
with input_archive.extract_tar(obj) as decrypted:
if decrypted.plaintext_size is None:
raise UnsupportedSecureTarVersion
try:
try:
with input_archive.extract_tar(obj) as decrypted:
if decrypted.plaintext_size is None:
raise UnsupportedSecureTarVersion
decrypted.read(1) # Read a single byte to trigger the decryption
except SecureTarReadError as err:
raise IncorrectPassword from err
except (InvalidPasswordError, SecureTarReadError) as err:
raise IncorrectPassword from err
else:
return
raise BackupEmpty

View File

@@ -29,8 +29,13 @@ if TYPE_CHECKING:
# Filter lists for optimized API calls - only fetch parameters we actually use
# This significantly reduces response time (~0.2s per parameter saved)
STATE_INCLUDE = ["current_temperature", "target_temperature", "hvac_mode"]
SENSOR_INCLUDE = ["current_temperature", "outside_temperature"]
STATE_INCLUDE = [
"current_temperature",
"target_temperature",
"hvac_mode",
"hvac_action",
]
SENSOR_INCLUDE = ["current_temperature", "outside_temperature", "total_energy"]
DHW_STATE_INCLUDE = [
"operating_mode",
"nominal_setpoint",

View File

@@ -11,7 +11,7 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.const import UnitOfTemperature
from homeassistant.const import UnitOfEnergy, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
@@ -58,6 +58,19 @@ SENSOR_TYPES: tuple[BSBLanSensorEntityDescription, ...] = (
),
exists_fn=lambda data: data.sensor.outside_temperature is not None,
),
BSBLanSensorEntityDescription(
key="total_energy",
translation_key="total_energy",
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
value_fn=lambda data: (
data.sensor.total_energy.value
if data.sensor.total_energy is not None
else None
),
exists_fn=lambda data: data.sensor.total_energy is not None,
),
)

View File

@@ -66,6 +66,9 @@
},
"outside_temperature": {
"name": "Outside temperature"
},
"total_energy": {
"name": "Total energy"
}
}
},

View File

@@ -4,17 +4,23 @@ from typing import Any
import voluptuous as vol
from homeassistant.components import usb
from homeassistant.components.usb import (
human_readable_device_name,
usb_unique_id_from_service_info,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.const import CONF_DEVICE
from homeassistant.const import ATTR_MANUFACTURER, CONF_DEVICE, CONF_NAME
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.selector import (
SelectSelector,
SelectSelectorConfig,
SelectSelectorMode,
)
from homeassistant.helpers.service_info.usb import UsbServiceInfo
from . import dongle
from .const import DOMAIN, ERROR_INVALID_DONGLE_PATH, LOGGER
from .const import DOMAIN, ERROR_INVALID_DONGLE_PATH, LOGGER, MANUFACTURER
MANUAL_SCHEMA = vol.Schema(
{
@@ -31,8 +37,48 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Initialize the EnOcean config flow."""
self.dongle_path = None
self.discovery_info = None
self.data: dict[str, Any] = {}
async def async_step_usb(self, discovery_info: UsbServiceInfo) -> ConfigFlowResult:
"""Handle usb discovery."""
unique_id = usb_unique_id_from_service_info(discovery_info)
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured(
updates={CONF_DEVICE: discovery_info.device}
)
discovery_info.device = await self.hass.async_add_executor_job(
usb.get_serial_by_id, discovery_info.device
)
self.data[CONF_DEVICE] = discovery_info.device
self.context["title_placeholders"] = {
CONF_NAME: human_readable_device_name(
discovery_info.device,
discovery_info.serial_number,
discovery_info.manufacturer,
discovery_info.description,
discovery_info.vid,
discovery_info.pid,
)
}
return await self.async_step_usb_confirm()
async def async_step_usb_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle USB Discovery confirmation."""
if user_input is not None:
return await self.async_step_manual({CONF_DEVICE: self.data[CONF_DEVICE]})
self._set_confirm_only()
return self.async_show_form(
step_id="usb_confirm",
description_placeholders={
ATTR_MANUFACTURER: MANUFACTURER,
CONF_DEVICE: self.data.get(CONF_DEVICE, ""),
},
)
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
"""Import a yaml configuration."""
@@ -104,4 +150,4 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
def create_enocean_entry(self, user_input):
"""Create an entry for the provided configuration."""
return self.async_create_entry(title="EnOcean", data=user_input)
return self.async_create_entry(title=MANUFACTURER, data=user_input)

View File

@@ -6,6 +6,8 @@ from homeassistant.const import Platform
DOMAIN = "enocean"
MANUFACTURER = "EnOcean"
ERROR_INVALID_DONGLE_PATH = "invalid_dongle_path"
SIGNAL_RECEIVE_MESSAGE = "enocean.receive_message"

View File

@@ -3,10 +3,19 @@
"name": "EnOcean",
"codeowners": [],
"config_flow": true,
"dependencies": ["usb"],
"documentation": "https://www.home-assistant.io/integrations/enocean",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["enocean"],
"requirements": ["enocean==0.50"],
"single_config_entry": true
"single_config_entry": true,
"usb": [
{
"description": "*usb 300*",
"manufacturer": "*enocean*",
"pid": "6001",
"vid": "0403"
}
]
}

View File

@@ -25,6 +25,9 @@
"device": "[%key:component::enocean::config::step::detect::data_description::device%]"
},
"description": "Enter the path to your EnOcean USB dongle."
},
"usb_confirm": {
"description": "{manufacturer} USB dongle detected at {device}. Do you want to set up this device?"
}
}
},

View File

@@ -300,16 +300,23 @@ class RuntimeEntryData:
needed_platforms.add(Platform.BINARY_SENSOR)
needed_platforms.add(Platform.SELECT)
needed_platforms.update(INFO_TYPE_TO_PLATFORM[type(info)] for info in infos)
await self._ensure_platforms_loaded(hass, entry, needed_platforms)
# Make a dict of the EntityInfo by type and send
# them to the listeners for each specific EntityInfo type
info_types_to_platform = INFO_TYPE_TO_PLATFORM
infos_by_type: defaultdict[type[EntityInfo], list[EntityInfo]] = defaultdict(
list
)
for info in infos:
infos_by_type[type(info)].append(info)
info_type = type(info)
if platform := info_types_to_platform.get(info_type):
needed_platforms.add(platform)
infos_by_type[info_type].append(info)
else:
_LOGGER.warning(
"Entity type %s is not supported in this version of Home Assistant",
info_type,
)
await self._ensure_platforms_loaded(hass, entry, needed_platforms)
for type_, callbacks in self.entity_info_callbacks.items():
# If all entities for a type are removed, we

View File

@@ -1,7 +1,7 @@
{
"domain": "fritzbox_callmonitor",
"name": "FRITZ!Box Call Monitor",
"codeowners": ["@cdce8p"],
"codeowners": [],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/fritzbox_callmonitor",
"integration_type": "device",

View File

@@ -1752,15 +1752,15 @@ class FanSpeedTrait(_Trait):
"""Initialize a trait for a state."""
super().__init__(hass, state, config)
if state.domain == fan.DOMAIN:
speed_count = min(
FAN_SPEED_MAX_SPEED_COUNT,
round(
100 / (self.state.attributes.get(fan.ATTR_PERCENTAGE_STEP) or 1.0)
),
speed_count = round(
100 / (self.state.attributes.get(fan.ATTR_PERCENTAGE_STEP) or 1.0)
)
self._ordered_speed = [
f"{speed}/{speed_count}" for speed in range(1, speed_count + 1)
]
if speed_count <= FAN_SPEED_MAX_SPEED_COUNT:
self._ordered_speed = [
f"{speed}/{speed_count}" for speed in range(1, speed_count + 1)
]
else:
self._ordered_speed = []
@staticmethod
def supported(domain, features, device_class, _):
@@ -1786,7 +1786,11 @@ class FanSpeedTrait(_Trait):
result.update(
{
"reversible": reversible,
"supportsFanSpeedPercent": True,
# supportsFanSpeedPercent is mutually exclusive with
# availableFanSpeeds, where supportsFanSpeedPercent takes
# precedence. Report it only when step speeds are not
# supported so Google renders a percent slider (1-100%).
"supportsFanSpeedPercent": not self._ordered_speed,
}
)
@@ -1832,10 +1836,12 @@ class FanSpeedTrait(_Trait):
if domain == fan.DOMAIN:
percent = attrs.get(fan.ATTR_PERCENTAGE) or 0
response["currentFanSpeedPercent"] = percent
response["currentFanSpeedSetting"] = percentage_to_ordered_list_item(
self._ordered_speed, percent
)
if self._ordered_speed:
response["currentFanSpeedSetting"] = percentage_to_ordered_list_item(
self._ordered_speed, percent
)
else:
response["currentFanSpeedPercent"] = percent
return response
@@ -1855,7 +1861,7 @@ class FanSpeedTrait(_Trait):
)
if domain == fan.DOMAIN:
if fan_speed := params.get("fanSpeed"):
if self._ordered_speed and (fan_speed := params.get("fanSpeed")):
fan_speed_percent = ordered_list_item_to_percentage(
self._ordered_speed, fan_speed
)

View File

@@ -181,8 +181,7 @@ class HassIOIngress(HomeAssistantView):
skip_auto_headers={hdrs.CONTENT_TYPE},
) as result:
headers = _response_header(result)
content_length_int = 0
content_length = result.headers.get(hdrs.CONTENT_LENGTH, UNDEFINED)
# Avoid parsing content_type in simple cases for better performance
if maybe_content_type := result.headers.get(hdrs.CONTENT_TYPE):
content_type: str = (maybe_content_type.partition(";"))[0].strip()
@@ -190,17 +189,30 @@ class HassIOIngress(HomeAssistantView):
# default value according to RFC 2616
content_type = "application/octet-stream"
# Empty body responses (304, 204, HEAD, etc.) should not be streamed,
# otherwise aiohttp < 3.9.0 may generate an invalid "0\r\n\r\n" chunk
# This also avoids setting content_type for empty responses.
if must_be_empty_body(request.method, result.status):
# If upstream contains content-type, preserve it (e.g. for HEAD requests)
# Note: This still is omitting content-length. We can't simply forward
# the upstream length since the proxy might change the body length
# (e.g. due to compression).
if maybe_content_type:
headers[hdrs.CONTENT_TYPE] = content_type
return web.Response(
headers=headers,
status=result.status,
)
# Simple request
if (empty_body := must_be_empty_body(result.method, result.status)) or (
content_length_int = 0
content_length = result.headers.get(hdrs.CONTENT_LENGTH, UNDEFINED)
if (
content_length is not UNDEFINED
and (content_length_int := int(content_length))
<= MAX_SIMPLE_RESPONSE_SIZE
):
# Return Response
if empty_body:
body = None
else:
body = await result.read()
body = await result.read()
simple_response = web.Response(
headers=headers,
status=result.status,

View File

@@ -6,6 +6,7 @@ from enum import Enum
import logging
from typing import Any
from bleak.backends.scanner import AdvertisementData
from HueBLE import ConnectionError, HueBleError, HueBleLight, PairingError
import voluptuous as vol
@@ -26,6 +27,17 @@ from .light import get_available_color_modes
_LOGGER = logging.getLogger(__name__)
SERVICE_UUID = SERVICE_DATA_UUID = "0000fe0f-0000-1000-8000-00805f9b34fb"
def device_filter(advertisement_data: AdvertisementData) -> bool:
"""Return True if the device is supported."""
return (
SERVICE_UUID in advertisement_data.service_uuids
and SERVICE_DATA_UUID in advertisement_data.service_data
)
async def validate_input(hass: HomeAssistant, address: str) -> Error | None:
"""Return error if cannot connect and validate."""
@@ -70,28 +82,66 @@ class HueBleConfigFlow(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Initialize the config flow."""
self._discovered_devices: dict[str, bluetooth.BluetoothServiceInfoBleak] = {}
self._discovery_info: bluetooth.BluetoothServiceInfoBleak | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the user step to pick discovered device."""
errors: dict[str, str] = {}
if user_input is not None:
unique_id = dr.format_mac(user_input[CONF_MAC])
# Don't raise on progress because there may be discovery flows
await self.async_set_unique_id(unique_id, raise_on_progress=False)
# Guard against the user selecting a device which has been configured by
# another flow.
self._abort_if_unique_id_configured()
self._discovery_info = self._discovered_devices[user_input[CONF_MAC]]
return await self.async_step_confirm()
current_addresses = self._async_current_ids(include_ignore=False)
for discovery in bluetooth.async_discovered_service_info(self.hass):
if (
discovery.address in current_addresses
or discovery.address in self._discovered_devices
or not device_filter(discovery.advertisement)
):
continue
self._discovered_devices[discovery.address] = discovery
if not self._discovered_devices:
return self.async_abort(reason="no_devices_found")
data_schema = vol.Schema(
{
vol.Required(CONF_MAC): vol.In(
{
service_info.address: (
f"{service_info.name} ({service_info.address})"
)
for service_info in self._discovered_devices.values()
}
),
}
)
return self.async_show_form(
step_id="user",
data_schema=data_schema,
errors=errors,
)
async def async_step_bluetooth(
self, discovery_info: bluetooth.BluetoothServiceInfoBleak
) -> ConfigFlowResult:
"""Handle a flow initialized by the home assistant scanner."""
_LOGGER.debug(
"HA found light %s. Will show in UI but not auto connect",
"HA found light %s. Use user flow to show in UI and connect",
discovery_info.name,
)
unique_id = dr.format_mac(discovery_info.address)
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
name = f"{discovery_info.name} ({discovery_info.address})"
self.context.update({"title_placeholders": {CONF_NAME: name}})
self._discovery_info = discovery_info
return await self.async_step_confirm()
return self.async_abort(reason="discovery_unsupported")
async def async_step_confirm(
self, user_input: dict[str, Any] | None = None
@@ -103,7 +153,10 @@ class HueBleConfigFlow(ConfigFlow, domain=DOMAIN):
if user_input is not None:
unique_id = dr.format_mac(self._discovery_info.address)
await self.async_set_unique_id(unique_id)
# Don't raise on progress because there may be discovery flows
await self.async_set_unique_id(unique_id, raise_on_progress=False)
# Guard against the user selecting a device which has been configured by
# another flow.
self._abort_if_unique_id_configured()
error = await validate_input(self.hass, unique_id)
if error:

View File

@@ -2,7 +2,8 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"not_implemented": "This integration can only be set up via discovery."
"discovery_unsupported": "Discovery flow is not supported by the Hue BLE integration.",
"no_devices_found": "[%key:common::config_flow::abort::no_devices_found%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -14,7 +15,16 @@
},
"step": {
"confirm": {
"description": "Do you want to set up {name} ({mac})?. Make sure the light is [made discoverable to voice assistants]({url_pairing_mode}) or has been [factory reset]({url_factory_reset})."
"description": "Do you want to set up {name} ({mac})?\nMake sure the light is [made discoverable to voice assistants]({url_pairing_mode}) or has been [factory reset]({url_factory_reset})."
},
"user": {
"data": {
"mac": "[%key:common::config_flow::data::device%]"
},
"data_description": {
"mac": "Select the Hue device you want to set up"
},
"description": "[%key:component::bluetooth::config::step::user::description%]"
}
}
}

View File

@@ -109,14 +109,18 @@ class LunatoneLight(
return self._device is not None and self._device.is_on
@property
def brightness(self) -> int:
def brightness(self) -> int | None:
"""Return the brightness of this light between 0..255."""
return value_to_brightness(self.BRIGHTNESS_SCALE, self._device.brightness)
return (
value_to_brightness(self.BRIGHTNESS_SCALE, self._device.brightness)
if self._device.brightness is not None
else None
)
@property
def color_mode(self) -> ColorMode:
"""Return the color mode of the light."""
if self._device is not None and self._device.is_dimmable:
if self._device is not None and self._device.brightness is not None:
return ColorMode.BRIGHTNESS
return ColorMode.ONOFF
@@ -149,7 +153,8 @@ class LunatoneLight(
async def async_turn_off(self, **kwargs: Any) -> None:
"""Instruct the light to turn off."""
if brightness_supported(self.supported_color_modes):
self._last_brightness = self.brightness
if self.brightness:
self._last_brightness = self.brightness
await self._device.fade_to_brightness(0)
else:
await self._device.switch_off()

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["lunatone-rest-api-client==0.6.3"]
"requirements": ["lunatone-rest-api-client==0.7.0"]
}

View File

@@ -8,6 +8,6 @@
"iot_class": "calculated",
"loggers": ["yt_dlp"],
"quality_scale": "internal",
"requirements": ["yt-dlp[default]==2026.02.04"],
"requirements": ["yt-dlp[default]==2026.02.21"],
"single_config_entry": true
}

View File

@@ -2,10 +2,12 @@
from __future__ import annotations
import asyncio
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .const import DOMAIN as DOMAIN
from .const import DOMAIN as DOMAIN, SUBENTRY_TYPE_BUS, SUBENTRY_TYPE_SUBWAY
from .coordinator import MTAConfigEntry, MTADataUpdateCoordinator
PLATFORMS = [Platform.SENSOR]
@@ -13,16 +15,36 @@ PLATFORMS = [Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: MTAConfigEntry) -> bool:
"""Set up MTA from a config entry."""
coordinator = MTADataUpdateCoordinator(hass, entry)
await coordinator.async_config_entry_first_refresh()
coordinators: dict[str, MTADataUpdateCoordinator] = {}
entry.runtime_data = coordinator
for subentry_id, subentry in entry.subentries.items():
if subentry.subentry_type not in (SUBENTRY_TYPE_SUBWAY, SUBENTRY_TYPE_BUS):
continue
coordinators[subentry_id] = MTADataUpdateCoordinator(hass, entry, subentry)
# Refresh all coordinators in parallel
await asyncio.gather(
*(
coordinator.async_config_entry_first_refresh()
for coordinator in coordinators.values()
)
)
entry.runtime_data = coordinators
entry.async_on_unload(entry.add_update_listener(async_update_entry))
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_update_entry(hass: HomeAssistant, entry: MTAConfigEntry) -> None:
"""Handle config entry update (e.g., subentry changes)."""
await hass.config_entries.async_reload(entry.entry_id)
async def async_unload_entry(hass: HomeAssistant, entry: MTAConfigEntry) -> bool:
"""Unload a config entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -2,22 +2,43 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
from pymta import LINE_TO_FEED, MTAFeedError, SubwayFeed
from pymta import LINE_TO_FEED, BusFeed, MTAFeedError, SubwayFeed
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult
from homeassistant.helpers import aiohttp_client
from homeassistant.config_entries import (
SOURCE_REAUTH,
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryFlow,
SubentryFlowResult,
)
from homeassistant.const import CONF_API_KEY
from homeassistant.core import callback
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.selector import (
SelectOptionDict,
SelectSelector,
SelectSelectorConfig,
SelectSelectorMode,
TextSelector,
TextSelectorConfig,
TextSelectorType,
)
from .const import CONF_LINE, CONF_STOP_ID, CONF_STOP_NAME, DOMAIN
from .const import (
CONF_LINE,
CONF_ROUTE,
CONF_STOP_ID,
CONF_STOP_NAME,
DOMAIN,
SUBENTRY_TYPE_BUS,
SUBENTRY_TYPE_SUBWAY,
)
_LOGGER = logging.getLogger(__name__)
@@ -28,17 +49,79 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 1
MINOR_VERSION = 1
def __init__(self) -> None:
"""Initialize the config flow."""
self.data: dict[str, Any] = {}
self.stops: dict[str, str] = {}
@classmethod
@callback
def async_get_supported_subentry_types(
cls, config_entry: ConfigEntry
) -> dict[str, type[ConfigSubentryFlow]]:
"""Return subentries supported by this handler."""
return {
SUBENTRY_TYPE_SUBWAY: SubwaySubentryFlowHandler,
SUBENTRY_TYPE_BUS: BusSubentryFlowHandler,
}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
errors: dict[str, str] = {}
if user_input is not None:
api_key = user_input.get(CONF_API_KEY)
self._async_abort_entries_match({CONF_API_KEY: api_key})
if api_key:
# Test the API key by trying to fetch bus data
session = async_get_clientsession(self.hass)
bus_feed = BusFeed(api_key=api_key, session=session)
try:
# Try to get stops for a known route to validate the key
await bus_feed.get_stops(route_id="M15")
except MTAFeedError:
errors["base"] = "cannot_connect"
except Exception:
_LOGGER.exception("Unexpected error validating API key")
errors["base"] = "unknown"
if not errors:
if self.source == SOURCE_REAUTH:
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_KEY: api_key or None},
)
return self.async_create_entry(
title="MTA",
data={CONF_API_KEY: api_key or None},
)
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Optional(CONF_API_KEY): TextSelector(
TextSelectorConfig(type=TextSelectorType.PASSWORD)
),
}
),
errors=errors,
)
async def async_step_reauth(
self, _entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle reauth when user wants to add or update API key."""
return await self.async_step_user()
class SubwaySubentryFlowHandler(ConfigSubentryFlow):
"""Handle subway stop subentry flow."""
def __init__(self) -> None:
"""Initialize the subentry flow."""
self.data: dict[str, Any] = {}
self.stops: dict[str, str] = {}
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle the line selection step."""
if user_input is not None:
self.data[CONF_LINE] = user_input[CONF_LINE]
return await self.async_step_stop()
@@ -58,13 +141,12 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
),
}
),
errors=errors,
)
async def async_step_stop(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the stop step."""
) -> SubentryFlowResult:
"""Handle the stop selection step."""
errors: dict[str, str] = {}
if user_input is not None:
@@ -74,25 +156,30 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
self.data[CONF_STOP_NAME] = stop_name
unique_id = f"{self.data[CONF_LINE]}_{stop_id}"
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured()
# Test connection to real-time GTFS-RT feed (different from static GTFS used by get_stops)
# Check for duplicate subentries across all entries
for entry in self.hass.config_entries.async_entries(DOMAIN):
for subentry in entry.subentries.values():
if subentry.unique_id == unique_id:
return self.async_abort(reason="already_configured")
# Test connection to real-time GTFS-RT feed
try:
await self._async_test_connection()
except MTAFeedError:
errors["base"] = "cannot_connect"
else:
title = f"{self.data[CONF_LINE]} Line - {stop_name}"
title = f"{self.data[CONF_LINE]} - {stop_name}"
return self.async_create_entry(
title=title,
data=self.data,
unique_id=unique_id,
)
try:
self.stops = await self._async_get_stops(self.data[CONF_LINE])
except MTAFeedError:
_LOGGER.exception("Error fetching stops for line %s", self.data[CONF_LINE])
_LOGGER.debug("Error fetching stops for line %s", self.data[CONF_LINE])
return self.async_abort(reason="cannot_connect")
if not self.stops:
@@ -123,7 +210,7 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
async def _async_get_stops(self, line: str) -> dict[str, str]:
"""Get stops for a line from the library."""
feed_id = SubwayFeed.get_feed_id_for_route(line)
session = aiohttp_client.async_get_clientsession(self.hass)
session = async_get_clientsession(self.hass)
subway_feed = SubwayFeed(feed_id=feed_id, session=session)
stops_list = await subway_feed.get_stops(route_id=line)
@@ -141,7 +228,7 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
async def _async_test_connection(self) -> None:
"""Test connection to MTA feed."""
feed_id = SubwayFeed.get_feed_id_for_route(self.data[CONF_LINE])
session = aiohttp_client.async_get_clientsession(self.hass)
session = async_get_clientsession(self.hass)
subway_feed = SubwayFeed(feed_id=feed_id, session=session)
await subway_feed.get_arrivals(
@@ -149,3 +236,133 @@ class MTAConfigFlow(ConfigFlow, domain=DOMAIN):
stop_id=self.data[CONF_STOP_ID],
max_arrivals=1,
)
class BusSubentryFlowHandler(ConfigSubentryFlow):
"""Handle bus stop subentry flow."""
def __init__(self) -> None:
"""Initialize the subentry flow."""
self.data: dict[str, Any] = {}
self.stops: dict[str, str] = {}
def _get_api_key(self) -> str:
"""Get API key from parent entry."""
return self._get_entry().data.get(CONF_API_KEY) or ""
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle the route input step."""
errors: dict[str, str] = {}
if user_input is not None:
route = user_input[CONF_ROUTE].upper().strip()
self.data[CONF_ROUTE] = route
# Validate route by fetching stops
try:
self.stops = await self._async_get_stops(route)
if not self.stops:
errors["base"] = "invalid_route"
else:
return await self.async_step_stop()
except MTAFeedError:
_LOGGER.debug("Error fetching stops for route %s", route)
errors["base"] = "invalid_route"
return self.async_show_form(
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_ROUTE): TextSelector(),
}
),
errors=errors,
)
async def async_step_stop(
self, user_input: dict[str, Any] | None = None
) -> SubentryFlowResult:
"""Handle the stop selection step."""
errors: dict[str, str] = {}
if user_input is not None:
stop_id = user_input[CONF_STOP_ID]
self.data[CONF_STOP_ID] = stop_id
stop_name = self.stops.get(stop_id, stop_id)
self.data[CONF_STOP_NAME] = stop_name
unique_id = f"bus_{self.data[CONF_ROUTE]}_{stop_id}"
# Check for duplicate subentries across all entries
for entry in self.hass.config_entries.async_entries(DOMAIN):
for subentry in entry.subentries.values():
if subentry.unique_id == unique_id:
return self.async_abort(reason="already_configured")
# Test connection to real-time feed
try:
await self._async_test_connection()
except MTAFeedError:
errors["base"] = "cannot_connect"
else:
title = f"{self.data[CONF_ROUTE]} - {stop_name}"
return self.async_create_entry(
title=title,
data=self.data,
unique_id=unique_id,
)
stop_options = [
SelectOptionDict(value=stop_id, label=stop_name)
for stop_id, stop_name in sorted(self.stops.items(), key=lambda x: x[1])
]
return self.async_show_form(
step_id="stop",
data_schema=vol.Schema(
{
vol.Required(CONF_STOP_ID): SelectSelector(
SelectSelectorConfig(
options=stop_options,
mode=SelectSelectorMode.DROPDOWN,
)
),
}
),
errors=errors,
description_placeholders={"route": self.data[CONF_ROUTE]},
)
async def _async_get_stops(self, route: str) -> dict[str, str]:
"""Get stops for a bus route from the library."""
session = async_get_clientsession(self.hass)
api_key = self._get_api_key()
bus_feed = BusFeed(api_key=api_key, session=session)
stops_list = await bus_feed.get_stops(route_id=route)
stops = {}
for stop in stops_list:
stop_id = stop["stop_id"]
stop_name = stop["stop_name"]
# Add direction if available (e.g., "to South Ferry")
if direction := stop.get("direction_name"):
stops[stop_id] = f"{stop_name} (to {direction})"
else:
stops[stop_id] = stop_name
return stops
async def _async_test_connection(self) -> None:
"""Test connection to MTA bus feed."""
session = async_get_clientsession(self.hass)
api_key = self._get_api_key()
bus_feed = BusFeed(api_key=api_key, session=session)
await bus_feed.get_arrivals(
route_id=self.data[CONF_ROUTE],
stop_id=self.data[CONF_STOP_ID],
max_arrivals=1,
)

View File

@@ -7,5 +7,9 @@ DOMAIN = "mta"
CONF_LINE = "line"
CONF_STOP_ID = "stop_id"
CONF_STOP_NAME = "stop_name"
CONF_ROUTE = "route"
SUBENTRY_TYPE_SUBWAY = "subway"
SUBENTRY_TYPE_BUS = "bus"
UPDATE_INTERVAL = timedelta(seconds=30)

View File

@@ -6,22 +6,30 @@ from dataclasses import dataclass
from datetime import datetime
import logging
from pymta import MTAFeedError, SubwayFeed
from pymta import BusFeed, MTAFeedError, SubwayFeed
from homeassistant.config_entries import ConfigEntry
from homeassistant.config_entries import ConfigEntry, ConfigSubentry
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.util import dt as dt_util
from .const import CONF_LINE, CONF_STOP_ID, DOMAIN, UPDATE_INTERVAL
from .const import (
CONF_LINE,
CONF_ROUTE,
CONF_STOP_ID,
DOMAIN,
SUBENTRY_TYPE_BUS,
UPDATE_INTERVAL,
)
_LOGGER = logging.getLogger(__name__)
@dataclass
class MTAArrival:
"""Represents a single train arrival."""
"""Represents a single transit arrival."""
arrival_time: datetime
minutes_until: int
@@ -36,7 +44,7 @@ class MTAData:
arrivals: list[MTAArrival]
type MTAConfigEntry = ConfigEntry[MTADataUpdateCoordinator]
type MTAConfigEntry = ConfigEntry[dict[str, MTADataUpdateCoordinator]]
class MTADataUpdateCoordinator(DataUpdateCoordinator[MTAData]):
@@ -44,35 +52,48 @@ class MTADataUpdateCoordinator(DataUpdateCoordinator[MTAData]):
config_entry: MTAConfigEntry
def __init__(self, hass: HomeAssistant, config_entry: MTAConfigEntry) -> None:
def __init__(
self,
hass: HomeAssistant,
config_entry: MTAConfigEntry,
subentry: ConfigSubentry,
) -> None:
"""Initialize."""
self.line = config_entry.data[CONF_LINE]
self.stop_id = config_entry.data[CONF_STOP_ID]
self.subentry = subentry
self.stop_id = subentry.data[CONF_STOP_ID]
self.feed_id = SubwayFeed.get_feed_id_for_route(self.line)
session = async_get_clientsession(hass)
self.subway_feed = SubwayFeed(feed_id=self.feed_id, session=session)
if subentry.subentry_type == SUBENTRY_TYPE_BUS:
api_key = config_entry.data.get(CONF_API_KEY) or ""
self.feed: BusFeed | SubwayFeed = BusFeed(api_key=api_key, session=session)
self.route_id = subentry.data[CONF_ROUTE]
else:
# Subway feed
line = subentry.data[CONF_LINE]
feed_id = SubwayFeed.get_feed_id_for_route(line)
self.feed = SubwayFeed(feed_id=feed_id, session=session)
self.route_id = line
super().__init__(
hass,
_LOGGER,
config_entry=config_entry,
name=DOMAIN,
name=f"{DOMAIN}_{subentry.subentry_id}",
update_interval=UPDATE_INTERVAL,
)
async def _async_update_data(self) -> MTAData:
"""Fetch data from MTA."""
_LOGGER.debug(
"Fetching data for line=%s, stop=%s, feed=%s",
self.line,
"Fetching data for route=%s, stop=%s",
self.route_id,
self.stop_id,
self.feed_id,
)
try:
library_arrivals = await self.subway_feed.get_arrivals(
route_id=self.line,
library_arrivals = await self.feed.get_arrivals(
route_id=self.route_id,
stop_id=self.stop_id,
max_arrivals=3,
)

View File

@@ -38,9 +38,7 @@ rules:
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow:
status: exempt
comment: No authentication required.
reauthentication-flow: done
test-coverage: done
# Gold

View File

@@ -11,12 +11,13 @@ from homeassistant.components.sensor import (
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import ConfigSubentry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import CONF_LINE, CONF_STOP_ID, CONF_STOP_NAME, DOMAIN
from .const import CONF_LINE, CONF_ROUTE, CONF_STOP_NAME, DOMAIN, SUBENTRY_TYPE_BUS
from .coordinator import MTAArrival, MTAConfigEntry, MTADataUpdateCoordinator
PARALLEL_UPDATES = 0
@@ -97,16 +98,19 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up MTA sensor based on a config entry."""
coordinator = entry.runtime_data
async_add_entities(
MTASensor(coordinator, entry, description)
for description in SENSOR_DESCRIPTIONS
)
for subentry_id, coordinator in entry.runtime_data.items():
subentry = entry.subentries[subentry_id]
async_add_entities(
(
MTASensor(coordinator, subentry, description)
for description in SENSOR_DESCRIPTIONS
),
config_subentry_id=subentry_id,
)
class MTASensor(CoordinatorEntity[MTADataUpdateCoordinator], SensorEntity):
"""Sensor for MTA train arrivals."""
"""Sensor for MTA transit arrivals."""
_attr_has_entity_name = True
entity_description: MTASensorEntityDescription
@@ -114,24 +118,32 @@ class MTASensor(CoordinatorEntity[MTADataUpdateCoordinator], SensorEntity):
def __init__(
self,
coordinator: MTADataUpdateCoordinator,
entry: MTAConfigEntry,
subentry: ConfigSubentry,
description: MTASensorEntityDescription,
) -> None:
"""Initialize the sensor."""
super().__init__(coordinator)
self.entity_description = description
line = entry.data[CONF_LINE]
stop_id = entry.data[CONF_STOP_ID]
stop_name = entry.data.get(CONF_STOP_NAME, stop_id)
self._attr_unique_id = f"{entry.unique_id}-{description.key}"
is_bus = subentry.subentry_type == SUBENTRY_TYPE_BUS
if is_bus:
route = subentry.data[CONF_ROUTE]
model = "Bus"
else:
route = subentry.data[CONF_LINE]
model = "Subway"
stop_name = subentry.data.get(CONF_STOP_NAME, subentry.subentry_id)
unique_id = subentry.unique_id or subentry.subentry_id
self._attr_unique_id = f"{unique_id}-{description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, entry.entry_id)},
name=f"{line} Line - {stop_name} ({stop_id})",
identifiers={(DOMAIN, unique_id)},
name=f"{route} - {stop_name}",
manufacturer="MTA",
model="Subway",
model=model,
entry_type=DeviceEntryType.SERVICE,
)

View File

@@ -2,32 +2,95 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"no_stops": "No stops found for this line. The line may not be currently running."
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"stop": {
"data": {
"stop_id": "Stop and direction"
},
"data_description": {
"stop_id": "Select the stop and direction you want to track"
},
"description": "Choose a stop on the {line} line. The direction is included with each stop.",
"title": "Select stop and direction"
},
"user": {
"data": {
"line": "Line"
"api_key": "[%key:common::config_flow::data::api_key%]"
},
"data_description": {
"line": "The subway line to track"
"api_key": "API key from MTA Bus Time. Required for bus tracking, optional for subway only."
},
"description": "Choose the subway line you want to track.",
"title": "Select subway line"
"description": "Enter your MTA Bus Time API key to enable bus tracking. Leave blank if you only want to track subways."
}
}
},
"config_subentries": {
"bus": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"entry_type": "Bus stop",
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_route": "Invalid bus route. Please check the route name and try again."
},
"initiate_flow": {
"user": "Add bus stop"
},
"step": {
"stop": {
"data": {
"stop_id": "Stop"
},
"data_description": {
"stop_id": "Select the stop you want to track"
},
"description": "Choose a stop on the {route} route.",
"title": "Select stop"
},
"user": {
"data": {
"route": "Route"
},
"data_description": {
"route": "The bus route identifier"
},
"description": "Enter the bus route you want to track (for example, M15, B46, Q10).",
"title": "Enter bus route"
}
}
},
"subway": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_service%]",
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"no_stops": "No stops found for this line. The line may not be currently running."
},
"entry_type": "Subway stop",
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]"
},
"initiate_flow": {
"user": "Add subway stop"
},
"step": {
"stop": {
"data": {
"stop_id": "Stop and direction"
},
"data_description": {
"stop_id": "Select the stop and direction you want to track"
},
"description": "Choose a stop on the {line} line. The direction is included with each stop.",
"title": "Select stop and direction"
},
"user": {
"data": {
"line": "Line"
},
"data_description": {
"line": "The subway line to track"
},
"description": "Choose the subway line you want to track.",
"title": "Select subway line"
}
}
}
},

View File

@@ -394,10 +394,10 @@
"name": "Delete notification"
},
"publish": {
"description": "Publishes a notification message to a ntfy topic",
"description": "Publishes a notification message to a ntfy topic.",
"fields": {
"actions": {
"description": "Up to three actions ('view', 'broadcast', or 'http') can be added as buttons below the notification. Actions are executed when the corresponding button is tapped or clicked.",
"description": "Up to three actions (`view`, `broadcast`, `http`, or `copy`) can be added as buttons below the notification. Actions are executed when the corresponding button is tapped or clicked.",
"name": "Action buttons"
},
"attach": {

View File

@@ -1,6 +1,6 @@
{
"domain": "powerfox",
"name": "Powerfox",
"name": "Powerfox Cloud",
"codeowners": ["@klaasnicolaas"],
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/powerfox",

View File

@@ -37,7 +37,10 @@ from .const import (
)
from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator
PLATFORMS = [Platform.BINARY_SENSOR]
PLATFORMS = [
Platform.BINARY_SENSOR,
Platform.BUTTON,
]
CONFIG_SCHEMA = vol.Schema(

View File

@@ -0,0 +1,339 @@
"""Button platform for Proxmox VE."""
from __future__ import annotations
from abc import abstractmethod
from collections.abc import Callable
from dataclasses import dataclass
from typing import Any
from proxmoxer import AuthenticationError
from proxmoxer.core import ResourceException
import requests
from requests.exceptions import ConnectTimeout, SSLError
from homeassistant.components.button import (
ButtonDeviceClass,
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator, ProxmoxNodeData
from .entity import ProxmoxContainerEntity, ProxmoxNodeEntity, ProxmoxVMEntity
@dataclass(frozen=True, kw_only=True)
class ProxmoxNodeButtonNodeEntityDescription(ButtonEntityDescription):
"""Class to hold Proxmox node button description."""
press_action: Callable[[ProxmoxCoordinator, str], None]
@dataclass(frozen=True, kw_only=True)
class ProxmoxVMButtonEntityDescription(ButtonEntityDescription):
"""Class to hold Proxmox VM button description."""
press_action: Callable[[ProxmoxCoordinator, str, int], None]
@dataclass(frozen=True, kw_only=True)
class ProxmoxContainerButtonEntityDescription(ButtonEntityDescription):
"""Class to hold Proxmox container button description."""
press_action: Callable[[ProxmoxCoordinator, str, int], None]
NODE_BUTTONS: tuple[ProxmoxNodeButtonNodeEntityDescription, ...] = (
ProxmoxNodeButtonNodeEntityDescription(
key="reboot",
press_action=lambda coordinator, node: coordinator.proxmox.nodes(
node
).status.post(command="reboot"),
entity_category=EntityCategory.CONFIG,
device_class=ButtonDeviceClass.RESTART,
),
ProxmoxNodeButtonNodeEntityDescription(
key="shutdown",
translation_key="shutdown",
press_action=lambda coordinator, node: coordinator.proxmox.nodes(
node
).status.post(command="shutdown"),
entity_category=EntityCategory.CONFIG,
),
ProxmoxNodeButtonNodeEntityDescription(
key="start_all",
translation_key="start_all",
press_action=lambda coordinator, node: coordinator.proxmox.nodes(
node
).startall.post(),
entity_category=EntityCategory.CONFIG,
),
ProxmoxNodeButtonNodeEntityDescription(
key="stop_all",
translation_key="stop_all",
press_action=lambda coordinator, node: coordinator.proxmox.nodes(
node
).stopall.post(),
entity_category=EntityCategory.CONFIG,
),
)
VM_BUTTONS: tuple[ProxmoxVMButtonEntityDescription, ...] = (
ProxmoxVMButtonEntityDescription(
key="start",
translation_key="start",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).qemu(vmid).status.start.post()
),
entity_category=EntityCategory.CONFIG,
),
ProxmoxVMButtonEntityDescription(
key="stop",
translation_key="stop",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).qemu(vmid).status.stop.post()
),
entity_category=EntityCategory.CONFIG,
),
ProxmoxVMButtonEntityDescription(
key="restart",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).qemu(vmid).status.restart.post()
),
entity_category=EntityCategory.CONFIG,
device_class=ButtonDeviceClass.RESTART,
),
ProxmoxVMButtonEntityDescription(
key="hibernate",
translation_key="hibernate",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).qemu(vmid).status.hibernate.post()
),
entity_category=EntityCategory.CONFIG,
),
ProxmoxVMButtonEntityDescription(
key="reset",
translation_key="reset",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).qemu(vmid).status.reset.post()
),
entity_category=EntityCategory.CONFIG,
),
)
CONTAINER_BUTTONS: tuple[ProxmoxContainerButtonEntityDescription, ...] = (
ProxmoxContainerButtonEntityDescription(
key="start",
translation_key="start",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).lxc(vmid).status.start.post()
),
entity_category=EntityCategory.CONFIG,
),
ProxmoxContainerButtonEntityDescription(
key="stop",
translation_key="stop",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).lxc(vmid).status.stop.post()
),
entity_category=EntityCategory.CONFIG,
),
ProxmoxContainerButtonEntityDescription(
key="restart",
press_action=lambda coordinator, node, vmid: (
coordinator.proxmox.nodes(node).lxc(vmid).status.restart.post()
),
entity_category=EntityCategory.CONFIG,
device_class=ButtonDeviceClass.RESTART,
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: ProxmoxConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up ProxmoxVE buttons."""
coordinator = entry.runtime_data
def _async_add_new_nodes(nodes: list[ProxmoxNodeData]) -> None:
"""Add new node buttons."""
async_add_entities(
ProxmoxNodeButtonEntity(coordinator, entity_description, node)
for node in nodes
for entity_description in NODE_BUTTONS
)
def _async_add_new_vms(
vms: list[tuple[ProxmoxNodeData, dict[str, Any]]],
) -> None:
"""Add new VM buttons."""
async_add_entities(
ProxmoxVMButtonEntity(coordinator, entity_description, vm, node_data)
for (node_data, vm) in vms
for entity_description in VM_BUTTONS
)
def _async_add_new_containers(
containers: list[tuple[ProxmoxNodeData, dict[str, Any]]],
) -> None:
"""Add new container buttons."""
async_add_entities(
ProxmoxContainerButtonEntity(
coordinator, entity_description, container, node_data
)
for (node_data, container) in containers
for entity_description in CONTAINER_BUTTONS
)
coordinator.new_nodes_callbacks.append(_async_add_new_nodes)
coordinator.new_vms_callbacks.append(_async_add_new_vms)
coordinator.new_containers_callbacks.append(_async_add_new_containers)
_async_add_new_nodes(
[
node_data
for node_data in coordinator.data.values()
if node_data.node["node"] in coordinator.known_nodes
]
)
_async_add_new_vms(
[
(node_data, vm_data)
for node_data in coordinator.data.values()
for vmid, vm_data in node_data.vms.items()
if (node_data.node["node"], vmid) in coordinator.known_vms
]
)
_async_add_new_containers(
[
(node_data, container_data)
for node_data in coordinator.data.values()
for vmid, container_data in node_data.containers.items()
if (node_data.node["node"], vmid) in coordinator.known_containers
]
)
class ProxmoxBaseButton(ButtonEntity):
"""Common base for Proxmox buttons. Basically to ensure the async_press logic isn't duplicated."""
entity_description: ButtonEntityDescription
coordinator: ProxmoxCoordinator
@abstractmethod
async def _async_press_call(self) -> None:
"""Abstract method used per Proxmox button class."""
async def async_press(self) -> None:
"""Trigger the Proxmox button press service."""
try:
await self._async_press_call()
except AuthenticationError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="cannot_connect_no_details",
) from err
except SSLError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="invalid_auth_no_details",
) from err
except ConnectTimeout as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="timeout_connect_no_details",
) from err
except (ResourceException, requests.exceptions.ConnectionError) as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="api_error_no_details",
) from err
class ProxmoxNodeButtonEntity(ProxmoxNodeEntity, ProxmoxBaseButton):
"""Represents a Proxmox Node button entity."""
entity_description: ProxmoxNodeButtonNodeEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxNodeButtonNodeEntityDescription,
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the Proxmox Node button entity."""
self.entity_description = entity_description
super().__init__(coordinator, node_data)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{node_data.node['id']}_{entity_description.key}"
async def _async_press_call(self) -> None:
"""Execute the node button action via executor."""
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
self._node_data.node["node"],
)
class ProxmoxVMButtonEntity(ProxmoxVMEntity, ProxmoxBaseButton):
"""Represents a Proxmox VM button entity."""
entity_description: ProxmoxVMButtonEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxVMButtonEntityDescription,
vm_data: dict[str, Any],
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the Proxmox VM button entity."""
self.entity_description = entity_description
super().__init__(coordinator, vm_data, node_data)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{self.device_id}_{entity_description.key}"
async def _async_press_call(self) -> None:
"""Execute the VM button action via executor."""
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
self._node_name,
self.vm_data["vmid"],
)
class ProxmoxContainerButtonEntity(ProxmoxContainerEntity, ProxmoxBaseButton):
"""Represents a Proxmox Container button entity."""
entity_description: ProxmoxContainerButtonEntityDescription
def __init__(
self,
coordinator: ProxmoxCoordinator,
entity_description: ProxmoxContainerButtonEntityDescription,
container_data: dict[str, Any],
node_data: ProxmoxNodeData,
) -> None:
"""Initialize the Proxmox Container button entity."""
self.entity_description = entity_description
super().__init__(coordinator, container_data, node_data)
self._attr_unique_id = f"{coordinator.config_entry.entry_id}_{self.device_id}_{entity_description.key}"
async def _async_press_call(self) -> None:
"""Execute the container button action via executor."""
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
self._node_name,
self.container_data["vmid"],
)

View File

@@ -0,0 +1,18 @@
{
"entity": {
"button": {
"hibernate": {
"default": "mdi:power-sleep"
},
"reset": {
"default": "mdi:restart"
},
"start": {
"default": "mdi:play"
},
"stop": {
"default": "mdi:stop"
}
}
}
}

View File

@@ -54,15 +54,47 @@
"status": {
"name": "Status"
}
},
"button": {
"hibernate": {
"name": "Hibernate"
},
"reset": {
"name": "Reset"
},
"shutdown": {
"name": "Shutdown"
},
"start": {
"name": "Start"
},
"start_all": {
"name": "Start all"
},
"stop": {
"name": "Stop"
},
"stop_all": {
"name": "Stop all"
}
}
},
"exceptions": {
"api_error_no_details": {
"message": "An error occurred while communicating with the Proxmox VE instance."
},
"cannot_connect": {
"message": "An error occurred while trying to connect to the Proxmox VE instance: {error}"
},
"cannot_connect_no_details": {
"message": "Could not connect to the Proxmox VE instance."
},
"invalid_auth": {
"message": "An error occurred while trying to authenticate: {error}"
},
"invalid_auth_no_details": {
"message": "Authentication failed for the Proxmox VE instance."
},
"no_nodes_found": {
"message": "No active nodes were found on the Proxmox VE server."
},
@@ -71,6 +103,9 @@
},
"timeout_connect": {
"message": "A timeout occurred while trying to connect to the Proxmox VE instance: {error}"
},
"timeout_connect_no_details": {
"message": "A timeout occurred while trying to connect to the Proxmox VE instance."
}
},
"issues": {

View File

@@ -7,7 +7,7 @@
"integration_type": "service",
"iot_class": "local_push",
"loggers": ["hass_splunk"],
"quality_scale": "legacy",
"quality_scale": "bronze",
"requirements": ["hass-splunk==0.1.4"],
"single_config_entry": true
}

View File

@@ -18,18 +18,9 @@ rules:
status: exempt
comment: |
Integration does not provide custom actions.
docs-high-level-description:
status: todo
comment: |
Verify integration docs at https://www.home-assistant.io/integrations/splunk/ include a high-level description of Splunk with a link to https://www.splunk.com/ and explain the integration's purpose for users unfamiliar with Splunk.
docs-installation-instructions:
status: todo
comment: |
Verify integration docs include clear prerequisites and step-by-step setup instructions including how to configure Splunk HTTP Event Collector and obtain the required token.
docs-removal-instructions:
status: todo
comment: |
Verify integration docs include instructions on how to remove the integration and clarify what happens to data already in Splunk.
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: |

View File

@@ -2,7 +2,7 @@
from abc import abstractmethod
import asyncio
from collections.abc import Callable, Sequence
from collections.abc import Awaitable, Callable, Sequence
import io
import logging
import os
@@ -430,48 +430,35 @@ class TelegramNotificationService:
params[ATTR_PARSER] = None
return params
async def _send_msgs(
async def _send_msg_formatted(
self,
func_send: Callable,
func_send: Callable[..., Awaitable[Message]],
message_tag: str | None,
*args_msg: Any,
context: Context | None = None,
**kwargs_msg: Any,
) -> dict[str, JsonValueType]:
"""Sends a message to each of the targets.
If there is only 1 targtet, an error is raised if the send fails.
For multiple targets, errors are logged and the caller is responsible for checking which target is successful/failed based on the return value.
"""Sends a message and formats the response.
:return: dict with chat_id keys and message_id values for successful sends
"""
chat_ids = [kwargs_msg.pop(ATTR_CHAT_ID)]
msg_ids: dict[str, JsonValueType] = {}
for chat_id in chat_ids:
_LOGGER.debug("%s to chat ID %s", func_send.__name__, chat_id)
chat_id: int = kwargs_msg.pop(ATTR_CHAT_ID)
_LOGGER.debug("%s to chat ID %s", func_send.__name__, chat_id)
for file_type in _FILE_TYPES:
if file_type in kwargs_msg and isinstance(
kwargs_msg[file_type], io.BytesIO
):
kwargs_msg[file_type].seek(0)
response: Message = await self._send_msg(
func_send,
message_tag,
chat_id,
*args_msg,
context=context,
**kwargs_msg,
)
response: Message = await self._send_msg(
func_send,
message_tag,
chat_id,
*args_msg,
context=context,
**kwargs_msg,
)
if response:
msg_ids[str(chat_id)] = response.id
return msg_ids
return {str(chat_id): response.id}
async def _send_msg(
self,
func_send: Callable,
func_send: Callable[..., Awaitable[Any]],
message_tag: str | None,
*args_msg: Any,
context: Context | None = None,
@@ -518,7 +505,7 @@ class TelegramNotificationService:
title = kwargs.get(ATTR_TITLE)
text = f"{title}\n{message}" if title else message
params = self._get_msg_kwargs(kwargs)
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_message,
params[ATTR_MESSAGE_TAG],
text,
@@ -759,7 +746,7 @@ class TelegramNotificationService:
)
if file_type == SERVICE_SEND_PHOTO:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_photo,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -775,7 +762,7 @@ class TelegramNotificationService:
)
if file_type == SERVICE_SEND_STICKER:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_sticker,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -789,7 +776,7 @@ class TelegramNotificationService:
)
if file_type == SERVICE_SEND_VIDEO:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_video,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -805,7 +792,7 @@ class TelegramNotificationService:
)
if file_type == SERVICE_SEND_DOCUMENT:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_document,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -821,7 +808,7 @@ class TelegramNotificationService:
)
if file_type == SERVICE_SEND_VOICE:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_voice,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -836,7 +823,7 @@ class TelegramNotificationService:
)
# SERVICE_SEND_ANIMATION
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_animation,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -861,7 +848,7 @@ class TelegramNotificationService:
stickerid = kwargs.get(ATTR_STICKER_ID)
if stickerid:
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_sticker,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -886,7 +873,7 @@ class TelegramNotificationService:
latitude = float(latitude)
longitude = float(longitude)
params = self._get_msg_kwargs(kwargs)
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_location,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],
@@ -911,7 +898,7 @@ class TelegramNotificationService:
"""Send a poll."""
params = self._get_msg_kwargs(kwargs)
openperiod = kwargs.get(ATTR_OPEN_PERIOD)
return await self._send_msgs(
return await self._send_msg_formatted(
self.bot.send_poll,
params[ATTR_MESSAGE_TAG],
chat_id=kwargs[ATTR_CHAT_ID],

View File

@@ -17,6 +17,10 @@ DHCP: Final[list[dict[str, str | bool]]] = [
"domain": "airobot",
"hostname": "airobot-thermostat-*",
},
{
"domain": "airos",
"registered_devices": True,
},
{
"domain": "airthings",
"hostname": "airthings-view",

View File

@@ -5302,7 +5302,7 @@
"integration_type": "hub",
"config_flow": true,
"iot_class": "cloud_polling",
"name": "Powerfox"
"name": "Powerfox Cloud"
},
"powerfox_local": {
"integration_type": "device",

View File

@@ -4,6 +4,13 @@ To update, run python3 -m script.hassfest
"""
USB = [
{
"description": "*usb 300*",
"domain": "enocean",
"manufacturer": "*enocean*",
"pid": "6001",
"vid": "0403",
},
{
"description": "*zbt-2*",
"domain": "homeassistant_connect_zbt2",

View File

@@ -838,6 +838,7 @@ class DurationSelectorConfig(BaseSelectorConfig, total=False):
"""Class to represent a duration selector config."""
enable_day: bool
enable_second: bool
enable_millisecond: bool
allow_negative: bool
@@ -853,6 +854,8 @@ class DurationSelector(Selector[DurationSelectorConfig]):
# Enable day field in frontend. A selection with `days` set is allowed
# even if `enable_day` is not set
vol.Optional("enable_day"): cv.boolean,
# Enable seconds field in frontend.
vol.Optional("enable_second", default=True): cv.boolean,
# Enable millisecond field in frontend.
vol.Optional("enable_millisecond"): cv.boolean,
# Allow negative durations.

4
requirements_all.txt generated
View File

@@ -1452,7 +1452,7 @@ loqedAPI==2.1.10
luftdaten==0.7.4
# homeassistant.components.lunatone
lunatone-rest-api-client==0.6.3
lunatone-rest-api-client==0.7.0
# homeassistant.components.lupusec
lupupy==0.3.2
@@ -3326,7 +3326,7 @@ youless-api==2.2.0
youtubeaio==2.1.1
# homeassistant.components.media_extractor
yt-dlp[default]==2026.02.04
yt-dlp[default]==2026.02.21
# homeassistant.components.zabbix
zabbix-utils==2.0.3

View File

@@ -1271,7 +1271,7 @@ loqedAPI==2.1.10
luftdaten==0.7.4
# homeassistant.components.lunatone
lunatone-rest-api-client==0.6.3
lunatone-rest-api-client==0.7.0
# homeassistant.components.lupusec
lupupy==0.3.2
@@ -2799,7 +2799,7 @@ youless-api==2.2.0
youtubeaio==2.1.1
# homeassistant.components.media_extractor
yt-dlp[default]==2026.02.04
yt-dlp[default]==2026.02.21
# homeassistant.components.zamg
zamg==0.3.6

View File

@@ -1895,7 +1895,6 @@ INTEGRATIONS_WITHOUT_SCALE = [
"spc",
"speedtestdotnet",
"spider",
"splunk",
"spotify",
"sql",
"srp_energy",

View File

@@ -22,7 +22,7 @@ from homeassistant.components.airos.const import (
MAC_ADDRESS,
SECTION_ADVANCED_SETTINGS,
)
from homeassistant.config_entries import SOURCE_RECONFIGURE, SOURCE_USER
from homeassistant.config_entries import SOURCE_DHCP, SOURCE_RECONFIGURE, SOURCE_USER
from homeassistant.const import (
CONF_HOST,
CONF_PASSWORD,
@@ -32,6 +32,7 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers.service_info.dhcp import DhcpServiceInfo
from tests.common import MockConfigEntry
@@ -680,3 +681,72 @@ async def test_configure_device_flow_exceptions(
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
async def test_dhcp_ip_changed_updates_entry(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
) -> None:
"""DHCP event with new IP should update the config entry and reload."""
mock_config_entry.add_to_hass(hass)
macaddress = mock_config_entry.unique_id.lower().replace(":", "").replace("-", "")
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_DHCP},
data=DhcpServiceInfo(
ip="1.1.1.2",
hostname="airos",
macaddress=macaddress,
),
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
assert mock_config_entry.data[CONF_HOST] == "1.1.1.2"
async def test_dhcp_mac_mismatch(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
) -> None:
"""DHCP event with non-matching MAC should abort."""
mock_config_entry.add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_DHCP},
data=DhcpServiceInfo(
ip="1.1.1.2",
hostname="airos",
macaddress="aabbccddeeff",
),
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "unreachable"
async def test_dhcp_ip_unchanged(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
) -> None:
"""DHCP event with same IP should abort."""
mock_config_entry.add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_DHCP},
data=DhcpServiceInfo(
ip=mock_config_entry.data[CONF_HOST],
hostname="airos",
macaddress=mock_config_entry.unique_id.lower()
.replace(":", "")
.replace("-", ""),
),
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"

View File

@@ -188,7 +188,7 @@
'type': 'result',
})
# ---
# name: test_can_decrypt_on_download[backup.local-c0cb53bd-hunter2]
# name: test_can_decrypt_on_download[backup.local-backup_compressed_protected_v2-hunter2]
dict({
'id': 1,
'result': None,
@@ -196,7 +196,26 @@
'type': 'result',
})
# ---
# name: test_can_decrypt_on_download[backup.local-c0cb53bd-wrong_password]
# name: test_can_decrypt_on_download[backup.local-backup_compressed_protected_v2-wrong_password]
dict({
'error': dict({
'code': 'password_incorrect',
'message': 'Incorrect password',
}),
'id': 1,
'success': False,
'type': 'result',
})
# ---
# name: test_can_decrypt_on_download[backup.local-backup_compressed_protected_v3-hunter2]
dict({
'id': 1,
'result': None,
'success': True,
'type': 'result',
})
# ---
# name: test_can_decrypt_on_download[backup.local-backup_compressed_protected_v3-wrong_password]
dict({
'error': dict({
'code': 'password_incorrect',

View File

@@ -131,32 +131,75 @@ def test_read_backup(backup_json_content: bytes, expected_backup: AgentBackup) -
@pytest.mark.parametrize(
("backup", "password", "validation_result"),
("backup", "password", "validation_result", "expected_messages"),
[
# Backup not protected, no password provided -> validation passes
(Path("backup_v2_compressed.tar"), None, True),
(Path("backup_v2_uncompressed.tar"), None, True),
(Path("backup_compressed.tar"), None, True, []),
(Path("backup_uncompressed.tar"), None, True, []),
# Backup not protected, password provided -> validation fails
(Path("backup_v2_compressed.tar"), "hunter2", False),
(Path("backup_v2_uncompressed.tar"), "hunter2", False),
(Path("backup_compressed.tar"), "hunter2", False, ["Invalid password"]),
(Path("backup_uncompressed.tar"), "hunter2", False, ["Invalid password"]),
# Backup protected, correct password provided -> validation passes
(Path("backup_v2_compressed_protected.tar"), "hunter2", True),
(Path("backup_v2_uncompressed_protected.tar"), "hunter2", True),
(Path("backup_compressed_protected_v2.tar"), "hunter2", True, []),
(Path("backup_uncompressed_protected_v2.tar"), "hunter2", True, []),
(Path("backup_compressed_protected_v3.tar"), "hunter2", True, []),
(Path("backup_uncompressed_protected_v3.tar"), "hunter2", True, []),
# Backup protected, no password provided -> validation fails
(Path("backup_v2_compressed_protected.tar"), None, False),
(Path("backup_v2_uncompressed_protected.tar"), None, False),
(Path("backup_compressed_protected_v2.tar"), None, False, ["Invalid password"]),
(
Path("backup_uncompressed_protected_v2.tar"),
None,
False,
["Invalid password"],
),
(Path("backup_compressed_protected_v3.tar"), None, False, ["Invalid password"]),
(
Path("backup_uncompressed_protected_v3.tar"),
None,
False,
["Invalid password"],
),
# Backup protected, wrong password provided -> validation fails
(Path("backup_v2_compressed_protected.tar"), "wrong_password", False),
(Path("backup_v2_uncompressed_protected.tar"), "wrong_password", False),
(
Path("backup_compressed_protected_v2.tar"),
"wrong_password",
False,
["Invalid password"],
),
(
Path("backup_uncompressed_protected_v2.tar"),
"wrong_password",
False,
["Invalid password"],
),
(
Path("backup_compressed_protected_v3.tar"),
"wrong_password",
False,
["Invalid password"],
),
(
Path("backup_uncompressed_protected_v3.tar"),
"wrong_password",
False,
["Invalid password"],
),
],
)
def test_validate_password(
password: str | None, backup: Path, validation_result: bool
password: str | None,
backup: Path,
validation_result: bool,
expected_messages: list[str],
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test validating a password."""
test_backups = get_fixture_path("test_backups", DOMAIN)
assert validate_password(test_backups / backup, password) == validation_result
for message in expected_messages:
assert message in caplog.text
assert "Unexpected error validating password" not in caplog.text
@pytest.mark.parametrize("password", [None, "hunter2"])

View File

@@ -4048,8 +4048,10 @@ async def test_subscribe_event(
# Legacy backup, which can't be streamed
("backup.local", "2bcb3113", "hunter2"),
# New backup, which can be streamed, try with correct and wrong password
("backup.local", "c0cb53bd", "hunter2"),
("backup.local", "c0cb53bd", "wrong_password"),
("backup.local", "backup_compressed_protected_v2", "hunter2"),
("backup.local", "backup_compressed_protected_v2", "wrong_password"),
("backup.local", "backup_compressed_protected_v3", "hunter2"),
("backup.local", "backup_compressed_protected_v3", "wrong_password"),
],
)
@pytest.mark.usefixtures("mock_backups")

View File

@@ -16,5 +16,14 @@
"dataType": 0,
"readonly": 1,
"unit": "&deg;C"
},
"total_energy": {
"name": "Total energy",
"error": 0,
"value": "7968",
"desc": "",
"dataType": 0,
"readonly": 1,
"unit": "kWh"
}
}

View File

@@ -102,7 +102,19 @@
'unit': '&deg;C',
'value': 6.1,
}),
'total_energy': None,
'total_energy': dict({
'data_type': 0,
'data_type_family': '',
'data_type_name': '',
'desc': '',
'error': 0,
'name': 'Total energy',
'precision': None,
'readonly': 1,
'readwrite': 0,
'unit': 'kWh',
'value': 7968,
}),
}),
'state': dict({
'current_temperature': dict({

View File

@@ -113,3 +113,60 @@
'state': '6.1',
})
# ---
# name: test_sensor_entity_properties[sensor.bsb_lan_total_energy-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': dict({
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
}),
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.bsb_lan_total_energy',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Total energy',
'options': dict({
'sensor': dict({
'suggested_display_precision': 2,
}),
}),
'original_device_class': <SensorDeviceClass.ENERGY: 'energy'>,
'original_icon': None,
'original_name': 'Total energy',
'platform': 'bsblan',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'total_energy',
'unique_id': '00:80:41:19:69:90-total_energy',
'unit_of_measurement': <UnitOfEnergy.KILO_WATT_HOUR: 'kWh'>,
})
# ---
# name: test_sensor_entity_properties[sensor.bsb_lan_total_energy-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'energy',
'friendly_name': 'BSB-LAN Total energy',
'state_class': <SensorStateClass.TOTAL_INCREASING: 'total_increasing'>,
'unit_of_measurement': <UnitOfEnergy.KILO_WATT_HOUR: 'kWh'>,
}),
'context': <ANY>,
'entity_id': 'sensor.bsb_lan_total_energy',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '7968',
})
# ---

View File

@@ -15,6 +15,7 @@ from tests.common import MockConfigEntry, snapshot_platform
ENTITY_CURRENT_TEMP = "sensor.bsb_lan_current_temperature"
ENTITY_OUTSIDE_TEMP = "sensor.bsb_lan_outside_temperature"
ENTITY_TOTAL_ENERGY = "sensor.bsb_lan_total_energy"
async def test_sensor_entity_properties(
@@ -40,6 +41,7 @@ async def test_sensors_not_created_when_data_unavailable(
# Set all sensor data to None to simulate no sensors available
mock_bsblan.sensor.return_value.current_temperature = None
mock_bsblan.sensor.return_value.outside_temperature = None
mock_bsblan.sensor.return_value.total_energy = None
await setup_with_selected_platforms(hass, mock_config_entry, [Platform.SENSOR])
@@ -58,8 +60,9 @@ async def test_partial_sensors_created_when_some_data_available(
entity_registry: er.EntityRegistry,
) -> None:
"""Test only available sensors are created when some sensor data is available."""
# Only current temperature available, outside temperature not
# Only current temperature available, outside temperature and energy not
mock_bsblan.sensor.return_value.outside_temperature = None
mock_bsblan.sensor.return_value.total_energy = None
await setup_with_selected_platforms(hass, mock_config_entry, [Platform.SENSOR])

View File

@@ -1,18 +1,25 @@
"""Tests for EnOcean config flow."""
from unittest.mock import Mock, patch
from unittest.mock import AsyncMock, Mock, patch
from homeassistant import config_entries
from homeassistant.components.enocean.config_flow import EnOceanFlowHandler
from homeassistant.components.enocean.const import DOMAIN
from homeassistant.components.enocean.const import DOMAIN, MANUFACTURER
from homeassistant.config_entries import (
SOURCE_IMPORT,
SOURCE_USB,
SOURCE_USER,
ConfigEntryState,
)
from homeassistant.const import CONF_DEVICE
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers.service_info.usb import UsbServiceInfo
from tests.common import MockConfigEntry
DONGLE_VALIDATE_PATH_METHOD = "homeassistant.components.enocean.dongle.validate_path"
DONGLE_DETECT_METHOD = "homeassistant.components.enocean.dongle.detect"
SETUP_ENTRY_METHOD = "homeassistant.components.enocean.async_setup_entry"
async def test_user_flow_cannot_create_multiple_instances(hass: HomeAssistant) -> None:
@@ -24,7 +31,7 @@ async def test_user_flow_cannot_create_multiple_instances(hass: HomeAssistant) -
with patch(DONGLE_VALIDATE_PATH_METHOD, Mock(return_value=True)):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
DOMAIN, context={"source": SOURCE_USER}
)
assert result["type"] is FlowResultType.ABORT
@@ -37,7 +44,7 @@ async def test_user_flow_with_detected_dongle(hass: HomeAssistant) -> None:
with patch(DONGLE_DETECT_METHOD, Mock(return_value=[FAKE_DONGLE_PATH])):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
DOMAIN, context={"source": SOURCE_USER}
)
assert result["type"] is FlowResultType.FORM
@@ -51,7 +58,7 @@ async def test_user_flow_with_no_detected_dongle(hass: HomeAssistant) -> None:
"""Test the user flow with a detected EnOcean dongle."""
with patch(DONGLE_DETECT_METHOD, Mock(return_value=[])):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
DOMAIN, context={"source": SOURCE_USER}
)
assert result["type"] is FlowResultType.FORM
@@ -147,7 +154,7 @@ async def test_import_flow_with_valid_path(hass: HomeAssistant) -> None:
with patch(DONGLE_VALIDATE_PATH_METHOD, Mock(return_value=True)):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_IMPORT},
context={"source": SOURCE_IMPORT},
data=DATA_TO_IMPORT,
)
@@ -165,9 +172,86 @@ async def test_import_flow_with_invalid_path(hass: HomeAssistant) -> None:
):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_IMPORT},
context={"source": SOURCE_IMPORT},
data=DATA_TO_IMPORT,
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "invalid_dongle_path"
async def test_usb_discovery(
hass: HomeAssistant,
) -> None:
"""Test usb discovery success path."""
usb_discovery_info = UsbServiceInfo(
device="/dev/enocean0",
pid="6001",
vid="0403",
serial_number="1234",
description="USB 300",
manufacturer="EnOcean GmbH",
)
device = "/dev/enocean0"
# test discovery step
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USB},
data=usb_discovery_info,
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "usb_confirm"
assert result["errors"] is None
# test device path
with (
patch(DONGLE_VALIDATE_PATH_METHOD, Mock(return_value=True)),
patch(SETUP_ENTRY_METHOD, AsyncMock(return_value=True)),
patch(
"homeassistant.components.usb.get_serial_by_id",
side_effect=lambda x: x,
),
):
result = await hass.config_entries.flow.async_configure(result["flow_id"], {})
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == MANUFACTURER
assert result["data"] == {"device": device}
assert result["context"]["unique_id"] == "0403:6001_1234_EnOcean GmbH_USB 300"
assert result["context"]["title_placeholders"] == {
"name": "USB 300 - /dev/enocean0, s/n: 1234 - EnOcean GmbH - 0403:6001"
}
assert result["result"].state is ConfigEntryState.LOADED
async def test_usb_discovery_already_configured_updates_path(
hass: HomeAssistant,
) -> None:
"""Test usb discovery aborts when already configured and updates device path."""
# Existing entry with the same unique_id but an old device path
existing_entry = MockConfigEntry(
domain=DOMAIN,
data={CONF_DEVICE: "/dev/enocean-old"},
unique_id="0403:6001_1234_EnOcean GmbH_USB 300",
)
existing_entry.add_to_hass(hass)
# New USB discovery for the same dongle but with an updated device path
usb_discovery_info = UsbServiceInfo(
device="/dev/enocean-new",
pid="6001",
vid="0403",
serial_number="1234",
description="USB 300",
manufacturer="EnOcean GmbH",
)
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USB},
data=usb_discovery_info,
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "single_instance_allowed"

View File

@@ -5,9 +5,11 @@ from unittest.mock import Mock, patch
from aioesphomeapi import (
APIClient,
EntityCategory as ESPHomeEntityCategory,
EntityInfo,
SensorInfo,
SensorState,
)
import pytest
from homeassistant.components.esphome import DOMAIN
from homeassistant.components.esphome.entry_data import RuntimeEntryData
@@ -152,3 +154,42 @@ async def test_discover_zwave_without_home_id() -> None:
)
# Verify async_create_flow was NOT called when zwave_home_id is 0
mock_create_flow.assert_not_called()
async def test_unknown_entity_type_skipped(
hass: HomeAssistant,
mock_client: APIClient,
mock_generic_device_entry: MockGenericDeviceEntryType,
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test that unknown entity types are skipped gracefully."""
class UnknownInfo(EntityInfo):
"""Mock unknown entity info type."""
entity_info = [
SensorInfo(
object_id="mysensor",
key=1,
name="my sensor",
),
UnknownInfo(
object_id="unknown",
key=2,
name="unknown entity",
),
]
states = [SensorState(key=1, state=42)]
await mock_generic_device_entry(
mock_client=mock_client,
entity_info=entity_info,
states=states,
)
assert "UnknownInfo" in caplog.text
assert "not supported in this version of Home Assistant" in caplog.text
# Known entity still works
state = hass.states.get("sensor.test_my_sensor")
assert state is not None
assert state.state == "42"

View File

@@ -2,7 +2,7 @@
from datetime import datetime, timedelta
from typing import Any
from unittest.mock import ANY, patch
from unittest.mock import patch
from freezegun.api import FrozenDateTimeFactory
import pytest
@@ -2291,12 +2291,10 @@ async def test_fan_speed(hass: HomeAssistant) -> None:
assert trt.sync_attributes() == {
"reversible": False,
"supportsFanSpeedPercent": True,
"availableFanSpeeds": ANY,
}
assert trt.query_attributes() == {
"currentFanSpeedPercent": 33,
"currentFanSpeedSetting": ANY,
}
assert trt.can_execute(trait.COMMAND_SET_FAN_SPEED, params={"fanSpeedPercent": 10})
@@ -2311,7 +2309,7 @@ async def test_fan_speed(hass: HomeAssistant) -> None:
async def test_fan_speed_without_percentage_step(hass: HomeAssistant) -> None:
"""Test FanSpeed trait speed control percentage step for fan domain."""
"""Test FanSpeed trait falls back to percent-only when percentage_step is missing."""
assert helpers.get_google_type(fan.DOMAIN, None) is not None
assert trait.FanSpeedTrait.supported(
fan.DOMAIN, FanEntityFeature.SET_SPEED, None, None
@@ -2322,6 +2320,9 @@ async def test_fan_speed_without_percentage_step(hass: HomeAssistant) -> None:
State(
"fan.living_room_fan",
STATE_ON,
attributes={
"percentage": 50,
},
),
BASIC_CONFIG,
)
@@ -2329,12 +2330,10 @@ async def test_fan_speed_without_percentage_step(hass: HomeAssistant) -> None:
assert trt.sync_attributes() == {
"reversible": False,
"supportsFanSpeedPercent": True,
"availableFanSpeeds": ANY,
}
# If a fan state has (temporary) no percentage_step attribute return 1 available
assert trt.query_attributes() == {
"currentFanSpeedPercent": 0,
"currentFanSpeedSetting": "1/5",
"currentFanSpeedPercent": 50,
}
@@ -2343,7 +2342,7 @@ async def test_fan_speed_without_percentage_step(hass: HomeAssistant) -> None:
[
(
33,
1.0,
20.0,
"2/5",
[
["Low", "Min", "Slow", "1"],
@@ -2356,7 +2355,7 @@ async def test_fan_speed_without_percentage_step(hass: HomeAssistant) -> None:
),
(
40,
1.0,
20.0,
"2/5",
[
["Low", "Min", "Slow", "1"],
@@ -2421,7 +2420,7 @@ async def test_fan_speed_ordered(
assert trt.sync_attributes() == {
"reversible": False,
"supportsFanSpeedPercent": True,
"supportsFanSpeedPercent": False,
"availableFanSpeeds": {
"ordered": True,
"speeds": [
@@ -2435,7 +2434,6 @@ async def test_fan_speed_ordered(
}
assert trt.query_attributes() == {
"currentFanSpeedPercent": percentage,
"currentFanSpeedSetting": speed,
}
@@ -2484,12 +2482,10 @@ async def test_fan_reverse(
assert trt.sync_attributes() == {
"reversible": True,
"supportsFanSpeedPercent": True,
"availableFanSpeeds": ANY,
}
assert trt.query_attributes() == {
"currentFanSpeedPercent": 33,
"currentFanSpeedSetting": ANY,
}
assert trt.can_execute(trait.COMMAND_REVERSE, params={})

View File

@@ -3,7 +3,12 @@
from http import HTTPStatus
from unittest.mock import MagicMock, patch
from aiohttp.hdrs import X_FORWARDED_FOR, X_FORWARDED_HOST, X_FORWARDED_PROTO
from aiohttp.hdrs import (
CONTENT_TYPE,
X_FORWARDED_FOR,
X_FORWARDED_HOST,
X_FORWARDED_PROTO,
)
from multidict import CIMultiDict
import pytest
@@ -324,6 +329,106 @@ async def test_ingress_request_head(
assert aioclient_mock.mock_calls[-1][3][X_FORWARDED_PROTO]
async def test_ingress_request_head_with_content_type(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test HEAD request preserves content-type from upstream."""
aioclient_mock.head(
"http://127.0.0.1/ingress/core/index.html",
text="",
headers={"Content-Type": "text/html; charset=utf-8"},
)
resp = await hassio_noauth_client.head(
"/api/hassio_ingress/core/index.html",
)
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == ""
assert resp.headers[CONTENT_TYPE] == "text/html"
async def test_ingress_request_head_without_content_type(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test HEAD request without upstream content-type omits it."""
aioclient_mock.head(
"http://127.0.0.1/ingress/core/index.html",
text="",
)
resp = await hassio_noauth_client.head(
"/api/hassio_ingress/core/index.html",
)
assert resp.status == HTTPStatus.OK
body = await resp.text()
assert body == ""
assert CONTENT_TYPE not in resp.headers
async def test_ingress_request_304_no_content_type(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test 304 Not Modified does not include content-type when upstream omits it."""
aioclient_mock.get(
"http://127.0.0.1/ingress/core/index.html",
text="",
status=HTTPStatus.NOT_MODIFIED,
)
resp = await hassio_noauth_client.get(
"/api/hassio_ingress/core/index.html",
)
assert resp.status == HTTPStatus.NOT_MODIFIED
body = await resp.text()
assert body == ""
assert CONTENT_TYPE not in resp.headers
async def test_ingress_request_304_with_content_type(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test 304 Not Modified preserves content-type when upstream provides it."""
aioclient_mock.get(
"http://127.0.0.1/ingress/core/index.html",
text="",
status=HTTPStatus.NOT_MODIFIED,
headers={"Content-Type": "text/html"},
)
resp = await hassio_noauth_client.get(
"/api/hassio_ingress/core/index.html",
)
assert resp.status == HTTPStatus.NOT_MODIFIED
body = await resp.text()
assert body == ""
assert resp.headers[CONTENT_TYPE] == "text/html"
async def test_ingress_request_204_no_content(
hassio_noauth_client, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test 204 No Content does not include content-type."""
aioclient_mock.get(
"http://127.0.0.1/ingress/core/api/status",
text="",
status=HTTPStatus.NO_CONTENT,
)
resp = await hassio_noauth_client.get(
"/api/hassio_ingress/core/api/status",
)
assert resp.status == HTTPStatus.NO_CONTENT
body = await resp.text()
assert body == ""
assert CONTENT_TYPE not in resp.headers
@pytest.mark.parametrize(
"build_type",
[

View File

@@ -42,3 +42,21 @@ HUE_BLE_SERVICE_INFO = BluetoothServiceInfoBleak(
connectable=True,
tx_power=-127,
)
NOT_HUE_BLE_DISCOVERY_INFO = BluetoothServiceInfoBleak(
name="Not",
address="AA:BB:CC:DD:EE:F2",
rssi=-60,
manufacturer_data={
33: b"\x00\x00\xd1\xf0b;\xd8\x1dE\xd6\xba\xeeL\xdd]\xf5\xb2\xe9",
21: b"\x061\x00Z\x8f\x93\xb2\xec\x85\x06\x00i\x00\x02\x02Q\xed\x1d\xf0",
},
service_uuids=[],
service_data={},
source="local",
device=generate_ble_device(address="AA:BB:CC:DD:EE:F2", name="Aug"),
advertisement=generate_advertisement_data(),
time=0,
connectable=True,
tx_power=-127,
)

View File

@@ -2,23 +2,28 @@
from unittest.mock import AsyncMock, PropertyMock, patch
from habluetooth import BluetoothServiceInfoBleak
from HueBLE import ConnectionError, HueBleError, PairingError
import pytest
from homeassistant import config_entries
from homeassistant.components.hue_ble.config_flow import Error
from homeassistant.components.hue_ble.const import (
DOMAIN,
URL_FACTORY_RESET,
URL_PAIRING_MODE,
)
from homeassistant.config_entries import SOURCE_BLUETOOTH
from homeassistant.config_entries import SOURCE_BLUETOOTH, SOURCE_USER
from homeassistant.const import CONF_MAC, CONF_NAME
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers import device_registry as dr
from . import HUE_BLE_SERVICE_INFO, TEST_DEVICE_MAC, TEST_DEVICE_NAME
from . import (
HUE_BLE_SERVICE_INFO,
NOT_HUE_BLE_DISCOVERY_INFO,
TEST_DEVICE_MAC,
TEST_DEVICE_NAME,
)
from tests.common import MockConfigEntry
from tests.components.bluetooth import BLEDevice, generate_ble_device
@@ -27,17 +32,34 @@ AUTH_ERROR = ConnectionError()
AUTH_ERROR.__cause__ = PairingError()
async def test_bluetooth_form(
async def test_user_form(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
) -> None:
"""Test bluetooth discovery form."""
"""Test user form."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_BLUETOOTH},
data=HUE_BLE_SERVICE_INFO,
with patch(
"homeassistant.components.hue_ble.config_flow.bluetooth.async_discovered_service_info",
return_value=[NOT_HUE_BLE_DISCOVERY_INFO, HUE_BLE_SERVICE_INFO],
):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
assert result["data_schema"].schema[CONF_MAC].container == {
HUE_BLE_SERVICE_INFO.address: (
f"{HUE_BLE_SERVICE_INFO.name} ({HUE_BLE_SERVICE_INFO.address})"
),
}
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_MAC: HUE_BLE_SERVICE_INFO.address},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "confirm"
assert result["description_placeholders"] == {
@@ -78,6 +100,27 @@ async def test_bluetooth_form(
assert len(mock_setup_entry.mock_calls) == 1
@pytest.mark.parametrize("discovery_info", [[NOT_HUE_BLE_DISCOVERY_INFO], []])
async def test_user_form_no_device(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
discovery_info: list[BluetoothServiceInfoBleak],
) -> None:
"""Test user form with no devices."""
with patch(
"homeassistant.components.hue_ble.config_flow.bluetooth.async_discovered_service_info",
return_value=discovery_info,
):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "no_devices_found"
@pytest.mark.parametrize(
(
"mock_return_device",
@@ -155,7 +198,7 @@ async def test_bluetooth_form(
"unknown",
],
)
async def test_bluetooth_form_exception(
async def test_user_form_exception(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_return_device: BLEDevice | None,
@@ -165,13 +208,30 @@ async def test_bluetooth_form_exception(
mock_poll_state: Exception | None,
error: Error,
) -> None:
"""Test bluetooth discovery form with errors."""
"""Test user form with errors."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_BLUETOOTH},
data=HUE_BLE_SERVICE_INFO,
with patch(
"homeassistant.components.hue_ble.config_flow.bluetooth.async_discovered_service_info",
return_value=[NOT_HUE_BLE_DISCOVERY_INFO, HUE_BLE_SERVICE_INFO],
):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
assert result["data_schema"].schema[CONF_MAC].container == {
HUE_BLE_SERVICE_INFO.address: (
f"{HUE_BLE_SERVICE_INFO.name} ({HUE_BLE_SERVICE_INFO.address})"
),
}
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_MAC: HUE_BLE_SERVICE_INFO.address},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "confirm"
@@ -232,17 +292,19 @@ async def test_bluetooth_form_exception(
assert result["type"] is FlowResultType.CREATE_ENTRY
async def test_user_form_exception(
async def test_bluetooth_discovery_aborts(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
) -> None:
"""Test the user form raises a discovery only error."""
"""Test bluetooth form aborts."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
DOMAIN,
context={"source": SOURCE_BLUETOOTH},
data=HUE_BLE_SERVICE_INFO,
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "not_implemented"
assert result["reason"] == "discovery_unsupported"
async def test_bluetooth_form_exception_already_set_up(
@@ -260,4 +322,38 @@ async def test_bluetooth_form_exception_already_set_up(
data=HUE_BLE_SERVICE_INFO,
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "discovery_unsupported"
async def test_user_form_exception_already_set_up(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test user form when device is already set up."""
mock_config_entry.add_to_hass(hass)
with patch(
"homeassistant.components.hue_ble.config_flow.bluetooth.async_discovered_service_info",
return_value=[NOT_HUE_BLE_DISCOVERY_INFO, HUE_BLE_SERVICE_INFO],
):
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
assert result["data_schema"].schema[CONF_MAC].container == {
HUE_BLE_SERVICE_INFO.address: (
f"{HUE_BLE_SERVICE_INFO.name} ({HUE_BLE_SERVICE_INFO.address})"
),
}
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_MAC: HUE_BLE_SERVICE_INFO.address},
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"

View File

@@ -11,7 +11,7 @@ from lunatone_rest_api_client.models import (
InfoData,
LineStatus,
)
from lunatone_rest_api_client.models.common import ColorRGBData, ColorWAFData, Status
from lunatone_rest_api_client.models.common import Status
from lunatone_rest_api_client.models.devices import DeviceStatus
from homeassistant.core import HomeAssistant
@@ -77,13 +77,7 @@ def build_device_data_list() -> list[DeviceData]:
name="Device 1",
available=True,
status=DeviceStatus(),
features=FeaturesStatus(
switchable=Status[bool](status=False),
dimmable=Status[float](status=0.0),
colorKelvin=Status[int](status=1000),
colorRGB=Status[ColorRGBData](status=ColorRGBData(r=0, g=0, b=0)),
colorWAF=Status[ColorWAFData](status=ColorWAFData(w=0, a=0, f=0)),
),
features=FeaturesStatus(switchable=Status[bool](status=False)),
address=0,
line=0,
),
@@ -95,9 +89,6 @@ def build_device_data_list() -> list[DeviceData]:
features=FeaturesStatus(
switchable=Status[bool](status=False),
dimmable=Status[float](status=0.0),
colorKelvin=Status[int](status=1000),
colorRGB=Status[ColorRGBData](status=ColorRGBData(r=0, g=0, b=0)),
colorWAF=Status[ColorWAFData](status=ColorWAFData(w=0, a=0, f=0)),
),
address=1,
line=0,

View File

@@ -27,7 +27,6 @@ def mock_setup_entry() -> Generator[AsyncMock]:
@pytest.fixture
def mock_lunatone_devices() -> Generator[AsyncMock]:
"""Mock a Lunatone devices object."""
state = {"is_dimmable": False}
def build_devices_mock(devices: Devices):
device_list = []
@@ -39,9 +38,10 @@ def mock_lunatone_devices() -> Generator[AsyncMock]:
device.id = device.data.id
device.name = device.data.name
device.is_on = device.data.features.switchable.status
device.brightness = device.data.features.dimmable.status
type(device).is_dimmable = PropertyMock(
side_effect=lambda s=state: s["is_dimmable"]
device.brightness = (
device.data.features.dimmable.status
if device.data.features.dimmable
else None
)
device_list.append(device)
return device_list
@@ -54,7 +54,6 @@ def mock_lunatone_devices() -> Generator[AsyncMock]:
type(devices).devices = PropertyMock(
side_effect=lambda d=devices: build_devices_mock(d)
)
devices.set_is_dimmable = lambda value, s=state: s.update(is_dimmable=value)
yield devices

View File

@@ -8,34 +8,18 @@
'dali_types': list([
]),
'features': dict({
'color_kelvin': dict({
'status': 1000.0,
}),
'color_kelvin': None,
'color_kelvin_with_fade': None,
'color_rgb': dict({
'status': dict({
'blue': 0.0,
'green': 0.0,
'red': 0.0,
}),
}),
'color_rgb': None,
'color_rgb_with_fade': None,
'color_waf': dict({
'status': dict({
'amber': 0.0,
'free_color': 0.0,
'white': 0.0,
}),
}),
'color_waf': None,
'color_waf_with_fade': None,
'color_xy': None,
'color_xy_with_fade': None,
'dali_cmd16': None,
'dim_down': None,
'dim_up': None,
'dimmable': dict({
'status': 0.0,
}),
'dimmable': None,
'dimmable_kelvin': None,
'dimmable_rgb': None,
'dimmable_waf': None,
@@ -79,25 +63,11 @@
'dali_types': list([
]),
'features': dict({
'color_kelvin': dict({
'status': 1000.0,
}),
'color_kelvin': None,
'color_kelvin_with_fade': None,
'color_rgb': dict({
'status': dict({
'blue': 0.0,
'green': 0.0,
'red': 0.0,
}),
}),
'color_rgb': None,
'color_rgb_with_fade': None,
'color_waf': dict({
'status': dict({
'amber': 0.0,
'free_color': 0.0,
'white': 0.0,
}),
}),
'color_waf': None,
'color_waf_with_fade': None,
'color_xy': None,
'color_xy_with_fade': None,
@@ -208,6 +178,7 @@
'node_red': False,
'startup_mode': 'normal',
'tier': 'basic',
'uid': None,
'version': 'v1.14.1/1.4.3',
}),
})

View File

@@ -64,7 +64,7 @@
'area_id': None,
'capabilities': dict({
'supported_color_modes': list([
<ColorMode.ONOFF: 'onoff'>,
<ColorMode.BRIGHTNESS: 'brightness'>,
]),
}),
'config_entry_id': <ANY>,
@@ -100,10 +100,11 @@
# name: test_setup[light.device_2-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'brightness': None,
'color_mode': None,
'friendly_name': 'Device 2',
'supported_color_modes': list([
<ColorMode.ONOFF: 'onoff'>,
<ColorMode.BRIGHTNESS: 'brightness'>,
]),
'supported_features': <LightEntityFeature: 0>,
}),

View File

@@ -22,8 +22,6 @@ from . import setup_integration
from tests.common import MockConfigEntry
TEST_ENTITY_ID = "light.device_1"
async def test_setup(
hass: HomeAssistant,
@@ -52,10 +50,13 @@ async def test_turn_on_off(
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the light can be turned on and off."""
device_id = 1
entity_id = f"light.device_{device_id}"
await setup_integration(hass, mock_config_entry)
async def fake_update():
device = mock_lunatone_devices.data.devices[0]
device = mock_lunatone_devices.data.devices[device_id - 1]
device.features.switchable.status = not device.features.switchable.status
mock_lunatone_devices.async_update.side_effect = fake_update
@@ -63,22 +64,22 @@ async def test_turn_on_off(
await hass.services.async_call(
LIGHT_DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: TEST_ENTITY_ID},
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
state = hass.states.get(TEST_ENTITY_ID)
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_ON
await hass.services.async_call(
LIGHT_DOMAIN,
SERVICE_TURN_OFF,
{ATTR_ENTITY_ID: TEST_ENTITY_ID},
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
state = hass.states.get(TEST_ENTITY_ID)
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
@@ -90,16 +91,16 @@ async def test_turn_on_off_with_brightness(
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the light can be turned on with brightness."""
device_id = 2
entity_id = f"light.device_{device_id}"
expected_brightness = 128
brightness_percentages = iter([50.0, 0.0, 50.0])
mock_lunatone_devices.set_is_dimmable(True)
await setup_integration(hass, mock_config_entry)
async def fake_update():
brightness = next(brightness_percentages)
device = mock_lunatone_devices.data.devices[0]
device = mock_lunatone_devices.data.devices[device_id - 1]
device.features.switchable.status = brightness > 0
device.features.dimmable.status = brightness
@@ -108,11 +109,11 @@ async def test_turn_on_off_with_brightness(
await hass.services.async_call(
LIGHT_DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: TEST_ENTITY_ID, ATTR_BRIGHTNESS: expected_brightness},
{ATTR_ENTITY_ID: entity_id, ATTR_BRIGHTNESS: expected_brightness},
blocking=True,
)
state = hass.states.get(TEST_ENTITY_ID)
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_ON
assert state.attributes["brightness"] == expected_brightness
@@ -120,11 +121,11 @@ async def test_turn_on_off_with_brightness(
await hass.services.async_call(
LIGHT_DOMAIN,
SERVICE_TURN_OFF,
{ATTR_ENTITY_ID: TEST_ENTITY_ID},
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
state = hass.states.get(TEST_ENTITY_ID)
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert not state.attributes["brightness"]
@@ -132,11 +133,11 @@ async def test_turn_on_off_with_brightness(
await hass.services.async_call(
LIGHT_DOMAIN,
SERVICE_TURN_ON,
{ATTR_ENTITY_ID: TEST_ENTITY_ID},
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
state = hass.states.get(TEST_ENTITY_ID)
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_ON
assert state.attributes["brightness"] == expected_brightness

View File

@@ -2,32 +2,206 @@
from collections.abc import Generator
from datetime import UTC, datetime
from types import MappingProxyType
from unittest.mock import AsyncMock, MagicMock, patch
from pymta import Arrival
import pytest
from homeassistant.components.mta.const import CONF_LINE, CONF_STOP_ID, CONF_STOP_NAME
from homeassistant.components.mta.const import (
CONF_LINE,
CONF_ROUTE,
CONF_STOP_ID,
CONF_STOP_NAME,
DOMAIN,
SUBENTRY_TYPE_BUS,
SUBENTRY_TYPE_SUBWAY,
)
from homeassistant.config_entries import ConfigSubentry
from homeassistant.const import CONF_API_KEY
from tests.common import MockConfigEntry
MOCK_SUBWAY_ARRIVALS = [
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 5, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 10, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 15, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
]
MOCK_SUBWAY_STOPS = [
{
"stop_id": "127N",
"stop_name": "Times Sq - 42 St",
"stop_sequence": 1,
},
{
"stop_id": "127S",
"stop_name": "Times Sq - 42 St",
"stop_sequence": 2,
},
]
MOCK_BUS_ARRIVALS = [
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 5, 0, tzinfo=UTC),
route_id="M15",
stop_id="400561",
destination="South Ferry",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 12, 0, tzinfo=UTC),
route_id="M15",
stop_id="400561",
destination="South Ferry",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 20, 0, tzinfo=UTC),
route_id="M15",
stop_id="400561",
destination="South Ferry",
),
]
MOCK_BUS_STOPS = [
{
"stop_id": "400561",
"stop_name": "1 Av/E 79 St",
"stop_sequence": 1,
},
{
"stop_id": "400562",
"stop_name": "1 Av/E 72 St",
"stop_sequence": 2,
},
]
# Bus stops with direction info (from updated library)
MOCK_BUS_STOPS_WITH_DIRECTION = [
{
"stop_id": "400561",
"stop_name": "1 Av/E 79 St",
"stop_sequence": 1,
"direction_id": 0,
"direction_name": "South Ferry",
},
{
"stop_id": "400570",
"stop_name": "1 Av/E 79 St",
"stop_sequence": 15,
"direction_id": 1,
"direction_name": "Harlem",
},
{
"stop_id": "400562",
"stop_name": "1 Av/E 72 St",
"stop_sequence": 2,
"direction_id": 0,
"direction_name": "South Ferry",
},
]
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return a mock config entry."""
"""Return a mock config entry (main entry without subentries)."""
return MockConfigEntry(
domain="mta",
data={
CONF_LINE: "1",
CONF_STOP_ID: "127N",
CONF_STOP_NAME: "Times Sq - 42 St (N direction)",
},
unique_id="1_127N",
domain=DOMAIN,
data={CONF_API_KEY: None},
version=1,
minor_version=1,
entry_id="01J0000000000000000000000",
title="1 Line - Times Sq - 42 St (N direction)",
title="MTA",
)
@pytest.fixture
def mock_config_entry_with_api_key() -> MockConfigEntry:
"""Return a mock config entry with API key."""
return MockConfigEntry(
domain=DOMAIN,
data={CONF_API_KEY: "test_api_key"},
version=1,
minor_version=1,
entry_id="01J0000000000000000000001",
title="MTA",
)
@pytest.fixture
def mock_subway_subentry() -> ConfigSubentry:
"""Return a mock subway subentry."""
return ConfigSubentry(
data=MappingProxyType(
{
CONF_LINE: "1",
CONF_STOP_ID: "127N",
CONF_STOP_NAME: "Times Sq - 42 St (N direction)",
}
),
subentry_id="01JSUBWAY00000000000000001",
subentry_type=SUBENTRY_TYPE_SUBWAY,
title="1 - Times Sq - 42 St (N direction)",
unique_id="1_127N",
)
@pytest.fixture
def mock_bus_subentry() -> ConfigSubentry:
"""Return a mock bus subentry."""
return ConfigSubentry(
data=MappingProxyType(
{
CONF_ROUTE: "M15",
CONF_STOP_ID: "400561",
CONF_STOP_NAME: "1 Av/E 79 St",
}
),
subentry_id="01JBUS0000000000000000001",
subentry_type=SUBENTRY_TYPE_BUS,
title="M15 - 1 Av/E 79 St",
unique_id="bus_M15_400561",
)
@pytest.fixture
def mock_config_entry_with_subway_subentry(
mock_config_entry: MockConfigEntry,
mock_subway_subentry: ConfigSubentry,
) -> MockConfigEntry:
"""Return a mock config entry with a subway subentry."""
mock_config_entry.subentries = {
mock_subway_subentry.subentry_id: mock_subway_subentry
}
return mock_config_entry
@pytest.fixture
def mock_config_entry_with_bus_subentry(
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_subentry: ConfigSubentry,
) -> MockConfigEntry:
"""Return a mock config entry with a bus subentry."""
mock_config_entry_with_api_key.subentries = {
mock_bus_subentry.subentry_id: mock_bus_subentry
}
return mock_config_entry_with_api_key
@pytest.fixture
def mock_setup_entry() -> Generator[AsyncMock]:
"""Mock setting up a config entry."""
@@ -40,41 +214,6 @@ def mock_setup_entry() -> Generator[AsyncMock]:
@pytest.fixture
def mock_subway_feed() -> Generator[MagicMock]:
"""Create a mock SubwayFeed for both coordinator and config flow."""
# Fixed arrival times: 5, 10, and 15 minutes after test frozen time (2023-10-21 00:00:00 UTC)
mock_arrivals = [
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 5, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 10, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
Arrival(
arrival_time=datetime(2023, 10, 21, 0, 15, 0, tzinfo=UTC),
route_id="1",
stop_id="127N",
destination="Van Cortlandt Park - 242 St",
),
]
mock_stops = [
{
"stop_id": "127N",
"stop_name": "Times Sq - 42 St",
"stop_sequence": 1,
},
{
"stop_id": "127S",
"stop_name": "Times Sq - 42 St",
"stop_sequence": 2,
},
]
with (
patch(
"homeassistant.components.mta.coordinator.SubwayFeed", autospec=True
@@ -86,7 +225,45 @@ def mock_subway_feed() -> Generator[MagicMock]:
):
mock_instance = mock_feed.return_value
mock_feed.get_feed_id_for_route.return_value = "1"
mock_instance.get_arrivals.return_value = mock_arrivals
mock_instance.get_stops.return_value = mock_stops
mock_instance.get_arrivals.return_value = MOCK_SUBWAY_ARRIVALS
mock_instance.get_stops.return_value = MOCK_SUBWAY_STOPS
yield mock_feed
@pytest.fixture
def mock_bus_feed() -> Generator[MagicMock]:
"""Create a mock BusFeed for both coordinator and config flow."""
with (
patch(
"homeassistant.components.mta.coordinator.BusFeed", autospec=True
) as mock_feed,
patch(
"homeassistant.components.mta.config_flow.BusFeed",
new=mock_feed,
),
):
mock_instance = mock_feed.return_value
mock_instance.get_arrivals.return_value = MOCK_BUS_ARRIVALS
mock_instance.get_stops.return_value = MOCK_BUS_STOPS
yield mock_feed
@pytest.fixture
def mock_bus_feed_with_direction() -> Generator[MagicMock]:
"""Create a mock BusFeed with direction info."""
with (
patch(
"homeassistant.components.mta.coordinator.BusFeed", autospec=True
) as mock_feed,
patch(
"homeassistant.components.mta.config_flow.BusFeed",
new=mock_feed,
),
):
mock_instance = mock_feed.return_value
mock_instance.get_arrivals.return_value = MOCK_BUS_ARRIVALS
mock_instance.get_stops.return_value = MOCK_BUS_STOPS_WITH_DIRECTION
yield mock_feed

View File

@@ -1,5 +1,5 @@
# serializer version: 1
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival-entry]
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -12,7 +12,451 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival',
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Next arrival',
'options': dict({
}),
'original_device_class': <SensorDeviceClass.TIMESTAMP: 'timestamp'>,
'original_icon': None,
'original_name': 'Next arrival',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'next_arrival',
'unique_id': 'bus_M15_400561-next_arrival',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': 'M15 - 1 Av/E 79 St Next arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:05:00+00:00',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Next arrival destination',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Next arrival destination',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'next_arrival_destination',
'unique_id': 'bus_M15_400561-next_arrival_destination',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Next arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'South Ferry',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Next arrival route',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Next arrival route',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'next_arrival_route',
'unique_id': 'bus_M15_400561-next_arrival_route',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_next_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Next arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_next_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'M15',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Second arrival',
'options': dict({
}),
'original_device_class': <SensorDeviceClass.TIMESTAMP: 'timestamp'>,
'original_icon': None,
'original_name': 'Second arrival',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'second_arrival',
'unique_id': 'bus_M15_400561-second_arrival',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': 'M15 - 1 Av/E 79 St Second arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:12:00+00:00',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Second arrival destination',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Second arrival destination',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'second_arrival_destination',
'unique_id': 'bus_M15_400561-second_arrival_destination',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Second arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'South Ferry',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Second arrival route',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Second arrival route',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'second_arrival_route',
'unique_id': 'bus_M15_400561-second_arrival_route',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_second_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Second arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_second_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'M15',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Third arrival',
'options': dict({
}),
'original_device_class': <SensorDeviceClass.TIMESTAMP: 'timestamp'>,
'original_icon': None,
'original_name': 'Third arrival',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'third_arrival',
'unique_id': 'bus_M15_400561-third_arrival',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': 'M15 - 1 Av/E 79 St Third arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:20:00+00:00',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Third arrival destination',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Third arrival destination',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'third_arrival_destination',
'unique_id': 'bus_M15_400561-third_arrival_destination',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Third arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'South Ferry',
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Third arrival route',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Third arrival route',
'platform': 'mta',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'third_arrival_route',
'unique_id': 'bus_M15_400561-third_arrival_route',
'unit_of_measurement': None,
})
# ---
# name: test_bus_sensor[sensor.m15_1_av_e_79_st_third_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'M15 - 1 Av/E 79 St Third arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.m15_1_av_e_79_st_third_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'M15',
})
# ---
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -35,21 +479,21 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Next arrival',
'friendly_name': '1 - Times Sq - 42 St (N direction) Next arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:05:00+00:00',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_destination-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -62,7 +506,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -85,20 +529,20 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_destination-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Next arrival destination',
'friendly_name': '1 - Times Sq - 42 St (N direction) Next arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'Van Cortlandt Park - 242 St',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_route-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -111,7 +555,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -134,20 +578,20 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_route-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_next_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Next arrival route',
'friendly_name': '1 - Times Sq - 42 St (N direction) Next arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_next_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_next_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '1',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -160,7 +604,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -183,21 +627,21 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Second arrival',
'friendly_name': '1 - Times Sq - 42 St (N direction) Second arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:10:00+00:00',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_destination-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -210,7 +654,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -233,20 +677,20 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_destination-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Second arrival destination',
'friendly_name': '1 - Times Sq - 42 St (N direction) Second arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'Van Cortlandt Park - 242 St',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_route-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -259,7 +703,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -282,20 +726,20 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_route-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_second_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Second arrival route',
'friendly_name': '1 - Times Sq - 42 St (N direction) Second arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_second_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_second_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '1',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -308,7 +752,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -331,21 +775,21 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'timestamp',
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Third arrival',
'friendly_name': '1 - Times Sq - 42 St (N direction) Third arrival',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': '2023-10-21T00:15:00+00:00',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_destination-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival_destination-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -358,7 +802,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival_destination',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -381,20 +825,20 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_destination-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival_destination-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Third arrival destination',
'friendly_name': '1 - Times Sq - 42 St (N direction) Third arrival destination',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_destination',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival_destination',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'Van Cortlandt Park - 242 St',
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_route-entry]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival_route-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
@@ -407,7 +851,7 @@
'disabled_by': None,
'domain': 'sensor',
'entity_category': None,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival_route',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
@@ -430,13 +874,13 @@
'unit_of_measurement': None,
})
# ---
# name: test_sensor[sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_route-state]
# name: test_subway_sensor[sensor.1_times_sq_42_st_n_direction_third_arrival_route-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': '1 Line - Times Sq - 42 St (N direction) (127N) Third arrival route',
'friendly_name': '1 - Times Sq - 42 St (N direction) Third arrival route',
}),
'context': <ANY>,
'entity_id': 'sensor.1_line_times_sq_42_st_n_direction_127n_third_arrival_route',
'entity_id': 'sensor.1_times_sq_42_st_n_direction_third_arrival_route',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,

View File

@@ -3,159 +3,522 @@
from unittest.mock import AsyncMock, MagicMock
from pymta import MTAFeedError
import pytest
from homeassistant.components.mta.const import (
CONF_LINE,
CONF_ROUTE,
CONF_STOP_ID,
CONF_STOP_NAME,
DOMAIN,
SUBENTRY_TYPE_BUS,
SUBENTRY_TYPE_SUBWAY,
)
from homeassistant.config_entries import SOURCE_USER
from homeassistant.const import CONF_API_KEY
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from tests.common import MockConfigEntry
async def test_form(
async def test_main_entry_flow_without_token(
hass: HomeAssistant,
mock_subway_feed: MagicMock,
mock_setup_entry: AsyncMock,
) -> None:
"""Test the complete config flow."""
# Start the flow
"""Test the main config flow without API key."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
assert result["errors"] == {}
# Select line
result = await hass.config_entries.flow.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
assert result["errors"] == {}
# Select stop and complete
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_STOP_ID: "127N"},
)
result = await hass.config_entries.flow.async_configure(result["flow_id"], {})
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "1 Line - Times Sq - 42 St (N direction)"
assert result["data"] == {
CONF_LINE: "1",
CONF_STOP_ID: "127N",
CONF_STOP_NAME: "Times Sq - 42 St (N direction)",
}
assert result["result"].unique_id == "1_127N"
assert result["title"] == "MTA"
assert result["data"] == {CONF_API_KEY: None}
assert len(mock_setup_entry.mock_calls) == 1
async def test_form_already_configured(
async def test_main_entry_flow_with_token(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_bus_feed: MagicMock,
) -> None:
"""Test the main config flow with API key."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.flow.async_configure(
result["flow_id"], {CONF_API_KEY: "test_api_key"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "MTA"
assert result["data"] == {CONF_API_KEY: "test_api_key"}
assert len(mock_setup_entry.mock_calls) == 1
async def test_main_entry_already_configured(
hass: HomeAssistant,
mock_subway_feed: MagicMock,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test we handle already configured."""
"""Test we abort if MTA is already configured."""
mock_config_entry.add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_LINE: "1"},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.flow.async_configure(result["flow_id"], {})
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
async def test_reauth_flow(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test the reauth flow."""
mock_config_entry.add_to_hass(hass)
result = await mock_config_entry.start_reauth_flow(hass)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_STOP_ID: "127N"},
result["flow_id"], {CONF_API_KEY: "new_api_key"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "reauth_successful"
assert mock_config_entry.data[CONF_API_KEY] == "new_api_key"
@pytest.mark.parametrize(
("side_effect", "expected_error"),
[
(MTAFeedError("Connection error"), "cannot_connect"),
(RuntimeError("Unexpected error"), "unknown"),
],
)
async def test_reauth_flow_errors(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_bus_feed: MagicMock,
side_effect: Exception,
expected_error: str,
) -> None:
"""Test the reauth flow with connection error."""
mock_config_entry.add_to_hass(hass)
mock_bus_feed.return_value.get_stops.side_effect = side_effect
result = await mock_config_entry.start_reauth_flow(hass)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.flow.async_configure(
result["flow_id"], {CONF_API_KEY: "bad_api_key"}
)
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": expected_error}
mock_bus_feed.return_value.get_stops.side_effect = None
result = await hass.config_entries.flow.async_configure(
result["flow_id"], {CONF_API_KEY: "api_key"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "reauth_successful"
# Subway subentry tests
async def test_subway_subentry_flow(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test the subway subentry flow."""
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry.entry_id, SUBENTRY_TYPE_SUBWAY),
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "127N"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "1 - Times Sq - 42 St (N direction)"
assert result["data"] == {
CONF_LINE: "1",
CONF_STOP_ID: "127N",
CONF_STOP_NAME: "Times Sq - 42 St (N direction)",
}
async def test_subway_subentry_already_configured(
hass: HomeAssistant,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test subway subentry already configured."""
mock_config_entry_with_subway_subentry.add_to_hass(hass)
await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_subway_subentry.entry_id, SUBENTRY_TYPE_SUBWAY),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "127N"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
async def test_form_connection_error(
async def test_subway_subentry_connection_error(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_subway_feed: MagicMock,
mock_setup_entry: AsyncMock,
) -> None:
"""Test we handle connection errors and can recover."""
mock_instance = mock_subway_feed.return_value
mock_instance.get_arrivals.side_effect = MTAFeedError("Connection error")
"""Test subway subentry flow with connection error."""
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
mock_subway_feed.return_value.get_arrivals.side_effect = MTAFeedError(
"Connection error"
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_LINE: "1"},
result = await hass.config_entries.subentries.async_init(
(mock_config_entry.entry_id, SUBENTRY_TYPE_SUBWAY),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_STOP_ID: "127S"},
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "127N"}
)
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
# Test recovery - reset mock to succeed
mock_instance.get_arrivals.side_effect = None
mock_instance.get_arrivals.return_value = []
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_STOP_ID: "127S"},
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert len(mock_setup_entry.mock_calls) == 1
async def test_form_cannot_get_stops(
hass: HomeAssistant, mock_subway_feed: MagicMock
async def test_subway_subentry_cannot_get_stops(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test we abort when we cannot get stops."""
mock_instance = mock_subway_feed.return_value
mock_instance.get_stops.side_effect = MTAFeedError("Feed error")
"""Test subway subentry flow when cannot get stops."""
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
mock_subway_feed.return_value.get_stops.side_effect = MTAFeedError("Feed error")
result = await hass.config_entries.subentries.async_init(
(mock_config_entry.entry_id, SUBENTRY_TYPE_SUBWAY),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_LINE: "1"},
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "cannot_connect"
async def test_form_no_stops_found(
hass: HomeAssistant, mock_subway_feed: MagicMock
async def test_subway_subentry_no_stops_found(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test we abort when no stops are found."""
mock_instance = mock_subway_feed.return_value
mock_instance.get_stops.return_value = []
"""Test subway subentry flow when no stops are found."""
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER}
mock_subway_feed.return_value.get_stops.return_value = []
result = await hass.config_entries.subentries.async_init(
(mock_config_entry.entry_id, SUBENTRY_TYPE_SUBWAY),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"],
{CONF_LINE: "1"},
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_LINE: "1"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "no_stops"
# Bus subentry tests
async def test_bus_subentry_flow(
hass: HomeAssistant,
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test the bus subentry flow."""
mock_config_entry_with_api_key.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_api_key.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_api_key.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "M15 - 1 Av/E 79 St"
assert result["data"] == {
CONF_ROUTE: "M15",
CONF_STOP_ID: "400561",
CONF_STOP_NAME: "1 Av/E 79 St",
}
async def test_bus_subentry_flow_without_token(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test the bus subentry flow without API token (space workaround)."""
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "user"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
async def test_bus_subentry_already_configured(
hass: HomeAssistant,
mock_config_entry_with_bus_subentry: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test bus subentry already configured."""
mock_config_entry_with_bus_subentry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_bus_subentry.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_bus_subentry.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
async def test_bus_subentry_invalid_route(
hass: HomeAssistant,
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test bus subentry flow with invalid route."""
mock_config_entry_with_api_key.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_api_key.entry_id)
await hass.async_block_till_done()
mock_bus_feed.return_value.get_stops.return_value = []
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_api_key.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "INVALID"}
)
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": "invalid_route"}
async def test_bus_subentry_route_fetch_error(
hass: HomeAssistant,
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test bus subentry flow when route fetch fails (treated as invalid route)."""
mock_config_entry_with_api_key.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_api_key.entry_id)
await hass.async_block_till_done()
mock_bus_feed.return_value.get_stops.side_effect = MTAFeedError("Connection error")
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_api_key.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": "invalid_route"}
mock_bus_feed.return_value.get_stops.side_effect = None
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
async def test_bus_subentry_connection_test_error(
hass: HomeAssistant,
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test bus subentry flow when connection test fails after route validation."""
mock_config_entry_with_api_key.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_api_key.entry_id)
await hass.async_block_till_done()
# get_stops succeeds but get_arrivals fails
mock_bus_feed.return_value.get_arrivals.side_effect = MTAFeedError(
"Connection error"
)
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_api_key.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
mock_bus_feed.return_value.get_arrivals.side_effect = None
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
async def test_bus_subentry_with_direction(
hass: HomeAssistant,
mock_config_entry_with_api_key: MockConfigEntry,
mock_bus_feed_with_direction: MagicMock,
) -> None:
"""Test bus subentry flow shows direction for stops."""
mock_config_entry_with_api_key.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_api_key.entry_id)
await hass.async_block_till_done()
result = await hass.config_entries.subentries.async_init(
(mock_config_entry_with_api_key.entry_id, SUBENTRY_TYPE_BUS),
context={"source": SOURCE_USER},
)
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_ROUTE: "M15"}
)
assert result["type"] is FlowResultType.FORM
assert result["step_id"] == "stop"
# Select a stop with direction info
result = await hass.config_entries.subentries.async_configure(
result["flow_id"], {CONF_STOP_ID: "400561"}
)
assert result["type"] is FlowResultType.CREATE_ENTRY
# Stop name should include direction
assert result["title"] == "M15 - 1 Av/E 79 St (to South Ferry)"
assert result["data"][CONF_STOP_NAME] == "1 Av/E 79 St (to South Ferry)"

View File

@@ -1,20 +1,29 @@
"""Test the MTA New York City Transit init."""
from types import MappingProxyType
from unittest.mock import MagicMock
from homeassistant.components.mta.const import DOMAIN
from homeassistant.config_entries import ConfigEntryState
from pymta import MTAFeedError
import pytest
from homeassistant.components.mta.const import (
CONF_LINE,
CONF_STOP_ID,
CONF_STOP_NAME,
DOMAIN,
)
from homeassistant.config_entries import ConfigEntryState, ConfigSubentry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from tests.common import MockConfigEntry
async def test_setup_and_unload_entry(
async def test_setup_entry_no_subentries(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test setting up and unloading an entry."""
"""Test setting up an entry without subentries."""
mock_config_entry.add_to_hass(hass)
assert await hass.config_entries.async_setup(mock_config_entry.entry_id)
@@ -23,7 +32,146 @@ async def test_setup_and_unload_entry(
assert mock_config_entry.state is ConfigEntryState.LOADED
assert DOMAIN in hass.config_entries.async_domains()
assert await hass.config_entries.async_unload(mock_config_entry.entry_id)
async def test_setup_entry_with_subway_subentry(
hass: HomeAssistant,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test setting up an entry with a subway subentry."""
mock_config_entry_with_subway_subentry.add_to_hass(hass)
assert await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
assert mock_config_entry.state is ConfigEntryState.NOT_LOADED
assert mock_config_entry_with_subway_subentry.state is ConfigEntryState.LOADED
assert DOMAIN in hass.config_entries.async_domains()
# Verify coordinator was created for the subentry
assert len(mock_config_entry_with_subway_subentry.runtime_data) == 1
async def test_setup_entry_with_bus_subentry(
hass: HomeAssistant,
mock_config_entry_with_bus_subentry: MockConfigEntry,
mock_bus_feed: MagicMock,
) -> None:
"""Test setting up an entry with a bus subentry."""
mock_config_entry_with_bus_subentry.add_to_hass(hass)
assert await hass.config_entries.async_setup(
mock_config_entry_with_bus_subentry.entry_id
)
await hass.async_block_till_done()
assert mock_config_entry_with_bus_subentry.state is ConfigEntryState.LOADED
assert DOMAIN in hass.config_entries.async_domains()
# Verify coordinator was created for the subentry
assert len(mock_config_entry_with_bus_subentry.runtime_data) == 1
async def test_unload_entry(
hass: HomeAssistant,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test unloading an entry."""
mock_config_entry_with_subway_subentry.add_to_hass(hass)
assert await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
assert mock_config_entry_with_subway_subentry.state is ConfigEntryState.LOADED
assert await hass.config_entries.async_unload(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
assert mock_config_entry_with_subway_subentry.state is ConfigEntryState.NOT_LOADED
async def test_setup_entry_with_unknown_subentry_type(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test that unknown subentry types are skipped."""
# Add a subentry with an unknown type
unknown_subentry = ConfigSubentry(
data=MappingProxyType(
{
CONF_LINE: "1",
CONF_STOP_ID: "127N",
CONF_STOP_NAME: "Times Sq - 42 St",
}
),
subentry_id="01JUNKNOWN000000000000001",
subentry_type="unknown_type", # Unknown subentry type
title="Unknown Subentry",
unique_id="unknown_1",
)
mock_config_entry.subentries = {unknown_subentry.subentry_id: unknown_subentry}
mock_config_entry.add_to_hass(hass)
assert await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
assert mock_config_entry.state is ConfigEntryState.LOADED
# No coordinators should be created for unknown subentry type
assert len(mock_config_entry.runtime_data) == 0
async def test_setup_entry_coordinator_fetch_error(
hass: HomeAssistant,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
) -> None:
"""Test that coordinator raises ConfigEntryNotReady on fetch error."""
mock_subway_feed.return_value.get_arrivals.side_effect = MTAFeedError("API error")
mock_config_entry_with_subway_subentry.add_to_hass(hass)
assert not await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
assert mock_config_entry_with_subway_subentry.state is ConfigEntryState.SETUP_RETRY
@pytest.mark.freeze_time("2023-10-21")
async def test_sensor_no_arrivals(
hass: HomeAssistant,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
entity_registry: er.EntityRegistry,
) -> None:
"""Test sensor values when there are no arrivals."""
await hass.config.async_set_time_zone("UTC")
# Return empty arrivals list
mock_subway_feed.return_value.get_arrivals.return_value = []
mock_config_entry_with_subway_subentry.add_to_hass(hass)
await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
# All arrival sensors should have state "unknown" (native_value is None)
state = hass.states.get("sensor.1_times_sq_42_st_n_direction_next_arrival")
assert state is not None
assert state.state == "unknown"
state = hass.states.get("sensor.1_times_sq_42_st_n_direction_second_arrival")
assert state is not None
assert state.state == "unknown"
state = hass.states.get("sensor.1_times_sq_42_st_n_direction_third_arrival")
assert state is not None
assert state.state == "unknown"

View File

@@ -13,18 +13,43 @@ from tests.common import MockConfigEntry, snapshot_platform
@pytest.mark.freeze_time("2023-10-21")
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
async def test_sensor(
async def test_subway_sensor(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_config_entry_with_subway_subentry: MockConfigEntry,
mock_subway_feed: MagicMock,
entity_registry: er.EntityRegistry,
snapshot: SnapshotAssertion,
) -> None:
"""Test the sensor entity."""
"""Test the subway sensor entities."""
await hass.config.async_set_time_zone("UTC")
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id)
mock_config_entry_with_subway_subentry.add_to_hass(hass)
await hass.config_entries.async_setup(
mock_config_entry_with_subway_subentry.entry_id
)
await hass.async_block_till_done()
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
await snapshot_platform(
hass, entity_registry, snapshot, mock_config_entry_with_subway_subentry.entry_id
)
@pytest.mark.freeze_time("2023-10-21")
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
async def test_bus_sensor(
hass: HomeAssistant,
mock_config_entry_with_bus_subentry: MockConfigEntry,
mock_bus_feed: MagicMock,
entity_registry: er.EntityRegistry,
snapshot: SnapshotAssertion,
) -> None:
"""Test the bus sensor entities."""
await hass.config.async_set_time_zone("UTC")
mock_config_entry_with_bus_subentry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry_with_bus_subentry.entry_id)
await hass.async_block_till_done()
await snapshot_platform(
hass, entity_registry, snapshot, mock_config_entry_with_bus_subentry.entry_id
)

View File

@@ -215,10 +215,7 @@ async def test_doortag_opening_status_change(
for _ in range(11):
freezer.tick(timedelta(seconds=30))
async_fire_time_changed(hass)
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done(wait_background_tasks=True)
# Change mocked status
doortag_entity_id = "12:34:56:00:86:99"
@@ -231,10 +228,7 @@ async def test_doortag_opening_status_change(
for _ in range(11):
freezer.tick(timedelta(seconds=30))
async_fire_time_changed(hass)
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done()
await hass.async_block_till_done(wait_background_tasks=True)
# Check connectivity mocked state
assert hass.states.get(_doortag_entity_connectivity).state == "on"

View File

@@ -89,31 +89,41 @@ def mock_proxmox_client():
qemu_by_vmid = {vm["vmid"]: vm for vm in qemu_list}
lxc_by_vmid = {vm["vmid"]: vm for vm in lxc_list}
# Note to reviewer: I will expand on these fixtures in a next PR
# Necessary evil to handle the binary_sensor tests properly
# Cache resource mocks by vmid so callers (e.g. button tests) can
# inspect specific call counts after pressing a button.
qemu_mocks: dict[int, MagicMock] = {}
lxc_mocks: dict[int, MagicMock] = {}
def _qemu_resource(vmid: int) -> MagicMock:
"""Return a mock resource the QEMU."""
resource = MagicMock()
vm = qemu_by_vmid[vmid]
resource.status.current.get.return_value = {
"name": vm["name"],
"status": vm["status"],
}
return resource
"""Return a cached mock resource for a QEMU VM."""
if vmid not in qemu_mocks:
resource = MagicMock()
vm = qemu_by_vmid[vmid]
resource.status.current.get.return_value = {
"name": vm["name"],
"status": vm["status"],
}
qemu_mocks[vmid] = resource
return qemu_mocks[vmid]
def _lxc_resource(vmid: int) -> MagicMock:
"""Return a mock resource the LXC."""
resource = MagicMock()
ct = lxc_by_vmid[vmid]
resource.status.current.get.return_value = {
"name": ct["name"],
"status": ct["status"],
}
return resource
"""Return a cached mock resource for an LXC container."""
if vmid not in lxc_mocks:
resource = MagicMock()
ct = lxc_by_vmid[vmid]
resource.status.current.get.return_value = {
"name": ct["name"],
"status": ct["status"],
}
lxc_mocks[vmid] = resource
return lxc_mocks[vmid]
node_mock.qemu.side_effect = _qemu_resource
node_mock.lxc.side_effect = _lxc_resource
mock_instance._qemu_mocks = qemu_mocks
mock_instance._lxc_mocks = lxc_mocks
nodes_mock = MagicMock()
nodes_mock.get.return_value = load_json_array_fixture(
"nodes/nodes.json", DOMAIN

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,315 @@
"""Tests for the ProxmoxVE button platform."""
from __future__ import annotations
from unittest.mock import MagicMock, patch
from proxmoxer import AuthenticationError
from proxmoxer.core import ResourceException
import pytest
from requests.exceptions import ConnectTimeout, SSLError
from syrupy.assertion import SnapshotAssertion
from homeassistant.components.button import SERVICE_PRESS
from homeassistant.const import ATTR_ENTITY_ID, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import entity_registry as er
from . import setup_integration
from tests.common import MockConfigEntry, snapshot_platform
BUTTON_DOMAIN = "button"
@pytest.fixture(autouse=True)
def enable_all_entities(entity_registry_enabled_by_default: None) -> None:
"""Enable all entities for button tests."""
async def test_all_button_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_registry: er.EntityRegistry,
) -> None:
"""Snapshot test for all ProxmoxVE button entities."""
with patch(
"homeassistant.components.proxmoxve.PLATFORMS",
[Platform.BUTTON],
):
await setup_integration(hass, mock_config_entry)
await snapshot_platform(
hass, entity_registry, snapshot, mock_config_entry.entry_id
)
@pytest.mark.parametrize(
("entity_id", "command"),
[
("button.pve1_restart", "reboot"),
("button.pve1_shutdown", "shutdown"),
],
)
async def test_node_buttons(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
command: str,
) -> None:
"""Test pressing a ProxmoxVE node action button triggers the correct API call."""
await setup_integration(hass, mock_config_entry)
method_mock = mock_proxmox_client._node_mock.status.post
pre_calls = len(method_mock.mock_calls)
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
assert len(method_mock.mock_calls) == pre_calls + 1
method_mock.assert_called_with(command=command)
@pytest.mark.parametrize(
("entity_id", "attr"),
[
("button.pve1_start_all", "startall"),
("button.pve1_stop_all", "stopall"),
],
)
async def test_node_startall_stopall_buttons(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
attr: str,
) -> None:
"""Test pressing a ProxmoxVE node start all / stop all button triggers the correct API call."""
await setup_integration(hass, mock_config_entry)
method_mock = getattr(mock_proxmox_client._node_mock, attr).post
pre_calls = len(method_mock.mock_calls)
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
assert len(method_mock.mock_calls) == pre_calls + 1
@pytest.mark.parametrize(
("entity_id", "vmid", "action"),
[
("button.vm_web_start", 100, "start"),
("button.vm_web_stop", 100, "stop"),
("button.vm_web_restart", 100, "restart"),
("button.vm_web_hibernate", 100, "hibernate"),
("button.vm_web_reset", 100, "reset"),
],
)
async def test_vm_buttons(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
vmid: int,
action: str,
) -> None:
"""Test pressing a ProxmoxVE VM action button triggers the correct API call."""
await setup_integration(hass, mock_config_entry)
mock_proxmox_client._node_mock.qemu(vmid)
method_mock = getattr(mock_proxmox_client._qemu_mocks[vmid].status, action).post
pre_calls = len(method_mock.mock_calls)
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
assert len(method_mock.mock_calls) == pre_calls + 1
@pytest.mark.parametrize(
("entity_id", "vmid", "action"),
[
("button.ct_nginx_start", 200, "start"),
("button.ct_nginx_stop", 200, "stop"),
("button.ct_nginx_restart", 200, "restart"),
],
)
async def test_container_buttons(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
vmid: int,
action: str,
) -> None:
"""Test pressing a ProxmoxVE container action button triggers the correct API call."""
await setup_integration(hass, mock_config_entry)
mock_proxmox_client._node_mock.lxc(vmid)
method_mock = getattr(mock_proxmox_client._lxc_mocks[vmid].status, action).post
pre_calls = len(method_mock.mock_calls)
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
assert len(method_mock.mock_calls) == pre_calls + 1
@pytest.mark.parametrize(
("entity_id", "exception"),
[
("button.pve1_restart", AuthenticationError("auth failed")),
("button.pve1_restart", SSLError("ssl error")),
("button.pve1_restart", ConnectTimeout("timeout")),
("button.pve1_shutdown", ResourceException(500, "error", {})),
],
)
async def test_node_buttons_exceptions(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
exception: Exception,
) -> None:
"""Test that ProxmoxVE node button errors are raised as HomeAssistantError."""
await setup_integration(hass, mock_config_entry)
mock_proxmox_client._node_mock.status.post.side_effect = exception
with pytest.raises(HomeAssistantError):
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
@pytest.mark.parametrize(
("entity_id", "vmid", "action", "exception"),
[
(
"button.vm_web_start",
100,
"start",
AuthenticationError("auth failed"),
),
(
"button.vm_web_start",
100,
"start",
SSLError("ssl error"),
),
(
"button.vm_web_hibernate",
100,
"hibernate",
ConnectTimeout("timeout"),
),
(
"button.vm_web_reset",
100,
"reset",
ResourceException(500, "error", {}),
),
],
)
async def test_vm_buttons_exceptions(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
vmid: int,
action: str,
exception: Exception,
) -> None:
"""Test that ProxmoxVE VM button errors are raised as HomeAssistantError."""
await setup_integration(hass, mock_config_entry)
mock_proxmox_client._node_mock.qemu(vmid)
getattr(
mock_proxmox_client._qemu_mocks[vmid].status, action
).post.side_effect = exception
with pytest.raises(HomeAssistantError):
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
@pytest.mark.parametrize(
("entity_id", "vmid", "action", "exception"),
[
(
"button.ct_nginx_start",
200,
"start",
AuthenticationError("auth failed"),
),
(
"button.ct_nginx_start",
200,
"start",
SSLError("ssl error"),
),
(
"button.ct_nginx_restart",
200,
"restart",
ConnectTimeout("timeout"),
),
(
"button.ct_nginx_stop",
200,
"stop",
ResourceException(500, "error", {}),
),
],
)
async def test_container_buttons_exceptions(
hass: HomeAssistant,
mock_proxmox_client: MagicMock,
mock_config_entry: MockConfigEntry,
entity_id: str,
vmid: int,
action: str,
exception: Exception,
) -> None:
"""Test that ProxmoxVE container button errors are raised as HomeAssistantError."""
await setup_integration(hass, mock_config_entry)
mock_proxmox_client._node_mock.lxc(vmid)
getattr(
mock_proxmox_client._lxc_mocks[vmid].status, action
).post.side_effect = exception
with pytest.raises(HomeAssistantError):
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)

View File

@@ -1304,14 +1304,16 @@ def test_attribute_selector_schema(
(
{},
(
{"seconds": 10},
{
"seconds": 10
}, # Seconds is allowed also if `enable_second` is not set
{"days": 10}, # Days is allowed also if `enable_day` is not set
{"milliseconds": 500},
),
(None, {}, {"seconds": -1}),
),
(
{"enable_day": True, "enable_millisecond": True},
{"enable_day": True, "enable_millisecond": True, "enable_second": True},
({"seconds": 10}, {"days": 10}, {"milliseconds": 500}),
(None, {}, {"seconds": -1}),
),

View File

@@ -240,6 +240,14 @@ def test_aborting_for_older_versions(restore_config: str, tmp_path: Path) -> Non
}
@pytest.mark.parametrize(
("backup", "password"),
[
("backup_with_database.tar", None),
("backup_with_database_protected_v2.tar", "hunter2"),
("backup_with_database_protected_v3.tar", "hunter2"),
],
)
@pytest.mark.parametrize(
(
"restore_backup_content",
@@ -287,6 +295,8 @@ def test_aborting_for_older_versions(restore_config: str, tmp_path: Path) -> Non
],
)
def test_restore_backup(
backup: str,
password: str | None,
restore_backup_content: backup_restore.RestoreBackupFileContent,
expected_kept_files: set[str],
expected_restored_files: set[str],
@@ -321,9 +331,7 @@ def test_restore_backup(
for f in existing_files:
(tmp_path / f).write_text("before_restore")
get_fixture_path(
"core/backup_restore/empty_backup_database_included.tar", None
).copy(backup_file_path)
get_fixture_path(f"core/backup_restore/{backup}", None).copy(backup_file_path)
files_before_restore = get_files(tmp_path)
assert files_before_restore == {
@@ -341,6 +349,7 @@ def test_restore_backup(
kept_files_data[file] = (tmp_path / file).read_bytes()
restore_backup_content.backup_file_path = backup_file_path
restore_backup_content.password = password
with (
mock.patch(
@@ -378,7 +387,7 @@ def test_restore_backup_filter_files(tmp_path: Path) -> None:
backup_file_path = tmp_path / "backups" / "test.tar"
backup_file_path.parent.mkdir()
get_fixture_path(
"core/backup_restore/empty_backup_database_included.tar", None
"core/backup_restore/malicious_backup_with_database.tar", None
).copy(backup_file_path)
with (
@@ -440,9 +449,9 @@ def test_remove_backup_file_after_restore(
"""Test removing a backup file after restore."""
backup_file_path = tmp_path / "backups" / "test.tar"
backup_file_path.parent.mkdir()
get_fixture_path(
"core/backup_restore/empty_backup_database_included.tar", None
).copy(backup_file_path)
get_fixture_path("core/backup_restore/backup_with_database.tar", None).copy(
backup_file_path
)
with (
mock.patch(