Compare commits

..

1 Commits

Author SHA1 Message Date
Erik
659bab1ca7 Deprecate unused helper function cover.is_closed 2026-03-04 14:00:55 +01:00
253 changed files with 2243 additions and 6408 deletions

View File

@@ -7,19 +7,330 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
## Code Review Guidelines
**When reviewing code, do NOT comment on:**
- **Missing imports** - We use static analysis tooling to catch that
- **Code formatting** - We have ruff as a formatting tool that will catch those if needed (unless specifically instructed otherwise in these instructions)
**Git commit practices during review:**
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
## Python Requirements
- **Compatibility**: Python 3.13+
- **Language Features**: Use the newest features when possible:
- Pattern matching
- Type hints
- f-strings (preferred over `%` or `.format()`)
- Dataclasses
- Walrus operator
### Strict Typing (Platinum)
- **Comprehensive Type Hints**: Add type hints to all functions, methods, and variables
- **Custom Config Entry Types**: When using runtime_data:
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
```
- **Library Requirements**: Include `py.typed` file for PEP-561 compliance
## Code Quality Standards
- **Formatting**: Ruff
- **Linting**: PyLint and Ruff
- **Type Checking**: MyPy
- **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists
- **Testing**: pytest with plain functions and fixtures
- **Language**: American English for all code, comments, and documentation (use sentence case, including titles)
### Writing Style Guidelines
- **Tone**: Friendly and informative
- **Perspective**: Use second-person ("you" and "your") for user-facing messages
- **Inclusivity**: Use objective, non-discriminatory language
- **Clarity**: Write for non-native English speakers
- **Formatting in Messages**:
- Use backticks for: file paths, filenames, variable names, field entries
- Use sentence case for titles and messages (capitalize only the first word and proper nouns)
- Avoid abbreviations when possible
### Documentation Standards
- **File Headers**: Short and concise
```python
"""Integration for Peblar EV chargers."""
```
- **Method/Function Docstrings**: Required for all
```python
async def async_setup_entry(hass: HomeAssistant, entry: PeblarConfigEntry) -> bool:
"""Set up Peblar from a config entry."""
```
- **Comment Style**:
- Use clear, descriptive comments
- Explain the "why" not just the "what"
- Keep code block lines under 80 characters when possible
- Use progressive disclosure (simple explanation first, complex details later)
## Async Programming
- All external I/O operations must be async
- **Best Practices**:
- Avoid sleeping in loops
- Avoid awaiting in loops - use `gather` instead
- No blocking calls
- Group executor jobs when possible - switching between event loop and executor is expensive
### Blocking Operations
- **Use Executor**: For blocking I/O operations
```python
result = await hass.async_add_executor_job(blocking_function, args)
```
- **Never Block Event Loop**: Avoid file operations, `time.sleep()`, blocking HTTP calls
- **Replace with Async**: Use `asyncio.sleep()` instead of `time.sleep()`
### Thread Safety
- **@callback Decorator**: For event loop safe functions
```python
@callback
def async_update_callback(self, event):
"""Safe to run in event loop."""
self.async_write_ha_state()
```
- **Sync APIs from Threads**: Use sync versions when calling from non-event loop threads
- **Registry Changes**: Must be done in event loop thread
### Error Handling
- **Exception Types**: Choose most specific exception available
- `ServiceValidationError`: User input errors (preferred over `ValueError`)
- `HomeAssistantError`: Device communication failures
- `ConfigEntryNotReady`: Temporary setup issues (device offline)
- `ConfigEntryAuthFailed`: Authentication problems
- `ConfigEntryError`: Permanent setup issues
- **Try/Catch Best Practices**:
- Only wrap code that can throw exceptions
- Keep try blocks minimal - process data after the try/catch
- **Avoid bare exceptions** except in specific cases:
- ❌ Generally not allowed: `except:` or `except Exception:`
- ✅ Allowed in config flows to ensure robustness
- ✅ Allowed in functions/methods that run in background tasks
- Bad pattern:
```python
try:
data = await device.get_data() # Can throw
# ❌ Don't process data inside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
except DeviceError:
_LOGGER.error("Failed to get data")
```
- Good pattern:
```python
try:
data = await device.get_data() # Can throw
except DeviceError:
_LOGGER.error("Failed to get data")
return
# ✅ Process data outside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
```
- **Bare Exception Usage**:
```python
# ❌ Not allowed in regular code
try:
data = await device.get_data()
except Exception: # Too broad
_LOGGER.error("Failed")
# ✅ Allowed in config flow for robustness
async def async_step_user(self, user_input=None):
try:
await self._test_connection(user_input)
except Exception: # Allowed here
errors["base"] = "unknown"
# ✅ Allowed in background tasks
async def _background_refresh():
try:
await coordinator.async_refresh()
except Exception: # Allowed in task
_LOGGER.exception("Unexpected error in background task")
```
- **Setup Failure Patterns**:
```python
try:
await device.async_setup()
except (asyncio.TimeoutError, TimeoutException) as ex:
raise ConfigEntryNotReady(f"Timeout connecting to {device.host}") from ex
except AuthFailed as ex:
raise ConfigEntryAuthFailed(f"Credentials expired for {device.name}") from ex
```
### Logging
- **Format Guidelines**:
- No periods at end of messages
- No integration names/domains (added automatically)
- No sensitive data (keys, tokens, passwords)
- Use debug level for non-user-facing messages
- **Use Lazy Logging**:
```python
_LOGGER.debug("This is a log message with %s", variable)
```
### Unavailability Logging
- **Log Once**: When device/service becomes unavailable (info level)
- **Log Recovery**: When device/service comes back online
- **Implementation Pattern**:
```python
_unavailable_logged: bool = False
if not self._unavailable_logged:
_LOGGER.info("The sensor is unavailable: %s", ex)
self._unavailable_logged = True
# On recovery:
if self._unavailable_logged:
_LOGGER.info("The sensor is back online")
self._unavailable_logged = False
```
## Development Commands
.vscode/tasks.json contains useful commands used for development.
### Environment
- **Local development (non-container)**: Activate the project venv before running commands: `source .venv/bin/activate`
- **Dev container**: No activation needed, the environment is pre-configured
## Good practices
### Code Quality & Linting
- **Run all linters on all files**: `prek run --all-files`
- **Run linters on staged files only**: `prek run`
- **PyLint on everything** (slow): `pylint homeassistant`
- **PyLint on specific folder**: `pylint homeassistant/components/my_integration`
- **MyPy type checking (whole project)**: `mypy homeassistant/`
- **MyPy on specific integration**: `mypy homeassistant/components/my_integration`
Integrations with Platinum or Gold level in the Integration Quality Scale reflect a high standard of code quality and maintainability. When looking for examples of something, these are good places to start. The level is indicated in the manifest.json of the integration.
### Testing
- **Quick test of changed files**: `pytest --timeout=10 --picked`
- **Update test snapshots**: Add `--snapshot-update` to pytest command
- ⚠️ Omit test results after using `--snapshot-update`
- Always run tests again without the flag to verify snapshots
- **Full test suite** (AVOID - very slow): `pytest ./tests`
### Dependencies & Requirements
- **Update generated files after dependency changes**: `python -m script.gen_requirements_all`
- **Install all Python requirements**:
```bash
uv pip install -r requirements_all.txt -r requirements.txt -r requirements_test.txt
```
- **Install test requirements only**:
```bash
uv pip install -r requirements_test_all.txt -r requirements.txt
```
### Translations
- **Update translations after strings.json changes**:
```bash
python -m script.translations develop --all
```
### Project Validation
- **Run hassfest** (checks project structure and updates generated files):
```bash
python -m script.hassfest
```
## Common Anti-Patterns & Best Practices
### ❌ **Avoid These Patterns**
```python
# Blocking operations in event loop
data = requests.get(url) # ❌ Blocks event loop
time.sleep(5) # ❌ Blocks event loop
# Reusing BleakClient instances
self.client = BleakClient(address)
await self.client.connect()
# Later...
await self.client.connect() # ❌ Don't reuse
# Hardcoded strings in code
self._attr_name = "Temperature Sensor" # ❌ Not translatable
# Missing error handling
data = await self.api.get_data() # ❌ No exception handling
# Storing sensitive data in diagnostics
return {"api_key": entry.data[CONF_API_KEY]} # ❌ Exposes secrets
# Accessing hass.data directly in tests
coordinator = hass.data[DOMAIN][entry.entry_id] # ❌ Don't access hass.data
# User-configurable polling intervals
# In config flow
vol.Optional("scan_interval", default=60): cv.positive_int # ❌ Not allowed
# In coordinator
update_interval = timedelta(minutes=entry.data.get("scan_interval", 1)) # ❌ Not allowed
# User-configurable config entry names (non-helper integrations)
vol.Optional("name", default="My Device"): cv.string # ❌ Not allowed in regular integrations
# Too much code in try block
try:
response = await client.get_data() # Can throw
# ❌ Data processing should be outside try block
temperature = response["temperature"] / 10
humidity = response["humidity"]
self._attr_native_value = temperature
except ClientError:
_LOGGER.error("Failed to fetch data")
# Bare exceptions in regular code
try:
value = await sensor.read_value()
except Exception: # ❌ Too broad - catch specific exceptions
_LOGGER.error("Failed to read sensor")
```
### ✅ **Use These Patterns Instead**
```python
# Async operations with executor
data = await hass.async_add_executor_job(requests.get, url)
await asyncio.sleep(5) # ✅ Non-blocking
# Fresh BleakClient instances
client = BleakClient(address) # ✅ New instance each time
await client.connect()
# Translatable entity names
_attr_translation_key = "temperature_sensor" # ✅ Translatable
# Proper error handling
try:
data = await self.api.get_data()
except ApiException as err:
raise UpdateFailed(f"API error: {err}") from err
# Redacted diagnostics data
return async_redact_data(data, {"api_key", "password"}) # ✅ Safe
# Test through proper integration setup and fixtures
@pytest.fixture
async def init_integration(hass, mock_config_entry, mock_api):
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id) # ✅ Proper setup
# Integration-determined polling intervals (not user-configurable)
SCAN_INTERVAL = timedelta(minutes=5) # ✅ Common pattern: constant in const.py
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
# ✅ Integration determines interval based on device capabilities, connection type, etc.
interval = timedelta(minutes=1) if client.is_local else SCAN_INTERVAL
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=interval,
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
# Skills
- github-pr-commenter: .claude/skills/github-pr-commenter/SKILL.md
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md

View File

@@ -10,6 +10,7 @@ on:
env:
BUILD_TYPE: core
DEFAULT_PYTHON: "3.14.2"
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
@@ -41,10 +42,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Get information
id: info
@@ -79,7 +80,7 @@ jobs:
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: translations
path: translations.tar.gz
@@ -111,7 +112,7 @@ jobs:
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -122,7 +123,7 @@ jobs:
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -131,11 +132,11 @@ jobs:
workflow_conclusion: success
name: package
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Adjust nightly version
if: needs.init.outputs.channel == 'dev'
@@ -537,10 +538,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0

View File

@@ -41,7 +41,8 @@ env:
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.4"
ADDITIONAL_PYTHON_VERSIONS: "[]"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -165,11 +166,6 @@ jobs:
tests_glob=""
lint_only=""
skip_coverage=""
default_python=$(cat .python-version)
all_python_versions=$(jq -cn \
--arg default_python "${default_python}" \
--argjson additional_python_versions "${ADDITIONAL_PYTHON_VERSIONS}" \
'[$default_python] + $additional_python_versions')
if [[ "${INTEGRATION_CHANGES}" != "[]" ]];
then
@@ -239,8 +235,8 @@ jobs:
echo "mariadb_groups=${mariadb_groups}" >> $GITHUB_OUTPUT
echo "postgresql_groups: ${postgresql_groups}"
echo "postgresql_groups=${postgresql_groups}" >> $GITHUB_OUTPUT
echo "python_versions: ${all_python_versions}"
echo "python_versions=${all_python_versions}" >> $GITHUB_OUTPUT
echo "python_versions: ${ALL_PYTHON_VERSIONS}"
echo "python_versions=${ALL_PYTHON_VERSIONS}" >> $GITHUB_OUTPUT
echo "test_full_suite: ${test_full_suite}"
echo "test_full_suite=${test_full_suite}" >> $GITHUB_OUTPUT
echo "integrations_glob: ${integrations_glob}"
@@ -456,7 +452,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -507,13 +503,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -544,13 +540,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -580,11 +576,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Run gen_copilot_instructions.py
run: |
@@ -657,7 +653,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${PYTHON_VERSION}.json
- name: Upload licenses
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -686,13 +682,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -739,13 +735,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -790,11 +786,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Generate partial mypy restore key
id: generate-mypy-key
@@ -802,7 +798,7 @@ jobs:
mypy_version=$(cat requirements_test.txt | grep 'mypy.*=' | cut -d '=' -f 3)
echo "version=${mypy_version}" >> $GITHUB_OUTPUT
echo "key=mypy-${MYPY_CACHE_VERSION}-${mypy_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -883,13 +879,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -905,7 +901,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${TEST_GROUP_COUNT} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -1024,14 +1020,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1044,7 +1040,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1181,7 +1177,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1189,7 +1185,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1203,7 +1199,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1342,7 +1338,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1350,7 +1346,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1364,7 +1360,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1518,14 +1514,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1538,7 +1534,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml

View File

@@ -15,6 +15,9 @@ concurrency:
group: ${{ github.workflow }}
cancel-in-progress: true
env:
DEFAULT_PYTHON: "3.14.2"
jobs:
upload:
name: Upload
@@ -26,10 +29,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Upload Translations
env:

View File

@@ -16,6 +16,9 @@ on:
- "requirements.txt"
- "script/gen_requirements_all.py"
env:
DEFAULT_PYTHON: "3.14.2"
permissions: {}
concurrency:
@@ -33,11 +36,11 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Create Python virtual environment
@@ -74,7 +77,7 @@ jobs:
) > .env_file
- name: Upload env_file
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: env_file
path: ./.env_file
@@ -82,7 +85,7 @@ jobs:
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -94,7 +97,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt

View File

@@ -1 +1 @@
3.14.2
3.14

318
AGENTS.md
View File

@@ -4,13 +4,325 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
## Code Review Guidelines
**When reviewing code, do NOT comment on:**
- **Missing imports** - We use static analysis tooling to catch that
- **Code formatting** - We have ruff as a formatting tool that will catch those if needed (unless specifically instructed otherwise in these instructions)
**Git commit practices during review:**
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
## Python Requirements
- **Compatibility**: Python 3.13+
- **Language Features**: Use the newest features when possible:
- Pattern matching
- Type hints
- f-strings (preferred over `%` or `.format()`)
- Dataclasses
- Walrus operator
### Strict Typing (Platinum)
- **Comprehensive Type Hints**: Add type hints to all functions, methods, and variables
- **Custom Config Entry Types**: When using runtime_data:
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
```
- **Library Requirements**: Include `py.typed` file for PEP-561 compliance
## Code Quality Standards
- **Formatting**: Ruff
- **Linting**: PyLint and Ruff
- **Type Checking**: MyPy
- **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists
- **Testing**: pytest with plain functions and fixtures
- **Language**: American English for all code, comments, and documentation (use sentence case, including titles)
### Writing Style Guidelines
- **Tone**: Friendly and informative
- **Perspective**: Use second-person ("you" and "your") for user-facing messages
- **Inclusivity**: Use objective, non-discriminatory language
- **Clarity**: Write for non-native English speakers
- **Formatting in Messages**:
- Use backticks for: file paths, filenames, variable names, field entries
- Use sentence case for titles and messages (capitalize only the first word and proper nouns)
- Avoid abbreviations when possible
### Documentation Standards
- **File Headers**: Short and concise
```python
"""Integration for Peblar EV chargers."""
```
- **Method/Function Docstrings**: Required for all
```python
async def async_setup_entry(hass: HomeAssistant, entry: PeblarConfigEntry) -> bool:
"""Set up Peblar from a config entry."""
```
- **Comment Style**:
- Use clear, descriptive comments
- Explain the "why" not just the "what"
- Keep code block lines under 80 characters when possible
- Use progressive disclosure (simple explanation first, complex details later)
## Async Programming
- All external I/O operations must be async
- **Best Practices**:
- Avoid sleeping in loops
- Avoid awaiting in loops - use `gather` instead
- No blocking calls
- Group executor jobs when possible - switching between event loop and executor is expensive
### Blocking Operations
- **Use Executor**: For blocking I/O operations
```python
result = await hass.async_add_executor_job(blocking_function, args)
```
- **Never Block Event Loop**: Avoid file operations, `time.sleep()`, blocking HTTP calls
- **Replace with Async**: Use `asyncio.sleep()` instead of `time.sleep()`
### Thread Safety
- **@callback Decorator**: For event loop safe functions
```python
@callback
def async_update_callback(self, event):
"""Safe to run in event loop."""
self.async_write_ha_state()
```
- **Sync APIs from Threads**: Use sync versions when calling from non-event loop threads
- **Registry Changes**: Must be done in event loop thread
### Error Handling
- **Exception Types**: Choose most specific exception available
- `ServiceValidationError`: User input errors (preferred over `ValueError`)
- `HomeAssistantError`: Device communication failures
- `ConfigEntryNotReady`: Temporary setup issues (device offline)
- `ConfigEntryAuthFailed`: Authentication problems
- `ConfigEntryError`: Permanent setup issues
- **Try/Catch Best Practices**:
- Only wrap code that can throw exceptions
- Keep try blocks minimal - process data after the try/catch
- **Avoid bare exceptions** except in specific cases:
- ❌ Generally not allowed: `except:` or `except Exception:`
- ✅ Allowed in config flows to ensure robustness
- ✅ Allowed in functions/methods that run in background tasks
- Bad pattern:
```python
try:
data = await device.get_data() # Can throw
# ❌ Don't process data inside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
except DeviceError:
_LOGGER.error("Failed to get data")
```
- Good pattern:
```python
try:
data = await device.get_data() # Can throw
except DeviceError:
_LOGGER.error("Failed to get data")
return
# ✅ Process data outside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
```
- **Bare Exception Usage**:
```python
# ❌ Not allowed in regular code
try:
data = await device.get_data()
except Exception: # Too broad
_LOGGER.error("Failed")
# ✅ Allowed in config flow for robustness
async def async_step_user(self, user_input=None):
try:
await self._test_connection(user_input)
except Exception: # Allowed here
errors["base"] = "unknown"
# ✅ Allowed in background tasks
async def _background_refresh():
try:
await coordinator.async_refresh()
except Exception: # Allowed in task
_LOGGER.exception("Unexpected error in background task")
```
- **Setup Failure Patterns**:
```python
try:
await device.async_setup()
except (asyncio.TimeoutError, TimeoutException) as ex:
raise ConfigEntryNotReady(f"Timeout connecting to {device.host}") from ex
except AuthFailed as ex:
raise ConfigEntryAuthFailed(f"Credentials expired for {device.name}") from ex
```
### Logging
- **Format Guidelines**:
- No periods at end of messages
- No integration names/domains (added automatically)
- No sensitive data (keys, tokens, passwords)
- Use debug level for non-user-facing messages
- **Use Lazy Logging**:
```python
_LOGGER.debug("This is a log message with %s", variable)
```
### Unavailability Logging
- **Log Once**: When device/service becomes unavailable (info level)
- **Log Recovery**: When device/service comes back online
- **Implementation Pattern**:
```python
_unavailable_logged: bool = False
if not self._unavailable_logged:
_LOGGER.info("The sensor is unavailable: %s", ex)
self._unavailable_logged = True
# On recovery:
if self._unavailable_logged:
_LOGGER.info("The sensor is back online")
self._unavailable_logged = False
```
## Development Commands
.vscode/tasks.json contains useful commands used for development.
### Environment
- **Local development (non-container)**: Activate the project venv before running commands: `source .venv/bin/activate`
- **Dev container**: No activation needed, the environment is pre-configured
## Good practices
### Code Quality & Linting
- **Run all linters on all files**: `prek run --all-files`
- **Run linters on staged files only**: `prek run`
- **PyLint on everything** (slow): `pylint homeassistant`
- **PyLint on specific folder**: `pylint homeassistant/components/my_integration`
- **MyPy type checking (whole project)**: `mypy homeassistant/`
- **MyPy on specific integration**: `mypy homeassistant/components/my_integration`
Integrations with Platinum or Gold level in the Integration Quality Scale reflect a high standard of code quality and maintainability. When looking for examples of something, these are good places to start. The level is indicated in the manifest.json of the integration.
### Testing
- **Quick test of changed files**: `pytest --timeout=10 --picked`
- **Update test snapshots**: Add `--snapshot-update` to pytest command
- ⚠️ Omit test results after using `--snapshot-update`
- Always run tests again without the flag to verify snapshots
- **Full test suite** (AVOID - very slow): `pytest ./tests`
### Dependencies & Requirements
- **Update generated files after dependency changes**: `python -m script.gen_requirements_all`
- **Install all Python requirements**:
```bash
uv pip install -r requirements_all.txt -r requirements.txt -r requirements_test.txt
```
- **Install test requirements only**:
```bash
uv pip install -r requirements_test_all.txt -r requirements.txt
```
### Translations
- **Update translations after strings.json changes**:
```bash
python -m script.translations develop --all
```
### Project Validation
- **Run hassfest** (checks project structure and updates generated files):
```bash
python -m script.hassfest
```
## Common Anti-Patterns & Best Practices
### ❌ **Avoid These Patterns**
```python
# Blocking operations in event loop
data = requests.get(url) # ❌ Blocks event loop
time.sleep(5) # ❌ Blocks event loop
# Reusing BleakClient instances
self.client = BleakClient(address)
await self.client.connect()
# Later...
await self.client.connect() # ❌ Don't reuse
# Hardcoded strings in code
self._attr_name = "Temperature Sensor" # ❌ Not translatable
# Missing error handling
data = await self.api.get_data() # ❌ No exception handling
# Storing sensitive data in diagnostics
return {"api_key": entry.data[CONF_API_KEY]} # ❌ Exposes secrets
# Accessing hass.data directly in tests
coordinator = hass.data[DOMAIN][entry.entry_id] # ❌ Don't access hass.data
# User-configurable polling intervals
# In config flow
vol.Optional("scan_interval", default=60): cv.positive_int # ❌ Not allowed
# In coordinator
update_interval = timedelta(minutes=entry.data.get("scan_interval", 1)) # ❌ Not allowed
# User-configurable config entry names (non-helper integrations)
vol.Optional("name", default="My Device"): cv.string # ❌ Not allowed in regular integrations
# Too much code in try block
try:
response = await client.get_data() # Can throw
# ❌ Data processing should be outside try block
temperature = response["temperature"] / 10
humidity = response["humidity"]
self._attr_native_value = temperature
except ClientError:
_LOGGER.error("Failed to fetch data")
# Bare exceptions in regular code
try:
value = await sensor.read_value()
except Exception: # ❌ Too broad - catch specific exceptions
_LOGGER.error("Failed to read sensor")
```
### ✅ **Use These Patterns Instead**
```python
# Async operations with executor
data = await hass.async_add_executor_job(requests.get, url)
await asyncio.sleep(5) # ✅ Non-blocking
# Fresh BleakClient instances
client = BleakClient(address) # ✅ New instance each time
await client.connect()
# Translatable entity names
_attr_translation_key = "temperature_sensor" # ✅ Translatable
# Proper error handling
try:
data = await self.api.get_data()
except ApiException as err:
raise UpdateFailed(f"API error: {err}") from err
# Redacted diagnostics data
return async_redact_data(data, {"api_key", "password"}) # ✅ Safe
# Test through proper integration setup and fixtures
@pytest.fixture
async def init_integration(hass, mock_config_entry, mock_api):
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id) # ✅ Proper setup
# Integration-determined polling intervals (not user-configurable)
SCAN_INTERVAL = timedelta(minutes=5) # ✅ Common pattern: constant in const.py
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
# ✅ Integration determines interval based on device capabilities, connection type, etc.
interval = timedelta(minutes=1) if client.is_local else SCAN_INTERVAL
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=interval,
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```

View File

@@ -236,14 +236,9 @@ DEFAULT_INTEGRATIONS = {
"input_text",
"schedule",
"timer",
#
# Base platforms:
*BASE_PLATFORMS,
}
DEFAULT_INTEGRATIONS_RECOVERY_MODE = {
# These integrations are set up if recovery mode is activated.
"backup",
"cloud",
"frontend",
}
DEFAULT_INTEGRATIONS_SUPERVISOR = {

View File

@@ -18,10 +18,6 @@ from homeassistant.helpers.schema_config_entry_flow import (
SchemaOptionsFlowHandler,
)
from homeassistant.helpers.selector import BooleanSelector
from homeassistant.helpers.service_info.zeroconf import (
ATTR_PROPERTIES_ID,
ZeroconfServiceInfo,
)
from .const import CONF_CLIP_NEGATIVE, CONF_RETURN_AVERAGE, DOMAIN
@@ -50,9 +46,6 @@ class AirQConfigFlow(ConfigFlow, domain=DOMAIN):
VERSION = 1
_discovered_host: str
_discovered_name: str
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -97,58 +90,6 @@ class AirQConfigFlow(ConfigFlow, domain=DOMAIN):
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_zeroconf(
self, discovery_info: ZeroconfServiceInfo
) -> ConfigFlowResult:
"""Handle zeroconf discovery of an air-Q device."""
self._discovered_host = discovery_info.host
self._discovered_name = discovery_info.properties.get("devicename", "air-Q")
device_id = discovery_info.properties.get(ATTR_PROPERTIES_ID)
if not device_id:
return self.async_abort(reason="incomplete_discovery")
await self.async_set_unique_id(device_id)
self._abort_if_unique_id_configured(
updates={CONF_IP_ADDRESS: self._discovered_host},
reload_on_update=True,
)
self.context["title_placeholders"] = {"name": self._discovered_name}
return await self.async_step_discovery_confirm()
async def async_step_discovery_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle user confirmation of a discovered air-Q device."""
errors: dict[str, str] = {}
if user_input is not None:
session = async_get_clientsession(self.hass)
airq = AirQ(self._discovered_host, user_input[CONF_PASSWORD], session)
try:
await airq.validate()
except ClientConnectionError:
errors["base"] = "cannot_connect"
except InvalidAuth:
errors["base"] = "invalid_auth"
else:
return self.async_create_entry(
title=self._discovered_name,
data={
CONF_IP_ADDRESS: self._discovered_host,
CONF_PASSWORD: user_input[CONF_PASSWORD],
},
)
return self.async_show_form(
step_id="discovery_confirm",
data_schema=vol.Schema({vol.Required(CONF_PASSWORD): str}),
description_placeholders={"name": self._discovered_name},
errors=errors,
)
@staticmethod
@callback
def async_get_options_flow(

View File

@@ -7,13 +7,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["aioairq"],
"requirements": ["aioairq==0.4.7"],
"zeroconf": [
{
"properties": {
"device": "air-q"
},
"type": "_http._tcp.local."
}
]
"requirements": ["aioairq==0.4.7"]
}

View File

@@ -1,23 +1,14 @@
{
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"incomplete_discovery": "The discovered air-Q device did not provide a device ID. Ensure the firmware is up to date."
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
"invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
"invalid_input": "[%key:common::config_flow::error::invalid_host%]"
},
"flow_title": "{name}",
"step": {
"discovery_confirm": {
"data": {
"password": "[%key:common::config_flow::data::password%]"
},
"description": "Do you want to set up **{name}**?",
"title": "Set up air-Q"
},
"user": {
"data": {
"ip_address": "[%key:common::config_flow::data::ip%]",

View File

@@ -148,11 +148,8 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"light",
"lock",
"media_player",
"number",
"person",
"remote",
"scene",
"schedule",
"siren",
"switch",
"text",

View File

@@ -804,22 +804,8 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
@property
def state(self) -> MediaPlayerState | None:
"""Return the state of the player."""
if (chromecast := self._chromecast) is None or (
cast_status := self.cast_status
) is None:
# Not connected to any chromecast, or not yet got any status
return None
if (
chromecast.cast_type == pychromecast.const.CAST_TYPE_CHROMECAST
and not chromecast.ignore_cec
and cast_status.is_active_input is False
):
# The display interface for the device has been turned off or switched away
return MediaPlayerState.OFF
# The lovelace app loops media to prevent timing out, don't show that
if self.app_id == CAST_APP_ID_HOMEASSISTANT_LOVELACE:
# The lovelace app loops media to prevent timing out, don't show that
return MediaPlayerState.PLAYING
if (media_status := self._media_status()[0]) is not None:
@@ -836,12 +822,16 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
# Some apps don't report media status, show the player as playing
return MediaPlayerState.PLAYING
if self.app_id in (pychromecast.IDLE_APP_ID, None):
# We have no active app or the home screen app. This is
# same app as APP_BACKDROP.
if self.app_id is not None and self.app_id != pychromecast.config.APP_BACKDROP:
# We have an active app
return MediaPlayerState.IDLE
if self._chromecast is not None and self._chromecast.is_idle:
# If library consider us idle, that is our off state
# it takes HDMI status into account for cast devices.
return MediaPlayerState.OFF
return MediaPlayerState.IDLE
return None
@property
def media_content_id(self) -> str | None:

View File

@@ -27,6 +27,7 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.deprecation import deprecated_function
from homeassistant.helpers.entity import Entity, EntityDescription
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.typing import ConfigType
@@ -91,11 +92,11 @@ class CoverEntityFeature(IntFlag):
ATTR_CURRENT_POSITION = "current_position"
ATTR_CURRENT_TILT_POSITION = "current_tilt_position"
ATTR_IS_CLOSED = "is_closed"
ATTR_POSITION = "position"
ATTR_TILT_POSITION = "tilt_position"
@deprecated_function("code which checks the state", breaks_in_ha_version="2027.4")
@bind_hass
def is_closed(hass: HomeAssistant, entity_id: str) -> bool:
"""Return if the cover is closed based on the statemachine."""
@@ -268,9 +269,7 @@ class CoverEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
@property
def state_attributes(self) -> dict[str, Any]:
"""Return the state attributes."""
data: dict[str, Any] = {}
data[ATTR_IS_CLOSED] = self.is_closed
data = {}
if (current := self.current_cover_position) is not None:
data[ATTR_CURRENT_POSITION] = current

View File

@@ -30,16 +30,9 @@ async def async_setup_entry(
async_add_entities(
[
DemoWaterHeater(
"demo_water_heater",
"Demo Water Heater",
119,
UnitOfTemperature.FAHRENHEIT,
False,
"eco",
1,
"Demo Water Heater", 119, UnitOfTemperature.FAHRENHEIT, False, "eco", 1
),
DemoWaterHeater(
"demo_water_heater_celsius",
"Demo Water Heater Celsius",
45,
UnitOfTemperature.CELSIUS,
@@ -59,7 +52,6 @@ class DemoWaterHeater(WaterHeaterEntity):
def __init__(
self,
unique_id: str,
name: str,
target_temperature: int,
unit_of_measurement: str,
@@ -68,7 +60,6 @@ class DemoWaterHeater(WaterHeaterEntity):
target_temperature_step: float,
) -> None:
"""Initialize the water_heater device."""
self._attr_unique_id = unique_id
self._attr_name = name
if target_temperature is not None:
self._attr_supported_features |= WaterHeaterEntityFeature.TARGET_TEMPERATURE

View File

@@ -13,7 +13,7 @@
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]"
"host": "Host"
},
"description": "Please enter the host name or IP address of the Devialet device."
}

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["paho_mqtt", "pyeconet"],
"requirements": ["pyeconet==0.2.2"]
"requirements": ["pyeconet==0.2.1"]
}

View File

@@ -1,6 +1,6 @@
"""Support for EnOcean devices."""
from enocean_async import Gateway
from serial import SerialException
import voluptuous as vol
from homeassistant.config_entries import SOURCE_IMPORT, ConfigEntry
@@ -8,15 +8,12 @@ from homeassistant.const import CONF_DEVICE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.dispatcher import (
async_dispatcher_connect,
async_dispatcher_send,
)
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN, SIGNAL_RECEIVE_MESSAGE, SIGNAL_SEND_MESSAGE
from .const import DOMAIN
from .dongle import EnOceanDongle
type EnOceanConfigEntry = ConfigEntry[Gateway]
type EnOceanConfigEntry = ConfigEntry[EnOceanDongle]
CONFIG_SCHEMA = vol.Schema(
{DOMAIN: vol.Schema({vol.Required(CONF_DEVICE): cv.string})}, extra=vol.ALLOW_EXTRA
@@ -30,7 +27,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return True
if hass.config_entries.async_entries(DOMAIN):
# We can only have one gateway. If there is already one in the config,
# We can only have one dongle. If there is already one in the config,
# there is no need to import the yaml based config.
return True
@@ -46,31 +43,23 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def async_setup_entry(
hass: HomeAssistant, config_entry: EnOceanConfigEntry
) -> bool:
"""Set up an EnOcean gateway for the given entry."""
gateway = Gateway(port=config_entry.data[CONF_DEVICE])
gateway.add_erp1_received_callback(
lambda packet: async_dispatcher_send(hass, SIGNAL_RECEIVE_MESSAGE, packet)
)
"""Set up an EnOcean dongle for the given entry."""
try:
await gateway.start()
except ConnectionError as err:
gateway.stop()
raise ConfigEntryNotReady(f"Failed to start EnOcean gateway: {err}") from err
usb_dongle = EnOceanDongle(hass, config_entry.data[CONF_DEVICE])
except SerialException as err:
raise ConfigEntryNotReady(f"Failed to set up EnOcean dongle: {err}") from err
await usb_dongle.async_setup()
config_entry.runtime_data = usb_dongle
config_entry.runtime_data = gateway
config_entry.async_on_unload(
async_dispatcher_connect(hass, SIGNAL_SEND_MESSAGE, gateway.send_esp3_packet)
)
return True
async def async_unload_entry(
hass: HomeAssistant, config_entry: EnOceanConfigEntry
) -> bool:
"""Unload EnOcean config entry: stop the gateway."""
"""Unload EnOcean config entry."""
enocean_dongle = config_entry.runtime_data
enocean_dongle.unload()
config_entry.runtime_data.stop()
return True

View File

@@ -2,7 +2,7 @@
from __future__ import annotations
from enocean_async import ERP1Telegram
from enocean.utils import combine_hex
import voluptuous as vol
from homeassistant.components.binary_sensor import (
@@ -17,7 +17,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .entity import EnOceanEntity, combine_hex
from .entity import EnOceanEntity
DEFAULT_NAME = "EnOcean binary sensor"
DEPENDENCIES = ["enocean"]
@@ -68,25 +68,29 @@ class EnOceanBinarySensor(EnOceanEntity, BinarySensorEntity):
self._attr_unique_id = f"{combine_hex(dev_id)}-{device_class}"
self._attr_name = dev_name
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Fire an event with the data that have changed.
This method is called when there is an incoming packet associated
with this platform.
Example packet data:
- 2nd button pressed
['0xf6', '0x10', '0x00', '0x2d', '0xcf', '0x45', '0x30']
- button released
['0xf6', '0x00', '0x00', '0x2d', '0xcf', '0x45', '0x20']
"""
if not self.address:
return
# Energy Bow
pushed = None
if telegram.status == 0x30:
if packet.data[6] == 0x30:
pushed = 1
elif telegram.status == 0x20:
elif packet.data[6] == 0x20:
pushed = 0
self.schedule_update_ha_state()
action = telegram.telegram_data[0]
action = packet.data[1]
if action == 0x70:
self.which = 0
self.onoff = 0
@@ -108,7 +112,7 @@ class EnOceanBinarySensor(EnOceanEntity, BinarySensorEntity):
self.hass.bus.fire(
EVENT_BUTTON_PRESSED,
{
"id": self.address.to_bytelist(),
"id": self.dev_id,
"pushed": pushed,
"which": self.which,
"onoff": self.onoff,

View File

@@ -1,9 +1,7 @@
"""Config flows for the EnOcean integration."""
import glob
from typing import Any
from enocean_async import Gateway
import voluptuous as vol
from homeassistant.components import usb
@@ -21,6 +19,7 @@ from homeassistant.helpers.selector import (
)
from homeassistant.helpers.service_info.usb import UsbServiceInfo
from . import dongle
from .const import DOMAIN, ERROR_INVALID_DONGLE_PATH, LOGGER, MANUFACTURER
MANUAL_SCHEMA = vol.Schema(
@@ -30,24 +29,6 @@ MANUAL_SCHEMA = vol.Schema(
)
def _detect_usb_dongle() -> list[str]:
"""Return a list of candidate paths for USB EnOcean dongles.
This method is currently a bit simplistic, it may need to be
improved to support more configurations and OS.
"""
globs_to_test = [
"/dev/tty*FTOA2PV*",
"/dev/serial/by-id/*EnOcean*",
"/dev/tty.usbserial-*",
]
found_paths = []
for current_glob in globs_to_test:
found_paths.extend(glob.glob(current_glob))
return found_paths
class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle the enOcean config flows."""
@@ -126,7 +107,7 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
return await self.async_step_manual()
return await self.async_step_manual(user_input)
devices = await self.hass.async_add_executor_job(_detect_usb_dongle)
devices = await self.hass.async_add_executor_job(dongle.detect)
if len(devices) == 0:
return await self.async_step_manual()
devices.append(self.MANUAL_PATH_VALUE)
@@ -165,17 +146,7 @@ class EnOceanFlowHandler(ConfigFlow, domain=DOMAIN):
async def validate_enocean_conf(self, user_input) -> bool:
"""Return True if the user_input contains a valid dongle path."""
dongle_path = user_input[CONF_DEVICE]
try:
# Starting the gateway will raise an exception if it can't connect
gateway = Gateway(port=dongle_path)
await gateway.start()
except ConnectionError as exception:
LOGGER.warning("Dongle path %s is invalid: %s", dongle_path, str(exception))
return False
finally:
gateway.stop()
return True
return await self.hass.async_add_executor_job(dongle.validate_path, dongle_path)
def create_enocean_entry(self, user_input):
"""Create an entry for the provided configuration."""

View File

@@ -0,0 +1,88 @@
"""Representation of an EnOcean dongle."""
import glob
import logging
from os.path import basename, normpath
from enocean.communicators import SerialCommunicator
from enocean.protocol.packet import RadioPacket
import serial
from homeassistant.helpers.dispatcher import async_dispatcher_connect, dispatcher_send
from .const import SIGNAL_RECEIVE_MESSAGE, SIGNAL_SEND_MESSAGE
_LOGGER = logging.getLogger(__name__)
class EnOceanDongle:
"""Representation of an EnOcean dongle.
The dongle is responsible for receiving the EnOcean frames,
creating devices if needed, and dispatching messages to platforms.
"""
def __init__(self, hass, serial_path):
"""Initialize the EnOcean dongle."""
self._communicator = SerialCommunicator(
port=serial_path, callback=self.callback
)
self.serial_path = serial_path
self.identifier = basename(normpath(serial_path))
self.hass = hass
self.dispatcher_disconnect_handle = None
async def async_setup(self):
"""Finish the setup of the bridge and supported platforms."""
self._communicator.start()
self.dispatcher_disconnect_handle = async_dispatcher_connect(
self.hass, SIGNAL_SEND_MESSAGE, self._send_message_callback
)
def unload(self):
"""Disconnect callbacks established at init time."""
if self.dispatcher_disconnect_handle:
self.dispatcher_disconnect_handle()
self.dispatcher_disconnect_handle = None
def _send_message_callback(self, command):
"""Send a command through the EnOcean dongle."""
self._communicator.send(command)
def callback(self, packet):
"""Handle EnOcean device's callback.
This is the callback function called by python-enocean whenever there
is an incoming packet.
"""
if isinstance(packet, RadioPacket):
_LOGGER.debug("Received radio packet: %s", packet)
dispatcher_send(self.hass, SIGNAL_RECEIVE_MESSAGE, packet)
def detect():
"""Return a list of candidate paths for USB EnOcean dongles.
This method is currently a bit simplistic, it may need to be
improved to support more configurations and OS.
"""
globs_to_test = ["/dev/tty*FTOA2PV*", "/dev/serial/by-id/*EnOcean*"]
found_paths = []
for current_glob in globs_to_test:
found_paths.extend(glob.glob(current_glob))
return found_paths
def validate_path(path: str):
"""Return True if the provided path points to a valid serial port, False otherwise."""
try:
# Creating the serial communicator will raise an exception
# if it cannot connect
SerialCommunicator(port=path)
except serial.SerialException as exception:
_LOGGER.warning("Dongle path %s is invalid: %s", path, str(exception))
return False
return True

View File

@@ -1,23 +1,12 @@
"""Representation of an EnOcean device."""
from enocean_async import EURID, Address, BaseAddress, ERP1Telegram, SenderAddress
from enocean_async.esp3.packet import ESP3Packet, ESP3PacketType
from enocean.protocol.packet import Packet
from enocean.utils import combine_hex
from homeassistant.helpers.dispatcher import async_dispatcher_connect, dispatcher_send
from homeassistant.helpers.entity import Entity
from .const import LOGGER, SIGNAL_RECEIVE_MESSAGE, SIGNAL_SEND_MESSAGE
def combine_hex(dev_id: list[int]) -> int:
"""Combine list of integer values to one big integer.
This function replaces the previously used function from the enocean library and is considered tech debt that will have to be replaced.
"""
value = 0
for byte in dev_id:
value = (value << 8) | (byte & 0xFF)
return value
from .const import SIGNAL_RECEIVE_MESSAGE, SIGNAL_SEND_MESSAGE
class EnOceanEntity(Entity):
@@ -25,16 +14,7 @@ class EnOceanEntity(Entity):
def __init__(self, dev_id: list[int]) -> None:
"""Initialize the device."""
self.address: SenderAddress | None = None
try:
address = Address.from_bytelist(dev_id)
if address.is_eurid():
self.address = EURID.from_number(address.to_number())
elif address.is_base_address():
self.address = BaseAddress.from_number(address.to_number())
except ValueError:
self.address = None
self.dev_id = dev_id
async def async_added_to_hass(self) -> None:
"""Register callbacks."""
@@ -44,25 +24,17 @@ class EnOceanEntity(Entity):
)
)
def _message_received_callback(self, telegram: ERP1Telegram) -> None:
def _message_received_callback(self, packet):
"""Handle incoming packets."""
if not self.address:
return
if telegram.sender == self.address:
self.value_changed(telegram)
if packet.sender_int == combine_hex(self.dev_id):
self.value_changed(packet)
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the device when a packet arrives."""
def send_command(
self, data: list[int], optional: list[int], packet_type: ESP3PacketType
) -> None:
"""Send a command via the EnOcean dongle, if data and optional are valid bytes; otherwise, ignore."""
try:
packet = ESP3Packet(packet_type, data=bytes(data), optional=bytes(optional))
dispatcher_send(self.hass, SIGNAL_SEND_MESSAGE, packet)
except ValueError as err:
LOGGER.warning(
"Failed to send command: invalid data or optional bytes: %s", err
)
def send_command(self, data, optional, packet_type):
"""Send a command via the EnOcean dongle."""
packet = Packet(packet_type, data=data, optional=optional)
dispatcher_send(self.hass, SIGNAL_SEND_MESSAGE, packet)

View File

@@ -5,8 +5,7 @@ from __future__ import annotations
import math
from typing import Any
from enocean_async import ERP1Telegram
from enocean_async.esp3.packet import ESP3PacketType
from enocean.utils import combine_hex
import voluptuous as vol
from homeassistant.components.light import (
@@ -21,7 +20,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .entity import EnOceanEntity, combine_hex
from .entity import EnOceanEntity
CONF_SENDER_ID = "sender_id"
@@ -76,8 +75,7 @@ class EnOceanLight(EnOceanEntity, LightEntity):
command = [0xA5, 0x02, bval, 0x01, 0x09]
command.extend(self._sender_id)
command.extend([0x00])
packet_type = ESP3PacketType(0x01)
self.send_command(command, [], packet_type)
self.send_command(command, [], 0x01)
self._attr_is_on = True
def turn_off(self, **kwargs: Any) -> None:
@@ -85,18 +83,17 @@ class EnOceanLight(EnOceanEntity, LightEntity):
command = [0xA5, 0x02, 0x00, 0x01, 0x09]
command.extend(self._sender_id)
command.extend([0x00])
packet_type = ESP3PacketType(0x01)
self.send_command(command, [], packet_type)
self.send_command(command, [], 0x01)
self._attr_is_on = False
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of this device.
Dimmer devices like Eltako FUD61 send telegram in different RORGs.
We only care about the 4BS (0xA5).
"""
if telegram.rorg == 0xA5 and telegram.telegram_data[0] == 0x02:
val = telegram.telegram_data[1]
if packet.data[0] == 0xA5 and packet.data[1] == 0x02:
val = packet.data[2]
self._attr_brightness = math.floor(val / 100.0 * 256.0)
self._attr_is_on = bool(val != 0)
self.schedule_update_ha_state()

View File

@@ -7,8 +7,8 @@
"documentation": "https://www.home-assistant.io/integrations/enocean",
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["enocean_async"],
"requirements": ["enocean-async==0.4.1"],
"loggers": ["enocean"],
"requirements": ["enocean==0.50"],
"single_config_entry": true,
"usb": [
{

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from enocean_async import EEP, EEP_SPECIFICATIONS, EEPHandler, EEPMessage, ERP1Telegram
from enocean.utils import combine_hex
import voluptuous as vol
from homeassistant.components.sensor import (
@@ -30,7 +30,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .entity import EnOceanEntity, combine_hex
from .entity import EnOceanEntity
CONF_MAX_TEMP = "max_temp"
CONF_MIN_TEMP = "min_temp"
@@ -166,7 +166,7 @@ class EnOceanSensor(EnOceanEntity, RestoreSensor):
if (sensor_data := await self.async_get_last_sensor_data()) is not None:
self._attr_native_value = sensor_data.native_value
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the sensor."""
@@ -177,19 +177,15 @@ class EnOceanPowerSensor(EnOceanSensor):
- A5-12-01 (Automated Meter Reading, Electricity)
"""
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the sensor."""
if telegram.rorg != 0xA5:
if packet.rorg != 0xA5:
return
if (eep := EEP_SPECIFICATIONS.get(EEP(0xA5, 0x12, 0x01))) is None:
return
msg: EEPMessage = EEPHandler(eep).decode(telegram)
if "DT" in msg.values and msg.values["DT"].raw == 1:
packet.parse_eep(0x12, 0x01)
if packet.parsed["DT"]["raw_value"] == 1:
# this packet reports the current value
raw_val = msg.values["MR"].raw
divisor = msg.values["DIV"].raw
raw_val = packet.parsed["MR"]["raw_value"]
divisor = packet.parsed["DIV"]["raw_value"]
self._attr_native_value = raw_val / (10**divisor)
self.schedule_update_ha_state()
@@ -230,13 +226,13 @@ class EnOceanTemperatureSensor(EnOceanSensor):
self.range_from = range_from
self.range_to = range_to
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the sensor."""
if telegram.rorg != 0xA5:
if packet.data[0] != 0xA5:
return
temp_scale = self._scale_max - self._scale_min
temp_range = self.range_to - self.range_from
raw_val = telegram.telegram_data[2]
raw_val = packet.data[3]
temperature = temp_scale / temp_range * (raw_val - self.range_from)
temperature += self._scale_min
self._attr_native_value = round(temperature, 1)
@@ -252,11 +248,11 @@ class EnOceanHumiditySensor(EnOceanSensor):
- A5-10-10 to A5-10-14 (Room Operating Panels)
"""
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the sensor."""
if telegram.rorg != 0xA5:
if packet.rorg != 0xA5:
return
humidity = telegram.telegram_data[1] * 100 / 250
humidity = packet.data[2] * 100 / 250
self._attr_native_value = round(humidity, 1)
self.schedule_update_ha_state()
@@ -268,9 +264,9 @@ class EnOceanWindowHandle(EnOceanSensor):
- F6-10-00 (Mechanical handle / Hoppe AG)
"""
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the sensor."""
action = (telegram.telegram_data[0] & 0x70) >> 4
action = (packet.data[1] & 0x70) >> 4
if action == 0x07:
self._attr_native_value = STATE_CLOSED

View File

@@ -4,8 +4,7 @@ from __future__ import annotations
from typing import Any
from enocean_async import EEP, EEP_SPECIFICATIONS, EEPHandler, EEPMessage, ERP1Telegram
from enocean_async.esp3.packet import ESP3PacketType
from enocean.utils import combine_hex
import voluptuous as vol
from homeassistant.components.switch import (
@@ -19,7 +18,7 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import DOMAIN, LOGGER
from .entity import EnOceanEntity, combine_hex
from .entity import EnOceanEntity
CONF_CHANNEL = "channel"
DEFAULT_NAME = "EnOcean Switch"
@@ -87,68 +86,52 @@ class EnOceanSwitch(EnOceanEntity, SwitchEntity):
"""Initialize the EnOcean switch device."""
super().__init__(dev_id)
self._light = None
self.channel: int = channel
self.channel = channel
self._attr_unique_id = generate_unique_id(dev_id, channel)
self._attr_name = dev_name
def turn_on(self, **kwargs: Any) -> None:
"""Turn on the switch."""
if not self.address:
return
optional = [0x03]
optional.extend(self.address.to_bytelist())
optional.extend(self.dev_id)
optional.extend([0xFF, 0x00])
self.send_command(
data=[0xD2, 0x01, self.channel & 0xFF, 0x64, 0x00, 0x00, 0x00, 0x00, 0x00],
optional=optional,
packet_type=ESP3PacketType(0x01),
packet_type=0x01,
)
self._attr_is_on = True
def turn_off(self, **kwargs: Any) -> None:
"""Turn off the switch."""
if not self.address:
return
optional = [0x03]
optional.extend(self.address.to_bytelist())
optional.extend(self.dev_id)
optional.extend([0xFF, 0x00])
self.send_command(
data=[0xD2, 0x01, self.channel & 0xFF, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00],
optional=optional,
packet_type=ESP3PacketType(0x01),
packet_type=0x01,
)
self._attr_is_on = False
def value_changed(self, telegram: ERP1Telegram) -> None:
def value_changed(self, packet):
"""Update the internal state of the switch."""
if telegram.rorg == 0xA5:
# power meter telegram, turn on if > 1 watts
if (eep := EEP_SPECIFICATIONS.get(EEP(0xA5, 0x12, 0x01))) is None:
LOGGER.warning("EEP A5-12-01 cannot be decoded")
return
msg: EEPMessage = EEPHandler(eep).decode(telegram)
if "DT" in msg.values and msg.values["DT"].raw == 1:
# this packet reports the current value
raw_val = msg.values["MR"].raw
divisor = msg.values["DIV"].raw
if packet.data[0] == 0xA5:
# power meter telegram, turn on if > 10 watts
packet.parse_eep(0x12, 0x01)
if packet.parsed["DT"]["raw_value"] == 1:
raw_val = packet.parsed["MR"]["raw_value"]
divisor = packet.parsed["DIV"]["raw_value"]
watts = raw_val / (10**divisor)
if watts > 1:
self._attr_is_on = True
self.schedule_update_ha_state()
elif telegram.rorg == 0xD2:
elif packet.data[0] == 0xD2:
# actuator status telegram
if (eep := EEP_SPECIFICATIONS.get(EEP(0xD2, 0x01, 0x01))) is None:
LOGGER.warning("EEP D2-01-01 cannot be decoded")
return
msg = EEPHandler(eep).decode(telegram)
if msg.values["CMD"].raw == 4:
channel = msg.values["I/O"].raw
output = msg.values["OV"].raw
packet.parse_eep(0x01, 0x01)
if packet.parsed["CMD"]["raw_value"] == 4:
channel = packet.parsed["IO"]["raw_value"]
output = packet.parsed["OV"]["raw_value"]
if channel == self.channel:
self._attr_is_on = output > 0
self.schedule_update_ha_state()

View File

@@ -275,11 +275,8 @@ class FibaroController:
# otherwise add the first visible device in the group
# which is a hack, but solves a problem with FGT having
# hidden compatibility devices before the real device
# Second hack is for quickapps which have parent id 0 and no children
if (
last_climate_parent != device.parent_fibaro_id
or (device.has_endpoint_id and last_endpoint != device.endpoint_id)
or device.parent_fibaro_id == 0
if last_climate_parent != device.parent_fibaro_id or (
device.has_endpoint_id and last_endpoint != device.endpoint_id
):
_LOGGER.debug("Handle separately")
self.fibaro_devices[platform].append(device)

View File

@@ -154,7 +154,7 @@
},
"issues": {
"deprecated_fireplace_switch": {
"description": "The fireplace mode switch entity `{entity_id}` is deprecated and will be removed in Home Assistant 2026.9.\n\nFireplace mode has been moved to a climate preset on the climate entity to better match the device interface.\n\nPlease update your automations to use the `climate.set_preset_mode` action with preset mode `fireplace` instead of using the switch entity.\n\nAfter updating your automations, you can safely disable this switch entity.",
"description": "The fireplace mode switch entity `{entity_id}` is deprecated and will be removed in a future version.\n\nFireplace mode has been moved to a climate preset on the climate entity to better match the device interface.\n\nPlease update your automations to use the `climate.set_preset_mode` action with preset mode `fireplace` instead of using the switch entity.\n\nAfter updating your automations, you can safely disable this switch entity.",
"title": "Fireplace mode switch is deprecated"
}
}

View File

@@ -91,7 +91,6 @@ async def async_setup_entry(
hass,
DOMAIN,
f"deprecated_switch_{fireplace_switch_unique_id}",
breaks_in_ha_version="2026.9.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=IssueSeverity.WARNING,
@@ -103,7 +102,7 @@ async def async_setup_entry(
entities.append(FlexitSwitch(coordinator, description))
else:
entities.append(FlexitSwitch(coordinator, description))
async_add_entities(entities)
async_add_entities(entities)
PARALLEL_UPDATES = 1

View File

@@ -21,5 +21,5 @@
"integration_type": "system",
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260304.0"]
"requirements": ["home-assistant-frontend==20260302.0"]
}

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from collections.abc import Mapping
import logging
from typing import Any
@@ -24,64 +23,12 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
}
)
STEP_REAUTH_DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_ADMIN_API_KEY): str,
}
)
class GhostConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Ghost."""
VERSION = 1
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:
"""Handle reauthentication."""
return await self.async_step_reauth_confirm()
async def async_step_reauth_confirm(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reauth confirmation."""
reauth_entry = self._get_reauth_entry()
errors: dict[str, str] = {}
if user_input is not None:
admin_api_key = user_input[CONF_ADMIN_API_KEY]
if ":" not in admin_api_key:
errors["base"] = "invalid_api_key"
else:
try:
await self._validate_credentials(
reauth_entry.data[CONF_API_URL], admin_api_key
)
except GhostAuthError:
errors["base"] = "invalid_auth"
except GhostError:
errors["base"] = "cannot_connect"
except Exception:
_LOGGER.exception("Unexpected error during Ghost reauth")
errors["base"] = "unknown"
else:
return self.async_update_reload_and_abort(
reauth_entry,
data_updates=user_input,
)
return self.async_show_form(
step_id="reauth_confirm",
data_schema=STEP_REAUTH_DATA_SCHEMA,
errors=errors,
description_placeholders={
"title": reauth_entry.title,
"docs_url": "https://account.ghost.org/?r=settings/integrations/new",
},
)
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -142,7 +89,7 @@ class GhostConfigFlow(ConfigFlow, domain=DOMAIN):
site_title = site["title"]
await self.async_set_unique_id(site["site_uuid"])
await self.async_set_unique_id(site["uuid"])
self._abort_if_unique_id_configured()
return self.async_create_entry(

View File

@@ -7,6 +7,6 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["aioghost"],
"quality_scale": "silver",
"quality_scale": "bronze",
"requirements": ["aioghost==0.4.0"]
}

View File

@@ -38,7 +38,7 @@ rules:
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
reauthentication-flow: todo
test-coverage: done
# Gold

View File

@@ -1,8 +1,7 @@
{
"config": {
"abort": {
"already_configured": "This Ghost site is already configured.",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
"already_configured": "This Ghost site is already configured."
},
"error": {
"cannot_connect": "Failed to connect to Ghost. Please check your URL.",
@@ -11,16 +10,6 @@
"unknown": "An unexpected error occurred."
},
"step": {
"reauth_confirm": {
"data": {
"admin_api_key": "[%key:component::ghost::config::step::user::data::admin_api_key%]"
},
"data_description": {
"admin_api_key": "[%key:component::ghost::config::step::user::data_description::admin_api_key%]"
},
"description": "Your API key for {title} is invalid. [Create a new integration key]({docs_url}) to reauthenticate.",
"title": "[%key:common::config_flow::title::reauth%]"
},
"user": {
"data": {
"admin_api_key": "Admin API key",

View File

@@ -1,5 +1,7 @@
"""Constants for the Home Connect integration."""
from typing import cast
from aiohomeconnect.model import EventKey, OptionKey, ProgramKey, SettingKey, StatusKey
from homeassistant.const import UnitOfTemperature, UnitOfTime, UnitOfVolume
@@ -74,9 +76,9 @@ AFFECTS_TO_SELECTED_PROGRAM = "selected_program"
TRANSLATION_KEYS_PROGRAMS_MAP = {
bsh_key_to_translation_key(program.value): program
bsh_key_to_translation_key(program.value): cast(ProgramKey, program)
for program in ProgramKey
if program not in (ProgramKey.UNKNOWN, ProgramKey.BSH_COMMON_FAVORITE_001)
if program != ProgramKey.UNKNOWN
}
PROGRAMS_TRANSLATION_KEYS_MAP = {

View File

@@ -23,6 +23,6 @@
"iot_class": "cloud_push",
"loggers": ["aiohomeconnect"],
"quality_scale": "platinum",
"requirements": ["aiohomeconnect==0.30.0"],
"requirements": ["aiohomeconnect==0.28.0"],
"zeroconf": ["_homeconnect._tcp.local."]
}

View File

@@ -403,7 +403,7 @@ class HomeConnectProgramSelectEntity(HomeConnectEntity, SelectEntity):
self._attr_options = [
PROGRAMS_TRANSLATION_KEYS_MAP[program.key]
for program in self.appliance.programs
if program.key in PROGRAMS_TRANSLATION_KEYS_MAP
if program.key != ProgramKey.UNKNOWN
and (
program.constraints is None
or program.constraints.execution

View File

@@ -610,7 +610,6 @@ SENSORS: Final[tuple[HomeWizardSensorEntityDescription, ...]] = (
key="active_liter_lpm",
translation_key="active_liter_lpm",
native_unit_of_measurement=UnitOfVolumeFlowRate.LITERS_PER_MINUTE,
device_class=SensorDeviceClass.VOLUME_FLOW_RATE,
state_class=SensorStateClass.MEASUREMENT,
has_fn=lambda data: data.measurement.active_liter_lpm is not None,
value_fn=lambda data: data.measurement.active_liter_lpm,

View File

@@ -627,17 +627,13 @@ class IntentHandleView(http.HomeAssistantView):
{
vol.Required("name"): cv.string,
vol.Optional("data"): vol.Schema({cv.string: object}),
vol.Optional("language"): cv.string,
vol.Optional("assistant"): vol.Any(cv.string, None),
vol.Optional("device_id"): vol.Any(cv.string, None),
vol.Optional("satellite_id"): vol.Any(cv.string, None),
}
)
)
async def post(self, request: web.Request, data: dict[str, Any]) -> web.Response:
"""Handle intent with name/data."""
hass = request.app[http.KEY_HASS]
language = data.get("language", hass.config.language)
language = hass.config.language
try:
intent_name = data["name"]
@@ -645,21 +641,14 @@ class IntentHandleView(http.HomeAssistantView):
key: {"value": value} for key, value in data.get("data", {}).items()
}
intent_result = await intent.async_handle(
hass,
DOMAIN,
intent_name,
slots,
"",
self.context(request),
language=language,
assistant=data.get("assistant"),
device_id=data.get("device_id"),
satellite_id=data.get("satellite_id"),
hass, DOMAIN, intent_name, slots, "", self.context(request)
)
except (intent.IntentHandleError, intent.MatchFailedError) as err:
intent_result = intent.IntentResponse(language=language)
intent_result.async_set_error(
intent.IntentResponseErrorCode.FAILED_TO_HANDLE, str(err)
)
intent_result.async_set_speech(str(err))
if intent_result is None:
intent_result = intent.IntentResponse(language=language) # type: ignore[unreachable]
intent_result.async_set_speech("Sorry, I couldn't handle that")
return self.json(intent_result)

View File

@@ -2,21 +2,66 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
import pyiss
import requests
from requests.exceptions import HTTPError
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .coordinator import IssConfigEntry, IssDataUpdateCoordinator
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
PLATFORMS = [Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: IssConfigEntry) -> bool:
@dataclass
class IssData:
"""Dataclass representation of data returned from pyiss."""
number_of_people_in_space: int
current_location: dict[str, str]
def update(iss: pyiss.ISS) -> IssData:
"""Retrieve data from the pyiss API."""
return IssData(
number_of_people_in_space=iss.number_of_people_in_space(),
current_location=iss.current_location(),
)
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up this integration using UI."""
coordinator = IssDataUpdateCoordinator(hass, entry)
hass.data.setdefault(DOMAIN, {})
iss = pyiss.ISS()
async def async_update() -> IssData:
try:
return await hass.async_add_executor_job(update, iss)
except (HTTPError, requests.exceptions.ConnectionError) as ex:
raise UpdateFailed("Unable to retrieve data") from ex
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_method=async_update,
update_interval=timedelta(seconds=60),
)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator
hass.data[DOMAIN] = coordinator
entry.async_on_unload(entry.add_update_listener(update_listener))
@@ -25,11 +70,13 @@ async def async_setup_entry(hass: HomeAssistant, entry: IssConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: IssConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Handle removal of an entry."""
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
del hass.data[DOMAIN]
return unload_ok
async def update_listener(hass: HomeAssistant, entry: IssConfigEntry) -> None:
async def update_listener(hass: HomeAssistant, entry: ConfigEntry) -> None:
"""Handle options update."""
await hass.config_entries.async_reload(entry.entry_id)

View File

@@ -4,12 +4,16 @@ from __future__ import annotations
import voluptuous as vol
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
OptionsFlow,
)
from homeassistant.const import CONF_SHOW_ON_MAP
from homeassistant.core import callback
from .const import DEFAULT_NAME, DOMAIN
from .coordinator import IssConfigEntry
class ISSConfigFlow(ConfigFlow, domain=DOMAIN):
@@ -20,7 +24,7 @@ class ISSConfigFlow(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(
config_entry: IssConfigEntry,
config_entry: ConfigEntry,
) -> OptionsFlowHandler:
"""Get the options flow for this handler."""
return OptionsFlowHandler()

View File

@@ -3,5 +3,3 @@
DOMAIN = "iss"
DEFAULT_NAME = "ISS"
MAX_CONSECUTIVE_FAILURES = 5

View File

@@ -1,76 +0,0 @@
"""DataUpdateCoordinator for the ISS integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
import pyiss
import requests
from requests.exceptions import HTTPError
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, MAX_CONSECUTIVE_FAILURES
type IssConfigEntry = ConfigEntry[IssDataUpdateCoordinator]
_LOGGER = logging.getLogger(__name__)
@dataclass
class IssData:
"""Dataclass representation of data returned from pyiss."""
number_of_people_in_space: int
current_location: dict[str, str]
class IssDataUpdateCoordinator(DataUpdateCoordinator[IssData]):
"""ISS coordinator that tolerates transient API failures."""
config_entry: IssConfigEntry
def __init__(self, hass: HomeAssistant, entry: IssConfigEntry) -> None:
"""Initialize the ISS coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_interval=timedelta(seconds=60),
)
self._consecutive_failures = 0
self.iss = pyiss.ISS()
def _fetch_iss_data(self) -> IssData:
"""Fetch data from ISS API (blocking)."""
return IssData(
number_of_people_in_space=self.iss.number_of_people_in_space(),
current_location=self.iss.current_location(),
)
async def _async_update_data(self) -> IssData:
"""Fetch data from the ISS API, tolerating transient failures."""
try:
data = await self.hass.async_add_executor_job(self._fetch_iss_data)
except (HTTPError, requests.exceptions.ConnectionError) as err:
self._consecutive_failures += 1
if self.data is None:
raise UpdateFailed("Unable to retrieve data") from err
if self._consecutive_failures >= MAX_CONSECUTIVE_FAILURES:
raise UpdateFailed(
f"Unable to retrieve data after {self._consecutive_failures} consecutive update failures"
) from err
_LOGGER.debug(
"Transient API error (%s/%s), using cached data: %s",
self._consecutive_failures,
MAX_CONSECUTIVE_FAILURES,
err,
)
return self.data
self._consecutive_failures = 0
return data

View File

@@ -6,32 +6,36 @@ import logging
from typing import Any
from homeassistant.components.sensor import SensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import ATTR_LATITUDE, ATTR_LONGITUDE, CONF_SHOW_ON_MAP
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from . import IssData
from .const import DEFAULT_NAME, DOMAIN
from .coordinator import IssConfigEntry, IssDataUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: IssConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the sensor platform."""
coordinator = entry.runtime_data
coordinator: DataUpdateCoordinator[IssData] = hass.data[DOMAIN]
show_on_map = entry.options.get(CONF_SHOW_ON_MAP, False)
async_add_entities([IssSensor(coordinator, entry, show_on_map)])
class IssSensor(CoordinatorEntity[IssDataUpdateCoordinator], SensorEntity):
class IssSensor(CoordinatorEntity[DataUpdateCoordinator[IssData]], SensorEntity):
"""Implementation of the ISS sensor."""
_attr_has_entity_name = True
@@ -39,8 +43,8 @@ class IssSensor(CoordinatorEntity[IssDataUpdateCoordinator], SensorEntity):
def __init__(
self,
coordinator: IssDataUpdateCoordinator,
entry: IssConfigEntry,
coordinator: DataUpdateCoordinator[IssData],
entry: ConfigEntry,
show: bool,
) -> None:
"""Initialize the sensor."""

View File

@@ -7,5 +7,5 @@
"iot_class": "local_push",
"loggers": ["aionotify", "evdev"],
"quality_scale": "legacy",
"requirements": ["evdev==1.9.3", "asyncinotify==4.4.0"]
"requirements": ["evdev==1.6.1", "asyncinotify==4.2.0"]
}

View File

@@ -8,7 +8,6 @@ from xknx.dpt import DPTBase, DPTComplex, DPTEnum, DPTNumeric
from xknx.dpt.dpt_16 import DPTString
from homeassistant.components.sensor import SensorDeviceClass, SensorStateClass
from homeassistant.const import UnitOfReactiveEnergy
HaDptClass = Literal["numeric", "enum", "complex", "string"]
@@ -37,7 +36,7 @@ def get_supported_dpts() -> Mapping[str, DPTInfo]:
main=dpt_class.dpt_main_number, # type: ignore[typeddict-item] # checked in xknx unit tests
sub=dpt_class.dpt_sub_number,
name=dpt_class.value_type,
unit=_sensor_unit_overrides.get(dpt_number_str, dpt_class.unit),
unit=dpt_class.unit,
sensor_device_class=_sensor_device_classes.get(dpt_number_str),
sensor_state_class=_get_sensor_state_class(ha_dpt_class, dpt_number_str),
)
@@ -78,13 +77,13 @@ _sensor_device_classes: Mapping[str, SensorDeviceClass] = {
"12.1200": SensorDeviceClass.VOLUME,
"12.1201": SensorDeviceClass.VOLUME,
"13.002": SensorDeviceClass.VOLUME_FLOW_RATE,
"13.010": SensorDeviceClass.ENERGY, # DPTActiveEnergy
"13.012": SensorDeviceClass.REACTIVE_ENERGY, # DPTReactiveEnergy
"13.013": SensorDeviceClass.ENERGY, # DPTActiveEnergykWh
"13.015": SensorDeviceClass.REACTIVE_ENERGY, # DPTReactiveEnergykVARh
"13.016": SensorDeviceClass.ENERGY, # DPTActiveEnergyMWh
"13.1200": SensorDeviceClass.VOLUME, # DPTDeltaVolumeLiquidLitre
"13.1201": SensorDeviceClass.VOLUME, # DPTDeltaVolumeM3
"13.010": SensorDeviceClass.ENERGY,
"13.012": SensorDeviceClass.REACTIVE_ENERGY,
"13.013": SensorDeviceClass.ENERGY,
"13.015": SensorDeviceClass.REACTIVE_ENERGY,
"13.016": SensorDeviceClass.ENERGY,
"13.1200": SensorDeviceClass.VOLUME,
"13.1201": SensorDeviceClass.VOLUME,
"14.010": SensorDeviceClass.AREA,
"14.019": SensorDeviceClass.CURRENT,
"14.027": SensorDeviceClass.VOLTAGE,
@@ -92,7 +91,7 @@ _sensor_device_classes: Mapping[str, SensorDeviceClass] = {
"14.030": SensorDeviceClass.VOLTAGE,
"14.031": SensorDeviceClass.ENERGY,
"14.033": SensorDeviceClass.FREQUENCY,
"14.037": SensorDeviceClass.ENERGY_STORAGE, # DPTHeatQuantity
"14.037": SensorDeviceClass.ENERGY_STORAGE,
"14.039": SensorDeviceClass.DISTANCE,
"14.051": SensorDeviceClass.WEIGHT,
"14.056": SensorDeviceClass.POWER,
@@ -102,7 +101,7 @@ _sensor_device_classes: Mapping[str, SensorDeviceClass] = {
"14.068": SensorDeviceClass.TEMPERATURE,
"14.069": SensorDeviceClass.TEMPERATURE,
"14.070": SensorDeviceClass.TEMPERATURE_DELTA,
"14.076": SensorDeviceClass.VOLUME, # DPTVolume
"14.076": SensorDeviceClass.VOLUME,
"14.077": SensorDeviceClass.VOLUME_FLOW_RATE,
"14.080": SensorDeviceClass.APPARENT_POWER,
"14.1200": SensorDeviceClass.VOLUME_FLOW_RATE,
@@ -122,28 +121,17 @@ _sensor_state_class_overrides: Mapping[str, SensorStateClass | None] = {
"13.010": SensorStateClass.TOTAL, # DPTActiveEnergy
"13.011": SensorStateClass.TOTAL, # DPTApparantEnergy
"13.012": SensorStateClass.TOTAL, # DPTReactiveEnergy
"13.013": SensorStateClass.TOTAL, # DPTActiveEnergykWh
"13.015": SensorStateClass.TOTAL, # DPTReactiveEnergykVARh
"13.016": SensorStateClass.TOTAL, # DPTActiveEnergyMWh
"13.1200": SensorStateClass.TOTAL, # DPTDeltaVolumeLiquidLitre
"13.1201": SensorStateClass.TOTAL, # DPTDeltaVolumeM3
"14.007": SensorStateClass.MEASUREMENT_ANGLE, # DPTAngleDeg
"14.037": SensorStateClass.TOTAL, # DPTHeatQuantity
"14.051": SensorStateClass.TOTAL, # DPTMass
"14.055": SensorStateClass.MEASUREMENT_ANGLE, # DPTPhaseAngleDeg
"14.031": SensorStateClass.TOTAL_INCREASING, # DPTEnergy
"14.076": SensorStateClass.TOTAL, # DPTVolume
"17.001": None, # DPTSceneNumber
"29.010": SensorStateClass.TOTAL, # DPTActiveEnergy8Byte
"29.011": SensorStateClass.TOTAL, # DPTApparantEnergy8Byte
"29.012": SensorStateClass.TOTAL, # DPTReactiveEnergy8Byte
}
_sensor_unit_overrides: Mapping[str, str] = {
"13.012": UnitOfReactiveEnergy.VOLT_AMPERE_REACTIVE_HOUR, # DPTReactiveEnergy (VARh in KNX)
"13.015": UnitOfReactiveEnergy.KILO_VOLT_AMPERE_REACTIVE_HOUR, # DPTReactiveEnergykVARh (kVARh in KNX)
"29.012": UnitOfReactiveEnergy.VOLT_AMPERE_REACTIVE_HOUR, # DPTReactiveEnergy8Byte (VARh in KNX)
}
def _get_sensor_state_class(
ha_dpt_class: HaDptClass, dpt_number_str: str

View File

@@ -16,7 +16,7 @@ from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER, MODELS
from .const import DOMAIN
from .coordinator import LaundrifyConfigEntry, LaundrifyUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -47,14 +47,7 @@ class LaundrifyBaseSensor(SensorEntity):
def __init__(self, device: LaundrifyDevice) -> None:
"""Initialize the sensor."""
self._device = device
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer=MANUFACTURER,
model=MODELS[device.model],
sw_version=device.firmwareVersion,
configuration_url=f"http://{device.internalIP}",
)
self._attr_device_info = DeviceInfo(identifiers={(DOMAIN, device.id)})
self._attr_unique_id = f"{device.id}_{self._attr_device_class}"

View File

@@ -7,6 +7,6 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["mastodon"],
"quality_scale": "gold",
"quality_scale": "silver",
"requirements": ["Mastodon.py==2.1.2"]
}

View File

@@ -49,11 +49,11 @@ rules:
Web service does not support discovery.
docs-data-update: done
docs-examples: done
docs-known-limitations: done
docs-known-limitations: todo
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: |

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
import logging
from typing import TYPE_CHECKING, Any
from chip.clusters import Objects as clusters
@@ -27,8 +26,6 @@ from .entity import MatterEntity, MatterEntityDescription
from .helpers import get_matter
from .models import MatterDiscoverySchema
_LOGGER = logging.getLogger(__name__)
class OperationalState(IntEnum):
"""Operational State of the vacuum cleaner.
@@ -257,18 +254,9 @@ class MatterVacuum(MatterEntity, StateVacuumEntity):
VacuumEntityFeature.CLEAN_AREA in self.supported_features
and self.registry_entry is not None
and (last_seen_segments := self.last_seen_segments) is not None
# Ignore empty segments; some devices transiently
# report an empty list before sending the real one.
and (current_segments := self._current_segments)
and self._current_segments != {s.id: s for s in last_seen_segments}
):
last_seen_by_id = {s.id: s for s in last_seen_segments}
if current_segments != last_seen_by_id:
_LOGGER.debug(
"Vacuum segments changed: last_seen=%s, current=%s",
last_seen_by_id,
current_segments,
)
self.async_create_segments_issue()
self.async_create_segments_issue()
@callback
def _calculate_features(self) -> None:

View File

@@ -120,7 +120,6 @@ class MobileAppNotificationService(BaseNotificationService):
local_push_channels = self.hass.data[DOMAIN][DATA_PUSH_CHANNEL]
failed_targets = []
for target in targets:
registration = self.hass.data[DOMAIN][DATA_CONFIG_ENTRIES][target].data
@@ -135,16 +134,12 @@ class MobileAppNotificationService(BaseNotificationService):
# Test if local push only.
if ATTR_PUSH_URL not in registration[ATTR_APP_DATA]:
failed_targets.append(target)
continue
raise HomeAssistantError(
"Device not connected to local push notifications"
)
await self._async_send_remote_message_target(target, registration, data)
if failed_targets:
raise HomeAssistantError(
f"Device(s) with webhook id(s) {', '.join(failed_targets)} not connected to local push notifications"
)
async def _async_send_remote_message_target(self, target, registration, data):
"""Send a message to a target."""
app_data = registration[ATTR_APP_DATA]

View File

@@ -307,25 +307,17 @@ class MotionTiltDevice(MotionPositionDevice):
async def async_open_cover_tilt(self, **kwargs: Any) -> None:
"""Open the cover tilt."""
if self.current_cover_tilt_position is not None:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 0)
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 0)
await self.async_request_position_till_stop()
else:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Jog_up)
await self.async_request_position_till_stop()
async def async_close_cover_tilt(self, **kwargs: Any) -> None:
"""Close the cover tilt."""
if self.current_cover_tilt_position is not None:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 180)
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Set_angle, 180)
await self.async_request_position_till_stop()
else:
async with self._api_lock:
await self.hass.async_add_executor_job(self._blind.Jog_down)
await self.async_request_position_till_stop()
async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
"""Move the cover tilt to a specific position."""

View File

@@ -1,18 +1,37 @@
"""The Mullvad VPN integration."""
import asyncio
from datetime import timedelta
import logging
from mullvad_api import MullvadAPI
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN
from .coordinator import MullvadCoordinator
PLATFORMS = [Platform.BINARY_SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Mullvad VPN integration."""
coordinator = MullvadCoordinator(hass, entry)
async def async_get_mullvad_api_data():
async with asyncio.timeout(10):
api = await hass.async_add_executor_job(MullvadAPI)
return api.data
coordinator = DataUpdateCoordinator(
hass,
logging.getLogger(__name__),
config_entry=entry,
name=DOMAIN,
update_method=async_get_mullvad_api_data,
update_interval=timedelta(minutes=1),
)
await coordinator.async_config_entry_first_refresh()
hass.data[DOMAIN] = coordinator

View File

@@ -9,10 +9,12 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from .const import DOMAIN
from .coordinator import MullvadCoordinator
BINARY_SENSORS = (
BinarySensorEntityDescription(
@@ -37,14 +39,14 @@ async def async_setup_entry(
)
class MullvadBinarySensor(CoordinatorEntity[MullvadCoordinator], BinarySensorEntity):
class MullvadBinarySensor(CoordinatorEntity, BinarySensorEntity):
"""Represents a Mullvad binary sensor."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: MullvadCoordinator,
coordinator: DataUpdateCoordinator,
entity_description: BinarySensorEntityDescription,
config_entry: ConfigEntry,
) -> None:

View File

@@ -1,38 +0,0 @@
"""The Mullvad VPN coordinator."""
import asyncio
from datetime import timedelta
import logging
from typing import Any
from mullvad_api import MullvadAPI
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
class MullvadCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Mullvad VPN data update coordinator."""
config_entry: ConfigEntry
def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None:
"""Initialize the Mullvad coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=DOMAIN,
update_interval=timedelta(minutes=1),
)
async def _async_update_data(self) -> dict[str, Any]:
"""Fetch data from Mullvad API."""
async with asyncio.timeout(10):
api = await self.hass.async_add_executor_job(MullvadAPI)
return api.data

View File

@@ -2,31 +2,39 @@
from __future__ import annotations
from datetime import timedelta
import logging
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PORT, CONF_SSL
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import PLATFORMS
from .coordinator import (
NetgearConfigEntry,
NetgearFirmwareCoordinator,
NetgearLinkCoordinator,
NetgearRuntimeData,
NetgearSpeedTestCoordinator,
NetgearTrackerCoordinator,
NetgearTrafficMeterCoordinator,
NetgearUtilizationCoordinator,
from .const import (
DOMAIN,
KEY_COORDINATOR,
KEY_COORDINATOR_FIRMWARE,
KEY_COORDINATOR_LINK,
KEY_COORDINATOR_SPEED,
KEY_COORDINATOR_TRAFFIC,
KEY_COORDINATOR_UTIL,
KEY_ROUTER,
PLATFORMS,
)
from .errors import CannotLoginException
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=30)
SPEED_TEST_INTERVAL = timedelta(hours=2)
SCAN_INTERVAL_FIRMWARE = timedelta(hours=5)
async def async_setup_entry(hass: HomeAssistant, entry: NetgearConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up Netgear component."""
router = NetgearRouter(hass, entry)
try:
@@ -51,41 +59,116 @@ async def async_setup_entry(hass: HomeAssistant, entry: NetgearConfigEntry) -> b
router.ssl,
)
hass.data.setdefault(DOMAIN, {})
async def async_update_devices() -> bool:
"""Fetch data from the router."""
if router.track_devices:
return await router.async_update_device_trackers()
return False
async def async_update_traffic_meter() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_traffic_meter()
async def async_update_speed_test() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_speed_test()
async def async_check_firmware() -> dict[str, Any] | None:
"""Check for new firmware of the router."""
return await router.async_check_new_firmware()
async def async_update_utilization() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_utilization()
async def async_check_link_status() -> dict[str, Any] | None:
"""Fetch data from the router."""
return await router.async_get_link_status()
# Create update coordinators
coordinator_tracker = NetgearTrackerCoordinator(hass, router, entry)
coordinator_traffic_meter = NetgearTrafficMeterCoordinator(hass, router, entry)
coordinator_speed_test = NetgearSpeedTestCoordinator(hass, router, entry)
coordinator_firmware = NetgearFirmwareCoordinator(hass, router, entry)
coordinator_utilization = NetgearUtilizationCoordinator(hass, router, entry)
coordinator_link = NetgearLinkCoordinator(hass, router, entry)
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Devices",
update_method=async_update_devices,
update_interval=SCAN_INTERVAL,
)
coordinator_traffic_meter = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Traffic meter",
update_method=async_update_traffic_meter,
update_interval=SCAN_INTERVAL,
)
coordinator_speed_test = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Speed test",
update_method=async_update_speed_test,
update_interval=SPEED_TEST_INTERVAL,
)
coordinator_firmware = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Firmware",
update_method=async_check_firmware,
update_interval=SCAN_INTERVAL_FIRMWARE,
)
coordinator_utilization = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Utilization",
update_method=async_update_utilization,
update_interval=SCAN_INTERVAL,
)
coordinator_link = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} Ethernet Link Status",
update_method=async_check_link_status,
update_interval=SCAN_INTERVAL,
)
if router.track_devices:
await coordinator_tracker.async_config_entry_first_refresh()
await coordinator.async_config_entry_first_refresh()
await coordinator_traffic_meter.async_config_entry_first_refresh()
await coordinator_firmware.async_config_entry_first_refresh()
await coordinator_utilization.async_config_entry_first_refresh()
await coordinator_link.async_config_entry_first_refresh()
entry.runtime_data = NetgearRuntimeData(
router=router,
coordinator_tracker=coordinator_tracker,
coordinator_traffic=coordinator_traffic_meter,
coordinator_speed=coordinator_speed_test,
coordinator_firmware=coordinator_firmware,
coordinator_utilization=coordinator_utilization,
coordinator_link=coordinator_link,
)
hass.data[DOMAIN][entry.entry_id] = {
KEY_ROUTER: router,
KEY_COORDINATOR: coordinator,
KEY_COORDINATOR_TRAFFIC: coordinator_traffic_meter,
KEY_COORDINATOR_SPEED: coordinator_speed_test,
KEY_COORDINATOR_FIRMWARE: coordinator_firmware,
KEY_COORDINATOR_UTIL: coordinator_utilization,
KEY_COORDINATOR_LINK: coordinator_link,
}
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: NetgearConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
router = entry.runtime_data.router
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
if not hass.data[DOMAIN]:
hass.data.pop(DOMAIN)
if not router.track_devices:
router_id = None
@@ -110,10 +193,10 @@ async def async_unload_entry(hass: HomeAssistant, entry: NetgearConfigEntry) ->
async def async_remove_config_entry_device(
hass: HomeAssistant, config_entry: NetgearConfigEntry, device_entry: dr.DeviceEntry
hass: HomeAssistant, config_entry: ConfigEntry, device_entry: dr.DeviceEntry
) -> bool:
"""Remove a device from a config entry."""
router = config_entry.runtime_data.router
router = hass.data[DOMAIN][config_entry.entry_id][KEY_ROUTER]
device_mac = None
for connection in device_entry.connections:

View File

@@ -9,11 +9,13 @@ from homeassistant.components.button import (
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .const import DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .entity import NetgearRouterCoordinatorEntity
from .router import NetgearRouter
@@ -37,13 +39,14 @@ BUTTONS = [
async def async_setup_entry(
hass: HomeAssistant,
entry: NetgearConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up button for Netgear component."""
coordinator_tracker = entry.runtime_data.coordinator_tracker
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
async_add_entities(
NetgearRouterButtonEntity(coordinator_tracker, entity_description)
NetgearRouterButtonEntity(coordinator, router, entity_description)
for entity_description in BUTTONS
)
@@ -55,15 +58,14 @@ class NetgearRouterButtonEntity(NetgearRouterCoordinatorEntity, ButtonEntity):
def __init__(
self,
coordinator: NetgearTrackerCoordinator,
coordinator: DataUpdateCoordinator,
router: NetgearRouter,
entity_description: NetgearButtonEntityDescription,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator)
super().__init__(coordinator, router)
self.entity_description = entity_description
self._attr_unique_id = (
f"{coordinator.router.serial_number}-{entity_description.key}"
)
self._attr_unique_id = f"{router.serial_number}-{entity_description.key}"
async def async_press(self) -> None:
"""Triggers the button press service."""

View File

@@ -16,6 +16,14 @@ PLATFORMS = [
CONF_CONSIDER_HOME = "consider_home"
KEY_ROUTER = "router"
KEY_COORDINATOR = "coordinator"
KEY_COORDINATOR_TRAFFIC = "coordinator_traffic"
KEY_COORDINATOR_SPEED = "coordinator_speed"
KEY_COORDINATOR_FIRMWARE = "coordinator_firmware"
KEY_COORDINATOR_UTIL = "coordinator_utilization"
KEY_COORDINATOR_LINK = "coordinator_link"
DEFAULT_CONSIDER_HOME = timedelta(seconds=180)
DEFAULT_NAME = "Netgear router"

View File

@@ -1,163 +0,0 @@
"""Models for the Netgear integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
from typing import Any
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(seconds=30)
SCAN_INTERVAL_FIRMWARE = timedelta(hours=5)
SPEED_TEST_INTERVAL = timedelta(hours=2)
@dataclass
class NetgearRuntimeData:
"""Runtime data for the Netgear integration."""
router: NetgearRouter
coordinator_tracker: NetgearTrackerCoordinator
coordinator_traffic: NetgearTrafficMeterCoordinator
coordinator_speed: NetgearSpeedTestCoordinator
coordinator_firmware: NetgearFirmwareCoordinator
coordinator_utilization: NetgearUtilizationCoordinator
coordinator_link: NetgearLinkCoordinator
type NetgearConfigEntry = ConfigEntry[NetgearRuntimeData]
class NetgearDataCoordinator[T](DataUpdateCoordinator[T]):
"""Base coordinator for Netgear."""
config_entry: NetgearConfigEntry
def __init__(
self,
hass: HomeAssistant,
router: NetgearRouter,
entry: NetgearConfigEntry,
*,
name: str,
update_interval: timedelta,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=f"{router.device_name} {name}",
update_interval=update_interval,
)
self.router = router
class NetgearTrackerCoordinator(NetgearDataCoordinator[bool]):
"""Coordinator for Netgear device tracking."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Devices", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> bool:
"""Fetch data from the router."""
if self.router.track_devices:
return await self.router.async_update_device_trackers()
return False
class NetgearTrafficMeterCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear traffic meter data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Traffic meter", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_traffic_meter()
class NetgearSpeedTestCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear speed test data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Speed test", update_interval=SPEED_TEST_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_speed_test()
class NetgearFirmwareCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear firmware updates."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Firmware", update_interval=SCAN_INTERVAL_FIRMWARE
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Check for new firmware of the router."""
return await self.router.async_check_new_firmware()
class NetgearUtilizationCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear utilization data."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass, router, entry, name="Utilization", update_interval=SCAN_INTERVAL
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_utilization()
class NetgearLinkCoordinator(NetgearDataCoordinator[dict[str, Any] | None]):
"""Coordinator for Netgear Ethernet link status."""
def __init__(
self, hass: HomeAssistant, router: NetgearRouter, entry: NetgearConfigEntry
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
router,
entry,
name="Ethernet Link Status",
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> dict[str, Any] | None:
"""Fetch data from the router."""
return await self.router.async_get_link_status()

View File

@@ -5,30 +5,32 @@ from __future__ import annotations
import logging
from homeassistant.components.device_tracker import ScannerEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import DEVICE_ICONS
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .const import DEVICE_ICONS, DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .entity import NetgearDeviceEntity
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: NetgearConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up device tracker for Netgear component."""
router = entry.runtime_data.router
coordinator_tracker = entry.runtime_data.coordinator_tracker
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
tracked = set()
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
if not coordinator_tracker.data:
if not coordinator.data:
return
new_entities = []
@@ -37,14 +39,14 @@ async def async_setup_entry(
if mac in tracked:
continue
new_entities.append(NetgearScannerEntity(coordinator_tracker, device))
new_entities.append(NetgearScannerEntity(coordinator, router, device))
tracked.add(mac)
async_add_entities(new_entities)
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
coordinator_tracker.data = True
coordinator.data = True
new_device_callback()
@@ -54,12 +56,10 @@ class NetgearScannerEntity(NetgearDeviceEntity, ScannerEntity):
_attr_has_entity_name = False
def __init__(
self,
coordinator: NetgearTrackerCoordinator,
device: dict,
self, coordinator: DataUpdateCoordinator, router: NetgearRouter, device: dict
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator, device)
super().__init__(coordinator, router, device)
self._hostname = self.get_hostname()
self._icon = DEVICE_ICONS.get(device["device_type"], "mdi:help-network")
self._attr_name = self._device_name

View File

@@ -3,33 +3,32 @@
from __future__ import annotations
from abc import abstractmethod
from typing import Any
from homeassistant.const import CONF_HOST
from homeassistant.core import callback
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.helpers.update_coordinator import (
CoordinatorEntity,
DataUpdateCoordinator,
)
from .const import DOMAIN
from .coordinator import NetgearDataCoordinator, NetgearTrackerCoordinator
from .router import NetgearRouter
class NetgearDeviceEntity(CoordinatorEntity[NetgearTrackerCoordinator]):
class NetgearDeviceEntity(CoordinatorEntity):
"""Base class for a device connected to a Netgear router."""
_attr_has_entity_name = True
def __init__(
self,
coordinator: NetgearTrackerCoordinator,
device: dict,
self, coordinator: DataUpdateCoordinator, router: NetgearRouter, device: dict
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator)
self._router = coordinator.router
self._router = router
self._device = device
self._mac = device["mac"]
self._device_name = self.get_device_name()
@@ -39,7 +38,7 @@ class NetgearDeviceEntity(CoordinatorEntity[NetgearTrackerCoordinator]):
connections={(dr.CONNECTION_NETWORK_MAC, self._mac)},
default_name=self._device_name,
default_model=device["device_model"],
via_device=(DOMAIN, coordinator.router.unique_id),
via_device=(DOMAIN, router.unique_id),
)
def get_device_name(self):
@@ -87,15 +86,15 @@ class NetgearRouterEntity(Entity):
)
class NetgearRouterCoordinatorEntity[T: NetgearDataCoordinator[Any]](
NetgearRouterEntity, CoordinatorEntity[T]
):
class NetgearRouterCoordinatorEntity(NetgearRouterEntity, CoordinatorEntity):
"""Base class for a Netgear router entity."""
def __init__(self, coordinator: T) -> None:
def __init__(
self, coordinator: DataUpdateCoordinator, router: NetgearRouter
) -> None:
"""Initialize a Netgear device."""
CoordinatorEntity.__init__(self, coordinator)
NetgearRouterEntity.__init__(self, coordinator.router)
NetgearRouterEntity.__init__(self, router)
@abstractmethod
@callback

View File

@@ -7,7 +7,6 @@ from dataclasses import dataclass
from datetime import date, datetime
from decimal import Decimal
import logging
from typing import Any
from homeassistant.components.sensor import (
RestoreSensor,
@@ -16,6 +15,7 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
EntityCategory,
@@ -26,13 +26,19 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .coordinator import (
NetgearConfigEntry,
NetgearDataCoordinator,
NetgearTrackerCoordinator,
from .const import (
DOMAIN,
KEY_COORDINATOR,
KEY_COORDINATOR_LINK,
KEY_COORDINATOR_SPEED,
KEY_COORDINATOR_TRAFFIC,
KEY_COORDINATOR_UTIL,
KEY_ROUTER,
)
from .entity import NetgearDeviceEntity, NetgearRouterCoordinatorEntity
from .router import NetgearRouter
_LOGGER = logging.getLogger(__name__)
@@ -269,19 +275,19 @@ SENSOR_LINK_TYPES = [
async def async_setup_entry(
hass: HomeAssistant,
entry: NetgearConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Netgear sensors from a config entry."""
router = entry.runtime_data.router
coordinator_tracker = entry.runtime_data.coordinator_tracker
coordinator_traffic = entry.runtime_data.coordinator_traffic
coordinator_speed = entry.runtime_data.coordinator_speed
coordinator_utilization = entry.runtime_data.coordinator_utilization
coordinator_link = entry.runtime_data.coordinator_link
"""Set up device tracker for Netgear component."""
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
coordinator_traffic = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_TRAFFIC]
coordinator_speed = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_SPEED]
coordinator_utilization = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_UTIL]
coordinator_link = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_LINK]
async_add_entities(
NetgearRouterSensorEntity(coordinator, description)
NetgearRouterSensorEntity(coordinator, router, description)
for (coordinator, descriptions) in (
(coordinator_traffic, SENSOR_TRAFFIC_TYPES),
(coordinator_speed, SENSOR_SPEED_TYPES),
@@ -300,7 +306,7 @@ async def async_setup_entry(
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
if not coordinator_tracker.data:
if not coordinator.data:
return
new_entities: list[NetgearSensorEntity] = []
@@ -310,16 +316,16 @@ async def async_setup_entry(
continue
new_entities.extend(
NetgearSensorEntity(coordinator_tracker, device, attribute)
NetgearSensorEntity(coordinator, router, device, attribute)
for attribute in sensors
)
tracked.add(mac)
async_add_entities(new_entities)
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
coordinator_tracker.data = True
coordinator.data = True
new_device_callback()
@@ -328,12 +334,13 @@ class NetgearSensorEntity(NetgearDeviceEntity, SensorEntity):
def __init__(
self,
coordinator: NetgearTrackerCoordinator,
coordinator: DataUpdateCoordinator,
router: NetgearRouter,
device: dict,
attribute: str,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator, device)
super().__init__(coordinator, router, device)
self._attribute = attribute
self.entity_description = SENSOR_TYPES[attribute]
self._attr_unique_id = f"{self._mac}-{attribute}"
@@ -366,13 +373,14 @@ class NetgearRouterSensorEntity(NetgearRouterCoordinatorEntity, RestoreSensor):
def __init__(
self,
coordinator: NetgearDataCoordinator[dict[str, Any] | None],
coordinator: DataUpdateCoordinator,
router: NetgearRouter,
entity_description: NetgearSensorEntityDescription,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator)
super().__init__(coordinator, router)
self.entity_description = entity_description
self._attr_unique_id = f"{coordinator.router.serial_number}-{entity_description.key}-{entity_description.index}"
self._attr_unique_id = f"{router.serial_number}-{entity_description.key}-{entity_description.index}"
self._value: StateType | date | datetime | Decimal = None
self.async_update_device()

View File

@@ -9,11 +9,13 @@ from typing import Any
from pynetgear import ALLOW, BLOCK
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .coordinator import NetgearConfigEntry, NetgearTrackerCoordinator
from .const import DOMAIN, KEY_COORDINATOR, KEY_ROUTER
from .entity import NetgearDeviceEntity, NetgearRouterEntity
from .router import NetgearRouter
@@ -98,11 +100,11 @@ ROUTER_SWITCH_TYPES = [
async def async_setup_entry(
hass: HomeAssistant,
entry: NetgearConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up switches for Netgear component."""
router = entry.runtime_data.router
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
async_add_entities(
NetgearRouterSwitchEntity(router, description)
@@ -110,14 +112,14 @@ async def async_setup_entry(
)
# Entities per network device
coordinator_tracker = entry.runtime_data.coordinator_tracker
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR]
tracked = set()
@callback
def new_device_callback() -> None:
"""Add new devices if needed."""
new_entities = []
if not coordinator_tracker.data:
if not coordinator.data:
return
for mac, device in router.devices.items():
@@ -126,7 +128,7 @@ async def async_setup_entry(
new_entities.extend(
[
NetgearAllowBlock(coordinator_tracker, device, entity_description)
NetgearAllowBlock(coordinator, router, device, entity_description)
for entity_description in SWITCH_TYPES
]
)
@@ -134,9 +136,9 @@ async def async_setup_entry(
async_add_entities(new_entities)
entry.async_on_unload(coordinator_tracker.async_add_listener(new_device_callback))
entry.async_on_unload(coordinator.async_add_listener(new_device_callback))
coordinator_tracker.data = True
coordinator.data = True
new_device_callback()
@@ -147,12 +149,13 @@ class NetgearAllowBlock(NetgearDeviceEntity, SwitchEntity):
def __init__(
self,
coordinator: NetgearTrackerCoordinator,
coordinator: DataUpdateCoordinator,
router: NetgearRouter,
device: dict,
entity_description: SwitchEntityDescription,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator, device)
super().__init__(coordinator, router, device)
self.entity_description = entity_description
self._attr_unique_id = f"{self._mac}-{entity_description.key}"
self.async_update_device()

View File

@@ -10,30 +10,32 @@ from homeassistant.components.update import (
UpdateEntity,
UpdateEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .coordinator import NetgearConfigEntry, NetgearFirmwareCoordinator
from .const import DOMAIN, KEY_COORDINATOR_FIRMWARE, KEY_ROUTER
from .entity import NetgearRouterCoordinatorEntity
from .router import NetgearRouter
LOGGER = logging.getLogger(__name__)
async def async_setup_entry(
hass: HomeAssistant,
entry: NetgearConfigEntry,
entry: ConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up update entities for Netgear component."""
coordinator = entry.runtime_data.coordinator_firmware
entities = [NetgearUpdateEntity(coordinator)]
router = hass.data[DOMAIN][entry.entry_id][KEY_ROUTER]
coordinator = hass.data[DOMAIN][entry.entry_id][KEY_COORDINATOR_FIRMWARE]
entities = [NetgearUpdateEntity(coordinator, router)]
async_add_entities(entities)
class NetgearUpdateEntity(
NetgearRouterCoordinatorEntity[NetgearFirmwareCoordinator], UpdateEntity
):
class NetgearUpdateEntity(NetgearRouterCoordinatorEntity, UpdateEntity):
"""Update entity for a Netgear device."""
_attr_device_class = UpdateDeviceClass.FIRMWARE
@@ -41,11 +43,12 @@ class NetgearUpdateEntity(
def __init__(
self,
coordinator: NetgearFirmwareCoordinator,
coordinator: DataUpdateCoordinator,
router: NetgearRouter,
) -> None:
"""Initialize a Netgear device."""
super().__init__(coordinator)
self._attr_unique_id = f"{coordinator.router.serial_number}-update"
super().__init__(coordinator, router)
self._attr_unique_id = f"{router.serial_number}-update"
@property
def installed_version(self) -> str | None:

View File

@@ -12,7 +12,6 @@ from .coordinator import NRGkickConfigEntry, NRGkickDataUpdateCoordinator
PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.DEVICE_TRACKER,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,

View File

@@ -1,74 +0,0 @@
"""Device tracker platform for NRGkick."""
from __future__ import annotations
from typing import Any, Final
from homeassistant.components.device_tracker import SourceType
from homeassistant.components.device_tracker.config_entry import TrackerEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import NRGkickConfigEntry, NRGkickDataUpdateCoordinator
from .entity import NRGkickEntity, get_nested_dict_value
PARALLEL_UPDATES = 0
TRACKER_KEY: Final = "gps_tracker"
async def async_setup_entry(
_hass: HomeAssistant,
entry: NRGkickConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up NRGkick device tracker based on a config entry."""
coordinator = entry.runtime_data
data = coordinator.data
assert data is not None
info_data: dict[str, Any] = data.info
general_info: dict[str, Any] = info_data.get("general", {})
model_type = general_info.get("model_type")
# GPS module is only available on SIM-capable models (same check as cellular
# sensors). SIM-capable models include "SIM" in their model type string.
has_sim_module = isinstance(model_type, str) and "SIM" in model_type.upper()
if has_sim_module:
async_add_entities([NRGkickDeviceTracker(coordinator)])
class NRGkickDeviceTracker(NRGkickEntity, TrackerEntity):
"""Representation of a NRGkick GPS device tracker."""
_attr_translation_key = TRACKER_KEY
_attr_source_type = SourceType.GPS
def __init__(
self,
coordinator: NRGkickDataUpdateCoordinator,
) -> None:
"""Initialize the device tracker."""
super().__init__(coordinator, TRACKER_KEY)
def _gps_float(self, key: str) -> float | None:
"""Return a GPS value as float, or None if GPS data is unavailable."""
value = get_nested_dict_value(self.coordinator.data.info, "gps", key)
return float(value) if value is not None else None
@property
def latitude(self) -> float | None:
"""Return latitude value of the device."""
return self._gps_float("latitude")
@property
def longitude(self) -> float | None:
"""Return longitude value of the device."""
return self._gps_float("longitude")
@property
def location_accuracy(self) -> float:
"""Return the location accuracy of the device."""
return self._gps_float("accuracy") or 0.0

View File

@@ -6,20 +6,12 @@ from dataclasses import asdict
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.const import (
ATTR_LATITUDE,
ATTR_LONGITUDE,
CONF_PASSWORD,
CONF_USERNAME,
)
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant
from .coordinator import NRGkickConfigEntry
TO_REDACT = {
ATTR_LATITUDE,
ATTR_LONGITUDE,
"altitude",
CONF_PASSWORD,
CONF_USERNAME,
}

View File

@@ -5,11 +5,6 @@
"default": "mdi:ev-station"
}
},
"device_tracker": {
"gps_tracker": {
"default": "mdi:map-marker"
}
},
"number": {
"current_set": {
"default": "mdi:current-ac"

View File

@@ -83,11 +83,6 @@
"name": "Charge permitted"
}
},
"device_tracker": {
"gps_tracker": {
"name": "GPS tracker"
}
},
"number": {
"current_set": {
"name": "Charging current"

View File

@@ -1,5 +1,6 @@
"""Support for NuHeat thermostats."""
from datetime import timedelta
from http import HTTPStatus
import logging
@@ -10,14 +11,14 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PASSWORD, CONF_USERNAME
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_SERIAL_NUMBER, DOMAIN, PLATFORMS
from .coordinator import NuHeatCoordinator
_LOGGER = logging.getLogger(__name__)
def _get_thermostat(api: nuheat.NuHeat, serial_number: str) -> nuheat.NuHeatThermostat:
def _get_thermostat(api, serial_number):
"""Authenticate and create the thermostat object."""
api.authenticate()
return api.get_thermostat(serial_number)
@@ -28,9 +29,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
conf = entry.data
username: str = conf[CONF_USERNAME]
password: str = conf[CONF_PASSWORD]
serial_number: str = conf[CONF_SERIAL_NUMBER]
username = conf[CONF_USERNAME]
password = conf[CONF_PASSWORD]
serial_number = conf[CONF_SERIAL_NUMBER]
api = nuheat.NuHeat(username, password)
@@ -52,7 +53,18 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
_LOGGER.error("Failed to login to nuheat: %s", ex)
return False
coordinator = NuHeatCoordinator(hass, entry, thermostat)
async def _async_update_data():
"""Fetch data from API endpoint."""
await hass.async_add_executor_job(thermostat.get_data)
coordinator = DataUpdateCoordinator(
hass,
_LOGGER,
config_entry=entry,
name=f"nuheat {serial_number}",
update_method=_async_update_data,
update_interval=timedelta(minutes=5),
)
hass.data.setdefault(DOMAIN, {})
hass.data[DOMAIN][entry.entry_id] = (thermostat, coordinator)

View File

@@ -27,7 +27,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN, MANUFACTURER, NUHEAT_API_STATE_SHIFT_DELAY
from .coordinator import NuHeatCoordinator
_LOGGER = logging.getLogger(__name__)
@@ -70,7 +69,7 @@ async def async_setup_entry(
async_add_entities([entity], True)
class NuHeatThermostat(CoordinatorEntity[NuHeatCoordinator], ClimateEntity):
class NuHeatThermostat(CoordinatorEntity, ClimateEntity):
"""Representation of a NuHeat Thermostat."""
_attr_hvac_modes = OPERATION_LIST

View File

@@ -1,42 +0,0 @@
"""DataUpdateCoordinator for NuHeat thermostats."""
from datetime import timedelta
import logging
import nuheat
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from .const import CONF_SERIAL_NUMBER
_LOGGER = logging.getLogger(__name__)
SCAN_INTERVAL = timedelta(minutes=5)
class NuHeatCoordinator(DataUpdateCoordinator[None]):
"""Coordinator for NuHeat thermostat data."""
config_entry: ConfigEntry
def __init__(
self,
hass: HomeAssistant,
entry: ConfigEntry,
thermostat: nuheat.NuHeatThermostat,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
config_entry=entry,
name=f"nuheat {entry.data[CONF_SERIAL_NUMBER]}",
update_interval=SCAN_INTERVAL,
)
self.thermostat = thermostat
async def _async_update_data(self) -> None:
"""Fetch data from API endpoint."""
await self.hass.async_add_executor_job(self.thermostat.get_data)

View File

@@ -173,10 +173,5 @@
"set_value": {
"service": "mdi:numeric"
}
},
"triggers": {
"changed": {
"trigger": "mdi:counter"
}
}
}

View File

@@ -204,11 +204,5 @@
"name": "Set"
}
},
"title": "Number",
"triggers": {
"changed": {
"description": "Triggers when a number value changes.",
"name": "Number changed"
}
}
"title": "Number"
}

View File

@@ -1,21 +0,0 @@
"""Provides triggers for number entities."""
from homeassistant.components.input_number import DOMAIN as INPUT_NUMBER_DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.helpers.trigger import (
Trigger,
make_entity_numerical_state_changed_trigger,
)
from .const import DOMAIN
TRIGGERS: dict[str, type[Trigger]] = {
"changed": make_entity_numerical_state_changed_trigger(
{DOMAIN, INPUT_NUMBER_DOMAIN}
),
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for number entities."""
return TRIGGERS

View File

@@ -1,6 +0,0 @@
changed:
target:
entity:
domain:
- number
- input_number

View File

@@ -10,5 +10,5 @@
"iot_class": "cloud_polling",
"loggers": ["onedrive_personal_sdk"],
"quality_scale": "platinum",
"requirements": ["onedrive-personal-sdk==0.1.5"]
"requirements": ["onedrive-personal-sdk==0.1.4"]
}

View File

@@ -10,5 +10,5 @@
"iot_class": "cloud_polling",
"loggers": ["onedrive_personal_sdk"],
"quality_scale": "platinum",
"requirements": ["onedrive-personal-sdk==0.1.5"]
"requirements": ["onedrive-personal-sdk==0.1.4"]
}

View File

@@ -545,14 +545,8 @@ class OpenAISubentryFlowHandler(ConfigSubentryFlow):
return []
models_reasoning_map: dict[str | tuple[str, ...], list[str]] = {
("gpt-5.2-pro", "gpt-5.4-pro"): ["medium", "high", "xhigh"],
("gpt-5.2", "gpt-5.3", "gpt-5.4"): [
"none",
"low",
"medium",
"high",
"xhigh",
],
"gpt-5.2-pro": ["medium", "high", "xhigh"],
("gpt-5.2", "gpt-5.3"): ["none", "low", "medium", "high", "xhigh"],
"gpt-5.1": ["none", "low", "medium", "high"],
"gpt-5": ["minimal", "low", "medium", "high"],
"": ["low", "medium", "high"], # The default case

View File

@@ -23,7 +23,7 @@ from homeassistant.helpers.device_registry import DeviceEntry
import homeassistant.helpers.entity_registry as er
from homeassistant.helpers.typing import ConfigType
from .const import API_MAX_RETRIES, DOMAIN
from .const import DOMAIN
from .coordinator import PortainerCoordinator
from .services import async_setup_services
@@ -50,7 +50,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: PortainerConfigEntry) ->
session=async_create_clientsession(
hass=hass, verify_ssl=entry.data[CONF_VERIFY_SSL]
),
max_retries=API_MAX_RETRIES,
)
coordinator = PortainerCoordinator(hass, entry, client)

View File

@@ -15,7 +15,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PortainerConfigEntry
from .const import ContainerState, EndpointStatus, StackStatus
from .const import CONTAINER_STATE_RUNNING, STACK_STATUS_ACTIVE
from .coordinator import PortainerContainerData
from .entity import (
PortainerContainerEntity,
@@ -53,7 +53,7 @@ CONTAINER_SENSORS: tuple[PortainerContainerBinarySensorEntityDescription, ...] =
PortainerContainerBinarySensorEntityDescription(
key="status",
translation_key="status",
state_fn=lambda data: data.container.state == ContainerState.RUNNING,
state_fn=lambda data: data.container.state == CONTAINER_STATE_RUNNING,
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),
@@ -63,7 +63,7 @@ ENDPOINT_SENSORS: tuple[PortainerEndpointBinarySensorEntityDescription, ...] = (
PortainerEndpointBinarySensorEntityDescription(
key="status",
translation_key="status",
state_fn=lambda data: data.endpoint.status == EndpointStatus.UP,
state_fn=lambda data: data.endpoint.status == 1, # 1 = Running | 2 = Stopped
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),
@@ -73,7 +73,9 @@ STACK_SENSORS: tuple[PortainerStackBinarySensorEntityDescription, ...] = (
PortainerStackBinarySensorEntityDescription(
key="stack_status",
translation_key="status",
state_fn=lambda data: data.stack.status == StackStatus.ACTIVE,
state_fn=lambda data: (
data.stack.status == STACK_STATUS_ACTIVE
), # 1 = Active | 2 = Inactive
device_class=BinarySensorDeviceClass.RUNNING,
entity_category=EntityCategory.DIAGNOSTIC,
),

View File

@@ -1,36 +1,17 @@
"""Constants for the Portainer integration."""
from enum import IntEnum, StrEnum
DOMAIN = "portainer"
DEFAULT_NAME = "Portainer"
API_MAX_RETRIES = 3
ENDPOINT_STATUS_DOWN = 2
CONTAINER_STATE_RUNNING = "running"
STACK_STATUS_ACTIVE = 1
STACK_STATUS_INACTIVE = 2
class EndpointStatus(IntEnum):
"""Portainer endpoint status."""
UP = 1
DOWN = 2
class ContainerState(StrEnum):
"""Portainer container state."""
RUNNING = "running"
class StackStatus(IntEnum):
"""Portainer stack status."""
ACTIVE = 1
INACTIVE = 2
class StackType(IntEnum):
"""Portainer stack type."""
SWARM = 1
COMPOSE = 2
KUBERNETES = 3
STACK_TYPE_SWARM = 1
STACK_TYPE_COMPOSE = 2
STACK_TYPE_KUBERNETES = 3

View File

@@ -29,7 +29,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, ContainerState, EndpointStatus
from .const import CONTAINER_STATE_RUNNING, DOMAIN, ENDPOINT_STATUS_DOWN
type PortainerConfigEntry = ConfigEntry[PortainerCoordinator]
@@ -154,7 +154,7 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
mapped_endpoints: dict[int, PortainerCoordinatorData] = {}
for endpoint in endpoints:
if endpoint.status == EndpointStatus.DOWN:
if endpoint.status == ENDPOINT_STATUS_DOWN:
_LOGGER.debug(
"Skipping offline endpoint: %s (ID: %d)",
endpoint.name,
@@ -215,7 +215,7 @@ class PortainerCoordinator(DataUpdateCoordinator[dict[int, PortainerCoordinatorD
running_containers = [
container
for container in containers
if container.state == ContainerState.RUNNING
if container.state == CONTAINER_STATE_RUNNING
]
if running_containers:
container_stats = dict(

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyportainer==1.0.32"]
"requirements": ["pyportainer==1.0.28"]
}

View File

@@ -17,7 +17,7 @@ from homeassistant.const import PERCENTAGE, UnitOfInformation
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import StackType
from .const import STACK_TYPE_COMPOSE, STACK_TYPE_KUBERNETES, STACK_TYPE_SWARM
from .coordinator import (
PortainerConfigEntry,
PortainerContainerData,
@@ -293,11 +293,11 @@ STACK_SENSORS: tuple[PortainerStackSensorEntityDescription, ...] = (
translation_key="stack_type",
value_fn=lambda data: (
"swarm"
if data.stack.type == StackType.SWARM
if data.stack.type == STACK_TYPE_SWARM
else "compose"
if data.stack.type == StackType.COMPOSE
if data.stack.type == STACK_TYPE_COMPOSE
else "kubernetes"
if data.stack.type == StackType.KUBERNETES
if data.stack.type == STACK_TYPE_KUBERNETES
else None
),
device_class=SensorDeviceClass.ENUM,

View File

@@ -23,7 +23,7 @@ from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import PortainerConfigEntry
from .const import DOMAIN, StackStatus
from .const import DOMAIN, STACK_STATUS_ACTIVE
from .coordinator import (
PortainerContainerData,
PortainerCoordinator,
@@ -99,7 +99,7 @@ STACK_SWITCHES: tuple[PortainerStackSwitchEntityDescription, ...] = (
key="stack",
translation_key="stack",
device_class=SwitchDeviceClass.SWITCH,
is_on_fn=lambda data: data.stack.status == StackStatus.ACTIVE,
is_on_fn=lambda data: data.stack.status == STACK_STATUS_ACTIVE,
turn_on_fn=lambda portainer: portainer.start_stack,
turn_off_fn=lambda portainer: portainer.stop_stack,
),

View File

@@ -19,13 +19,12 @@ from homeassistant.components.button import (
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import ProxmoxConfigEntry, ProxmoxCoordinator, ProxmoxNodeData
from .entity import ProxmoxContainerEntity, ProxmoxNodeEntity, ProxmoxVMEntity
from .helpers import is_granted
@dataclass(frozen=True, kw_only=True)
@@ -265,11 +264,6 @@ class ProxmoxNodeButtonEntity(ProxmoxNodeEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the node button action via executor."""
if not is_granted(self.coordinator.permissions, p_type="nodes"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_node_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
@@ -284,11 +278,6 @@ class ProxmoxVMButtonEntity(ProxmoxVMEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the VM button action via executor."""
if not is_granted(self.coordinator.permissions, p_type="vms"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_vm_lxc_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,
@@ -304,12 +293,6 @@ class ProxmoxContainerButtonEntity(ProxmoxContainerEntity, ProxmoxBaseButton):
async def _async_press_call(self) -> None:
"""Execute the container button action via executor."""
# Container power actions fall under vms
if not is_granted(self.coordinator.permissions, p_type="vms"):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="no_permission_vm_lxc_power",
)
await self.hass.async_add_executor_job(
self.entity_description.press_action,
self.coordinator,

View File

@@ -17,5 +17,3 @@ DEFAULT_VERIFY_SSL = True
TYPE_VM = 0
TYPE_CONTAINER = 1
UPDATE_INTERVAL = 60
PERM_POWER = "VM.PowerMgmt"

View File

@@ -70,7 +70,6 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
self.known_nodes: set[str] = set()
self.known_vms: set[tuple[str, int]] = set()
self.known_containers: set[tuple[str, int]] = set()
self.permissions: dict[str, dict[str, int]] = {}
self.new_nodes_callbacks: list[Callable[[list[ProxmoxNodeData]], None]] = []
self.new_vms_callbacks: list[
@@ -102,21 +101,11 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
translation_key="timeout_connect",
translation_placeholders={"error": repr(err)},
) from err
except ProxmoxServerError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="api_error_details",
translation_placeholders={"error": repr(err)},
) from err
except ProxmoxPermissionsError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,
translation_key="permissions_error",
) from err
except ProxmoxNodesNotFoundError as err:
except ResourceException as err:
raise ConfigEntryError(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise ConfigEntryError(
@@ -154,6 +143,7 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="no_nodes_found",
translation_placeholders={"error": repr(err)},
) from err
except requests.exceptions.ConnectionError as err:
raise UpdateFailed(
@@ -190,19 +180,7 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
password=self.config_entry.data[CONF_PASSWORD],
verify_ssl=self.config_entry.data.get(CONF_VERIFY_SSL, DEFAULT_VERIFY_SSL),
)
try:
self.permissions = self.proxmox.access.permissions.get()
except ResourceException as err:
if 400 <= err.status_code < 500:
raise ProxmoxPermissionsError from err
raise ProxmoxServerError from err
try:
self.proxmox.nodes.get()
except ResourceException as err:
if 400 <= err.status_code < 500:
raise ProxmoxNodesNotFoundError from err
raise ProxmoxServerError from err
self.proxmox.nodes.get()
def _fetch_all_nodes(
self,
@@ -252,19 +230,3 @@ class ProxmoxCoordinator(DataUpdateCoordinator[dict[str, ProxmoxNodeData]]):
if new_containers:
_LOGGER.debug("New containers found: %s", new_containers)
self.known_containers.update(new_containers)
class ProxmoxSetupError(Exception):
"""Base exception for Proxmox setup issues."""
class ProxmoxNodesNotFoundError(ProxmoxSetupError):
"""Raised when the API works but no nodes are visible."""
class ProxmoxPermissionsError(ProxmoxSetupError):
"""Raised when failing to retrieve permissions."""
class ProxmoxServerError(ProxmoxSetupError):
"""Raised when the Proxmox server returns an error."""

View File

@@ -1,13 +0,0 @@
"""Helpers for Proxmox VE."""
from .const import PERM_POWER
def is_granted(
permissions: dict[str, dict[str, int]],
p_type: str = "vms",
permission: str = PERM_POWER,
) -> bool:
"""Validate user permissions for the given type and permission."""
path = f"/{p_type}"
return permissions.get(path, {}).get(permission) == 1

View File

@@ -175,9 +175,6 @@
}
},
"exceptions": {
"api_error_details": {
"message": "An error occurred while communicating with the Proxmox VE instance: {error}"
},
"api_error_no_details": {
"message": "An error occurred while communicating with the Proxmox VE instance."
},
@@ -196,15 +193,6 @@
"no_nodes_found": {
"message": "No active nodes were found on the Proxmox VE server."
},
"no_permission_node_power": {
"message": "The configured Proxmox VE user does not have permission to manage the power state of nodes. Please grant the user the 'VM.PowerMgmt' permission and try again."
},
"no_permission_vm_lxc_power": {
"message": "The configured Proxmox VE user does not have permission to manage the power state of VMs and containers. Please grant the user the 'VM.PowerMgmt' permission and try again."
},
"permissions_error": {
"message": "Failed to retrieve Proxmox VE permissions. Please check your credentials and try again."
},
"ssl_error": {
"message": "An SSL error occurred: {error}"
},

View File

@@ -3,7 +3,7 @@
"name": "Recovery Mode",
"codeowners": ["@home-assistant/core"],
"config_flow": false,
"dependencies": ["persistent_notification"],
"dependencies": ["frontend", "persistent_notification", "cloud"],
"documentation": "https://www.home-assistant.io/integrations/recovery_mode",
"integration_type": "system",
"quality_scale": "internal"

View File

@@ -26,13 +26,5 @@
"turn_on": {
"service": "mdi:remote"
}
},
"triggers": {
"turned_off": {
"trigger": "mdi:remote-off"
},
"turned_on": {
"trigger": "mdi:remote"
}
}
}

View File

@@ -1,8 +1,4 @@
{
"common": {
"trigger_behavior_description": "The behavior of the targeted remotes to trigger on.",
"trigger_behavior_name": "Behavior"
},
"device_automation": {
"action_type": {
"toggle": "[%key:common::device_automation::action_type::toggle%]",
@@ -31,15 +27,6 @@
}
}
},
"selector": {
"trigger_behavior": {
"options": {
"any": "Any",
"first": "First",
"last": "Last"
}
}
},
"services": {
"delete_command": {
"description": "Deletes a command or a list of commands from the database.",
@@ -126,27 +113,5 @@
"name": "[%key:common::action::turn_on%]"
}
},
"title": "Remote",
"triggers": {
"turned_off": {
"description": "Triggers when one or more remotes turn off.",
"fields": {
"behavior": {
"description": "[%key:component::remote::common::trigger_behavior_description%]",
"name": "[%key:component::remote::common::trigger_behavior_name%]"
}
},
"name": "Remote turned off"
},
"turned_on": {
"description": "Triggers when one or more remotes turn on.",
"fields": {
"behavior": {
"description": "[%key:component::remote::common::trigger_behavior_description%]",
"name": "[%key:component::remote::common::trigger_behavior_name%]"
}
},
"name": "Remote turned on"
}
}
"title": "Remote"
}

View File

@@ -1,17 +0,0 @@
"""Provides triggers for remotes."""
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.trigger import Trigger, make_entity_target_state_trigger
from . import DOMAIN
TRIGGERS: dict[str, type[Trigger]] = {
"turned_on": make_entity_target_state_trigger(DOMAIN, STATE_ON),
"turned_off": make_entity_target_state_trigger(DOMAIN, STATE_OFF),
}
async def async_get_triggers(hass: HomeAssistant) -> dict[str, type[Trigger]]:
"""Return the triggers for remotes."""
return TRIGGERS

View File

@@ -1,18 +0,0 @@
.trigger_common: &trigger_common
target:
entity:
domain: remote
fields:
behavior:
required: true
default: any
selector:
select:
options:
- first
- last
- any
translation_key: trigger_behavior
turned_off: *trigger_common
turned_on: *trigger_common

View File

@@ -52,13 +52,6 @@
"charging_remaining_time": {
"default": "mdi:timer"
},
"charging_settings_mode": {
"default": "mdi:calendar-remove",
"state": {
"delayed": "mdi:calendar-clock",
"scheduled": "mdi:calendar-month"
}
},
"fuel_autonomy": {
"default": "mdi:gas-station"
},

View File

@@ -11,7 +11,7 @@ import logging
from typing import TYPE_CHECKING, Any, Concatenate, cast
from renault_api.exceptions import RenaultException
from renault_api.kamereon import models
from renault_api.kamereon import models, schemas
from renault_api.renault_vehicle import RenaultVehicle
from homeassistant.core import HomeAssistant
@@ -201,7 +201,18 @@ class RenaultVehicleProxy:
@with_error_wrapping
async def get_charging_settings(self) -> models.KamereonVehicleChargingSettingsData:
"""Get vehicle charging settings."""
return await self._vehicle.get_charging_settings()
full_endpoint = await self._vehicle.get_full_endpoint("charging-settings")
response = await self._vehicle.http_get(full_endpoint)
response_data = cast(
models.KamereonVehicleDataResponse,
schemas.KamereonVehicleDataResponseSchema.load(response.raw_data),
)
return cast(
models.KamereonVehicleChargingSettingsData,
response_data.get_attributes(
schemas.KamereonVehicleChargingSettingsDataSchema
),
)
@with_error_wrapping
async def set_charge_schedules(
@@ -249,12 +260,6 @@ COORDINATORS: tuple[RenaultCoordinatorDescription, ...] = (
requires_electricity=True,
update_method=lambda x: x.get_charge_mode,
),
RenaultCoordinatorDescription(
endpoint="charging-settings",
key="charging_settings",
requires_electricity=True,
update_method=lambda x: x.get_charging_settings,
),
RenaultCoordinatorDescription(
endpoint="lock-status",
key="lock_status",

View File

@@ -9,7 +9,6 @@ from typing import TYPE_CHECKING, Any, Generic, cast
from renault_api.kamereon.models import (
KamereonVehicleBatteryStatusData,
KamereonVehicleChargingSettingsData,
KamereonVehicleCockpitData,
KamereonVehicleHvacStatusData,
KamereonVehicleLocationData,
@@ -129,13 +128,6 @@ def _get_utc_value(entity: RenaultSensor[T]) -> datetime:
return as_utc(original_dt)
def _get_charging_settings_mode_formatted(entity: RenaultSensor[T]) -> str | None:
"""Return the charging_settings mode of this entity."""
data = cast(KamereonVehicleChargingSettingsData, entity.coordinator.data)
charging_mode = data.mode if data else None
return charging_mode.lower() if charging_mode else None
SENSOR_TYPES: tuple[RenaultSensorEntityDescription[Any], ...] = (
RenaultSensorEntityDescription(
key="battery_level",
@@ -347,20 +339,6 @@ SENSOR_TYPES: tuple[RenaultSensorEntityDescription[Any], ...] = (
entity_registry_enabled_default=False,
translation_key="res_state_code",
),
RenaultSensorEntityDescription(
key="charging_settings_mode",
coordinator="charging_settings",
data_key="mode",
translation_key="charging_settings_mode",
entity_class=RenaultSensor[KamereonVehicleChargingSettingsData],
device_class=SensorDeviceClass.ENUM,
options=[
"always",
"delayed",
"scheduled",
],
value_lambda=_get_charging_settings_mode_formatted,
),
RenaultSensorEntityDescription(
key="front_left_pressure",
coordinator="pressure",

View File

@@ -140,14 +140,6 @@
"charging_remaining_time": {
"name": "Charging remaining time"
},
"charging_settings_mode": {
"name": "Charging mode",
"state": {
"always": "Always",
"delayed": "Delayed",
"scheduled": "Scheduled"
}
},
"front_left_pressure": {
"name": "Front left tyre pressure"
},

Some files were not shown because too many files have changed in this diff Show More