Compare commits

...

59 Commits

Author SHA1 Message Date
Michael Hansen
cd4db314cd Implement suggestions 2026-03-03 11:58:18 -06:00
Michael Hansen
5b2e0e2a4f Add missing parameters from handle API 2026-03-03 11:27:30 -06:00
Abílio Costa
1d5913d7a5 Simplify copilot-instructions.md script to use file refs (#164686) 2026-03-03 17:17:25 +00:00
epenet
05acba37c7 Remove deprecated YAML import from nederlandse_spoorwegen (#164662) 2026-03-03 17:59:29 +01:00
Samuel Xiao
7496406156 Bumb switchbot api to v2.11.0 (#164663) 2026-03-03 17:59:03 +01:00
epenet
543f2b1396 Improve type hints in meteoclimatic (#164651) 2026-03-03 17:57:54 +01:00
epenet
3df2bbda80 Bump tuya-device-handlers to 0.0.11 (#164586) 2026-03-03 17:57:36 +01:00
epenet
b661d37a86 Move mutesync coordinator to separate module (#164600) 2026-03-03 17:57:11 +01:00
Ariel Ebersberger
2102babc6d Influxdb repair issue follow up (#164684) 2026-03-03 17:57:09 +01:00
epenet
f3a1cab582 Migrate motionblinds_ble to runtime_data (#164601) 2026-03-03 17:56:54 +01:00
epenet
03c9ce25c8 Simplify access to motioneye client (#164599) 2026-03-03 17:56:16 +01:00
Christian Lackas
8fcabcec16 Fix HomematicIP heating group availability with unreachable members (#162571) 2026-03-03 17:34:14 +01:00
Michael Hansen
2a33096074 Bump intents to 2026.3.3 (#164676) 2026-03-03 17:26:44 +01:00
Ariel Ebersberger
14a9eada09 Add repair issue after importing influxdb yaml config (#164145)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-03 16:33:25 +01:00
tobiaswaldvogel
4a00f78e90 Add missing cover entity features to motion_blinds (#164673)
Signed-off-by: Tobias Waldvogel <tobias.waldvogel@gmail.com>
2026-03-03 16:30:55 +01:00
starkillerOG
abef46864e Fix key error in Reolink DHCP if still setting up (#164619) 2026-03-03 16:12:30 +01:00
Willem-Jan van Rootselaar
73b28f1ee2 Bump python-bsblan to 5.1.1 (#164591) 2026-03-03 15:56:07 +01:00
epenet
7379d41393 Migrate met_eireann to runtime_data (#164607) 2026-03-03 15:55:12 +01:00
epenet
89acb02519 Migrate monoprice to runtime_data (#164604) 2026-03-03 15:54:48 +01:00
Paul Tarjan
e343e90da2 Fix Reolink camera updates persisting in UI (#161149)
Co-authored-by: Claude <noreply@anthropic.com>
2026-03-03 15:40:32 +01:00
Daniel Schneider
e9a576494b Bump ring-doorbell to 0.9.14 (#158074)
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-03-03 15:36:26 +01:00
TimL
4e047b56d8 Bump pysmlight to v0.2.16 (#164665)
Co-authored-by: epenet <6771947+epenet@users.noreply.github.com>
2026-03-03 14:47:54 +01:00
epenet
a1e95c483d Migrate metoffice to runtime_data (#164606) 2026-03-03 14:19:57 +01:00
Andreas Jakl
9cb6e02c5f Add binary sensor platform and tests to NRGkick integration (#164629)
Co-authored-by: Joost Lekkerkerker <joostlek@outlook.com>
2026-03-03 13:55:10 +01:00
epenet
2c75e3289a Improve device_info type hints in mobile_app (#164655) 2026-03-03 13:40:56 +01:00
reneboer
348012a6b8 Bump renault-api to 0.5.6 (#164664) 2026-03-03 12:52:41 +01:00
Michael
e0db00e089 Allow the creation of multi-domain triggers (#164628) 2026-03-03 12:52:27 +01:00
Thomas Pfeiffer
b2280198d9 Add equalizer switch for Cambridge Audio devices (#162956) 2026-03-03 12:51:24 +01:00
Artur Pragacz
9cc4a3e427 Trigger recovery mode on registry major version downgrade (#164340) 2026-03-03 11:46:32 +01:00
Raman Gupta
f94a075641 Decouple Vizio apps coordinator from config entry (#163923)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-03-03 11:22:41 +01:00
hanwg
f1856e6ef6 Update subentry description for Telegram bot (#164642)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-03 11:21:01 +01:00
mettolen
ed35bafa6c Bump pysaunum to 0.6.0 (#164530) 2026-03-03 11:18:02 +01:00
Manu
66e16d728b Bump python-xbox to 0.2.0 (#164616) 2026-03-03 11:10:14 +01:00
Matthias Alphart
a806efa7e2 Update knx-frontend to 2026.3.2.183756 (#164623) 2026-03-03 11:08:20 +01:00
Norman Yee
ad4b4bd221 Enhance GV5140 test to assert temperature and humidity sensors (#164644) 2026-03-03 11:05:32 +01:00
David Recordon
c9c9a149b6 Bump pylutron-caseta to 0.27.0 (#164614) 2026-03-03 11:03:12 +01:00
epenet
0f9fdfe2de Fix invalid device registry identifiers in eafm (#164654)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-03 11:02:59 +01:00
Abílio Costa
a76b63912d Add Ubisys virtual integration (#164314) 2026-03-03 10:00:57 +00:00
Joshua Monta
bc03e13d38 Bump uhooapi to 1.2.8 (#164648) 2026-03-03 10:59:32 +01:00
Colin
450aa9757d Bump python-openevse-http to 0.2.5 (#164641) 2026-03-03 10:54:58 +01:00
Tom Matheussen
158389a4f2 Remove deprecated YAML import from Satel Integra (#164469) 2026-03-03 10:24:23 +01:00
Raman Gupta
95e89d5ef1 Redact zwave_js dsk key from diagnostics (#164636)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 10:01:35 +01:00
dependabot[bot]
e107b8e5cd Bump actions/download-artifact from 7.0.0 to 8.0.0 (#164647)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-03 08:34:36 +01:00
epenet
f875b43ede Remove unnecessary suppress in importlib helper (#164323) 2026-03-03 01:00:32 +01:00
Jeff Terrace
6242ef78c4 Move ONVIF event parsing into a module outside core (#164550)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: J. Nick Koston <nick@home-assistant.io>
Co-authored-by: J. Nick Koston <nick@koston.org>
2026-03-02 12:18:05 -10:00
Abílio Costa
3c342c0768 Add infrared platform to ESPHome (#162346)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-02 22:00:47 +00:00
Norman Yee
5dba5fc79d Add Govee H5140 CO2 monitor support to govee_ble (#164365)
Co-authored-by: J. Nick Koston <nick@koston.org>
2026-03-02 20:12:48 +00:00
James
713b7cf36d Check Daikin zone temp keys before represent (#164297)
Co-authored-by: barneyonline <barneyonline@users.noreply.github.com>
2026-03-02 19:48:39 +00:00
Bram Kragten
cb016b014b Update frontend to 20260302.0 (#164612) 2026-03-02 18:53:01 +01:00
Michael Hansen
afb4523f63 Add device_id and satellite_id to conversation HTTP/websocket APIs (#164414) 2026-03-02 17:01:51 +01:00
Alex Brown
05ad4986ac Fix Matter clear lock user (#164493) 2026-03-02 16:28:49 +01:00
epenet
42dbd5f98f Migrate moat to runtime_data (#164605) 2026-03-02 16:14:25 +01:00
epenet
f58a514ce7 Migrate monzo to runtime_data (#164603) 2026-03-02 16:14:10 +01:00
Artur Pragacz
8fb384a5e1 Raise on vacuum area mapping not configured (#164595) 2026-03-02 15:36:48 +01:00
Samuel Xiao
c24302b5ce Switchbot Cloud: Fixed Smart Radiator Thermostat off line (#162714)
Co-authored-by: Ariel Ebersberger <ariel@ebersberger.io>
2026-03-02 14:44:34 +01:00
Jan-Philipp Benecke
999ad9b642 Bump aiotankerkoenig to 0.5.1 (#164590) 2026-03-02 14:44:29 +01:00
Pierre Sassoulas
36d6b4dafe Use clearer number notation for very small and very large literals (#164521) 2026-03-02 14:06:19 +01:00
Norbert Rittel
06870a2e25 Replace "the lock" with "a lock" in matter action descriptions (#164585) 2026-03-02 12:56:45 +01:00
willemstuursma
85eba2bb15 Bump DSMR parser to 1.5.0 (#164484) 2026-03-02 12:52:37 +01:00
203 changed files with 2806 additions and 4294 deletions

View File

@@ -331,864 +331,6 @@ class MyCoordinator(DataUpdateCoordinator[MyData]):
```
# Skill: Home Assistant Integration knowledge
# Skills
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
<REFERENCE platform-diagnostics.md>
# Integration Diagnostics
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates
<END REFERENCE platform-diagnostics.md>
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
<REFERENCE platform-repairs.md>
# Repairs platform
Platform exists as `homeassistant/components/<domain>/repairs.py`.
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve
<END REFERENCE platform-repairs.md>
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md

View File

@@ -182,7 +182,7 @@ jobs:
fi
- name: Download translations
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations
@@ -544,7 +544,7 @@ jobs:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download translations
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: translations

View File

@@ -978,7 +978,7 @@ jobs:
run: |
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
- name: Download pytest_buckets
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: pytest_buckets
- name: Compile English translations
@@ -1387,7 +1387,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1558,7 +1558,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1587,7 +1587,7 @@ jobs:
&& needs.info.outputs.skip_coverage != 'true' && !cancelled()
steps:
- name: Download all coverage artifacts
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
pattern: test-results-*
- name: Upload test results to Codecov

View File

@@ -124,12 +124,12 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: requirements_diff
@@ -175,17 +175,17 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: requirements_diff
- name: Download requirements_all_wheels
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: requirements_all_wheels

View File

@@ -70,7 +70,7 @@ from .const import (
SIGNAL_BOOTSTRAP_INTEGRATIONS,
)
from .core_config import async_process_ha_core_config
from .exceptions import HomeAssistantError
from .exceptions import HomeAssistantError, UnsupportedStorageVersionError
from .helpers import (
area_registry,
category_registry,
@@ -433,32 +433,56 @@ def _init_blocking_io_modules_in_executor() -> None:
is_docker_env()
async def async_load_base_functionality(hass: core.HomeAssistant) -> None:
"""Load the registries and modules that will do blocking I/O."""
async def async_load_base_functionality(hass: core.HomeAssistant) -> bool:
"""Load the registries and modules that will do blocking I/O.
Return whether loading succeeded.
"""
if DATA_REGISTRIES_LOADED in hass.data:
return
return True
hass.data[DATA_REGISTRIES_LOADED] = None
entity.async_setup(hass)
frame.async_setup(hass)
template.async_setup(hass)
translation.async_setup(hass)
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass)),
create_eager_task(category_registry.async_load(hass)),
create_eager_task(device_registry.async_load(hass)),
create_eager_task(entity_registry.async_load(hass)),
create_eager_task(floor_registry.async_load(hass)),
create_eager_task(issue_registry.async_load(hass)),
create_eager_task(label_registry.async_load(hass)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
recovery = hass.config.recovery_mode
try:
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass, load_empty=recovery)),
create_eager_task(category_registry.async_load(hass, load_empty=recovery)),
create_eager_task(device_registry.async_load(hass, load_empty=recovery)),
create_eager_task(entity_registry.async_load(hass, load_empty=recovery)),
create_eager_task(floor_registry.async_load(hass, load_empty=recovery)),
create_eager_task(issue_registry.async_load(hass, load_empty=recovery)),
create_eager_task(label_registry.async_load(hass, load_empty=recovery)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass, load_empty=recovery)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
except UnsupportedStorageVersionError as err:
# If we're already in recovery mode, we don't want to handle the exception
# and activate recovery mode again, as that would lead to an infinite loop.
if recovery:
raise
_LOGGER.error(
"Storage file %s was created by a newer version of Home Assistant"
" (storage version %s > %s); activating recovery mode; on-disk data"
" is preserved; upgrade Home Assistant or restore from a backup",
err.storage_key,
err.found_version,
err.max_supported_version,
)
return False
return True
async def async_from_config_dict(
@@ -475,7 +499,9 @@ async def async_from_config_dict(
# Prime custom component cache early so we know if registry entries are tied
# to a custom integration
await loader.async_get_custom_components(hass)
await async_load_base_functionality(hass)
if not await async_load_base_functionality(hass):
return None
# Set up core.
_LOGGER.debug("Setting up %s", CORE_INTEGRATIONS)

View File

@@ -0,0 +1,5 @@
{
"domain": "ubisys",
"name": "Ubisys",
"iot_standards": ["zigbee"]
}

View File

@@ -44,7 +44,7 @@ def make_entity_state_trigger_required_features(
class CustomTrigger(EntityStateTriggerRequiredFeatures):
"""Trigger for entity state changes."""
_domain = domain
_domains = {domain}
_to_states = {to_state}
_required_features = required_features

View File

@@ -29,12 +29,17 @@ class StoredBackupData(TypedDict):
class _BackupStore(Store[StoredBackupData]):
"""Class to help storing backup data."""
# Maximum version we support reading for forward compatibility.
# This allows reading data written by a newer HA version after downgrade.
_MAX_READABLE_VERSION = 2
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize storage class."""
super().__init__(
hass,
STORAGE_VERSION,
STORAGE_KEY,
max_readable_version=self._MAX_READABLE_VERSION,
minor_version=STORAGE_VERSION_MINOR,
)
@@ -86,8 +91,8 @@ class _BackupStore(Store[StoredBackupData]):
# data["config"]["schedule"]["state"] will be removed. The bump to 2 is
# planned to happen after a 6 month quiet period with no minor version
# changes.
# Reject if major version is higher than 2.
if old_major_version > 2:
# Reject if major version is higher than _MAX_READABLE_VERSION.
if old_major_version > self._MAX_READABLE_VERSION:
raise NotImplementedError
return data

View File

@@ -24,7 +24,7 @@ class BinarySensorOnOffTrigger(EntityTargetStateTriggerBase):
"""Class for binary sensor on/off triggers."""
_device_class: BinarySensorDeviceClass | None
_domain: str = DOMAIN
_domains = {DOMAIN}
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain."""

View File

@@ -190,7 +190,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "miners_revenue_usd":
self._attr_native_value = f"{stats.miners_revenue_usd:.0f}"
elif sensor_type == "btc_mined":
self._attr_native_value = str(stats.btc_mined * 0.00000001)
self._attr_native_value = str(stats.btc_mined * 1e-8)
elif sensor_type == "trade_volume_usd":
self._attr_native_value = f"{stats.trade_volume_usd:.1f}"
elif sensor_type == "difficulty":
@@ -208,13 +208,13 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "blocks_size":
self._attr_native_value = f"{stats.blocks_size:.1f}"
elif sensor_type == "total_fees_btc":
self._attr_native_value = f"{stats.total_fees_btc * 0.00000001:.2f}"
self._attr_native_value = f"{stats.total_fees_btc * 1e-8:.2f}"
elif sensor_type == "total_btc_sent":
self._attr_native_value = f"{stats.total_btc_sent * 0.00000001:.2f}"
self._attr_native_value = f"{stats.total_btc_sent * 1e-8:.2f}"
elif sensor_type == "estimated_btc_sent":
self._attr_native_value = f"{stats.estimated_btc_sent * 0.00000001:.2f}"
self._attr_native_value = f"{stats.estimated_btc_sent * 1e-8:.2f}"
elif sensor_type == "total_btc":
self._attr_native_value = f"{stats.total_btc * 0.00000001:.2f}"
self._attr_native_value = f"{stats.total_btc * 1e-8:.2f}"
elif sensor_type == "total_blocks":
self._attr_native_value = f"{stats.total_blocks:.0f}"
elif sensor_type == "next_retarget":
@@ -222,7 +222,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "estimated_transaction_volume_usd":
self._attr_native_value = f"{stats.estimated_transaction_volume_usd:.2f}"
elif sensor_type == "miners_revenue_btc":
self._attr_native_value = f"{stats.miners_revenue_btc * 0.00000001:.1f}"
self._attr_native_value = f"{stats.miners_revenue_btc * 1e-8:.1f}"
elif sensor_type == "market_price_usd":
self._attr_native_value = f"{stats.market_price_usd:.2f}"

View File

@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["bsblan"],
"quality_scale": "silver",
"requirements": ["python-bsblan==5.1.0"],
"requirements": ["python-bsblan==5.1.1"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -14,7 +14,7 @@ from . import DOMAIN
class ButtonPressedTrigger(EntityTriggerBase):
"""Trigger for button entity presses."""
_domain = DOMAIN
_domains = {DOMAIN}
_schema = ENTITY_STATE_TRIGGER_SCHEMA
def is_valid_transition(self, from_state: State, to_state: State) -> bool:

View File

@@ -29,6 +29,12 @@
"early_update": {
"default": "mdi:update"
},
"equalizer": {
"default": "mdi:equalizer",
"state": {
"off": "mdi:equalizer-outline"
}
},
"pre_amp": {
"default": "mdi:volume-high",
"state": {

View File

@@ -65,6 +65,9 @@
"early_update": {
"name": "Early update"
},
"equalizer": {
"name": "Equalizer"
},
"pre_amp": {
"name": "Pre-Amp"
},

View File

@@ -33,6 +33,13 @@ def room_correction_enabled(client: StreamMagicClient) -> bool:
return client.audio.tilt_eq.enabled
def equalizer_enabled(client: StreamMagicClient) -> bool:
"""Check if equalizer is enabled."""
if TYPE_CHECKING:
assert client.audio.user_eq is not None
return client.audio.user_eq.enabled
CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
CambridgeAudioSwitchEntityDescription(
key="pre_amp",
@@ -56,6 +63,14 @@ CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
value_fn=room_correction_enabled,
set_value_fn=lambda client, value: client.set_room_correction_mode(value),
),
CambridgeAudioSwitchEntityDescription(
key="equalizer",
translation_key="equalizer",
entity_category=EntityCategory.CONFIG,
load_fn=lambda client: client.audio.user_eq is not None,
value_fn=equalizer_enabled,
set_value_fn=lambda client, value: client.set_equalizer_mode(value),
),
)

View File

@@ -43,7 +43,7 @@ HVAC_MODE_CHANGED_TRIGGER_SCHEMA = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST.extend
class HVACModeChangedTrigger(EntityTargetStateTriggerBase):
"""Trigger for entity state changes."""
_domain = DOMAIN
_domains = {DOMAIN}
_schema = HVAC_MODE_CHANGED_TRIGGER_SCHEMA
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:

View File

@@ -48,6 +48,8 @@ def async_setup(hass: HomeAssistant) -> None:
vol.Optional("conversation_id"): vol.Any(str, None),
vol.Optional("language"): str,
vol.Optional("agent_id"): agent_id_validator,
vol.Optional("device_id"): vol.Any(str, None),
vol.Optional("satellite_id"): vol.Any(str, None),
}
)
@websocket_api.async_response
@@ -64,6 +66,8 @@ async def websocket_process(
context=connection.context(msg),
language=msg.get("language"),
agent_id=msg.get("agent_id"),
device_id=msg.get("device_id"),
satellite_id=msg.get("satellite_id"),
)
connection.send_result(msg["id"], result.as_dict())
@@ -248,6 +252,8 @@ class ConversationProcessView(http.HomeAssistantView):
vol.Optional("conversation_id"): str,
vol.Optional("language"): str,
vol.Optional("agent_id"): agent_id_validator,
vol.Optional("device_id"): vol.Any(str, None),
vol.Optional("satellite_id"): vol.Any(str, None),
}
)
)
@@ -262,6 +268,8 @@ class ConversationProcessView(http.HomeAssistantView):
context=self.context(request),
language=data.get("language"),
agent_id=data.get("agent_id"),
device_id=data.get("device_id"),
satellite_id=data.get("satellite_id"),
)
return self.json(result.as_dict())

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.2.13"]
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.3.3"]
}

View File

@@ -112,11 +112,12 @@ def _zone_is_configured(zone: DaikinZone) -> bool:
def _zone_temperature_lists(device: Appliance) -> tuple[list[str], list[str]]:
"""Return the decoded zone temperature lists."""
try:
heating = device.represent(DAIKIN_ZONE_TEMP_HEAT)[1]
cooling = device.represent(DAIKIN_ZONE_TEMP_COOL)[1]
except AttributeError, KeyError:
values = device.values
if DAIKIN_ZONE_TEMP_HEAT not in values or DAIKIN_ZONE_TEMP_COOL not in values:
return ([], [])
heating = device.represent(DAIKIN_ZONE_TEMP_HEAT)[1]
cooling = device.represent(DAIKIN_ZONE_TEMP_COOL)[1]
return (list(heating or []), list(cooling or []))

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["dsmr_parser"],
"requirements": ["dsmr-parser==1.4.3"]
"requirements": ["dsmr-parser==1.5.0"]
}

View File

@@ -2,14 +2,39 @@
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr
from .const import DOMAIN
from .coordinator import EafmConfigEntry, EafmCoordinator
PLATFORMS = [Platform.SENSOR]
def _fix_device_registry_identifiers(
hass: HomeAssistant, entry: EafmConfigEntry
) -> None:
"""Fix invalid identifiers in device registry.
Added in 2026.4, can be removed in 2026.10 or later.
"""
device_registry = dr.async_get(hass)
for device_entry in dr.async_entries_for_config_entry(
device_registry, entry.entry_id
):
old_identifier = (DOMAIN, "measure-id", entry.data["station"])
if old_identifier not in device_entry.identifiers: # type: ignore[comparison-overlap]
continue
new_identifiers = device_entry.identifiers.copy()
new_identifiers.discard(old_identifier) # type: ignore[arg-type]
new_identifiers.add((DOMAIN, entry.data["station"]))
device_registry.async_update_device(
device_entry.id, new_identifiers=new_identifiers
)
async def async_setup_entry(hass: HomeAssistant, entry: EafmConfigEntry) -> bool:
"""Set up flood monitoring sensors for this config entry."""
_fix_device_registry_identifiers(hass, entry)
coordinator = EafmCoordinator(hass, entry=entry)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator

View File

@@ -94,11 +94,11 @@ class Measurement(CoordinatorEntity, SensorEntity):
return self.coordinator.data["measures"][self.key]["parameterName"]
@property
def device_info(self):
def device_info(self) -> DeviceInfo:
"""Return the device info."""
return DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, "measure-id", self.station_id)},
identifiers={(DOMAIN, self.station_id)},
manufacturer="https://environment.data.gov.uk/",
model=self.parameter_name,
name=f"{self.station_name} {self.parameter_name} {self.qualifier}",

View File

@@ -189,6 +189,7 @@ async def platform_async_setup_entry(
info_type: type[_InfoT],
entity_type: type[_EntityT],
state_type: type[_StateT],
info_filter: Callable[[_InfoT], bool] | None = None,
) -> None:
"""Set up an esphome platform.
@@ -208,10 +209,22 @@ async def platform_async_setup_entry(
entity_type,
state_type,
)
if info_filter is not None:
def on_filtered_update(infos: list[EntityInfo]) -> None:
on_static_info_update(
[info for info in infos if info_filter(cast(_InfoT, info))]
)
info_callback = on_filtered_update
else:
info_callback = on_static_info_update
entry_data.cleanup_callbacks.append(
entry_data.async_register_static_info_callback(
info_type,
on_static_info_update,
info_callback,
)
)

View File

@@ -29,6 +29,7 @@ from aioesphomeapi import (
Event,
EventInfo,
FanInfo,
InfraredInfo,
LightInfo,
LockInfo,
MediaPlayerInfo,
@@ -85,6 +86,7 @@ INFO_TYPE_TO_PLATFORM: dict[type[EntityInfo], Platform] = {
DateTimeInfo: Platform.DATETIME,
EventInfo: Platform.EVENT,
FanInfo: Platform.FAN,
InfraredInfo: Platform.INFRARED,
LightInfo: Platform.LIGHT,
LockInfo: Platform.LOCK,
MediaPlayerInfo: Platform.MEDIA_PLAYER,

View File

@@ -0,0 +1,59 @@
"""Infrared platform for ESPHome."""
from __future__ import annotations
from functools import partial
import logging
from aioesphomeapi import EntityState, InfraredCapability, InfraredInfo
from homeassistant.components.infrared import InfraredCommand, InfraredEntity
from homeassistant.core import callback
from .entity import (
EsphomeEntity,
convert_api_error_ha_error,
platform_async_setup_entry,
)
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
class EsphomeInfraredEntity(EsphomeEntity[InfraredInfo, EntityState], InfraredEntity):
"""ESPHome infrared entity using native API."""
@callback
def _on_device_update(self) -> None:
"""Call when device updates or entry data changes."""
super()._on_device_update()
if self._entry_data.available:
# Infrared entities should go available as soon as the device comes online
self.async_write_ha_state()
@convert_api_error_ha_error
async def async_send_command(self, command: InfraredCommand) -> None:
"""Send an IR command."""
timings = [
interval
for timing in command.get_raw_timings()
for interval in (timing.high_us, -timing.low_us)
]
_LOGGER.debug("Sending command: %s", timings)
self._client.infrared_rf_transmit_raw_timings(
self._static_info.key,
carrier_frequency=command.modulation,
timings=timings,
device_id=self._static_info.device_id,
)
async_setup_entry = partial(
platform_async_setup_entry,
info_type=InfraredInfo,
entity_type=EsphomeInfraredEntity,
state_type=EntityState,
info_filter=lambda info: bool(info.capabilities & InfraredCapability.TRANSMITTER),
)

View File

@@ -241,7 +241,7 @@ class EsphomeLight(EsphomeEntity[LightInfo, LightState], LightEntity):
if (color_temp_k := kwargs.get(ATTR_COLOR_TEMP_KELVIN)) is not None:
# Do not use kelvin_to_mired here to prevent precision loss
data["color_temperature"] = 1000000.0 / color_temp_k
data["color_temperature"] = 1_000_000.0 / color_temp_k
if color_temp_modes := _filter_color_modes(
color_modes, LightColorCapability.COLOR_TEMPERATURE
):

View File

@@ -21,5 +21,5 @@
"integration_type": "system",
"preview_features": { "winter_mode": {} },
"quality_scale": "internal",
"requirements": ["home-assistant-frontend==20260226.0"]
"requirements": ["home-assistant-frontend==20260302.0"]
}

View File

@@ -54,6 +54,10 @@
"connectable": false,
"local_name": "GVH5110*"
},
{
"connectable": false,
"local_name": "GV5140*"
},
{
"connectable": false,
"manufacturer_id": 1,

View File

@@ -21,6 +21,7 @@ from homeassistant.components.sensor import (
)
from homeassistant.const import (
CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
CONCENTRATION_PARTS_PER_MILLION,
PERCENTAGE,
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
UnitOfTemperature,
@@ -72,6 +73,12 @@ SENSOR_DESCRIPTIONS = {
native_unit_of_measurement=CONCENTRATION_MICROGRAMS_PER_CUBIC_METER,
state_class=SensorStateClass.MEASUREMENT,
),
(DeviceClass.CO2, Units.CONCENTRATION_PARTS_PER_MILLION): SensorEntityDescription(
key=f"{DeviceClass.CO2}_{Units.CONCENTRATION_PARTS_PER_MILLION}",
device_class=SensorDeviceClass.CO2,
native_unit_of_measurement=CONCENTRATION_PARTS_PER_MILLION,
state_class=SensorStateClass.MEASUREMENT,
),
}

View File

@@ -88,6 +88,17 @@ class HomematicipHeatingGroup(HomematicipGenericEntity, ClimateEntity):
if device.actualTemperature is None:
self._simple_heating = self._first_radiator_thermostat
@property
def available(self) -> bool:
"""Heating group available.
A heating group must be available, and should not be affected by the
individual availability of group members.
This allows controlling the temperature even when individual group
members are not available.
"""
return True
@property
def device_info(self) -> DeviceInfo:
"""Return device specific attributes."""

View File

@@ -312,6 +312,17 @@ class HomematicipCoverShutterGroup(HomematicipGenericEntity, CoverEntity):
device.modelType = f"HmIP-{post}"
super().__init__(hap, device, post, is_multi_channel=False)
@property
def available(self) -> bool:
"""Cover shutter group available.
A cover shutter group must be available, and should not be affected by
the individual availability of group members.
This allows controlling the shutters even when individual group
members are not available.
"""
return True
@property
def current_cover_position(self) -> int | None:
"""Return current position of cover."""

View File

@@ -43,6 +43,7 @@ from homeassistant.const import (
STATE_UNKNOWN,
)
from homeassistant.core import Event, HomeAssistant, State, callback
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, state as state_helper
from homeassistant.helpers.entity_values import EntityValues
@@ -61,6 +62,7 @@ from .const import (
CLIENT_ERROR_V2,
CODE_INVALID_INPUTS,
COMPONENT_CONFIG_SCHEMA_CONNECTION,
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS,
CONF_API_VERSION,
CONF_BUCKET,
CONF_COMPONENT_CONFIG,
@@ -79,7 +81,6 @@ from .const import (
CONF_TAGS_ATTRIBUTES,
CONNECTION_ERROR,
DEFAULT_API_VERSION,
DEFAULT_HOST,
DEFAULT_HOST_V2,
DEFAULT_MEASUREMENT_ATTR,
DEFAULT_SSL_V2,
@@ -104,6 +105,7 @@ from .const import (
WRITE_ERROR,
WROTE_MESSAGE,
)
from .issue import async_create_deprecated_yaml_issue
_LOGGER = logging.getLogger(__name__)
@@ -137,7 +139,7 @@ def create_influx_url(conf: dict) -> dict:
def validate_version_specific_config(conf: dict) -> dict:
"""Ensure correct config fields are provided based on API version used."""
if conf[CONF_API_VERSION] == API_VERSION_2:
if conf.get(CONF_API_VERSION, DEFAULT_API_VERSION) == API_VERSION_2:
if CONF_TOKEN not in conf:
raise vol.Invalid(
f"{CONF_TOKEN} and {CONF_BUCKET} are required when"
@@ -193,32 +195,13 @@ _INFLUX_BASE_SCHEMA = INCLUDE_EXCLUDE_BASE_FILTER_SCHEMA.extend(
}
)
INFLUX_SCHEMA = vol.All(
_INFLUX_BASE_SCHEMA.extend(COMPONENT_CONFIG_SCHEMA_CONNECTION),
validate_version_specific_config,
create_influx_url,
INFLUX_SCHEMA = _INFLUX_BASE_SCHEMA.extend(
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS
)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.All(
cv.deprecated(CONF_API_VERSION),
cv.deprecated(CONF_HOST),
cv.deprecated(CONF_PATH),
cv.deprecated(CONF_PORT),
cv.deprecated(CONF_SSL),
cv.deprecated(CONF_VERIFY_SSL),
cv.deprecated(CONF_SSL_CA_CERT),
cv.deprecated(CONF_USERNAME),
cv.deprecated(CONF_PASSWORD),
cv.deprecated(CONF_DB_NAME),
cv.deprecated(CONF_TOKEN),
cv.deprecated(CONF_ORG),
cv.deprecated(CONF_BUCKET),
INFLUX_SCHEMA,
)
},
{DOMAIN: vol.All(INFLUX_SCHEMA, validate_version_specific_config)},
extra=vol.ALLOW_EXTRA,
)
@@ -499,23 +482,35 @@ def get_influx_connection( # noqa: C901
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the InfluxDB component."""
conf = config.get(DOMAIN)
if DOMAIN not in config:
return True
if conf is not None:
if CONF_HOST not in conf and conf[CONF_API_VERSION] == DEFAULT_API_VERSION:
conf[CONF_HOST] = DEFAULT_HOST
hass.async_create_task(
hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=conf,
)
)
hass.async_create_task(_async_setup(hass, config[DOMAIN]))
return True
async def _async_setup(hass: HomeAssistant, config: dict[str, Any]) -> None:
"""Import YAML configuration into a config entry."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config,
)
if (
result.get("type") is FlowResultType.ABORT
and (reason := result["reason"]) != "single_instance_allowed"
):
async_create_deprecated_yaml_issue(hass, error=reason)
return
# If we are here, the entry already exists (single instance allowed)
if config.keys() & (
{k.schema for k in COMPONENT_CONFIG_SCHEMA_CONNECTION} - {CONF_PRECISION}
):
async_create_deprecated_yaml_issue(hass)
async def async_setup_entry(hass: HomeAssistant, entry: InfluxDBConfigEntry) -> bool:
"""Set up InfluxDB from a config entry."""
data = entry.data

View File

@@ -31,7 +31,7 @@ from homeassistant.helpers.selector import (
)
from homeassistant.helpers.storage import STORAGE_DIR
from . import DOMAIN, get_influx_connection
from . import DOMAIN, create_influx_url, get_influx_connection
from .const import (
API_VERSION_2,
CONF_API_VERSION,
@@ -40,8 +40,11 @@ from .const import (
CONF_ORG,
CONF_SSL_CA_CERT,
DEFAULT_API_VERSION,
DEFAULT_BUCKET,
DEFAULT_DATABASE,
DEFAULT_HOST,
DEFAULT_PORT,
DEFAULT_VERIFY_SSL,
)
_LOGGER = logging.getLogger(__name__)
@@ -240,14 +243,17 @@ class InfluxDBConfigFlow(ConfigFlow, domain=DOMAIN):
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
"""Handle the initial step."""
host = import_data.get(CONF_HOST)
database = import_data.get(CONF_DB_NAME)
bucket = import_data.get(CONF_BUCKET)
import_data = {**import_data}
import_data.setdefault(CONF_API_VERSION, DEFAULT_API_VERSION)
import_data.setdefault(CONF_VERIFY_SSL, DEFAULT_VERIFY_SSL)
import_data.setdefault(CONF_DB_NAME, DEFAULT_DATABASE)
import_data.setdefault(CONF_BUCKET, DEFAULT_BUCKET)
api_version = import_data.get(CONF_API_VERSION)
ssl = import_data.get(CONF_SSL)
api_version = import_data[CONF_API_VERSION]
if api_version == DEFAULT_API_VERSION:
host = import_data.get(CONF_HOST, DEFAULT_HOST)
database = import_data[CONF_DB_NAME]
title = f"{database} ({host})"
data = {
CONF_API_VERSION: api_version,
@@ -256,21 +262,23 @@ class InfluxDBConfigFlow(ConfigFlow, domain=DOMAIN):
CONF_USERNAME: import_data.get(CONF_USERNAME),
CONF_PASSWORD: import_data.get(CONF_PASSWORD),
CONF_DB_NAME: database,
CONF_SSL: ssl,
CONF_SSL: import_data.get(CONF_SSL),
CONF_PATH: import_data.get(CONF_PATH),
CONF_VERIFY_SSL: import_data.get(CONF_VERIFY_SSL),
CONF_VERIFY_SSL: import_data[CONF_VERIFY_SSL],
CONF_SSL_CA_CERT: import_data.get(CONF_SSL_CA_CERT),
}
else:
create_influx_url(import_data) # Only modifies dict for api_version == 2
bucket = import_data[CONF_BUCKET]
url = import_data.get(CONF_URL)
title = f"{bucket} ({url})"
data = {
CONF_API_VERSION: api_version,
CONF_URL: import_data.get(CONF_URL),
CONF_URL: url,
CONF_TOKEN: import_data.get(CONF_TOKEN),
CONF_ORG: import_data.get(CONF_ORG),
CONF_BUCKET: bucket,
CONF_VERIFY_SSL: import_data.get(CONF_VERIFY_SSL),
CONF_VERIFY_SSL: import_data[CONF_VERIFY_SSL],
CONF_SSL_CA_CERT: import_data.get(CONF_SSL_CA_CERT),
}

View File

@@ -154,3 +154,14 @@ COMPONENT_CONFIG_SCHEMA_CONNECTION = {
vol.Inclusive(CONF_ORG, "v2_authentication"): cv.string,
vol.Optional(CONF_BUCKET, default=DEFAULT_BUCKET): cv.string,
}
# Same keys without defaults, used in CONFIG_SCHEMA to validate
# without injecting default values (so we can detect explicit keys).
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS = {
(
vol.Optional(k.schema)
if isinstance(k, vol.Optional) and k.default is not vol.UNDEFINED
else k
): v
for k, v in COMPONENT_CONFIG_SCHEMA_CONNECTION.items()
}

View File

@@ -0,0 +1,34 @@
"""Issues for InfluxDB integration."""
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from .const import DOMAIN
@callback
def async_create_deprecated_yaml_issue(
hass: HomeAssistant, *, error: str | None = None
) -> None:
"""Create a repair issue for deprecated YAML connection configuration."""
if error is None:
issue_id = "deprecated_yaml"
severity = IssueSeverity.WARNING
else:
issue_id = f"deprecated_yaml_import_issue_{error}"
severity = IssueSeverity.ERROR
async_create_issue(
hass,
DOMAIN,
issue_id,
is_fixable=False,
issue_domain=DOMAIN,
breaks_in_ha_version="2026.9.0",
severity=severity,
translation_key=issue_id,
translation_placeholders={
"domain": DOMAIN,
"url": f"/config/integrations/dashboard/add?domain={DOMAIN}",
},
)

View File

@@ -7,7 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/influxdb",
"iot_class": "local_push",
"loggers": ["influxdb", "influxdb_client"],
"quality_scale": "legacy",
"requirements": ["influxdb==5.3.1", "influxdb-client==1.50.0"],
"single_config_entry": true
}

View File

@@ -54,5 +54,31 @@
"title": "Choose InfluxDB version"
}
}
},
"issues": {
"deprecated_yaml": {
"description": "Configuring InfluxDB connection settings using YAML is being removed. Your existing YAML connection configuration has been imported into the UI automatically.\n\nRemove the `{domain}` connection and authentication keys from your `configuration.yaml` file and restart Home Assistant to fix this issue. Other options like `include`, `exclude`, and `tags` remain in YAML for now. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "The InfluxDB YAML configuration is being removed"
},
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because Home Assistant could not connect to the InfluxDB server.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "Failed to import InfluxDB YAML configuration"
},
"deprecated_yaml_import_issue_invalid_auth": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because the provided credentials are invalid.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_invalid_database": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because the specified database was not found.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_ssl_error": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed due to an SSL certificate error.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_unknown": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed due to an unknown error.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
}
}
}

View File

@@ -627,13 +627,17 @@ class IntentHandleView(http.HomeAssistantView):
{
vol.Required("name"): cv.string,
vol.Optional("data"): vol.Schema({cv.string: object}),
vol.Optional("language"): cv.string,
vol.Optional("assistant"): vol.Any(cv.string, None),
vol.Optional("device_id"): vol.Any(cv.string, None),
vol.Optional("satellite_id"): vol.Any(cv.string, None),
}
)
)
async def post(self, request: web.Request, data: dict[str, Any]) -> web.Response:
"""Handle intent with name/data."""
hass = request.app[http.KEY_HASS]
language = hass.config.language
language = data.get("language", hass.config.language)
try:
intent_name = data["name"]
@@ -641,14 +645,21 @@ class IntentHandleView(http.HomeAssistantView):
key: {"value": value} for key, value in data.get("data", {}).items()
}
intent_result = await intent.async_handle(
hass, DOMAIN, intent_name, slots, "", self.context(request)
hass,
DOMAIN,
intent_name,
slots,
"",
self.context(request),
language=language,
assistant=data.get("assistant"),
device_id=data.get("device_id"),
satellite_id=data.get("satellite_id"),
)
except (intent.IntentHandleError, intent.MatchFailedError) as err:
intent_result = intent.IntentResponse(language=language)
intent_result.async_set_speech(str(err))
if intent_result is None:
intent_result = intent.IntentResponse(language=language) # type: ignore[unreachable]
intent_result.async_set_speech("Sorry, I couldn't handle that")
intent_result.async_set_error(
intent.IntentResponseErrorCode.FAILED_TO_HANDLE, str(err)
)
return self.json(intent_result)

View File

@@ -13,7 +13,7 @@
"requirements": [
"xknx==3.15.0",
"xknxproject==3.8.2",
"knx-frontend==2026.2.25.165736"
"knx-frontend==2026.3.2.183756"
],
"single_config_entry": true
}

View File

@@ -23,7 +23,7 @@ def _convert_uint8_to_percentage(value: Any) -> float:
class BrightnessChangedTrigger(EntityNumericalStateAttributeChangedTriggerBase):
"""Trigger for brightness changed."""
_domain = DOMAIN
_domains = {DOMAIN}
_attribute = ATTR_BRIGHTNESS
_converter = staticmethod(_convert_uint8_to_percentage)
@@ -34,7 +34,7 @@ class BrightnessCrossedThresholdTrigger(
):
"""Trigger for brightness crossed threshold."""
_domain = DOMAIN
_domains = {DOMAIN}
_attribute = ATTR_BRIGHTNESS
_converter = staticmethod(_convert_uint8_to_percentage)

View File

@@ -10,7 +10,7 @@
"integration_type": "hub",
"iot_class": "local_push",
"loggers": ["pylutron_caseta"],
"requirements": ["pylutron-caseta==0.26.0"],
"requirements": ["pylutron-caseta==0.27.0"],
"zeroconf": [
{
"properties": {

View File

@@ -14,7 +14,6 @@ from chip.clusters.Types import NullValue
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
from .const import (
CLEAR_ALL_INDEX,
CRED_TYPE_FACE,
CRED_TYPE_FINGER_VEIN,
CRED_TYPE_FINGERPRINT,
@@ -222,42 +221,6 @@ def _format_user_response(user_data: Any) -> LockUserData | None:
# --- Credential management helpers ---
async def _clear_user_credentials(
matter_client: MatterClient,
node_id: int,
endpoint_id: int,
user_index: int,
) -> None:
"""Clear all credentials for a specific user.
Fetches the user to get credential list, then clears each credential.
"""
get_user_response = await matter_client.send_device_command(
node_id=node_id,
endpoint_id=endpoint_id,
command=clusters.DoorLock.Commands.GetUser(userIndex=user_index),
)
creds = _get_attr(get_user_response, "credentials")
if not creds:
return
for cred in creds:
cred_type = _get_attr(cred, "credentialType")
cred_index = _get_attr(cred, "credentialIndex")
await matter_client.send_device_command(
node_id=node_id,
endpoint_id=endpoint_id,
command=clusters.DoorLock.Commands.ClearCredential(
credential=clusters.DoorLock.Structs.CredentialStruct(
credentialType=cred_type,
credentialIndex=cred_index,
),
),
timed_request_timeout_ms=LOCK_TIMED_REQUEST_TIMEOUT_MS,
)
class LockEndpointNotFoundError(HomeAssistantError):
"""Lock endpoint not found on node."""
@@ -557,33 +520,16 @@ async def clear_lock_user(
node: MatterNode,
user_index: int,
) -> None:
"""Clear a user from the lock, cleaning up credentials first.
"""Clear a user from the lock.
Per the Matter spec, ClearUser also clears all associated credentials
and schedules for the user.
Use index 0xFFFE (CLEAR_ALL_INDEX) to clear all users.
Raises HomeAssistantError on failure.
"""
lock_endpoint = _get_lock_endpoint_or_raise(node)
_ensure_usr_support(lock_endpoint)
if user_index == CLEAR_ALL_INDEX:
# Clear all: clear all credentials first, then all users
await matter_client.send_device_command(
node_id=node.node_id,
endpoint_id=lock_endpoint.endpoint_id,
command=clusters.DoorLock.Commands.ClearCredential(
credential=None,
),
timed_request_timeout_ms=LOCK_TIMED_REQUEST_TIMEOUT_MS,
)
else:
# Clear credentials for this specific user before deleting them
await _clear_user_credentials(
matter_client,
node.node_id,
lock_endpoint.endpoint_id,
user_index,
)
await matter_client.send_device_command(
node_id=node.node_id,
endpoint_id=lock_endpoint.endpoint_id,

View File

@@ -642,7 +642,7 @@
},
"services": {
"clear_lock_credential": {
"description": "Removes a credential from the lock.",
"description": "Removes a credential from a lock.",
"fields": {
"credential_index": {
"description": "The credential slot index to clear.",
@@ -666,7 +666,7 @@
"name": "Clear lock user"
},
"get_lock_credential_status": {
"description": "Returns the status of a credential slot on the lock.",
"description": "Returns the status of a credential slot on a lock.",
"fields": {
"credential_index": {
"description": "The credential slot index to query.",
@@ -684,7 +684,7 @@
"name": "Get lock info"
},
"get_lock_users": {
"description": "Returns all users configured on the lock with their credentials.",
"description": "Returns all users configured on a lock with their credentials.",
"name": "Get lock users"
},
"open_commissioning_window": {
@@ -698,7 +698,7 @@
"name": "Open commissioning window"
},
"set_lock_credential": {
"description": "Adds or updates a credential on the lock.",
"description": "Adds or updates a credential on a lock.",
"fields": {
"credential_data": {
"description": "The credential data. For PIN: digits only. For RFID: hexadecimal string.",

View File

@@ -1,32 +1,29 @@
"""The met_eireann component."""
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from .coordinator import MetEireannUpdateCoordinator
from .coordinator import MetEireannConfigEntry, MetEireannUpdateCoordinator
PLATFORMS = [Platform.WEATHER]
async def async_setup_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
async def async_setup_entry(
hass: HomeAssistant, config_entry: MetEireannConfigEntry
) -> bool:
"""Set up Met Éireann as config entry."""
coordinator = MetEireannUpdateCoordinator(hass, config_entry=config_entry)
await coordinator.async_refresh()
hass.data.setdefault(DOMAIN, {})[config_entry.entry_id] = coordinator
config_entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
async def async_unload_entry(
hass: HomeAssistant, config_entry: MetEireannConfigEntry
) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(
config_entry, PLATFORMS
)
hass.data[DOMAIN].pop(config_entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(config_entry, PLATFORMS)

View File

@@ -22,6 +22,8 @@ _LOGGER = logging.getLogger(__name__)
UPDATE_INTERVAL = timedelta(minutes=60)
type MetEireannConfigEntry = ConfigEntry[MetEireannUpdateCoordinator]
class MetEireannWeatherData:
"""Keep data for Met Éireann weather entities."""

View File

@@ -11,7 +11,6 @@ from homeassistant.components.weather import (
SingleCoordinatorWeatherEntity,
WeatherEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_LATITUDE,
CONF_LONGITUDE,
@@ -25,11 +24,10 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from .const import CONDITION_MAP, DEFAULT_NAME, DOMAIN, FORECAST_MAP
from .coordinator import MetEireannWeatherData
from .coordinator import MetEireannConfigEntry, MetEireannUpdateCoordinator
def format_condition(condition: str | None) -> str | None:
@@ -43,11 +41,11 @@ def format_condition(condition: str | None) -> str | None:
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: MetEireannConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Add a weather entity from a config_entry."""
coordinator = hass.data[DOMAIN][config_entry.entry_id]
coordinator = config_entry.runtime_data
entity_registry = er.async_get(hass)
# Remove hourly entity from legacy config entries
@@ -70,9 +68,7 @@ def _calculate_unique_id(config: Mapping[str, Any], hourly: bool) -> str:
return f"{config[CONF_LATITUDE]}-{config[CONF_LONGITUDE]}{name_appendix}"
class MetEireannWeather(
SingleCoordinatorWeatherEntity[DataUpdateCoordinator[MetEireannWeatherData]]
):
class MetEireannWeather(SingleCoordinatorWeatherEntity[MetEireannUpdateCoordinator]):
"""Implementation of a Met Éireann weather condition."""
_attr_attribution = "Data provided by Met Éireann"
@@ -86,7 +82,7 @@ class MetEireannWeather(
def __init__(
self,
coordinator: DataUpdateCoordinator[MetEireannWeatherData],
coordinator: MetEireannUpdateCoordinator,
config: Mapping[str, Any],
) -> None:
"""Initialise the platform with a data instance and site."""

View File

@@ -1,9 +1,8 @@
"""Support for Meteoclimatic weather data."""
import logging
from typing import Any
from meteoclimatic import MeteoclimaticClient
from meteoclimatic import MeteoclimaticClient, Observation
from meteoclimatic.exceptions import MeteoclimaticError
from homeassistant.config_entries import ConfigEntry
@@ -17,7 +16,7 @@ _LOGGER = logging.getLogger(__name__)
type MeteoclimaticConfigEntry = ConfigEntry[MeteoclimaticUpdateCoordinator]
class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[Observation]):
"""Coordinator for Meteoclimatic weather data."""
config_entry: MeteoclimaticConfigEntry
@@ -34,12 +33,11 @@ class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
)
self._meteoclimatic_client = MeteoclimaticClient()
async def _async_update_data(self) -> dict[str, Any]:
async def _async_update_data(self) -> Observation:
"""Obtain the latest data from Meteoclimatic."""
try:
data = await self.hass.async_add_executor_job(
return await self.hass.async_add_executor_job(
self._meteoclimatic_client.weather_at_station, self._station_code
)
except MeteoclimaticError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err
return data.__dict__

View File

@@ -1,5 +1,7 @@
"""Support for Meteoclimatic sensor."""
from typing import TYPE_CHECKING
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -139,26 +141,24 @@ class MeteoclimaticSensor(
"""Initialize the Meteoclimatic sensor."""
super().__init__(coordinator)
self.entity_description = description
station = self.coordinator.data["station"]
station = coordinator.data.station
self._attr_name = f"{station.name} {description.name}"
self._attr_unique_id = f"{station.code}_{description.key}"
@property
def device_info(self):
"""Return the device info."""
return DeviceInfo(
if TYPE_CHECKING:
assert coordinator.config_entry.unique_id is not None
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, self.platform.config_entry.unique_id)},
identifiers={(DOMAIN, coordinator.config_entry.unique_id)},
manufacturer=MANUFACTURER,
model=MODEL,
name=self.coordinator.name,
name=coordinator.name,
)
@property
def native_value(self):
def native_value(self) -> float | None:
"""Return the state of the sensor."""
return (
getattr(self.coordinator.data["weather"], self.entity_description.key)
getattr(self.coordinator.data.weather, self.entity_description.key)
if self.coordinator.data
else None
)

View File

@@ -48,49 +48,44 @@ class MeteoclimaticWeather(
def __init__(self, coordinator: MeteoclimaticUpdateCoordinator) -> None:
"""Initialise the weather platform."""
super().__init__(coordinator)
self._attr_unique_id = self.coordinator.data["station"].code
self._attr_name = self.coordinator.data["station"].name
@property
def device_info(self) -> DeviceInfo:
"""Return the device info."""
unique_id = self.coordinator.config_entry.unique_id
self._attr_unique_id = coordinator.data.station.code
self._attr_name = coordinator.data.station.name
if TYPE_CHECKING:
assert unique_id is not None
return DeviceInfo(
assert coordinator.config_entry.unique_id is not None
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, unique_id)},
identifiers={(DOMAIN, coordinator.config_entry.unique_id)},
manufacturer=MANUFACTURER,
model=MODEL,
name=self.coordinator.name,
name=coordinator.name,
)
@property
def condition(self) -> str | None:
"""Return the current condition."""
return format_condition(self.coordinator.data["weather"].condition)
return format_condition(self.coordinator.data.weather.condition)
@property
def native_temperature(self) -> float | None:
"""Return the temperature."""
return self.coordinator.data["weather"].temp_current
return self.coordinator.data.weather.temp_current
@property
def humidity(self) -> float | None:
"""Return the humidity."""
return self.coordinator.data["weather"].humidity_current
return self.coordinator.data.weather.humidity_current
@property
def native_pressure(self) -> float | None:
"""Return the pressure."""
return self.coordinator.data["weather"].pressure_current
return self.coordinator.data.weather.pressure_current
@property
def native_wind_speed(self) -> float | None:
"""Return the wind speed."""
return self.coordinator.data["weather"].wind_current
return self.coordinator.data.weather.wind_current
@property
def wind_bearing(self) -> float | None:
"""Return the wind bearing."""
return self.coordinator.data["weather"].wind_bearing
return self.coordinator.data.weather.wind_bearing

View File

@@ -18,20 +18,17 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.device_registry import DeviceInfo
from .const import (
DOMAIN,
METOFFICE_COORDINATES,
METOFFICE_DAILY_COORDINATOR,
METOFFICE_HOURLY_COORDINATOR,
METOFFICE_NAME,
METOFFICE_TWICE_DAILY_COORDINATOR,
from .const import DOMAIN
from .coordinator import (
MetOfficeConfigEntry,
MetOfficeRuntimeData,
MetOfficeUpdateCoordinator,
)
from .coordinator import MetOfficeUpdateCoordinator
PLATFORMS = [Platform.SENSOR, Platform.WEATHER]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: MetOfficeConfigEntry) -> bool:
"""Set up a Met Office entry."""
latitude: float = entry.data[CONF_LATITUDE]
@@ -39,8 +36,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
api_key: str = entry.data[CONF_API_KEY]
site_name: str = entry.data[CONF_NAME]
coordinates = f"{latitude}_{longitude}"
connection = Manager(api_key=api_key)
metoffice_hourly_coordinator = MetOfficeUpdateCoordinator(
@@ -73,21 +68,20 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
frequency="twice-daily",
)
metoffice_hass_data = hass.data.setdefault(DOMAIN, {})
metoffice_hass_data[entry.entry_id] = {
METOFFICE_HOURLY_COORDINATOR: metoffice_hourly_coordinator,
METOFFICE_DAILY_COORDINATOR: metoffice_daily_coordinator,
METOFFICE_TWICE_DAILY_COORDINATOR: metoffice_twice_daily_coordinator,
METOFFICE_NAME: site_name,
METOFFICE_COORDINATES: coordinates,
}
# Fetch initial data so we have data when entities subscribe
await asyncio.gather(
metoffice_hourly_coordinator.async_config_entry_first_refresh(),
metoffice_daily_coordinator.async_config_entry_first_refresh(),
)
entry.runtime_data = MetOfficeRuntimeData(
coordinates=f"{latitude}_{longitude}",
hourly_coordinator=metoffice_hourly_coordinator,
daily_coordinator=metoffice_daily_coordinator,
twice_daily_coordinator=metoffice_twice_daily_coordinator,
name=site_name,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
@@ -95,12 +89,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
if not hass.data[DOMAIN]:
hass.data.pop(DOMAIN)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
def get_device_info(coordinates: str, name: str) -> DeviceInfo:

View File

@@ -38,13 +38,6 @@ ATTRIBUTION = "Data provided by the Met Office"
DEFAULT_SCAN_INTERVAL = timedelta(minutes=15)
METOFFICE_COORDINATES = "metoffice_coordinates"
METOFFICE_HOURLY_COORDINATOR = "metoffice_hourly_coordinator"
METOFFICE_DAILY_COORDINATOR = "metoffice_daily_coordinator"
METOFFICE_TWICE_DAILY_COORDINATOR = "metoffice_twice_daily_coordinator"
METOFFICE_MONITORED_CONDITIONS = "metoffice_monitored_conditions"
METOFFICE_NAME = "metoffice_name"
CONDITION_CLASSES: dict[str, list[int]] = {
ATTR_CONDITION_CLEAR_NIGHT: [0],
ATTR_CONDITION_CLOUDY: [7, 8],

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
from dataclasses import dataclass
import logging
from typing import Literal
@@ -22,6 +23,19 @@ from .const import DEFAULT_SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
type MetOfficeConfigEntry = ConfigEntry[MetOfficeRuntimeData]
@dataclass
class MetOfficeRuntimeData:
"""Met Office config entry."""
coordinates: str
hourly_coordinator: MetOfficeUpdateCoordinator
daily_coordinator: MetOfficeUpdateCoordinator
twice_daily_coordinator: MetOfficeUpdateCoordinator
name: str
class MetOfficeUpdateCoordinator(TimestampDataUpdateCoordinator[Forecast]):
"""Coordinator for Met Office forecast data."""

View File

@@ -13,7 +13,6 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
DEGREE,
PERCENTAGE,
@@ -30,15 +29,12 @@ from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from . import get_device_info
from .const import (
ATTRIBUTION,
CONDITION_MAP,
DOMAIN,
METOFFICE_COORDINATES,
METOFFICE_HOURLY_COORDINATOR,
METOFFICE_NAME,
from .const import ATTRIBUTION, CONDITION_MAP, DOMAIN
from .coordinator import (
MetOfficeConfigEntry,
MetOfficeRuntimeData,
MetOfficeUpdateCoordinator,
)
from .coordinator import MetOfficeUpdateCoordinator
from .helpers import get_attribute
ATTR_LAST_UPDATE = "last_update"
@@ -172,19 +168,19 @@ SENSOR_TYPES: tuple[MetOfficeSensorEntityDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MetOfficeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Met Office weather sensor platform."""
entity_registry = er.async_get(hass)
hass_data = hass.data[DOMAIN][entry.entry_id]
hass_data = entry.runtime_data
# Remove daily entities from legacy config entries
for description in SENSOR_TYPES:
if entity_id := entity_registry.async_get_entity_id(
SENSOR_DOMAIN,
DOMAIN,
f"{description.key}_{hass_data[METOFFICE_COORDINATES]}_daily",
f"{description.key}_{hass_data.coordinates}_daily",
):
entity_registry.async_remove(entity_id)
@@ -192,20 +188,20 @@ async def async_setup_entry(
if entity_id := entity_registry.async_get_entity_id(
SENSOR_DOMAIN,
DOMAIN,
f"visibility_distance_{hass_data[METOFFICE_COORDINATES]}_daily",
f"visibility_distance_{hass_data.coordinates}_daily",
):
entity_registry.async_remove(entity_id)
if entity_id := entity_registry.async_get_entity_id(
SENSOR_DOMAIN,
DOMAIN,
f"visibility_distance_{hass_data[METOFFICE_COORDINATES]}",
f"visibility_distance_{hass_data.coordinates}",
):
entity_registry.async_remove(entity_id)
async_add_entities(
[
MetOfficeCurrentSensor(
hass_data[METOFFICE_HOURLY_COORDINATOR],
hass_data.hourly_coordinator,
hass_data,
description,
)
@@ -228,7 +224,7 @@ class MetOfficeCurrentSensor(
def __init__(
self,
coordinator: MetOfficeUpdateCoordinator,
hass_data: dict[str, Any],
hass_data: MetOfficeRuntimeData,
description: MetOfficeSensorEntityDescription,
) -> None:
"""Initialize the sensor."""
@@ -237,9 +233,9 @@ class MetOfficeCurrentSensor(
self.entity_description = description
self._attr_device_info = get_device_info(
coordinates=hass_data[METOFFICE_COORDINATES], name=hass_data[METOFFICE_NAME]
coordinates=hass_data.coordinates, name=hass_data.name
)
self._attr_unique_id = f"{description.key}_{hass_data[METOFFICE_COORDINATES]}"
self._attr_unique_id = f"{description.key}_{hass_data.coordinates}"
@property
def native_value(self) -> StateType:

View File

@@ -23,7 +23,6 @@ from homeassistant.components.weather import (
Forecast as WeatherForecast,
WeatherEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
UnitOfLength,
UnitOfPressure,
@@ -42,40 +41,39 @@ from .const import (
DAY_FORECAST_ATTRIBUTE_MAP,
DOMAIN,
HOURLY_FORECAST_ATTRIBUTE_MAP,
METOFFICE_COORDINATES,
METOFFICE_DAILY_COORDINATOR,
METOFFICE_HOURLY_COORDINATOR,
METOFFICE_NAME,
METOFFICE_TWICE_DAILY_COORDINATOR,
NIGHT_FORECAST_ATTRIBUTE_MAP,
)
from .coordinator import MetOfficeUpdateCoordinator
from .coordinator import (
MetOfficeConfigEntry,
MetOfficeRuntimeData,
MetOfficeUpdateCoordinator,
)
from .helpers import get_attribute
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MetOfficeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Met Office weather sensor platform."""
entity_registry = er.async_get(hass)
hass_data = hass.data[DOMAIN][entry.entry_id]
hass_data = entry.runtime_data
# Remove daily entity from legacy config entries
if entity_id := entity_registry.async_get_entity_id(
WEATHER_DOMAIN,
DOMAIN,
f"{hass_data[METOFFICE_COORDINATES]}_daily",
f"{hass_data.coordinates}_daily",
):
entity_registry.async_remove(entity_id)
async_add_entities(
[
MetOfficeWeather(
hass_data[METOFFICE_DAILY_COORDINATOR],
hass_data[METOFFICE_HOURLY_COORDINATOR],
hass_data[METOFFICE_TWICE_DAILY_COORDINATOR],
hass_data.daily_coordinator,
hass_data.hourly_coordinator,
hass_data.twice_daily_coordinator,
hass_data,
)
],
@@ -178,7 +176,7 @@ class MetOfficeWeather(
coordinator_daily: MetOfficeUpdateCoordinator,
coordinator_hourly: MetOfficeUpdateCoordinator,
coordinator_twice_daily: MetOfficeUpdateCoordinator,
hass_data: dict[str, Any],
hass_data: MetOfficeRuntimeData,
) -> None:
"""Initialise the platform with a data instance."""
observation_coordinator = coordinator_hourly
@@ -190,9 +188,9 @@ class MetOfficeWeather(
)
self._attr_device_info = get_device_info(
coordinates=hass_data[METOFFICE_COORDINATES], name=hass_data[METOFFICE_NAME]
coordinates=hass_data.coordinates, name=hass_data.name
)
self._attr_unique_id = hass_data[METOFFICE_COORDINATES]
self._attr_unique_id = hass_data.coordinates
@property
def condition(self) -> str | None:

View File

@@ -14,27 +14,26 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .const import DOMAIN
PLATFORMS: list[Platform] = [Platform.SENSOR]
_LOGGER = logging.getLogger(__name__)
type MoatConfigEntry = ConfigEntry[PassiveBluetoothProcessorCoordinator]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: MoatConfigEntry) -> bool:
"""Set up Moat BLE device from a config entry."""
address = entry.unique_id
assert address is not None
data = MoatBluetoothDeviceData()
coordinator = hass.data.setdefault(DOMAIN, {})[entry.entry_id] = (
PassiveBluetoothProcessorCoordinator(
hass,
_LOGGER,
address=address,
mode=BluetoothScanningMode.PASSIVE,
update_method=data.update,
)
coordinator = PassiveBluetoothProcessorCoordinator(
hass,
_LOGGER,
address=address,
mode=BluetoothScanningMode.PASSIVE,
update_method=data.update,
)
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
entry.async_on_unload(
coordinator.async_start()
@@ -42,9 +41,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: MoatConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -4,12 +4,10 @@ from __future__ import annotations
from moat_ble import DeviceClass, DeviceKey, SensorUpdate, Units
from homeassistant import config_entries
from homeassistant.components.bluetooth.passive_update_processor import (
PassiveBluetoothDataProcessor,
PassiveBluetoothDataUpdate,
PassiveBluetoothEntityKey,
PassiveBluetoothProcessorCoordinator,
PassiveBluetoothProcessorEntity,
)
from homeassistant.components.sensor import (
@@ -28,7 +26,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.sensor import sensor_device_info_to_hass_device_info
from .const import DOMAIN
from . import MoatConfigEntry
SENSOR_DESCRIPTIONS = {
(DeviceClass.TEMPERATURE, Units.TEMP_CELSIUS): SensorEntityDescription(
@@ -104,13 +102,11 @@ def sensor_update_to_bluetooth_data_update(
async def async_setup_entry(
hass: HomeAssistant,
entry: config_entries.ConfigEntry,
entry: MoatConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Moat BLE sensors."""
coordinator: PassiveBluetoothProcessorCoordinator = hass.data[DOMAIN][
entry.entry_id
]
coordinator = entry.runtime_data
processor = PassiveBluetoothDataProcessor(sensor_update_to_bluetooth_data_update)
entry.async_on_unload(
processor.async_add_entities_listener(

View File

@@ -13,6 +13,7 @@ from homeassistant.const import (
STATE_UNKNOWN,
)
from homeassistant.core import State, callback
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.restore_state import RestoreEntity
@@ -95,7 +96,7 @@ class MobileAppEntity(RestoreEntity):
config[ATTR_SENSOR_ICON] = last_state.attributes[ATTR_ICON]
@property
def device_info(self):
def device_info(self) -> DeviceInfo:
"""Return device registry information for this entity."""
return device_info(self._registration)

View File

@@ -193,7 +193,7 @@ def webhook_response(
)
def device_info(registration: dict) -> DeviceInfo:
def device_info(registration: Mapping[str, Any]) -> DeviceInfo:
"""Return the device info for this registration."""
return DeviceInfo(
identifiers={(DOMAIN, registration[ATTR_DEVICE_ID])},

View File

@@ -1,8 +1,11 @@
"""The Monoprice 6-Zone Amplifier integration."""
from __future__ import annotations
from dataclasses import dataclass
import logging
from pymonoprice import get_monoprice
from pymonoprice import Monoprice, get_monoprice
from serial import SerialException
from homeassistant.config_entries import ConfigEntry
@@ -10,14 +13,24 @@ from homeassistant.const import CONF_PORT, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from .const import CONF_NOT_FIRST_RUN, DOMAIN, FIRST_RUN, MONOPRICE_OBJECT
from .const import CONF_NOT_FIRST_RUN
PLATFORMS = [Platform.MEDIA_PLAYER]
_LOGGER = logging.getLogger(__name__)
type MonopriceConfigEntry = ConfigEntry[MonopriceRuntimeData]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
@dataclass
class MonopriceRuntimeData:
"""Data stored in the config entry for a Monoprice entry."""
client: Monoprice
first_run: bool
async def async_setup_entry(hass: HomeAssistant, entry: MonopriceConfigEntry) -> bool:
"""Set up Monoprice 6-Zone Amplifier from a config entry."""
port = entry.data[CONF_PORT]
@@ -37,17 +50,17 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.async_on_unload(entry.add_update_listener(_update_listener))
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = {
MONOPRICE_OBJECT: monoprice,
FIRST_RUN: first_run,
}
entry.runtime_data = MonopriceRuntimeData(
client=monoprice,
first_run=first_run,
)
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: MonopriceConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if not unload_ok:
@@ -61,10 +74,7 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""
del monoprice
monoprice = hass.data[DOMAIN][entry.entry_id][MONOPRICE_OBJECT]
hass.data[DOMAIN].pop(entry.entry_id)
await hass.async_add_executor_job(_cleanup, monoprice)
await hass.async_add_executor_job(_cleanup, entry.runtime_data.client)
return True

View File

@@ -15,6 +15,3 @@ CONF_NOT_FIRST_RUN = "not_first_run"
SERVICE_SNAPSHOT = "snapshot"
SERVICE_RESTORE = "restore"
FIRST_RUN = "first_run"
MONOPRICE_OBJECT = "monoprice_object"

View File

@@ -11,21 +11,14 @@ from homeassistant.components.media_player import (
MediaPlayerEntityFeature,
MediaPlayerState,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_PORT
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, entity_platform, service
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import (
CONF_SOURCES,
DOMAIN,
FIRST_RUN,
MONOPRICE_OBJECT,
SERVICE_RESTORE,
SERVICE_SNAPSHOT,
)
from . import MonopriceConfigEntry
from .const import CONF_SOURCES, DOMAIN, SERVICE_RESTORE, SERVICE_SNAPSHOT
_LOGGER = logging.getLogger(__name__)
@@ -57,13 +50,13 @@ def _get_sources(config_entry):
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: MonopriceConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Monoprice 6-zone amplifier platform."""
port = config_entry.data[CONF_PORT]
monoprice = hass.data[DOMAIN][config_entry.entry_id][MONOPRICE_OBJECT]
monoprice = config_entry.runtime_data.client
sources = _get_sources(config_entry)
@@ -77,8 +70,7 @@ async def async_setup_entry(
)
# only call update before add if it's the first run so we can try to detect zones
first_run = hass.data[DOMAIN][config_entry.entry_id][FIRST_RUN]
async_add_entities(entities, first_run)
async_add_entities(entities, config_entry.runtime_data.first_run)
platform = entity_platform.async_get_current_platform()

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import logging
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -14,15 +13,14 @@ from homeassistant.helpers.config_entry_oauth2_flow import (
)
from .api import AuthenticatedMonzoAPI
from .const import DOMAIN
from .coordinator import MonzoCoordinator
from .coordinator import MonzoConfigEntry, MonzoCoordinator
_LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [Platform.SENSOR]
async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_migrate_entry(hass: HomeAssistant, entry: MonzoConfigEntry) -> bool:
"""Migrate entry."""
_LOGGER.debug("Migrating from version %s.%s", entry.version, entry.minor_version)
@@ -39,7 +37,7 @@ async def async_migrate_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: MonzoConfigEntry) -> bool:
"""Set up Monzo from a config entry."""
implementation = await async_get_config_entry_implementation(hass, entry)
@@ -51,15 +49,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
await coordinator.async_config_entry_first_refresh()
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
entry.runtime_data = coordinator
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: MonzoConfigEntry) -> bool:
"""Unload a config entry."""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -1,5 +1,7 @@
"""The Monzo integration."""
from __future__ import annotations
from dataclasses import dataclass
from datetime import timedelta
import logging
@@ -18,6 +20,8 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
type MonzoConfigEntry = ConfigEntry[MonzoCoordinator]
@dataclass
class MonzoData:
@@ -30,10 +34,13 @@ class MonzoData:
class MonzoCoordinator(DataUpdateCoordinator[MonzoData]):
"""Class to manage fetching Monzo data from the API."""
config_entry: ConfigEntry
config_entry: MonzoConfigEntry
def __init__(
self, hass: HomeAssistant, config_entry: ConfigEntry, api: AuthenticatedMonzoAPI
self,
hass: HomeAssistant,
config_entry: MonzoConfigEntry,
api: AuthenticatedMonzoAPI,
) -> None:
"""Initialize."""
super().__init__(

View File

@@ -11,14 +11,11 @@ from homeassistant.components.sensor import (
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from . import MonzoCoordinator
from .const import DOMAIN
from .coordinator import MonzoData
from .coordinator import MonzoConfigEntry, MonzoCoordinator, MonzoData
from .entity import MonzoBaseEntity
@@ -64,11 +61,11 @@ MODEL_POT = "Pot"
async def async_setup_entry(
hass: HomeAssistant,
config_entry: ConfigEntry,
config_entry: MonzoConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Defer sensor setup to the shared sensor module."""
coordinator: MonzoCoordinator = hass.data[DOMAIN][config_entry.entry_id]
coordinator = config_entry.runtime_data
accounts = [
MonzoSensor(

View File

@@ -268,6 +268,26 @@ class MotionTiltDevice(MotionPositionDevice):
_restore_tilt = True
@property
def supported_features(self) -> CoverEntityFeature:
"""Flag supported features."""
supported_features = (
CoverEntityFeature.OPEN
| CoverEntityFeature.CLOSE
| CoverEntityFeature.STOP
| CoverEntityFeature.OPEN_TILT
| CoverEntityFeature.CLOSE_TILT
| CoverEntityFeature.STOP_TILT
)
if self.current_cover_position is not None:
supported_features |= CoverEntityFeature.SET_POSITION
if self.current_cover_tilt_position is not None:
supported_features |= CoverEntityFeature.SET_TILT_POSITION
return supported_features
@property
def current_cover_tilt_position(self) -> int | None:
"""Return current angle of cover.

View File

@@ -43,6 +43,8 @@ PLATFORMS: list[Platform] = [
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
type MotionConfigEntry = ConfigEntry[MotionDevice]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up Motionblinds Bluetooth integration."""
@@ -56,7 +58,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: MotionConfigEntry) -> bool:
"""Set up Motionblinds Bluetooth device from a config entry."""
_LOGGER.debug("(%s) Setting up device", entry.data[CONF_MAC_CODE])
@@ -95,11 +97,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
)
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = device
# Register OptionsFlow update listener
entry.async_on_unload(entry.add_update_listener(options_update_listener))
entry.runtime_data = device
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
# Apply options
@@ -112,7 +114,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def options_update_listener(hass: HomeAssistant, entry: ConfigEntry) -> None:
async def options_update_listener(
hass: HomeAssistant, entry: MotionConfigEntry
) -> None:
"""Handle options update."""
_LOGGER.debug(
"(%s) Updated device options: %s", entry.data[CONF_MAC_CODE], entry.options
@@ -120,10 +124,10 @@ async def options_update_listener(hass: HomeAssistant, entry: ConfigEntry) -> No
await apply_options(hass, entry)
async def apply_options(hass: HomeAssistant, entry: ConfigEntry) -> None:
async def apply_options(hass: HomeAssistant, entry: MotionConfigEntry) -> None:
"""Apply the options from the OptionsFlow."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
disconnect_time: float | None = entry.options.get(OPTION_DISCONNECT_TIME, None)
permanent_connection: bool = entry.options.get(OPTION_PERMANENT_CONNECTION, False)
@@ -131,10 +135,7 @@ async def apply_options(hass: HomeAssistant, entry: ConfigEntry) -> None:
await device.set_permanent_connection(permanent_connection)
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: MotionConfigEntry) -> bool:
"""Unload Motionblinds Bluetooth device from a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -10,12 +10,12 @@ from typing import Any
from motionblindsble.device import MotionDevice
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import ATTR_CONNECT, ATTR_DISCONNECT, ATTR_FAVORITE, CONF_MAC_CODE, DOMAIN
from . import MotionConfigEntry
from .const import ATTR_CONNECT, ATTR_DISCONNECT, ATTR_FAVORITE, CONF_MAC_CODE
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -54,12 +54,12 @@ BUTTON_TYPES: list[MotionblindsBLEButtonEntityDescription] = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up button entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
async_add_entities(
MotionblindsBLEButtonEntity(

View File

@@ -12,12 +12,7 @@ import voluptuous as vol
from homeassistant.components import bluetooth
from homeassistant.components.bluetooth import BluetoothServiceInfoBleak
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
OptionsFlow,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.const import CONF_ADDRESS
from homeassistant.core import callback
from homeassistant.exceptions import HomeAssistantError
@@ -27,6 +22,7 @@ from homeassistant.helpers.selector import (
SelectSelectorMode,
)
from . import MotionConfigEntry
from .const import (
CONF_BLIND_TYPE,
CONF_LOCAL_NAME,
@@ -185,7 +181,7 @@ class FlowHandler(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(
config_entry: ConfigEntry,
config_entry: MotionConfigEntry,
) -> OptionsFlow:
"""Create the options flow."""
return OptionsFlowHandler()

View File

@@ -7,7 +7,6 @@ import logging
from typing import Any
from motionblindsble.const import MotionBlindType, MotionRunningType
from motionblindsble.device import MotionDevice
from homeassistant.components.cover import (
ATTR_POSITION,
@@ -17,11 +16,11 @@ from homeassistant.components.cover import (
CoverEntityDescription,
CoverEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, DOMAIN, ICON_VERTICAL_BLIND
from . import MotionConfigEntry
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, ICON_VERTICAL_BLIND
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -62,7 +61,7 @@ BLIND_TYPE_TO_ENTITY_DESCRIPTION: dict[str, MotionblindsBLECoverEntityDescriptio
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up cover entity based on a config entry."""
@@ -70,7 +69,7 @@ async def async_setup_entry(
cover_class: type[MotionblindsBLECoverEntity] = BLIND_TYPE_TO_CLASS[
entry.data[CONF_BLIND_TYPE].upper()
]
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
entity_description: MotionblindsBLECoverEntityDescription = (
BLIND_TYPE_TO_ENTITY_DESCRIPTION[entry.data[CONF_BLIND_TYPE].upper()]
)

View File

@@ -5,14 +5,11 @@ from __future__ import annotations
from collections.abc import Iterable
from typing import Any
from motionblindsble.device import MotionDevice
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_UNIQUE_ID
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from . import MotionConfigEntry
CONF_TITLE = "title"
@@ -24,10 +21,10 @@ TO_REDACT: Iterable[Any] = {
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
hass: HomeAssistant, entry: MotionConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
return async_redact_data(
{

View File

@@ -5,11 +5,11 @@ import logging
from motionblindsble.const import MotionBlindType
from motionblindsble.device import MotionDevice
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ADDRESS
from homeassistant.helpers.device_registry import CONNECTION_BLUETOOTH, DeviceInfo
from homeassistant.helpers.entity import Entity, EntityDescription
from . import MotionConfigEntry
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, MANUFACTURER
_LOGGER = logging.getLogger(__name__)
@@ -21,13 +21,10 @@ class MotionblindsBLEEntity(Entity):
_attr_has_entity_name = True
_attr_should_poll = False
device: MotionDevice
entry: ConfigEntry
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: EntityDescription,
unique_id_suffix: str | None = None,
) -> None:

View File

@@ -8,12 +8,12 @@ from motionblindsble.const import MotionBlindType, MotionSpeedLevel
from motionblindsble.device import MotionDevice
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import ATTR_SPEED, CONF_MAC_CODE, DOMAIN
from . import MotionConfigEntry
from .const import ATTR_SPEED, CONF_MAC_CODE
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -33,12 +33,12 @@ SELECT_TYPES: dict[str, SelectEntityDescription] = {
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up select entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
if device.blind_type not in {MotionBlindType.CURTAIN, MotionBlindType.VERTICAL}:
async_add_entities([SpeedSelect(device, entry, SELECT_TYPES[ATTR_SPEED])])
@@ -50,7 +50,7 @@ class SpeedSelect(MotionblindsBLEEntity, SelectEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: SelectEntityDescription,
) -> None:
"""Initialize the speed select entity."""

View File

@@ -20,7 +20,6 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
@@ -30,13 +29,13 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from . import MotionConfigEntry
from .const import (
ATTR_BATTERY,
ATTR_CALIBRATION,
ATTR_CONNECTION,
ATTR_SIGNAL_STRENGTH,
CONF_MAC_CODE,
DOMAIN,
)
from .entity import MotionblindsBLEEntity
@@ -94,12 +93,12 @@ SENSORS: tuple[MotionblindsBLESensorEntityDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up sensor entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
entities: list[SensorEntity] = [
MotionblindsBLESensorEntity(device, entry, description)
@@ -118,7 +117,7 @@ class MotionblindsBLESensorEntity[_T](MotionblindsBLEEntity, SensorEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: MotionblindsBLESensorEntityDescription[_T],
) -> None:
"""Initialize the sensor entity."""
@@ -149,7 +148,7 @@ class BatterySensor(MotionblindsBLEEntity, SensorEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
) -> None:
"""Initialize the sensor entity."""
entity_description = SensorEntityDescription(

View File

@@ -62,8 +62,6 @@ from .const import (
ATTR_WEBHOOK_ID,
CONF_ADMIN_PASSWORD,
CONF_ADMIN_USERNAME,
CONF_CLIENT,
CONF_COORDINATOR,
CONF_SURVEILLANCE_PASSWORD,
CONF_SURVEILLANCE_USERNAME,
CONF_WEBHOOK_SET,
@@ -308,10 +306,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
coordinator = MotionEyeUpdateCoordinator(hass, entry, client)
hass.data[DOMAIN][entry.entry_id] = {
CONF_CLIENT: client,
CONF_COORDINATOR: coordinator,
}
hass.data[DOMAIN][entry.entry_id] = coordinator
current_cameras: set[tuple[str, str]] = set()
device_registry = dr.async_get(hass)
@@ -373,8 +368,8 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
config_data = hass.data[DOMAIN].pop(entry.entry_id)
await config_data[CONF_CLIENT].async_client_close()
coordinator = hass.data[DOMAIN].pop(entry.entry_id)
await coordinator.client.async_client_close()
return unload_ok
@@ -446,9 +441,8 @@ def _get_media_event_data(
if not config_entry_id or config_entry_id not in hass.data[DOMAIN]:
return {}
config_entry_data = hass.data[DOMAIN][config_entry_id]
client = config_entry_data[CONF_CLIENT]
coordinator = config_entry_data[CONF_COORDINATOR]
coordinator = hass.data[DOMAIN][config_entry_id]
client = coordinator.client
for identifier in device.identifiers:
data = split_motioneye_device_identifier(identifier)

View File

@@ -47,8 +47,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import get_camera_from_cameras, is_acceptable_camera, listen_for_new_cameras
from .const import (
CONF_ACTION,
CONF_CLIENT,
CONF_COORDINATOR,
CONF_STREAM_URL_TEMPLATE,
CONF_SURVEILLANCE_PASSWORD,
CONF_SURVEILLANCE_USERNAME,
@@ -98,7 +96,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -112,8 +110,8 @@ async def async_setup_entry(
),
entry.data.get(CONF_SURVEILLANCE_PASSWORD, ""),
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
)
]

View File

@@ -30,8 +30,6 @@ ATTR_EVENT_TYPE: Final = "event_type"
ATTR_WEBHOOK_ID: Final = "webhook_id"
CONF_ACTION: Final = "action"
CONF_CLIENT: Final = "client"
CONF_COORDINATOR: Final = "coordinator"
CONF_ADMIN_PASSWORD: Final = "admin_password"
CONF_ADMIN_USERNAME: Final = "admin_username"
CONF_STREAM_URL_TEMPLATE: Final = "stream_url_template"

View File

@@ -22,7 +22,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
from . import get_media_url, split_motioneye_device_identifier
from .const import CONF_CLIENT, DOMAIN
from .const import DOMAIN
MIME_TYPE_MAP = {
"movies": "video/mp4",
@@ -74,7 +74,7 @@ class MotionEyeMediaSource(MediaSource):
self._verify_kind_or_raise(kind)
url = get_media_url(
self.hass.data[DOMAIN][config.entry_id][CONF_CLIENT],
self.hass.data[DOMAIN][config.entry_id].client,
self._get_camera_id_or_raise(config, device),
self._get_path_or_raise(path),
kind == "images",
@@ -276,7 +276,7 @@ class MotionEyeMediaSource(MediaSource):
base.children = []
client = self.hass.data[DOMAIN][config.entry_id][CONF_CLIENT]
client = self.hass.data[DOMAIN][config.entry_id].client
camera_id = self._get_camera_id_or_raise(config, device)
if kind == "movies":

View File

@@ -15,7 +15,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from . import get_camera_from_cameras, listen_for_new_cameras
from .const import CONF_CLIENT, CONF_COORDINATOR, DOMAIN, TYPE_MOTIONEYE_ACTION_SENSOR
from .const import DOMAIN, TYPE_MOTIONEYE_ACTION_SENSOR
from .coordinator import MotionEyeUpdateCoordinator
from .entity import MotionEyeEntity
@@ -26,7 +26,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -36,8 +36,8 @@ async def async_setup_entry(
MotionEyeActionSensor(
entry.entry_id,
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
)
]

View File

@@ -22,7 +22,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import get_camera_from_cameras, listen_for_new_cameras
from .const import CONF_CLIENT, CONF_COORDINATOR, DOMAIN, TYPE_MOTIONEYE_SWITCH_BASE
from .const import DOMAIN, TYPE_MOTIONEYE_SWITCH_BASE
from .coordinator import MotionEyeUpdateCoordinator
from .entity import MotionEyeEntity
@@ -72,7 +72,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -82,8 +82,8 @@ async def async_setup_entry(
MotionEyeSwitch(
entry.entry_id,
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
entity_description,
)

View File

@@ -2,54 +2,20 @@
from __future__ import annotations
import asyncio
import logging
import mutesync
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import update_coordinator
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN, UPDATE_INTERVAL_IN_MEETING, UPDATE_INTERVAL_NOT_IN_MEETING
from .const import DOMAIN
from .coordinator import MutesyncUpdateCoordinator
PLATFORMS = [Platform.BINARY_SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up mütesync from a config entry."""
client = mutesync.PyMutesync(
entry.data["token"],
entry.data["host"],
async_get_clientsession(hass),
)
async def update_data():
"""Update the data."""
async with asyncio.timeout(2.5):
state = await client.get_state()
if state["muted"] is None or state["in_meeting"] is None:
raise update_coordinator.UpdateFailed("Got invalid response")
if state["in_meeting"]:
coordinator.update_interval = UPDATE_INTERVAL_IN_MEETING
else:
coordinator.update_interval = UPDATE_INTERVAL_NOT_IN_MEETING
return state
coordinator = hass.data.setdefault(DOMAIN, {})[entry.entry_id] = (
update_coordinator.DataUpdateCoordinator(
hass,
logging.getLogger(__name__),
config_entry=entry,
name=DOMAIN,
update_interval=UPDATE_INTERVAL_NOT_IN_MEETING,
update_method=update_data,
)
MutesyncUpdateCoordinator(hass, entry)
)
await coordinator.async_config_entry_first_refresh()

View File

@@ -3,11 +3,12 @@
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import update_coordinator
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import MutesyncUpdateCoordinator
SENSORS = (
"in_meeting",
@@ -27,7 +28,7 @@ async def async_setup_entry(
)
class MuteStatus(update_coordinator.CoordinatorEntity, BinarySensorEntity):
class MuteStatus(CoordinatorEntity[MutesyncUpdateCoordinator], BinarySensorEntity):
"""Mütesync binary sensors."""
_attr_has_entity_name = True

View File

@@ -0,0 +1,58 @@
"""Coordinator for the mütesync integration."""
from __future__ import annotations
import asyncio
import logging
from typing import Any
import mutesync
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, UPDATE_INTERVAL_IN_MEETING, UPDATE_INTERVAL_NOT_IN_MEETING
_LOGGER = logging.getLogger(__name__)
class MutesyncUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Coordinator for the mütesync integration."""
config_entry: ConfigEntry
def __init__(
self,
hass: HomeAssistant,
entry: ConfigEntry,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
name=DOMAIN,
config_entry=entry,
update_interval=UPDATE_INTERVAL_NOT_IN_MEETING,
)
self._client = mutesync.PyMutesync(
entry.data["token"],
entry.data["host"],
async_get_clientsession(hass),
)
async def _async_update_data(self) -> dict[str, Any]:
"""Get data from the mütesync client."""
async with asyncio.timeout(2.5):
state = await self._client.get_state()
if state["muted"] is None or state["in_meeting"] is None:
raise UpdateFailed("Got invalid response")
if state["in_meeting"]:
self.update_interval = UPDATE_INTERVAL_IN_MEETING
else:
self.update_interval = UPDATE_INTERVAL_NOT_IN_MEETING
return state

View File

@@ -17,7 +17,6 @@ from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryData,
ConfigSubentryFlow,
SubentryFlowResult,
)
@@ -30,15 +29,7 @@ from homeassistant.helpers.selector import (
TimeSelector,
)
from .const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
DOMAIN,
INTEGRATION_TITLE,
)
from .const import CONF_FROM, CONF_TIME, CONF_TO, CONF_VIA, DOMAIN, INTEGRATION_TITLE
_LOGGER = logging.getLogger(__name__)
@@ -133,47 +124,6 @@ class NSConfigFlow(ConfigFlow, domain=DOMAIN):
errors=errors,
)
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
"""Handle import from YAML configuration."""
self._async_abort_entries_match({CONF_API_KEY: import_data[CONF_API_KEY]})
client = NSAPI(import_data[CONF_API_KEY])
try:
stations = await self.hass.async_add_executor_job(client.get_stations)
except HTTPError:
return self.async_abort(reason="invalid_auth")
except RequestsConnectionError, Timeout:
return self.async_abort(reason="cannot_connect")
except Exception:
_LOGGER.exception("Unexpected exception validating API key")
return self.async_abort(reason="unknown")
station_codes = {station.code for station in stations}
subentries: list[ConfigSubentryData] = []
for route in import_data.get(CONF_ROUTES, []):
# Convert station codes to uppercase for consistency with UI routes
for key in (CONF_FROM, CONF_TO, CONF_VIA):
if key in route:
route[key] = route[key].upper()
if route[key] not in station_codes:
return self.async_abort(reason="invalid_station")
subentries.append(
ConfigSubentryData(
title=route[CONF_NAME],
subentry_type="route",
data=route,
unique_id=None,
)
)
return self.async_create_entry(
title=INTEGRATION_TITLE,
data={CONF_API_KEY: import_data[CONF_API_KEY]},
subentries=subentries,
)
@classmethod
@callback
def async_get_supported_subentry_types(

View File

@@ -12,7 +12,6 @@ AMS_TZ = ZoneInfo("Europe/Amsterdam")
# Update every 2 minutes
SCAN_INTERVAL = timedelta(minutes=2)
CONF_ROUTES = "routes"
CONF_FROM = "from"
CONF_TO = "to"
CONF_VIA = "via"

View File

@@ -5,42 +5,24 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from datetime import datetime
import logging
from typing import Any
from ns_api import Trip
import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA as SENSOR_PLATFORM_SCHEMA,
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import SOURCE_IMPORT
from homeassistant.const import CONF_API_KEY, CONF_NAME, EntityCategory
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
AddEntitiesCallback,
)
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .binary_sensor import get_delay
from .const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
DOMAIN,
INTEGRATION_TITLE,
ROUTE_MODEL,
)
from .const import DOMAIN, INTEGRATION_TITLE, ROUTE_MODEL
from .coordinator import NSConfigEntry, NSDataUpdateCoordinator
@@ -70,26 +52,9 @@ TRIP_STATUS = {
"CANCELLED": "cancelled",
}
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0 # since we use coordinator pattern
ROUTE_SCHEMA = vol.Schema(
{
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_FROM): cv.string,
vol.Required(CONF_TO): cv.string,
vol.Optional(CONF_VIA): cv.string,
vol.Optional(CONF_TIME): cv.time,
}
)
ROUTES_SCHEMA = vol.All(cv.ensure_list, [ROUTE_SCHEMA])
PLATFORM_SCHEMA = SENSOR_PLATFORM_SCHEMA.extend(
{vol.Required(CONF_API_KEY): cv.string, vol.Optional(CONF_ROUTES): ROUTES_SCHEMA}
)
@dataclass(frozen=True, kw_only=True)
class NSSensorEntityDescription(SensorEntityDescription):
@@ -195,55 +160,6 @@ SENSOR_DESCRIPTIONS: tuple[NSSensorEntityDescription, ...] = (
)
async def async_setup_platform(
hass: HomeAssistant,
config: ConfigType,
async_add_entities: AddEntitiesCallback,
discovery_info: DiscoveryInfoType | None = None,
) -> None:
"""Set up the departure sensor."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config,
)
if (
result.get("type") is FlowResultType.ABORT
and result.get("reason") != "already_configured"
):
ir.async_create_issue(
hass,
DOMAIN,
f"deprecated_yaml_import_issue_{result.get('reason')}",
breaks_in_ha_version="2026.4.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key=f"deprecated_yaml_import_issue_{result.get('reason')}",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
return
ir.async_create_issue(
hass,
HOMEASSISTANT_DOMAIN,
"deprecated_yaml",
breaks_in_ha_version="2026.4.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_yaml",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: NSConfigEntry,

View File

@@ -127,23 +127,5 @@
"name": "Transfers"
}
}
},
"issues": {
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, Home Assistant could not connect to the NS API. Please check your internet connection and the status of the NS API, then restart Home Assistant to try again, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
},
"deprecated_yaml_import_issue_invalid_auth": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, an invalid API key was found. Please update your YAML configuration, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "Nederlandse Spoorwegen YAML configuration deprecated"
},
"deprecated_yaml_import_issue_invalid_station": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration an invalid station was found. Please update your YAML configuration, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
},
"deprecated_yaml_import_issue_unknown": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, an unknown error occurred. Please restart Home Assistant to try again, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
}
}
}

View File

@@ -218,7 +218,7 @@ def fix_coordinates(user_input: dict) -> dict:
# Ensure coordinates have acceptable length for the Netatmo API
for coordinate in (CONF_LAT_NE, CONF_LAT_SW, CONF_LON_NE, CONF_LON_SW):
if len(str(user_input[coordinate]).split(".")[1]) < 7:
user_input[coordinate] = user_input[coordinate] + 0.0000001
user_input[coordinate] = user_input[coordinate] + 1e-7
# Swap coordinates if entered in wrong order
if user_input[CONF_LAT_NE] < user_input[CONF_LAT_SW]:

View File

@@ -11,6 +11,7 @@ from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .coordinator import NRGkickConfigEntry, NRGkickDataUpdateCoordinator
PLATFORMS: list[Platform] = [
Platform.BINARY_SENSOR,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,

View File

@@ -0,0 +1,76 @@
"""Binary sensor platform for NRGkick."""
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from homeassistant.components.binary_sensor import (
BinarySensorEntity,
BinarySensorEntityDescription,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import NRGkickConfigEntry, NRGkickData, NRGkickDataUpdateCoordinator
from .entity import NRGkickEntity, get_nested_dict_value
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class NRGkickBinarySensorEntityDescription(BinarySensorEntityDescription):
"""Class describing NRGkick binary sensor entities."""
is_on_fn: Callable[[NRGkickData], bool | None]
BINARY_SENSORS: tuple[NRGkickBinarySensorEntityDescription, ...] = (
NRGkickBinarySensorEntityDescription(
key="charge_permitted",
translation_key="charge_permitted",
is_on_fn=lambda data: (
bool(value)
if (
value := get_nested_dict_value(
data.values, "general", "charge_permitted"
)
)
is not None
else None
),
),
)
async def async_setup_entry(
_hass: HomeAssistant,
entry: NRGkickConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up NRGkick binary sensors based on a config entry."""
coordinator = entry.runtime_data
async_add_entities(
NRGkickBinarySensor(coordinator, description) for description in BINARY_SENSORS
)
class NRGkickBinarySensor(NRGkickEntity, BinarySensorEntity):
"""Representation of a NRGkick binary sensor."""
entity_description: NRGkickBinarySensorEntityDescription
def __init__(
self,
coordinator: NRGkickDataUpdateCoordinator,
entity_description: NRGkickBinarySensorEntityDescription,
) -> None:
"""Initialize the binary sensor."""
super().__init__(coordinator, entity_description.key)
self.entity_description = entity_description
@property
def is_on(self) -> bool | None:
"""Return the state of the binary sensor."""
return self.entity_description.is_on_fn(self.coordinator.data)

View File

@@ -14,6 +14,17 @@ from .const import DOMAIN
from .coordinator import NRGkickDataUpdateCoordinator
def get_nested_dict_value(data: Any, *keys: str) -> Any:
"""Safely get a nested value from dict-like API responses."""
current: Any = data
for key in keys:
try:
current = current.get(key)
except AttributeError:
return None
return current
class NRGkickEntity(CoordinatorEntity[NRGkickDataUpdateCoordinator]):
"""Base class for NRGkick entities with common device info setup."""

View File

@@ -1,5 +1,10 @@
{
"entity": {
"binary_sensor": {
"charge_permitted": {
"default": "mdi:ev-station"
}
},
"number": {
"current_set": {
"default": "mdi:current-ac"

View File

@@ -45,22 +45,11 @@ from .const import (
WARNING_CODE_MAP,
)
from .coordinator import NRGkickConfigEntry, NRGkickData, NRGkickDataUpdateCoordinator
from .entity import NRGkickEntity
from .entity import NRGkickEntity, get_nested_dict_value
PARALLEL_UPDATES = 0
def _get_nested_dict_value(data: Any, *keys: str) -> Any:
"""Safely get a nested value from dict-like API responses."""
current: Any = data
for key in keys:
try:
current = current.get(key)
except AttributeError:
return None
return current
@dataclass(frozen=True, kw_only=True)
class NRGkickSensorEntityDescription(SensorEntityDescription):
"""Class describing NRGkick sensor entities."""
@@ -159,7 +148,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.info, "general", "rated_current"
),
),
@@ -167,7 +156,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
NRGkickSensorEntityDescription(
key="connector_phase_count",
translation_key="connector_phase_count",
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.info, "connector", "phase_count"
),
),
@@ -178,7 +167,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.info, "connector", "max_current"
),
),
@@ -189,7 +178,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
options=_enum_options_from_mapping(CONNECTOR_TYPE_MAP),
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _map_code_to_translation_key(
cast(StateType, _get_nested_dict_value(data.info, "connector", "type")),
cast(StateType, get_nested_dict_value(data.info, "connector", "type")),
CONNECTOR_TYPE_MAP,
),
),
@@ -198,7 +187,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
translation_key="connector_serial",
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(data.info, "connector", "serial"),
value_fn=lambda data: get_nested_dict_value(data.info, "connector", "serial"),
),
# INFO - Grid
NRGkickSensorEntityDescription(
@@ -208,7 +197,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(data.info, "grid", "voltage"),
value_fn=lambda data: get_nested_dict_value(data.info, "grid", "voltage"),
),
NRGkickSensorEntityDescription(
key="grid_frequency",
@@ -217,7 +206,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfFrequency.HERTZ,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(data.info, "grid", "frequency"),
value_fn=lambda data: get_nested_dict_value(data.info, "grid", "frequency"),
),
# INFO - Network
NRGkickSensorEntityDescription(
@@ -225,7 +214,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
translation_key="network_ssid",
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(data.info, "network", "ssid"),
value_fn=lambda data: get_nested_dict_value(data.info, "network", "ssid"),
),
NRGkickSensorEntityDescription(
key="network_rssi",
@@ -234,7 +223,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(data.info, "network", "rssi"),
value_fn=lambda data: get_nested_dict_value(data.info, "network", "rssi"),
),
# INFO - Cellular (optional, only if cellular module is available)
NRGkickSensorEntityDescription(
@@ -246,7 +235,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_registry_enabled_default=False,
requires_sim_module=True,
value_fn=lambda data: _map_code_to_translation_key(
cast(StateType, _get_nested_dict_value(data.info, "cellular", "mode")),
cast(StateType, get_nested_dict_value(data.info, "cellular", "mode")),
CELLULAR_MODE_MAP,
),
),
@@ -259,7 +248,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
requires_sim_module=True,
value_fn=lambda data: _get_nested_dict_value(data.info, "cellular", "rssi"),
value_fn=lambda data: get_nested_dict_value(data.info, "cellular", "rssi"),
),
NRGkickSensorEntityDescription(
key="cellular_operator",
@@ -267,7 +256,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
requires_sim_module=True,
value_fn=lambda data: _get_nested_dict_value(data.info, "cellular", "operator"),
value_fn=lambda data: get_nested_dict_value(data.info, "cellular", "operator"),
),
# VALUES - Energy
NRGkickSensorEntityDescription(
@@ -278,7 +267,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfEnergy.WATT_HOUR,
suggested_display_precision=3,
suggested_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "energy", "total_charged_energy"
),
),
@@ -290,7 +279,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfEnergy.WATT_HOUR,
suggested_display_precision=3,
suggested_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "energy", "charged_energy"
),
),
@@ -302,7 +291,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "charging_voltage"
),
),
@@ -313,7 +302,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "charging_current"
),
),
@@ -326,7 +315,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "grid_frequency"
),
),
@@ -339,7 +328,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "peak_power"
),
),
@@ -350,7 +339,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfPower.WATT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "total_active_power"
),
),
@@ -362,7 +351,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "total_reactive_power"
),
),
@@ -374,7 +363,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfApparentPower.VOLT_AMPERE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "total_apparent_power"
),
),
@@ -386,7 +375,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "total_power_factor"
),
),
@@ -400,7 +389,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "voltage"
),
),
@@ -411,7 +400,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "current"
),
),
@@ -422,7 +411,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfPower.WATT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "active_power"
),
),
@@ -434,7 +423,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "reactive_power"
),
),
@@ -446,7 +435,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfApparentPower.VOLT_AMPERE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "apparent_power"
),
),
@@ -458,7 +447,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l1", "power_factor"
),
),
@@ -472,7 +461,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "voltage"
),
),
@@ -483,7 +472,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "current"
),
),
@@ -494,7 +483,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfPower.WATT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "active_power"
),
),
@@ -506,7 +495,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "reactive_power"
),
),
@@ -518,7 +507,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfApparentPower.VOLT_AMPERE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "apparent_power"
),
),
@@ -530,7 +519,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l2", "power_factor"
),
),
@@ -544,7 +533,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "voltage"
),
),
@@ -555,7 +544,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "current"
),
),
@@ -566,7 +555,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfPower.WATT,
suggested_display_precision=2,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "active_power"
),
),
@@ -578,7 +567,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfReactivePower.VOLT_AMPERE_REACTIVE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "reactive_power"
),
),
@@ -590,7 +579,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=UnitOfApparentPower.VOLT_AMPERE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "apparent_power"
),
),
@@ -602,7 +591,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
native_unit_of_measurement=PERCENTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "l3", "power_factor"
),
),
@@ -616,7 +605,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
suggested_display_precision=2,
entity_category=EntityCategory.DIAGNOSTIC,
entity_registry_enabled_default=False,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "powerflow", "n", "current"
),
),
@@ -626,7 +615,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
translation_key="charging_rate",
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfSpeed.KILOMETERS_PER_HOUR,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "general", "charging_rate"
),
),
@@ -638,12 +627,12 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
_seconds_to_stable_timestamp(
cast(
StateType,
_get_nested_dict_value(
get_nested_dict_value(
data.values, "general", "vehicle_connect_time"
),
)
)
if _get_nested_dict_value(data.values, "general", "status")
if get_nested_dict_value(data.values, "general", "status")
!= ChargingStatus.STANDBY
else None
),
@@ -655,7 +644,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTime.SECONDS,
suggested_unit_of_measurement=UnitOfTime.MINUTES,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "general", "vehicle_charging_time"
),
),
@@ -665,7 +654,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.ENUM,
options=_enum_options_from_mapping(STATUS_MAP),
value_fn=lambda data: _map_code_to_translation_key(
cast(StateType, _get_nested_dict_value(data.values, "general", "status")),
cast(StateType, get_nested_dict_value(data.values, "general", "status")),
STATUS_MAP,
),
),
@@ -675,7 +664,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
state_class=SensorStateClass.TOTAL_INCREASING,
suggested_display_precision=0,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "general", "charge_count"
),
),
@@ -687,7 +676,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _map_code_to_translation_key(
cast(
StateType, _get_nested_dict_value(data.values, "general", "rcd_trigger")
StateType, get_nested_dict_value(data.values, "general", "rcd_trigger")
),
RCD_TRIGGER_MAP,
),
@@ -700,8 +689,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _map_code_to_translation_key(
cast(
StateType,
_get_nested_dict_value(data.values, "general", "warning_code"),
StateType, get_nested_dict_value(data.values, "general", "warning_code")
),
WARNING_CODE_MAP,
),
@@ -714,7 +702,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _map_code_to_translation_key(
cast(
StateType, _get_nested_dict_value(data.values, "general", "error_code")
StateType, get_nested_dict_value(data.values, "general", "error_code")
),
ERROR_CODE_MAP,
),
@@ -727,7 +715,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "housing"
),
),
@@ -738,7 +726,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "connector_l1"
),
),
@@ -749,7 +737,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "connector_l2"
),
),
@@ -760,7 +748,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "connector_l3"
),
),
@@ -771,7 +759,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "domestic_plug_1"
),
),
@@ -782,7 +770,7 @@ SENSORS: tuple[NRGkickSensorEntityDescription, ...] = (
state_class=SensorStateClass.MEASUREMENT,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: _get_nested_dict_value(
value_fn=lambda data: get_nested_dict_value(
data.values, "temperatures", "domestic_plug_2"
),
),

View File

@@ -78,6 +78,11 @@
}
},
"entity": {
"binary_sensor": {
"charge_permitted": {
"name": "Charge permitted"
}
},
"number": {
"current_set": {
"name": "Charging current"

View File

@@ -16,23 +16,30 @@ from onvif.client import (
)
from onvif.exceptions import ONVIFError
from onvif.util import stringify_onvif_error
import onvif_parsers
from zeep.exceptions import Fault, TransportError, ValidationError, XMLParseError
from homeassistant.components import webhook
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import CALLBACK_TYPE, HassJob, HomeAssistant, callback
from homeassistant.helpers.device_registry import format_mac
from homeassistant.helpers.event import async_call_later
from homeassistant.helpers.network import NoURLAvailableError, get_url
from homeassistant.util import dt as dt_util
from .const import DOMAIN, LOGGER
from .models import Event, PullPointManagerState, WebHookManagerState
from .parsers import PARSERS
# Topics in this list are ignored because we do not want to create
# entities for them.
UNHANDLED_TOPICS: set[str] = {"tns1:MediaControl/VideoEncoderConfiguration"}
ENTITY_CATEGORY_MAPPING: dict[str, EntityCategory] = {
"diagnostic": EntityCategory.DIAGNOSTIC,
"config": EntityCategory.CONFIG,
}
SUBSCRIPTION_ERRORS = (Fault, TimeoutError, TransportError)
CREATE_ERRORS = (
ONVIFError,
@@ -81,6 +88,18 @@ PULLPOINT_MESSAGE_LIMIT = 100
PULLPOINT_COOLDOWN_TIME = 0.75
def _local_datetime_or_none(value: str) -> dt.datetime | None:
"""Convert strings to datetimes, if invalid, return None."""
# Handle cameras that return times like '0000-00-00T00:00:00Z' (e.g. Hikvision)
try:
ret = dt_util.parse_datetime(value)
except ValueError:
return None
if ret is not None:
return dt_util.as_local(ret)
return None
class EventManager:
"""ONVIF Event Manager."""
@@ -176,7 +195,10 @@ class EventManager:
# tns1:RuleEngine/CellMotionDetector/Motion
topic = msg.Topic._value_1.rstrip("/.") # noqa: SLF001
if not (parser := PARSERS.get(topic)):
try:
event = await onvif_parsers.parse(topic, unique_id, msg)
error = None
except onvif_parsers.errors.UnknownTopicError:
if topic not in UNHANDLED_TOPICS:
LOGGER.warning(
"%s: No registered handler for event from %s: %s",
@@ -186,10 +208,6 @@ class EventManager:
)
UNHANDLED_TOPICS.add(topic)
continue
try:
event = await parser(unique_id, msg)
error = None
except (AttributeError, KeyError) as e:
event = None
error = e
@@ -202,10 +220,26 @@ class EventManager:
error,
msg,
)
return
continue
self.get_uids_by_platform(event.platform).add(event.uid)
self._events[event.uid] = event
value = event.value
if event.device_class == "timestamp" and isinstance(value, str):
value = _local_datetime_or_none(value)
ha_event = Event(
uid=event.uid,
name=event.name,
platform=event.platform,
device_class=event.device_class,
unit_of_measurement=event.unit_of_measurement,
value=value,
entity_category=ENTITY_CATEGORY_MAPPING.get(
event.entity_category or ""
),
entity_enabled=event.entity_enabled,
)
self.get_uids_by_platform(ha_event.platform).add(ha_event.uid)
self._events[ha_event.uid] = ha_event
def get_uid(self, uid: str) -> Event | None:
"""Retrieve event for given id."""

View File

@@ -13,5 +13,9 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["onvif", "wsdiscovery", "zeep"],
"requirements": ["onvif-zeep-async==4.0.4", "WSDiscovery==2.1.2"]
"requirements": [
"onvif-zeep-async==4.0.4",
"onvif_parsers==1.2.2",
"WSDiscovery==2.1.2"
]
}

View File

@@ -1,755 +0,0 @@
"""ONVIF event parsers."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
import dataclasses
import datetime
from typing import Any
from homeassistant.const import EntityCategory
from homeassistant.util import dt as dt_util
from homeassistant.util.decorator import Registry
from .models import Event
PARSERS: Registry[str, Callable[[str, Any], Coroutine[Any, Any, Event | None]]] = (
Registry()
)
VIDEO_SOURCE_MAPPING = {
"vsconf": "VideoSourceToken",
}
def extract_message(msg: Any) -> tuple[str, Any]:
"""Extract the message content and the topic."""
return msg.Topic._value_1, msg.Message._value_1 # noqa: SLF001
def _normalize_video_source(source: str) -> str:
"""Normalize video source.
Some cameras do not set the VideoSourceToken correctly so we get duplicate
sensors, so we need to normalize it to the correct value.
"""
return VIDEO_SOURCE_MAPPING.get(source, source)
def local_datetime_or_none(value: str) -> datetime.datetime | None:
"""Convert strings to datetimes, if invalid, return None."""
# To handle cameras that return times like '0000-00-00T00:00:00Z' (e.g. hikvision)
try:
ret = dt_util.parse_datetime(value)
except ValueError:
return None
if ret is not None:
return dt_util.as_local(ret)
return None
@PARSERS.register("tns1:VideoSource/MotionAlarm")
@PARSERS.register("tns1:Device/Trigger/tnshik:AlarmIn")
async def async_parse_motion_alarm(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:VideoSource/MotionAlarm
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Motion Alarm",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:VideoSource/ImageTooBlurry/AnalyticsService")
@PARSERS.register("tns1:VideoSource/ImageTooBlurry/ImagingService")
@PARSERS.register("tns1:VideoSource/ImageTooBlurry/RecordingService")
async def async_parse_image_too_blurry(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:VideoSource/ImageTooBlurry/*
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Image Too Blurry",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:VideoSource/ImageTooDark/AnalyticsService")
@PARSERS.register("tns1:VideoSource/ImageTooDark/ImagingService")
@PARSERS.register("tns1:VideoSource/ImageTooDark/RecordingService")
async def async_parse_image_too_dark(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:VideoSource/ImageTooDark/*
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Image Too Dark",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:VideoSource/ImageTooBright/AnalyticsService")
@PARSERS.register("tns1:VideoSource/ImageTooBright/ImagingService")
@PARSERS.register("tns1:VideoSource/ImageTooBright/RecordingService")
async def async_parse_image_too_bright(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:VideoSource/ImageTooBright/*
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Image Too Bright",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:VideoSource/GlobalSceneChange/AnalyticsService")
@PARSERS.register("tns1:VideoSource/GlobalSceneChange/ImagingService")
@PARSERS.register("tns1:VideoSource/GlobalSceneChange/RecordingService")
async def async_parse_scene_change(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:VideoSource/GlobalSceneChange/*
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Global Scene Change",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:AudioAnalytics/Audio/DetectedSound")
async def async_parse_detected_sound(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:AudioAnalytics/Audio/DetectedSound
"""
audio_source = ""
audio_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "AudioSourceConfigurationToken":
audio_source = source.Value
if source.Name == "AudioAnalyticsConfigurationToken":
audio_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{audio_source}_{audio_analytics}_{rule}",
"Detected Sound",
"binary_sensor",
"sound",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/FieldDetector/ObjectsInside")
async def async_parse_field_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/FieldDetector/ObjectsInside
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Field Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/Motion")
async def async_parse_cell_motion_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/CellMotionDetector/Motion
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Cell Motion Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/MotionRegionDetector/Motion")
async def async_parse_motion_region_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MotionRegionDetector/Motion
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Motion Region Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value in ["1", "true"],
)
@PARSERS.register("tns1:RuleEngine/TamperDetector/Tamper")
async def async_parse_tamper_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/TamperDetector/Tamper
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Tamper Detection",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/DogCatDetect")
async def async_parse_dog_cat_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/DogCatDetect
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Pet Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/VehicleDetect")
async def async_parse_vehicle_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/VehicleDetect
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Vehicle Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
_TAPO_EVENT_TEMPLATES: dict[str, Event] = {
"IsVehicle": Event(
uid="",
name="Vehicle Detection",
platform="binary_sensor",
device_class="motion",
),
"IsPeople": Event(
uid="", name="Person Detection", platform="binary_sensor", device_class="motion"
),
"IsPet": Event(
uid="", name="Pet Detection", platform="binary_sensor", device_class="motion"
),
"IsLineCross": Event(
uid="",
name="Line Detector Crossed",
platform="binary_sensor",
device_class="motion",
),
"IsTamper": Event(
uid="", name="Tamper Detection", platform="binary_sensor", device_class="tamper"
),
"IsIntrusion": Event(
uid="",
name="Intrusion Detection",
platform="binary_sensor",
device_class="safety",
),
}
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/Intrusion")
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/LineCross")
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/People")
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/Tamper")
@PARSERS.register("tns1:RuleEngine/CellMotionDetector/TpSmartEvent")
@PARSERS.register("tns1:RuleEngine/PeopleDetector/People")
@PARSERS.register("tns1:RuleEngine/TPSmartEventDetector/TPSmartEvent")
async def async_parse_tplink_detector(uid: str, msg) -> Event | None:
"""Handle parsing tplink smart event messages.
Topic: tns1:RuleEngine/CellMotionDetector/Intrusion
Topic: tns1:RuleEngine/CellMotionDetector/LineCross
Topic: tns1:RuleEngine/CellMotionDetector/People
Topic: tns1:RuleEngine/CellMotionDetector/Tamper
Topic: tns1:RuleEngine/CellMotionDetector/TpSmartEvent
Topic: tns1:RuleEngine/PeopleDetector/People
Topic: tns1:RuleEngine/TPSmartEventDetector/TPSmartEvent
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
for item in payload.Data.SimpleItem:
event_template = _TAPO_EVENT_TEMPLATES.get(item.Name)
if event_template is None:
continue
return dataclasses.replace(
event_template,
uid=f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
value=item.Value == "true",
)
return None
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/PeopleDetect")
async def async_parse_person_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/PeopleDetect
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Person Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/FaceDetect")
async def async_parse_face_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/FaceDetect
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Face Detection",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/Visitor")
async def async_parse_visitor_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/Visitor
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Visitor Detection",
"binary_sensor",
"occupancy",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:RuleEngine/MyRuleDetector/Package")
async def async_parse_package_detector(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/MyRuleDetector/Package
"""
video_source = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "Source":
video_source = _normalize_video_source(source.Value)
return Event(
f"{uid}_{topic}_{video_source}",
"Package Detection",
"binary_sensor",
"occupancy",
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:Device/Trigger/DigitalInput")
async def async_parse_digital_input(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Device/Trigger/DigitalInput
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Digital Input",
"binary_sensor",
None,
None,
payload.Data.SimpleItem[0].Value == "true",
)
@PARSERS.register("tns1:Device/Trigger/Relay")
async def async_parse_relay(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Device/Trigger/Relay
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Relay Triggered",
"binary_sensor",
None,
None,
payload.Data.SimpleItem[0].Value == "active",
)
@PARSERS.register("tns1:Device/HardwareFailure/StorageFailure")
async def async_parse_storage_failure(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Device/HardwareFailure/StorageFailure
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Storage Failure",
"binary_sensor",
"problem",
None,
payload.Data.SimpleItem[0].Value == "true",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:Monitoring/ProcessorUsage")
async def async_parse_processor_usage(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Monitoring/ProcessorUsage
"""
topic, payload = extract_message(msg)
usage = float(payload.Data.SimpleItem[0].Value)
if usage <= 1:
usage *= 100
return Event(
f"{uid}_{topic}",
"Processor Usage",
"sensor",
None,
"percent",
int(usage),
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:Monitoring/OperatingTime/LastReboot")
async def async_parse_last_reboot(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Monitoring/OperatingTime/LastReboot
"""
topic, payload = extract_message(msg)
date_time = local_datetime_or_none(payload.Data.SimpleItem[0].Value)
return Event(
f"{uid}_{topic}",
"Last Reboot",
"sensor",
"timestamp",
None,
date_time,
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:Monitoring/OperatingTime/LastReset")
async def async_parse_last_reset(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Monitoring/OperatingTime/LastReset
"""
topic, payload = extract_message(msg)
date_time = local_datetime_or_none(payload.Data.SimpleItem[0].Value)
return Event(
f"{uid}_{topic}",
"Last Reset",
"sensor",
"timestamp",
None,
date_time,
EntityCategory.DIAGNOSTIC,
entity_enabled=False,
)
@PARSERS.register("tns1:Monitoring/Backup/Last")
async def async_parse_backup_last(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Monitoring/Backup/Last
"""
topic, payload = extract_message(msg)
date_time = local_datetime_or_none(payload.Data.SimpleItem[0].Value)
return Event(
f"{uid}_{topic}",
"Last Backup",
"sensor",
"timestamp",
None,
date_time,
EntityCategory.DIAGNOSTIC,
entity_enabled=False,
)
@PARSERS.register("tns1:Monitoring/OperatingTime/LastClockSynchronization")
async def async_parse_last_clock_sync(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:Monitoring/OperatingTime/LastClockSynchronization
"""
topic, payload = extract_message(msg)
date_time = local_datetime_or_none(payload.Data.SimpleItem[0].Value)
return Event(
f"{uid}_{topic}",
"Last Clock Synchronization",
"sensor",
"timestamp",
None,
date_time,
EntityCategory.DIAGNOSTIC,
entity_enabled=False,
)
@PARSERS.register("tns1:RecordingConfig/JobState")
async def async_parse_jobstate(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RecordingConfig/JobState
"""
topic, payload = extract_message(msg)
source = payload.Source.SimpleItem[0].Value
return Event(
f"{uid}_{topic}_{source}",
"Recording Job State",
"binary_sensor",
None,
None,
payload.Data.SimpleItem[0].Value == "Active",
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:RuleEngine/LineDetector/Crossed")
async def async_parse_linedetector_crossed(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/LineDetector/Crossed
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = source.Value
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Line Detector Crossed",
"sensor",
None,
None,
payload.Data.SimpleItem[0].Value,
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:RuleEngine/CountAggregation/Counter")
async def async_parse_count_aggregation_counter(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:RuleEngine/CountAggregation/Counter
"""
video_source = ""
video_analytics = ""
rule = ""
topic, payload = extract_message(msg)
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
if source.Name == "VideoAnalyticsConfigurationToken":
video_analytics = source.Value
if source.Name == "Rule":
rule = source.Value
return Event(
f"{uid}_{topic}_{video_source}_{video_analytics}_{rule}",
"Count Aggregation Counter",
"sensor",
None,
None,
payload.Data.SimpleItem[0].Value,
EntityCategory.DIAGNOSTIC,
)
@PARSERS.register("tns1:UserAlarm/IVA/HumanShapeDetect")
async def async_parse_human_shape_detect(uid: str, msg) -> Event | None:
"""Handle parsing event message.
Topic: tns1:UserAlarm/IVA/HumanShapeDetect
"""
topic, payload = extract_message(msg)
video_source = ""
for source in payload.Source.SimpleItem:
if source.Name == "VideoSourceConfigurationToken":
video_source = _normalize_video_source(source.Value)
break
return Event(
f"{uid}_{topic}_{video_source}",
"Human Shape Detect",
"binary_sensor",
"motion",
None,
payload.Data.SimpleItem[0].Value == "true",
)

View File

@@ -9,6 +9,6 @@
"iot_class": "local_push",
"loggers": ["openevsehttp"],
"quality_scale": "bronze",
"requirements": ["python-openevse-http==0.2.1"],
"requirements": ["python-openevse-http==0.2.5"],
"zeroconf": ["_openevse._tcp.local."]
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["renault_api"],
"quality_scale": "silver",
"requirements": ["renault-api==0.5.5"]
"requirements": ["renault-api==0.5.6"]
}

View File

@@ -143,8 +143,13 @@ async def async_setup_entry(
min_timeout = host.api.timeout * (RETRY_ATTEMPTS + 2)
update_timeout = max(min_timeout, min_timeout * host.api.num_cameras / 10)
# Track firmware versions to detect external updates (e.g., via Reolink app)
last_known_firmware: dict[int | None, str | None] = {}
async def async_device_config_update() -> None:
"""Update the host state cache and renew the ONVIF-subscription."""
nonlocal last_known_firmware
async with asyncio.timeout(update_timeout):
try:
await host.update_states()
@@ -162,6 +167,23 @@ async def async_setup_entry(
host.credential_errors = 0
# Check for firmware version changes (external update detection)
firmware_changed = False
for ch in (*host.api.channels, None):
new_version = host.api.camera_sw_version(ch)
old_version = last_known_firmware.get(ch)
if (
old_version is not None
and new_version is not None
and new_version != old_version
):
firmware_changed = True
last_known_firmware[ch] = new_version
# Notify firmware coordinator if firmware changed externally
if firmware_changed and firmware_coordinator is not None:
firmware_coordinator.async_set_updated_data(None)
async with asyncio.timeout(min_timeout):
await host.renew()

View File

@@ -159,6 +159,15 @@ class ReolinkFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle discovery via dhcp."""
mac_address = format_mac(discovery_info.macaddress)
existing_entry = await self.async_set_unique_id(mac_address)
if existing_entry and CONF_HOST not in existing_entry.data:
_LOGGER.debug(
"Reolink DHCP discovered device with MAC '%s' and IP '%s', "
"but existing config entry does not have host, ignoring",
mac_address,
discovery_info.ip,
)
raise AbortFlow("already_configured")
if (
existing_entry
and CONF_PASSWORD in existing_entry.data

Some files were not shown because too many files have changed in this diff Show More