Compare commits

..

25 Commits

Author SHA1 Message Date
Simone Chemelli
5dad64e54c Bump aioamazondevices to 13.0.0 (#164618) 2026-03-03 22:16:07 +00:00
Robert Resch
c311ff0464 Fix wheels building by using arch dependent requirements_all file (#164675) 2026-03-03 21:55:59 +01:00
Dave T
c45675a01f Add additional diagnostic sensors to aurora_abb_powerone PV inverter (#164622) 2026-03-03 21:34:44 +01:00
erikbadman
9d92141812 Add support for active power limit in Kostal Plenticore (#164674) 2026-03-03 21:33:54 +01:00
Robin Lintermann
501b973a98 Add send diagnostics button to smarla (#164335) 2026-03-03 21:31:31 +01:00
Kamil Breguła
fd4d8137da Change reconfiguration-flow status to 'todo' in WebDAV (#164637)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-03 21:23:24 +01:00
Miguel Angel Nubla
33881c1912 Fix infinite loop in esphome assist_satellite (#163097)
Co-authored-by: Artur Pragacz <artur@pragacz.com>
2026-03-03 20:44:36 +01:00
Robin Lintermann
9bdb03dbe8 Set device classes and measurement units for Smarla (#164682) 2026-03-03 18:36:02 +00:00
epenet
d2178ba458 Cleanup deprecated tuya entities (#164657) 2026-03-03 19:31:09 +01:00
Abílio Costa
06cdf3c5d2 Add PR review Claude skill (#164626) 2026-03-03 18:21:51 +00:00
r2xj
84c994ab80 Add support for samsungce.lamp as light entity and when not under main component (#164448)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Joostlek <joostlek@outlook.com>
2026-03-03 18:29:36 +01:00
Abílio Costa
1d5913d7a5 Simplify copilot-instructions.md script to use file refs (#164686) 2026-03-03 17:17:25 +00:00
epenet
05acba37c7 Remove deprecated YAML import from nederlandse_spoorwegen (#164662) 2026-03-03 17:59:29 +01:00
Samuel Xiao
7496406156 Bumb switchbot api to v2.11.0 (#164663) 2026-03-03 17:59:03 +01:00
epenet
543f2b1396 Improve type hints in meteoclimatic (#164651) 2026-03-03 17:57:54 +01:00
epenet
3df2bbda80 Bump tuya-device-handlers to 0.0.11 (#164586) 2026-03-03 17:57:36 +01:00
epenet
b661d37a86 Move mutesync coordinator to separate module (#164600) 2026-03-03 17:57:11 +01:00
Ariel Ebersberger
2102babc6d Influxdb repair issue follow up (#164684) 2026-03-03 17:57:09 +01:00
epenet
f3a1cab582 Migrate motionblinds_ble to runtime_data (#164601) 2026-03-03 17:56:54 +01:00
epenet
03c9ce25c8 Simplify access to motioneye client (#164599) 2026-03-03 17:56:16 +01:00
Christian Lackas
8fcabcec16 Fix HomematicIP heating group availability with unreachable members (#162571) 2026-03-03 17:34:14 +01:00
Michael Hansen
2a33096074 Bump intents to 2026.3.3 (#164676) 2026-03-03 17:26:44 +01:00
Ariel Ebersberger
14a9eada09 Add repair issue after importing influxdb yaml config (#164145)
Co-authored-by: Norbert Rittel <norbert@rittel.de>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-03 16:33:25 +01:00
tobiaswaldvogel
4a00f78e90 Add missing cover entity features to motion_blinds (#164673)
Signed-off-by: Tobias Waldvogel <tobias.waldvogel@gmail.com>
2026-03-03 16:30:55 +01:00
starkillerOG
abef46864e Fix key error in Reolink DHCP if still setting up (#164619) 2026-03-03 16:12:30 +01:00
103 changed files with 2009 additions and 2125 deletions

View File

@@ -0,0 +1,46 @@
---
name: github-pr-reviewer
description: Review a GitHub pull request and provide feedback comments. Use when the user says "review the current PR" or asks to review a specific PR.
---
# Review GitHub Pull Request
## Preparation:
- Check if the local commit matches the last one in the PR. If not, checkout the PR locally using 'gh pr checkout'.
- CRITICAL: If 'gh pr checkout' fails for ANY reason, you MUST immediately STOP.
- Do NOT attempt any workarounds.
- Do NOT proceed with the review.
- ALERT about the failure and WAIT for instructions.
- This is a hard requirement - no exceptions.
## Follow these steps:
1. Use 'gh pr view' to get the PR details and description.
2. Use 'gh pr diff' to see all the changes in the PR.
3. Analyze the code changes for:
- Code quality and style consistency
- Potential bugs or issues
- Performance implications
- Security concerns
- Test coverage
- Documentation updates if needed
4. Ensure any existing review comments have been addressed.
5. Generate constructive review comments in the CONSOLE. DO NOT POST TO GITHUB YOURSELF.
## IMPORTANT:
- Just review. DO NOT make any changes
- Be constructive and specific in your comments
- Suggest improvements where appropriate
- Only provide review feedback in the CONSOLE. DO NOT ACT ON GITHUB.
- No need to run tests or linters, just review the code changes.
- No need to highlight things that are already good.
## Output format:
- List specific comments for each file/line that needs attention
- In the end, summarize with an overall assessment (approve, request changes, or comment) and bullet point list of changes suggested, if any.
- Example output:
```
Overall assessment: request changes.
- [CRITICAL] Memory leak in homeassistant/components/sensor/my_sensor.py:143
- [PROBLEM] Inefficient algorithm in homeassistant/helpers/data_processing.py:87
- [SUGGESTION] Improve variable naming in homeassistant/helpers/config_validation.py:45
```

View File

@@ -331,864 +331,6 @@ class MyCoordinator(DataUpdateCoordinator[MyData]):
```
# Skill: Home Assistant Integration knowledge
# Skills
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
<REFERENCE platform-diagnostics.md>
# Integration Diagnostics
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates
<END REFERENCE platform-diagnostics.md>
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
<REFERENCE platform-repairs.md>
# Repairs platform
Platform exists as `homeassistant/components/<domain>/repairs.py`.
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve
<END REFERENCE platform-repairs.md>
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md

View File

@@ -40,8 +40,6 @@ env:
CACHE_VERSION: 3
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
PYTEST_DURATIONS_CACHE_VERSION: 1
PYTEST_DURATIONS_FILE: .ci/pytest_durations.json
HA_SHORT_VERSION: "2026.4"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
@@ -896,27 +894,12 @@ jobs:
key: >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.python_cache_key }}
- name: Restore pytest durations cache
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: ${{ env.PYTEST_DURATIONS_FILE }}
key: >-
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-${{
github.base_ref || github.ref_name }}
restore-keys: |
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-${{ github.base_ref || github.ref_name }}-
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-dev
- name: Run split_tests.py
env:
TEST_GROUP_COUNT: ${{ needs.info.outputs.test_group_count }}
PYTEST_DURATIONS_FILE: ${{ env.PYTEST_DURATIONS_FILE }}
run: |
. venv/bin/activate
split_args=("${TEST_GROUP_COUNT}" tests)
if [[ -f "${PYTEST_DURATIONS_FILE}" ]]; then
split_args+=(--durations-file "${PYTEST_DURATIONS_FILE}")
fi
python -m script.split_tests "${split_args[@]}"
python -m script.split_tests ${TEST_GROUP_COUNT} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
@@ -1014,10 +997,11 @@ jobs:
. venv/bin/activate
python --version
set -o pipefail
params=(--junitxml=junit.xml -o junit_family=legacy)
cov_params=()
if [[ "${SKIP_COVERAGE}" != "true" ]]; then
params+=(--cov="homeassistant")
params+=(--cov-report=xml)
cov_params+=(--cov="homeassistant")
cov_params+=(--cov-report=xml)
cov_params+=(--junitxml=junit.xml -o junit_family=legacy)
fi
echo "Test group ${TEST_GROUP}: $(sed -n "${TEST_GROUP},1p" pytest_buckets.txt)"
@@ -1028,7 +1012,7 @@ jobs:
--numprocesses auto \
--snapshot-details \
--dist=loadfile \
${params[@]} \
${cov_params[@]} \
-o console_output_style=count \
-p no:sugar \
--exclude-warning-annotations \
@@ -1048,25 +1032,6 @@ jobs:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
overwrite: true
- name: Collect pytest durations
env:
TEST_GROUP: ${{ matrix.group }}
PYTHON_VERSION: ${{ matrix.python-version }}
run: |
. venv/bin/activate
output="pytest-durations-${PYTHON_VERSION}-${TEST_GROUP}.json"
if [[ -f junit.xml ]]; then
python -m script.collect_test_durations --output "${output}" junit.xml
else
echo "::error::Missing junit.xml, cannot collect pytest durations"
exit 1
fi
- name: Upload pytest durations
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-durations-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-durations-*.json
overwrite: true
- name: Beautify test results
# For easier identification of parsing errors
if: needs.info.outputs.skip_coverage != 'true'
@@ -1085,68 +1050,6 @@ jobs:
run: |
./script/check_dirty
update-pytest-duration-cache:
name: Update pytest durations cache
runs-on: ubuntu-24.04
permissions:
contents: read
needs:
- info
- prepare-pytest-full
- pytest-full
if: |
needs.info.outputs.lint_only != 'true'
&& needs.info.outputs.test_full_suite == 'true'
steps:
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore pytest durations cache
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: ${{ env.PYTEST_DURATIONS_FILE }}
key: >-
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-${{
github.base_ref || github.ref_name }}
restore-keys: |
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-${{ github.base_ref || github.ref_name }}-
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-dev
- name: Download pytest durations artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
pattern: pytest-durations-*
merge-multiple: true
path: .ci/pytest-durations
- name: Merge pytest durations
run: |
input_files=()
for file in .ci/pytest-durations/*.json; do
[[ -f "${file}" ]] || continue
input_files+=("${file}")
done
if [[ ${#input_files[@]} -eq 0 ]]; then
mkdir -p "$(dirname "${PYTEST_DURATIONS_FILE}")"
if [[ ! -f "${PYTEST_DURATIONS_FILE}" ]]; then
echo "{}" > "${PYTEST_DURATIONS_FILE}"
fi
exit 0
fi
python -m script.collect_test_durations \
--existing "${PYTEST_DURATIONS_FILE}" \
--output "${PYTEST_DURATIONS_FILE}" \
"${input_files[@]}"
- name: Save pytest durations cache
uses: actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: ${{ env.PYTEST_DURATIONS_FILE }}
key: >-
${{ runner.os }}-${{ runner.arch }}-pytest-durations-${{ env.PYTEST_DURATIONS_CACHE_VERSION }}-${{
github.base_ref || github.ref_name }}-${{ github.run_id }}-${{ github.run_attempt }}
pytest-mariadb:
name: Run ${{ matrix.mariadb-group }} tests Python ${{ matrix.python-version }}
runs-on: ubuntu-24.04

View File

@@ -209,4 +209,4 @@ jobs:
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_all.txt"
requirements: "requirements_all_wheels_${{ matrix.arch }}.txt"

View File

@@ -1,6 +1,6 @@
"""Defines a base Alexa Devices entity."""
from aioamazondevices.const.devices import SPEAKER_GROUP_MODEL
from aioamazondevices.const.devices import SPEAKER_GROUP_DEVICE_TYPE
from aioamazondevices.structures import AmazonDevice
from homeassistant.helpers.device_registry import DeviceInfo
@@ -25,19 +25,20 @@ class AmazonEntity(CoordinatorEntity[AmazonDevicesCoordinator]):
"""Initialize the entity."""
super().__init__(coordinator)
self._serial_num = serial_num
model_details = coordinator.api.get_model_details(self.device) or {}
model = model_details.get("model")
model = self.device.model
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_num)},
name=self.device.account_name,
model=model,
model_id=self.device.device_type,
manufacturer=model_details.get("manufacturer", "Amazon"),
hw_version=model_details.get("hw_version"),
manufacturer=self.device.manufacturer or "Amazon",
hw_version=self.device.hardware_version,
sw_version=(
self.device.software_version if model != SPEAKER_GROUP_MODEL else None
self.device.software_version
if model != SPEAKER_GROUP_DEVICE_TYPE
else None
),
serial_number=serial_num if model != SPEAKER_GROUP_MODEL else None,
serial_number=serial_num if model != SPEAKER_GROUP_DEVICE_TYPE else None,
)
self.entity_description = description
self._attr_unique_id = f"{serial_num}-{description.key}"

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==12.0.2"]
"requirements": ["aioamazondevices==13.0.0"]
}

View File

@@ -61,7 +61,13 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
frequency = self.client.measure(4)
i_leak_dcdc = self.client.measure(6)
i_leak_inverter = self.client.measure(7)
power_in_1 = self.client.measure(8)
power_in_2 = self.client.measure(9)
temperature_c = self.client.measure(21)
voltage_in_1 = self.client.measure(23)
current_in_1 = self.client.measure(25)
voltage_in_2 = self.client.measure(26)
current_in_2 = self.client.measure(27)
r_iso = self.client.measure(30)
energy_wh = self.client.cumulated_energy(5)
[alarm, *_] = self.client.alarms()
@@ -87,7 +93,13 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
data["grid_frequency"] = round(frequency, 1)
data["i_leak_dcdc"] = i_leak_dcdc
data["i_leak_inverter"] = i_leak_inverter
data["power_in_1"] = round(power_in_1, 1)
data["power_in_2"] = round(power_in_2, 1)
data["temp"] = round(temperature_c, 1)
data["voltage_in_1"] = round(voltage_in_1, 1)
data["current_in_1"] = round(current_in_1, 1)
data["voltage_in_2"] = round(voltage_in_2, 1)
data["current_in_2"] = round(current_in_2, 1)
data["r_iso"] = r_iso
data["totalenergy"] = round(energy_wh / 1000, 2)
data["alarm"] = alarm

View File

@@ -68,6 +68,7 @@ SENSOR_TYPES = [
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfFrequency.HERTZ,
state_class=SensorStateClass.MEASUREMENT,
translation_key="grid_frequency",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
@@ -88,6 +89,60 @@ SENSOR_TYPES = [
translation_key="i_leak_inverter",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_1",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_2",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_1",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_1",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_2",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_2",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="alarm",
device_class=SensorDeviceClass.ENUM,

View File

@@ -24,9 +24,18 @@
"alarm": {
"name": "Alarm status"
},
"current_in_1": {
"name": "String 1 current"
},
"current_in_2": {
"name": "String 2 current"
},
"grid_current": {
"name": "Grid current"
},
"grid_frequency": {
"name": "Grid frequency"
},
"grid_voltage": {
"name": "Grid voltage"
},
@@ -36,6 +45,12 @@
"i_leak_inverter": {
"name": "Inverter leak current"
},
"power_in_1": {
"name": "String 1 power"
},
"power_in_2": {
"name": "String 2 power"
},
"power_output": {
"name": "Power output"
},
@@ -44,6 +59,12 @@
},
"total_energy": {
"name": "Total energy"
},
"voltage_in_1": {
"name": "String 1 voltage"
},
"voltage_in_2": {
"name": "String 2 voltage"
}
}
}

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.2.13"]
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.3.3"]
}

View File

@@ -524,14 +524,10 @@ class EsphomeAssistSatellite(
self._active_pipeline_index = 0
maybe_pipeline_index = 0
while True:
if not (ww_entity_id := self.get_wake_word_entity(maybe_pipeline_index)):
break
if not (ww_state := self.hass.states.get(ww_entity_id)):
continue
if ww_state.state == wake_word_phrase:
while ww_entity_id := self.get_wake_word_entity(maybe_pipeline_index):
if (
ww_state := self.hass.states.get(ww_entity_id)
) and ww_state.state == wake_word_phrase:
# First match
self._active_pipeline_index = maybe_pipeline_index
break

View File

@@ -88,6 +88,17 @@ class HomematicipHeatingGroup(HomematicipGenericEntity, ClimateEntity):
if device.actualTemperature is None:
self._simple_heating = self._first_radiator_thermostat
@property
def available(self) -> bool:
"""Heating group available.
A heating group must be available, and should not be affected by the
individual availability of group members.
This allows controlling the temperature even when individual group
members are not available.
"""
return True
@property
def device_info(self) -> DeviceInfo:
"""Return device specific attributes."""

View File

@@ -312,6 +312,17 @@ class HomematicipCoverShutterGroup(HomematicipGenericEntity, CoverEntity):
device.modelType = f"HmIP-{post}"
super().__init__(hap, device, post, is_multi_channel=False)
@property
def available(self) -> bool:
"""Cover shutter group available.
A cover shutter group must be available, and should not be affected by
the individual availability of group members.
This allows controlling the shutters even when individual group
members are not available.
"""
return True
@property
def current_cover_position(self) -> int | None:
"""Return current position of cover."""

View File

@@ -43,6 +43,7 @@ from homeassistant.const import (
STATE_UNKNOWN,
)
from homeassistant.core import Event, HomeAssistant, State, callback
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, state as state_helper
from homeassistant.helpers.entity_values import EntityValues
@@ -61,6 +62,7 @@ from .const import (
CLIENT_ERROR_V2,
CODE_INVALID_INPUTS,
COMPONENT_CONFIG_SCHEMA_CONNECTION,
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS,
CONF_API_VERSION,
CONF_BUCKET,
CONF_COMPONENT_CONFIG,
@@ -79,7 +81,6 @@ from .const import (
CONF_TAGS_ATTRIBUTES,
CONNECTION_ERROR,
DEFAULT_API_VERSION,
DEFAULT_HOST,
DEFAULT_HOST_V2,
DEFAULT_MEASUREMENT_ATTR,
DEFAULT_SSL_V2,
@@ -104,6 +105,7 @@ from .const import (
WRITE_ERROR,
WROTE_MESSAGE,
)
from .issue import async_create_deprecated_yaml_issue
_LOGGER = logging.getLogger(__name__)
@@ -137,7 +139,7 @@ def create_influx_url(conf: dict) -> dict:
def validate_version_specific_config(conf: dict) -> dict:
"""Ensure correct config fields are provided based on API version used."""
if conf[CONF_API_VERSION] == API_VERSION_2:
if conf.get(CONF_API_VERSION, DEFAULT_API_VERSION) == API_VERSION_2:
if CONF_TOKEN not in conf:
raise vol.Invalid(
f"{CONF_TOKEN} and {CONF_BUCKET} are required when"
@@ -193,32 +195,13 @@ _INFLUX_BASE_SCHEMA = INCLUDE_EXCLUDE_BASE_FILTER_SCHEMA.extend(
}
)
INFLUX_SCHEMA = vol.All(
_INFLUX_BASE_SCHEMA.extend(COMPONENT_CONFIG_SCHEMA_CONNECTION),
validate_version_specific_config,
create_influx_url,
INFLUX_SCHEMA = _INFLUX_BASE_SCHEMA.extend(
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS
)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.All(
cv.deprecated(CONF_API_VERSION),
cv.deprecated(CONF_HOST),
cv.deprecated(CONF_PATH),
cv.deprecated(CONF_PORT),
cv.deprecated(CONF_SSL),
cv.deprecated(CONF_VERIFY_SSL),
cv.deprecated(CONF_SSL_CA_CERT),
cv.deprecated(CONF_USERNAME),
cv.deprecated(CONF_PASSWORD),
cv.deprecated(CONF_DB_NAME),
cv.deprecated(CONF_TOKEN),
cv.deprecated(CONF_ORG),
cv.deprecated(CONF_BUCKET),
INFLUX_SCHEMA,
)
},
{DOMAIN: vol.All(INFLUX_SCHEMA, validate_version_specific_config)},
extra=vol.ALLOW_EXTRA,
)
@@ -499,23 +482,35 @@ def get_influx_connection( # noqa: C901
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the InfluxDB component."""
conf = config.get(DOMAIN)
if DOMAIN not in config:
return True
if conf is not None:
if CONF_HOST not in conf and conf[CONF_API_VERSION] == DEFAULT_API_VERSION:
conf[CONF_HOST] = DEFAULT_HOST
hass.async_create_task(
hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=conf,
)
)
hass.async_create_task(_async_setup(hass, config[DOMAIN]))
return True
async def _async_setup(hass: HomeAssistant, config: dict[str, Any]) -> None:
"""Import YAML configuration into a config entry."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config,
)
if (
result.get("type") is FlowResultType.ABORT
and (reason := result["reason"]) != "single_instance_allowed"
):
async_create_deprecated_yaml_issue(hass, error=reason)
return
# If we are here, the entry already exists (single instance allowed)
if config.keys() & (
{k.schema for k in COMPONENT_CONFIG_SCHEMA_CONNECTION} - {CONF_PRECISION}
):
async_create_deprecated_yaml_issue(hass)
async def async_setup_entry(hass: HomeAssistant, entry: InfluxDBConfigEntry) -> bool:
"""Set up InfluxDB from a config entry."""
data = entry.data

View File

@@ -31,7 +31,7 @@ from homeassistant.helpers.selector import (
)
from homeassistant.helpers.storage import STORAGE_DIR
from . import DOMAIN, get_influx_connection
from . import DOMAIN, create_influx_url, get_influx_connection
from .const import (
API_VERSION_2,
CONF_API_VERSION,
@@ -40,8 +40,11 @@ from .const import (
CONF_ORG,
CONF_SSL_CA_CERT,
DEFAULT_API_VERSION,
DEFAULT_BUCKET,
DEFAULT_DATABASE,
DEFAULT_HOST,
DEFAULT_PORT,
DEFAULT_VERIFY_SSL,
)
_LOGGER = logging.getLogger(__name__)
@@ -240,14 +243,17 @@ class InfluxDBConfigFlow(ConfigFlow, domain=DOMAIN):
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
"""Handle the initial step."""
host = import_data.get(CONF_HOST)
database = import_data.get(CONF_DB_NAME)
bucket = import_data.get(CONF_BUCKET)
import_data = {**import_data}
import_data.setdefault(CONF_API_VERSION, DEFAULT_API_VERSION)
import_data.setdefault(CONF_VERIFY_SSL, DEFAULT_VERIFY_SSL)
import_data.setdefault(CONF_DB_NAME, DEFAULT_DATABASE)
import_data.setdefault(CONF_BUCKET, DEFAULT_BUCKET)
api_version = import_data.get(CONF_API_VERSION)
ssl = import_data.get(CONF_SSL)
api_version = import_data[CONF_API_VERSION]
if api_version == DEFAULT_API_VERSION:
host = import_data.get(CONF_HOST, DEFAULT_HOST)
database = import_data[CONF_DB_NAME]
title = f"{database} ({host})"
data = {
CONF_API_VERSION: api_version,
@@ -256,21 +262,23 @@ class InfluxDBConfigFlow(ConfigFlow, domain=DOMAIN):
CONF_USERNAME: import_data.get(CONF_USERNAME),
CONF_PASSWORD: import_data.get(CONF_PASSWORD),
CONF_DB_NAME: database,
CONF_SSL: ssl,
CONF_SSL: import_data.get(CONF_SSL),
CONF_PATH: import_data.get(CONF_PATH),
CONF_VERIFY_SSL: import_data.get(CONF_VERIFY_SSL),
CONF_VERIFY_SSL: import_data[CONF_VERIFY_SSL],
CONF_SSL_CA_CERT: import_data.get(CONF_SSL_CA_CERT),
}
else:
create_influx_url(import_data) # Only modifies dict for api_version == 2
bucket = import_data[CONF_BUCKET]
url = import_data.get(CONF_URL)
title = f"{bucket} ({url})"
data = {
CONF_API_VERSION: api_version,
CONF_URL: import_data.get(CONF_URL),
CONF_URL: url,
CONF_TOKEN: import_data.get(CONF_TOKEN),
CONF_ORG: import_data.get(CONF_ORG),
CONF_BUCKET: bucket,
CONF_VERIFY_SSL: import_data.get(CONF_VERIFY_SSL),
CONF_VERIFY_SSL: import_data[CONF_VERIFY_SSL],
CONF_SSL_CA_CERT: import_data.get(CONF_SSL_CA_CERT),
}

View File

@@ -154,3 +154,14 @@ COMPONENT_CONFIG_SCHEMA_CONNECTION = {
vol.Inclusive(CONF_ORG, "v2_authentication"): cv.string,
vol.Optional(CONF_BUCKET, default=DEFAULT_BUCKET): cv.string,
}
# Same keys without defaults, used in CONFIG_SCHEMA to validate
# without injecting default values (so we can detect explicit keys).
COMPONENT_CONFIG_SCHEMA_CONNECTION_VALIDATORS = {
(
vol.Optional(k.schema)
if isinstance(k, vol.Optional) and k.default is not vol.UNDEFINED
else k
): v
for k, v in COMPONENT_CONFIG_SCHEMA_CONNECTION.items()
}

View File

@@ -0,0 +1,34 @@
"""Issues for InfluxDB integration."""
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.issue_registry import IssueSeverity, async_create_issue
from .const import DOMAIN
@callback
def async_create_deprecated_yaml_issue(
hass: HomeAssistant, *, error: str | None = None
) -> None:
"""Create a repair issue for deprecated YAML connection configuration."""
if error is None:
issue_id = "deprecated_yaml"
severity = IssueSeverity.WARNING
else:
issue_id = f"deprecated_yaml_import_issue_{error}"
severity = IssueSeverity.ERROR
async_create_issue(
hass,
DOMAIN,
issue_id,
is_fixable=False,
issue_domain=DOMAIN,
breaks_in_ha_version="2026.9.0",
severity=severity,
translation_key=issue_id,
translation_placeholders={
"domain": DOMAIN,
"url": f"/config/integrations/dashboard/add?domain={DOMAIN}",
},
)

View File

@@ -7,7 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/influxdb",
"iot_class": "local_push",
"loggers": ["influxdb", "influxdb_client"],
"quality_scale": "legacy",
"requirements": ["influxdb==5.3.1", "influxdb-client==1.50.0"],
"single_config_entry": true
}

View File

@@ -54,5 +54,31 @@
"title": "Choose InfluxDB version"
}
}
},
"issues": {
"deprecated_yaml": {
"description": "Configuring InfluxDB connection settings using YAML is being removed. Your existing YAML connection configuration has been imported into the UI automatically.\n\nRemove the `{domain}` connection and authentication keys from your `configuration.yaml` file and restart Home Assistant to fix this issue. Other options like `include`, `exclude`, and `tags` remain in YAML for now. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "The InfluxDB YAML configuration is being removed"
},
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because Home Assistant could not connect to the InfluxDB server.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "Failed to import InfluxDB YAML configuration"
},
"deprecated_yaml_import_issue_invalid_auth": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because the provided credentials are invalid.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_invalid_database": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed because the specified database was not found.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_ssl_error": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed due to an SSL certificate error.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
},
"deprecated_yaml_import_issue_unknown": {
"description": "Configuring InfluxDB connection settings using YAML is being removed but the import failed due to an unknown error.\n\nPlease correct your YAML configuration and restart Home Assistant.\n\nAlternatively you can remove the `{domain}` connection and authentication keys from your `configuration.yaml` file and continue to [set up the integration]({url}) manually. \n\nThe following keys should be removed:\n- `api_version`\n- `host`\n- `port`\n- `ssl`\n- `verify_ssl`\n- `ssl_ca_cert`\n- `username`\n- `password`\n- `database`\n- `token`\n- `organization`\n- `bucket`\n- `path`",
"title": "[%key:component::influxdb::issues::deprecated_yaml_import_issue_cannot_connect::title%]"
}
}
}

View File

@@ -67,6 +67,22 @@ NUMBER_SETTINGS_DATA = [
fmt_from="format_round",
fmt_to="format_round_back",
),
PlenticoreNumberEntityDescription(
key="active_power_limitation",
device_class=NumberDeviceClass.POWER,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
icon="mdi:solar-power",
name="Active Power Limitation",
native_unit_of_measurement=UnitOfPower.WATT,
native_max_value=10000,
native_min_value=0,
native_step=1,
module_id="devices:local",
data_id="Inverter:ActivePowerLimitation",
fmt_from="format_round",
fmt_to="format_round_back",
),
]

View File

@@ -1,9 +1,8 @@
"""Support for Meteoclimatic weather data."""
import logging
from typing import Any
from meteoclimatic import MeteoclimaticClient
from meteoclimatic import MeteoclimaticClient, Observation
from meteoclimatic.exceptions import MeteoclimaticError
from homeassistant.config_entries import ConfigEntry
@@ -17,7 +16,7 @@ _LOGGER = logging.getLogger(__name__)
type MeteoclimaticConfigEntry = ConfigEntry[MeteoclimaticUpdateCoordinator]
class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[Observation]):
"""Coordinator for Meteoclimatic weather data."""
config_entry: MeteoclimaticConfigEntry
@@ -34,12 +33,11 @@ class MeteoclimaticUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
)
self._meteoclimatic_client = MeteoclimaticClient()
async def _async_update_data(self) -> dict[str, Any]:
async def _async_update_data(self) -> Observation:
"""Obtain the latest data from Meteoclimatic."""
try:
data = await self.hass.async_add_executor_job(
return await self.hass.async_add_executor_job(
self._meteoclimatic_client.weather_at_station, self._station_code
)
except MeteoclimaticError as err:
raise UpdateFailed(f"Error while retrieving data: {err}") from err
return data.__dict__

View File

@@ -1,5 +1,7 @@
"""Support for Meteoclimatic sensor."""
from typing import TYPE_CHECKING
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -139,26 +141,24 @@ class MeteoclimaticSensor(
"""Initialize the Meteoclimatic sensor."""
super().__init__(coordinator)
self.entity_description = description
station = self.coordinator.data["station"]
station = coordinator.data.station
self._attr_name = f"{station.name} {description.name}"
self._attr_unique_id = f"{station.code}_{description.key}"
@property
def device_info(self):
"""Return the device info."""
return DeviceInfo(
if TYPE_CHECKING:
assert coordinator.config_entry.unique_id is not None
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, self.platform.config_entry.unique_id)},
identifiers={(DOMAIN, coordinator.config_entry.unique_id)},
manufacturer=MANUFACTURER,
model=MODEL,
name=self.coordinator.name,
name=coordinator.name,
)
@property
def native_value(self):
def native_value(self) -> float | None:
"""Return the state of the sensor."""
return (
getattr(self.coordinator.data["weather"], self.entity_description.key)
getattr(self.coordinator.data.weather, self.entity_description.key)
if self.coordinator.data
else None
)

View File

@@ -48,49 +48,44 @@ class MeteoclimaticWeather(
def __init__(self, coordinator: MeteoclimaticUpdateCoordinator) -> None:
"""Initialise the weather platform."""
super().__init__(coordinator)
self._attr_unique_id = self.coordinator.data["station"].code
self._attr_name = self.coordinator.data["station"].name
@property
def device_info(self) -> DeviceInfo:
"""Return the device info."""
unique_id = self.coordinator.config_entry.unique_id
self._attr_unique_id = coordinator.data.station.code
self._attr_name = coordinator.data.station.name
if TYPE_CHECKING:
assert unique_id is not None
return DeviceInfo(
assert coordinator.config_entry.unique_id is not None
self._attr_device_info = DeviceInfo(
entry_type=DeviceEntryType.SERVICE,
identifiers={(DOMAIN, unique_id)},
identifiers={(DOMAIN, coordinator.config_entry.unique_id)},
manufacturer=MANUFACTURER,
model=MODEL,
name=self.coordinator.name,
name=coordinator.name,
)
@property
def condition(self) -> str | None:
"""Return the current condition."""
return format_condition(self.coordinator.data["weather"].condition)
return format_condition(self.coordinator.data.weather.condition)
@property
def native_temperature(self) -> float | None:
"""Return the temperature."""
return self.coordinator.data["weather"].temp_current
return self.coordinator.data.weather.temp_current
@property
def humidity(self) -> float | None:
"""Return the humidity."""
return self.coordinator.data["weather"].humidity_current
return self.coordinator.data.weather.humidity_current
@property
def native_pressure(self) -> float | None:
"""Return the pressure."""
return self.coordinator.data["weather"].pressure_current
return self.coordinator.data.weather.pressure_current
@property
def native_wind_speed(self) -> float | None:
"""Return the wind speed."""
return self.coordinator.data["weather"].wind_current
return self.coordinator.data.weather.wind_current
@property
def wind_bearing(self) -> float | None:
"""Return the wind bearing."""
return self.coordinator.data["weather"].wind_bearing
return self.coordinator.data.weather.wind_bearing

View File

@@ -268,6 +268,26 @@ class MotionTiltDevice(MotionPositionDevice):
_restore_tilt = True
@property
def supported_features(self) -> CoverEntityFeature:
"""Flag supported features."""
supported_features = (
CoverEntityFeature.OPEN
| CoverEntityFeature.CLOSE
| CoverEntityFeature.STOP
| CoverEntityFeature.OPEN_TILT
| CoverEntityFeature.CLOSE_TILT
| CoverEntityFeature.STOP_TILT
)
if self.current_cover_position is not None:
supported_features |= CoverEntityFeature.SET_POSITION
if self.current_cover_tilt_position is not None:
supported_features |= CoverEntityFeature.SET_TILT_POSITION
return supported_features
@property
def current_cover_tilt_position(self) -> int | None:
"""Return current angle of cover.

View File

@@ -43,6 +43,8 @@ PLATFORMS: list[Platform] = [
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
type MotionConfigEntry = ConfigEntry[MotionDevice]
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up Motionblinds Bluetooth integration."""
@@ -56,7 +58,7 @@ async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
return True
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_setup_entry(hass: HomeAssistant, entry: MotionConfigEntry) -> bool:
"""Set up Motionblinds Bluetooth device from a config entry."""
_LOGGER.debug("(%s) Setting up device", entry.data[CONF_MAC_CODE])
@@ -95,11 +97,11 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
)
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = device
# Register OptionsFlow update listener
entry.async_on_unload(entry.add_update_listener(options_update_listener))
entry.runtime_data = device
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
# Apply options
@@ -112,7 +114,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return True
async def options_update_listener(hass: HomeAssistant, entry: ConfigEntry) -> None:
async def options_update_listener(
hass: HomeAssistant, entry: MotionConfigEntry
) -> None:
"""Handle options update."""
_LOGGER.debug(
"(%s) Updated device options: %s", entry.data[CONF_MAC_CODE], entry.options
@@ -120,10 +124,10 @@ async def options_update_listener(hass: HomeAssistant, entry: ConfigEntry) -> No
await apply_options(hass, entry)
async def apply_options(hass: HomeAssistant, entry: ConfigEntry) -> None:
async def apply_options(hass: HomeAssistant, entry: MotionConfigEntry) -> None:
"""Apply the options from the OptionsFlow."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
disconnect_time: float | None = entry.options.get(OPTION_DISCONNECT_TIME, None)
permanent_connection: bool = entry.options.get(OPTION_PERMANENT_CONNECTION, False)
@@ -131,10 +135,7 @@ async def apply_options(hass: HomeAssistant, entry: ConfigEntry) -> None:
await device.set_permanent_connection(permanent_connection)
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
async def async_unload_entry(hass: HomeAssistant, entry: MotionConfigEntry) -> bool:
"""Unload Motionblinds Bluetooth device from a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
hass.data[DOMAIN].pop(entry.entry_id)
return unload_ok
return await hass.config_entries.async_unload_platforms(entry, PLATFORMS)

View File

@@ -10,12 +10,12 @@ from typing import Any
from motionblindsble.device import MotionDevice
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import ATTR_CONNECT, ATTR_DISCONNECT, ATTR_FAVORITE, CONF_MAC_CODE, DOMAIN
from . import MotionConfigEntry
from .const import ATTR_CONNECT, ATTR_DISCONNECT, ATTR_FAVORITE, CONF_MAC_CODE
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -54,12 +54,12 @@ BUTTON_TYPES: list[MotionblindsBLEButtonEntityDescription] = [
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up button entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
async_add_entities(
MotionblindsBLEButtonEntity(

View File

@@ -12,12 +12,7 @@ import voluptuous as vol
from homeassistant.components import bluetooth
from homeassistant.components.bluetooth import BluetoothServiceInfoBleak
from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
OptionsFlow,
)
from homeassistant.config_entries import ConfigFlow, ConfigFlowResult, OptionsFlow
from homeassistant.const import CONF_ADDRESS
from homeassistant.core import callback
from homeassistant.exceptions import HomeAssistantError
@@ -27,6 +22,7 @@ from homeassistant.helpers.selector import (
SelectSelectorMode,
)
from . import MotionConfigEntry
from .const import (
CONF_BLIND_TYPE,
CONF_LOCAL_NAME,
@@ -185,7 +181,7 @@ class FlowHandler(ConfigFlow, domain=DOMAIN):
@staticmethod
@callback
def async_get_options_flow(
config_entry: ConfigEntry,
config_entry: MotionConfigEntry,
) -> OptionsFlow:
"""Create the options flow."""
return OptionsFlowHandler()

View File

@@ -7,7 +7,6 @@ import logging
from typing import Any
from motionblindsble.const import MotionBlindType, MotionRunningType
from motionblindsble.device import MotionDevice
from homeassistant.components.cover import (
ATTR_POSITION,
@@ -17,11 +16,11 @@ from homeassistant.components.cover import (
CoverEntityDescription,
CoverEntityFeature,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, DOMAIN, ICON_VERTICAL_BLIND
from . import MotionConfigEntry
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, ICON_VERTICAL_BLIND
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -62,7 +61,7 @@ BLIND_TYPE_TO_ENTITY_DESCRIPTION: dict[str, MotionblindsBLECoverEntityDescriptio
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up cover entity based on a config entry."""
@@ -70,7 +69,7 @@ async def async_setup_entry(
cover_class: type[MotionblindsBLECoverEntity] = BLIND_TYPE_TO_CLASS[
entry.data[CONF_BLIND_TYPE].upper()
]
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
entity_description: MotionblindsBLECoverEntityDescription = (
BLIND_TYPE_TO_ENTITY_DESCRIPTION[entry.data[CONF_BLIND_TYPE].upper()]
)

View File

@@ -5,14 +5,11 @@ from __future__ import annotations
from collections.abc import Iterable
from typing import Any
from motionblindsble.device import MotionDevice
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_UNIQUE_ID
from homeassistant.core import HomeAssistant
from .const import DOMAIN
from . import MotionConfigEntry
CONF_TITLE = "title"
@@ -24,10 +21,10 @@ TO_REDACT: Iterable[Any] = {
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: ConfigEntry
hass: HomeAssistant, entry: MotionConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
return async_redact_data(
{

View File

@@ -5,11 +5,11 @@ import logging
from motionblindsble.const import MotionBlindType
from motionblindsble.device import MotionDevice
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_ADDRESS
from homeassistant.helpers.device_registry import CONNECTION_BLUETOOTH, DeviceInfo
from homeassistant.helpers.entity import Entity, EntityDescription
from . import MotionConfigEntry
from .const import CONF_BLIND_TYPE, CONF_MAC_CODE, MANUFACTURER
_LOGGER = logging.getLogger(__name__)
@@ -21,13 +21,10 @@ class MotionblindsBLEEntity(Entity):
_attr_has_entity_name = True
_attr_should_poll = False
device: MotionDevice
entry: ConfigEntry
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: EntityDescription,
unique_id_suffix: str | None = None,
) -> None:

View File

@@ -8,12 +8,12 @@ from motionblindsble.const import MotionBlindType, MotionSpeedLevel
from motionblindsble.device import MotionDevice
from homeassistant.components.select import SelectEntity, SelectEntityDescription
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import ATTR_SPEED, CONF_MAC_CODE, DOMAIN
from . import MotionConfigEntry
from .const import ATTR_SPEED, CONF_MAC_CODE
from .entity import MotionblindsBLEEntity
_LOGGER = logging.getLogger(__name__)
@@ -33,12 +33,12 @@ SELECT_TYPES: dict[str, SelectEntityDescription] = {
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up select entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
if device.blind_type not in {MotionBlindType.CURTAIN, MotionBlindType.VERTICAL}:
async_add_entities([SpeedSelect(device, entry, SELECT_TYPES[ATTR_SPEED])])
@@ -50,7 +50,7 @@ class SpeedSelect(MotionblindsBLEEntity, SelectEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: SelectEntityDescription,
) -> None:
"""Initialize the speed select entity."""

View File

@@ -20,7 +20,6 @@ from homeassistant.components.sensor import (
SensorEntityDescription,
SensorStateClass,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
PERCENTAGE,
SIGNAL_STRENGTH_DECIBELS_MILLIWATT,
@@ -30,13 +29,13 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from . import MotionConfigEntry
from .const import (
ATTR_BATTERY,
ATTR_CALIBRATION,
ATTR_CONNECTION,
ATTR_SIGNAL_STRENGTH,
CONF_MAC_CODE,
DOMAIN,
)
from .entity import MotionblindsBLEEntity
@@ -94,12 +93,12 @@ SENSORS: tuple[MotionblindsBLESensorEntityDescription, ...] = (
async def async_setup_entry(
hass: HomeAssistant,
entry: ConfigEntry,
entry: MotionConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up sensor entities based on a config entry."""
device: MotionDevice = hass.data[DOMAIN][entry.entry_id]
device = entry.runtime_data
entities: list[SensorEntity] = [
MotionblindsBLESensorEntity(device, entry, description)
@@ -118,7 +117,7 @@ class MotionblindsBLESensorEntity[_T](MotionblindsBLEEntity, SensorEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
entity_description: MotionblindsBLESensorEntityDescription[_T],
) -> None:
"""Initialize the sensor entity."""
@@ -149,7 +148,7 @@ class BatterySensor(MotionblindsBLEEntity, SensorEntity):
def __init__(
self,
device: MotionDevice,
entry: ConfigEntry,
entry: MotionConfigEntry,
) -> None:
"""Initialize the sensor entity."""
entity_description = SensorEntityDescription(

View File

@@ -62,8 +62,6 @@ from .const import (
ATTR_WEBHOOK_ID,
CONF_ADMIN_PASSWORD,
CONF_ADMIN_USERNAME,
CONF_CLIENT,
CONF_COORDINATOR,
CONF_SURVEILLANCE_PASSWORD,
CONF_SURVEILLANCE_USERNAME,
CONF_WEBHOOK_SET,
@@ -308,10 +306,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
)
coordinator = MotionEyeUpdateCoordinator(hass, entry, client)
hass.data[DOMAIN][entry.entry_id] = {
CONF_CLIENT: client,
CONF_COORDINATOR: coordinator,
}
hass.data[DOMAIN][entry.entry_id] = coordinator
current_cameras: set[tuple[str, str]] = set()
device_registry = dr.async_get(hass)
@@ -373,8 +368,8 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
config_data = hass.data[DOMAIN].pop(entry.entry_id)
await config_data[CONF_CLIENT].async_client_close()
coordinator = hass.data[DOMAIN].pop(entry.entry_id)
await coordinator.client.async_client_close()
return unload_ok
@@ -446,9 +441,8 @@ def _get_media_event_data(
if not config_entry_id or config_entry_id not in hass.data[DOMAIN]:
return {}
config_entry_data = hass.data[DOMAIN][config_entry_id]
client = config_entry_data[CONF_CLIENT]
coordinator = config_entry_data[CONF_COORDINATOR]
coordinator = hass.data[DOMAIN][config_entry_id]
client = coordinator.client
for identifier in device.identifiers:
data = split_motioneye_device_identifier(identifier)

View File

@@ -47,8 +47,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import get_camera_from_cameras, is_acceptable_camera, listen_for_new_cameras
from .const import (
CONF_ACTION,
CONF_CLIENT,
CONF_COORDINATOR,
CONF_STREAM_URL_TEMPLATE,
CONF_SURVEILLANCE_PASSWORD,
CONF_SURVEILLANCE_USERNAME,
@@ -98,7 +96,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -112,8 +110,8 @@ async def async_setup_entry(
),
entry.data.get(CONF_SURVEILLANCE_PASSWORD, ""),
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
)
]

View File

@@ -30,8 +30,6 @@ ATTR_EVENT_TYPE: Final = "event_type"
ATTR_WEBHOOK_ID: Final = "webhook_id"
CONF_ACTION: Final = "action"
CONF_CLIENT: Final = "client"
CONF_COORDINATOR: Final = "coordinator"
CONF_ADMIN_PASSWORD: Final = "admin_password"
CONF_ADMIN_USERNAME: Final = "admin_username"
CONF_STREAM_URL_TEMPLATE: Final = "stream_url_template"

View File

@@ -22,7 +22,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import device_registry as dr
from . import get_media_url, split_motioneye_device_identifier
from .const import CONF_CLIENT, DOMAIN
from .const import DOMAIN
MIME_TYPE_MAP = {
"movies": "video/mp4",
@@ -74,7 +74,7 @@ class MotionEyeMediaSource(MediaSource):
self._verify_kind_or_raise(kind)
url = get_media_url(
self.hass.data[DOMAIN][config.entry_id][CONF_CLIENT],
self.hass.data[DOMAIN][config.entry_id].client,
self._get_camera_id_or_raise(config, device),
self._get_path_or_raise(path),
kind == "images",
@@ -276,7 +276,7 @@ class MotionEyeMediaSource(MediaSource):
base.children = []
client = self.hass.data[DOMAIN][config.entry_id][CONF_CLIENT]
client = self.hass.data[DOMAIN][config.entry_id].client
camera_id = self._get_camera_id_or_raise(config, device)
if kind == "movies":

View File

@@ -15,7 +15,7 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from . import get_camera_from_cameras, listen_for_new_cameras
from .const import CONF_CLIENT, CONF_COORDINATOR, DOMAIN, TYPE_MOTIONEYE_ACTION_SENSOR
from .const import DOMAIN, TYPE_MOTIONEYE_ACTION_SENSOR
from .coordinator import MotionEyeUpdateCoordinator
from .entity import MotionEyeEntity
@@ -26,7 +26,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -36,8 +36,8 @@ async def async_setup_entry(
MotionEyeActionSensor(
entry.entry_id,
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
)
]

View File

@@ -22,7 +22,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import get_camera_from_cameras, listen_for_new_cameras
from .const import CONF_CLIENT, CONF_COORDINATOR, DOMAIN, TYPE_MOTIONEYE_SWITCH_BASE
from .const import DOMAIN, TYPE_MOTIONEYE_SWITCH_BASE
from .coordinator import MotionEyeUpdateCoordinator
from .entity import MotionEyeEntity
@@ -72,7 +72,7 @@ async def async_setup_entry(
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up motionEye from a config entry."""
entry_data = hass.data[DOMAIN][entry.entry_id]
coordinator = hass.data[DOMAIN][entry.entry_id]
@callback
def camera_add(camera: dict[str, Any]) -> None:
@@ -82,8 +82,8 @@ async def async_setup_entry(
MotionEyeSwitch(
entry.entry_id,
camera,
entry_data[CONF_CLIENT],
entry_data[CONF_COORDINATOR],
coordinator.client,
coordinator,
entry.options,
entity_description,
)

View File

@@ -2,54 +2,20 @@
from __future__ import annotations
import asyncio
import logging
import mutesync
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import update_coordinator
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from .const import DOMAIN, UPDATE_INTERVAL_IN_MEETING, UPDATE_INTERVAL_NOT_IN_MEETING
from .const import DOMAIN
from .coordinator import MutesyncUpdateCoordinator
PLATFORMS = [Platform.BINARY_SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
"""Set up mütesync from a config entry."""
client = mutesync.PyMutesync(
entry.data["token"],
entry.data["host"],
async_get_clientsession(hass),
)
async def update_data():
"""Update the data."""
async with asyncio.timeout(2.5):
state = await client.get_state()
if state["muted"] is None or state["in_meeting"] is None:
raise update_coordinator.UpdateFailed("Got invalid response")
if state["in_meeting"]:
coordinator.update_interval = UPDATE_INTERVAL_IN_MEETING
else:
coordinator.update_interval = UPDATE_INTERVAL_NOT_IN_MEETING
return state
coordinator = hass.data.setdefault(DOMAIN, {})[entry.entry_id] = (
update_coordinator.DataUpdateCoordinator(
hass,
logging.getLogger(__name__),
config_entry=entry,
name=DOMAIN,
update_interval=UPDATE_INTERVAL_NOT_IN_MEETING,
update_method=update_data,
)
MutesyncUpdateCoordinator(hass, entry)
)
await coordinator.async_config_entry_first_refresh()

View File

@@ -3,11 +3,12 @@
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers import update_coordinator
from homeassistant.helpers.device_registry import DeviceEntryType, DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import MutesyncUpdateCoordinator
SENSORS = (
"in_meeting",
@@ -27,7 +28,7 @@ async def async_setup_entry(
)
class MuteStatus(update_coordinator.CoordinatorEntity, BinarySensorEntity):
class MuteStatus(CoordinatorEntity[MutesyncUpdateCoordinator], BinarySensorEntity):
"""Mütesync binary sensors."""
_attr_has_entity_name = True

View File

@@ -0,0 +1,58 @@
"""Coordinator for the mütesync integration."""
from __future__ import annotations
import asyncio
import logging
from typing import Any
import mutesync
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import DOMAIN, UPDATE_INTERVAL_IN_MEETING, UPDATE_INTERVAL_NOT_IN_MEETING
_LOGGER = logging.getLogger(__name__)
class MutesyncUpdateCoordinator(DataUpdateCoordinator[dict[str, Any]]):
"""Coordinator for the mütesync integration."""
config_entry: ConfigEntry
def __init__(
self,
hass: HomeAssistant,
entry: ConfigEntry,
) -> None:
"""Initialize the coordinator."""
super().__init__(
hass,
_LOGGER,
name=DOMAIN,
config_entry=entry,
update_interval=UPDATE_INTERVAL_NOT_IN_MEETING,
)
self._client = mutesync.PyMutesync(
entry.data["token"],
entry.data["host"],
async_get_clientsession(hass),
)
async def _async_update_data(self) -> dict[str, Any]:
"""Get data from the mütesync client."""
async with asyncio.timeout(2.5):
state = await self._client.get_state()
if state["muted"] is None or state["in_meeting"] is None:
raise UpdateFailed("Got invalid response")
if state["in_meeting"]:
self.update_interval = UPDATE_INTERVAL_IN_MEETING
else:
self.update_interval = UPDATE_INTERVAL_NOT_IN_MEETING
return state

View File

@@ -17,7 +17,6 @@ from homeassistant.config_entries import (
ConfigEntry,
ConfigFlow,
ConfigFlowResult,
ConfigSubentryData,
ConfigSubentryFlow,
SubentryFlowResult,
)
@@ -30,15 +29,7 @@ from homeassistant.helpers.selector import (
TimeSelector,
)
from .const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
DOMAIN,
INTEGRATION_TITLE,
)
from .const import CONF_FROM, CONF_TIME, CONF_TO, CONF_VIA, DOMAIN, INTEGRATION_TITLE
_LOGGER = logging.getLogger(__name__)
@@ -133,47 +124,6 @@ class NSConfigFlow(ConfigFlow, domain=DOMAIN):
errors=errors,
)
async def async_step_import(self, import_data: dict[str, Any]) -> ConfigFlowResult:
"""Handle import from YAML configuration."""
self._async_abort_entries_match({CONF_API_KEY: import_data[CONF_API_KEY]})
client = NSAPI(import_data[CONF_API_KEY])
try:
stations = await self.hass.async_add_executor_job(client.get_stations)
except HTTPError:
return self.async_abort(reason="invalid_auth")
except RequestsConnectionError, Timeout:
return self.async_abort(reason="cannot_connect")
except Exception:
_LOGGER.exception("Unexpected exception validating API key")
return self.async_abort(reason="unknown")
station_codes = {station.code for station in stations}
subentries: list[ConfigSubentryData] = []
for route in import_data.get(CONF_ROUTES, []):
# Convert station codes to uppercase for consistency with UI routes
for key in (CONF_FROM, CONF_TO, CONF_VIA):
if key in route:
route[key] = route[key].upper()
if route[key] not in station_codes:
return self.async_abort(reason="invalid_station")
subentries.append(
ConfigSubentryData(
title=route[CONF_NAME],
subentry_type="route",
data=route,
unique_id=None,
)
)
return self.async_create_entry(
title=INTEGRATION_TITLE,
data={CONF_API_KEY: import_data[CONF_API_KEY]},
subentries=subentries,
)
@classmethod
@callback
def async_get_supported_subentry_types(

View File

@@ -12,7 +12,6 @@ AMS_TZ = ZoneInfo("Europe/Amsterdam")
# Update every 2 minutes
SCAN_INTERVAL = timedelta(minutes=2)
CONF_ROUTES = "routes"
CONF_FROM = "from"
CONF_TO = "to"
CONF_VIA = "via"

View File

@@ -5,42 +5,24 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from datetime import datetime
import logging
from typing import Any
from ns_api import Trip
import voluptuous as vol
from homeassistant.components.sensor import (
PLATFORM_SCHEMA as SENSOR_PLATFORM_SCHEMA,
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
)
from homeassistant.config_entries import SOURCE_IMPORT
from homeassistant.const import CONF_API_KEY, CONF_NAME, EntityCategory
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
from homeassistant.helpers import config_validation as cv, issue_registry as ir
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import (
AddConfigEntryEntitiesCallback,
AddEntitiesCallback,
)
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType, StateType
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.typing import StateType
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .binary_sensor import get_delay
from .const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
DOMAIN,
INTEGRATION_TITLE,
ROUTE_MODEL,
)
from .const import DOMAIN, INTEGRATION_TITLE, ROUTE_MODEL
from .coordinator import NSConfigEntry, NSDataUpdateCoordinator
@@ -70,26 +52,9 @@ TRIP_STATUS = {
"CANCELLED": "cancelled",
}
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0 # since we use coordinator pattern
ROUTE_SCHEMA = vol.Schema(
{
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_FROM): cv.string,
vol.Required(CONF_TO): cv.string,
vol.Optional(CONF_VIA): cv.string,
vol.Optional(CONF_TIME): cv.time,
}
)
ROUTES_SCHEMA = vol.All(cv.ensure_list, [ROUTE_SCHEMA])
PLATFORM_SCHEMA = SENSOR_PLATFORM_SCHEMA.extend(
{vol.Required(CONF_API_KEY): cv.string, vol.Optional(CONF_ROUTES): ROUTES_SCHEMA}
)
@dataclass(frozen=True, kw_only=True)
class NSSensorEntityDescription(SensorEntityDescription):
@@ -195,55 +160,6 @@ SENSOR_DESCRIPTIONS: tuple[NSSensorEntityDescription, ...] = (
)
async def async_setup_platform(
hass: HomeAssistant,
config: ConfigType,
async_add_entities: AddEntitiesCallback,
discovery_info: DiscoveryInfoType | None = None,
) -> None:
"""Set up the departure sensor."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data=config,
)
if (
result.get("type") is FlowResultType.ABORT
and result.get("reason") != "already_configured"
):
ir.async_create_issue(
hass,
DOMAIN,
f"deprecated_yaml_import_issue_{result.get('reason')}",
breaks_in_ha_version="2026.4.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key=f"deprecated_yaml_import_issue_{result.get('reason')}",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
return
ir.async_create_issue(
hass,
HOMEASSISTANT_DOMAIN,
"deprecated_yaml",
breaks_in_ha_version="2026.4.0",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.WARNING,
translation_key="deprecated_yaml",
translation_placeholders={
"domain": DOMAIN,
"integration_title": INTEGRATION_TITLE,
},
)
async def async_setup_entry(
hass: HomeAssistant,
config_entry: NSConfigEntry,

View File

@@ -127,23 +127,5 @@
"name": "Transfers"
}
}
},
"issues": {
"deprecated_yaml_import_issue_cannot_connect": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, Home Assistant could not connect to the NS API. Please check your internet connection and the status of the NS API, then restart Home Assistant to try again, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
},
"deprecated_yaml_import_issue_invalid_auth": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, an invalid API key was found. Please update your YAML configuration, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "Nederlandse Spoorwegen YAML configuration deprecated"
},
"deprecated_yaml_import_issue_invalid_station": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration an invalid station was found. Please update your YAML configuration, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
},
"deprecated_yaml_import_issue_unknown": {
"description": "Configuring Nederlandse Spoorwegen using YAML sensor platform is deprecated.\n\nWhile importing your configuration, an unknown error occurred. Please restart Home Assistant to try again, or remove the existing YAML configuration and set the integration up via the UI.",
"title": "[%key:component::nederlandse_spoorwegen::issues::deprecated_yaml_import_issue_invalid_auth::title%]"
}
}
}

View File

@@ -159,6 +159,15 @@ class ReolinkFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle discovery via dhcp."""
mac_address = format_mac(discovery_info.macaddress)
existing_entry = await self.async_set_unique_id(mac_address)
if existing_entry and CONF_HOST not in existing_entry.data:
_LOGGER.debug(
"Reolink DHCP discovered device with MAC '%s' and IP '%s', "
"but existing config entry does not have host, ignoring",
mac_address,
discovery_info.ip,
)
raise AbortFlow("already_configured")
if (
existing_entry
and CONF_PASSWORD in existing_entry.data

View File

@@ -0,0 +1,53 @@
"""Support for the Swing2Sleep Smarla button entities."""
from dataclasses import dataclass
from pysmarlaapi.federwiege.services.classes import Property
from homeassistant.components.button import ButtonEntity, ButtonEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import FederwiegeConfigEntry
from .entity import SmarlaBaseEntity, SmarlaEntityDescription
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class SmarlaButtonEntityDescription(SmarlaEntityDescription, ButtonEntityDescription):
"""Class describing Swing2Sleep Smarla button entity."""
BUTTONS: list[SmarlaButtonEntityDescription] = [
SmarlaButtonEntityDescription(
key="send_diagnostics",
translation_key="send_diagnostics",
service="system",
property="send_diagnostic_data",
entity_category=EntityCategory.CONFIG,
),
]
async def async_setup_entry(
hass: HomeAssistant,
config_entry: FederwiegeConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up the Smarla buttons from config entry."""
federwiege = config_entry.runtime_data
async_add_entities(SmarlaButton(federwiege, desc) for desc in BUTTONS)
class SmarlaButton(SmarlaBaseEntity, ButtonEntity):
"""Representation of a Smarla button."""
entity_description: SmarlaButtonEntityDescription
_property: Property[str]
def press(self) -> None:
"""Press the button."""
self._property.set("Sent from Home Assistant")

View File

@@ -6,7 +6,13 @@ DOMAIN = "smarla"
HOST = "https://devices.swing2sleep.de"
PLATFORMS = [Platform.NUMBER, Platform.SENSOR, Platform.SWITCH, Platform.UPDATE]
PLATFORMS = [
Platform.BUTTON,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,
Platform.UPDATE,
]
DEVICE_MODEL_NAME = "Smarla"
MANUFACTURER_NAME = "Swing2Sleep"

View File

@@ -9,6 +9,7 @@ from homeassistant.components.number import (
NumberEntityDescription,
NumberMode,
)
from homeassistant.const import PERCENTAGE
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -32,6 +33,7 @@ NUMBERS: list[SmarlaNumberEntityDescription] = [
native_max_value=100,
native_min_value=0,
native_step=1,
native_unit_of_measurement=PERCENTAGE,
mode=NumberMode.SLIDER,
),
]

View File

@@ -5,6 +5,7 @@ from dataclasses import dataclass
from pysmarlaapi.federwiege.services.classes import Property
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
SensorEntityDescription,
SensorStateClass,
@@ -35,6 +36,7 @@ SENSORS: list[SmarlaSensorEntityDescription] = [
property="oscillation",
multiple=True,
value_pos=0,
device_class=SensorDeviceClass.DISTANCE,
native_unit_of_measurement=UnitOfLength.MILLIMETERS,
state_class=SensorStateClass.MEASUREMENT,
),
@@ -45,6 +47,7 @@ SENSORS: list[SmarlaSensorEntityDescription] = [
property="oscillation",
multiple=True,
value_pos=1,
device_class=SensorDeviceClass.DURATION,
native_unit_of_measurement=UnitOfTime.MILLISECONDS,
state_class=SensorStateClass.MEASUREMENT,
),

View File

@@ -30,6 +30,11 @@
}
},
"entity": {
"button": {
"send_diagnostics": {
"name": "Send diagnostics"
}
},
"number": {
"intensity": {
"name": "Intensity"

View File

@@ -5,7 +5,11 @@ from typing import Any
from pysmarlaapi.federwiege.services.classes import Property
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.components.switch import (
SwitchDeviceClass,
SwitchEntity,
SwitchEntityDescription,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
@@ -26,12 +30,14 @@ SWITCHES: list[SmarlaSwitchEntityDescription] = [
name=None,
service="babywiege",
property="swing_active",
device_class=SwitchDeviceClass.SWITCH,
),
SmarlaSwitchEntityDescription(
key="smart_mode",
translation_key="smart_mode",
service="babywiege",
property="smart_mode",
device_class=SwitchDeviceClass.SWITCH,
),
]

View File

@@ -592,7 +592,8 @@ def process_status(status: dict[str, ComponentStatus]) -> dict[str, ComponentSta
if "burner" in component:
burner_id = int(component.split("-")[-1])
component = f"burner-0{burner_id}"
if component in status:
# Don't delete 'lamp' component even when disabled
if component in status and component != "lamp":
del status[component]
for component_status in status.values():
process_component_status(component_status)

View File

@@ -3,9 +3,18 @@
from __future__ import annotations
import asyncio
from collections.abc import Callable
from typing import Any, cast
from pysmartthings import Attribute, Capability, Command, DeviceEvent, SmartThings
from pysmartthings import (
Attribute,
Capability,
Category,
Command,
ComponentStatus,
DeviceEvent,
SmartThings,
)
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
@@ -21,6 +30,10 @@ from homeassistant.components.light import (
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.restore_state import RestoreEntity
from homeassistant.util.percentage import (
ordered_list_item_to_percentage,
percentage_to_ordered_list_item,
)
from . import FullDevice, SmartThingsConfigEntry
from .const import MAIN
@@ -32,6 +45,22 @@ CAPABILITIES = (
Capability.COLOR_TEMPERATURE,
)
LAMP_CAPABILITY_EXISTS: dict[str, Callable[[FullDevice, ComponentStatus], bool]] = {
"lamp": lambda _, __: True,
"hood": lambda device, component: (
Capability.SAMSUNG_CE_CONNECTION_STATE not in component
or component[Capability.SAMSUNG_CE_CONNECTION_STATE][
Attribute.CONNECTION_STATE
].value
!= "disconnected"
),
"cavity-02": lambda _, __: True,
"main": lambda device, component: (
device.device.components[MAIN].manufacturer_category
in {Category.MICROWAVE, Category.OVEN, Category.RANGE}
),
}
async def async_setup_entry(
hass: HomeAssistant,
@@ -40,12 +69,25 @@ async def async_setup_entry(
) -> None:
"""Add lights for a config entry."""
entry_data = entry.runtime_data
async_add_entities(
SmartThingsLight(entry_data.client, device)
entities: list[LightEntity] = [
SmartThingsLight(entry_data.client, device, component)
for device in entry_data.devices.values()
if Capability.SWITCH in device.status[MAIN]
and any(capability in device.status[MAIN] for capability in CAPABILITIES)
for component in device.status
if (
Capability.SWITCH in device.status[MAIN]
and any(capability in device.status[MAIN] for capability in CAPABILITIES)
and Capability.SAMSUNG_CE_LAMP not in device.status[component]
)
]
entities.extend(
SmartThingsLamp(entry_data.client, device, component)
for device in entry_data.devices.values()
for component, exists_fn in LAMP_CAPABILITY_EXISTS.items()
if component in device.status
and Capability.SAMSUNG_CE_LAMP in device.status[component]
and exists_fn(device, device.status[component])
)
async_add_entities(entities)
def convert_scale(
@@ -71,7 +113,9 @@ class SmartThingsLight(SmartThingsEntity, LightEntity, RestoreEntity):
# highest kelvin found supported across 20+ handlers.
_attr_max_color_temp_kelvin = 9000 # 111 mireds
def __init__(self, client: SmartThings, device: FullDevice) -> None:
def __init__(
self, client: SmartThings, device: FullDevice, component: str = MAIN
) -> None:
"""Initialize a SmartThingsLight."""
super().__init__(
client,
@@ -82,6 +126,7 @@ class SmartThingsLight(SmartThingsEntity, LightEntity, RestoreEntity):
Capability.SWITCH_LEVEL,
Capability.SWITCH,
},
component=component,
)
color_modes = set()
if self.supports_capability(Capability.COLOR_TEMPERATURE):
@@ -236,3 +281,117 @@ class SmartThingsLight(SmartThingsEntity, LightEntity, RestoreEntity):
) is None:
return None
return state == "on"
class SmartThingsLamp(SmartThingsEntity, LightEntity):
"""Define a SmartThings lamp component as a light entity."""
_attr_translation_key = "light"
def __init__(
self, client: SmartThings, device: FullDevice, component: str = MAIN
) -> None:
"""Initialize a SmartThingsLamp."""
super().__init__(
client,
device,
{Capability.SWITCH, Capability.SAMSUNG_CE_LAMP},
component=component,
)
levels = (
self.get_attribute_value(
Capability.SAMSUNG_CE_LAMP, Attribute.SUPPORTED_BRIGHTNESS_LEVEL
)
or []
)
color_modes = set()
if "off" not in levels or len(levels) > 2:
color_modes.add(ColorMode.BRIGHTNESS)
if not color_modes:
color_modes.add(ColorMode.ONOFF)
self._attr_color_mode = list(color_modes)[0]
self._attr_supported_color_modes = color_modes
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the lamp on."""
# Switch/brightness/transition
if ATTR_BRIGHTNESS in kwargs:
await self.async_set_level(kwargs[ATTR_BRIGHTNESS])
return
if self.supports_capability(Capability.SWITCH):
await self.execute_device_command(Capability.SWITCH, Command.ON)
# if no switch, turn on via brightness level
else:
await self.async_set_level(255)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the lamp off."""
if self.supports_capability(Capability.SWITCH):
await self.execute_device_command(Capability.SWITCH, Command.OFF)
return
await self.execute_device_command(
Capability.SAMSUNG_CE_LAMP,
Command.SET_BRIGHTNESS_LEVEL,
argument="off",
)
async def async_set_level(self, brightness: int) -> None:
"""Set lamp brightness via supported levels."""
levels = (
self.get_attribute_value(
Capability.SAMSUNG_CE_LAMP, Attribute.SUPPORTED_BRIGHTNESS_LEVEL
)
or []
)
# remove 'off' for brightness mapping
if "off" in levels:
levels = [level for level in levels if level != "off"]
level = percentage_to_ordered_list_item(
levels, int(round(brightness * 100 / 255))
)
await self.execute_device_command(
Capability.SAMSUNG_CE_LAMP,
Command.SET_BRIGHTNESS_LEVEL,
argument=level,
)
# turn on switch separately if needed
if (
self.supports_capability(Capability.SWITCH)
and not self.is_on
and brightness > 0
):
await self.execute_device_command(Capability.SWITCH, Command.ON)
def _update_attr(self) -> None:
"""Update lamp-specific attributes."""
level = self.get_attribute_value(
Capability.SAMSUNG_CE_LAMP, Attribute.BRIGHTNESS_LEVEL
)
if level is None:
self._attr_brightness = None
return
levels = (
self.get_attribute_value(
Capability.SAMSUNG_CE_LAMP, Attribute.SUPPORTED_BRIGHTNESS_LEVEL
)
or []
)
if "off" in levels:
if level == "off":
self._attr_brightness = 0
return
levels = [level for level in levels if level != "off"]
percent = ordered_list_item_to_percentage(levels, level)
self._attr_brightness = int(convert_scale(percent, 100, 255))
@property
def is_on(self) -> bool | None:
"""Return true if lamp is on."""
if self.supports_capability(Capability.SWITCH):
state = self.get_attribute_value(Capability.SWITCH, Attribute.SWITCH)
if state is None:
return None
return state == "on"
if (brightness := self.brightness) is not None:
return brightness > 0
return None

View File

@@ -165,6 +165,11 @@
}
}
},
"light": {
"light": {
"name": "[%key:component::light::title%]"
}
},
"number": {
"cool_select_plus_temperature": {
"name": "CoolSelect+ temperature"

View File

@@ -13,5 +13,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"loggers": ["switchbot_api"],
"requirements": ["switchbot-api==2.10.0"]
"requirements": ["switchbot-api==2.11.0"]
}

View File

@@ -139,10 +139,10 @@ async def async_setup_entry(
action_wrapper=_AlarmActionWrapper(
master_mode.dpcode, master_mode
),
changed_by_wrapper=_AlarmChangedByWrapper.find_dpcode(
changed_by_wrapper=_AlarmChangedByWrapper.find_dpcode( # type: ignore[arg-type]
device, DPCode.ALARM_MSG
),
state_wrapper=_AlarmStateWrapper(
state_wrapper=_AlarmStateWrapper( # type: ignore[arg-type]
master_mode.dpcode, master_mode
),
)

View File

@@ -177,7 +177,7 @@ class _HvacModeWrapper(DPCodeEnumWrapper):
return None
return TUYA_HVAC_TO_HA[raw]
def _convert_value_to_raw_value( # type: ignore[override]
def _convert_value_to_raw_value(
self,
device: CustomerDevice,
value: HVACMode,
@@ -358,7 +358,7 @@ async def async_setup_entry(
device,
manager,
CLIMATE_DESCRIPTIONS[device.category],
current_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode(
current_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode( # type: ignore[arg-type]
device, DPCode.HUMIDITY_CURRENT
),
current_temperature_wrapper=temperature_wrappers[0],
@@ -367,7 +367,7 @@ async def async_setup_entry(
(DPCode.FAN_SPEED_ENUM, DPCode.LEVEL, DPCode.WINDSPEED),
prefer_function=True,
),
hvac_mode_wrapper=_HvacModeWrapper.find_dpcode(
hvac_mode_wrapper=_HvacModeWrapper.find_dpcode( # type: ignore[arg-type]
device, DPCode.MODE, prefer_function=True
),
preset_wrapper=_PresetWrapper.find_dpcode(
@@ -378,7 +378,7 @@ async def async_setup_entry(
switch_wrapper=DPCodeBooleanWrapper.find_dpcode(
device, DPCode.SWITCH, prefer_function=True
),
target_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode(
target_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode( # type: ignore[arg-type]
device, DPCode.HUMIDITY_SET, prefer_function=True
),
temperature_unit=temperature_wrappers[2],

View File

@@ -87,7 +87,7 @@ class _InstructionBooleanWrapper(DPCodeBooleanWrapper):
options = ["open", "close"]
_ACTION_MAPPINGS = {"open": True, "close": False}
def _convert_value_to_raw_value(self, device: CustomerDevice, value: str) -> bool: # type: ignore[override]
def _convert_value_to_raw_value(self, device: CustomerDevice, value: str) -> bool:
return self._ACTION_MAPPINGS[value]
@@ -291,19 +291,19 @@ async def async_setup_entry(
device,
manager,
description,
current_position=description.position_wrapper.find_dpcode(
current_position=description.position_wrapper.find_dpcode( # type: ignore[arg-type]
device, description.current_position
),
current_state_wrapper=description.current_state_wrapper.find_dpcode(
current_state_wrapper=description.current_state_wrapper.find_dpcode( # type: ignore[arg-type]
device, description.current_state
),
instruction_wrapper=_get_instruction_wrapper(
device, description
),
set_position=description.position_wrapper.find_dpcode(
set_position=description.position_wrapper.find_dpcode( # type: ignore[arg-type]
device, description.set_position, prefer_function=True
),
tilt_position=description.position_wrapper.find_dpcode(
tilt_position=description.position_wrapper.find_dpcode( # type: ignore[arg-type]
device,
(DPCode.ANGLE_HORIZONTAL, DPCode.ANGLE_VERTICAL),
prefer_function=True,

View File

@@ -49,7 +49,7 @@ class _AlarmMessageWrapper(DPCodeStringWrapper):
super().__init__(dpcode, type_information)
self.options = ["triggered"]
def read_device_status(
def read_device_status( # type: ignore[override]
self, device: CustomerDevice
) -> tuple[str, dict[str, Any]] | None:
"""Return the event attributes for the alarm message."""

View File

@@ -154,7 +154,7 @@ async def async_setup_entry(
oscillate_wrapper=DPCodeBooleanWrapper.find_dpcode(
device, _OSCILLATE_DPCODES, prefer_function=True
),
speed_wrapper=_get_speed_wrapper(device),
speed_wrapper=_get_speed_wrapper(device), # type: ignore[arg-type]
switch_wrapper=DPCodeBooleanWrapper.find_dpcode(
device, _SWITCH_DPCODES, prefer_function=True
),

View File

@@ -104,7 +104,7 @@ async def async_setup_entry(
device,
manager,
description,
current_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode(
current_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode( # type: ignore[arg-type]
device, description.current_humidity
),
mode_wrapper=DPCodeEnumWrapper.find_dpcode(
@@ -115,7 +115,7 @@ async def async_setup_entry(
description.dpcode or description.key,
prefer_function=True,
),
target_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode(
target_humidity_wrapper=_RoundedIntegerWrapper.find_dpcode( # type: ignore[arg-type]
device, description.humidity, prefer_function=True
),
)

View File

@@ -633,17 +633,17 @@ async def async_setup_entry(
manager,
description,
brightness_wrapper=(
brightness_wrapper := _get_brightness_wrapper(
brightness_wrapper := _get_brightness_wrapper( # type: ignore[arg-type]
device, description
)
),
color_data_wrapper=_get_color_data_wrapper(
color_data_wrapper=_get_color_data_wrapper( # type: ignore[arg-type]
device, description, brightness_wrapper
),
color_mode_wrapper=DPCodeEnumWrapper.find_dpcode(
device, description.color_mode, prefer_function=True
),
color_temp_wrapper=_ColorTempWrapper.find_dpcode(
color_temp_wrapper=_ColorTempWrapper.find_dpcode( # type: ignore[arg-type]
device, description.color_temp, prefer_function=True
),
switch_wrapper=switch_wrapper,

View File

@@ -44,7 +44,7 @@
"iot_class": "cloud_push",
"loggers": ["tuya_sharing"],
"requirements": [
"tuya-device-handlers==0.0.10",
"tuya-device-handlers==0.0.11",
"tuya-device-sharing-sdk==0.2.8"
]
}

View File

@@ -1097,11 +1097,5 @@
"action_dpcode_not_found": {
"message": "Unable to process action as the device does not provide a corresponding function code (expected one of {expected} in {available})."
}
},
"issues": {
"deprecated_entity_new_valve": {
"description": "The Tuya entity `{entity}` is deprecated, replaced by a new valve entity.\nPlease update your dashboards, automations and scripts, disable `{entity}` and reload the integration/restart Home Assistant to fix this issue.",
"title": "{name} is deprecated"
}
}
}

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any
from tuya_device_handlers.device_wrapper.base import DeviceWrapper
@@ -10,35 +9,19 @@ from tuya_device_handlers.device_wrapper.common import DPCodeBooleanWrapper
from tuya_sharing import CustomerDevice, Manager
from homeassistant.components.switch import (
DOMAIN as SWITCH_DOMAIN,
SwitchDeviceClass,
SwitchEntity,
SwitchEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import entity_registry as er
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.issue_registry import (
IssueSeverity,
async_create_issue,
async_delete_issue,
)
from . import TuyaConfigEntry
from .const import DOMAIN, TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .const import TUYA_DISCOVERY_NEW, DeviceCategory, DPCode
from .entity import TuyaEntity
@dataclass(frozen=True, kw_only=True)
class TuyaDeprecatedSwitchEntityDescription(SwitchEntityDescription):
"""Describes Tuya deprecated switch entity."""
deprecated: str
breaks_in_ha_version: str
# All descriptions can be found here. Mostly the Boolean data types in the
# default instruction set of each category end up being a Switch.
# https://developer.tuya.com/en/docs/iot/standarddescription?id=K9i5ql6waswzq
@@ -664,14 +647,6 @@ SWITCHES: dict[DeviceCategory, tuple[SwitchEntityDescription, ...]] = {
entity_category=EntityCategory.CONFIG,
),
),
DeviceCategory.SFKZQ: (
TuyaDeprecatedSwitchEntityDescription(
key=DPCode.SWITCH,
translation_key="switch",
deprecated="deprecated_entity_new_valve",
breaks_in_ha_version="2026.4.0",
),
),
DeviceCategory.SGBJ: (
SwitchEntityDescription(
key=DPCode.MUFFLING,
@@ -937,7 +912,6 @@ async def async_setup_entry(
) -> None:
"""Set up tuya sensors dynamically through tuya discovery."""
manager = entry.runtime_data.manager
entity_registry = er.async_get(hass)
@callback
def async_discover_device(device_ids: list[str]) -> None:
@@ -954,12 +928,6 @@ async def async_setup_entry(
device, description.key, prefer_function=True
)
)
and _check_deprecation(
hass,
device,
description,
entity_registry,
)
)
async_add_entities(entities)
@@ -971,55 +939,6 @@ async def async_setup_entry(
)
def _check_deprecation(
hass: HomeAssistant,
device: CustomerDevice,
description: SwitchEntityDescription,
entity_registry: er.EntityRegistry,
) -> bool:
"""Check entity deprecation.
Returns:
`True` if the entity should be created, `False` otherwise.
"""
# Not deprecated, just create it
if not isinstance(description, TuyaDeprecatedSwitchEntityDescription):
return True
unique_id = f"tuya.{device.id}{description.key}"
entity_id = entity_registry.async_get_entity_id(SWITCH_DOMAIN, DOMAIN, unique_id)
# Deprecated and not present in registry, skip creation
if not entity_id or not (entity_entry := entity_registry.async_get(entity_id)):
return False
# Deprecated and present in registry but disabled, remove it and skip creation
if entity_entry.disabled:
entity_registry.async_remove(entity_id)
async_delete_issue(
hass,
DOMAIN,
f"deprecated_entity_{unique_id}",
)
return False
# Deprecated and present in registry and enabled, raise issue and create it
async_create_issue(
hass,
DOMAIN,
f"deprecated_entity_{unique_id}",
breaks_in_ha_version=description.breaks_in_ha_version,
is_fixable=False,
severity=IssueSeverity.WARNING,
translation_key=description.deprecated,
translation_placeholders={
"name": f"{device.name} {entity_entry.name or entity_entry.original_name}",
"entity": entity_id,
},
)
return True
class TuyaSwitchEntity(TuyaEntity, SwitchEntity):
"""Tuya Switch Device."""

View File

@@ -129,10 +129,7 @@ rules:
status: exempt
comment: |
This integration does not have entities.
reconfiguration-flow:
status: exempt
comment: |
Nothing to reconfigure.
reconfiguration-flow: todo
repair-issues: todo
stale-devices:
status: exempt

View File

@@ -41,7 +41,7 @@ hass-nabucasa==1.15.0
hassil==3.5.0
home-assistant-bluetooth==1.13.1
home-assistant-frontend==20260302.0
home-assistant-intents==2026.2.13
home-assistant-intents==2026.3.3
httpx==0.28.1
ifaddr==0.2.0
Jinja2==3.1.6

2
requirements.txt generated
View File

@@ -28,7 +28,7 @@ ha-ffmpeg==3.2.2
hass-nabucasa==1.15.0
hassil==3.5.0
home-assistant-bluetooth==1.13.1
home-assistant-intents==2026.2.13
home-assistant-intents==2026.3.3
httpx==0.28.1
ifaddr==0.2.0
infrared-protocols==1.0.0

8
requirements_all.txt generated
View File

@@ -190,7 +190,7 @@ aioairzone-cloud==0.7.2
aioairzone==1.0.5
# homeassistant.components.alexa_devices
aioamazondevices==12.0.2
aioamazondevices==13.0.0
# homeassistant.components.ambient_network
# homeassistant.components.ambient_station
@@ -1226,7 +1226,7 @@ holidays==0.84
home-assistant-frontend==20260302.0
# homeassistant.components.conversation
home-assistant-intents==2026.2.13
home-assistant-intents==2026.3.3
# homeassistant.components.gentex_homelink
homelink-integration-api==0.0.1
@@ -3014,7 +3014,7 @@ surepy==0.9.0
swisshydrodata==0.1.0
# homeassistant.components.switchbot_cloud
switchbot-api==2.10.0
switchbot-api==2.11.0
# homeassistant.components.synology_srm
synology-srm==0.2.0
@@ -3124,7 +3124,7 @@ ttls==1.8.3
ttn_client==1.2.3
# homeassistant.components.tuya
tuya-device-handlers==0.0.10
tuya-device-handlers==0.0.11
# homeassistant.components.tuya
tuya-device-sharing-sdk==0.2.8

View File

@@ -181,7 +181,7 @@ aioairzone-cloud==0.7.2
aioairzone==1.0.5
# homeassistant.components.alexa_devices
aioamazondevices==12.0.2
aioamazondevices==13.0.0
# homeassistant.components.ambient_network
# homeassistant.components.ambient_station
@@ -1087,7 +1087,7 @@ holidays==0.84
home-assistant-frontend==20260302.0
# homeassistant.components.conversation
home-assistant-intents==2026.2.13
home-assistant-intents==2026.3.3
# homeassistant.components.gentex_homelink
homelink-integration-api==0.0.1
@@ -2544,7 +2544,7 @@ subarulink==0.7.15
surepy==0.9.0
# homeassistant.components.switchbot_cloud
switchbot-api==2.10.0
switchbot-api==2.11.0
# homeassistant.components.system_bridge
systembridgeconnector==5.4.3
@@ -2627,7 +2627,7 @@ ttls==1.8.3
ttn_client==1.2.3
# homeassistant.components.tuya
tuya-device-handlers==0.0.10
tuya-device-handlers==0.0.11
# homeassistant.components.tuya
tuya-device-sharing-sdk==0.2.8

View File

@@ -1,148 +0,0 @@
#!/usr/bin/env python3
"""Collect and merge pytest durations per test file."""
from __future__ import annotations
import argparse
import json
from pathlib import Path
from defusedxml import ElementTree as ET
def _load_json(path: Path) -> dict[str, float]:
"""Load durations from a JSON file."""
with path.open("r", encoding="utf-8") as file:
payload = json.load(file)
if not isinstance(payload, dict):
raise TypeError(f"Expected JSON object in {path}")
result: dict[str, float] = {}
for file_path, duration in payload.items():
if not isinstance(file_path, str) or not isinstance(duration, int | float):
continue
if duration <= 0:
continue
result[file_path] = float(duration)
return result
def _load_junit(path: Path) -> dict[str, float]:
"""Load durations from a JUnit XML file."""
tree = ET.parse(path)
root = tree.getroot()
result: dict[str, float] = {}
for testcase in root.iter("testcase"):
file_path = testcase.attrib.get("file")
if not file_path:
continue
raw_duration = testcase.attrib.get("time", "0")
try:
duration = float(raw_duration)
except ValueError:
continue
if duration <= 0:
continue
normalized = Path(file_path).as_posix()
result[normalized] = result.get(normalized, 0.0) + duration
return result
def _load_input(path: Path) -> dict[str, float]:
"""Load durations from either JSON or XML input."""
suffix = path.suffix.lower()
if suffix == ".json":
return _load_json(path)
if suffix == ".xml":
return _load_junit(path)
raise ValueError(f"Unsupported file type for {path}")
def merge_durations(
existing: dict[str, float],
incoming: dict[str, float],
smoothing: float,
) -> dict[str, float]:
"""Merge durations by smoothing with historical values.
Formula: merged = old * (1 - smoothing) + new * smoothing
"""
merged = dict(existing)
for file_path, duration in incoming.items():
previous = merged.get(file_path)
if previous is None:
merged[file_path] = duration
continue
merged[file_path] = (previous * (1 - smoothing)) + (duration * smoothing)
return merged
def main() -> None:
"""Run the duration collector."""
parser = argparse.ArgumentParser(
description="Collect and merge test durations from JUnit XML or JSON files"
)
parser.add_argument(
"inputs",
nargs="*",
type=Path,
help="Input files (.xml or .json)",
)
parser.add_argument(
"--existing",
type=Path,
help="Existing durations JSON file",
)
parser.add_argument(
"--output",
required=True,
type=Path,
help="Output JSON file",
)
parser.add_argument(
"--smoothing",
type=float,
default=0.35,
help="Weight for newly measured durations (0.0 to 1.0)",
)
args = parser.parse_args()
if not 0 <= args.smoothing <= 1:
raise ValueError("--smoothing must be between 0.0 and 1.0")
merged: dict[str, float] = {}
if args.existing and args.existing.exists():
merged = _load_json(args.existing)
incoming: dict[str, float] = {}
for input_file in args.inputs:
if not input_file.exists():
continue
for file_path, duration in _load_input(input_file).items():
incoming[file_path] = incoming.get(file_path, 0.0) + duration
merged = merge_durations(merged, incoming, args.smoothing)
args.output.parent.mkdir(parents=True, exist_ok=True)
with args.output.open("w", encoding="utf-8") as file:
json.dump(dict(sorted(merged.items())), file, indent=2)
file.write("\n")
print(
f"Wrote {len(merged)} file durations "
f"(updated {len(incoming)} from current run) to {args.output}"
)
if __name__ == "__main__":
main()

67
script/gen_copilot_instructions.py Normal file → Executable file
View File

@@ -7,7 +7,6 @@ Necessary until copilot can handle skills.
from __future__ import annotations
from pathlib import Path
import re
import sys
GENERATED_MESSAGE = (
@@ -18,54 +17,15 @@ SKILLS_DIR = Path(".claude/skills")
AGENTS_FILE = Path("AGENTS.md")
OUTPUT_FILE = Path(".github/copilot-instructions.md")
# Pattern to match markdown links to local files: [text](filename)
# Excludes URLs (http://, https://) and anchors (#)
LOCAL_LINK_PATTERN = re.compile(r"\[([^\]]+)\]\(([^)]+)\)")
EXCLUDED_SKILLS = {"github-pr-reviewer"}
def expand_file_references(content: str, skill_dir: Path) -> str:
"""Expand file references in skill content.
Finds markdown links to local files and replaces them with the file content
wrapped in reference tags.
"""
lines = content.split("\n")
result_lines: list[str] = []
for line in lines:
result_lines.append(line)
matches = list(LOCAL_LINK_PATTERN.finditer(line))
if not matches:
continue
# Check if any match is a local file reference
for match in matches:
link_path = match.group(2)
# Skip URLs and anchors
if link_path.startswith(("http://", "https://", "#", "/")):
continue
# Try to find the referenced file
ref_file = skill_dir / link_path
if ref_file.exists():
ref_content = ref_file.read_text().strip()
result_lines.append(f"<REFERENCE {ref_file.name}>")
result_lines.append(ref_content)
result_lines.append(f"<END REFERENCE {ref_file.name}>")
result_lines.append("")
break
return "\n".join(result_lines)
def gather_skills() -> list[tuple[str, str]]:
def gather_skills() -> list[tuple[str, Path]]:
"""Gather all skills from the skills directory.
Returns a list of tuples (skill_name, skill_content).
Returns a list of tuples (skill_name, skill_file_path).
"""
skills: list[tuple[str, str]] = []
skills: list[tuple[str, Path]] = []
if not SKILLS_DIR.exists():
return skills
@@ -74,6 +34,9 @@ def gather_skills() -> list[tuple[str, str]]:
if not skill_dir.is_dir():
continue
if skill_dir.name in EXCLUDED_SKILLS:
continue
skill_file = skill_dir / "SKILL.md"
if not skill_file.exists():
continue
@@ -91,13 +54,8 @@ def gather_skills() -> list[tuple[str, str]]:
if line.startswith("name:"):
skill_name = line[5:].strip()
break
# Remove frontmatter from content
skill_content = skill_content[end_idx + 3 :].strip()
# Expand file references in the skill content
skill_content = expand_file_references(skill_content, skill_dir)
skills.append((skill_name, skill_content))
skills.append((skill_name, skill_file))
return skills
@@ -115,13 +73,14 @@ def generate_output() -> str:
output_parts.append(agents_content.strip())
output_parts.append("")
# Add each skill
# Add skills section as a bullet list of name: path
skills = gather_skills()
for skill_name, skill_content in skills:
if skills:
output_parts.append("")
output_parts.append(f"# Skill: {skill_name}")
output_parts.append("# Skills")
output_parts.append("")
output_parts.append(skill_content)
for skill_name, skill_file in skills:
output_parts.append(f"- {skill_name}: {skill_file}")
output_parts.append("")
return "\n".join(output_parts)

View File

@@ -5,9 +5,8 @@ from __future__ import annotations
import argparse
from dataclasses import dataclass, field
import json
from math import ceil
from pathlib import Path
from statistics import fmean
import subprocess
import sys
from typing import Final
@@ -21,14 +20,12 @@ class Bucket:
):
"""Initialize bucket."""
self.total_tests = 0
self.total_duration = 0.0
self._paths: list[str] = []
def add(self, part: TestFolder | TestFile) -> None:
"""Add tests to bucket."""
part.add_to_bucket()
self.total_tests += part.total_tests
self.total_duration += part.total_duration
self._paths.append(str(part.path))
def get_paths_line(self) -> str:
@@ -39,9 +36,9 @@ class Bucket:
class BucketHolder:
"""Class to hold buckets."""
def __init__(self, duration_per_bucket: float, bucket_count: int) -> None:
def __init__(self, tests_per_bucket: int, bucket_count: int) -> None:
"""Initialize bucket holder."""
self._duration_per_bucket = duration_per_bucket
self._tests_per_bucket = tests_per_bucket
self._bucket_count = bucket_count
self._buckets: list[Bucket] = [Bucket() for _ in range(bucket_count)]
@@ -49,26 +46,18 @@ class BucketHolder:
"""Split tests into buckets."""
digits = len(str(test_folder.total_tests))
sorted_tests = sorted(
test_folder.get_all_flatten(),
reverse=True,
key=lambda test: (test.total_duration, test.total_tests),
test_folder.get_all_flatten(), reverse=True, key=lambda x: x.total_tests
)
for tests in sorted_tests:
if tests.added_to_bucket:
# Already added to bucket
continue
print(
f"{tests.total_tests:>{digits}} tests in {tests.path} "
f"(~{tests.total_duration:.2f}s)"
)
smallest_bucket = min(
self._buckets, key=lambda bucket: bucket.total_duration
)
print(f"{tests.total_tests:>{digits}} tests in {tests.path}")
smallest_bucket = min(self._buckets, key=lambda x: x.total_tests)
is_file = isinstance(tests, TestFile)
if (
smallest_bucket.total_duration + tests.total_duration
< self._duration_per_bucket
smallest_bucket.total_tests + tests.total_tests < self._tests_per_bucket
) or is_file:
smallest_bucket.add(tests)
# Ensure all files from the same folder are in the same bucket
@@ -78,9 +67,7 @@ class BucketHolder:
if other_test is tests or isinstance(other_test, TestFolder):
continue
print(
f"{other_test.total_tests:>{digits}} tests in "
f"{other_test.path} (same bucket, "
f"~{other_test.total_duration:.2f}s)"
f"{other_test.total_tests:>{digits}} tests in {other_test.path} (same bucket)"
)
smallest_bucket.add(other_test)
@@ -92,10 +79,7 @@ class BucketHolder:
"""Create output file."""
with Path("pytest_buckets.txt").open("w") as file:
for idx, bucket in enumerate(self._buckets):
print(
f"Bucket {idx + 1} has {bucket.total_tests} tests "
f"(~{bucket.total_duration:.2f}s)"
)
print(f"Bucket {idx + 1} has {bucket.total_tests} tests")
file.write(bucket.get_paths_line())
@@ -104,7 +88,6 @@ class TestFile:
"""Class represents a single test file and the number of tests it has."""
total_tests: int
total_duration: float
path: Path
added_to_bucket: bool = field(default=False, init=False)
parent: TestFolder | None = field(default=None, init=False)
@@ -117,7 +100,7 @@ class TestFile:
def __gt__(self, other: TestFile) -> bool:
"""Return if greater than."""
return self.total_duration > other.total_duration
return self.total_tests > other.total_tests
class TestFolder:
@@ -133,11 +116,6 @@ class TestFolder:
"""Return total tests."""
return sum([test.total_tests for test in self.children.values()])
@property
def total_duration(self) -> float:
"""Return total estimated duration in seconds."""
return sum(test.total_duration for test in self.children.values())
@property
def added_to_bucket(self) -> bool:
"""Return if added to bucket."""
@@ -211,66 +189,12 @@ def collect_tests(path: Path) -> TestFolder:
print(f"Unexpected line: {line}")
sys.exit(1)
file = TestFile(int(total_tests), 0.0, Path(file_path))
file = TestFile(int(total_tests), Path(file_path))
folder.add_test_file(file)
return folder
def load_test_durations(path: Path | None) -> dict[str, float]:
"""Load known test durations keyed by file path."""
if path is None or not path.exists():
return {}
with path.open("r", encoding="utf-8") as file:
raw_data = json.load(file)
if not isinstance(raw_data, dict):
raise TypeError("Durations file should contain a JSON object")
durations: dict[str, float] = {}
for file_path, duration in raw_data.items():
if not isinstance(file_path, str) or not isinstance(duration, int | float):
continue
if duration <= 0:
continue
durations[file_path] = float(duration)
return durations
def assign_estimated_durations(
tests: TestFolder, known_durations: dict[str, float]
) -> tuple[float, int, int]:
"""Assign estimated durations to all test files.
Files with known timings use those values. New files (without timings)
receive an estimate based on average seconds per collected test.
"""
all_files = [file for file in tests.get_all_flatten() if isinstance(file, TestFile)]
known_seconds_per_test: list[float] = []
files_without_durations = []
for test_file in all_files:
if test_file.total_tests <= 0:
continue
duration = known_durations.get(str(test_file.path))
if duration is None:
files_without_durations.append(test_file)
continue
known_seconds_per_test.append(duration / test_file.total_tests)
test_file.total_duration = duration
default_seconds_per_test = (
fmean(known_seconds_per_test) if known_seconds_per_test else 0.1
)
for test_file in files_without_durations:
test_file.total_duration = test_file.total_tests * default_seconds_per_test
return default_seconds_per_test, len(files_without_durations), len(all_files)
def main() -> None:
"""Execute script."""
parser = argparse.ArgumentParser(description="Split tests into n buckets.")
@@ -293,33 +217,19 @@ def main() -> None:
help="Path to the test files to split into buckets",
type=Path,
)
parser.add_argument(
"--durations-file",
help="JSON file with per-test-file durations in seconds",
type=Path,
)
arguments = parser.parse_args()
print("Collecting tests...")
tests = collect_tests(arguments.path)
known_durations = load_test_durations(arguments.durations_file)
default_seconds_per_test, files_missing_durations, total_files = (
assign_estimated_durations(tests, known_durations)
)
tests_per_bucket = ceil(tests.total_tests / arguments.bucket_count)
duration_per_bucket = tests.total_duration / arguments.bucket_count
bucket_holder = BucketHolder(duration_per_bucket, arguments.bucket_count)
bucket_holder = BucketHolder(tests_per_bucket, arguments.bucket_count)
print("Splitting tests...")
bucket_holder.split_tests(tests)
print(f"Total tests: {tests.total_tests}")
print(f"Files missing durations: {files_missing_durations}")
print(f"Total files: {total_files}")
print(f"Fallback seconds per test: {default_seconds_per_test:.4f}")
print(f"Estimated total duration: {tests.total_duration:.2f}s")
print(f"Estimated duration per bucket: {duration_per_bucket:.2f}s")
print(f"Estimated tests per bucket: {tests_per_bucket}")
bucket_holder.create_ouput_file()

View File

@@ -4,7 +4,6 @@ from collections.abc import Generator
from copy import deepcopy
from unittest.mock import AsyncMock, patch
from aioamazondevices.const.devices import DEVICE_TYPE_TO_MODEL
import pytest
from homeassistant.components.alexa_devices.const import (
@@ -51,9 +50,6 @@ def mock_amazon_devices_client() -> Generator[AsyncMock]:
client.get_devices_data.return_value = {
TEST_DEVICE_1_SN: deepcopy(TEST_DEVICE_1)
}
client.get_model_details = lambda device: DEVICE_TYPE_TO_MODEL.get(
device.device_type
)
client.send_sound_notification = AsyncMock()
yield client

View File

@@ -22,9 +22,12 @@ TEST_DEVICE_1 = AmazonDevice(
device_type="echo",
household_device=False,
device_owner_customer_id="amazon_ower_id",
device_cluster_members=[TEST_DEVICE_1_SN],
device_cluster_members={TEST_DEVICE_1_SN: TEST_DEVICE_1_ID},
online=True,
serial_number=TEST_DEVICE_1_SN,
manufacturer="Test manufacturer",
model="Test model",
hardware_version="1.0",
software_version="echo_test_software_version",
entity_id="11111111-2222-3333-4444-555555555555",
endpoint_id="G1234567890123456789012345678A",
@@ -78,9 +81,12 @@ TEST_DEVICE_2 = AmazonDevice(
device_type="echo",
household_device=True,
device_owner_customer_id="amazon_ower_id",
device_cluster_members=[TEST_DEVICE_2_SN],
device_cluster_members={TEST_DEVICE_2_SN: TEST_DEVICE_2_ID},
online=True,
serial_number=TEST_DEVICE_2_SN,
manufacturer="Test manufacturer 2",
model="Test model 2",
hardware_version="2.0",
software_version="echo_test_2_software_version",
entity_id="11111111-2222-3333-4444-555555555555",
endpoint_id="G1234567890123456789012345678A",

View File

@@ -6,9 +6,9 @@
'AUDIO_PLAYER',
'MICROPHONE',
]),
'device cluster members': list([
'echo_test_serial_number',
]),
'device cluster members': dict({
'echo_test_serial_number': 'echo_test_device_id',
}),
'device family': 'mine',
'device type': 'echo',
'online': True,
@@ -44,9 +44,9 @@
'AUDIO_PLAYER',
'MICROPHONE',
]),
'device cluster members': list([
'echo_test_serial_number',
]),
'device cluster members': dict({
'echo_test_serial_number': 'echo_test_device_id',
}),
'device family': 'mine',
'device type': 'echo',
'online': True,

View File

@@ -9,7 +9,7 @@
}),
'disabled_by': None,
'entry_type': None,
'hw_version': None,
'hw_version': '1.0',
'id': <ANY>,
'identifiers': set({
tuple(
@@ -19,8 +19,8 @@
}),
'labels': set({
}),
'manufacturer': 'Amazon',
'model': None,
'manufacturer': 'Test manufacturer',
'model': 'Test model',
'model_id': 'echo',
'name': 'Echo Test',
'name_by_user': None,

View File

@@ -8,15 +8,18 @@
'AUDIO_PLAYER',
'MICROPHONE',
]),
'device_cluster_members': list([
'echo_test_serial_number',
]),
'device_cluster_members': dict({
'echo_test_serial_number': 'echo_test_device_id',
}),
'device_family': 'mine',
'device_owner_customer_id': 'amazon_ower_id',
'device_type': 'echo',
'endpoint_id': 'G1234567890123456789012345678A',
'entity_id': '11111111-2222-3333-4444-555555555555',
'hardware_version': '1.0',
'household_device': False,
'manufacturer': 'Test manufacturer',
'model': 'Test model',
'notifications': dict({
'Alarm': dict({
'label': 'Morning Alarm',
@@ -75,15 +78,18 @@
'AUDIO_PLAYER',
'MICROPHONE',
]),
'device_cluster_members': list([
'echo_test_serial_number',
]),
'device_cluster_members': dict({
'echo_test_serial_number': 'echo_test_device_id',
}),
'device_family': 'mine',
'device_owner_customer_id': 'amazon_ower_id',
'device_type': 'echo',
'endpoint_id': 'G1234567890123456789012345678A',
'entity_id': '11111111-2222-3333-4444-555555555555',
'hardware_version': '1.0',
'household_device': False,
'manufacturer': 'Test manufacturer',
'model': 'Test model',
'notifications': dict({
'Alarm': dict({
'label': 'Morning Alarm',
@@ -142,15 +148,18 @@
'AUDIO_PLAYER',
'MICROPHONE',
]),
'device_cluster_members': list([
'echo_test_serial_number',
]),
'device_cluster_members': dict({
'echo_test_serial_number': 'echo_test_device_id',
}),
'device_family': 'mine',
'device_owner_customer_id': 'amazon_ower_id',
'device_type': 'echo',
'endpoint_id': 'G1234567890123456789012345678A',
'entity_id': '11111111-2222-3333-4444-555555555555',
'hardware_version': '1.0',
'household_device': False,
'manufacturer': 'Test manufacturer',
'model': 'Test model',
'notifications': dict({
'Alarm': dict({
'label': 'Morning Alarm',

View File

@@ -2,7 +2,10 @@
from unittest.mock import AsyncMock
from aioamazondevices.const.devices import SPEAKER_GROUP_FAMILY, SPEAKER_GROUP_MODEL
from aioamazondevices.const.devices import (
SPEAKER_GROUP_DEVICE_TYPE,
SPEAKER_GROUP_FAMILY,
)
from aioamazondevices.exceptions import CannotConnect, CannotRetrieveData
import pytest
@@ -114,7 +117,7 @@ async def test_alexa_dnd_group_removal(
identifiers={(DOMAIN, mock_config_entry.entry_id)},
name=mock_config_entry.title,
manufacturer="Amazon",
model=SPEAKER_GROUP_MODEL,
model=SPEAKER_GROUP_DEVICE_TYPE,
entry_type=dr.DeviceEntryType.SERVICE,
)
@@ -153,7 +156,7 @@ async def test_alexa_unsupported_notification_sensor_removal(
identifiers={(DOMAIN, mock_config_entry.entry_id)},
name=mock_config_entry.title,
manufacturer="Amazon",
model=SPEAKER_GROUP_MODEL,
model=SPEAKER_GROUP_DEVICE_TYPE,
entry_type=dr.DeviceEntryType.SERVICE,
)

View File

@@ -38,7 +38,13 @@ def _simulated_returns(index, global_measure=None):
4: 50.789, # frequency
6: 1.2345, # leak dcdc
7: 2.3456, # leak inverter
8: 12.345, # power in 1
9: 23.456, # power in 2
21: 9.876, # temperature
23: 123.456, # voltage in 1
25: 0.9876, # current in 1
26: 234.567, # voltage in 2
27: 1.234, # current in 2
30: 0.1234, # Isolation resistance
5: 12345, # energy
}
@@ -116,9 +122,15 @@ async def test_sensors(hass: HomeAssistant, entity_registry: EntityRegistry) ->
sensors = [
("sensor.mydevicename_grid_voltage", "235.9"),
("sensor.mydevicename_grid_current", "2.8"),
("sensor.mydevicename_frequency", "50.8"),
("sensor.mydevicename_grid_frequency", "50.8"),
("sensor.mydevicename_dc_dc_leak_current", "1.2345"),
("sensor.mydevicename_inverter_leak_current", "2.3456"),
("sensor.mydevicename_string_1_power", "12.3"),
("sensor.mydevicename_string_2_power", "23.5"),
("sensor.mydevicename_string_1_voltage", "123.5"),
("sensor.mydevicename_string_1_current", "1.0"),
("sensor.mydevicename_string_2_voltage", "234.6"),
("sensor.mydevicename_string_2_current", "1.2"),
("sensor.mydevicename_isolation_resistance", "0.1234"),
]
for entity_id, _ in sensors:

View File

@@ -43,6 +43,7 @@
'nb',
'ne',
'nl',
'pa',
'pl',
'pt',
'pt-BR',

View File

@@ -2091,6 +2091,129 @@ async def test_secondary_pipeline(
assert (await get_pipeline(None)) == "Primary Pipeline"
@pytest.mark.timeout(5)
async def test_pipeline_start_missing_wake_word_entity_state(
hass: HomeAssistant,
mock_client: APIClient,
mock_esphome_device: MockESPHomeDeviceType,
) -> None:
"""Test pipeline selection when a wake word entity has no state.
Regression test for an infinite loop that occurred when a wake word entity
existed in the entity registry but had no state in the state machine.
"""
assert await async_setup_component(hass, "assist_pipeline", {})
pipeline_data = hass.data[KEY_ASSIST_PIPELINE]
pipeline_id_to_name: dict[str, str] = {}
for pipeline_name in ("Primary Pipeline", "Secondary Pipeline"):
pipeline = await pipeline_data.pipeline_store.async_create_item(
{
"name": pipeline_name,
"language": "en-US",
"conversation_engine": None,
"conversation_language": "en-US",
"tts_engine": None,
"tts_language": None,
"tts_voice": None,
"stt_engine": None,
"stt_language": None,
"wake_word_entity": None,
"wake_word_id": None,
}
)
pipeline_id_to_name[pipeline.id] = pipeline_name
device_config = AssistSatelliteConfiguration(
available_wake_words=[
AssistSatelliteWakeWord("okay_nabu", "Okay Nabu", ["en"]),
AssistSatelliteWakeWord("hey_jarvis", "Hey Jarvis", ["en"]),
],
active_wake_words=["hey_jarvis"],
max_active_wake_words=2,
)
mock_client.get_voice_assistant_configuration.return_value = device_config
configuration_set = asyncio.Event()
async def wrapper(*args, **kwargs):
device_config.active_wake_words = kwargs["active_wake_words"]
configuration_set.set()
mock_client.set_voice_assistant_configuration = AsyncMock(side_effect=wrapper)
mock_device = await mock_esphome_device(
mock_client=mock_client,
device_info={
"voice_assistant_feature_flags": VoiceAssistantFeature.VOICE_ASSISTANT
| VoiceAssistantFeature.ANNOUNCE
},
)
await hass.async_block_till_done()
satellite = get_satellite_entity(hass, mock_device.device_info.mac_address)
assert satellite is not None
# Set primary/secondary wake words and assistants
await hass.services.async_call(
SELECT_DOMAIN,
SERVICE_SELECT_OPTION,
{ATTR_ENTITY_ID: "select.test_wake_word", "option": "Okay Nabu"},
blocking=True,
)
await hass.services.async_call(
SELECT_DOMAIN,
SERVICE_SELECT_OPTION,
{ATTR_ENTITY_ID: "select.test_assistant", "option": "Primary Pipeline"},
blocking=True,
)
await hass.services.async_call(
SELECT_DOMAIN,
SERVICE_SELECT_OPTION,
{ATTR_ENTITY_ID: "select.test_wake_word_2", "option": "Hey Jarvis"},
blocking=True,
)
await hass.services.async_call(
SELECT_DOMAIN,
SERVICE_SELECT_OPTION,
{
ATTR_ENTITY_ID: "select.test_assistant_2",
"option": "Secondary Pipeline",
},
blocking=True,
)
await hass.async_block_till_done()
# Remove state for primary wake word entity to simulate the bug scenario:
# entity exists in the registry but has no state in the state machine.
hass.states.async_remove("select.test_wake_word")
async def get_pipeline(wake_word_phrase):
with patch(
"homeassistant.components.assist_satellite.entity.async_pipeline_from_audio_stream",
) as mock_pipeline_from_audio_stream:
await satellite.handle_pipeline_start(
conversation_id="",
flags=0,
audio_settings=VoiceAssistantAudioSettings(),
wake_word_phrase=wake_word_phrase,
)
mock_pipeline_from_audio_stream.assert_called_once()
kwargs = mock_pipeline_from_audio_stream.call_args_list[0].kwargs
return pipeline_id_to_name[kwargs["pipeline_id"]]
# The primary wake word entity has no state, so the loop must skip it.
# The secondary wake word entity still has state, so "Hey Jarvis" matches.
assert (await get_pipeline("Hey Jarvis")) == "Secondary Pipeline"
# "Okay Nabu" can't match because its entity has no state — falls back to
# default pipeline (index 0).
assert (await get_pipeline("Okay Nabu")) == "Primary Pipeline"
# No wake word phrase also falls back to default.
assert (await get_pipeline(None)) == "Primary Pipeline"
async def test_custom_wake_words(
hass: HomeAssistant,
mock_client: APIClient,

View File

@@ -23,6 +23,10 @@ from homeassistant.components.homematicip_cloud.climate import (
ATTR_PRESET_END_TIME,
PERMANENT_END_TIME,
)
from homeassistant.components.homematicip_cloud.entity import (
ATTR_GROUP_MEMBER_UNREACHABLE,
)
from homeassistant.const import STATE_UNAVAILABLE
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ServiceValidationError
@@ -428,6 +432,30 @@ async def test_hmip_heating_group_heat_with_radiator(
]
async def test_hmip_heating_group_availability(
hass: HomeAssistant, default_mock_hap_factory: HomeFactory
) -> None:
"""Test heating group stays available when group member is unreachable."""
entity_id = "climate.badezimmer"
entity_name = "Badezimmer"
device_model = None
mock_hap = await default_mock_hap_factory.async_get_mock_hap(
test_groups=[entity_name]
)
ha_state, hmip_device = get_and_check_entity_basics(
hass, mock_hap, entity_id, entity_name, device_model
)
assert ha_state.state != STATE_UNAVAILABLE
assert not ha_state.attributes.get(ATTR_GROUP_MEMBER_UNREACHABLE)
await async_manipulate_test_data(hass, hmip_device, "unreach", True)
ha_state = hass.states.get(entity_id)
assert ha_state.state != STATE_UNAVAILABLE
assert ha_state.attributes[ATTR_GROUP_MEMBER_UNREACHABLE]
async def test_hmip_heating_profile_default_name(
hass: HomeAssistant, default_mock_hap_factory: HomeFactory
) -> None:

View File

@@ -7,7 +7,10 @@ from homeassistant.components.cover import (
ATTR_CURRENT_TILT_POSITION,
CoverState,
)
from homeassistant.const import STATE_UNKNOWN
from homeassistant.components.homematicip_cloud.entity import (
ATTR_GROUP_MEMBER_UNREACHABLE,
)
from homeassistant.const import STATE_UNAVAILABLE, STATE_UNKNOWN
from homeassistant.core import HomeAssistant
from .helper import HomeFactory, async_manipulate_test_data, get_and_check_entity_basics
@@ -596,3 +599,25 @@ async def test_hmip_cover_slats_group(
assert len(hmip_device.mock_calls) == service_call_counter + 9
assert hmip_device.mock_calls[-1][0] == "set_shutter_stop_async"
assert hmip_device.mock_calls[-1][1] == ()
async def test_hmip_cover_shutter_group_availability(
hass: HomeAssistant, default_mock_hap_factory: HomeFactory
) -> None:
"""Test cover shutter group stays available when group member is unreachable."""
entity_id = "cover.rollos_shuttergroup"
entity_name = "Rollos ShutterGroup"
device_model = None
mock_hap = await default_mock_hap_factory.async_get_mock_hap(test_groups=["Rollos"])
ha_state, hmip_device = get_and_check_entity_basics(
hass, mock_hap, entity_id, entity_name, device_model
)
assert ha_state.state != STATE_UNAVAILABLE
assert not ha_state.attributes.get(ATTR_GROUP_MEMBER_UNREACHABLE)
await async_manipulate_test_data(hass, hmip_device, "unreach", True)
ha_state = hass.states.get(entity_id)
assert ha_state.state != STATE_UNAVAILABLE
assert ha_state.attributes[ATTR_GROUP_MEMBER_UNREACHABLE]

View File

@@ -479,6 +479,34 @@ async def test_setup_v2_ssl_cert(
ApiException("token"),
"invalid_config",
),
(
DEFAULT_API_VERSION,
{
CONF_URL: "http://localhost:8086",
CONF_VERIFY_SSL: False,
CONF_DB_NAME: "home_assistant",
CONF_USERNAME: "user",
CONF_PASSWORD: "pass",
},
DEFAULT_API_VERSION,
_get_write_api_mock_v1,
InfluxDBClientError("some other error"),
"cannot_connect",
),
(
DEFAULT_API_VERSION,
{
CONF_URL: "http://localhost:8086",
CONF_VERIFY_SSL: False,
CONF_DB_NAME: "home_assistant",
CONF_USERNAME: "user",
CONF_PASSWORD: "pass",
},
DEFAULT_API_VERSION,
_get_write_api_mock_v1,
Exception("unexpected"),
"unknown",
),
],
indirect=["mock_client"],
)

View File

@@ -5,6 +5,7 @@ from dataclasses import dataclass
import datetime
from http import HTTPStatus
import logging
from typing import Any
from unittest.mock import ANY, MagicMock, Mock, call, patch
import pytest
@@ -14,6 +15,7 @@ from homeassistant.components.influxdb.const import DEFAULT_BUCKET, DOMAIN
from homeassistant.config_entries import ConfigEntryState
from homeassistant.const import PERCENTAGE, STATE_OFF, STATE_ON, STATE_STANDBY
from homeassistant.core import HomeAssistant, split_entity_id
from homeassistant.helpers import issue_registry as ir
from homeassistant.setup import async_setup_component
from . import (
@@ -139,6 +141,7 @@ async def test_setup_config_full(
config_ext,
config_update,
get_write_api,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test the setup with full configuration."""
config = {
@@ -166,6 +169,10 @@ async def test_setup_config_full(
assert entry.state == ConfigEntryState.LOADED
assert entry.data == full_config
assert issue_registry.async_get_issue(
domain=DOMAIN,
issue_id="deprecated_yaml",
)
@pytest.mark.parametrize(
@@ -315,31 +322,66 @@ async def test_setup_config_ssl(
@pytest.mark.parametrize(
("mock_client", "config_base", "config_ext", "get_write_api"),
("mock_client", "get_write_api", "config_ext"),
[
(influxdb.DEFAULT_API_VERSION, _get_write_api_mock_v1, {}),
(influxdb.DEFAULT_API_VERSION, _get_write_api_mock_v1, {"precision": "s"}),
],
indirect=["mock_client"],
)
async def test_setup_minimal_config_no_connection_keys(
hass: HomeAssistant,
mock_client,
get_write_api,
config_ext,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test the setup with non-connection YAML keys creates no deprecation issue."""
config = {"influxdb": {}}
config["influxdb"].update(config_ext)
assert await async_setup_component(hass, influxdb.DOMAIN, config)
await hass.async_block_till_done()
assert get_write_api(mock_client).call_count == 2
conf_entries = hass.config_entries.async_entries(DOMAIN)
assert len(conf_entries) == 1
entry = conf_entries[0]
assert entry.state == ConfigEntryState.LOADED
assert entry.data == BASE_V1_CONFIG
assert not issue_registry.async_get_issue(domain=DOMAIN, issue_id="deprecated_yaml")
@pytest.mark.parametrize(
("mock_client", "config_ext", "config_base", "get_write_api"),
[
(
influxdb.DEFAULT_API_VERSION,
BASE_V1_CONFIG,
{},
_get_write_api_mock_v1,
),
(
influxdb.API_VERSION_2,
BASE_V2_CONFIG,
{
"api_version": influxdb.API_VERSION_2,
"organization": "org",
"token": "token",
},
BASE_V2_CONFIG,
_get_write_api_mock_v2,
),
],
indirect=["mock_client"],
)
async def test_setup_minimal_config(
hass: HomeAssistant, mock_client, config_base, config_ext, get_write_api
async def test_setup_minimal_config_with_connection_keys(
hass: HomeAssistant,
mock_client,
config_ext,
config_base,
get_write_api,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test the setup with minimal configuration and defaults."""
"""Test the setup with connection keys creates a deprecation issue."""
config = {"influxdb": {}}
config["influxdb"].update(config_ext)
@@ -357,44 +399,30 @@ async def test_setup_minimal_config(
assert entry.state == ConfigEntryState.LOADED
assert entry.data == config_base
assert issue_registry.async_get_issue(domain=DOMAIN, issue_id="deprecated_yaml")
@pytest.mark.parametrize(
("mock_client", "config_ext", "get_write_api"),
"config_ext",
[
(influxdb.DEFAULT_API_VERSION, {"username": "user"}, _get_write_api_mock_v1),
(
influxdb.DEFAULT_API_VERSION,
{"token": "token", "organization": "organization"},
_get_write_api_mock_v1,
),
(
influxdb.API_VERSION_2,
{"api_version": influxdb.API_VERSION_2},
_get_write_api_mock_v2,
),
(
influxdb.API_VERSION_2,
{"api_version": influxdb.API_VERSION_2, "organization": "organization"},
_get_write_api_mock_v2,
),
(
influxdb.API_VERSION_2,
{
"api_version": influxdb.API_VERSION_2,
"token": "token",
"organization": "organization",
"username": "user",
"password": "pass",
},
_get_write_api_mock_v2,
),
{"username": "user"},
{"api_version": influxdb.API_VERSION_2, "organization": "organization"},
{"token": "token", "organization": "organization"},
{"api_version": influxdb.API_VERSION_2},
{
"api_version": influxdb.API_VERSION_2,
"token": "token",
"organization": "organization",
"username": "user",
"password": "pass",
},
],
indirect=["mock_client"],
)
async def test_invalid_config(
hass: HomeAssistant, mock_client, config_ext, get_write_api
async def test_invalid_config_schema(
hass: HomeAssistant,
config_ext,
) -> None:
"""Test the setup with invalid config or config options specified for wrong version."""
"""Test that invalid schema configs are rejected at setup."""
config = {"influxdb": {}}
config["influxdb"].update(config_ext)
@@ -2104,3 +2132,97 @@ async def test_precision(
assert write_api.call_count == 1
assert write_api.call_args == get_mock_call(body, precision)
write_api.reset_mock()
@pytest.mark.parametrize(
("mock_client", "config_ext", "get_write_api"),
[
(
influxdb.DEFAULT_API_VERSION,
{
"api_version": influxdb.DEFAULT_API_VERSION,
"host": "host",
"port": 123,
"username": "user",
"password": "password",
"database": "db",
"ssl": False,
"verify_ssl": False,
},
_get_write_api_mock_v1,
),
(
influxdb.API_VERSION_2,
{
"api_version": influxdb.API_VERSION_2,
"token": "token",
"organization": "organization",
"bucket": "bucket",
},
_get_write_api_mock_v2,
),
],
indirect=["mock_client"],
)
async def test_setup_import_connection_error(
hass: HomeAssistant,
mock_client: MagicMock,
config_ext: dict[str, Any],
get_write_api,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test that a repair issue is created on import connection error."""
write_api = get_write_api(mock_client)
write_api.side_effect = ConnectionError("fail")
config = {"influxdb": {}}
config["influxdb"].update(config_ext)
assert await async_setup_component(hass, influxdb.DOMAIN, config)
await hass.async_block_till_done()
assert issue_registry.async_get_issue(
domain=DOMAIN,
issue_id="deprecated_yaml_import_issue_cannot_connect",
)
@pytest.mark.parametrize(
("mock_client", "config_ext", "get_write_api"),
[
(
influxdb.DEFAULT_API_VERSION,
{
"host": "localhost",
"username": "user",
"password": "password",
"database": "db",
},
_get_write_api_mock_v1,
),
],
indirect=["mock_client"],
)
async def test_setup_import_already_exists(
hass: HomeAssistant,
mock_client: MagicMock,
config_ext: dict[str, Any],
get_write_api,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test that no error issue is created when a config entry already exists."""
mock_entry = MockConfigEntry(domain=DOMAIN, data=BASE_V1_CONFIG)
mock_entry.add_to_hass(hass)
config = {"influxdb": {}}
config["influxdb"].update(config_ext)
assert await async_setup_component(hass, influxdb.DOMAIN, config)
await hass.async_block_till_done()
# No error issue should be created for single_instance_allowed
for issue in issue_registry.issues.values():
assert "deprecated_yaml_import_issue" not in issue.issue_id
# Deprecation warning should still be shown
assert issue_registry.async_get_issue(domain=DOMAIN, issue_id="deprecated_yaml")

View File

@@ -25,6 +25,7 @@ DEFAULT_SETTING_VALUES = {
"Properties:VersionMC": "01.46",
"Battery:MinSoc": "5",
"Battery:MinHomeComsumption": "50",
"Inverter:ActivePowerLimitation": "8000",
},
"scb:network": {"Hostname": "scb"},
}
@@ -49,6 +50,15 @@ DEFAULT_SETTINGS = {
id="Battery:MinHomeComsumption",
type="byte",
),
SettingsData(
min="0",
max="10000",
default=None,
access="readwrite",
unit="W",
id="Inverter:ActivePowerLimitation",
type="byte",
),
],
"scb:network": [
SettingsData(

View File

@@ -52,6 +52,7 @@ async def test_entry_diagnostics(
"devices:local": [
"min='5' max='100' default=None access='readwrite' unit='%' id='Battery:MinSoc' type='byte'",
"min='50' max='38000' default=None access='readwrite' unit='W' id='Battery:MinHomeComsumption' type='byte'",
"min='0' max='10000' default=None access='readwrite' unit='W' id='Inverter:ActivePowerLimitation' type='byte'",
],
"scb:network": [
"min='1' max='63' default=None access='readwrite' unit=None id='Hostname' type='string'"

View File

@@ -41,6 +41,7 @@ async def test_setup_all_entries(
assert (
entity_registry.async_get("number.scb_battery_min_home_consumption") is not None
)
assert entity_registry.async_get("number.scb_active_power_limitation") is not None
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
@@ -77,6 +78,7 @@ async def test_setup_no_entries(
assert entity_registry.async_get("number.scb_battery_min_soc") is None
assert entity_registry.async_get("number.scb_battery_min_home_consumption") is None
assert entity_registry.async_get("number.scb_active_power_limitation") is None
@pytest.mark.usefixtures("entity_registry_enabled_by_default")

View File

@@ -1,7 +1,5 @@
"""Test config flow for Nederlandse Spoorwegen integration."""
from datetime import time
from typing import Any
from unittest.mock import AsyncMock
import pytest
@@ -9,13 +7,12 @@ from requests import ConnectionError as RequestsConnectionError, HTTPError, Time
from homeassistant.components.nederlandse_spoorwegen.const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
DOMAIN,
)
from homeassistant.config_entries import SOURCE_IMPORT, SOURCE_RECONFIGURE, SOURCE_USER
from homeassistant.config_entries import SOURCE_RECONFIGURE, SOURCE_USER
from homeassistant.const import CONF_API_KEY, CONF_NAME
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import FlowResultType
@@ -165,174 +162,6 @@ async def test_already_configured(
assert result["reason"] == "already_configured"
async def test_config_flow_import_success(
hass: HomeAssistant, mock_nsapi: AsyncMock, mock_setup_entry: AsyncMock
) -> None:
"""Test successful import flow from YAML configuration."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data={CONF_API_KEY: API_KEY},
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "Nederlandse Spoorwegen"
assert result["data"] == {CONF_API_KEY: API_KEY}
assert not result["result"].subentries
@pytest.mark.parametrize(
("routes_data", "expected_routes_data"),
[
(
# Test with uppercase station codes (UI behavior)
[
{
CONF_NAME: "Home to Work",
CONF_FROM: "ASD",
CONF_TO: "RTD",
CONF_VIA: "HT",
CONF_TIME: time(hour=8, minute=30),
}
],
[
{
CONF_NAME: "Home to Work",
CONF_FROM: "ASD",
CONF_TO: "RTD",
CONF_VIA: "HT",
CONF_TIME: time(hour=8, minute=30),
}
],
),
(
# Test with lowercase station codes (converted to uppercase)
[
{
CONF_NAME: "Rotterdam-Amsterdam",
CONF_FROM: "rtd", # lowercase input
CONF_TO: "asd", # lowercase input
},
{
CONF_NAME: "Amsterdam-Haarlem",
CONF_FROM: "asd", # lowercase input
CONF_TO: "ht", # lowercase input
CONF_VIA: "rtd", # lowercase input
},
],
[
{
CONF_NAME: "Rotterdam-Amsterdam",
CONF_FROM: "RTD", # converted to uppercase
CONF_TO: "ASD", # converted to uppercase
},
{
CONF_NAME: "Amsterdam-Haarlem",
CONF_FROM: "ASD", # converted to uppercase
CONF_TO: "HT", # converted to uppercase
CONF_VIA: "RTD", # converted to uppercase
},
],
),
],
)
async def test_config_flow_import_with_routes(
hass: HomeAssistant,
mock_nsapi: AsyncMock,
mock_setup_entry: AsyncMock,
routes_data: list[dict[str, Any]],
expected_routes_data: list[dict[str, Any]],
) -> None:
"""Test import flow with routes from YAML configuration."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data={
CONF_API_KEY: API_KEY,
CONF_ROUTES: routes_data,
},
)
assert result["type"] is FlowResultType.CREATE_ENTRY
assert result["title"] == "Nederlandse Spoorwegen"
assert result["data"] == {CONF_API_KEY: API_KEY}
assert len(result["result"].subentries) == len(expected_routes_data)
subentries = list(result["result"].subentries.values())
for expected_route in expected_routes_data:
route_entry = next(
entry for entry in subentries if entry.title == expected_route[CONF_NAME]
)
assert route_entry.data == expected_route
assert route_entry.subentry_type == "route"
async def test_config_flow_import_with_unknown_station(
hass: HomeAssistant, mock_nsapi: AsyncMock, mock_setup_entry: AsyncMock
) -> None:
"""Test import flow aborts with unknown station in routes."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data={
CONF_API_KEY: API_KEY,
CONF_ROUTES: [
{
CONF_NAME: "Home to Work",
CONF_FROM: "HRM",
CONF_TO: "RTD",
CONF_VIA: "HT",
CONF_TIME: time(hour=8, minute=30),
}
],
},
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "invalid_station"
async def test_config_flow_import_already_configured(
hass: HomeAssistant, mock_config_entry: MockConfigEntry
) -> None:
"""Test import flow when integration is already configured."""
mock_config_entry.add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": SOURCE_IMPORT},
data={CONF_API_KEY: API_KEY},
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
@pytest.mark.parametrize(
("exception", "expected_error"),
[
(HTTPError("Invalid API key"), "invalid_auth"),
(Timeout("Cannot connect"), "cannot_connect"),
(RequestsConnectionError("Cannot connect"), "cannot_connect"),
(Exception("Unexpected error"), "unknown"),
],
)
async def test_import_flow_exceptions(
hass: HomeAssistant,
mock_nsapi: AsyncMock,
exception: Exception,
expected_error: str,
) -> None:
"""Test config flow handling different exceptions."""
mock_nsapi.get_stations.side_effect = exception
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_IMPORT}, data={CONF_API_KEY: API_KEY}
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == expected_error
async def test_reconfigure_success(
hass: HomeAssistant, mock_nsapi: AsyncMock, mock_config_entry: MockConfigEntry
) -> None:

View File

@@ -11,7 +11,6 @@ from syrupy.assertion import SnapshotAssertion
from homeassistant.components.nederlandse_spoorwegen.const import (
CONF_FROM,
CONF_ROUTES,
CONF_TIME,
CONF_TO,
CONF_VIA,
@@ -19,19 +18,10 @@ from homeassistant.components.nederlandse_spoorwegen.const import (
INTEGRATION_TITLE,
SUBENTRY_TYPE_ROUTE,
)
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN
from homeassistant.config_entries import ConfigSubentryDataWithId
from homeassistant.const import (
CONF_API_KEY,
CONF_NAME,
CONF_PLATFORM,
STATE_UNKNOWN,
Platform,
)
from homeassistant.core import DOMAIN as HOMEASSISTANT_DOMAIN, HomeAssistant
from homeassistant.const import CONF_API_KEY, CONF_NAME, STATE_UNKNOWN, Platform
from homeassistant.core import HomeAssistant
import homeassistant.helpers.entity_registry as er
import homeassistant.helpers.issue_registry as ir
from homeassistant.setup import async_setup_component
from . import setup_integration
from .const import API_KEY
@@ -49,41 +39,6 @@ def mock_sensor_platform() -> Generator:
yield mock_platform
async def test_config_import(
hass: HomeAssistant,
mock_nsapi,
mock_setup_entry: AsyncMock,
issue_registry: ir.IssueRegistry,
) -> None:
"""Test sensor initialization."""
await async_setup_component(
hass,
SENSOR_DOMAIN,
{
SENSOR_DOMAIN: [
{
CONF_PLATFORM: DOMAIN,
CONF_API_KEY: API_KEY,
CONF_ROUTES: [
{
CONF_NAME: "Spoorwegen Nederlande Station",
CONF_FROM: "ASD",
CONF_TO: "RTD",
CONF_VIA: "HT",
}
],
}
]
},
)
await hass.async_block_till_done()
assert len(issue_registry.issues) == 1
assert (HOMEASSISTANT_DOMAIN, "deprecated_yaml") in issue_registry.issues
assert len(hass.config_entries.async_entries(DOMAIN)) == 1
@pytest.mark.freeze_time("2025-09-15 14:30:00+00:00")
@pytest.mark.usefixtures("entity_registry_enabled_by_default")
async def test_sensor(

View File

@@ -608,6 +608,33 @@ async def test_dhcp_ip_update_aborted_if_wrong_mac(
assert config_entry.data[CONF_HOST] == TEST_HOST
async def test_dhcp_ip_update_aborted_if_no_host(hass: HomeAssistant) -> None:
"""Test dhcp discovery does not update the IP if the config entry has no host."""
config_entry = MockConfigEntry(
domain=DOMAIN,
unique_id=format_mac(TEST_MAC),
data={},
options={
CONF_PROTOCOL: DEFAULT_PROTOCOL,
},
title=TEST_NVR_NAME,
)
config_entry.add_to_hass(hass)
dhcp_data = DhcpServiceInfo(
ip=TEST_HOST2,
hostname="Reolink",
macaddress=DHCP_FORMATTED_MAC,
)
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_DHCP}, data=dhcp_data
)
assert result["type"] is FlowResultType.ABORT
assert result["reason"] == "already_configured"
@pytest.mark.parametrize(
("attr", "value", "expected", "host_call_list"),
[

View File

@@ -105,6 +105,7 @@ def _mock_system_service() -> MagicMock:
mock_system_service.props = {
"firmware_update": MagicMock(spec=Property),
"firmware_update_status": MagicMock(spec=Property),
"send_diagnostic_data": MagicMock(spec=Property),
}
mock_system_service.props["firmware_update"].get.return_value = 0

View File

@@ -0,0 +1,50 @@
# serializer version: 1
# name: test_entities[button.smarla_send_diagnostics-entry]
EntityRegistryEntrySnapshot({
'aliases': set({
}),
'area_id': None,
'capabilities': None,
'config_entry_id': <ANY>,
'config_subentry_id': <ANY>,
'device_class': None,
'device_id': <ANY>,
'disabled_by': None,
'domain': 'button',
'entity_category': <EntityCategory.CONFIG: 'config'>,
'entity_id': 'button.smarla_send_diagnostics',
'has_entity_name': True,
'hidden_by': None,
'icon': None,
'id': <ANY>,
'labels': set({
}),
'name': None,
'object_id_base': 'Send diagnostics',
'options': dict({
}),
'original_device_class': None,
'original_icon': None,
'original_name': 'Send diagnostics',
'platform': 'smarla',
'previous_unique_id': None,
'suggested_object_id': None,
'supported_features': 0,
'translation_key': 'send_diagnostics',
'unique_id': 'ABCD-send_diagnostics',
'unit_of_measurement': None,
})
# ---
# name: test_entities[button.smarla_send_diagnostics-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'friendly_name': 'Smarla Send diagnostics',
}),
'context': <ANY>,
'entity_id': 'button.smarla_send_diagnostics',
'last_changed': <ANY>,
'last_reported': <ANY>,
'last_updated': <ANY>,
'state': 'unknown',
})
# ---

View File

@@ -37,7 +37,7 @@
'supported_features': 0,
'translation_key': 'intensity',
'unique_id': 'ABCD-intensity',
'unit_of_measurement': None,
'unit_of_measurement': '%',
})
# ---
# name: test_entities[number.smarla_intensity-state]
@@ -48,6 +48,7 @@
'min': 0,
'mode': <NumberMode.SLIDER: 'slider'>,
'step': 1,
'unit_of_measurement': '%',
}),
'context': <ANY>,
'entity_id': 'number.smarla_intensity',

View File

@@ -76,8 +76,11 @@
'name': None,
'object_id_base': 'Amplitude',
'options': dict({
'sensor': dict({
'suggested_display_precision': 0,
}),
}),
'original_device_class': None,
'original_device_class': <SensorDeviceClass.DISTANCE: 'distance'>,
'original_icon': None,
'original_name': 'Amplitude',
'platform': 'smarla',
@@ -92,6 +95,7 @@
# name: test_entities[sensor.smarla_amplitude-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'distance',
'friendly_name': 'Smarla Amplitude',
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
'unit_of_measurement': <UnitOfLength.MILLIMETERS: 'mm'>,
@@ -129,8 +133,11 @@
'name': None,
'object_id_base': 'Period',
'options': dict({
'sensor': dict({
'suggested_display_precision': 0,
}),
}),
'original_device_class': None,
'original_device_class': <SensorDeviceClass.DURATION: 'duration'>,
'original_icon': None,
'original_name': 'Period',
'platform': 'smarla',
@@ -145,6 +152,7 @@
# name: test_entities[sensor.smarla_period-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'duration',
'friendly_name': 'Smarla Period',
'state_class': <SensorStateClass.MEASUREMENT: 'measurement'>,
'unit_of_measurement': <UnitOfTime.MILLISECONDS: 'ms'>,

View File

@@ -23,7 +23,7 @@
'object_id_base': None,
'options': dict({
}),
'original_device_class': None,
'original_device_class': <SwitchDeviceClass.SWITCH: 'switch'>,
'original_icon': None,
'original_name': None,
'platform': 'smarla',
@@ -38,6 +38,7 @@
# name: test_entities[switch.smarla-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'switch',
'friendly_name': 'Smarla',
}),
'context': <ANY>,
@@ -72,7 +73,7 @@
'object_id_base': 'Smart Mode',
'options': dict({
}),
'original_device_class': None,
'original_device_class': <SwitchDeviceClass.SWITCH: 'switch'>,
'original_icon': None,
'original_name': 'Smart Mode',
'platform': 'smarla',
@@ -87,6 +88,7 @@
# name: test_entities[switch.smarla_smart_mode-state]
StateSnapshot({
'attributes': ReadOnlyDict({
'device_class': 'switch',
'friendly_name': 'Smarla Smart Mode',
}),
'context': <ANY>,

View File

@@ -0,0 +1,67 @@
"""Test button platform for Swing2Sleep Smarla integration."""
from unittest.mock import MagicMock, patch
import pytest
from syrupy.assertion import SnapshotAssertion
from homeassistant.components.button import DOMAIN as BUTTON_DOMAIN, SERVICE_PRESS
from homeassistant.const import ATTR_ENTITY_ID, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from . import setup_integration
from tests.common import MockConfigEntry, snapshot_platform
BUTTON_ENTITIES = [
{
"entity_id": "button.smarla_send_diagnostics",
"service": "system",
"property": "send_diagnostic_data",
},
]
@pytest.mark.usefixtures("mock_federwiege")
async def test_entities(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
entity_registry: er.EntityRegistry,
snapshot: SnapshotAssertion,
) -> None:
"""Test the Smarla entities."""
with (
patch("homeassistant.components.smarla.PLATFORMS", [Platform.BUTTON]),
):
assert await setup_integration(hass, mock_config_entry)
await snapshot_platform(
hass, entity_registry, snapshot, mock_config_entry.entry_id
)
@pytest.mark.parametrize("entity_info", BUTTON_ENTITIES)
async def test_button_action(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_federwiege: MagicMock,
entity_info: dict[str, str],
) -> None:
"""Test Smarla Button press behavior."""
assert await setup_integration(hass, mock_config_entry)
mock_button_property = mock_federwiege.get_property(
entity_info["service"], entity_info["property"]
)
entity_id = entity_info["entity_id"]
# Turn on
await hass.services.async_call(
BUTTON_DOMAIN,
SERVICE_PRESS,
{ATTR_ENTITY_ID: entity_id},
blocking=True,
)
mock_button_property.set.assert_called_once()

View File

@@ -471,13 +471,13 @@
"lamp": {
"switch": {
"switch": {
"value": "off",
"value": "on",
"timestamp": "2025-11-12T00:04:46.554Z"
}
},
"samsungce.lamp": {
"brightnessLevel": {
"value": "high",
"value": "low",
"timestamp": "2025-11-12T00:04:44.863Z"
},
"supportedBrightnessLevel": {

Some files were not shown because too many files have changed in this diff Show More