Compare commits

..

3 Commits

Author SHA1 Message Date
Paulus Schoutsen
25e0c26317 Reduce diff 2025-12-22 17:10:06 +01:00
Paulus Schoutsen
9180486291 Address comments 2025-12-22 17:08:45 +01:00
Paulus Schoutsen
310730a69d represent ThinQ hoods as fans instead of number entities 2025-12-22 16:54:15 +01:00
2736 changed files with 31890 additions and 132844 deletions

View File

@@ -1,784 +0,0 @@
---
name: Home Assistant Integration knowledge
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
---
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```

View File

@@ -1,19 +0,0 @@
# Integration Diagnostics
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates

View File

@@ -1,55 +0,0 @@
# Repairs platform
Platform exists as `homeassistant/components/<domain>/repairs.py`.
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve

View File

@@ -91,7 +91,6 @@ components: &components
- homeassistant/components/input_number/**
- homeassistant/components/input_select/**
- homeassistant/components/input_text/**
- homeassistant/components/labs/**
- homeassistant/components/logbook/**
- homeassistant/components/logger/**
- homeassistant/components/lovelace/**

View File

@@ -8,6 +8,9 @@
"PYTHONASYNCIODEBUG": "1"
},
"features": {
// Node feature required for Claude Code until fixed https://github.com/anthropics/devcontainer-features/issues/28
"ghcr.io/devcontainers/features/node:1": {},
"ghcr.io/anthropics/devcontainer-features/claude-code:1.0": {},
"ghcr.io/devcontainers/features/github-cli:1": {}
},
// Port 5683 udp is used by Shelly integration
@@ -37,8 +40,7 @@
"python.terminal.activateEnvInCurrentTerminal": true,
"python.testing.pytestArgs": ["--no-cov"],
"pylint.importStrategy": "fromEnvironment",
// Pyright type checking is not compatible with mypy which Home Assistant uses for type checking
"python.analysis.typeCheckingMode": "off",
"python.analysis.typeCheckingMode": "basic",
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,

View File

@@ -80,7 +80,7 @@ If the code communicates with devices, web services, or third-party tools:
Updated and included derived files by running: `python3 -m script.hassfest`.
- [ ] New or updated dependencies have been added to `requirements_all.txt`.
Updated by running `python3 -m script.gen_requirements_all`.
- [ ] For the updated dependencies a diff between library versions and ideally a link to the changelog/release notes is added to the PR description.
- [ ] For the updated dependencies - a link to the changelog, or at minimum a diff between library versions is added to the PR description.
<!--
This project is very active and we have a high turnover of pull requests.

File diff suppressed because it is too large Load Diff

View File

@@ -30,10 +30,10 @@ jobs:
architectures: ${{ env.ARCHITECTURES }}
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -96,11 +96,11 @@ jobs:
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Download nightly wheels of frontend
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -111,7 +111,7 @@ jobs:
- name: Download nightly wheels of intents
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@0bd50d53a6d7fb5cb921e607957e9cc12b4ce392 # v12
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -122,7 +122,7 @@ jobs:
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -273,7 +273,7 @@ jobs:
- green
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Set build additional args
run: |
@@ -311,7 +311,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Initialize git
uses: home-assistant/actions/helpers/git-init@master
@@ -474,10 +474,10 @@ jobs:
if: github.repository_owner == 'home-assistant' && needs.init.outputs.publish == 'true'
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
@@ -519,7 +519,7 @@ jobs:
HASSFEST_IMAGE_TAG: ghcr.io/home-assistant/hassfest:${{ needs.init.outputs.version }}
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0

View File

@@ -40,7 +40,7 @@ env:
CACHE_VERSION: 2
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.2"
HA_SHORT_VERSION: "2026.1"
DEFAULT_PYTHON: "3.13.11"
ALL_PYTHON_VERSIONS: "['3.13.11', '3.14.2']"
# 10.3 is the oldest supported version
@@ -59,6 +59,7 @@ env:
# 15 is the latest version
# - 15.2 is the latest (as of 9 Feb 2023)
POSTGRESQL_VERSIONS: "['postgres:12.14','postgres:15.2']"
PRE_COMMIT_CACHE: ~/.cache/pre-commit
UV_CACHE_DIR: /tmp/uv-cache
APT_CACHE_BASE: /home/runner/work/apt
APT_CACHE_DIR: /home/runner/work/apt/cache
@@ -82,6 +83,7 @@ jobs:
integrations_glob: ${{ steps.info.outputs.integrations_glob }}
integrations: ${{ steps.integrations.outputs.changes }}
apt_cache_key: ${{ steps.generate_apt_cache_key.outputs.key }}
pre-commit_cache_key: ${{ steps.generate_pre-commit_cache_key.outputs.key }}
python_cache_key: ${{ steps.generate_python_cache_key.outputs.key }}
requirements: ${{ steps.core.outputs.requirements }}
mariadb_groups: ${{ steps.info.outputs.mariadb_groups }}
@@ -97,7 +99,7 @@ jobs:
steps:
- &checkout
name: Check out code from GitHub
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Generate partial Python venv restore key
id: generate_python_cache_key
run: |
@@ -109,6 +111,11 @@ jobs:
hashFiles('requirements_all.txt') }}-${{
hashFiles('homeassistant/package_constraints.txt') }}-${{
hashFiles('script/gen_requirements_all.py') }}" >> $GITHUB_OUTPUT
- name: Generate partial pre-commit restore key
id: generate_pre-commit_cache_key
run: >-
echo "key=pre-commit-${{ env.CACHE_VERSION }}-${{
hashFiles('.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
- name: Generate partial apt restore key
id: generate_apt_cache_key
run: |
@@ -237,8 +244,8 @@ jobs:
echo "skip_coverage: ${skip_coverage}"
echo "skip_coverage=${skip_coverage}" >> $GITHUB_OUTPUT
prek:
name: Run prek checks
pre-commit:
name: Prepare pre-commit base
runs-on: *runs-on-ubuntu
needs: [info]
if: |
@@ -247,17 +254,147 @@ jobs:
&& github.event.inputs.audit-licenses-only != 'true'
steps:
- *checkout
- name: Register problem matchers
- &setup-python-default
name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: &actions-setup-python actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: venv
key: &key-pre-commit-venv >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }}
- name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
python -m venv venv
. venv/bin/activate
python --version
pip install "$(grep '^uv' < requirements.txt)"
uv pip install "$(cat requirements_test.txt | grep pre-commit)"
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: *actions-cache
with:
path: ${{ env.PRE_COMMIT_CACHE }}
lookup-only: true
key: &key-pre-commit-env >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }}
- name: Install pre-commit dependencies
if: steps.cache-precommit.outputs.cache-hit != 'true'
run: |
. venv/bin/activate
pre-commit install-hooks
lint-ruff-format:
name: Check ruff-format
runs-on: *runs-on-ubuntu
needs: &needs-pre-commit
- info
- pre-commit
steps:
- *checkout
- *setup-python-default
- &cache-restore-pre-commit-venv
name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache-restore actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: venv
fail-on-cache-miss: true
key: *key-pre-commit-venv
- &cache-restore-pre-commit-env
name: Restore pre-commit environment from cache
id: cache-precommit
uses: *actions-cache-restore
with:
path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true
key: *key-pre-commit-env
- name: Run ruff-format
run: |
. venv/bin/activate
pre-commit run --hook-stage manual ruff-format --all-files --show-diff-on-failure
env:
RUFF_OUTPUT_FORMAT: github
lint-ruff:
name: Check ruff
runs-on: *runs-on-ubuntu
needs: *needs-pre-commit
steps:
- *checkout
- *setup-python-default
- *cache-restore-pre-commit-venv
- *cache-restore-pre-commit-env
- name: Run ruff
run: |
. venv/bin/activate
pre-commit run --hook-stage manual ruff-check --all-files --show-diff-on-failure
env:
RUFF_OUTPUT_FORMAT: github
lint-other:
name: Check other linters
runs-on: *runs-on-ubuntu
needs: *needs-pre-commit
steps:
- *checkout
- *setup-python-default
- *cache-restore-pre-commit-venv
- *cache-restore-pre-commit-env
- name: Register yamllint problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/yamllint.json"
- name: Run yamllint
run: |
. venv/bin/activate
pre-commit run --hook-stage manual yamllint --all-files --show-diff-on-failure
- name: Register check-json problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/check-json.json"
- name: Run check-json
run: |
. venv/bin/activate
pre-commit run --hook-stage manual check-json --all-files --show-diff-on-failure
- name: Run prettier (fully)
if: needs.info.outputs.test_full_suite == 'true'
run: |
. venv/bin/activate
pre-commit run --hook-stage manual prettier --all-files --show-diff-on-failure
- name: Run prettier (partially)
if: needs.info.outputs.test_full_suite == 'false'
shell: bash
run: |
. venv/bin/activate
shopt -s globstar
pre-commit run --hook-stage manual prettier --show-diff-on-failure --files {homeassistant,tests}/components/${{ needs.info.outputs.integrations_glob }}/{*,**/*}
- name: Register check executables problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/check-executables-have-shebangs.json"
- name: Run executables check
run: |
. venv/bin/activate
pre-commit run --hook-stage manual check-executables-have-shebangs --all-files --show-diff-on-failure
- name: Register codespell problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/codespell.json"
- name: Run prek
uses: j178/prek-action@9d6a3097e0c1865ecce00cfb89fe80f2ee91b547 # v1.0.12
env:
PREK_SKIP: no-commit-to-branch,mypy,pylint,gen_requirements_all,hassfest,hassfest-metadata,hassfest-mypy-config
RUFF_OUTPUT_FORMAT: github
- name: Run codespell
run: |
. venv/bin/activate
pre-commit run --show-diff-on-failure --hook-stage manual codespell --all-files
lint-hadolint:
name: Check ${{ matrix.file }}
@@ -297,7 +434,7 @@ jobs:
- &setup-python-matrix
name: Set up Python ${{ matrix.python-version }}
id: python
uses: &actions-setup-python actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: *actions-setup-python
with:
python-version: ${{ matrix.python-version }}
check-latest: true
@@ -310,7 +447,7 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: *actions-cache
with:
path: venv
key: &key-python-venv >-
@@ -374,7 +511,7 @@ jobs:
fi
- name: Save apt cache
if: steps.cache-apt-check.outputs.cache-hit != 'true'
uses: &actions-cache-save actions/cache/save@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: &actions-cache-save actions/cache/save@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: *path-apt-cache
key: *key-apt-cache
@@ -425,7 +562,7 @@ jobs:
steps:
- &cache-restore-apt
name: Restore apt cache
uses: &actions-cache-restore actions/cache/restore@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: *actions-cache-restore
with:
path: *path-apt-cache
fail-on-cache-miss: true
@@ -442,13 +579,7 @@ jobs:
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
libturbojpeg
- *checkout
- &setup-python-default
name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: *actions-setup-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- *setup-python-default
- &cache-restore-python-default
name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
@@ -479,22 +610,6 @@ jobs:
. venv/bin/activate
python -m script.gen_requirements_all validate
gen-copilot-instructions:
name: Check copilot instructions
runs-on: *runs-on-ubuntu
needs:
- info
if: |
github.event.inputs.pylint-only != 'true'
&& github.event.inputs.mypy-only != 'true'
&& github.event.inputs.audit-licenses-only != 'true'
steps:
- *checkout
- *setup-python-default
- name: Run gen_copilot_instructions.py
run: |
python -m script.gen_copilot_instructions validate
dependency-review:
name: Dependency review
runs-on: *runs-on-ubuntu
@@ -667,7 +782,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
steps:
- *cache-restore-apt
@@ -706,7 +823,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
- prepare-pytest-full
if: |
@@ -830,7 +949,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -945,7 +1066,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -1079,7 +1202,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -1203,8 +1328,6 @@ jobs:
- pytest-postgres
- pytest-mariadb
timeout-minutes: 10
permissions:
id-token: write
# codecov/test-results-action currently doesn't support tokenless uploads
# therefore we can't run it on forks
if: |
@@ -1216,9 +1339,8 @@ jobs:
with:
pattern: test-results-*
- name: Upload test results to Codecov
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
with:
report_type: test_results
fail_ci_if_error: true
verbose: true
use_oidc: true
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -21,14 +21,14 @@ jobs:
steps:
- name: Check out code from GitHub
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Initialize CodeQL
uses: github/codeql-action/init@19b2f06db2b6f5108140aeb04014ef02b648f789 # v4.31.11
uses: github/codeql-action/init@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@19b2f06db2b6f5108140aeb04014ef02b648f789 # v4.31.11
uses: github/codeql-action/analyze@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
with:
category: "/language:python"

View File

@@ -231,7 +231,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@a6101c89c6feaecc585efdd8d461f18bb7896f20 # v2.0.5
uses: actions/ai-inference@334892bb203895caaed82ec52d23c1ed9385151e # v2.0.4
with:
model: openai/gpt-4o
system-prompt: |

View File

@@ -57,7 +57,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@a6101c89c6feaecc585efdd8d461f18bb7896f20 # v2.0.5
uses: actions/ai-inference@334892bb203895caaed82ec52d23c1ed9385151e # v2.0.4
with:
model: openai/gpt-4o-mini
system-prompt: |

View File

@@ -4,7 +4,7 @@
"owner": "check-executables-have-shebangs",
"pattern": [
{
"regexp": "^(.+):\\s(marked executable but has no \\(or invalid\\) shebang!.*)$",
"regexp": "^(.+):\\s(.+)$",
"file": 1,
"message": 2
}

View File

@@ -19,10 +19,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}

View File

@@ -31,11 +31,11 @@ jobs:
steps:
- &checkout
name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true

1
.gitignore vendored
View File

@@ -92,7 +92,6 @@ pip-selfcheck.json
venv
.venv
Pipfile*
uv.lock
share/*
/Scripts/

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.13
rev: v0.13.0
hooks:
- id: ruff-check
args:
@@ -39,14 +39,14 @@ repos:
- id: prettier
additional_dependencies:
- prettier@3.6.2
- prettier-plugin-sort-json@4.2.0
- prettier-plugin-sort-json@4.1.1
- repo: https://github.com/cdce8p/python-typing-update
rev: v0.6.0
hooks:
# Run `python-typing-update` hook manually from time to time
# to update python typing syntax.
# Will require manual work, before submitting changes!
# prek run --hook-stage manual python-typing-update --all-files
# pre-commit run --hook-stage manual python-typing-update --all-files
- id: python-typing-update
stages: [manual]
args:

View File

@@ -53,7 +53,6 @@ homeassistant.components.air_quality.*
homeassistant.components.airgradient.*
homeassistant.components.airly.*
homeassistant.components.airnow.*
homeassistant.components.airobot.*
homeassistant.components.airos.*
homeassistant.components.airq.*
homeassistant.components.airthings.*
@@ -408,7 +407,6 @@ homeassistant.components.person.*
homeassistant.components.pi_hole.*
homeassistant.components.ping.*
homeassistant.components.plugwise.*
homeassistant.components.pooldose.*
homeassistant.components.portainer.*
homeassistant.components.powerfox.*
homeassistant.components.powerwall.*
@@ -456,7 +454,6 @@ homeassistant.components.russound_rio.*
homeassistant.components.ruuvi_gateway.*
homeassistant.components.ruuvitag_ble.*
homeassistant.components.samsungtv.*
homeassistant.components.saunum.*
homeassistant.components.scene.*
homeassistant.components.schedule.*
homeassistant.components.schlage.*

View File

@@ -7,8 +7,8 @@
"python.testing.pytestEnabled": false,
// https://code.visualstudio.com/docs/python/linting#_general-settings
"pylint.importStrategy": "fromEnvironment",
// Pyright type checking is not compatible with mypy which Home Assistant uses for type checking
"python.analysis.typeCheckingMode": "off",
// Pyright is too pedantic for Home Assistant
"python.analysis.typeCheckingMode": "basic",
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff",
},

8
.vscode/tasks.json vendored
View File

@@ -45,7 +45,7 @@
{
"label": "Ruff",
"type": "shell",
"command": "prek run ruff-check --all-files",
"command": "pre-commit run ruff-check --all-files",
"group": {
"kind": "test",
"isDefault": true
@@ -57,9 +57,9 @@
"problemMatcher": []
},
{
"label": "Prek",
"label": "Pre-commit",
"type": "shell",
"command": "prek run --show-diff-on-failure",
"command": "pre-commit run --show-diff-on-failure",
"group": {
"kind": "test",
"isDefault": true
@@ -120,7 +120,7 @@
{
"label": "Generate Requirements",
"type": "shell",
"command": "${command:python.interpreterPath} -m script.gen_requirements_all",
"command": "./script/gen_requirements_all.py",
"group": {
"kind": "build",
"isDefault": true

324
AGENTS.md
View File

@@ -1,324 +0,0 @@
# GitHub Copilot & Claude Code Instructions
This repository contains the core of Home Assistant, a Python 3 based home automation application.
## Code Review Guidelines
**When reviewing code, do NOT comment on:**
- **Missing imports** - We use static analysis tooling to catch that
- **Code formatting** - We have ruff as a formatting tool that will catch those if needed (unless specifically instructed otherwise in these instructions)
**Git commit practices during review:**
- **Do NOT amend, squash, or rebase commits after review has started** - Reviewers need to see what changed since their last review
## Python Requirements
- **Compatibility**: Python 3.13+
- **Language Features**: Use the newest features when possible:
- Pattern matching
- Type hints
- f-strings (preferred over `%` or `.format()`)
- Dataclasses
- Walrus operator
### Strict Typing (Platinum)
- **Comprehensive Type Hints**: Add type hints to all functions, methods, and variables
- **Custom Config Entry Types**: When using runtime_data:
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
```
- **Library Requirements**: Include `py.typed` file for PEP-561 compliance
## Code Quality Standards
- **Formatting**: Ruff
- **Linting**: PyLint and Ruff
- **Type Checking**: MyPy
- **Lint/Type/Format Fixes**: Always prefer addressing the underlying issue (e.g., import the typed source, update shared stubs, align with Ruff expectations, or correct formatting at the source) before disabling a rule, adding `# type: ignore`, or skipping a formatter. Treat suppressions and `noqa` comments as a last resort once no compliant fix exists
- **Testing**: pytest with plain functions and fixtures
- **Language**: American English for all code, comments, and documentation (use sentence case, including titles)
### Writing Style Guidelines
- **Tone**: Friendly and informative
- **Perspective**: Use second-person ("you" and "your") for user-facing messages
- **Inclusivity**: Use objective, non-discriminatory language
- **Clarity**: Write for non-native English speakers
- **Formatting in Messages**:
- Use backticks for: file paths, filenames, variable names, field entries
- Use sentence case for titles and messages (capitalize only the first word and proper nouns)
- Avoid abbreviations when possible
### Documentation Standards
- **File Headers**: Short and concise
```python
"""Integration for Peblar EV chargers."""
```
- **Method/Function Docstrings**: Required for all
```python
async def async_setup_entry(hass: HomeAssistant, entry: PeblarConfigEntry) -> bool:
"""Set up Peblar from a config entry."""
```
- **Comment Style**:
- Use clear, descriptive comments
- Explain the "why" not just the "what"
- Keep code block lines under 80 characters when possible
- Use progressive disclosure (simple explanation first, complex details later)
## Async Programming
- All external I/O operations must be async
- **Best Practices**:
- Avoid sleeping in loops
- Avoid awaiting in loops - use `gather` instead
- No blocking calls
- Group executor jobs when possible - switching between event loop and executor is expensive
### Blocking Operations
- **Use Executor**: For blocking I/O operations
```python
result = await hass.async_add_executor_job(blocking_function, args)
```
- **Never Block Event Loop**: Avoid file operations, `time.sleep()`, blocking HTTP calls
- **Replace with Async**: Use `asyncio.sleep()` instead of `time.sleep()`
### Thread Safety
- **@callback Decorator**: For event loop safe functions
```python
@callback
def async_update_callback(self, event):
"""Safe to run in event loop."""
self.async_write_ha_state()
```
- **Sync APIs from Threads**: Use sync versions when calling from non-event loop threads
- **Registry Changes**: Must be done in event loop thread
### Error Handling
- **Exception Types**: Choose most specific exception available
- `ServiceValidationError`: User input errors (preferred over `ValueError`)
- `HomeAssistantError`: Device communication failures
- `ConfigEntryNotReady`: Temporary setup issues (device offline)
- `ConfigEntryAuthFailed`: Authentication problems
- `ConfigEntryError`: Permanent setup issues
- **Try/Catch Best Practices**:
- Only wrap code that can throw exceptions
- Keep try blocks minimal - process data after the try/catch
- **Avoid bare exceptions** except in specific cases:
- ❌ Generally not allowed: `except:` or `except Exception:`
- ✅ Allowed in config flows to ensure robustness
- ✅ Allowed in functions/methods that run in background tasks
- Bad pattern:
```python
try:
data = await device.get_data() # Can throw
# ❌ Don't process data inside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
except DeviceError:
_LOGGER.error("Failed to get data")
```
- Good pattern:
```python
try:
data = await device.get_data() # Can throw
except DeviceError:
_LOGGER.error("Failed to get data")
return
# ✅ Process data outside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
```
- **Bare Exception Usage**:
```python
# ❌ Not allowed in regular code
try:
data = await device.get_data()
except Exception: # Too broad
_LOGGER.error("Failed")
# ✅ Allowed in config flow for robustness
async def async_step_user(self, user_input=None):
try:
await self._test_connection(user_input)
except Exception: # Allowed here
errors["base"] = "unknown"
# ✅ Allowed in background tasks
async def _background_refresh():
try:
await coordinator.async_refresh()
except Exception: # Allowed in task
_LOGGER.exception("Unexpected error in background task")
```
- **Setup Failure Patterns**:
```python
try:
await device.async_setup()
except (asyncio.TimeoutError, TimeoutException) as ex:
raise ConfigEntryNotReady(f"Timeout connecting to {device.host}") from ex
except AuthFailed as ex:
raise ConfigEntryAuthFailed(f"Credentials expired for {device.name}") from ex
```
### Logging
- **Format Guidelines**:
- No periods at end of messages
- No integration names/domains (added automatically)
- No sensitive data (keys, tokens, passwords)
- Use debug level for non-user-facing messages
- **Use Lazy Logging**:
```python
_LOGGER.debug("This is a log message with %s", variable)
```
### Unavailability Logging
- **Log Once**: When device/service becomes unavailable (info level)
- **Log Recovery**: When device/service comes back online
- **Implementation Pattern**:
```python
_unavailable_logged: bool = False
if not self._unavailable_logged:
_LOGGER.info("The sensor is unavailable: %s", ex)
self._unavailable_logged = True
# On recovery:
if self._unavailable_logged:
_LOGGER.info("The sensor is back online")
self._unavailable_logged = False
```
## Development Commands
### Code Quality & Linting
- **Run all linters on all files**: `prek run --all-files`
- **Run linters on staged files only**: `prek run`
- **PyLint on everything** (slow): `pylint homeassistant`
- **PyLint on specific folder**: `pylint homeassistant/components/my_integration`
- **MyPy type checking (whole project)**: `mypy homeassistant/`
- **MyPy on specific integration**: `mypy homeassistant/components/my_integration`
### Testing
- **Quick test of changed files**: `pytest --timeout=10 --picked`
- **Update test snapshots**: Add `--snapshot-update` to pytest command
- ⚠️ Omit test results after using `--snapshot-update`
- Always run tests again without the flag to verify snapshots
- **Full test suite** (AVOID - very slow): `pytest ./tests`
### Dependencies & Requirements
- **Update generated files after dependency changes**: `python -m script.gen_requirements_all`
- **Install all Python requirements**:
```bash
uv pip install -r requirements_all.txt -r requirements.txt -r requirements_test.txt
```
- **Install test requirements only**:
```bash
uv pip install -r requirements_test_all.txt -r requirements.txt
```
### Translations
- **Update translations after strings.json changes**:
```bash
python -m script.translations develop --all
```
### Project Validation
- **Run hassfest** (checks project structure and updates generated files):
```bash
python -m script.hassfest
```
## Common Anti-Patterns & Best Practices
### ❌ **Avoid These Patterns**
```python
# Blocking operations in event loop
data = requests.get(url) # ❌ Blocks event loop
time.sleep(5) # ❌ Blocks event loop
# Reusing BleakClient instances
self.client = BleakClient(address)
await self.client.connect()
# Later...
await self.client.connect() # ❌ Don't reuse
# Hardcoded strings in code
self._attr_name = "Temperature Sensor" # ❌ Not translatable
# Missing error handling
data = await self.api.get_data() # ❌ No exception handling
# Storing sensitive data in diagnostics
return {"api_key": entry.data[CONF_API_KEY]} # ❌ Exposes secrets
# Accessing hass.data directly in tests
coordinator = hass.data[DOMAIN][entry.entry_id] # ❌ Don't access hass.data
# User-configurable polling intervals
# In config flow
vol.Optional("scan_interval", default=60): cv.positive_int # ❌ Not allowed
# In coordinator
update_interval = timedelta(minutes=entry.data.get("scan_interval", 1)) # ❌ Not allowed
# User-configurable config entry names (non-helper integrations)
vol.Optional("name", default="My Device"): cv.string # ❌ Not allowed in regular integrations
# Too much code in try block
try:
response = await client.get_data() # Can throw
# ❌ Data processing should be outside try block
temperature = response["temperature"] / 10
humidity = response["humidity"]
self._attr_native_value = temperature
except ClientError:
_LOGGER.error("Failed to fetch data")
# Bare exceptions in regular code
try:
value = await sensor.read_value()
except Exception: # ❌ Too broad - catch specific exceptions
_LOGGER.error("Failed to read sensor")
```
### ✅ **Use These Patterns Instead**
```python
# Async operations with executor
data = await hass.async_add_executor_job(requests.get, url)
await asyncio.sleep(5) # ✅ Non-blocking
# Fresh BleakClient instances
client = BleakClient(address) # ✅ New instance each time
await client.connect()
# Translatable entity names
_attr_translation_key = "temperature_sensor" # ✅ Translatable
# Proper error handling
try:
data = await self.api.get_data()
except ApiException as err:
raise UpdateFailed(f"API error: {err}") from err
# Redacted diagnostics data
return async_redact_data(data, {"api_key", "password"}) # ✅ Safe
# Test through proper integration setup and fixtures
@pytest.fixture
async def init_integration(hass, mock_config_entry, mock_api):
mock_config_entry.add_to_hass(hass)
await hass.config_entries.async_setup(mock_config_entry.entry_id) # ✅ Proper setup
# Integration-determined polling intervals (not user-configurable)
SCAN_INTERVAL = timedelta(minutes=5) # ✅ Common pattern: constant in const.py
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
# ✅ Integration determines interval based on device capabilities, connection type, etc.
interval = timedelta(minutes=1) if client.is_local else SCAN_INTERVAL
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=interval,
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```

1
AGENTS.md Symbolic link
View File

@@ -0,0 +1 @@
.github/copilot-instructions.md

View File

@@ -1 +1 @@
AGENTS.md
.github/copilot-instructions.md

25
CODEOWNERS generated
View File

@@ -516,8 +516,6 @@ build.json @home-assistant/supervisor
/tests/components/fireservicerota/ @cyberjunky
/homeassistant/components/firmata/ @DaAwesomeP
/tests/components/firmata/ @DaAwesomeP
/homeassistant/components/fish_audio/ @noambav
/tests/components/fish_audio/ @noambav
/homeassistant/components/fitbit/ @allenporter
/tests/components/fitbit/ @allenporter
/homeassistant/components/fivem/ @Sander0542
@@ -532,8 +530,6 @@ build.json @home-assistant/supervisor
/tests/components/flo/ @dmulcahey
/homeassistant/components/flume/ @ChrisMandich @bdraco @jeeftor
/tests/components/flume/ @ChrisMandich @bdraco @jeeftor
/homeassistant/components/fluss/ @fluss
/tests/components/fluss/ @fluss
/homeassistant/components/flux_led/ @icemanch
/tests/components/flux_led/ @icemanch
/homeassistant/components/forecast_solar/ @klaasnicolaas @frenck
@@ -641,8 +637,6 @@ build.json @home-assistant/supervisor
/tests/components/gpsd/ @fabaff @jrieger
/homeassistant/components/gree/ @cmroche
/tests/components/gree/ @cmroche
/homeassistant/components/green_planet_energy/ @petschni
/tests/components/green_planet_energy/ @petschni
/homeassistant/components/greeneye_monitor/ @jkeljo
/tests/components/greeneye_monitor/ @jkeljo
/homeassistant/components/group/ @home-assistant/core
@@ -663,8 +657,6 @@ build.json @home-assistant/supervisor
/tests/components/harmony/ @ehendrix23 @bdraco @mkeesey @Aohzan
/homeassistant/components/hassio/ @home-assistant/supervisor
/tests/components/hassio/ @home-assistant/supervisor
/homeassistant/components/hdfury/ @glenndehaan
/tests/components/hdfury/ @glenndehaan
/homeassistant/components/hdmi_cec/ @inytar
/tests/components/hdmi_cec/ @inytar
/homeassistant/components/heatmiser/ @andylockran
@@ -1019,8 +1011,8 @@ build.json @home-assistant/supervisor
/tests/components/mill/ @danielhiversen
/homeassistant/components/min_max/ @gjohansson-ST
/tests/components/min_max/ @gjohansson-ST
/homeassistant/components/minecraft_server/ @elmurato @zachdeibert
/tests/components/minecraft_server/ @elmurato @zachdeibert
/homeassistant/components/minecraft_server/ @elmurato
/tests/components/minecraft_server/ @elmurato
/homeassistant/components/minio/ @tkislan
/tests/components/minio/ @tkislan
/homeassistant/components/moat/ @bdraco
@@ -1070,8 +1062,6 @@ build.json @home-assistant/supervisor
/tests/components/myuplink/ @pajzo @astrandb
/homeassistant/components/nam/ @bieniu
/tests/components/nam/ @bieniu
/homeassistant/components/namecheapdns/ @tr4nt0r
/tests/components/namecheapdns/ @tr4nt0r
/homeassistant/components/nanoleaf/ @milanmeu @joostlek
/tests/components/nanoleaf/ @milanmeu @joostlek
/homeassistant/components/nasweb/ @nasWebio
@@ -1176,8 +1166,6 @@ build.json @home-assistant/supervisor
/tests/components/open_router/ @joostlek
/homeassistant/components/openerz/ @misialq
/tests/components/openerz/ @misialq
/homeassistant/components/openevse/ @c00w @firstof9
/tests/components/openevse/ @c00w @firstof9
/homeassistant/components/openexchangerates/ @MartinHjelmare
/tests/components/openexchangerates/ @MartinHjelmare
/homeassistant/components/opengarage/ @danielhiversen
@@ -1275,8 +1263,7 @@ build.json @home-assistant/supervisor
/tests/components/prosegur/ @dgomes
/homeassistant/components/proximity/ @mib1185
/tests/components/proximity/ @mib1185
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/tests/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno
/homeassistant/components/ps4/ @ktnrg45
/tests/components/ps4/ @ktnrg45
/homeassistant/components/pterodactyl/ @elmurato
@@ -1708,8 +1695,8 @@ build.json @home-assistant/supervisor
/tests/components/trafikverket_train/ @gjohansson-ST
/homeassistant/components/trafikverket_weatherstation/ @gjohansson-ST
/tests/components/trafikverket_weatherstation/ @gjohansson-ST
/homeassistant/components/transmission/ @engrbm87 @JPHutchins @andrew-codechimp
/tests/components/transmission/ @engrbm87 @JPHutchins @andrew-codechimp
/homeassistant/components/transmission/ @engrbm87 @JPHutchins
/tests/components/transmission/ @engrbm87 @JPHutchins
/homeassistant/components/trend/ @jpbede
/tests/components/trend/ @jpbede
/homeassistant/components/triggercmd/ @rvmey
@@ -1810,8 +1797,6 @@ build.json @home-assistant/supervisor
/tests/components/waqi/ @joostlek
/homeassistant/components/water_heater/ @home-assistant/core
/tests/components/water_heater/ @home-assistant/core
/homeassistant/components/waterfurnace/ @sdague @masterkoppa
/tests/components/waterfurnace/ @sdague @masterkoppa
/homeassistant/components/watergate/ @adam-the-hero
/tests/components/watergate/ @adam-the-hero
/homeassistant/components/watson_tts/ @rutkai

4
Dockerfile generated
View File

@@ -24,13 +24,13 @@ ENV \
COPY rootfs /
# Add go2rtc binary
COPY --from=ghcr.io/alexxit/go2rtc@sha256:675c318b23c06fd862a61d262240c9a63436b4050d177ffc68a32710d9e05bae /usr/local/bin/go2rtc /bin/go2rtc
COPY --from=ghcr.io/alexxit/go2rtc@sha256:f394f6329f5389a4c9a7fc54b09fdec9621bbb78bf7a672b973440bbdfb02241 /usr/local/bin/go2rtc /bin/go2rtc
RUN \
# Verify go2rtc can be executed
go2rtc --version \
# Install uv
&& pip3 install uv==0.9.26
&& pip3 install uv==0.9.17
WORKDIR /usr/src

View File

@@ -67,6 +67,8 @@ from .const import (
BASE_PLATFORMS,
FORMAT_DATETIME,
KEY_DATA_LOGGING as DATA_LOGGING,
REQUIRED_NEXT_PYTHON_HA_RELEASE,
REQUIRED_NEXT_PYTHON_VER,
SIGNAL_BOOTSTRAP_INTEGRATIONS,
)
from .core_config import async_process_ha_core_config
@@ -514,6 +516,38 @@ async def async_from_config_dict(
stop = monotonic()
_LOGGER.info("Home Assistant initialized in %.2fs", stop - start)
if (
REQUIRED_NEXT_PYTHON_HA_RELEASE
and sys.version_info[:3] < REQUIRED_NEXT_PYTHON_VER
):
current_python_version = ".".join(str(x) for x in sys.version_info[:3])
required_python_version = ".".join(str(x) for x in REQUIRED_NEXT_PYTHON_VER[:2])
_LOGGER.warning(
(
"Support for the running Python version %s is deprecated and "
"will be removed in Home Assistant %s; "
"Please upgrade Python to %s"
),
current_python_version,
REQUIRED_NEXT_PYTHON_HA_RELEASE,
required_python_version,
)
issue_registry.async_create_issue(
hass,
core.DOMAIN,
f"python_version_{required_python_version}",
is_fixable=False,
severity=issue_registry.IssueSeverity.WARNING,
breaks_in_ha_version=REQUIRED_NEXT_PYTHON_HA_RELEASE,
translation_key="python_version",
translation_placeholders={
"current_python_version": current_python_version,
"required_python_version": required_python_version,
"breaks_in_ha_version": REQUIRED_NEXT_PYTHON_HA_RELEASE,
},
)
return hass

View File

@@ -1,6 +1,5 @@
{
"domain": "leviton",
"name": "Leviton",
"integrations": ["decora_wifi"],
"iot_standards": ["zwave"]
}

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["accuweather"],
"requirements": ["accuweather==5.0.0"]
"requirements": ["accuweather==4.2.2"]
}

View File

@@ -15,10 +15,12 @@ from homeassistant.components.climate import (
)
from homeassistant.const import ATTR_TEMPERATURE, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
from .entity import ActronAirAcEntity, ActronAirZoneEntity
PARALLEL_UPDATES = 0
@@ -54,7 +56,8 @@ async def async_setup_entry(
for coordinator in system_coordinators.values():
status = coordinator.data
entities.append(ActronSystemClimate(coordinator))
name = status.ac_system.system_name
entities.append(ActronSystemClimate(coordinator, name))
entities.extend(
ActronZoneClimate(coordinator, zone)
@@ -65,9 +68,10 @@ async def async_setup_entry(
async_add_entities(entities)
class ActronAirClimateEntity(ClimateEntity):
class BaseClimateEntity(CoordinatorEntity[ActronAirSystemCoordinator], ClimateEntity):
"""Base class for Actron Air climate entities."""
_attr_has_entity_name = True
_attr_temperature_unit = UnitOfTemperature.CELSIUS
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
@@ -79,17 +83,43 @@ class ActronAirClimateEntity(ClimateEntity):
_attr_fan_modes = list(FAN_MODE_MAPPING_ACTRONAIR_TO_HA.values())
_attr_hvac_modes = list(HVAC_MODE_MAPPING_ACTRONAIR_TO_HA.values())
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator)
self._serial_number = coordinator.serial_number
class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
class ActronSystemClimate(BaseClimateEntity):
"""Representation of the Actron Air system."""
_attr_supported_features = (
ClimateEntityFeature.TARGET_TEMPERATURE
| ClimateEntityFeature.FAN_MODE
| ClimateEntityFeature.TURN_ON
| ClimateEntityFeature.TURN_OFF
)
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
name: str,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator)
self._attr_unique_id = self._serial_number
super().__init__(coordinator, name)
serial_number = coordinator.serial_number
self._attr_unique_id = serial_number
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_number)},
name=self._status.ac_system.system_name,
manufacturer="Actron Air",
model_id=self._status.ac_system.master_wc_model,
sw_version=self._status.ac_system.master_wc_firmware_version,
serial_number=serial_number,
)
@property
def min_temp(self) -> float:
@@ -138,7 +168,7 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
async def async_set_fan_mode(self, fan_mode: str) -> None:
"""Set a new fan mode."""
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode)
api_fan_mode = FAN_MODE_MAPPING_HA_TO_ACTRONAIR.get(fan_mode.lower())
await self._status.user_aircon_settings.set_fan_mode(api_fan_mode)
async def async_set_hvac_mode(self, hvac_mode: HVACMode) -> None:
@@ -152,7 +182,7 @@ class ActronSystemClimate(ActronAirAcEntity, ActronAirClimateEntity):
await self._status.user_aircon_settings.set_temperature(temperature=temp)
class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
class ActronZoneClimate(BaseClimateEntity):
"""Representation of a zone within the Actron Air system."""
_attr_supported_features = (
@@ -167,8 +197,18 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
zone: ActronAirZone,
) -> None:
"""Initialize an Actron Air unit."""
super().__init__(coordinator, zone)
self._attr_unique_id: str = self._zone_identifier
super().__init__(coordinator, zone.title)
serial_number = coordinator.serial_number
self._zone_id: int = zone.zone_id
self._attr_unique_id: str = f"{serial_number}_zone_{zone.zone_id}"
self._attr_device_info: DeviceInfo = DeviceInfo(
identifiers={(DOMAIN, self._attr_unique_id)},
name=zone.title,
manufacturer="Actron Air",
model="Zone",
suggested_area=zone.title,
via_device=(DOMAIN, serial_number),
)
@property
def min_temp(self) -> float:
@@ -216,4 +256,4 @@ class ActronZoneClimate(ActronAirZoneEntity, ActronAirClimateEntity):
async def async_set_temperature(self, **kwargs: Any) -> None:
"""Set the temperature."""
await self._zone.set_temperature(temperature=kwargs.get(ATTR_TEMPERATURE))
await self._zone.set_temperature(temperature=kwargs["temperature"])

View File

@@ -8,7 +8,6 @@ from datetime import timedelta
from actron_neo_api import (
ActronAirACSystem,
ActronAirAPI,
ActronAirAPIError,
ActronAirAuthError,
ActronAirStatus,
)
@@ -16,7 +15,7 @@ from actron_neo_api import (
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.util import dt as dt_util
from .const import _LOGGER, DOMAIN
@@ -71,12 +70,6 @@ class ActronAirSystemCoordinator(DataUpdateCoordinator[ActronAirACSystem]):
translation_domain=DOMAIN,
translation_key="auth_error",
) from err
except ActronAirAPIError as err:
raise UpdateFailed(
translation_domain=DOMAIN,
translation_key="update_error",
translation_placeholders={"error": repr(err)},
) from err
self.status = self.api.state_manager.get_status(self.serial_number)
self.last_seen = dt_util.utcnow()

View File

@@ -1,63 +0,0 @@
"""Base entity classes for Actron Air integration."""
from actron_neo_api import ActronAirZone
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import ActronAirSystemCoordinator
class ActronAirEntity(CoordinatorEntity[ActronAirSystemCoordinator]):
"""Base class for Actron Air entities."""
_attr_has_entity_name = True
def __init__(self, coordinator: ActronAirSystemCoordinator) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self._serial_number = coordinator.serial_number
@property
def available(self) -> bool:
"""Return True if entity is available."""
return not self.coordinator.is_device_stale()
class ActronAirAcEntity(ActronAirEntity):
"""Base class for Actron Air entities."""
def __init__(self, coordinator: ActronAirSystemCoordinator) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, self._serial_number)},
name=coordinator.data.ac_system.system_name,
manufacturer="Actron Air",
model_id=coordinator.data.ac_system.master_wc_model,
sw_version=coordinator.data.ac_system.master_wc_firmware_version,
serial_number=self._serial_number,
)
class ActronAirZoneEntity(ActronAirEntity):
"""Base class for Actron Air zone entities."""
def __init__(
self,
coordinator: ActronAirSystemCoordinator,
zone: ActronAirZone,
) -> None:
"""Initialize the entity."""
super().__init__(coordinator)
self._zone_id: int = zone.zone_id
self._zone_identifier = f"{self._serial_number}_zone_{zone.zone_id}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, self._zone_identifier)},
name=zone.title,
manufacturer="Actron Air",
model="Zone",
suggested_area=zone.title,
via_device=(DOMAIN, self._serial_number),
)

View File

@@ -51,9 +51,6 @@
"exceptions": {
"auth_error": {
"message": "Authentication failed, please reauthenticate"
},
"update_error": {
"message": "An error occurred while retrieving data from the Actron Air API: {error}"
}
}
}

View File

@@ -7,10 +7,12 @@ from typing import Any
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from .const import DOMAIN
from .coordinator import ActronAirConfigEntry, ActronAirSystemCoordinator
from .entity import ActronAirAcEntity
PARALLEL_UPDATES = 0
@@ -72,9 +74,10 @@ async def async_setup_entry(
)
class ActronAirSwitch(ActronAirAcEntity, SwitchEntity):
class ActronAirSwitch(CoordinatorEntity[ActronAirSystemCoordinator], SwitchEntity):
"""Actron Air switch."""
_attr_has_entity_name = True
_attr_entity_category = EntityCategory.CONFIG
entity_description: ActronAirSwitchEntityDescription
@@ -87,6 +90,11 @@ class ActronAirSwitch(ActronAirAcEntity, SwitchEntity):
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.serial_number}_{description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, coordinator.serial_number)},
manufacturer="Actron Air",
name=coordinator.data.ac_system.system_name,
)
@property
def is_on(self) -> bool:

View File

@@ -87,7 +87,7 @@ class AdaxConfigFlow(ConfigFlow, domain=DOMAIN):
data_schema=data_schema,
)
wifi_ssid = user_input[WIFI_SSID]
wifi_ssid = user_input[WIFI_SSID].replace(" ", "")
wifi_pswd = user_input[WIFI_PSWD].replace(" ", "")
configurator = adax_local.AdaxConfig(wifi_ssid, wifi_pswd)

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/adax",
"iot_class": "local_polling",
"loggers": ["adax", "adax_local"],
"requirements": ["adax==0.4.0", "Adax-local==0.3.0"]
"requirements": ["adax==0.4.0", "Adax-local==0.2.0"]
}

View File

@@ -52,7 +52,7 @@ class AdGuardHomeEntity(Entity):
def device_info(self) -> DeviceInfo:
"""Return device information about this AdGuard Home instance."""
if self._entry.source == SOURCE_HASSIO:
config_url = "homeassistant://app/a0d7b954_adguard"
config_url = "homeassistant://hassio/ingress/a0d7b954_adguard"
elif self.adguard.tls:
config_url = f"https://{self.adguard.host}:{self.adguard.port}"
else:

View File

@@ -8,15 +8,12 @@ from advantage_air import ApiError, advantage_air
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_IP_ADDRESS, CONF_PORT, Platform
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.debounce import Debouncer
from homeassistant.helpers.typing import ConfigType
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import ADVANTAGE_AIR_RETRY, DOMAIN
from .const import ADVANTAGE_AIR_RETRY
from .models import AdvantageAirData
from .services import async_setup_services
type AdvantageAirDataConfigEntry = ConfigEntry[AdvantageAirData]
@@ -35,14 +32,6 @@ PLATFORMS = [
_LOGGER = logging.getLogger(__name__)
REQUEST_REFRESH_DELAY = 0.5
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the component."""
async_setup_services(hass)
return True
async def async_setup_entry(
hass: HomeAssistant, entry: AdvantageAirDataConfigEntry

View File

@@ -5,6 +5,8 @@ from __future__ import annotations
from decimal import Decimal
from typing import Any
import voluptuous as vol
from homeassistant.components.sensor import (
SensorDeviceClass,
SensorEntity,
@@ -12,6 +14,7 @@ from homeassistant.components.sensor import (
)
from homeassistant.const import PERCENTAGE, EntityCategory, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, entity_platform
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AdvantageAirDataConfigEntry
@@ -21,6 +24,7 @@ from .models import AdvantageAirData
ADVANTAGE_AIR_SET_COUNTDOWN_VALUE = "minutes"
ADVANTAGE_AIR_SET_COUNTDOWN_UNIT = "min"
ADVANTAGE_AIR_SERVICE_SET_TIME_TO = "set_time_to"
PARALLEL_UPDATES = 0
@@ -49,6 +53,13 @@ async def async_setup_entry(
entities.append(AdvantageAirZoneSignal(instance, ac_key, zone_key))
async_add_entities(entities)
platform = entity_platform.async_get_current_platform()
platform.async_register_entity_service(
ADVANTAGE_AIR_SERVICE_SET_TIME_TO,
{vol.Required("minutes"): cv.positive_int},
"set_time_to",
)
class AdvantageAirTimeTo(AdvantageAirAcEntity, SensorEntity):
"""Representation of Advantage Air timer control."""

View File

@@ -1,27 +0,0 @@
"""Services for Advantage Air integration."""
from __future__ import annotations
import voluptuous as vol
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
ADVANTAGE_AIR_SERVICE_SET_TIME_TO = "set_time_to"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
"""Home Assistant services."""
service.async_register_platform_entity_service(
hass,
DOMAIN,
ADVANTAGE_AIR_SERVICE_SET_TIME_TO,
entity_domain=SENSOR_DOMAIN,
schema={vol.Required("minutes"): cv.positive_int},
func="set_time_to",
)

View File

@@ -7,13 +7,7 @@ from homeassistant.core import HomeAssistant
from .coordinator import AirobotConfigEntry, AirobotDataUpdateCoordinator
PLATFORMS: list[Platform] = [
Platform.BUTTON,
Platform.CLIMATE,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,
]
PLATFORMS: list[Platform] = [Platform.CLIMATE, Platform.SENSOR]
async def async_setup_entry(hass: HomeAssistant, entry: AirobotConfigEntry) -> bool:

View File

@@ -1,96 +0,0 @@
"""Button platform for Airobot integration."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any
from pyairobotrest.exceptions import (
AirobotConnectionError,
AirobotError,
AirobotTimeoutError,
)
from homeassistant.components.button import (
ButtonDeviceClass,
ButtonEntity,
ButtonEntityDescription,
)
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN
from .coordinator import AirobotConfigEntry, AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirobotButtonEntityDescription(ButtonEntityDescription):
"""Describes Airobot button entity."""
press_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
BUTTON_TYPES: tuple[AirobotButtonEntityDescription, ...] = (
AirobotButtonEntityDescription(
key="restart",
device_class=ButtonDeviceClass.RESTART,
entity_category=EntityCategory.CONFIG,
press_fn=lambda coordinator: coordinator.client.reboot_thermostat(),
),
AirobotButtonEntityDescription(
key="recalibrate_co2",
translation_key="recalibrate_co2",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
press_fn=lambda coordinator: coordinator.client.recalibrate_co2_sensor(),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: AirobotConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airobot button entities."""
coordinator = entry.runtime_data
async_add_entities(
AirobotButton(coordinator, description) for description in BUTTON_TYPES
)
class AirobotButton(AirobotEntity, ButtonEntity):
"""Representation of an Airobot button."""
entity_description: AirobotButtonEntityDescription
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
description: AirobotButtonEntityDescription,
) -> None:
"""Initialize the button."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.status.device_id}_{description.key}"
async def async_press(self) -> None:
"""Handle the button press."""
try:
await self.entity_description.press_fn(self.coordinator)
except (AirobotConnectionError, AirobotTimeoutError):
# Connection errors during reboot are expected as device restarts
pass
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="button_press_failed",
translation_placeholders={"button": self.entity_description.key},
) from err

View File

@@ -29,7 +29,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AirobotConfigEntry
from .const import DOMAIN
from .coordinator import AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 1
@@ -64,11 +63,6 @@ class AirobotClimate(AirobotEntity, ClimateEntity):
_attr_min_temp = SETPOINT_TEMP_MIN
_attr_max_temp = SETPOINT_TEMP_MAX
def __init__(self, coordinator: AirobotDataUpdateCoordinator) -> None:
"""Initialize the climate entity."""
super().__init__(coordinator)
self._attr_unique_id = coordinator.data.status.device_id
@property
def _status(self) -> ThermostatStatus:
"""Get status from coordinator data."""

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
import asyncio
from collections.abc import Mapping
from dataclasses import dataclass
import logging
@@ -61,17 +60,11 @@ async def validate_input(hass: HomeAssistant, data: dict[str, Any]) -> DeviceInf
try:
# Try to fetch data to validate connection and authentication
status, settings = await asyncio.gather(
client.get_statuses(), client.get_settings()
)
status = await client.get_statuses()
settings = await client.get_settings()
except AirobotAuthError as err:
raise InvalidAuth from err
except (
AirobotConnectionError,
AirobotTimeoutError,
AirobotError,
TimeoutError,
) as err:
except (AirobotConnectionError, AirobotTimeoutError, AirobotError) as err:
raise CannotConnect from err
# Use device name or device ID as title
@@ -182,42 +175,6 @@ class AirobotConfigFlow(BaseConfigFlow, domain=DOMAIN):
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_reconfigure(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle reconfiguration of the integration."""
errors: dict[str, str] = {}
reconfigure_entry = self._get_reconfigure_entry()
if user_input is not None:
try:
info = await validate_input(self.hass, user_input)
except CannotConnect:
errors["base"] = "cannot_connect"
except InvalidAuth:
errors["base"] = "invalid_auth"
except Exception:
_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
# Verify the device ID matches the existing config entry
await self.async_set_unique_id(info.device_id)
self._abort_if_unique_id_mismatch(reason="wrong_device")
return self.async_update_reload_and_abort(
reconfigure_entry,
data_updates=user_input,
title=info.title,
)
return self.async_show_form(
step_id="reconfigure",
data_schema=self.add_suggested_values_to_schema(
STEP_USER_DATA_SCHEMA, reconfigure_entry.data
),
errors=errors,
)
async def async_step_reauth(
self, entry_data: Mapping[str, Any]
) -> ConfigFlowResult:

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
import asyncio
from datetime import timedelta
import logging
@@ -53,10 +52,8 @@ class AirobotDataUpdateCoordinator(DataUpdateCoordinator[AirobotData]):
async def _async_update_data(self) -> AirobotData:
"""Fetch data from API endpoint."""
try:
status, settings = await asyncio.gather(
self.client.get_statuses(),
self.client.get_settings(),
)
status = await self.client.get_statuses()
settings = await self.client.get_settings()
except AirobotAuthError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,

View File

@@ -24,6 +24,8 @@ class AirobotEntity(CoordinatorEntity[AirobotDataUpdateCoordinator]):
status = coordinator.data.status
settings = coordinator.data.settings
self._attr_unique_id = status.device_id
connections = set()
if (mac := coordinator.config_entry.data.get(CONF_MAC)) is not None:
connections.add((CONNECTION_NETWORK_MAC, mac))
@@ -35,6 +37,6 @@ class AirobotEntity(CoordinatorEntity[AirobotDataUpdateCoordinator]):
manufacturer="Airobot",
model="Thermostat",
model_id="TE1",
sw_version=status.fw_version_string,
hw_version=status.hw_version_string,
sw_version=str(status.fw_version),
hw_version=str(status.hw_version),
)

View File

@@ -1,22 +0,0 @@
{
"entity": {
"button": {
"recalibrate_co2": {
"default": "mdi:molecule-co2"
}
},
"number": {
"hysteresis_band": {
"default": "mdi:delta"
}
},
"switch": {
"actuator_exercise_disabled": {
"default": "mdi:valve"
},
"child_lock": {
"default": "mdi:lock"
}
}
}
}

View File

@@ -12,6 +12,6 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["pyairobotrest"],
"quality_scale": "platinum",
"requirements": ["pyairobotrest==0.3.0"]
"quality_scale": "silver",
"requirements": ["pyairobotrest==0.1.0"]
}

View File

@@ -1,99 +0,0 @@
"""Number platform for Airobot thermostat."""
from __future__ import annotations
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
from pyairobotrest.const import HYSTERESIS_BAND_MAX, HYSTERESIS_BAND_MIN
from pyairobotrest.exceptions import AirobotError
from homeassistant.components.number import (
NumberDeviceClass,
NumberEntity,
NumberEntityDescription,
)
from homeassistant.const import EntityCategory, UnitOfTemperature
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ServiceValidationError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AirobotConfigEntry
from .const import DOMAIN
from .coordinator import AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirobotNumberEntityDescription(NumberEntityDescription):
"""Describes Airobot number entity."""
value_fn: Callable[[AirobotDataUpdateCoordinator], float]
set_value_fn: Callable[[AirobotDataUpdateCoordinator, float], Awaitable[None]]
NUMBERS: tuple[AirobotNumberEntityDescription, ...] = (
AirobotNumberEntityDescription(
key="hysteresis_band",
translation_key="hysteresis_band",
device_class=NumberDeviceClass.TEMPERATURE,
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
native_min_value=HYSTERESIS_BAND_MIN / 10.0,
native_max_value=HYSTERESIS_BAND_MAX / 10.0,
native_step=0.1,
value_fn=lambda coordinator: coordinator.data.settings.hysteresis_band,
set_value_fn=lambda coordinator, value: coordinator.client.set_hysteresis_band(
value
),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: AirobotConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airobot number platform."""
coordinator = entry.runtime_data
async_add_entities(
AirobotNumber(coordinator, description) for description in NUMBERS
)
class AirobotNumber(AirobotEntity, NumberEntity):
"""Representation of an Airobot number entity."""
entity_description: AirobotNumberEntityDescription
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
description: AirobotNumberEntityDescription,
) -> None:
"""Initialize the number entity."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.status.device_id}_{description.key}"
@property
def native_value(self) -> float:
"""Return the current value."""
return self.entity_description.value_fn(self.coordinator)
async def async_set_native_value(self, value: float) -> None:
"""Set the value."""
try:
await self.entity_description.set_value_fn(self.coordinator, value)
except AirobotError as err:
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="set_value_failed",
translation_placeholders={"error": str(err)},
) from err
else:
await self.coordinator.async_request_refresh()

View File

@@ -43,12 +43,12 @@ rules:
discovery-update-info: done
discovery: done
docs-data-update: done
docs-examples: done
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: Single device integration, no dynamic device discovery needed.
@@ -57,8 +57,8 @@ rules:
entity-disabled-by-default: done
entity-translations: done
exception-translations: done
icon-translations: done
reconfiguration-flow: done
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: This integration doesn't have any cases where raising an issue is needed.
@@ -69,4 +69,4 @@ rules:
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done
strict-typing: todo

View File

@@ -28,7 +28,6 @@ from homeassistant.util.dt import utcnow
from homeassistant.util.variance import ignore_variance
from . import AirobotConfigEntry
from .coordinator import AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@@ -54,7 +53,6 @@ SENSOR_TYPES: tuple[AirobotSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.TEMPERATURE,
native_unit_of_measurement=UnitOfTemperature.CELSIUS,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=1,
value_fn=lambda status: status.temp_air,
),
AirobotSensorEntityDescription(
@@ -138,7 +136,7 @@ class AirobotSensor(AirobotEntity, SensorEntity):
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
coordinator,
description: AirobotSensorEntityDescription,
) -> None:
"""Initialize the sensor."""

View File

@@ -2,9 +2,7 @@
"config": {
"abort": {
"already_configured": "[%key:common::config_flow::abort::already_configured_device%]",
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]",
"reconfigure_successful": "[%key:common::config_flow::abort::reconfigure_successful%]",
"wrong_device": "Device ID does not match the existing configuration. Please use the correct device credentials."
"reauth_successful": "[%key:common::config_flow::abort::reauth_successful%]"
},
"error": {
"cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
@@ -30,19 +28,6 @@
},
"description": "The authentication for Airobot thermostat at {host} (Device ID: {username}) has expired. Please enter the password to reauthenticate. Find the password in the thermostat settings menu under Connectivity → Mobile app."
},
"reconfigure": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
"password": "[%key:common::config_flow::data::password%]",
"username": "Device ID"
},
"data_description": {
"host": "[%key:component::airobot::config::step::user::data_description::host%]",
"password": "[%key:component::airobot::config::step::user::data_description::password%]",
"username": "[%key:component::airobot::config::step::user::data_description::username%]"
},
"description": "Update your Airobot thermostat connection details. Note: The Device ID must remain the same as the original configuration."
},
"user": {
"data": {
"host": "[%key:common::config_flow::data::host%]",
@@ -59,16 +44,6 @@
}
},
"entity": {
"button": {
"recalibrate_co2": {
"name": "Recalibrate CO2 sensor"
}
},
"number": {
"hysteresis_band": {
"name": "Hysteresis band"
}
},
"sensor": {
"air_temperature": {
"name": "Air temperature"
@@ -85,23 +60,12 @@
"heating_uptime": {
"name": "Heating uptime"
}
},
"switch": {
"actuator_exercise_disabled": {
"name": "Actuator exercise disabled"
},
"child_lock": {
"name": "Child lock"
}
}
},
"exceptions": {
"authentication_failed": {
"message": "Authentication failed, please reauthenticate."
},
"button_press_failed": {
"message": "Failed to press {button} button."
},
"connection_failed": {
"message": "Failed to communicate with device."
},
@@ -110,15 +74,6 @@
},
"set_temperature_failed": {
"message": "Failed to set temperature to {temperature}."
},
"set_value_failed": {
"message": "Failed to set value: {error}"
},
"switch_turn_off_failed": {
"message": "Failed to turn off {switch}."
},
"switch_turn_on_failed": {
"message": "Failed to turn on {switch}."
}
}
}

View File

@@ -1,118 +0,0 @@
"""Switch platform for Airobot thermostat."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any
from pyairobotrest.exceptions import AirobotError
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AirobotConfigEntry
from .const import DOMAIN
from .coordinator import AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirobotSwitchEntityDescription(SwitchEntityDescription):
"""Describes Airobot switch entity."""
is_on_fn: Callable[[AirobotDataUpdateCoordinator], bool]
turn_on_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
turn_off_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
SWITCH_TYPES: tuple[AirobotSwitchEntityDescription, ...] = (
AirobotSwitchEntityDescription(
key="child_lock",
translation_key="child_lock",
entity_category=EntityCategory.CONFIG,
is_on_fn=lambda coordinator: (
coordinator.data.settings.setting_flags.childlock_enabled
),
turn_on_fn=lambda coordinator: coordinator.client.set_child_lock(True),
turn_off_fn=lambda coordinator: coordinator.client.set_child_lock(False),
),
AirobotSwitchEntityDescription(
key="actuator_exercise_disabled",
translation_key="actuator_exercise_disabled",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
is_on_fn=lambda coordinator: (
coordinator.data.settings.setting_flags.actuator_exercise_disabled
),
turn_on_fn=lambda coordinator: coordinator.client.toggle_actuator_exercise(
True
),
turn_off_fn=lambda coordinator: coordinator.client.toggle_actuator_exercise(
False
),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: AirobotConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airobot switch entities."""
coordinator = entry.runtime_data
async_add_entities(
AirobotSwitch(coordinator, description) for description in SWITCH_TYPES
)
class AirobotSwitch(AirobotEntity, SwitchEntity):
"""Representation of an Airobot switch."""
entity_description: AirobotSwitchEntityDescription
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
description: AirobotSwitchEntityDescription,
) -> None:
"""Initialize the switch."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.status.device_id}_{description.key}"
@property
def is_on(self) -> bool:
"""Return true if the switch is on."""
return self.entity_description.is_on_fn(self.coordinator)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
try:
await self.entity_description.turn_on_fn(self.coordinator)
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="switch_turn_on_failed",
translation_placeholders={"switch": self.entity_description.key},
) from err
await self.coordinator.async_request_refresh()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
try:
await self.entity_description.turn_off_fn(self.coordinator)
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="switch_turn_off_failed",
translation_placeholders={"switch": self.entity_description.key},
) from err
await self.coordinator.async_request_refresh()

View File

@@ -7,5 +7,5 @@
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "silver",
"requirements": ["airos==0.6.3"]
"requirements": ["airos==0.6.0"]
}

View File

@@ -4,9 +4,6 @@
"health_index": {
"default": "mdi:heart-pulse"
},
"mold": {
"default": "mdi:water-check"
},
"oxygen": {
"default": "mdi:leaf"
},

View File

@@ -219,13 +219,6 @@ SENSOR_TYPES: list[AirQEntityDescription] = [
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("ch4_MIPEX"),
),
AirQEntityDescription(
key="mold",
translation_key="mold",
native_unit_of_measurement=PERCENTAGE,
state_class=SensorStateClass.MEASUREMENT,
value=lambda data: data.get("mold"),
),
AirQEntityDescription(
key="n2o",
device_class=SensorDeviceClass.NITROUS_OXIDE,

View File

@@ -101,9 +101,6 @@
"methanethiol": {
"name": "Methanethiol"
},
"mold": {
"name": "Mold index"
},
"noise": {
"name": "Noise"
},

View File

@@ -85,22 +85,6 @@ class AirzoneSystemEntity(AirzoneEntity):
value = system[key]
return value
async def _async_update_sys_params(self, params: dict[str, Any]) -> None:
"""Send system parameters to API."""
_params = {
API_SYSTEM_ID: self.system_id,
**params,
}
_LOGGER.debug("update_sys_params=%s", _params)
try:
await self.coordinator.airzone.set_sys_parameters(_params)
except AirzoneError as error:
raise HomeAssistantError(
f"Failed to set system {self.entity_id}: {error}"
) from error
self.coordinator.async_set_updated_data(self.coordinator.airzone.data())
class AirzoneHotWaterEntity(AirzoneEntity):
"""Define an Airzone Hot Water entity."""

View File

@@ -12,5 +12,5 @@
"integration_type": "hub",
"iot_class": "local_polling",
"loggers": ["aioairzone"],
"requirements": ["aioairzone==1.0.5"]
"requirements": ["aioairzone==1.0.4"]
}

View File

@@ -20,7 +20,6 @@ from aioairzone.const import (
AZD_MODES,
AZD_Q_ADAPT,
AZD_SLEEP,
AZD_SYSTEMS,
AZD_ZONES,
)
@@ -31,7 +30,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirzoneConfigEntry, AirzoneUpdateCoordinator
from .entity import AirzoneEntity, AirzoneSystemEntity, AirzoneZoneEntity
from .entity import AirzoneEntity, AirzoneZoneEntity
@dataclass(frozen=True, kw_only=True)
@@ -86,18 +85,6 @@ def main_zone_options(
return [k for k, v in options.items() if v in modes]
SYSTEM_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
AirzoneSelectDescription(
api_param=API_Q_ADAPT,
entity_category=EntityCategory.CONFIG,
key=AZD_Q_ADAPT,
options=list(Q_ADAPT_DICT),
options_dict=Q_ADAPT_DICT,
translation_key="q_adapt",
),
)
MAIN_ZONE_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
AirzoneSelectDescription(
api_param=API_MODE,
@@ -106,6 +93,14 @@ MAIN_ZONE_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
options_fn=main_zone_options,
translation_key="modes",
),
AirzoneSelectDescription(
api_param=API_Q_ADAPT,
entity_category=EntityCategory.CONFIG,
key=AZD_Q_ADAPT,
options=list(Q_ADAPT_DICT),
options_dict=Q_ADAPT_DICT,
translation_key="q_adapt",
),
)
@@ -145,37 +140,16 @@ async def async_setup_entry(
"""Add Airzone select from a config_entry."""
coordinator = entry.runtime_data
added_systems: set[str] = set()
added_zones: set[str] = set()
def _async_entity_listener() -> None:
"""Handle additions of select."""
entities: list[AirzoneBaseSelect] = []
systems_data = coordinator.data.get(AZD_SYSTEMS, {})
received_systems = set(systems_data)
new_systems = received_systems - added_systems
if new_systems:
entities.extend(
AirzoneSystemSelect(
coordinator,
description,
entry,
system_id,
systems_data.get(system_id),
)
for system_id in new_systems
for description in SYSTEM_SELECT_TYPES
if description.key in systems_data.get(system_id)
)
added_systems.update(new_systems)
zones_data = coordinator.data.get(AZD_ZONES, {})
received_zones = set(zones_data)
new_zones = received_zones - added_zones
if new_zones:
entities.extend(
entities: list[AirzoneZoneSelect] = [
AirzoneZoneSelect(
coordinator,
description,
@@ -187,8 +161,8 @@ async def async_setup_entry(
for description in MAIN_ZONE_SELECT_TYPES
if description.key in zones_data.get(system_zone_id)
and zones_data.get(system_zone_id).get(AZD_MASTER) is True
)
entities.extend(
]
entities += [
AirzoneZoneSelect(
coordinator,
description,
@@ -199,11 +173,10 @@ async def async_setup_entry(
for system_zone_id in new_zones
for description in ZONE_SELECT_TYPES
if description.key in zones_data.get(system_zone_id)
)
]
async_add_entities(entities)
added_zones.update(new_zones)
async_add_entities(entities)
entry.async_on_unload(coordinator.async_add_listener(_async_entity_listener))
_async_entity_listener()
@@ -230,38 +203,6 @@ class AirzoneBaseSelect(AirzoneEntity, SelectEntity):
self._attr_current_option = self._get_current_option()
class AirzoneSystemSelect(AirzoneSystemEntity, AirzoneBaseSelect):
"""Define an Airzone System select."""
def __init__(
self,
coordinator: AirzoneUpdateCoordinator,
description: AirzoneSelectDescription,
entry: ConfigEntry,
system_id: str,
system_data: dict[str, Any],
) -> None:
"""Initialize."""
super().__init__(coordinator, entry, system_data)
self._attr_unique_id = f"{self._attr_unique_id}_{system_id}_{description.key}"
self.entity_description = description
self._attr_options = self.entity_description.options_fn(
system_data, description.options_dict
)
self.values_dict = {v: k for k, v in description.options_dict.items()}
self._async_update_attrs()
async def async_select_option(self, option: str) -> None:
"""Change the selected option."""
param = self.entity_description.api_param
value = self.entity_description.options_dict[option]
await self._async_update_sys_params({param: value})
class AirzoneZoneSelect(AirzoneZoneEntity, AirzoneBaseSelect):
"""Define an Airzone Zone select."""

View File

@@ -1,93 +0,0 @@
"""Provides conditions for alarm control panels."""
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.condition import (
Condition,
EntityStateConditionBase,
make_entity_state_condition,
)
from homeassistant.helpers.entity import get_supported_features
from .const import DOMAIN, AlarmControlPanelEntityFeature, AlarmControlPanelState
def supports_feature(hass: HomeAssistant, entity_id: str, features: int) -> bool:
"""Test if an entity supports the specified features."""
try:
return bool(get_supported_features(hass, entity_id) & features)
except HomeAssistantError:
return False
class EntityStateRequiredFeaturesCondition(EntityStateConditionBase):
"""State condition."""
_required_features: int
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain with the required features."""
entities = super().entity_filter(entities)
return {
entity_id
for entity_id in entities
if supports_feature(self._hass, entity_id, self._required_features)
}
def make_entity_state_required_features_condition(
domain: str, to_state: str, required_features: int
) -> type[EntityStateRequiredFeaturesCondition]:
"""Create an entity state condition class with required feature filtering."""
class CustomCondition(EntityStateRequiredFeaturesCondition):
"""Condition for entity state changes."""
_domain = domain
_states = {to_state}
_required_features = required_features
return CustomCondition
CONDITIONS: dict[str, type[Condition]] = {
"is_armed": make_entity_state_condition(
DOMAIN,
{
AlarmControlPanelState.ARMED_AWAY,
AlarmControlPanelState.ARMED_CUSTOM_BYPASS,
AlarmControlPanelState.ARMED_HOME,
AlarmControlPanelState.ARMED_NIGHT,
AlarmControlPanelState.ARMED_VACATION,
},
),
"is_armed_away": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_AWAY,
AlarmControlPanelEntityFeature.ARM_AWAY,
),
"is_armed_home": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_HOME,
AlarmControlPanelEntityFeature.ARM_HOME,
),
"is_armed_night": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_NIGHT,
AlarmControlPanelEntityFeature.ARM_NIGHT,
),
"is_armed_vacation": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_VACATION,
AlarmControlPanelEntityFeature.ARM_VACATION,
),
"is_disarmed": make_entity_state_condition(DOMAIN, AlarmControlPanelState.DISARMED),
"is_triggered": make_entity_state_condition(
DOMAIN, AlarmControlPanelState.TRIGGERED
),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the alarm control panel conditions."""
return CONDITIONS

View File

@@ -1,52 +0,0 @@
.condition_common: &condition_common
target:
entity:
domain: alarm_control_panel
fields: &condition_common_fields
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_armed: *condition_common
is_armed_away:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_AWAY
is_armed_home:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_HOME
is_armed_night:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_NIGHT
is_armed_vacation:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_VACATION
is_disarmed: *condition_common
is_triggered: *condition_common

View File

@@ -1,27 +1,4 @@
{
"conditions": {
"is_armed": {
"condition": "mdi:shield"
},
"is_armed_away": {
"condition": "mdi:shield-lock"
},
"is_armed_home": {
"condition": "mdi:shield-home"
},
"is_armed_night": {
"condition": "mdi:shield-moon"
},
"is_armed_vacation": {
"condition": "mdi:shield-airplane"
},
"is_disarmed": {
"condition": "mdi:shield-off"
},
"is_triggered": {
"condition": "mdi:bell-ring"
}
},
"entity_component": {
"_": {
"default": "mdi:shield",

View File

@@ -1,82 +1,8 @@
{
"common": {
"condition_behavior_description": "How the state should match on the targeted alarms.",
"condition_behavior_name": "Behavior",
"trigger_behavior_description": "The behavior of the targeted alarms to trigger on.",
"trigger_behavior_name": "Behavior"
},
"conditions": {
"is_armed": {
"description": "Tests if one or more alarms are armed.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed"
},
"is_armed_away": {
"description": "Tests if one or more alarms are armed in away mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed away"
},
"is_armed_home": {
"description": "Tests if one or more alarms are armed in home mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed home"
},
"is_armed_night": {
"description": "Tests if one or more alarms are armed in night mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed night"
},
"is_armed_vacation": {
"description": "Tests if one or more alarms are armed in vacation mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed vacation"
},
"is_disarmed": {
"description": "Tests if one or more alarms are disarmed.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is disarmed"
},
"is_triggered": {
"description": "Tests if one or more alarms are triggered.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is triggered"
}
},
"device_automation": {
"action_type": {
"arm_away": "Arm {entity_name} away",
@@ -150,12 +76,6 @@
}
},
"selector": {
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"trigger_behavior": {
"options": {
"any": "Any",

View File

@@ -14,7 +14,7 @@ from .const import DOMAIN, AlarmControlPanelEntityFeature, AlarmControlPanelStat
def supports_feature(hass: HomeAssistant, entity_id: str, features: int) -> bool:
"""Test if an entity supports the specified features."""
"""Get the device class of an entity or UNDEFINED if not found."""
try:
return bool(get_supported_features(hass, entity_id) & features)
except HomeAssistantError:
@@ -39,7 +39,7 @@ class EntityStateTriggerRequiredFeatures(EntityTargetStateTriggerBase):
def make_entity_state_trigger_required_features(
domain: str, to_state: str, required_features: int
) -> type[EntityTargetStateTriggerBase]:
"""Create an entity state trigger class with required feature filtering."""
"""Create an entity state trigger class."""
class CustomTrigger(EntityStateTriggerRequiredFeatures):
"""Trigger for entity state changes."""

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==11.0.2"]
"requirements": ["aioamazondevices==10.0.0"]
}

View File

@@ -3,7 +3,7 @@
from __future__ import annotations
import logging
from typing import TYPE_CHECKING, Any
from typing import Any
from aiohttp import CookieJar
from pyanglianwater import AnglianWater
@@ -30,11 +30,14 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
vol.Required(CONF_PASSWORD): selector.TextSelector(
selector.TextSelectorConfig(type=selector.TextSelectorType.PASSWORD)
),
vol.Required(CONF_ACCOUNT_NUMBER): selector.TextSelector(),
}
)
async def validate_credentials(auth: MSOB2CAuth) -> str | MSOB2CAuth:
async def validate_credentials(
auth: MSOB2CAuth, account_number: str
) -> str | MSOB2CAuth:
"""Validate the provided credentials."""
try:
await auth.send_login_request()
@@ -43,33 +46,6 @@ async def validate_credentials(auth: MSOB2CAuth) -> str | MSOB2CAuth:
except Exception:
_LOGGER.exception("Unexpected exception")
return "unknown"
return auth
def humanize_account_data(account: dict) -> str:
"""Convert an account data into a human-readable format."""
if account["address"]["company_name"] != "":
return f"{account['account_number']} - {account['address']['company_name']}"
if account["address"]["building_name"] != "":
return f"{account['account_number']} - {account['address']['building_name']}"
return f"{account['account_number']} - {account['address']['postcode']}"
async def get_accounts(auth: MSOB2CAuth) -> list[selector.SelectOptionDict]:
"""Retrieve the list of accounts associated with the authenticated user."""
_aw = AnglianWater(authenticator=auth)
accounts = await _aw.api.get_associated_accounts()
return [
selector.SelectOptionDict(
value=str(account["account_number"]),
label=humanize_account_data(account),
)
for account in accounts["result"]["active"]
]
async def validate_account(auth: MSOB2CAuth, account_number: str) -> str | MSOB2CAuth:
"""Validate the provided account number."""
_aw = AnglianWater(authenticator=auth)
try:
await _aw.validate_smart_meter(account_number)
@@ -81,91 +57,36 @@ async def validate_account(auth: MSOB2CAuth, account_number: str) -> str | MSOB2
class AnglianWaterConfigFlow(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Anglian Water."""
def __init__(self) -> None:
"""Initialize the config flow."""
self.authenticator: MSOB2CAuth | None = None
self.accounts: list[selector.SelectOptionDict] = []
self.user_input: dict[str, Any] | None = None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
errors: dict[str, str] = {}
if user_input is not None:
self.authenticator = MSOB2CAuth(
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
session=async_create_clientsession(
self.hass,
cookie_jar=CookieJar(quote_cookie=False),
validation_response = await validate_credentials(
MSOB2CAuth(
username=user_input[CONF_USERNAME],
password=user_input[CONF_PASSWORD],
session=async_create_clientsession(
self.hass,
cookie_jar=CookieJar(quote_cookie=False),
),
),
user_input[CONF_ACCOUNT_NUMBER],
)
validation_response = await validate_credentials(self.authenticator)
if isinstance(validation_response, str):
errors["base"] = validation_response
else:
self.accounts = await get_accounts(self.authenticator)
if len(self.accounts) > 1:
self.user_input = user_input
return await self.async_step_select_account()
account_number = self.accounts[0]["value"]
self.user_input = user_input
return await self.async_step_complete(
{
CONF_ACCOUNT_NUMBER: account_number,
}
await self.async_set_unique_id(user_input[CONF_ACCOUNT_NUMBER])
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=user_input[CONF_ACCOUNT_NUMBER],
data={
**user_input,
CONF_ACCESS_TOKEN: validation_response.refresh_token,
},
)
return self.async_show_form(
step_id="user", data_schema=STEP_USER_DATA_SCHEMA, errors=errors
)
async def async_step_select_account(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the account selection step."""
errors = {}
if user_input is not None:
if TYPE_CHECKING:
assert self.authenticator
validation_result = await validate_account(
self.authenticator,
user_input[CONF_ACCOUNT_NUMBER],
)
if isinstance(validation_result, str):
errors["base"] = validation_result
else:
return await self.async_step_complete(user_input)
return self.async_show_form(
step_id="select_account",
data_schema=vol.Schema(
{
vol.Required(CONF_ACCOUNT_NUMBER): selector.SelectSelector(
selector.SelectSelectorConfig(
options=self.accounts,
multiple=False,
mode=selector.SelectSelectorMode.DROPDOWN,
)
)
}
),
errors=errors,
)
async def async_step_complete(self, user_input: dict[str, Any]) -> ConfigFlowResult:
"""Handle the final configuration step."""
await self.async_set_unique_id(user_input[CONF_ACCOUNT_NUMBER])
self._abort_if_unique_id_configured()
if TYPE_CHECKING:
assert self.authenticator
assert self.user_input
config_entry_data = {
**self.user_input,
CONF_ACCOUNT_NUMBER: user_input[CONF_ACCOUNT_NUMBER],
CONF_ACCESS_TOKEN: self.authenticator.refresh_token,
}
return self.async_create_entry(
title=user_input[CONF_ACCOUNT_NUMBER],
data=config_entry_data,
)

View File

@@ -10,21 +10,14 @@
"unknown": "[%key:common::config_flow::error::unknown%]"
},
"step": {
"select_account": {
"data": {
"account_number": "Billing account number"
},
"data_description": {
"account_number": "Select the billing account you wish to use."
},
"description": "Multiple active billing accounts were found with your credentials. Please select the account you wish to use. If this is unexpected, contact Anglian Water to confirm your active accounts."
},
"user": {
"data": {
"account_number": "Billing Account Number",
"password": "[%key:common::config_flow::data::password%]",
"username": "[%key:common::config_flow::data::username%]"
},
"data_description": {
"account_number": "Your account number found on your latest bill.",
"password": "Your password",
"username": "Username or email used to log in to the Anglian Water website."
},

View File

@@ -201,13 +201,10 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
)
for api in llm.async_get_apis(self.hass)
]
if suggested_llm_apis := self.options.get(CONF_LLM_HASS_API):
if isinstance(suggested_llm_apis, str):
suggested_llm_apis = [suggested_llm_apis]
known_apis = {api.id for api in llm.async_get_apis(self.hass)}
self.options[CONF_LLM_HASS_API] = [
api for api in suggested_llm_apis if api in known_apis
]
if (suggested_llm_apis := self.options.get(CONF_LLM_HASS_API)) and isinstance(
suggested_llm_apis, str
):
self.options[CONF_LLM_HASS_API] = [suggested_llm_apis]
step_schema: VolDictType = {}
errors: dict[str, str] = {}

View File

@@ -69,7 +69,6 @@ from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.json import json_dumps
from homeassistant.util import slugify
from . import AnthropicConfigEntry
@@ -194,7 +193,7 @@ def _convert_content(
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json_dumps(content.tool_result),
content=json.dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (

View File

@@ -4,8 +4,6 @@ from __future__ import annotations
import logging
import dateutil
from homeassistant.components.automation import automations_with_entity
from homeassistant.components.script import scripts_with_entity
from homeassistant.components.sensor import (
@@ -181,7 +179,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
LAST_S_TEST: SensorEntityDescription(
key=LAST_S_TEST,
translation_key="last_self_test",
device_class=SensorDeviceClass.TIMESTAMP,
),
"lastxfer": SensorEntityDescription(
key="lastxfer",
@@ -235,7 +232,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
"masterupd": SensorEntityDescription(
key="masterupd",
translation_key="master_update",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"maxlinev": SensorEntityDescription(
@@ -369,7 +365,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
"starttime": SensorEntityDescription(
key="starttime",
translation_key="startup_time",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"statflag": SensorEntityDescription(
@@ -421,19 +416,16 @@ SENSORS: dict[str, SensorEntityDescription] = {
"xoffbat": SensorEntityDescription(
key="xoffbat",
translation_key="transfer_from_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"xoffbatt": SensorEntityDescription(
key="xoffbatt",
translation_key="transfer_from_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"xonbatt": SensorEntityDescription(
key="xonbatt",
translation_key="transfer_to_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
}
@@ -537,13 +529,7 @@ class APCUPSdSensor(APCUPSdEntity, SensorEntity):
self._attr_native_value = None
return
data = self.coordinator.data[key]
if self.entity_description.device_class == SensorDeviceClass.TIMESTAMP:
self._attr_native_value = dateutil.parser.parse(data)
return
self._attr_native_value, inferred_unit = infer_unit(data)
self._attr_native_value, inferred_unit = infer_unit(self.coordinator.data[key])
if not self.native_unit_of_measurement:
self._attr_native_unit_of_measurement = inferred_unit

View File

@@ -5,14 +5,9 @@ from __future__ import annotations
import asyncio
import logging
from random import randrange
import sys
from typing import Any, cast
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
from homeassistant.components import zeroconf
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -29,7 +24,11 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryNotReady,
HomeAssistantError,
)
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.dispatcher import async_dispatcher_send
@@ -43,6 +42,18 @@ from .const import (
SIGNAL_DISCONNECTED,
)
if sys.version_info < (3, 14):
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
else:
class DeviceListener:
"""Dummy class."""
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME_TV = "Apple TV"
@@ -51,27 +62,32 @@ DEFAULT_NAME_HP = "HomePod"
BACKOFF_TIME_LOWER_LIMIT = 15 # seconds
BACKOFF_TIME_UPPER_LIMIT = 300 # Five minutes
PLATFORMS = [Platform.BINARY_SENSOR, Platform.MEDIA_PLAYER, Platform.REMOTE]
PLATFORMS = [Platform.MEDIA_PLAYER, Platform.REMOTE]
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
if sys.version_info < (3, 14):
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
else:
AUTH_EXCEPTIONS = ()
CONNECTION_TIMEOUT_EXCEPTIONS = ()
DEVICE_EXCEPTIONS = ()
type AppleTvConfigEntry = ConfigEntry[AppleTVManager]
@@ -79,6 +95,10 @@ type AppleTvConfigEntry = ConfigEntry[AppleTVManager]
async def async_setup_entry(hass: HomeAssistant, entry: AppleTvConfigEntry) -> bool:
"""Set up a config entry for Apple TV."""
if sys.version_info >= (3, 14):
raise HomeAssistantError(
"Apple TV is not supported on Python 3.14. Please use Python 3.13."
)
manager = AppleTVManager(hass, entry)
if manager.is_on:

View File

@@ -1,63 +0,0 @@
"""Binary sensor support for Apple TV."""
from __future__ import annotations
from pyatv.const import KeyboardFocusState
from pyatv.interface import AppleTV, KeyboardListener
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AppleTvConfigEntry
from .entity import AppleTVEntity
async def async_setup_entry(
hass: HomeAssistant,
config_entry: AppleTvConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Load Apple TV binary sensor based on a config entry."""
# apple_tv config entries always have a unique id
assert config_entry.unique_id is not None
name: str = config_entry.data[CONF_NAME]
manager = config_entry.runtime_data
async_add_entities([AppleTVKeyboardFocused(name, config_entry.unique_id, manager)])
class AppleTVKeyboardFocused(AppleTVEntity, BinarySensorEntity, KeyboardListener):
"""Binary sensor for Text input focused."""
_attr_translation_key = "keyboard_focused"
_attr_available = True
@callback
def async_device_connected(self, atv: AppleTV) -> None:
"""Handle when connection is made to device."""
self._attr_available = True
# Listen to keyboard updates
atv.keyboard.listener = self
# Set initial state based on current focus state
self._update_state(atv.keyboard.text_focus_state == KeyboardFocusState.Focused)
@callback
def async_device_disconnected(self) -> None:
"""Handle when connection was lost to device."""
self._attr_available = False
self._update_state(False)
def focusstate_update(
self, old_state: KeyboardFocusState, new_state: KeyboardFocusState
) -> None:
"""Update keyboard state when it changes.
This is a callback function from pyatv.interface.KeyboardListener.
"""
self._update_state(new_state == KeyboardFocusState.Focused)
def _update_state(self, new_state: bool) -> None:
"""Update and report."""
self._attr_is_on = new_state
self.async_write_ha_state()

View File

@@ -18,6 +18,7 @@ class AppleTVEntity(Entity):
_attr_should_poll = False
_attr_has_entity_name = True
_attr_name = None
atv: AppleTVInterface | None = None
def __init__(self, name: str, identifier: str, manager: AppleTVManager) -> None:

View File

@@ -1,12 +0,0 @@
{
"entity": {
"binary_sensor": {
"keyboard_focused": {
"default": "mdi:keyboard",
"state": {
"off": "mdi:keyboard-off"
}
}
}
}
}

View File

@@ -8,7 +8,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["pyatv", "srptools"],
"requirements": ["pyatv==0.17.0"],
"requirements": ["pyatv==0.16.1;python_version<'3.14'"],
"zeroconf": [
"_mediaremotetv._tcp.local.",
"_companion-link._tcp.local.",

View File

@@ -115,7 +115,6 @@ class AppleTvMediaPlayer(
"""Representation of an Apple TV media player."""
_attr_supported_features = SUPPORT_APPLE_TV
_attr_name = None
def __init__(self, name: str, identifier: str, manager: AppleTVManager) -> None:
"""Initialize the Apple TV media player."""
@@ -240,15 +239,6 @@ class AppleTvMediaPlayer(
"""
self.async_write_ha_state()
@callback
def volume_device_update(
self, output_device: OutputDevice, old_level: float, new_level: float
) -> None:
"""Output device volume was updated.
This is a callback function from pyatv.interface.AudioListener.
"""
@callback
def outputdevices_update(
self, old_devices: list[OutputDevice], new_devices: list[OutputDevice]

View File

@@ -51,8 +51,6 @@ async def async_setup_entry(
class AppleTVRemote(AppleTVEntity, RemoteEntity):
"""Device that sends commands to an Apple TV."""
_attr_name = None
@property
def is_on(self) -> bool:
"""Return true if device is on."""

View File

@@ -62,13 +62,6 @@
}
}
},
"entity": {
"binary_sensor": {
"keyboard_focused": {
"name": "Keyboard focus"
}
}
},
"options": {
"step": {
"init": {

View File

@@ -2,35 +2,14 @@
from __future__ import annotations
import logging
from homeassistant.const import Platform
from homeassistant.core import HomeAssistant
from .coordinator import ArveConfigEntry, ArveCoordinator
_LOGGER = logging.getLogger(__name__)
PLATFORMS: list[Platform] = [Platform.SENSOR]
async def async_migrate_entry(hass: HomeAssistant, entry: ArveConfigEntry) -> bool:
"""Migrate entry."""
_LOGGER.debug("Migrating from version %s.%s", entry.version, entry.minor_version)
if entry.version == 1:
# 1 -> 1.2: Unique ID from integer to string
if entry.minor_version == 1:
minor_version = 2
hass.config_entries.async_update_entry(
entry, unique_id=str(entry.unique_id), minor_version=minor_version
)
_LOGGER.debug("Migration successful")
return True
async def async_setup_entry(hass: HomeAssistant, entry: ArveConfigEntry) -> bool:
"""Set up Arve from a config entry."""

View File

@@ -19,9 +19,6 @@ _LOGGER = logging.getLogger(__name__)
class ArveConfigFlowHandler(ConfigFlow, domain=DOMAIN):
"""Handle a config flow for Arve."""
VERSION = 1
MINOR_VERSION = 2
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
@@ -38,7 +35,7 @@ class ArveConfigFlowHandler(ConfigFlow, domain=DOMAIN):
except ArveConnectionError:
errors["base"] = "cannot_connect"
else:
await self.async_set_unique_id(str(customer.customerId))
await self.async_set_unique_id(customer.customerId)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title="Arve",

View File

@@ -3,8 +3,9 @@
from abc import ABC, abstractmethod
from dataclasses import dataclass
import logging
import math
from pymicro_vad import MicroVad
from pysilero_vad import SileroVoiceActivityDetector
from pyspeex_noise import AudioProcessor
from .const import BYTES_PER_CHUNK
@@ -42,8 +43,8 @@ class AudioEnhancer(ABC):
"""Enhance chunk of PCM audio @ 16Khz with 16-bit mono samples."""
class MicroVadSpeexEnhancer(AudioEnhancer):
"""Audio enhancer that runs microVAD and speex."""
class SileroVadSpeexEnhancer(AudioEnhancer):
"""Audio enhancer that runs Silero VAD and speex."""
def __init__(
self, auto_gain: int, noise_suppression: int, is_vad_enabled: bool
@@ -69,21 +70,49 @@ class MicroVadSpeexEnhancer(AudioEnhancer):
self.noise_suppression,
)
self.vad: MicroVad | None = None
self.vad: SileroVoiceActivityDetector | None = None
# We get 10ms chunks but Silero works on 32ms chunks, so we have to
# buffer audio. The previous speech probability is used until enough
# audio has been buffered.
self._vad_buffer: bytearray | None = None
self._vad_buffer_chunks = 0
self._vad_buffer_chunk_idx = 0
self._last_speech_probability: float | None = None
if self.is_vad_enabled:
self.vad = MicroVad()
_LOGGER.debug("Initialized microVAD")
self.vad = SileroVoiceActivityDetector()
# VAD buffer is a multiple of 10ms, but Silero VAD needs 32ms.
self._vad_buffer_chunks = int(
math.ceil(self.vad.chunk_bytes() / BYTES_PER_CHUNK)
)
self._vad_leftover_bytes = self.vad.chunk_bytes() - BYTES_PER_CHUNK
self._vad_buffer = bytearray(self.vad.chunk_bytes())
_LOGGER.debug("Initialized Silero VAD")
def enhance_chunk(self, audio: bytes, timestamp_ms: int) -> EnhancedAudioChunk:
"""Enhance 10ms chunk of PCM audio @ 16Khz with 16-bit mono samples."""
speech_probability: float | None = None
assert len(audio) == BYTES_PER_CHUNK
if self.vad is not None:
# Run VAD
speech_probability = self.vad.Process10ms(audio)
assert self._vad_buffer is not None
start_idx = self._vad_buffer_chunk_idx * BYTES_PER_CHUNK
self._vad_buffer[start_idx : start_idx + BYTES_PER_CHUNK] = audio
self._vad_buffer_chunk_idx += 1
if self._vad_buffer_chunk_idx >= self._vad_buffer_chunks:
# We have enough data to run Silero VAD (32 ms)
self._last_speech_probability = self.vad.process_chunk(
self._vad_buffer[: self.vad.chunk_bytes()]
)
# Copy leftover audio that wasn't processed to start
self._vad_buffer[: self._vad_leftover_bytes] = self._vad_buffer[
-self._vad_leftover_bytes :
]
self._vad_buffer_chunk_idx = 0
if self.audio_processor is not None:
# Run noise suppression and auto gain
@@ -92,5 +121,5 @@ class MicroVadSpeexEnhancer(AudioEnhancer):
return EnhancedAudioChunk(
audio=audio,
timestamp_ms=timestamp_ms,
speech_probability=speech_probability,
speech_probability=self._last_speech_probability,
)

View File

@@ -8,5 +8,5 @@
"integration_type": "system",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["pymicro-vad==1.0.1", "pyspeex-noise==1.0.2"]
"requirements": ["pysilero-vad==3.0.1", "pyspeex-noise==1.0.2"]
}

View File

@@ -55,7 +55,7 @@ from homeassistant.util import (
from homeassistant.util.hass_dict import HassKey
from homeassistant.util.limited_size_dict import LimitedSizeDict
from .audio_enhancer import AudioEnhancer, EnhancedAudioChunk, MicroVadSpeexEnhancer
from .audio_enhancer import AudioEnhancer, EnhancedAudioChunk, SileroVadSpeexEnhancer
from .const import (
ACKNOWLEDGE_PATH,
BYTES_PER_CHUNK,
@@ -633,7 +633,7 @@ class PipelineRun:
# Initialize with audio settings
if self.audio_settings.needs_processor and (self.audio_enhancer is None):
# Default audio enhancer
self.audio_enhancer = MicroVadSpeexEnhancer(
self.audio_enhancer = SileroVadSpeexEnhancer(
self.audio_settings.auto_gain_dbfs,
self.audio_settings.noise_suppression_level,
self.audio_settings.is_vad_enabled,

View File

@@ -1,23 +0,0 @@
"""Provides conditions for assist satellites."""
from homeassistant.core import HomeAssistant
from homeassistant.helpers.condition import Condition, make_entity_state_condition
from .const import DOMAIN
from .entity import AssistSatelliteState
CONDITIONS: dict[str, type[Condition]] = {
"is_idle": make_entity_state_condition(DOMAIN, AssistSatelliteState.IDLE),
"is_listening": make_entity_state_condition(DOMAIN, AssistSatelliteState.LISTENING),
"is_processing": make_entity_state_condition(
DOMAIN, AssistSatelliteState.PROCESSING
),
"is_responding": make_entity_state_condition(
DOMAIN, AssistSatelliteState.RESPONDING
),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the assist satellite conditions."""
return CONDITIONS

View File

@@ -1,19 +0,0 @@
.condition_common: &condition_common
target:
entity:
domain: assist_satellite
fields:
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_idle: *condition_common
is_listening: *condition_common
is_processing: *condition_common
is_responding: *condition_common

View File

@@ -1,18 +1,4 @@
{
"conditions": {
"is_idle": {
"condition": "mdi:chat-sleep"
},
"is_listening": {
"condition": "mdi:chat-question"
},
"is_processing": {
"condition": "mdi:chat-processing"
},
"is_responding": {
"condition": "mdi:chat-alert"
}
},
"entity_component": {
"_": {
"default": "mdi:account-voice"

View File

@@ -1,52 +1,8 @@
{
"common": {
"condition_behavior_description": "How the state should match on the targeted Assist satellites.",
"condition_behavior_name": "Behavior",
"trigger_behavior_description": "The behavior of the targeted Assist satellites to trigger on.",
"trigger_behavior_name": "Behavior"
},
"conditions": {
"is_idle": {
"description": "Tests if one or more Assist satellites are idle.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is idle"
},
"is_listening": {
"description": "Tests if one or more Assist satellites are listening.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is listening"
},
"is_processing": {
"description": "Tests if one or more Assist satellites are processing.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is processing"
},
"is_responding": {
"description": "Tests if one or more Assist satellites are responding.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is responding"
}
},
"entity_component": {
"_": {
"name": "Assist satellite",
@@ -65,12 +21,6 @@
"sentences": "Sentences"
}
},
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"trigger_behavior": {
"options": {
"any": "Any",

View File

@@ -7,7 +7,7 @@ import asyncio
from collections.abc import Callable, Mapping
from dataclasses import dataclass
import logging
from typing import Any, Literal, Protocol, cast
from typing import Any, Protocol, cast
from propcache.api import cached_property
import voluptuous as vol
@@ -16,10 +16,7 @@ from homeassistant.components import labs, websocket_api
from homeassistant.components.blueprint import CONF_USE_BLUEPRINT
from homeassistant.components.labs import async_listen as async_labs_listen
from homeassistant.const import (
ATTR_AREA_ID,
ATTR_ENTITY_ID,
ATTR_FLOOR_ID,
ATTR_LABEL_ID,
ATTR_MODE,
ATTR_NAME,
CONF_ACTIONS,
@@ -33,7 +30,6 @@ from homeassistant.const import (
CONF_OPTIONS,
CONF_PATH,
CONF_PLATFORM,
CONF_TARGET,
CONF_TRIGGERS,
CONF_VARIABLES,
CONF_ZONE,
@@ -56,7 +52,7 @@ from homeassistant.core import (
valid_entity_id,
)
from homeassistant.exceptions import HomeAssistantError, ServiceNotFound, TemplateError
from homeassistant.helpers import condition as condition_helper, config_validation as cv
from homeassistant.helpers import condition, config_validation as cv
from homeassistant.helpers.entity import ToggleEntity
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.issue_registry import (
@@ -123,14 +119,7 @@ SERVICE_TRIGGER = "trigger"
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG = "new_triggers_conditions"
_EXPERIMENTAL_CONDITION_PLATFORMS = {
"alarm_control_panel",
"assist_satellite",
"device_tracker",
"fan",
"light",
"lock",
"siren",
"switch",
}
_EXPERIMENTAL_TRIGGER_PLATFORMS = {
@@ -147,7 +136,6 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"light",
"lock",
"media_player",
"person",
"scene",
"siren",
"switch",
@@ -557,7 +545,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
automation_id: str | None,
name: str,
trigger_config: list[ConfigType],
condition: IfAction | None,
cond_func: IfAction | None,
action_script: Script,
initial_state: bool | None,
variables: ScriptVariables | None,
@@ -570,7 +558,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
self._attr_name = name
self._trigger_config = trigger_config
self._async_detach_triggers: CALLBACK_TYPE | None = None
self._condition = condition
self._cond_func = cond_func
self.action_script = action_script
self.action_script.change_listener = self.async_write_ha_state
self._initial_state = initial_state
@@ -600,48 +588,20 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return True if entity is on."""
return self._async_detach_triggers is not None or self._is_enabled
@cached_property
@property
def referenced_labels(self) -> set[str]:
"""Return a set of referenced labels."""
referenced = self.action_script.referenced_labels
return self.action_script.referenced_labels
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(
conf, ATTR_LABEL_ID
)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_LABEL_ID))
return referenced
@cached_property
@property
def referenced_floors(self) -> set[str]:
"""Return a set of referenced floors."""
referenced = self.action_script.referenced_floors
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(
conf, ATTR_FLOOR_ID
)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_FLOOR_ID))
return referenced
return self.action_script.referenced_floors
@cached_property
def referenced_areas(self) -> set[str]:
"""Return a set of referenced areas."""
referenced = self.action_script.referenced_areas
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(conf, ATTR_AREA_ID)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_AREA_ID))
return referenced
return self.action_script.referenced_areas
@property
def referenced_blueprint(self) -> str | None:
@@ -655,9 +615,9 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced devices."""
referenced = self.action_script.referenced_devices
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_devices(conf)
if self._cond_func is not None:
for conf in self._cond_func.config:
referenced |= condition.async_extract_devices(conf)
for conf in self._trigger_config:
referenced |= set(_trigger_extract_devices(conf))
@@ -669,9 +629,9 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced entities."""
referenced = self.action_script.referenced_entities
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_entities(conf)
if self._cond_func is not None:
for conf in self._cond_func.config:
referenced |= condition.async_extract_entities(conf)
for conf in self._trigger_config:
for entity_id in _trigger_extract_entities(conf):
@@ -791,8 +751,8 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
if (
not skip_condition
and self._condition is not None
and not self._condition(variables)
and self._cond_func is not None
and not self._cond_func(variables)
):
self._logger.debug(
"Conditions not met, aborting automation. Condition summary: %s",
@@ -1054,12 +1014,12 @@ async def _create_automation_entities(
)
if CONF_CONDITIONS in config_block:
condition = await _async_process_if(hass, name, config_block)
cond_func = await _async_process_if(hass, name, config_block)
if condition is None:
if cond_func is None:
continue
else:
condition = None
cond_func = None
# Add trigger variables to variables
variables = None
@@ -1077,7 +1037,7 @@ async def _create_automation_entities(
automation_id,
name,
config_block[CONF_TRIGGERS],
condition,
cond_func,
action_script,
initial_state,
variables,
@@ -1219,7 +1179,7 @@ async def _async_process_if(
if_configs = config[CONF_CONDITIONS]
try:
if_action = await condition_helper.async_conditions_from_config(
if_action = await condition.async_conditions_from_config(
hass, if_configs, LOGGER, name
)
except HomeAssistantError as ex:
@@ -1249,9 +1209,6 @@ def _trigger_extract_devices(trigger_conf: dict) -> list[str]:
if trigger_conf[CONF_PLATFORM] == "tag" and CONF_DEVICE_ID in trigger_conf:
return trigger_conf[CONF_DEVICE_ID] # type: ignore[no-any-return]
if target_devices := _get_targets_from_trigger_config(trigger_conf, CONF_DEVICE_ID):
return target_devices
return []
@@ -1282,28 +1239,9 @@ def _trigger_extract_entities(trigger_conf: dict) -> list[str]:
):
return [trigger_conf[CONF_EVENT_DATA][CONF_ENTITY_ID]]
if target_entities := _get_targets_from_trigger_config(
trigger_conf, CONF_ENTITY_ID
):
return target_entities
return []
@callback
def _get_targets_from_trigger_config(
config: dict,
target: Literal["entity_id", "device_id", "area_id", "floor_id", "label_id"],
) -> list[str]:
"""Extract targets from a target config."""
if not (target_conf := config.get(CONF_TARGET)):
return []
if not (targets := target_conf.get(target)):
return []
return [targets] if isinstance(targets, str) else targets
@websocket_api.websocket_command({"type": "automation/config", "entity_id": str})
def websocket_config(
hass: HomeAssistant,

View File

@@ -29,7 +29,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["axis"],
"requirements": ["axis==66"],
"requirements": ["axis==65"],
"ssdp": [
{
"manufacturer": "AXIS"

View File

@@ -80,7 +80,7 @@ class AzureDataExplorerClient:
def test_connection(self) -> None:
"""Test connection, will throw Exception if it cannot connect."""
query = f"['{self._table}'] | take 1"
query = f"{self._table} | take 1"
self.query_client.execute_query(self._database, query)

View File

@@ -45,7 +45,7 @@ class ADXConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
VERSION = 1
async def validate_input(self, data: dict[str, Any]) -> dict[str, str]:
async def validate_input(self, data: dict[str, Any]) -> dict[str, Any] | None:
"""Validate the user input allows us to connect.
Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.
@@ -54,40 +54,36 @@ class ADXConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
try:
await self.hass.async_add_executor_job(client.test_connection)
except KustoAuthenticationError as err:
_LOGGER.error("Authentication failed: %s", err)
except KustoAuthenticationError as exp:
_LOGGER.error(exp)
return {"base": "invalid_auth"}
except KustoServiceError as err:
_LOGGER.error("Could not connect: %s", err)
except KustoServiceError as exp:
_LOGGER.error(exp)
return {"base": "cannot_connect"}
return {}
return None
async def async_step_user(
self, user_input: dict[str, Any] | None = None
) -> ConfigFlowResult:
"""Handle the initial step."""
errors: dict[str, str] = {}
data_schema = STEP_USER_DATA_SCHEMA
if user_input is not None:
errors = await self.validate_input(user_input)
errors: dict = {}
if user_input:
errors = await self.validate_input(user_input) # type: ignore[assignment]
if not errors:
return self.async_create_entry(
data=user_input,
title=f"{user_input[CONF_ADX_CLUSTER_INGEST_URI].replace('https://', '')} / {user_input[CONF_ADX_DATABASE_NAME]} ({user_input[CONF_ADX_TABLE_NAME]})",
title=user_input[CONF_ADX_CLUSTER_INGEST_URI].replace(
"https://", ""
),
options=DEFAULT_OPTIONS,
)
# Keep previously entered values when we re-show the form after an error.
data_schema = self.add_suggested_values_to_schema(
STEP_USER_DATA_SCHEMA, user_input
)
return self.async_show_form(
step_id="user",
data_schema=data_schema,
data_schema=STEP_USER_DATA_SCHEMA,
errors=errors,
last_step=True,
)

View File

@@ -20,7 +20,6 @@
"use_queued_ingestion": "Use queued ingestion"
},
"data_description": {
"authority_id": "In Azure portal this is also known as Directory (tenant) ID",
"cluster_ingest_uri": "Ingestion URI of the cluster",
"use_queued_ingestion": "Must be enabled when using ADX free cluster"
},

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import json
import logging
from typing import Any
from azure.servicebus import ServiceBusMessage
from azure.servicebus.aio import ServiceBusClient, ServiceBusSender
@@ -93,7 +92,7 @@ class ServiceBusNotificationService(BaseNotificationService):
"""Initialize the service."""
self._client = client
async def async_send_message(self, message: str, **kwargs: Any) -> None:
async def async_send_message(self, message, **kwargs):
"""Send a message."""
dto = {ATTR_ASB_MESSAGE: message}

View File

@@ -6,15 +6,13 @@ from datetime import timedelta
import logging
from typing import Any
from b2sdk.v2 import Bucket, exception
from b2sdk.v2 import B2Api, Bucket, InMemoryAccountInfo, exception
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.helpers.event import async_track_time_interval
# Import from b2_client to ensure timeout configuration is applied
from .b2_client import B2Api, InMemoryAccountInfo
from .const import (
BACKBLAZE_REALM,
CONF_APPLICATION_KEY,
@@ -74,11 +72,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: BackblazeConfigEntry) ->
translation_domain=DOMAIN,
translation_key="invalid_bucket_name",
) from err
except (
exception.B2ConnectionError,
exception.B2RequestTimeout,
exception.ConnectionReset,
) as err:
except exception.ConnectionReset as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
translation_key="cannot_connect",

View File

@@ -1,39 +0,0 @@
"""Backblaze B2 client with extended timeouts.
The b2sdk library uses class-level timeout attributes. To avoid modifying
global library state, we subclass the relevant classes to provide extended
timeouts suitable for backup operations involving large files.
"""
from b2sdk.v2 import B2Api as BaseB2Api, InMemoryAccountInfo
from b2sdk.v2.b2http import B2Http as BaseB2Http
from b2sdk.v2.session import B2Session as BaseB2Session
# Extended timeouts for Home Assistant backup operations
# Default CONNECTION_TIMEOUT is 46 seconds, which can be too short for slow connections
CONNECTION_TIMEOUT = 120 # 2 minutes
# Default TIMEOUT_FOR_UPLOAD is 128 seconds, which is too short for large backups
TIMEOUT_FOR_UPLOAD = 43200 # 12 hours
class B2Http(BaseB2Http): # type: ignore[misc]
"""B2Http with extended timeouts for backup operations."""
CONNECTION_TIMEOUT = CONNECTION_TIMEOUT
TIMEOUT_FOR_UPLOAD = TIMEOUT_FOR_UPLOAD
class B2Session(BaseB2Session): # type: ignore[misc]
"""B2Session using custom B2Http with extended timeouts."""
B2HTTP_CLASS = B2Http
class B2Api(BaseB2Api): # type: ignore[misc]
"""B2Api using custom session with extended timeouts."""
SESSION_CLASS = B2Session
__all__ = ["B2Api", "InMemoryAccountInfo"]

View File

@@ -36,10 +36,6 @@ _LOGGER = logging.getLogger(__name__)
# Cache TTL for backup list (in seconds)
CACHE_TTL = 300
# Timeout for upload operations (in seconds)
# This prevents uploads from hanging indefinitely
UPLOAD_TIMEOUT = 43200 # 12 hours (matches B2 HTTP timeout)
def suggested_filenames(backup: AgentBackup) -> tuple[str, str]:
"""Return the suggested filenames for the backup and metadata files."""
@@ -333,28 +329,13 @@ class BackblazeBackupAgent(BackupAgent):
_LOGGER.debug("Uploading backup file %s with streaming", filename)
try:
content_type, _ = mimetypes.guess_type(filename)
file_version = await asyncio.wait_for(
self._hass.async_add_executor_job(
self._upload_unbound_stream_sync,
reader,
filename,
content_type or "application/x-tar",
file_info,
),
timeout=UPLOAD_TIMEOUT,
file_version = await self._hass.async_add_executor_job(
self._upload_unbound_stream_sync,
reader,
filename,
content_type or "application/x-tar",
file_info,
)
except TimeoutError:
_LOGGER.error(
"Upload of %s timed out after %s seconds", filename, UPLOAD_TIMEOUT
)
reader.abort()
raise BackupAgentError(
f"Upload timed out after {UPLOAD_TIMEOUT} seconds"
) from None
except asyncio.CancelledError:
_LOGGER.warning("Upload of %s was cancelled", filename)
reader.abort()
raise
finally:
reader.close()

View File

@@ -6,7 +6,7 @@ from collections.abc import Mapping
import logging
from typing import Any
from b2sdk.v2 import exception
from b2sdk.v2 import B2Api, InMemoryAccountInfo, exception
import voluptuous as vol
from homeassistant.config_entries import ConfigEntry, ConfigFlow, ConfigFlowResult
@@ -17,8 +17,6 @@ from homeassistant.helpers.selector import (
TextSelectorType,
)
# Import from b2_client to ensure timeout configuration is applied
from .b2_client import B2Api, InMemoryAccountInfo
from .const import (
BACKBLAZE_REALM,
CONF_APPLICATION_KEY,
@@ -174,12 +172,8 @@ class BackblazeConfigFlow(ConfigFlow, domain=DOMAIN):
"Backblaze B2 bucket '%s' does not exist", user_input[CONF_BUCKET]
)
errors[CONF_BUCKET] = "invalid_bucket_name"
except (
exception.B2ConnectionError,
exception.B2RequestTimeout,
exception.ConnectionReset,
) as err:
_LOGGER.error("Failed to connect to Backblaze B2: %s", err)
except exception.ConnectionReset:
_LOGGER.error("Failed to connect to Backblaze B2. Connection reset")
errors["base"] = "cannot_connect"
except exception.MissingAccountData:
# This generally indicates an issue with how InMemoryAccountInfo is used

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_push",
"loggers": ["b2sdk"],
"quality_scale": "bronze",
"requirements": ["b2sdk==2.10.1"]
"requirements": ["b2sdk==2.8.1"]
}

View File

@@ -17,12 +17,10 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_HOST, CONF_MODEL, Platform
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers import config_validation as cv, device_registry as dr
from homeassistant.helpers.typing import ConfigType
from homeassistant.helpers import device_registry as dr
from homeassistant.util.ssl import get_default_context
from .const import DOMAIN
from .services import async_setup_services
from .websocket import BeoWebsocket
@@ -36,20 +34,7 @@ class BeoData:
type BeoConfigEntry = ConfigEntry[BeoData]
PLATFORMS = [
Platform.BINARY_SENSOR,
Platform.EVENT,
Platform.MEDIA_PLAYER,
Platform.SENSOR,
]
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the component."""
async_setup_services(hass)
return True
PLATFORMS = [Platform.EVENT, Platform.MEDIA_PLAYER]
async def async_setup_entry(hass: HomeAssistant, entry: BeoConfigEntry) -> bool:

View File

@@ -1,63 +0,0 @@
"""Binary Sensor entities for the Bang & Olufsen integration."""
from __future__ import annotations
from mozart_api.models import BatteryState
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import BeoConfigEntry
from .const import CONNECTION_STATUS, DOMAIN, WebsocketNotification
from .entity import BeoEntity
from .util import supports_battery
async def async_setup_entry(
hass: HomeAssistant,
config_entry: BeoConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Binary Sensor entities from config entry."""
if await supports_battery(config_entry.runtime_data.client):
async_add_entities(new_entities=[BeoBinarySensorBatteryCharging(config_entry)])
class BeoBinarySensorBatteryCharging(BinarySensorEntity, BeoEntity):
"""Battery charging Binary Sensor."""
_attr_device_class = BinarySensorDeviceClass.BATTERY_CHARGING
_attr_is_on = False
def __init__(self, config_entry: BeoConfigEntry) -> None:
"""Init the battery charging Binary Sensor."""
super().__init__(config_entry, config_entry.runtime_data.client)
self._attr_unique_id = f"{self._unique_id}_charging"
async def async_added_to_hass(self) -> None:
"""Turn on the dispatchers."""
self.async_on_remove(
async_dispatcher_connect(
self.hass,
f"{DOMAIN}_{self._unique_id}_{CONNECTION_STATUS}",
self._async_update_connection_state,
)
)
self.async_on_remove(
async_dispatcher_connect(
self.hass,
f"{DOMAIN}_{self._unique_id}_{WebsocketNotification.BATTERY}",
self._update_battery_charging,
)
)
async def _update_battery_charging(self, data: BatteryState) -> None:
"""Update battery charging."""
self._attr_is_on = bool(data.is_charging)
self.async_write_ha_state()

Some files were not shown because too many files have changed in this diff Show More