Compare commits

..

1 Commits

Author SHA1 Message Date
Daniel Hjelseth Høyer
d2cb3928e9 Homevolt select
Signed-off-by: Daniel Hjelseth Høyer <github@dahoiv.net>
2026-02-27 10:46:46 +01:00
1216 changed files with 19767 additions and 79359 deletions

View File

@@ -1,46 +0,0 @@
---
name: github-pr-reviewer
description: Review a GitHub pull request and provide feedback comments. Use when the user says "review the current PR" or asks to review a specific PR.
---
# Review GitHub Pull Request
## Preparation:
- Check if the local commit matches the last one in the PR. If not, checkout the PR locally using 'gh pr checkout'.
- CRITICAL: If 'gh pr checkout' fails for ANY reason, you MUST immediately STOP.
- Do NOT attempt any workarounds.
- Do NOT proceed with the review.
- ALERT about the failure and WAIT for instructions.
- This is a hard requirement - no exceptions.
## Follow these steps:
1. Use 'gh pr view' to get the PR details and description.
2. Use 'gh pr diff' to see all the changes in the PR.
3. Analyze the code changes for:
- Code quality and style consistency
- Potential bugs or issues
- Performance implications
- Security concerns
- Test coverage
- Documentation updates if needed
4. Ensure any existing review comments have been addressed.
5. Generate constructive review comments in the CONSOLE. DO NOT POST TO GITHUB YOURSELF.
## IMPORTANT:
- Just review. DO NOT make any changes
- Be constructive and specific in your comments
- Suggest improvements where appropriate
- Only provide review feedback in the CONSOLE. DO NOT ACT ON GITHUB.
- No need to run tests or linters, just review the code changes.
- No need to highlight things that are already good.
## Output format:
- List specific comments for each file/line that needs attention
- In the end, summarize with an overall assessment (approve, request changes, or comment) and bullet point list of changes suggested, if any.
- Example output:
```
Overall assessment: request changes.
- [CRITICAL] Memory leak in homeassistant/components/sensor/my_sensor.py:143
- [PROBLEM] Inefficient algorithm in homeassistant/helpers/data_processing.py:87
- [SUGGESTION] Improve variable naming in homeassistant/helpers/config_validation.py:45
```

View File

@@ -34,7 +34,6 @@ base_platforms: &base_platforms
- homeassistant/components/humidifier/**
- homeassistant/components/image/**
- homeassistant/components/image_processing/**
- homeassistant/components/infrared/**
- homeassistant/components/lawn_mower/**
- homeassistant/components/light/**
- homeassistant/components/lock/**

View File

@@ -331,6 +331,864 @@ class MyCoordinator(DataUpdateCoordinator[MyData]):
```
# Skills
# Skill: Home Assistant Integration knowledge
- Home Assistant Integration knowledge: .claude/skills/integrations/SKILL.md
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
<REFERENCE platform-diagnostics.md>
# Integration Diagnostics
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates
<END REFERENCE platform-diagnostics.md>
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
<REFERENCE platform-repairs.md>
# Repairs platform
Platform exists as `homeassistant/components/<domain>/repairs.py`.
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve
<END REFERENCE platform-repairs.md>
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```

View File

@@ -10,6 +10,7 @@ on:
env:
BUILD_TYPE: core
DEFAULT_PYTHON: "3.14.2"
PIP_TIMEOUT: 60
UV_HTTP_TIMEOUT: 60
UV_SYSTEM_PYTHON: "true"
@@ -35,17 +36,16 @@ jobs:
channel: ${{ steps.version.outputs.channel }}
publish: ${{ steps.version.outputs.publish }}
architectures: ${{ env.ARCHITECTURES }}
base_image_version: ${{ env.BASE_IMAGE_VERSION }}
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Get information
id: info
@@ -75,9 +75,44 @@ jobs:
env:
LOKALISE_TOKEN: ${{ secrets.LOKALISE_TOKEN }}
- name: Archive translations
shell: bash
run: find ./homeassistant/components/*/translations -name "*.json" | tar zcvf translations.tar.gz -T -
- name: Upload translations
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: translations
path: translations.tar.gz
if-no-files-found: error
build_base:
name: Build ${{ matrix.arch }} base core image
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
arch: ${{ fromJson(needs.init.outputs.architectures) }}
include:
- arch: amd64
os: ubuntu-latest
- arch: aarch64
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Download nightly wheels of frontend
if: steps.version.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: home-assistant/frontend
@@ -87,8 +122,8 @@ jobs:
name: wheels
- name: Download nightly wheels of intents
if: steps.version.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@2536c51d3d126276eb39f74d6bc9c72ac6ef30d3 # v16
if: needs.init.outputs.channel == 'dev'
uses: dawidd6/action-download-artifact@5c98f0b039f36ef966fdb7dfa9779262785ecb05 # v14
with:
github_token: ${{secrets.GITHUB_TOKEN}}
repo: OHF-Voice/intents-package
@@ -97,12 +132,18 @@ jobs:
workflow_conclusion: success
name: package
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
if: needs.init.outputs.channel == 'dev'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Adjust nightly version
if: steps.version.outputs.channel == 'dev'
if: needs.init.outputs.channel == 'dev'
shell: bash
env:
UV_PRERELEASE: allow
VERSION: ${{ steps.version.outputs.version }}
VERSION: ${{ needs.init.outputs.version }}
run: |
python3 -m pip install "$(grep '^uv' < requirements.txt)"
uv pip install packaging tomli
@@ -140,78 +181,98 @@ jobs:
sed -i "s|home-assistant-intents==.*||" requirements_all.txt requirements.txt
fi
- name: Download translations
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Write meta info file
shell: bash
run: |
echo "${GITHUB_SHA};${GITHUB_REF};${GITHUB_EVENT_NAME};${GITHUB_ACTOR}" > rootfs/OFFICIAL_IMAGE
- name: Upload build context overlay
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
name: build-context
if-no-files-found: ignore
path: |
homeassistant/components/*/translations/
rootfs/OFFICIAL_IMAGE
home_assistant_frontend-*.whl
home_assistant_intents-*.whl
homeassistant/const.py
homeassistant/components/frontend/manifest.json
homeassistant/components/conversation/manifest.json
homeassistant/package_constraints.txt
requirements_all.txt
requirements.txt
pyproject.toml
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
build_base:
name: Build ${{ matrix.arch }} base core image
if: github.repository_owner == 'home-assistant'
needs: init
runs-on: ${{ matrix.os }}
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
id-token: write # For cosign signing
strategy:
fail-fast: false
matrix:
include:
- arch: amd64
os: ubuntu-24.04
- arch: aarch64
os: ubuntu-24.04-arm
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0
with:
persist-credentials: false
cosign-release: "v2.5.3"
- name: Download build context overlay
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with:
name: build-context
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build variables
id: vars
shell: bash
env:
ARCH: ${{ matrix.arch }}
run: |
echo "base_image=ghcr.io/home-assistant/${ARCH}-homeassistant-base:${BASE_IMAGE_VERSION}" >> "$GITHUB_OUTPUT"
echo "cache_image=ghcr.io/home-assistant/${ARCH}-homeassistant:latest" >> "$GITHUB_OUTPUT"
echo "created=$(date --rfc-3339=seconds --utc)" >> "$GITHUB_OUTPUT"
- name: Verify base image signature
env:
BASE_IMAGE: ${{ steps.vars.outputs.base_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/docker/.*" \
"${BASE_IMAGE}"
- name: Verify cache image signature
id: cache
continue-on-error: true
env:
CACHE_IMAGE: ${{ steps.vars.outputs.cache_image }}
run: |
cosign verify \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
--certificate-identity-regexp "https://github.com/home-assistant/core/.*" \
"${CACHE_IMAGE}"
- name: Build base image
uses: home-assistant/builder/actions/build-image@gha-builder # zizmor: ignore[unpinned-uses]
id: build
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
with:
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
context: .
cosign-base-identity: "https://github.com/home-assistant/docker/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant-base:${{ needs.init.outputs.base_image_version }}
image: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant
image-tags: ${{ needs.init.outputs.version }}
file: ./Dockerfile
platforms: ${{ steps.vars.outputs.platform }}
push: true
version: ${{ needs.init.outputs.version }}
cache-from: ${{ steps.cache.outcome == 'success' && steps.vars.outputs.cache_image || '' }}
build-args: |
BUILD_FROM=${{ steps.vars.outputs.base_image }}
tags: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
outputs: type=image,push=true,compression=zstd,compression-level=9,force-compression=true,oci-mediatypes=true
labels: |
io.hass.arch=${{ matrix.arch }}
io.hass.version=${{ needs.init.outputs.version }}
org.opencontainers.image.created=${{ steps.vars.outputs.created }}
org.opencontainers.image.version=${{ needs.init.outputs.version }}
- name: Sign image
env:
ARCH: ${{ matrix.arch }}
VERSION: ${{ needs.init.outputs.version }}
DIGEST: ${{ steps.build.outputs.digest }}
run: |
cosign sign --yes "ghcr.io/home-assistant/${ARCH}-homeassistant:${VERSION}@${DIGEST}"
build_machine:
name: Build ${{ matrix.machine }} machine core image
if: github.repository_owner == 'home-assistant'
needs: ["init", "build_base"]
runs-on: ${{ matrix.runs-on }}
runs-on: ubuntu-latest
permissions:
contents: read # To check out the repository
packages: write # To push to GHCR
@@ -233,59 +294,40 @@ jobs:
- raspberrypi5-64
- yellow
- green
include:
# Default: aarch64 on native ARM runner
- arch: aarch64
runs-on: ubuntu-24.04-arm
# Overrides for amd64 machines
- machine: generic-x86-64
arch: amd64
runs-on: ubuntu-24.04
- machine: qemux86-64
arch: amd64
runs-on: ubuntu-24.04
# TODO: remove, intel-nuc is a legacy name for x86-64, renamed in 2021
- machine: intel-nuc
arch: amd64
runs-on: ubuntu-24.04
steps:
- name: Checkout the repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Compute extra tags
id: tags
shell: bash
- name: Set build additional args
env:
VERSION: ${{ needs.init.outputs.version }}
run: |
# Create general tags
if [[ "${VERSION}" =~ d ]]; then
echo "extra_tags=dev" >> "$GITHUB_OUTPUT"
echo "BUILD_ARGS=--additional-tag dev" >> $GITHUB_ENV
elif [[ "${VERSION}" =~ b ]]; then
echo "extra_tags=beta" >> "$GITHUB_OUTPUT"
echo "BUILD_ARGS=--additional-tag beta" >> $GITHUB_ENV
else
echo "extra_tags=stable" >> "$GITHUB_OUTPUT"
echo "BUILD_ARGS=--additional-tag stable" >> $GITHUB_ENV
fi
- name: Build machine image
uses: home-assistant/builder/actions/build-image@gha-builder # zizmor: ignore[unpinned-uses]
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
arch: ${{ matrix.arch }}
build-args: |
BUILD_FROM=ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
cache-gha: false
container-registry-password: ${{ secrets.GITHUB_TOKEN }}
context: machine/
cosign-base-identity: "https://github.com/home-assistant/core/.*"
cosign-base-verify: ghcr.io/home-assistant/${{ matrix.arch }}-homeassistant:${{ needs.init.outputs.version }}
file: machine/${{ matrix.machine }}
image: ghcr.io/home-assistant/${{ matrix.machine }}-homeassistant
image-tags: |
${{ needs.init.outputs.version }}
${{ steps.tags.outputs.extra_tags }}
push: true
version: ${{ needs.init.outputs.version }}
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build base image
uses: home-assistant/builder@2025.11.0 # zizmor: ignore[unpinned-uses]
with:
args: |
$BUILD_ARGS \
--target /data/machine \
--cosign \
--machine "${{ needs.init.outputs.version }}=${{ matrix.machine }}"
publish_ha:
name: Publish version files
@@ -480,15 +522,20 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Download build context overlay
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
- name: Download translations
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: build-context
name: translations
- name: Extract translations
run: |
tar xvf translations.tar.gz
rm translations.tar.gz
- name: Build package
shell: bash

View File

@@ -40,8 +40,9 @@ env:
CACHE_VERSION: 3
UV_CACHE_VERSION: 1
MYPY_CACHE_VERSION: 1
HA_SHORT_VERSION: "2026.4"
ADDITIONAL_PYTHON_VERSIONS: "[]"
HA_SHORT_VERSION: "2026.3"
DEFAULT_PYTHON: "3.14.2"
ALL_PYTHON_VERSIONS: "['3.14.2']"
# 10.3 is the oldest supported version
# - 10.3.32 is the version currently shipped with Synology (as of 17 Feb 2022)
# 10.6 is the current long-term-support
@@ -165,11 +166,6 @@ jobs:
tests_glob=""
lint_only=""
skip_coverage=""
default_python=$(cat .python-version)
all_python_versions=$(jq -cn \
--arg default_python "${default_python}" \
--argjson additional_python_versions "${ADDITIONAL_PYTHON_VERSIONS}" \
'[$default_python] + $additional_python_versions')
if [[ "${INTEGRATION_CHANGES}" != "[]" ]];
then
@@ -239,8 +235,8 @@ jobs:
echo "mariadb_groups=${mariadb_groups}" >> $GITHUB_OUTPUT
echo "postgresql_groups: ${postgresql_groups}"
echo "postgresql_groups=${postgresql_groups}" >> $GITHUB_OUTPUT
echo "python_versions: ${all_python_versions}"
echo "python_versions=${all_python_versions}" >> $GITHUB_OUTPUT
echo "python_versions: ${ALL_PYTHON_VERSIONS}"
echo "python_versions=${ALL_PYTHON_VERSIONS}" >> $GITHUB_OUTPUT
echo "test_full_suite: ${test_full_suite}"
echo "test_full_suite=${test_full_suite}" >> $GITHUB_OUTPUT
echo "integrations_glob: ${integrations_glob}"
@@ -456,7 +452,7 @@ jobs:
python --version
uv pip freeze >> pip_freeze.txt
- name: Upload pip_freeze artifact
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pip-freeze-${{ matrix.python-version }}
path: pip_freeze.txt
@@ -507,13 +503,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -544,13 +540,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -580,11 +576,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Run gen_copilot_instructions.py
run: |
@@ -609,7 +605,7 @@ jobs:
with:
persist-credentials: false
- name: Dependency review
uses: actions/dependency-review-action@05fe4576374b728f0c523d6a13d64c25081e0803 # v4.8.3
uses: actions/dependency-review-action@3c4e3dcb1aa7874d2c16be7d79418e9b7efd6261 # v4.8.2
with:
license-check: false # We use our own license audit checks
@@ -657,7 +653,7 @@ jobs:
. venv/bin/activate
python -m script.licenses extract --output-file=licenses-${PYTHON_VERSION}.json
- name: Upload licenses
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: licenses-${{ github.run_number }}-${{ matrix.python-version }}
path: licenses-${{ matrix.python-version }}.json
@@ -686,13 +682,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -739,13 +735,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -790,11 +786,11 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Generate partial mypy restore key
id: generate-mypy-key
@@ -802,7 +798,7 @@ jobs:
mypy_version=$(cat requirements_test.txt | grep 'mypy.*=' | cut -d '=' -f 3)
echo "version=${mypy_version}" >> $GITHUB_OUTPUT
echo "key=mypy-${MYPY_CACHE_VERSION}-${mypy_version}-${HA_SHORT_VERSION}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -883,13 +879,13 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore full Python virtual environment
- name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
@@ -905,7 +901,7 @@ jobs:
. venv/bin/activate
python -m script.split_tests ${TEST_GROUP_COUNT} tests
- name: Upload pytest_buckets
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest_buckets
path: pytest_buckets.txt
@@ -982,7 +978,7 @@ jobs:
run: |
echo "::add-matcher::.github/workflows/matchers/pytest-slow.json"
- name: Download pytest_buckets
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: pytest_buckets
- name: Compile English translations
@@ -1024,14 +1020,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-full.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1044,7 +1040,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-full-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1181,7 +1177,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${mariadb}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1189,7 +1185,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1203,7 +1199,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-mariadb-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.mariadb }}
@@ -1342,7 +1338,7 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${postgresql}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1350,7 +1346,7 @@ jobs:
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1364,7 +1360,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-postgres-${{ matrix.python-version }}-${{
steps.pytest-partial.outputs.postgresql }}
@@ -1391,7 +1387,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1518,14 +1514,14 @@ jobs:
2>&1 | tee pytest-${PYTHON_VERSION}-${TEST_GROUP}.txt
- name: Upload pytest output
if: success() || failure() && steps.pytest-partial.conclusion == 'failure'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: pytest-${{ github.run_number }}-${{ matrix.python-version }}-${{ matrix.group }}
path: pytest-*.txt
overwrite: true
- name: Upload coverage artifact
if: needs.info.outputs.skip_coverage != 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: coverage-${{ matrix.python-version }}-${{ matrix.group }}
path: coverage.xml
@@ -1538,7 +1534,7 @@ jobs:
mv "junit.xml-tmp" "junit.xml"
- name: Upload test results artifact
if: needs.info.outputs.skip_coverage != 'true' && !cancelled()
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: test-results-partial-${{ matrix.python-version }}-${{ matrix.group }}
path: junit.xml
@@ -1562,7 +1558,7 @@ jobs:
with:
persist-credentials: false
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: coverage-*
- name: Upload coverage to Codecov
@@ -1591,7 +1587,7 @@ jobs:
&& needs.info.outputs.skip_coverage != 'true' && !cancelled()
steps:
- name: Download all coverage artifacts
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
pattern: test-results-*
- name: Upload test results to Codecov

View File

@@ -28,11 +28,11 @@ jobs:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@89a39a4e59826350b863aa6b6252a07ad50cf83e # v4.32.4
uses: github/codeql-action/init@9e907b5e64f6b83e7804b09294d44122997950d6 # v4.32.3
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@89a39a4e59826350b863aa6b6252a07ad50cf83e # v4.32.4
uses: github/codeql-action/analyze@9e907b5e64f6b83e7804b09294d44122997950d6 # v4.32.3
with:
category: "/language:python"

View File

@@ -236,7 +236,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@e09e65981758de8b2fdab13c2bfb7c7d5493b0b6 # v2.0.7
uses: actions/ai-inference@a380166897b5408b8fb7dddd148142794cb5624a # v2.0.6
with:
model: openai/gpt-4o
system-prompt: |

View File

@@ -62,7 +62,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@e09e65981758de8b2fdab13c2bfb7c7d5493b0b6 # v2.0.7
uses: actions/ai-inference@a380166897b5408b8fb7dddd148142794cb5624a # v2.0.6
with:
model: openai/gpt-4o-mini
system-prompt: |

View File

@@ -15,6 +15,9 @@ concurrency:
group: ${{ github.workflow }}
cancel-in-progress: true
env:
DEFAULT_PYTHON: "3.14.2"
jobs:
upload:
name: Upload
@@ -26,10 +29,10 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Upload Translations
env:

View File

@@ -16,6 +16,9 @@ on:
- "requirements.txt"
- "script/gen_requirements_all.py"
env:
DEFAULT_PYTHON: "3.14.2"
permissions: {}
concurrency:
@@ -33,11 +36,11 @@ jobs:
with:
persist-credentials: false
- name: Set up Python
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version-file: ".python-version"
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Create Python virtual environment
@@ -74,7 +77,7 @@ jobs:
) > .env_file
- name: Upload env_file
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: env_file
path: ./.env_file
@@ -82,7 +85,7 @@ jobs:
overwrite: true
- name: Upload requirements_diff
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_diff
path: ./requirements_diff.txt
@@ -94,7 +97,7 @@ jobs:
python -m script.gen_requirements_all ci
- name: Upload requirements_all_wheels
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: requirements_all_wheels
path: ./requirements_all_wheels_*.txt
@@ -107,7 +110,7 @@ jobs:
strategy:
fail-fast: false
matrix:
abi: ["cp314"]
abi: ["cp313", "cp314"]
arch: ["amd64", "aarch64"]
include:
- arch: amd64
@@ -121,12 +124,12 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_diff
@@ -158,7 +161,7 @@ jobs:
strategy:
fail-fast: false
matrix:
abi: ["cp314"]
abi: ["cp313", "cp314"]
arch: ["amd64", "aarch64"]
include:
- arch: amd64
@@ -172,17 +175,17 @@ jobs:
persist-credentials: false
- name: Download env_file
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: env_file
- name: Download requirements_diff
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_diff
- name: Download requirements_all_wheels
uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with:
name: requirements_all_wheels
@@ -206,4 +209,4 @@ jobs:
skip-binary: aiohttp;charset-normalizer;grpcio;multidict;SQLAlchemy;propcache;protobuf;pymicro-vad;yarl
constraints: "homeassistant/package_constraints.txt"
requirements-diff: "requirements_diff.txt"
requirements: "requirements_all_wheels_${{ matrix.arch }}.txt"
requirements: "requirements_all.txt"

View File

@@ -1 +1 @@
3.14.2
3.14

View File

@@ -289,7 +289,6 @@ homeassistant.components.imgw_pib.*
homeassistant.components.immich.*
homeassistant.components.incomfort.*
homeassistant.components.inels.*
homeassistant.components.infrared.*
homeassistant.components.input_button.*
homeassistant.components.input_select.*
homeassistant.components.input_text.*
@@ -545,7 +544,6 @@ homeassistant.components.tcp.*
homeassistant.components.technove.*
homeassistant.components.tedee.*
homeassistant.components.telegram_bot.*
homeassistant.components.teslemetry.*
homeassistant.components.text.*
homeassistant.components.thethingsnetwork.*
homeassistant.components.threshold.*

15
CODEOWNERS generated
View File

@@ -242,8 +242,6 @@ build.json @home-assistant/supervisor
/tests/components/bosch_alarm/ @mag1024 @sanjay900
/homeassistant/components/bosch_shc/ @tschamm
/tests/components/bosch_shc/ @tschamm
/homeassistant/components/brands/ @home-assistant/core
/tests/components/brands/ @home-assistant/core
/homeassistant/components/braviatv/ @bieniu @Drafteed
/tests/components/braviatv/ @bieniu @Drafteed
/homeassistant/components/bring/ @miaucl @tr4nt0r
@@ -401,6 +399,8 @@ build.json @home-assistant/supervisor
/tests/components/dsmr_reader/ @sorted-bits @glodenox @erwindouna
/homeassistant/components/duckdns/ @tr4nt0r
/tests/components/duckdns/ @tr4nt0r
/homeassistant/components/duke_energy/ @hunterjm
/tests/components/duke_energy/ @hunterjm
/homeassistant/components/duotecno/ @cereal2nd
/tests/components/duotecno/ @cereal2nd
/homeassistant/components/dwd_weather_warnings/ @runningman84 @stephan192
@@ -717,8 +717,8 @@ build.json @home-assistant/supervisor
/tests/components/homematic/ @pvizeli
/homeassistant/components/homematicip_cloud/ @hahn-th @lackas
/tests/components/homematicip_cloud/ @hahn-th @lackas
/homeassistant/components/homevolt/ @danielhiversen @liudger
/tests/components/homevolt/ @danielhiversen @liudger
/homeassistant/components/homevolt/ @danielhiversen
/tests/components/homevolt/ @danielhiversen
/homeassistant/components/homewizard/ @DCSBL
/tests/components/homewizard/ @DCSBL
/homeassistant/components/honeywell/ @rdfurman @mkmer
@@ -792,8 +792,6 @@ build.json @home-assistant/supervisor
/tests/components/inels/ @epdevlab
/homeassistant/components/influxdb/ @mdegat01 @Robbie1221
/tests/components/influxdb/ @mdegat01 @Robbie1221
/homeassistant/components/infrared/ @home-assistant/core
/tests/components/infrared/ @home-assistant/core
/homeassistant/components/inkbird/ @bdraco
/tests/components/inkbird/ @bdraco
/homeassistant/components/input_boolean/ @home-assistant/core
@@ -1899,8 +1897,8 @@ build.json @home-assistant/supervisor
/tests/components/withings/ @joostlek
/homeassistant/components/wiz/ @sbidy @arturpragacz
/tests/components/wiz/ @sbidy @arturpragacz
/homeassistant/components/wled/ @frenck @mik-laj
/tests/components/wled/ @frenck @mik-laj
/homeassistant/components/wled/ @frenck
/tests/components/wled/ @frenck
/homeassistant/components/wmspro/ @mback2k
/tests/components/wmspro/ @mback2k
/homeassistant/components/wolflink/ @adamkrol93 @mtielen
@@ -1968,7 +1966,6 @@ build.json @home-assistant/supervisor
/homeassistant/components/zone/ @home-assistant/core
/tests/components/zone/ @home-assistant/core
/homeassistant/components/zoneminder/ @rohankapoorcom @nabbi
/tests/components/zoneminder/ @rohankapoorcom @nabbi
/homeassistant/components/zwave_js/ @home-assistant/z-wave
/tests/components/zwave_js/ @home-assistant/z-wave
/homeassistant/components/zwave_me/ @lawfulchaos @Z-Wave-Me @PoltoS

33
Dockerfile generated
View File

@@ -1,9 +1,19 @@
# Automatically generated by hassfest.
#
# To update, run python3 -m script.hassfest -p docker
ARG BUILD_FROM=ghcr.io/home-assistant/amd64-homeassistant-base:latest
ARG BUILD_FROM
FROM ${BUILD_FROM}
LABEL \
io.hass.type="core" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.source="https://github.com/home-assistant/core" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"
# Synchronize with homeassistant/core.py:async_stop
ENV \
S6_SERVICES_GRACETIME=240000 \
@@ -20,7 +30,7 @@ RUN \
# Verify go2rtc can be executed
go2rtc --version \
# Install uv
&& pip3 install uv==0.10.6
&& pip3 install uv==0.9.26
WORKDIR /usr/src
@@ -50,22 +60,3 @@ RUN \
homeassistant/homeassistant
WORKDIR /config
ARG BUILD_ARCH=amd64
ARG BUILD_DATE="1970-01-01 00:00:00+00:00"
ARG BUILD_REPOSITORY
ARG BUILD_VERSION=0.0.0-local
LABEL \
io.hass.type="core" \
io.hass.arch="${BUILD_ARCH}" \
io.hass.version="${BUILD_VERSION}" \
org.opencontainers.image.created="${BUILD_DATE}" \
org.opencontainers.image.version="${BUILD_VERSION}" \
org.opencontainers.image.source="${BUILD_REPOSITORY}" \
org.opencontainers.image.authors="The Home Assistant Authors" \
org.opencontainers.image.description="Open-source home automation platform running on Python 3" \
org.opencontainers.image.documentation="https://www.home-assistant.io/docs/" \
org.opencontainers.image.licenses="Apache-2.0" \
org.opencontainers.image.title="Home Assistant" \
org.opencontainers.image.url="https://www.home-assistant.io/"

View File

@@ -10,7 +10,6 @@ coverage:
target: auto
threshold: 1
paths:
- homeassistant/components/*/backup.py
- homeassistant/components/*/config_flow.py
- homeassistant/components/*/device_action.py
- homeassistant/components/*/device_condition.py
@@ -29,7 +28,6 @@ coverage:
target: 100
threshold: 0
paths:
- homeassistant/components/*/backup.py
- homeassistant/components/*/config_flow.py
- homeassistant/components/*/device_action.py
- homeassistant/components/*/device_condition.py

View File

@@ -70,7 +70,7 @@ from .const import (
SIGNAL_BOOTSTRAP_INTEGRATIONS,
)
from .core_config import async_process_ha_core_config
from .exceptions import HomeAssistantError, UnsupportedStorageVersionError
from .exceptions import HomeAssistantError
from .helpers import (
area_registry,
category_registry,
@@ -210,7 +210,6 @@ DEFAULT_INTEGRATIONS = {
"analytics", # Needed for onboarding
"application_credentials",
"backup",
"brands",
"frontend",
"hardware",
"labs",
@@ -239,8 +238,6 @@ DEFAULT_INTEGRATIONS = {
}
DEFAULT_INTEGRATIONS_RECOVERY_MODE = {
# These integrations are set up if recovery mode is activated.
"backup",
"cloud",
"frontend",
}
DEFAULT_INTEGRATIONS_SUPERVISOR = {
@@ -435,56 +432,32 @@ def _init_blocking_io_modules_in_executor() -> None:
is_docker_env()
async def async_load_base_functionality(hass: core.HomeAssistant) -> bool:
"""Load the registries and modules that will do blocking I/O.
Return whether loading succeeded.
"""
async def async_load_base_functionality(hass: core.HomeAssistant) -> None:
"""Load the registries and modules that will do blocking I/O."""
if DATA_REGISTRIES_LOADED in hass.data:
return True
return
hass.data[DATA_REGISTRIES_LOADED] = None
entity.async_setup(hass)
frame.async_setup(hass)
template.async_setup(hass)
translation.async_setup(hass)
recovery = hass.config.recovery_mode
try:
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass, load_empty=recovery)),
create_eager_task(category_registry.async_load(hass, load_empty=recovery)),
create_eager_task(device_registry.async_load(hass, load_empty=recovery)),
create_eager_task(entity_registry.async_load(hass, load_empty=recovery)),
create_eager_task(floor_registry.async_load(hass, load_empty=recovery)),
create_eager_task(issue_registry.async_load(hass, load_empty=recovery)),
create_eager_task(label_registry.async_load(hass, load_empty=recovery)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass, load_empty=recovery)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
except UnsupportedStorageVersionError as err:
# If we're already in recovery mode, we don't want to handle the exception
# and activate recovery mode again, as that would lead to an infinite loop.
if recovery:
raise
_LOGGER.error(
"Storage file %s was created by a newer version of Home Assistant"
" (storage version %s > %s); activating recovery mode; on-disk data"
" is preserved; upgrade Home Assistant or restore from a backup",
err.storage_key,
err.found_version,
err.max_supported_version,
)
return False
return True
await asyncio.gather(
create_eager_task(get_internal_store_manager(hass).async_initialize()),
create_eager_task(area_registry.async_load(hass)),
create_eager_task(category_registry.async_load(hass)),
create_eager_task(device_registry.async_load(hass)),
create_eager_task(entity_registry.async_load(hass)),
create_eager_task(floor_registry.async_load(hass)),
create_eager_task(issue_registry.async_load(hass)),
create_eager_task(label_registry.async_load(hass)),
hass.async_add_executor_job(_init_blocking_io_modules_in_executor),
create_eager_task(template.async_load_custom_templates(hass)),
create_eager_task(restore_state.async_load(hass)),
create_eager_task(hass.config_entries.async_initialize()),
create_eager_task(async_get_system_info(hass)),
create_eager_task(condition.async_setup(hass)),
create_eager_task(trigger.async_setup(hass)),
)
async def async_from_config_dict(
@@ -501,9 +474,7 @@ async def async_from_config_dict(
# Prime custom component cache early so we know if registry entries are tied
# to a custom integration
await loader.async_get_custom_components(hass)
if not await async_load_base_functionality(hass):
return None
await async_load_base_functionality(hass)
# Set up core.
_LOGGER.debug("Setting up %s", CORE_INTEGRATIONS)

View File

@@ -1,5 +0,0 @@
{
"domain": "ubisys",
"name": "Ubisys",
"iot_standards": ["zigbee"]
}

View File

@@ -12,6 +12,10 @@ from homeassistant.helpers.dispatcher import dispatcher_send
from .const import DOMAIN, DOMAIN_DATA, LOGGER
SERVICE_SETTINGS = "change_setting"
SERVICE_CAPTURE_IMAGE = "capture_image"
SERVICE_TRIGGER_AUTOMATION = "trigger_automation"
ATTR_SETTING = "setting"
ATTR_VALUE = "value"
@@ -71,13 +75,16 @@ def async_setup_services(hass: HomeAssistant) -> None:
"""Home Assistant services."""
hass.services.async_register(
DOMAIN, "change_setting", _change_setting, schema=CHANGE_SETTING_SCHEMA
DOMAIN, SERVICE_SETTINGS, _change_setting, schema=CHANGE_SETTING_SCHEMA
)
hass.services.async_register(
DOMAIN, "capture_image", _capture_image, schema=CAPTURE_IMAGE_SCHEMA
DOMAIN, SERVICE_CAPTURE_IMAGE, _capture_image, schema=CAPTURE_IMAGE_SCHEMA
)
hass.services.async_register(
DOMAIN, "trigger_automation", _trigger_automation, schema=AUTOMATION_SCHEMA
DOMAIN,
SERVICE_TRIGGER_AUTOMATION,
_trigger_automation,
schema=AUTOMATION_SCHEMA,
)

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"loggers": ["accuweather"],
"requirements": ["accuweather==5.1.0"]
"requirements": ["accuweather==5.0.0"]
}

View File

@@ -30,8 +30,6 @@ async def system_health_info(hass: HomeAssistant) -> dict[str, Any]:
)
return {
"can_reach_server": system_health.async_check_can_reach_url(
hass, str(ENDPOINT)
),
"can_reach_server": system_health.async_check_can_reach_url(hass, ENDPOINT),
"remaining_requests": remaining_requests,
}

View File

@@ -191,7 +191,7 @@ class AccuWeatherEntity(
{
ATTR_FORECAST_TIME: utc_from_timestamp(item["EpochDate"]).isoformat(),
ATTR_FORECAST_CLOUD_COVERAGE: item["CloudCoverDay"],
ATTR_FORECAST_HUMIDITY: item["RelativeHumidityDay"].get("Average"),
ATTR_FORECAST_HUMIDITY: item["RelativeHumidityDay"]["Average"],
ATTR_FORECAST_NATIVE_TEMP: item["TemperatureMax"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_TEMP_LOW: item["TemperatureMin"][ATTR_VALUE],
ATTR_FORECAST_NATIVE_APPARENT_TEMP: item["RealFeelTemperatureMax"][

View File

@@ -10,6 +10,8 @@ from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
ADVANTAGE_AIR_SERVICE_SET_TIME_TO = "set_time_to"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -18,7 +20,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"set_time_to",
ADVANTAGE_AIR_SERVICE_SET_TIME_TO,
entity_domain=SENSOR_DOMAIN,
schema={vol.Required("minutes"): cv.positive_int},
func="set_time_to",

View File

@@ -8,12 +8,18 @@ from homeassistant.helpers import service
from .const import DOMAIN
_DEV_EN_ALT = "enable_alerts"
_DEV_DS_ALT = "disable_alerts"
_DEV_EN_REC = "start_recording"
_DEV_DS_REC = "stop_recording"
_DEV_SNAP = "snapshot"
CAMERA_SERVICES = {
"enable_alerts": "async_enable_alerts",
"disable_alerts": "async_disable_alerts",
"start_recording": "async_start_recording",
"stop_recording": "async_stop_recording",
"snapshot": "async_snapshot",
_DEV_EN_ALT: "async_enable_alerts",
_DEV_DS_ALT: "async_disable_alerts",
_DEV_EN_REC: "async_start_recording",
_DEV_DS_REC: "async_stop_recording",
_DEV_SNAP: "async_snapshot",
}

View File

@@ -93,6 +93,7 @@ class AirobotNumber(AirobotEntity, NumberEntity):
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="set_value_failed",
translation_placeholders={"error": str(err)},
) from err
else:
await self.coordinator.async_request_refresh()

View File

@@ -112,7 +112,7 @@
"message": "Failed to set temperature to {temperature}."
},
"set_value_failed": {
"message": "Failed to set value."
"message": "Failed to set value: {error}"
},
"switch_turn_off_failed": {
"message": "Failed to turn off {switch}."

View File

@@ -4,16 +4,7 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.exceptions import (
AirOSConnectionAuthenticationError,
AirOSConnectionSetupError,
AirOSDataMissingError,
AirOSDeviceConnectionError,
AirOSKeyDataMissingError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
from homeassistant.const import (
CONF_HOST,
@@ -24,11 +15,6 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryError,
ConfigEntryNotReady,
)
from homeassistant.helpers import device_registry as dr, entity_registry as er
from homeassistant.helpers.aiohttp_client import async_get_clientsession
@@ -53,40 +39,15 @@ async def async_setup_entry(hass: HomeAssistant, entry: AirOSConfigEntry) -> boo
hass, verify_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL]
)
conn_data = {
CONF_HOST: entry.data[CONF_HOST],
CONF_USERNAME: entry.data[CONF_USERNAME],
CONF_PASSWORD: entry.data[CONF_PASSWORD],
"use_ssl": entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
"session": session,
}
# Determine firmware version before creating the device instance
try:
device_data: DetectDeviceData = await async_get_firmware_data(**conn_data)
except (
AirOSConnectionSetupError,
AirOSDeviceConnectionError,
TimeoutError,
) as err:
raise ConfigEntryNotReady from err
except (
AirOSConnectionAuthenticationError,
AirOSDataMissingError,
) as err:
raise ConfigEntryAuthFailed from err
except AirOSKeyDataMissingError as err:
raise ConfigEntryError("key_data_missing") from err
except Exception as err:
raise ConfigEntryError("unknown") from err
airos_class: type[AirOS8 | AirOS6] = (
AirOS8 if device_data["fw_major"] == 8 else AirOS6
airos_device = AirOS8(
host=entry.data[CONF_HOST],
username=entry.data[CONF_USERNAME],
password=entry.data[CONF_PASSWORD],
session=session,
use_ssl=entry.data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
airos_device = airos_class(**conn_data)
coordinator = AirOSDataUpdateCoordinator(hass, entry, device_data, airos_device)
coordinator = AirOSDataUpdateCoordinator(hass, entry, airos_device)
await coordinator.async_config_entry_first_refresh()
entry.runtime_data = coordinator

View File

@@ -4,9 +4,7 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import Generic, TypeVar
from airos.data import AirOSDataBaseClass
import logging
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
@@ -20,24 +18,25 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirOS8Data, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
PARALLEL_UPDATES = 0
_LOGGER = logging.getLogger(__name__)
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirOSBinarySensorEntityDescription(
BinarySensorEntityDescription,
Generic[AirOSDataModel],
):
class AirOSBinarySensorEntityDescription(BinarySensorEntityDescription):
"""Describe an AirOS binary sensor."""
value_fn: Callable[[AirOSDataModel], bool]
value_fn: Callable[[AirOS8Data], bool]
AirOS8BinarySensorEntityDescription = AirOSBinarySensorEntityDescription[AirOS8Data]
COMMON_BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
AirOSBinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOSBinarySensorEntityDescription(
key="dhcp_client",
translation_key="dhcp_client",
@@ -54,23 +53,6 @@ COMMON_BINARY_SENSORS: tuple[AirOSBinarySensorEntityDescription, ...] = (
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="pppoe",
translation_key="pppoe",
device_class=BinarySensorDeviceClass.CONNECTIVITY,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.pppoe,
entity_registry_enabled_default=False,
),
)
AIROS8_BINARY_SENSORS: tuple[AirOS8BinarySensorEntityDescription, ...] = (
AirOS8BinarySensorEntityDescription(
key="portfw",
translation_key="port_forwarding",
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.portfw,
),
AirOS8BinarySensorEntityDescription(
key="dhcp6_server",
translation_key="dhcp6_server",
device_class=BinarySensorDeviceClass.RUNNING,
@@ -78,6 +60,14 @@ AIROS8_BINARY_SENSORS: tuple[AirOS8BinarySensorEntityDescription, ...] = (
value_fn=lambda data: data.services.dhcp6d_stateful,
entity_registry_enabled_default=False,
),
AirOSBinarySensorEntityDescription(
key="pppoe",
translation_key="pppoe",
device_class=BinarySensorDeviceClass.CONNECTIVITY,
entity_category=EntityCategory.DIAGNOSTIC,
value_fn=lambda data: data.services.pppoe,
entity_registry_enabled_default=False,
),
)
@@ -89,18 +79,9 @@ async def async_setup_entry(
"""Set up the AirOS binary sensors from a config entry."""
coordinator = config_entry.runtime_data
entities = [
AirOSBinarySensor(coordinator, description)
for description in COMMON_BINARY_SENSORS
]
if coordinator.device_data["fw_major"] == 8:
entities.extend(
AirOSBinarySensor(coordinator, description)
for description in AIROS8_BINARY_SENSORS
)
async_add_entities(entities)
async_add_entities(
AirOSBinarySensor(coordinator, description) for description in BINARY_SENSORS
)
class AirOSBinarySensor(AirOSEntity, BinarySensorEntity):

View File

@@ -2,6 +2,8 @@
from __future__ import annotations
import logging
from airos.exceptions import AirOSException
from homeassistant.components.button import (
@@ -16,6 +18,8 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import DOMAIN, AirOSConfigEntry, AirOSDataUpdateCoordinator
from .entity import AirOSEntity
_LOGGER = logging.getLogger(__name__)
PARALLEL_UPDATES = 0
REBOOT_BUTTON = ButtonEntityDescription(

View File

@@ -7,8 +7,6 @@ from collections.abc import Mapping
import logging
from typing import Any
from airos.airos6 import AirOS6
from airos.airos8 import AirOS8
from airos.discovery import airos_discover_devices
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -19,7 +17,6 @@ from airos.exceptions import (
AirOSKeyDataMissingError,
AirOSListenerError,
)
from airos.helpers import DetectDeviceData, async_get_firmware_data
import voluptuous as vol
from homeassistant.config_entries import (
@@ -56,11 +53,10 @@ from .const import (
MAC_ADDRESS,
SECTION_ADVANCED_SETTINGS,
)
from .coordinator import AirOS8
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
# Discovery duration in seconds, airOS announces every 20 seconds
DISCOVER_INTERVAL: int = 30
@@ -96,7 +92,7 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
def __init__(self) -> None:
"""Initialize the config flow."""
super().__init__()
self.airos_device: AirOSDeviceDetect
self.airos_device: AirOS8
self.errors: dict[str, str] = {}
self.discovered_devices: dict[str, dict[str, Any]] = {}
self.discovery_abort_reason: str | None = None
@@ -139,14 +135,16 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
verify_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_VERIFY_SSL],
)
airos_device = AirOS8(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
try:
device_data: DetectDeviceData = await async_get_firmware_data(
host=config_data[CONF_HOST],
username=config_data[CONF_USERNAME],
password=config_data[CONF_PASSWORD],
session=session,
use_ssl=config_data[SECTION_ADVANCED_SETTINGS][CONF_SSL],
)
await airos_device.login()
airos_data = await airos_device.status()
except (
AirOSConnectionSetupError,
@@ -161,14 +159,14 @@ class AirOSConfigFlow(ConfigFlow, domain=DOMAIN):
_LOGGER.exception("Unexpected exception during credential validation")
self.errors["base"] = "unknown"
else:
await self.async_set_unique_id(device_data["mac"])
await self.async_set_unique_id(airos_data.derived.mac)
if self.source in [SOURCE_REAUTH, SOURCE_RECONFIGURE]:
self._abort_if_unique_id_mismatch()
else:
self._abort_if_unique_id_configured()
return {"title": device_data["hostname"], "data": config_data}
return {"title": airos_data.host.hostname, "data": config_data}
return None

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import logging
from airos.airos6 import AirOS6, AirOS6Data
from airos.airos8 import AirOS8, AirOS8Data
from airos.exceptions import (
AirOSConnectionAuthenticationError,
@@ -12,7 +11,6 @@ from airos.exceptions import (
AirOSDataMissingError,
AirOSDeviceConnectionError,
)
from airos.helpers import DetectDeviceData
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
@@ -23,28 +21,19 @@ from .const import DOMAIN, SCAN_INTERVAL
_LOGGER = logging.getLogger(__name__)
AirOSDeviceDetect = AirOS8 | AirOS6
AirOSDataDetect = AirOS8Data | AirOS6Data
type AirOSConfigEntry = ConfigEntry[AirOSDataUpdateCoordinator]
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOS8Data]):
"""Class to manage fetching AirOS data from single endpoint."""
airos_device: AirOSDeviceDetect
config_entry: AirOSConfigEntry
def __init__(
self,
hass: HomeAssistant,
config_entry: AirOSConfigEntry,
device_data: DetectDeviceData,
airos_device: AirOSDeviceDetect,
self, hass: HomeAssistant, config_entry: AirOSConfigEntry, airos_device: AirOS8
) -> None:
"""Initialize the coordinator."""
self.airos_device = airos_device
self.device_data = device_data
super().__init__(
hass,
_LOGGER,
@@ -53,7 +42,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
update_interval=SCAN_INTERVAL,
)
async def _async_update_data(self) -> AirOSDataDetect:
async def _async_update_data(self) -> AirOS8Data:
"""Fetch data from AirOS."""
try:
await self.airos_device.login()
@@ -73,7 +62,7 @@ class AirOSDataUpdateCoordinator(DataUpdateCoordinator[AirOSDataDetect]):
translation_domain=DOMAIN,
translation_key="cannot_connect",
) from err
except AirOSDataMissingError as err:
except (AirOSDataMissingError,) as err:
_LOGGER.error("Expected data not returned by airOS device: %s", err)
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -7,6 +7,6 @@
"documentation": "https://www.home-assistant.io/integrations/airos",
"integration_type": "device",
"iot_class": "local_polling",
"quality_scale": "platinum",
"quality_scale": "silver",
"requirements": ["airos==0.6.4"]
}

View File

@@ -42,20 +42,16 @@ rules:
# Gold
devices: done
diagnostics: done
discovery-update-info: done
discovery:
status: exempt
comment: No way to detect device on the network
discovery-update-info: todo
discovery: todo
docs-data-update: done
docs-examples: done
docs-examples: todo
docs-known-limitations: done
docs-supported-devices: done
docs-supported-functions: done
docs-troubleshooting: done
docs-use-cases: done
dynamic-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
dynamic-devices: todo
entity-category: done
entity-device-class: done
entity-disabled-by-default: done
@@ -65,10 +61,8 @@ rules:
status: exempt
comment: no (custom) icons used or envisioned
reconfiguration-flow: done
repair-issues: done
stale-devices:
status: exempt
comment: single airOS device per config entry; peer/remote endpoints are not modeled as child devices/entities at this time
repair-issues: todo
stale-devices: todo
# Platinum
async-dependency: done

View File

@@ -5,14 +5,8 @@ from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
import logging
from typing import Generic, TypeVar
from airos.data import (
AirOSDataBaseClass,
DerivedWirelessMode,
DerivedWirelessRole,
NetRole,
)
from airos.data import DerivedWirelessMode, DerivedWirelessRole, NetRole
from homeassistant.components.sensor import (
SensorDeviceClass,
@@ -43,19 +37,15 @@ WIRELESS_ROLE_OPTIONS = [mode.value for mode in DerivedWirelessRole]
PARALLEL_UPDATES = 0
AirOSDataModel = TypeVar("AirOSDataModel", bound=AirOSDataBaseClass)
@dataclass(frozen=True, kw_only=True)
class AirOSSensorEntityDescription(SensorEntityDescription, Generic[AirOSDataModel]):
class AirOSSensorEntityDescription(SensorEntityDescription):
"""Describe an AirOS sensor."""
value_fn: Callable[[AirOSDataModel], StateType]
value_fn: Callable[[AirOS8Data], StateType]
AirOS8SensorEntityDescription = AirOSSensorEntityDescription[AirOS8Data]
COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
AirOSSensorEntityDescription(
key="host_cpuload",
translation_key="host_cpuload",
@@ -85,6 +75,54 @@ COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
translation_key="wireless_essid",
value_fn=lambda data: data.wireless.essid,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOSSensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
AirOSSensorEntityDescription(
key="host_uptime",
translation_key="host_uptime",
@@ -120,57 +158,6 @@ COMMON_SENSORS: tuple[AirOSSensorEntityDescription, ...] = (
options=WIRELESS_ROLE_OPTIONS,
entity_registry_enabled_default=False,
),
AirOSSensorEntityDescription(
key="wireless_antenna_gain",
translation_key="wireless_antenna_gain",
native_unit_of_measurement=SIGNAL_STRENGTH_DECIBELS,
device_class=SensorDeviceClass.SIGNAL_STRENGTH,
state_class=SensorStateClass.MEASUREMENT,
value_fn=lambda data: data.wireless.antenna_gain,
),
AirOSSensorEntityDescription(
key="wireless_polling_dl_capacity",
translation_key="wireless_polling_dl_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.dl_capacity,
),
AirOSSensorEntityDescription(
key="wireless_polling_ul_capacity",
translation_key="wireless_polling_ul_capacity",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.polling.ul_capacity,
),
)
AIROS8_SENSORS: tuple[AirOS8SensorEntityDescription, ...] = (
AirOS8SensorEntityDescription(
key="wireless_throughput_tx",
translation_key="wireless_throughput_tx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.tx,
),
AirOS8SensorEntityDescription(
key="wireless_throughput_rx",
translation_key="wireless_throughput_rx",
native_unit_of_measurement=UnitOfDataRate.KILOBITS_PER_SECOND,
device_class=SensorDeviceClass.DATA_RATE,
state_class=SensorStateClass.MEASUREMENT,
suggested_display_precision=0,
suggested_unit_of_measurement=UnitOfDataRate.MEGABITS_PER_SECOND,
value_fn=lambda data: data.wireless.throughput.rx,
),
)
@@ -182,14 +169,7 @@ async def async_setup_entry(
"""Set up the AirOS sensors from a config entry."""
coordinator = config_entry.runtime_data
entities = [AirOSSensor(coordinator, description) for description in COMMON_SENSORS]
if coordinator.device_data["fw_major"] == 8:
entities.extend(
AirOSSensor(coordinator, description) for description in AIROS8_SENSORS
)
async_add_entities(entities)
async_add_entities(AirOSSensor(coordinator, description) for description in SENSORS)
class AirOSSensor(AirOSEntity, SensorEntity):

View File

@@ -5,13 +5,12 @@ from __future__ import annotations
from datetime import timedelta
import logging
import aiohttp
from genie_partner_sdk.client import AladdinConnectClient
from genie_partner_sdk.model import GarageDoor
from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
_LOGGER = logging.getLogger(__name__)
type AladdinConnectConfigEntry = ConfigEntry[dict[str, AladdinConnectCoordinator]]
@@ -41,10 +40,7 @@ class AladdinConnectCoordinator(DataUpdateCoordinator[GarageDoor]):
async def _async_update_data(self) -> GarageDoor:
"""Fetch data from the Aladdin Connect API."""
try:
await self.client.update_door(self.data.device_id, self.data.door_number)
except aiohttp.ClientError as err:
raise UpdateFailed(f"Error communicating with API: {err}") from err
await self.client.update_door(self.data.device_id, self.data.door_number)
self.data.status = self.client.get_door_status(
self.data.device_id, self.data.door_number
)

View File

@@ -4,19 +4,14 @@ from __future__ import annotations
from typing import Any
import aiohttp
from homeassistant.components.cover import CoverDeviceClass, CoverEntity
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DOMAIN, SUPPORTED_FEATURES
from .const import SUPPORTED_FEATURES
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
from .entity import AladdinConnectEntity
PARALLEL_UPDATES = 1
async def async_setup_entry(
hass: HomeAssistant,
@@ -45,23 +40,11 @@ class AladdinCoverEntity(AladdinConnectEntity, CoverEntity):
async def async_open_cover(self, **kwargs: Any) -> None:
"""Issue open command to cover."""
try:
await self.client.open_door(self._device_id, self._number)
except aiohttp.ClientError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="open_door_failed",
) from err
await self.client.open_door(self._device_id, self._number)
async def async_close_cover(self, **kwargs: Any) -> None:
"""Issue close command to cover."""
try:
await self.client.close_door(self._device_id, self._number)
except aiohttp.ClientError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="close_door_failed",
) from err
await self.client.close_door(self._device_id, self._number)
@property
def is_closed(self) -> bool | None:

View File

@@ -1,32 +0,0 @@
"""Diagnostics support for Aladdin Connect."""
from __future__ import annotations
from typing import Any
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
from .coordinator import AladdinConnectConfigEntry
TO_REDACT = {"access_token", "refresh_token"}
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, config_entry: AladdinConnectConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"config_entry": async_redact_data(config_entry.as_dict(), TO_REDACT),
"doors": {
uid: {
"device_id": coordinator.data.device_id,
"door_number": coordinator.data.door_number,
"name": coordinator.data.name,
"status": coordinator.data.status,
"link_status": coordinator.data.link_status,
"battery_level": coordinator.data.battery_level,
}
for uid, coordinator in config_entry.runtime_data.items()
},
}

View File

@@ -26,26 +26,24 @@ rules:
unique-config-entry: done
# Silver
action-exceptions: done
action-exceptions: todo
config-entry-unloading: done
docs-configuration-parameters:
status: exempt
comment: Integration does not have an options flow.
docs-installation-parameters: done
entity-unavailable:
status: done
comment: Handled by the coordinator.
entity-unavailable: todo
integration-owner: done
log-when-unavailable:
status: done
comment: Handled by the coordinator.
parallel-updates: done
log-when-unavailable: todo
parallel-updates: todo
reauthentication-flow: done
test-coverage: done
test-coverage:
status: todo
comment: Platform tests for cover and sensor need to be implemented to reach 95% coverage.
# Gold
devices: done
diagnostics: done
diagnostics: todo
discovery: done
discovery-update-info:
status: exempt

View File

@@ -20,8 +20,6 @@ from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AladdinConnectConfigEntry, AladdinConnectCoordinator
from .entity import AladdinConnectEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AladdinConnectSensorEntityDescription(SensorEntityDescription):

View File

@@ -32,13 +32,5 @@
"title": "[%key:common::config_flow::title::reauth%]"
}
}
},
"exceptions": {
"close_door_failed": {
"message": "Failed to close the garage door"
},
"open_door_failed": {
"message": "Failed to open the garage door"
}
}
}

View File

@@ -44,7 +44,7 @@ def make_entity_state_trigger_required_features(
class CustomTrigger(EntityStateTriggerRequiredFeatures):
"""Trigger for entity state changes."""
_domains = {domain}
_domain = domain
_to_states = {to_state}
_required_features = required_features

View File

@@ -13,6 +13,9 @@ from homeassistant.helpers import config_validation as cv, service
from .const import DOMAIN
SERVICE_ALARM_TOGGLE_CHIME = "alarm_toggle_chime"
SERVICE_ALARM_KEYPRESS = "alarm_keypress"
ATTR_KEYPRESS = "keypress"
@@ -23,7 +26,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"alarm_toggle_chime",
SERVICE_ALARM_TOGGLE_CHIME,
entity_domain=ALARM_CONTROL_PANEL_DOMAIN,
schema={
vol.Required(ATTR_CODE): cv.string,
@@ -34,7 +37,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"alarm_keypress",
SERVICE_ALARM_KEYPRESS,
entity_domain=ALARM_CONTROL_PANEL_DOMAIN,
schema={
vol.Required(ATTR_KEYPRESS): cv.string,

View File

@@ -1,6 +1,6 @@
"""Defines a base Alexa Devices entity."""
from aioamazondevices.const.devices import SPEAKER_GROUP_DEVICE_TYPE
from aioamazondevices.const.devices import SPEAKER_GROUP_MODEL
from aioamazondevices.structures import AmazonDevice
from homeassistant.helpers.device_registry import DeviceInfo
@@ -25,20 +25,19 @@ class AmazonEntity(CoordinatorEntity[AmazonDevicesCoordinator]):
"""Initialize the entity."""
super().__init__(coordinator)
self._serial_num = serial_num
model = self.device.model
model_details = coordinator.api.get_model_details(self.device) or {}
model = model_details.get("model")
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, serial_num)},
name=self.device.account_name,
model=model,
model_id=self.device.device_type,
manufacturer=self.device.manufacturer or "Amazon",
hw_version=self.device.hardware_version,
manufacturer=model_details.get("manufacturer", "Amazon"),
hw_version=model_details.get("hw_version"),
sw_version=(
self.device.software_version
if model != SPEAKER_GROUP_DEVICE_TYPE
else None
self.device.software_version if model != SPEAKER_GROUP_MODEL else None
),
serial_number=serial_num if model != SPEAKER_GROUP_DEVICE_TYPE else None,
serial_number=serial_num if model != SPEAKER_GROUP_MODEL else None,
)
self.entity_description = description
self._attr_unique_id = f"{serial_num}-{description.key}"

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["aioamazondevices"],
"quality_scale": "platinum",
"requirements": ["aioamazondevices==13.0.0"]
"requirements": ["aioamazondevices==12.0.0"]
}

View File

@@ -16,6 +16,9 @@ from .coordinator import AmazonConfigEntry
ATTR_TEXT_COMMAND = "text_command"
ATTR_SOUND = "sound"
ATTR_INFO_SKILL = "info_skill"
SERVICE_TEXT_COMMAND = "send_text_command"
SERVICE_SOUND_NOTIFICATION = "send_sound"
SERVICE_INFO_SKILL = "send_info_skill"
SCHEMA_SOUND_SERVICE = vol.Schema(
{
@@ -125,17 +128,17 @@ def async_setup_services(hass: HomeAssistant) -> None:
"""Set up the services for the Amazon Devices integration."""
for service_name, method, schema in (
(
"send_sound",
SERVICE_SOUND_NOTIFICATION,
async_send_sound_notification,
SCHEMA_SOUND_SERVICE,
),
(
"send_text_command",
SERVICE_TEXT_COMMAND,
async_send_text_command,
SCHEMA_CUSTOM_COMMAND,
),
(
"send_info_skill",
SERVICE_INFO_SKILL,
async_send_info_skill,
SCHEMA_INFO_SKILL,
),

View File

@@ -16,6 +16,8 @@ ATTRIBUTION = "Data provided by Amber Electric"
LOGGER = logging.getLogger(__package__)
PLATFORMS = [Platform.BINARY_SENSOR, Platform.SENSOR]
SERVICE_GET_FORECASTS = "get_forecasts"
GENERAL_CHANNEL = "general"
CONTROLLED_LOAD_CHANNEL = "controlled_load"
FEED_IN_CHANNEL = "feed_in"

View File

@@ -22,6 +22,7 @@ from .const import (
DOMAIN,
FEED_IN_CHANNEL,
GENERAL_CHANNEL,
SERVICE_GET_FORECASTS,
)
from .coordinator import AmberConfigEntry
from .helpers import format_cents_to_dollars, normalize_descriptor
@@ -100,7 +101,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
hass.services.async_register(
DOMAIN,
"get_forecasts",
SERVICE_GET_FORECASTS,
handle_get_forecasts,
GET_FORECASTS_SCHEMA,
supports_response=SupportsResponse.ONLY,

View File

@@ -49,6 +49,18 @@ SCAN_INTERVAL = timedelta(seconds=15)
STREAM_SOURCE_LIST = ["snapshot", "mjpeg", "rtsp"]
_SRV_EN_REC = "enable_recording"
_SRV_DS_REC = "disable_recording"
_SRV_EN_AUD = "enable_audio"
_SRV_DS_AUD = "disable_audio"
_SRV_EN_MOT_REC = "enable_motion_recording"
_SRV_DS_MOT_REC = "disable_motion_recording"
_SRV_GOTO = "goto_preset"
_SRV_CBW = "set_color_bw"
_SRV_TOUR_ON = "start_tour"
_SRV_TOUR_OFF = "stop_tour"
_SRV_PTZ_CTRL = "ptz_control"
_ATTR_PTZ_TT = "travel_time"
_ATTR_PTZ_MOV = "movement"
_MOV = [
@@ -91,17 +103,17 @@ _SRV_PTZ_SCHEMA = _SRV_SCHEMA.extend(
)
CAMERA_SERVICES = {
"enable_recording": (_SRV_SCHEMA, "async_enable_recording", ()),
"disable_recording": (_SRV_SCHEMA, "async_disable_recording", ()),
"enable_audio": (_SRV_SCHEMA, "async_enable_audio", ()),
"disable_audio": (_SRV_SCHEMA, "async_disable_audio", ()),
"enable_motion_recording": (_SRV_SCHEMA, "async_enable_motion_recording", ()),
"disable_motion_recording": (_SRV_SCHEMA, "async_disable_motion_recording", ()),
"goto_preset": (_SRV_GOTO_SCHEMA, "async_goto_preset", (_ATTR_PRESET,)),
"set_color_bw": (_SRV_CBW_SCHEMA, "async_set_color_bw", (_ATTR_COLOR_BW,)),
"start_tour": (_SRV_SCHEMA, "async_start_tour", ()),
"stop_tour": (_SRV_SCHEMA, "async_stop_tour", ()),
"ptz_control": (
_SRV_EN_REC: (_SRV_SCHEMA, "async_enable_recording", ()),
_SRV_DS_REC: (_SRV_SCHEMA, "async_disable_recording", ()),
_SRV_EN_AUD: (_SRV_SCHEMA, "async_enable_audio", ()),
_SRV_DS_AUD: (_SRV_SCHEMA, "async_disable_audio", ()),
_SRV_EN_MOT_REC: (_SRV_SCHEMA, "async_enable_motion_recording", ()),
_SRV_DS_MOT_REC: (_SRV_SCHEMA, "async_disable_motion_recording", ()),
_SRV_GOTO: (_SRV_GOTO_SCHEMA, "async_goto_preset", (_ATTR_PRESET,)),
_SRV_CBW: (_SRV_CBW_SCHEMA, "async_set_color_bw", (_ATTR_COLOR_BW,)),
_SRV_TOUR_ON: (_SRV_SCHEMA, "async_start_tour", ()),
_SRV_TOUR_OFF: (_SRV_SCHEMA, "async_stop_tour", ()),
_SRV_PTZ_CTRL: (
_SRV_PTZ_SCHEMA,
"async_ptz_control",
(_ATTR_PTZ_MOV, _ATTR_PTZ_TT),

View File

@@ -36,7 +36,7 @@ from .const import (
SIGNAL_CONFIG_ENTITY,
)
from .entity import AndroidTVEntity, adb_decorator
from .services import ATTR_ADB_RESPONSE, ATTR_HDMI_INPUT
from .services import ATTR_ADB_RESPONSE, ATTR_HDMI_INPUT, SERVICE_LEARN_SENDEVENT
_LOGGER = logging.getLogger(__name__)
@@ -271,7 +271,7 @@ class ADBDevice(AndroidTVEntity, MediaPlayerEntity):
self.async_write_ha_state()
msg = (
f"Output from service 'learn_sendevent' from"
f"Output from service '{SERVICE_LEARN_SENDEVENT}' from"
f" {self.entity_id}: '{output}'"
)
persistent_notification.async_create(

View File

@@ -16,6 +16,11 @@ ATTR_DEVICE_PATH = "device_path"
ATTR_HDMI_INPUT = "hdmi_input"
ATTR_LOCAL_PATH = "local_path"
SERVICE_ADB_COMMAND = "adb_command"
SERVICE_DOWNLOAD = "download"
SERVICE_LEARN_SENDEVENT = "learn_sendevent"
SERVICE_UPLOAD = "upload"
@callback
def async_setup_services(hass: HomeAssistant) -> None:
@@ -24,7 +29,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"adb_command",
SERVICE_ADB_COMMAND,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={vol.Required(ATTR_COMMAND): cv.string},
func="adb_command",
@@ -32,7 +37,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"learn_sendevent",
SERVICE_LEARN_SENDEVENT,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema=None,
func="learn_sendevent",
@@ -40,7 +45,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"download",
SERVICE_DOWNLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,
@@ -51,7 +56,7 @@ def async_setup_services(hass: HomeAssistant) -> None:
service.async_register_platform_entity_service(
hass,
DOMAIN,
"upload",
SERVICE_UPLOAD,
entity_domain=MEDIA_PLAYER_DOMAIN,
schema={
vol.Required(ATTR_DEVICE_PATH): cv.string,

View File

@@ -46,7 +46,6 @@ class AnthropicTaskEntity(
ai_task.AITaskEntityFeature.GENERATE_DATA
| ai_task.AITaskEntityFeature.SUPPORT_ATTACHMENTS
)
_attr_translation_key = "ai_task_data"
async def _async_generate_data(
self,

View File

@@ -43,9 +43,7 @@ from homeassistant.helpers.selector import (
from homeassistant.helpers.typing import VolDictType
from .const import (
CODE_EXECUTION_UNSUPPORTED_MODELS,
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
CONF_PROMPT,
CONF_RECOMMENDED,
@@ -417,16 +415,6 @@ class ConversationSubentryFlowHandler(ConfigSubentryFlow):
else:
self.options.pop(CONF_THINKING_EFFORT, None)
if not model.startswith(tuple(CODE_EXECUTION_UNSUPPORTED_MODELS)):
step_schema[
vol.Optional(
CONF_CODE_EXECUTION,
default=DEFAULT[CONF_CODE_EXECUTION],
)
] = bool
else:
self.options.pop(CONF_CODE_EXECUTION, None)
if not model.startswith(tuple(WEB_SEARCH_UNSUPPORTED_MODELS)):
step_schema.update(
{

View File

@@ -11,7 +11,6 @@ DEFAULT_AI_TASK_NAME = "Claude AI Task"
CONF_RECOMMENDED = "recommended"
CONF_PROMPT = "prompt"
CONF_CHAT_MODEL = "chat_model"
CONF_CODE_EXECUTION = "code_execution"
CONF_MAX_TOKENS = "max_tokens"
CONF_TEMPERATURE = "temperature"
CONF_THINKING_BUDGET = "thinking_budget"
@@ -26,7 +25,6 @@ CONF_WEB_SEARCH_TIMEZONE = "timezone"
DEFAULT = {
CONF_CHAT_MODEL: "claude-haiku-4-5",
CONF_CODE_EXECUTION: False,
CONF_MAX_TOKENS: 3000,
CONF_TEMPERATURE: 1.0,
CONF_THINKING_BUDGET: 0,
@@ -67,10 +65,6 @@ WEB_SEARCH_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
CODE_EXECUTION_UNSUPPORTED_MODELS = [
"claude-3-haiku",
]
DEPRECATED_MODELS = [
"claude-3",
]

View File

@@ -37,7 +37,6 @@ class AnthropicConversationEntity(
"""Anthropic conversation agent."""
_attr_supports_streaming = True
_attr_translation_key = "conversation"
def __init__(self, entry: AnthropicConfigEntry, subentry: ConfigSubentry) -> None:
"""Initialize the agent."""

View File

@@ -3,23 +3,19 @@
import base64
from collections.abc import AsyncGenerator, Callable, Iterable
from dataclasses import dataclass, field
from datetime import UTC, datetime
import json
from mimetypes import guess_file_type
from pathlib import Path
from typing import Any, Literal, cast
from typing import Any
import anthropic
from anthropic import AsyncStream
from anthropic.types import (
Base64ImageSourceParam,
Base64PDFSourceParam,
BashCodeExecutionToolResultBlock,
CitationsDelta,
CitationsWebSearchResultLocation,
CitationWebSearchResultLocationParam,
CodeExecutionTool20250825Param,
Container,
ContentBlockParam,
DocumentBlockParam,
ImageBlockParam,
@@ -45,7 +41,6 @@ from anthropic.types import (
TextCitation,
TextCitationParam,
TextDelta,
TextEditorCodeExecutionToolResultBlock,
ThinkingBlock,
ThinkingBlockParam,
ThinkingConfigAdaptiveParam,
@@ -56,21 +51,18 @@ from anthropic.types import (
ToolChoiceAutoParam,
ToolChoiceToolParam,
ToolParam,
ToolResultBlockParam,
ToolUnionParam,
ToolUseBlock,
ToolUseBlockParam,
Usage,
WebSearchTool20250305Param,
WebSearchToolRequestErrorParam,
WebSearchToolResultBlock,
WebSearchToolResultBlockParamContentParam,
)
from anthropic.types.bash_code_execution_tool_result_block_param import (
Content as BashCodeExecutionToolResultContentParam,
WebSearchToolResultBlockParam,
WebSearchToolResultError,
)
from anthropic.types.message_create_params import MessageCreateParamsStreaming
from anthropic.types.text_editor_code_execution_tool_result_block_param import (
Content as TextEditorCodeExecutionToolResultContentParam,
)
import voluptuous as vol
from voluptuous_openapi import convert
@@ -82,12 +74,10 @@ from homeassistant.helpers import device_registry as dr, llm
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.json import json_dumps
from homeassistant.util import slugify
from homeassistant.util.json import JsonObjectType
from . import AnthropicConfigEntry
from .const import (
CONF_CHAT_MODEL,
CONF_CODE_EXECUTION,
CONF_MAX_TOKENS,
CONF_TEMPERATURE,
CONF_THINKING_BUDGET,
@@ -144,7 +134,6 @@ class ContentDetails:
citation_details: list[CitationDetails] = field(default_factory=list)
thinking_signature: str | None = None
redacted_thinking: str | None = None
container: Container | None = None
def has_content(self) -> bool:
"""Check if there is any text content."""
@@ -155,7 +144,6 @@ class ContentDetails:
return (
self.thinking_signature is not None
or self.redacted_thinking is not None
or self.container is not None
or self.has_citations()
)
@@ -200,53 +188,30 @@ class ContentDetails:
def _convert_content(
chat_content: Iterable[conversation.Content],
) -> tuple[list[MessageParam], str | None]:
) -> list[MessageParam]:
"""Transform HA chat_log content into Anthropic API format."""
messages: list[MessageParam] = []
container_id: str | None = None
for content in chat_content:
if isinstance(content, conversation.ToolResultContent):
external_tool = True
if content.tool_name == "web_search":
tool_result_block: ContentBlockParam = {
"type": "web_search_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
WebSearchToolResultBlockParamContentParam,
content.tool_result["content"]
if "content" in content.tool_result
else {
"type": "web_search_tool_result_error",
"error_code": content.tool_result.get(
"error_code", "unavailable"
),
},
tool_result_block: ContentBlockParam = WebSearchToolResultBlockParam(
type="web_search_tool_result",
tool_use_id=content.tool_call_id,
content=content.tool_result["content"]
if "content" in content.tool_result
else WebSearchToolRequestErrorParam(
type="web_search_tool_result_error",
error_code=content.tool_result.get("error_code", "unavailable"), # type: ignore[typeddict-item]
),
}
elif content.tool_name == "bash_code_execution":
tool_result_block = {
"type": "bash_code_execution_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
BashCodeExecutionToolResultContentParam, content.tool_result
),
}
elif content.tool_name == "text_editor_code_execution":
tool_result_block = {
"type": "text_editor_code_execution_tool_result",
"tool_use_id": content.tool_call_id,
"content": cast(
TextEditorCodeExecutionToolResultContentParam,
content.tool_result,
),
}
)
external_tool = True
else:
tool_result_block = {
"type": "tool_result",
"tool_use_id": content.tool_call_id,
"content": json_dumps(content.tool_result),
}
tool_result_block = ToolResultBlockParam(
type="tool_result",
tool_use_id=content.tool_call_id,
content=json_dumps(content.tool_result),
)
external_tool = False
if not messages or messages[-1]["role"] != (
"assistant" if external_tool else "user"
@@ -312,11 +277,6 @@ def _convert_content(
data=content.native.redacted_thinking,
)
)
if (
content.native.container is not None
and content.native.container.expires_at > datetime.now(UTC)
):
container_id = content.native.container.id
if content.content:
current_index = 0
@@ -365,23 +325,10 @@ def _convert_content(
ServerToolUseBlockParam(
type="server_tool_use",
id=tool_call.id,
name=cast(
Literal[
"web_search",
"bash_code_execution",
"text_editor_code_execution",
],
tool_call.tool_name,
),
name="web_search",
input=tool_call.tool_args,
)
if tool_call.external
and tool_call.tool_name
in [
"web_search",
"bash_code_execution",
"text_editor_code_execution",
]
if tool_call.external and tool_call.tool_name == "web_search"
else ToolUseBlockParam(
type="tool_use",
id=tool_call.id,
@@ -400,10 +347,10 @@ def _convert_content(
# If there is only one text block, simplify the content to a string
messages[-1]["content"] = messages[-1]["content"][0]["text"]
else:
# Note: We don't pass SystemContent here as it's passed to the API as the prompt
raise HomeAssistantError("Unexpected content type in chat log")
# Note: We don't pass SystemContent here as its passed to the API as the prompt
raise TypeError(f"Unexpected content type: {type(content)}")
return messages, container_id
return messages
async def _transform_stream( # noqa: C901 - This is complex, but better to have it in one place
@@ -442,8 +389,8 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
Each message could contain multiple blocks of the same type.
"""
if stream is None or not hasattr(stream, "__aiter__"):
raise HomeAssistantError("Expected a stream of messages")
if stream is None:
raise TypeError("Expected a stream of messages")
current_tool_block: ToolUseBlockParam | ServerToolUseBlockParam | None = None
current_tool_args: str
@@ -456,6 +403,8 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
LOGGER.debug("Received response: %s", response)
if isinstance(response, RawMessageStartEvent):
if response.message.role != "assistant":
raise ValueError("Unexpected message role")
input_usage = response.message.usage
first_block = True
elif isinstance(response, RawContentBlockStartEvent):
@@ -529,14 +478,7 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
input={},
)
current_tool_args = ""
elif isinstance(
response.content_block,
(
WebSearchToolResultBlock,
BashCodeExecutionToolResultBlock,
TextEditorCodeExecutionToolResultBlock,
),
):
elif isinstance(response.content_block, WebSearchToolResultBlock):
if content_details:
content_details.delete_empty()
yield {"native": content_details}
@@ -545,16 +487,26 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
yield {
"role": "tool_result",
"tool_call_id": response.content_block.tool_use_id,
"tool_name": response.content_block.type.removesuffix(
"_tool_result"
),
"tool_name": "web_search",
"tool_result": {
"content": cast(
JsonObjectType, response.content_block.to_dict()["content"]
)
"type": "web_search_tool_result_error",
"error_code": response.content_block.content.error_code,
}
if isinstance(response.content_block.content, list)
else cast(JsonObjectType, response.content_block.content.to_dict()),
if isinstance(
response.content_block.content, WebSearchToolResultError
)
else {
"content": [
{
"type": "web_search_result",
"encrypted_content": block.encrypted_content,
"page_age": block.page_age,
"title": block.title,
"url": block.url,
}
for block in response.content_block.content
]
},
}
first_block = True
elif isinstance(response, RawContentBlockDeltaEvent):
@@ -603,7 +555,6 @@ async def _transform_stream( # noqa: C901 - This is complex, but better to have
elif isinstance(response, RawMessageDeltaEvent):
if (usage := response.usage) is not None:
chat_log.async_trace(_create_token_stats(input_usage, usage))
content_details.container = response.delta.container
if response.delta.stop_reason == "refusal":
raise HomeAssistantError("Potential policy violation detected")
elif isinstance(response, RawMessageStopEvent):
@@ -664,7 +615,7 @@ class AnthropicBaseLLMEntity(Entity):
system = chat_log.content[0]
if not isinstance(system, conversation.SystemContent):
raise HomeAssistantError("First message must be a system message")
raise TypeError("First message must be a system message")
# System prompt with caching enabled
system_prompt: list[TextBlockParam] = [
@@ -675,7 +626,7 @@ class AnthropicBaseLLMEntity(Entity):
)
]
messages, container_id = _convert_content(chat_log.content[1:])
messages = _convert_content(chat_log.content[1:])
model = options.get(CONF_CHAT_MODEL, DEFAULT[CONF_CHAT_MODEL])
@@ -685,7 +636,6 @@ class AnthropicBaseLLMEntity(Entity):
max_tokens=options.get(CONF_MAX_TOKENS, DEFAULT[CONF_MAX_TOKENS]),
system=system_prompt,
stream=True,
container=container_id,
)
if not model.startswith(tuple(NON_ADAPTIVE_THINKING_MODELS)):
@@ -724,14 +674,6 @@ class AnthropicBaseLLMEntity(Entity):
for tool in chat_log.llm_api.tools
]
if options.get(CONF_CODE_EXECUTION):
tools.append(
CodeExecutionTool20250825Param(
name="code_execution",
type="code_execution_20250825",
),
)
if options.get(CONF_WEB_SEARCH):
web_search = WebSearchTool20250305Param(
name="web_search",
@@ -842,25 +784,21 @@ class AnthropicBaseLLMEntity(Entity):
try:
stream = await client.messages.create(**model_args)
new_messages, model_args["container"] = _convert_content(
[
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
_transform_stream(
chat_log,
stream,
output_tool=structure_name or None,
),
)
]
messages.extend(
_convert_content(
[
content
async for content in chat_log.async_add_delta_content_stream(
self.entity_id,
_transform_stream(
chat_log,
stream,
output_tool=structure_name or None,
),
)
]
)
)
messages.extend(new_messages)
except anthropic.AuthenticationError as err:
self.entry.async_start_reauth(self.hass)
raise HomeAssistantError(
"Authentication error with Anthropic API, reauthentication required"
) from err
except anthropic.AnthropicError as err:
raise HomeAssistantError(
f"Sorry, I had a problem talking to Anthropic: {err}"

View File

@@ -1,14 +0,0 @@
{
"entity": {
"ai_task": {
"ai_task_data": {
"default": "mdi:asterisk"
}
},
"conversation": {
"conversation": {
"default": "mdi:asterisk"
}
}
}
}

View File

@@ -1,6 +1,6 @@
{
"domain": "anthropic",
"name": "Anthropic",
"name": "Anthropic Conversation",
"after_dependencies": ["assist_pipeline", "intent"],
"codeowners": ["@Shulyaka"],
"config_flow": true,

View File

@@ -31,7 +31,10 @@ rules:
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: done
action-exceptions:
status: todo
comment: |
Reevaluate exceptions for entity services.
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
@@ -89,7 +92,7 @@ rules:
No entities disabled by default.
entity-translations: todo
exception-translations: todo
icon-translations: done
icon-translations: todo
reconfiguration-flow: done
repair-issues: done
stale-devices:

View File

@@ -69,7 +69,6 @@
},
"model": {
"data": {
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data::code_execution%]",
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_budget%]",
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data::thinking_effort%]",
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data::user_location%]",
@@ -77,7 +76,6 @@
"web_search_max_uses": "[%key:component::anthropic::config_subentries::conversation::step::model::data::web_search_max_uses%]"
},
"data_description": {
"code_execution": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::code_execution%]",
"thinking_budget": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_budget%]",
"thinking_effort": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::thinking_effort%]",
"user_location": "[%key:component::anthropic::config_subentries::conversation::step::model::data_description::user_location%]",
@@ -129,7 +127,6 @@
},
"model": {
"data": {
"code_execution": "Code execution",
"thinking_budget": "Thinking budget",
"thinking_effort": "Thinking effort",
"user_location": "Include home location",
@@ -137,7 +134,6 @@
"web_search_max_uses": "Maximum web searches"
},
"data_description": {
"code_execution": "Allow the model to execute code in a secure sandbox environment, enabling it to analyze data and perform complex calculations.",
"thinking_budget": "The number of tokens the model can use to think about the response out of the total maximum number of tokens. Set to 1024 or greater to enable extended thinking.",
"thinking_effort": "Control how many tokens Claude uses when responding, trading off between response thoroughness and token efficiency",
"user_location": "Localize search results based on home location",

View File

@@ -117,7 +117,6 @@ class SharpAquosTVDevice(MediaPlayerEntity):
| MediaPlayerEntityFeature.VOLUME_SET
| MediaPlayerEntityFeature.PLAY
)
_attr_volume_step = 2 / 60
def __init__(
self, name: str, remote: sharp_aquos_rc.TV, power_on_enabled: bool = False
@@ -162,6 +161,22 @@ class SharpAquosTVDevice(MediaPlayerEntity):
"""Turn off tvplayer."""
self._remote.power(0)
@_retry
def volume_up(self) -> None:
"""Volume up the media player."""
if self.volume_level is None:
_LOGGER.debug("Unknown volume in volume_up")
return
self._remote.volume(int(self.volume_level * 60) + 2)
@_retry
def volume_down(self) -> None:
"""Volume down media player."""
if self.volume_level is None:
_LOGGER.debug("Unknown volume in volume_down")
return
self._remote.volume(int(self.volume_level * 60) - 2)
@_retry
def set_volume_level(self, volume: float) -> None:
"""Set Volume media player."""

View File

@@ -30,5 +30,5 @@
"integration_type": "hub",
"iot_class": "cloud_push",
"loggers": ["pubnub", "yalexs"],
"requirements": ["yalexs==9.2.0", "yalexs-ble==3.2.7"]
"requirements": ["yalexs==9.2.0", "yalexs-ble==3.2.4"]
}

View File

@@ -61,13 +61,7 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
frequency = self.client.measure(4)
i_leak_dcdc = self.client.measure(6)
i_leak_inverter = self.client.measure(7)
power_in_1 = self.client.measure(8)
power_in_2 = self.client.measure(9)
temperature_c = self.client.measure(21)
voltage_in_1 = self.client.measure(23)
current_in_1 = self.client.measure(25)
voltage_in_2 = self.client.measure(26)
current_in_2 = self.client.measure(27)
r_iso = self.client.measure(30)
energy_wh = self.client.cumulated_energy(5)
[alarm, *_] = self.client.alarms()
@@ -93,13 +87,7 @@ class AuroraAbbDataUpdateCoordinator(DataUpdateCoordinator[dict[str, float]]):
data["grid_frequency"] = round(frequency, 1)
data["i_leak_dcdc"] = i_leak_dcdc
data["i_leak_inverter"] = i_leak_inverter
data["power_in_1"] = round(power_in_1, 1)
data["power_in_2"] = round(power_in_2, 1)
data["temp"] = round(temperature_c, 1)
data["voltage_in_1"] = round(voltage_in_1, 1)
data["current_in_1"] = round(current_in_1, 1)
data["voltage_in_2"] = round(voltage_in_2, 1)
data["current_in_2"] = round(current_in_2, 1)
data["r_iso"] = r_iso
data["totalenergy"] = round(energy_wh / 1000, 2)
data["alarm"] = alarm

View File

@@ -68,7 +68,6 @@ SENSOR_TYPES = [
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfFrequency.HERTZ,
state_class=SensorStateClass.MEASUREMENT,
translation_key="grid_frequency",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
@@ -89,60 +88,6 @@ SENSOR_TYPES = [
translation_key="i_leak_inverter",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_1",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="power_in_2",
device_class=SensorDeviceClass.POWER,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfPower.WATT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="power_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_1",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_1",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_1",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="voltage_in_2",
device_class=SensorDeviceClass.VOLTAGE,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricPotential.VOLT,
state_class=SensorStateClass.MEASUREMENT,
translation_key="voltage_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="current_in_2",
device_class=SensorDeviceClass.CURRENT,
entity_category=EntityCategory.DIAGNOSTIC,
native_unit_of_measurement=UnitOfElectricCurrent.AMPERE,
state_class=SensorStateClass.MEASUREMENT,
translation_key="current_in_2",
entity_registry_enabled_default=False,
),
SensorEntityDescription(
key="alarm",
device_class=SensorDeviceClass.ENUM,

View File

@@ -24,18 +24,9 @@
"alarm": {
"name": "Alarm status"
},
"current_in_1": {
"name": "String 1 current"
},
"current_in_2": {
"name": "String 2 current"
},
"grid_current": {
"name": "Grid current"
},
"grid_frequency": {
"name": "Grid frequency"
},
"grid_voltage": {
"name": "Grid voltage"
},
@@ -45,12 +36,6 @@
"i_leak_inverter": {
"name": "Inverter leak current"
},
"power_in_1": {
"name": "String 1 power"
},
"power_in_2": {
"name": "String 2 power"
},
"power_output": {
"name": "Power output"
},
@@ -59,12 +44,6 @@
},
"total_energy": {
"name": "Total energy"
},
"voltage_in_1": {
"name": "String 1 voltage"
},
"voltage_in_2": {
"name": "String 2 voltage"
}
}
}

View File

@@ -149,7 +149,6 @@ _EXPERIMENTAL_TRIGGER_PLATFORMS = {
"lock",
"media_player",
"person",
"remote",
"scene",
"siren",
"switch",

View File

@@ -19,7 +19,7 @@ from homeassistant.components.backup import (
from homeassistant.core import HomeAssistant, callback
from . import S3ConfigEntry
from .const import CONF_BUCKET, CONF_PREFIX, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .const import CONF_BUCKET, DATA_BACKUP_AGENT_LISTENERS, DOMAIN
from .helpers import async_list_backups_from_s3
_LOGGER = logging.getLogger(__name__)
@@ -100,13 +100,6 @@ class S3BackupAgent(BackupAgent):
self.unique_id = entry.entry_id
self._backup_cache: dict[str, AgentBackup] = {}
self._cache_expiration = time()
self._prefix: str = entry.data.get(CONF_PREFIX, "")
def _with_prefix(self, key: str) -> str:
"""Add prefix to a key if configured."""
if not self._prefix:
return key
return f"{self._prefix}/{key}"
@handle_boto_errors
async def async_download_backup(
@@ -122,9 +115,7 @@ class S3BackupAgent(BackupAgent):
backup = await self._find_backup_by_id(backup_id)
tar_filename, _ = suggested_filenames(backup)
response = await self._client.get_object(
Bucket=self._bucket, Key=self._with_prefix(tar_filename)
)
response = await self._client.get_object(Bucket=self._bucket, Key=tar_filename)
return response["Body"].iter_chunks()
async def async_upload_backup(
@@ -151,7 +142,7 @@ class S3BackupAgent(BackupAgent):
metadata_content = json.dumps(backup.as_dict())
await self._client.put_object(
Bucket=self._bucket,
Key=self._with_prefix(metadata_filename),
Key=metadata_filename,
Body=metadata_content,
)
except BotoCoreError as err:
@@ -178,7 +169,7 @@ class S3BackupAgent(BackupAgent):
await self._client.put_object(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
Body=bytes(file_data),
)
@@ -195,7 +186,7 @@ class S3BackupAgent(BackupAgent):
_LOGGER.debug("Starting multipart upload for %s", tar_filename)
multipart_upload = await self._client.create_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
)
upload_id = multipart_upload["UploadId"]
try:
@@ -225,7 +216,7 @@ class S3BackupAgent(BackupAgent):
)
part = await cast(Any, self._client).upload_part(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
PartNumber=part_number,
UploadId=upload_id,
Body=part_data.tobytes(),
@@ -253,7 +244,7 @@ class S3BackupAgent(BackupAgent):
)
part = await cast(Any, self._client).upload_part(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
PartNumber=part_number,
UploadId=upload_id,
Body=remaining_data.tobytes(),
@@ -262,7 +253,7 @@ class S3BackupAgent(BackupAgent):
await cast(Any, self._client).complete_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
UploadId=upload_id,
MultipartUpload={"Parts": parts},
)
@@ -271,7 +262,7 @@ class S3BackupAgent(BackupAgent):
try:
await self._client.abort_multipart_upload(
Bucket=self._bucket,
Key=self._with_prefix(tar_filename),
Key=tar_filename,
UploadId=upload_id,
)
except BotoCoreError:
@@ -292,12 +283,8 @@ class S3BackupAgent(BackupAgent):
tar_filename, metadata_filename = suggested_filenames(backup)
# Delete both the backup file and its metadata file
await self._client.delete_object(
Bucket=self._bucket, Key=self._with_prefix(tar_filename)
)
await self._client.delete_object(
Bucket=self._bucket, Key=self._with_prefix(metadata_filename)
)
await self._client.delete_object(Bucket=self._bucket, Key=tar_filename)
await self._client.delete_object(Bucket=self._bucket, Key=metadata_filename)
# Reset cache after successful deletion
self._cache_expiration = time()
@@ -330,9 +317,7 @@ class S3BackupAgent(BackupAgent):
if time() <= self._cache_expiration:
return self._backup_cache
backups_list = await async_list_backups_from_s3(
self._client, self._bucket, self._prefix
)
backups_list = await async_list_backups_from_s3(self._client, self._bucket)
self._backup_cache = {b.backup_id: b for b in backups_list}
self._cache_expiration = time() + CACHE_TTL

View File

@@ -22,7 +22,6 @@ from .const import (
CONF_ACCESS_KEY_ID,
CONF_BUCKET,
CONF_ENDPOINT_URL,
CONF_PREFIX,
CONF_SECRET_ACCESS_KEY,
DEFAULT_ENDPOINT_URL,
DESCRIPTION_AWS_S3_DOCS_URL,
@@ -40,7 +39,6 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
vol.Required(CONF_ENDPOINT_URL, default=DEFAULT_ENDPOINT_URL): TextSelector(
config=TextSelectorConfig(type=TextSelectorType.URL)
),
vol.Optional(CONF_PREFIX, default=""): cv.string,
}
)
@@ -55,20 +53,16 @@ class S3ConfigFlow(ConfigFlow, domain=DOMAIN):
errors: dict[str, str] = {}
if user_input is not None:
normalized_prefix = user_input.get(CONF_PREFIX, "").strip("/")
# Check for existing entries, treating missing prefix as empty
for entry in self._async_current_entries(include_ignore=False):
entry_prefix = (entry.data.get(CONF_PREFIX) or "").strip("/")
if (
entry.data.get(CONF_BUCKET) == user_input[CONF_BUCKET]
and entry.data.get(CONF_ENDPOINT_URL)
== user_input[CONF_ENDPOINT_URL]
and entry_prefix == normalized_prefix
):
return self.async_abort(reason="already_configured")
self._async_abort_entries_match(
{
CONF_BUCKET: user_input[CONF_BUCKET],
CONF_ENDPOINT_URL: user_input[CONF_ENDPOINT_URL],
}
)
hostname = urlparse(user_input[CONF_ENDPOINT_URL]).hostname
if not hostname or not hostname.endswith(AWS_DOMAIN):
if not urlparse(user_input[CONF_ENDPOINT_URL]).hostname.endswith(
AWS_DOMAIN
):
errors[CONF_ENDPOINT_URL] = "invalid_endpoint_url"
else:
try:
@@ -90,18 +84,9 @@ class S3ConfigFlow(ConfigFlow, domain=DOMAIN):
except ConnectionError:
errors[CONF_ENDPOINT_URL] = "cannot_connect"
else:
data = dict(user_input)
if not normalized_prefix:
# Do not persist empty optional values
data.pop(CONF_PREFIX, None)
else:
data[CONF_PREFIX] = normalized_prefix
title = user_input[CONF_BUCKET]
if normalized_prefix:
title = f"{title} - {normalized_prefix}"
return self.async_create_entry(title=title, data=data)
return self.async_create_entry(
title=user_input[CONF_BUCKET], data=user_input
)
return self.async_show_form(
step_id="user",

View File

@@ -11,7 +11,6 @@ CONF_ACCESS_KEY_ID = "access_key_id"
CONF_SECRET_ACCESS_KEY = "secret_access_key"
CONF_ENDPOINT_URL = "endpoint_url"
CONF_BUCKET = "bucket"
CONF_PREFIX = "prefix"
AWS_DOMAIN = "amazonaws.com"
DEFAULT_ENDPOINT_URL = f"https://s3.eu-central-1.{AWS_DOMAIN}/"

View File

@@ -13,7 +13,7 @@ from homeassistant.config_entries import ConfigEntry
from homeassistant.core import HomeAssistant
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import CONF_BUCKET, CONF_PREFIX, DOMAIN
from .const import CONF_BUCKET, DOMAIN
from .helpers import async_list_backups_from_s3
SCAN_INTERVAL = timedelta(hours=6)
@@ -53,14 +53,11 @@ class S3DataUpdateCoordinator(DataUpdateCoordinator[SensorData]):
)
self.client = client
self._bucket: str = entry.data[CONF_BUCKET]
self._prefix: str = entry.data.get(CONF_PREFIX, "")
async def _async_update_data(self) -> SensorData:
"""Fetch data from AWS S3."""
try:
backups = await async_list_backups_from_s3(
self.client, self._bucket, self._prefix
)
backups = await async_list_backups_from_s3(self.client, self._bucket)
except BotoCoreError as error:
raise UpdateFailed(
translation_domain=DOMAIN,

View File

@@ -1,55 +0,0 @@
"""Diagnostics support for AWS S3."""
from __future__ import annotations
import dataclasses
from typing import Any
from homeassistant.components.backup import (
DATA_MANAGER as BACKUP_DATA_MANAGER,
BackupManager,
)
from homeassistant.components.diagnostics import async_redact_data
from homeassistant.core import HomeAssistant
from .const import (
CONF_ACCESS_KEY_ID,
CONF_BUCKET,
CONF_PREFIX,
CONF_SECRET_ACCESS_KEY,
DOMAIN,
)
from .coordinator import S3ConfigEntry
from .helpers import async_list_backups_from_s3
TO_REDACT = (CONF_ACCESS_KEY_ID, CONF_SECRET_ACCESS_KEY)
async def async_get_config_entry_diagnostics(
hass: HomeAssistant,
entry: S3ConfigEntry,
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
coordinator = entry.runtime_data
backup_manager: BackupManager = hass.data[BACKUP_DATA_MANAGER]
backups = await async_list_backups_from_s3(
coordinator.client,
bucket=entry.data[CONF_BUCKET],
prefix=entry.data.get(CONF_PREFIX, ""),
)
data = {
"coordinator_data": dataclasses.asdict(coordinator.data),
"config": {
**entry.data,
**entry.options,
},
"backup_agents": [
{"name": agent.name}
for agent in backup_manager.backup_agents.values()
if agent.domain == DOMAIN
],
"backup": [backup.as_dict() for backup in backups],
}
return async_redact_data(data, TO_REDACT)

View File

@@ -17,17 +17,11 @@ _LOGGER = logging.getLogger(__name__)
async def async_list_backups_from_s3(
client: S3Client,
bucket: str,
prefix: str,
) -> list[AgentBackup]:
"""List backups from an S3 bucket by reading metadata files."""
paginator = client.get_paginator("list_objects_v2")
metadata_files: list[dict[str, Any]] = []
list_kwargs: dict[str, Any] = {"Bucket": bucket}
if prefix:
list_kwargs["Prefix"] = prefix + "/"
async for page in paginator.paginate(**list_kwargs):
async for page in paginator.paginate(Bucket=bucket):
metadata_files.extend(
obj
for obj in page.get("Contents", [])

View File

@@ -23,9 +23,7 @@ rules:
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry:
status: exempt
comment: Hassfest does not recognize the duplicate prevention logic. Duplicate entries are prevented by checking bucket, endpoint URL, and prefix in the config flow.
unique-config-entry: done
# Silver
action-exceptions:
@@ -38,14 +36,14 @@ rules:
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
log-when-unavailable: todo
parallel-updates: done
reauthentication-flow: todo
test-coverage: done
# Gold
devices: done
diagnostics: done
diagnostics: todo
discovery-update-info:
status: exempt
comment: S3 is a cloud service that is not discovered on the network.

View File

@@ -15,14 +15,12 @@
"access_key_id": "Access key ID",
"bucket": "Bucket name",
"endpoint_url": "Endpoint URL",
"prefix": "Prefix",
"secret_access_key": "Secret access key"
},
"data_description": {
"access_key_id": "Access key ID to connect to AWS S3 API",
"bucket": "Bucket must already exist and be writable by the provided credentials.",
"endpoint_url": "Endpoint URL provided to [Boto3 Session]({boto3_docs_url}). Region-specific [AWS S3 endpoints]({aws_s3_docs_url}) are available in their docs.",
"prefix": "Folder or prefix to store backups in, for example `backups`",
"secret_access_key": "Secret access key to connect to AWS S3 API"
},
"title": "Add AWS S3 bucket"

View File

@@ -29,17 +29,12 @@ class StoredBackupData(TypedDict):
class _BackupStore(Store[StoredBackupData]):
"""Class to help storing backup data."""
# Maximum version we support reading for forward compatibility.
# This allows reading data written by a newer HA version after downgrade.
_MAX_READABLE_VERSION = 2
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize storage class."""
super().__init__(
hass,
STORAGE_VERSION,
STORAGE_KEY,
max_readable_version=self._MAX_READABLE_VERSION,
minor_version=STORAGE_VERSION_MINOR,
)
@@ -91,8 +86,8 @@ class _BackupStore(Store[StoredBackupData]):
# data["config"]["schedule"]["state"] will be removed. The bump to 2 is
# planned to happen after a 6 month quiet period with no minor version
# changes.
# Reject if major version is higher than _MAX_READABLE_VERSION.
if old_major_version > self._MAX_READABLE_VERSION:
# Reject if major version is higher than 2.
if old_major_version > 2:
raise NotImplementedError
return data

View File

@@ -43,11 +43,11 @@
"title": "The backup location {agent_id} is unavailable"
},
"automatic_backup_failed_addons": {
"description": "Apps {failed_addons} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Not all apps could be included in automatic backup"
"description": "Add-ons {failed_addons} could not be included in automatic backup. Please check the Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Not all add-ons could be included in automatic backup"
},
"automatic_backup_failed_agents_addons_folders": {
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Apps which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the Core and Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"description": "The automatic backup was created with errors:\n* Locations which the backup could not be uploaded to: {failed_agents}\n* Add-ons which could not be backed up: {failed_addons}\n* Folders which could not be backed up: {failed_folders}\n\nPlease check the Core and Supervisor logs for more information. Another attempt will be made at the next scheduled time if a backup schedule is configured.",
"title": "Automatic backup was created with errors"
},
"automatic_backup_failed_create": {

View File

@@ -24,7 +24,7 @@ class BinarySensorOnOffTrigger(EntityTargetStateTriggerBase):
"""Class for binary sensor on/off triggers."""
_device_class: BinarySensorDeviceClass | None
_domains = {DOMAIN}
_domain: str = DOMAIN
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain."""

View File

@@ -190,7 +190,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "miners_revenue_usd":
self._attr_native_value = f"{stats.miners_revenue_usd:.0f}"
elif sensor_type == "btc_mined":
self._attr_native_value = str(stats.btc_mined * 1e-8)
self._attr_native_value = str(stats.btc_mined * 0.00000001)
elif sensor_type == "trade_volume_usd":
self._attr_native_value = f"{stats.trade_volume_usd:.1f}"
elif sensor_type == "difficulty":
@@ -208,13 +208,13 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "blocks_size":
self._attr_native_value = f"{stats.blocks_size:.1f}"
elif sensor_type == "total_fees_btc":
self._attr_native_value = f"{stats.total_fees_btc * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_fees_btc * 0.00000001:.2f}"
elif sensor_type == "total_btc_sent":
self._attr_native_value = f"{stats.total_btc_sent * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_btc_sent * 0.00000001:.2f}"
elif sensor_type == "estimated_btc_sent":
self._attr_native_value = f"{stats.estimated_btc_sent * 1e-8:.2f}"
self._attr_native_value = f"{stats.estimated_btc_sent * 0.00000001:.2f}"
elif sensor_type == "total_btc":
self._attr_native_value = f"{stats.total_btc * 1e-8:.2f}"
self._attr_native_value = f"{stats.total_btc * 0.00000001:.2f}"
elif sensor_type == "total_blocks":
self._attr_native_value = f"{stats.total_blocks:.0f}"
elif sensor_type == "next_retarget":
@@ -222,7 +222,7 @@ class BitcoinSensor(SensorEntity):
elif sensor_type == "estimated_transaction_volume_usd":
self._attr_native_value = f"{stats.estimated_transaction_volume_usd:.2f}"
elif sensor_type == "miners_revenue_btc":
self._attr_native_value = f"{stats.miners_revenue_btc * 1e-8:.1f}"
self._attr_native_value = f"{stats.miners_revenue_btc * 0.00000001:.1f}"
elif sensor_type == "market_price_usd":
self._attr_native_value = f"{stats.market_price_usd:.2f}"

View File

@@ -85,7 +85,6 @@ class BluesoundPlayer(CoordinatorEntity[BluesoundCoordinator], MediaPlayerEntity
_attr_media_content_type = MediaType.MUSIC
_attr_has_entity_name = True
_attr_name = None
_attr_volume_step = 0.01
def __init__(
self,
@@ -689,6 +688,24 @@ class BluesoundPlayer(CoordinatorEntity[BluesoundCoordinator], MediaPlayerEntity
await self._player.play_url(url)
async def async_volume_up(self) -> None:
"""Volume up the media player."""
if self.volume_level is None:
return
new_volume = self.volume_level + 0.01
new_volume = min(1, new_volume)
await self.async_set_volume_level(new_volume)
async def async_volume_down(self) -> None:
"""Volume down the media player."""
if self.volume_level is None:
return
new_volume = self.volume_level - 0.01
new_volume = max(0, new_volume)
await self.async_set_volume_level(new_volume)
async def async_set_volume_level(self, volume: float) -> None:
"""Send volume_up command to media player."""
volume = int(round(volume * 100))

View File

@@ -1,291 +0,0 @@
"""The Brands integration."""
from __future__ import annotations
from collections import deque
from http import HTTPStatus
import logging
from pathlib import Path
from random import SystemRandom
import time
from typing import Any, Final
from aiohttp import ClientError, hdrs, web
import voluptuous as vol
from homeassistant.components import websocket_api
from homeassistant.components.http import KEY_AUTHENTICATED, HomeAssistantView
from homeassistant.core import HomeAssistant, callback, valid_domain
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.event import async_track_time_interval
from homeassistant.helpers.typing import ConfigType
from homeassistant.loader import async_get_custom_components
from .const import (
ALLOWED_IMAGES,
BRANDS_CDN_URL,
CACHE_TTL,
CATEGORY_RE,
CDN_TIMEOUT,
DOMAIN,
HARDWARE_IMAGE_RE,
IMAGE_FALLBACKS,
PLACEHOLDER,
TOKEN_CHANGE_INTERVAL,
)
_LOGGER = logging.getLogger(__name__)
_RND: Final = SystemRandom()
CONFIG_SCHEMA = cv.empty_config_schema(DOMAIN)
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the Brands integration."""
access_tokens: deque[str] = deque([], 2)
access_tokens.append(hex(_RND.getrandbits(256))[2:])
hass.data[DOMAIN] = access_tokens
@callback
def _rotate_token(_now: Any) -> None:
"""Rotate the access token."""
access_tokens.append(hex(_RND.getrandbits(256))[2:])
async_track_time_interval(hass, _rotate_token, TOKEN_CHANGE_INTERVAL)
hass.http.register_view(BrandsIntegrationView(hass))
hass.http.register_view(BrandsHardwareView(hass))
websocket_api.async_register_command(hass, ws_access_token)
return True
@callback
@websocket_api.websocket_command({vol.Required("type"): "brands/access_token"})
def ws_access_token(
hass: HomeAssistant,
connection: websocket_api.ActiveConnection,
msg: dict[str, Any],
) -> None:
"""Return the current brands access token."""
access_tokens: deque[str] = hass.data[DOMAIN]
connection.send_result(msg["id"], {"token": access_tokens[-1]})
def _read_cached_file_with_marker(
cache_path: Path,
) -> tuple[bytes | None, float] | None:
"""Read a cached file, distinguishing between content and 404 markers.
Returns (content, mtime) where content is None for 404 markers (empty files).
Returns None if the file does not exist at all.
"""
if not cache_path.is_file():
return None
mtime = cache_path.stat().st_mtime
data = cache_path.read_bytes()
if not data:
# Empty file is a 404 marker
return (None, mtime)
return (data, mtime)
def _write_cache_file(cache_path: Path, data: bytes) -> None:
"""Write data to cache file, creating directories as needed."""
cache_path.parent.mkdir(parents=True, exist_ok=True)
cache_path.write_bytes(data)
def _read_brand_file(brand_dir: Path, image: str) -> bytes | None:
"""Read a brand image, trying fallbacks in a single I/O pass."""
for candidate in (image, *IMAGE_FALLBACKS.get(image, ())):
file_path = brand_dir / candidate
if file_path.is_file():
return file_path.read_bytes()
return None
class _BrandsBaseView(HomeAssistantView):
"""Base view for serving brand images."""
requires_auth = False
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the view."""
self._hass = hass
self._cache_dir = Path(hass.config.cache_path(DOMAIN))
def _authenticate(self, request: web.Request) -> None:
"""Authenticate the request using Bearer token or query token."""
access_tokens: deque[str] = self._hass.data[DOMAIN]
authenticated = (
request[KEY_AUTHENTICATED] or request.query.get("token") in access_tokens
)
if not authenticated:
if hdrs.AUTHORIZATION in request.headers:
raise web.HTTPUnauthorized
raise web.HTTPForbidden
async def _serve_from_custom_integration(
self,
domain: str,
image: str,
) -> web.Response | None:
"""Try to serve a brand image from a custom integration."""
custom_components = await async_get_custom_components(self._hass)
if (integration := custom_components.get(domain)) is None:
return None
if not integration.has_branding:
return None
brand_dir = Path(integration.file_path) / "brand"
data = await self._hass.async_add_executor_job(
_read_brand_file, brand_dir, image
)
if data is not None:
return self._build_response(data)
return None
async def _serve_from_cache_or_cdn(
self,
cdn_path: str,
cache_subpath: str,
*,
fallback_placeholder: bool = True,
) -> web.Response:
"""Serve from disk cache, fetching from CDN if needed."""
cache_path = self._cache_dir / cache_subpath
now = time.time()
# Try disk cache
result = await self._hass.async_add_executor_job(
_read_cached_file_with_marker, cache_path
)
if result is not None:
data, mtime = result
# Schedule background refresh if stale
if now - mtime > CACHE_TTL:
self._hass.async_create_background_task(
self._fetch_and_cache(cdn_path, cache_path),
f"brands_refresh_{cache_subpath}",
)
else:
# Cache miss - fetch from CDN
data = await self._fetch_and_cache(cdn_path, cache_path)
if data is None:
if fallback_placeholder:
return await self._serve_placeholder(
image=cache_subpath.rsplit("/", 1)[-1]
)
return web.Response(status=HTTPStatus.NOT_FOUND)
return self._build_response(data)
async def _fetch_and_cache(
self,
cdn_path: str,
cache_path: Path,
) -> bytes | None:
"""Fetch from CDN and write to cache. Returns data or None on 404."""
url = f"{BRANDS_CDN_URL}/{cdn_path}"
session = async_get_clientsession(self._hass)
try:
resp = await session.get(url, timeout=CDN_TIMEOUT)
except ClientError, TimeoutError:
_LOGGER.debug("Failed to fetch brand from CDN: %s", cdn_path)
return None
if resp.status == HTTPStatus.NOT_FOUND:
# Cache the 404 as empty file
await self._hass.async_add_executor_job(_write_cache_file, cache_path, b"")
return None
if resp.status != HTTPStatus.OK:
_LOGGER.debug("Unexpected CDN response %s for %s", resp.status, cdn_path)
return None
data = await resp.read()
await self._hass.async_add_executor_job(_write_cache_file, cache_path, data)
return data
async def _serve_placeholder(self, image: str) -> web.Response:
"""Serve a placeholder image."""
return await self._serve_from_cache_or_cdn(
cdn_path=f"_/{PLACEHOLDER}/{image}",
cache_subpath=f"integrations/{PLACEHOLDER}/{image}",
fallback_placeholder=False,
)
@staticmethod
def _build_response(data: bytes) -> web.Response:
"""Build a response with proper headers."""
return web.Response(
body=data,
content_type="image/png",
)
class BrandsIntegrationView(_BrandsBaseView):
"""Serve integration brand images."""
name = "api:brands:integration"
url = "/api/brands/integration/{domain}/{image}"
async def get(
self,
request: web.Request,
domain: str,
image: str,
) -> web.Response:
"""Handle GET request for an integration brand image."""
self._authenticate(request)
if not valid_domain(domain) or image not in ALLOWED_IMAGES:
return web.Response(status=HTTPStatus.NOT_FOUND)
use_placeholder = request.query.get("placeholder") != "no"
# 1. Try custom integration local files
if (
response := await self._serve_from_custom_integration(domain, image)
) is not None:
return response
# 2. Try cache / CDN (always use direct path for proper 404 caching)
return await self._serve_from_cache_or_cdn(
cdn_path=f"brands/{domain}/{image}",
cache_subpath=f"integrations/{domain}/{image}",
fallback_placeholder=use_placeholder,
)
class BrandsHardwareView(_BrandsBaseView):
"""Serve hardware brand images."""
name = "api:brands:hardware"
url = "/api/brands/hardware/{category}/{image:.+}"
async def get(
self,
request: web.Request,
category: str,
image: str,
) -> web.Response:
"""Handle GET request for a hardware brand image."""
self._authenticate(request)
if not CATEGORY_RE.match(category):
return web.Response(status=HTTPStatus.NOT_FOUND)
# Hardware images have dynamic names like "manufacturer_model.png"
# Validate it ends with .png and contains only safe characters
if not HARDWARE_IMAGE_RE.match(image):
return web.Response(status=HTTPStatus.NOT_FOUND)
cache_subpath = f"hardware/{category}/{image}"
return await self._serve_from_cache_or_cdn(
cdn_path=cache_subpath,
cache_subpath=cache_subpath,
)

View File

@@ -1,57 +0,0 @@
"""Constants for the Brands integration."""
from __future__ import annotations
from datetime import timedelta
import re
from typing import Final
from aiohttp import ClientTimeout
DOMAIN: Final = "brands"
# CDN
BRANDS_CDN_URL: Final = "https://brands.home-assistant.io"
CDN_TIMEOUT: Final = ClientTimeout(total=10)
PLACEHOLDER: Final = "_placeholder"
# Caching
CACHE_TTL: Final = 30 * 24 * 60 * 60 # 30 days in seconds
# Access token
TOKEN_CHANGE_INTERVAL: Final = timedelta(minutes=30)
# Validation
CATEGORY_RE: Final = re.compile(r"^[a-z0-9_]+$")
HARDWARE_IMAGE_RE: Final = re.compile(r"^[a-z0-9_-]+\.png$")
# Images and fallback chains
ALLOWED_IMAGES: Final = frozenset(
{
"icon.png",
"logo.png",
"icon@2x.png",
"logo@2x.png",
"dark_icon.png",
"dark_logo.png",
"dark_icon@2x.png",
"dark_logo@2x.png",
}
)
# Fallback chains for image resolution, mirroring the brands CDN build logic.
# When a requested image is not found, we try each fallback in order.
IMAGE_FALLBACKS: Final[dict[str, list[str]]] = {
"logo.png": ["icon.png"],
"icon@2x.png": ["icon.png"],
"logo@2x.png": ["logo.png", "icon.png"],
"dark_icon.png": ["icon.png"],
"dark_logo.png": ["dark_icon.png", "logo.png", "icon.png"],
"dark_icon@2x.png": ["icon@2x.png", "icon.png"],
"dark_logo@2x.png": [
"dark_icon@2x.png",
"logo@2x.png",
"logo.png",
"icon.png",
],
}

View File

@@ -1,10 +0,0 @@
{
"domain": "brands",
"name": "Brands",
"codeowners": ["@home-assistant/core"],
"config_flow": false,
"dependencies": ["http", "websocket_api"],
"documentation": "https://www.home-assistant.io/integrations/brands",
"integration_type": "system",
"quality_scale": "internal"
}

View File

@@ -7,8 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["bsblan"],
"quality_scale": "silver",
"requirements": ["python-bsblan==5.1.1"],
"requirements": ["python-bsblan==5.0.1"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -1,74 +0,0 @@
rules:
# Bronze
action-setup: done
appropriate-polling: done
brands: done
common-modules: done
config-flow-test-coverage: done
config-flow: done
dependency-transparency: done
docs-actions: done
docs-high-level-description: done
docs-installation-instructions: done
docs-removal-instructions: done
entity-event-setup:
status: exempt
comment: |
Entities of this integration does not explicitly subscribe to events.
entity-unique-id: done
has-entity-name: done
runtime-data: done
test-before-configure: done
test-before-setup: done
unique-config-entry: done
# Silver
action-exceptions: done
config-entry-unloading: done
docs-configuration-parameters: done
docs-installation-parameters: done
entity-unavailable: done
integration-owner: done
log-when-unavailable: done
parallel-updates: done
reauthentication-flow: done
test-coverage: done
# Gold
devices: done
diagnostics: done
discovery-update-info: done
discovery: done
docs-data-update: todo
docs-examples: todo
docs-known-limitations: todo
docs-supported-devices: todo
docs-supported-functions: todo
docs-troubleshooting: todo
docs-use-cases: todo
dynamic-devices:
status: exempt
comment: |
This integration has a fixed single device.
entity-category: done
entity-device-class: done
entity-disabled-by-default:
status: exempt
comment: |
This integration provides a limited number of entities, all of which are useful to users.
entity-translations: done
exception-translations: done
icon-translations: todo
reconfiguration-flow: todo
repair-issues:
status: exempt
comment: |
This integration doesn't have any cases where raising an issue is needed.
stale-devices:
status: exempt
comment: |
This integration has a fixed single device.
# Platinum
async-dependency: done
inject-websession: done
strict-typing: done

View File

@@ -64,8 +64,6 @@ SENSOR_TYPES: tuple[BSBLanSensorEntityDescription, ...] = (
device_class=SensorDeviceClass.ENERGY,
native_unit_of_measurement=UnitOfEnergy.KILO_WATT_HOUR,
state_class=SensorStateClass.TOTAL_INCREASING,
suggested_display_precision=0,
entity_registry_enabled_default=False,
value_fn=lambda data: (
data.sensor.total_energy.value
if data.sensor.total_energy is not None

View File

@@ -31,6 +31,10 @@ ATTR_FRIDAY_SLOTS = "friday_slots"
ATTR_SATURDAY_SLOTS = "saturday_slots"
ATTR_SUNDAY_SLOTS = "sunday_slots"
# Service names
SERVICE_SET_HOT_WATER_SCHEDULE = "set_hot_water_schedule"
SERVICE_SYNC_TIME = "sync_time"
# Schema for a single time slot
_SLOT_SCHEMA = vol.Schema(
@@ -256,14 +260,14 @@ def async_setup_services(hass: HomeAssistant) -> None:
"""Register the BSB-LAN services."""
hass.services.async_register(
DOMAIN,
"set_hot_water_schedule",
SERVICE_SET_HOT_WATER_SCHEDULE,
set_hot_water_schedule,
schema=SERVICE_SET_HOT_WATER_SCHEDULE_SCHEMA,
)
hass.services.async_register(
DOMAIN,
"sync_time",
SERVICE_SYNC_TIME,
async_sync_time,
schema=SYNC_TIME_SCHEMA,
)

View File

@@ -14,7 +14,7 @@ from . import DOMAIN
class ButtonPressedTrigger(EntityTriggerBase):
"""Trigger for button entity presses."""
_domains = {DOMAIN}
_domain = DOMAIN
_schema = ENTITY_STATE_TRIGGER_SCHEMA
def is_valid_transition(self, from_state: State, to_state: State) -> bool:

View File

@@ -29,12 +29,6 @@
"early_update": {
"default": "mdi:update"
},
"equalizer": {
"default": "mdi:equalizer",
"state": {
"off": "mdi:equalizer-outline"
}
},
"pre_amp": {
"default": "mdi:volume-high",
"state": {

View File

@@ -38,7 +38,7 @@ async def _root_payload(
media_class=MediaClass.DIRECTORY,
media_content_id="",
media_content_type="presets",
thumbnail="/api/brands/integration/cambridge_audio/logo.png",
thumbnail="https://brands.home-assistant.io/_/cambridge_audio/logo.png",
can_play=False,
can_expand=True,
)

View File

@@ -65,9 +65,6 @@
"early_update": {
"name": "Early update"
},
"equalizer": {
"name": "Equalizer"
},
"pre_amp": {
"name": "Pre-Amp"
},

View File

@@ -33,13 +33,6 @@ def room_correction_enabled(client: StreamMagicClient) -> bool:
return client.audio.tilt_eq.enabled
def equalizer_enabled(client: StreamMagicClient) -> bool:
"""Check if equalizer is enabled."""
if TYPE_CHECKING:
assert client.audio.user_eq is not None
return client.audio.user_eq.enabled
CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
CambridgeAudioSwitchEntityDescription(
key="pre_amp",
@@ -63,14 +56,6 @@ CONTROL_ENTITIES: tuple[CambridgeAudioSwitchEntityDescription, ...] = (
value_fn=room_correction_enabled,
set_value_fn=lambda client, value: client.set_room_correction_mode(value),
),
CambridgeAudioSwitchEntityDescription(
key="equalizer",
translation_key="equalizer",
entity_category=EntityCategory.CONFIG,
load_fn=lambda client: client.audio.user_eq is not None,
value_fn=equalizer_enabled,
set_value_fn=lambda client, value: client.set_equalizer_mode(value),
),
)

View File

@@ -804,24 +804,9 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
@property
def state(self) -> MediaPlayerState | None:
"""Return the state of the player."""
if (chromecast := self._chromecast) is None or (
cast_status := self.cast_status
) is None:
# Not connected to any chromecast, or not yet got any status
return None
if (
chromecast.cast_type == pychromecast.const.CAST_TYPE_CHROMECAST
and not chromecast.ignore_cec
and cast_status.is_active_input is False
):
# The display interface for the device has been turned off or switched away
return MediaPlayerState.OFF
# The lovelace app loops media to prevent timing out, don't show that
if self.app_id == CAST_APP_ID_HOMEASSISTANT_LOVELACE:
# The lovelace app loops media to prevent timing out, don't show that
return MediaPlayerState.PLAYING
if (media_status := self._media_status()[0]) is not None:
if media_status.player_state == MEDIA_PLAYER_STATE_PLAYING:
return MediaPlayerState.PLAYING
@@ -832,16 +817,20 @@ class CastMediaPlayerEntity(CastDevice, MediaPlayerEntity):
if media_status.player_is_idle:
return MediaPlayerState.IDLE
if self._chromecast is not None and self._chromecast.is_idle:
# If library consider us idle, that is our off state
# it takes HDMI status into account for cast devices.
return MediaPlayerState.OFF
if self.app_id in APP_IDS_UNRELIABLE_MEDIA_INFO:
# Some apps don't report media status, show the player as playing
return MediaPlayerState.PLAYING
if self.app_id in (pychromecast.IDLE_APP_ID, None):
# We have no active app or the home screen app. This is
# same app as APP_BACKDROP.
return MediaPlayerState.OFF
if self.app_id is not None:
# We have an active app
return MediaPlayerState.IDLE
return MediaPlayerState.IDLE
return None
@property
def media_content_id(self) -> str | None:

View File

@@ -43,7 +43,7 @@ HVAC_MODE_CHANGED_TRIGGER_SCHEMA = ENTITY_STATE_TRIGGER_SCHEMA_FIRST_LAST.extend
class HVACModeChangedTrigger(EntityTargetStateTriggerBase):
"""Trigger for entity state changes."""
_domains = {DOMAIN}
_domain = DOMAIN
_schema = HVAC_MODE_CHANGED_TRIGGER_SCHEMA
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:

View File

@@ -12,10 +12,8 @@ from .coordinator import CompitConfigEntry, CompitDataUpdateCoordinator
PLATFORMS = [
Platform.BINARY_SENSOR,
Platform.CLIMATE,
Platform.FAN,
Platform.NUMBER,
Platform.SELECT,
Platform.SENSOR,
Platform.WATER_HEATER,
]

View File

@@ -1,172 +0,0 @@
"""Fan platform for Compit integration."""
from typing import Any
from compit_inext_api import PARAM_VALUES
from compit_inext_api.consts import CompitParameter
from homeassistant.components.fan import (
FanEntity,
FanEntityDescription,
FanEntityFeature,
)
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.device_registry import DeviceInfo
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from homeassistant.helpers.update_coordinator import CoordinatorEntity
from homeassistant.util.percentage import (
ordered_list_item_to_percentage,
percentage_to_ordered_list_item,
)
from .const import DOMAIN, MANUFACTURER_NAME
from .coordinator import CompitConfigEntry, CompitDataUpdateCoordinator
PARALLEL_UPDATES = 0
COMPIT_GEAR_TO_HA = PARAM_VALUES[CompitParameter.VENTILATION_GEAR_TARGET]
HA_STATE_TO_COMPIT = {value: key for key, value in COMPIT_GEAR_TO_HA.items()}
DEVICE_DEFINITIONS: dict[int, FanEntityDescription] = {
223: FanEntityDescription(
key="Nano Color 2",
translation_key="ventilation",
),
12: FanEntityDescription(
key="Nano Color",
translation_key="ventilation",
),
}
async def async_setup_entry(
hass: HomeAssistant,
entry: CompitConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Compit fan entities from a config entry."""
coordinator = entry.runtime_data
async_add_entities(
CompitFan(
coordinator,
device_id,
device_definition,
)
for device_id, device in coordinator.connector.all_devices.items()
if (device_definition := DEVICE_DEFINITIONS.get(device.definition.code))
)
class CompitFan(CoordinatorEntity[CompitDataUpdateCoordinator], FanEntity):
"""Representation of a Compit fan entity."""
_attr_speed_count = len(COMPIT_GEAR_TO_HA)
_attr_has_entity_name = True
_attr_name = None
_attr_supported_features = (
FanEntityFeature.TURN_ON
| FanEntityFeature.TURN_OFF
| FanEntityFeature.SET_SPEED
)
def __init__(
self,
coordinator: CompitDataUpdateCoordinator,
device_id: int,
entity_description: FanEntityDescription,
) -> None:
"""Initialize the fan entity."""
super().__init__(coordinator)
self.device_id = device_id
self.entity_description = entity_description
self._attr_unique_id = f"{device_id}_{entity_description.key}"
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, str(device_id))},
name=entity_description.key,
manufacturer=MANUFACTURER_NAME,
model=entity_description.key,
)
@property
def available(self) -> bool:
"""Return if entity is available."""
return (
super().available
and self.coordinator.connector.get_device(self.device_id) is not None
)
@property
def is_on(self) -> bool | None:
"""Return true if the fan is on."""
value = self.coordinator.connector.get_current_option(
self.device_id, CompitParameter.VENTILATION_ON_OFF
)
return True if value == STATE_ON else False if value == STATE_OFF else None
async def async_turn_on(
self,
percentage: int | None = None,
preset_mode: str | None = None,
**kwargs: Any,
) -> None:
"""Turn on the fan."""
await self.coordinator.connector.select_device_option(
self.device_id, CompitParameter.VENTILATION_ON_OFF, STATE_ON
)
if percentage is None:
self.async_write_ha_state()
return
await self.async_set_percentage(percentage)
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn off the fan."""
await self.coordinator.connector.select_device_option(
self.device_id, CompitParameter.VENTILATION_ON_OFF, STATE_OFF
)
self.async_write_ha_state()
@property
def percentage(self) -> int | None:
"""Return the current fan speed as a percentage."""
if self.is_on is False:
return 0
mode = self.coordinator.connector.get_current_option(
self.device_id, CompitParameter.VENTILATION_GEAR_TARGET
)
if mode is None:
return None
gear = COMPIT_GEAR_TO_HA.get(mode)
return (
None
if gear is None
else ordered_list_item_to_percentage(
list(COMPIT_GEAR_TO_HA.values()),
gear,
)
)
async def async_set_percentage(self, percentage: int) -> None:
"""Set the fan speed."""
if percentage == 0:
await self.async_turn_off()
return
gear = int(
percentage_to_ordered_list_item(
list(COMPIT_GEAR_TO_HA.values()),
percentage,
)
)
mode = HA_STATE_TO_COMPIT.get(gear)
if mode is None:
return
await self.coordinator.connector.select_device_option(
self.device_id, CompitParameter.VENTILATION_GEAR_TARGET, mode
)
self.async_write_ha_state()

View File

@@ -20,14 +20,6 @@
"default": "mdi:alert"
}
},
"fan": {
"ventilation": {
"default": "mdi:fan",
"state": {
"off": "mdi:fan-off"
}
}
},
"number": {
"boiler_target_temperature": {
"default": "mdi:water-boiler"
@@ -166,119 +158,6 @@
"winter": "mdi:snowflake"
}
}
},
"sensor": {
"alarm_code": {
"default": "mdi:alert-circle",
"state": {
"no_alarm": "mdi:check-circle"
}
},
"battery_level": {
"default": "mdi:battery"
},
"boiler_temperature": {
"default": "mdi:thermometer"
},
"calculated_heating_temperature": {
"default": "mdi:thermometer"
},
"calculated_target_temperature": {
"default": "mdi:thermometer"
},
"charging_power": {
"default": "mdi:flash"
},
"circuit_target_temperature": {
"default": "mdi:thermometer"
},
"co2_percent": {
"default": "mdi:molecule-co2"
},
"collector_power": {
"default": "mdi:solar-power"
},
"collector_temperature": {
"default": "mdi:thermometer"
},
"dhw_measured_temperature": {
"default": "mdi:thermometer"
},
"energy_consumption": {
"default": "mdi:lightning-bolt"
},
"energy_smart_grid_yesterday": {
"default": "mdi:lightning-bolt"
},
"energy_today": {
"default": "mdi:lightning-bolt"
},
"energy_total": {
"default": "mdi:lightning-bolt"
},
"energy_yesterday": {
"default": "mdi:lightning-bolt"
},
"fuel_level": {
"default": "mdi:gauge"
},
"humidity": {
"default": "mdi:water-percent"
},
"mixer_temperature": {
"default": "mdi:thermometer"
},
"outdoor_temperature": {
"default": "mdi:thermometer"
},
"pk1_function": {
"default": "mdi:cog",
"state": {
"cooling": "mdi:snowflake-thermometer",
"off": "mdi:cog-off",
"summer": "mdi:weather-sunny",
"winter": "mdi:snowflake"
}
},
"pm10_level": {
"default": "mdi:air-filter",
"state": {
"exceeded": "mdi:alert",
"no_sensor": "mdi:cancel",
"normal": "mdi:air-filter",
"warning": "mdi:alert-circle-outline"
}
},
"pm25_level": {
"default": "mdi:air-filter",
"state": {
"exceeded": "mdi:alert",
"no_sensor": "mdi:cancel",
"normal": "mdi:air-filter",
"warning": "mdi:alert-circle-outline"
}
},
"return_circuit_temperature": {
"default": "mdi:thermometer"
},
"tank_temperature_t2": {
"default": "mdi:thermometer"
},
"tank_temperature_t3": {
"default": "mdi:thermometer"
},
"tank_temperature_t4": {
"default": "mdi:thermometer"
},
"target_heating_temperature": {
"default": "mdi:thermometer"
},
"ventilation_alarm": {
"default": "mdi:alert",
"state": {
"no_alarm": "mdi:check-circle"
}
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -53,11 +53,6 @@
"name": "Temperature alert"
}
},
"fan": {
"ventilation": {
"name": "[%key:component::fan::title%]"
}
},
"number": {
"boiler_target_temperature": {
"name": "Boiler target temperature"
@@ -208,219 +203,6 @@
"winter": "Winter"
}
}
},
"sensor": {
"actual_buffer_temp": {
"name": "Actual buffer temperature"
},
"actual_dhw_temp": {
"name": "Actual DHW temperature"
},
"actual_hc_temperature_zone": {
"name": "Actual heating circuit {zone} temperature"
},
"actual_upper_source_temp": {
"name": "Actual upper source temperature"
},
"alarm_code": {
"name": "Alarm code",
"state": {
"battery_fault": "Battery fault",
"damaged_outdoor_temp": "Damaged outdoor temperature sensor",
"damaged_return_temp": "Damaged return temperature sensor",
"discharged_battery": "Discharged battery",
"internal_af": "Internal fault",
"low_battery_level": "Low battery level",
"no_alarm": "No alarm",
"no_battery": "No battery",
"no_power": "No power",
"no_pump": "No pump",
"pump_fault": "Pump fault"
}
},
"battery_level": {
"name": "Battery level"
},
"boiler_temperature": {
"name": "Boiler temperature"
},
"buffer_return_temperature": {
"name": "Buffer return temperature"
},
"buffer_set_temperature": {
"name": "Buffer set temperature"
},
"calculated_buffer_temp": {
"name": "Calculated buffer temperature"
},
"calculated_dhw_temp": {
"name": "Calculated DHW temperature"
},
"calculated_heating_temperature": {
"name": "Calculated heating temperature"
},
"calculated_target_temperature": {
"name": "Calculated target temperature"
},
"calculated_upper_source_temp": {
"name": "Calculated upper source temperature"
},
"charging_power": {
"name": "Charging power"
},
"circuit_target_temperature": {
"name": "Circuit target temperature"
},
"co2_percent": {
"name": "CO2 percent"
},
"collector_power": {
"name": "Collector power"
},
"collector_temperature": {
"name": "Collector temperature"
},
"dhw_measured_temperature": {
"name": "DHW measured temperature"
},
"dhw_temperature": {
"name": "DHW temperature"
},
"energy_consumption": {
"name": "Energy consumption"
},
"energy_smart_grid_yesterday": {
"name": "Energy smart grid yesterday"
},
"energy_today": {
"name": "Energy today"
},
"energy_total": {
"name": "Energy total"
},
"energy_yesterday": {
"name": "Energy yesterday"
},
"fuel_level": {
"name": "Fuel level"
},
"heating_target_temperature_zone": {
"name": "Heating circuit {zone} target temperature"
},
"lower_source_temperature": {
"name": "Lower source temperature"
},
"mixer_temperature": {
"name": "Mixer temperature"
},
"mixer_temperature_zone": {
"name": "Mixer {zone} temperature"
},
"outdoor_temperature": {
"name": "Outdoor temperature"
},
"pk1_function": {
"name": "PK1 function",
"state": {
"cooling": "Cooling",
"holiday": "Holiday",
"nano_nr_1": "Nano 1",
"nano_nr_2": "Nano 2",
"nano_nr_3": "Nano 3",
"nano_nr_4": "Nano 4",
"nano_nr_5": "Nano 5",
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]",
"summer": "Summer",
"winter": "Winter"
}
},
"pm10_level": {
"name": "PM10 level",
"state": {
"exceeded": "Exceeded",
"no_sensor": "No sensor",
"normal": "Normal",
"warning": "Warning"
}
},
"pm1_level": {
"name": "PM1 level"
},
"pm25_level": {
"name": "PM2.5 level",
"state": {
"exceeded": "Exceeded",
"no_sensor": "No sensor",
"normal": "Normal",
"warning": "Warning"
}
},
"pm4_level": {
"name": "PM4 level"
},
"preset_mode": {
"name": "Preset mode"
},
"protection_temperature": {
"name": "Protection temperature"
},
"pump_status": {
"name": "Pump status",
"state": {
"off": "[%key:common::state::off%]",
"on": "[%key:common::state::on%]"
}
},
"return_circuit_temperature": {
"name": "Return circuit temperature"
},
"set_target_temperature": {
"name": "Set target temperature"
},
"tank_temperature_t2": {
"name": "Tank T2 bottom temperature"
},
"tank_temperature_t3": {
"name": "Tank T3 top temperature"
},
"tank_temperature_t4": {
"name": "Tank T4 temperature"
},
"target_heating_temperature": {
"name": "Target heating temperature"
},
"target_temperature": {
"name": "Target temperature"
},
"temperature_alert": {
"name": "Temperature alert",
"state": {
"alert": "Alert",
"no_alert": "No alert"
}
},
"upper_source_temperature": {
"name": "Upper source temperature"
},
"ventilation_alarm": {
"name": "Ventilation alarm",
"state": {
"ahu_alarm": "AHU alarm",
"bot_alarm": "BOT alarm",
"damaged_exhaust_sensor": "Damaged exhaust sensor",
"damaged_preheater_sensor": "Damaged preheater sensor",
"damaged_supply_and_exhaust_sensors": "Damaged supply and exhaust sensors",
"damaged_supply_sensor": "Damaged supply sensor",
"no_alarm": "No alarm"
}
},
"ventilation_gear": {
"name": "Ventilation gear"
},
"weather_curve": {
"name": "Weather curve"
}
}
}
}

View File

@@ -48,8 +48,6 @@ def async_setup(hass: HomeAssistant) -> None:
vol.Optional("conversation_id"): vol.Any(str, None),
vol.Optional("language"): str,
vol.Optional("agent_id"): agent_id_validator,
vol.Optional("device_id"): vol.Any(str, None),
vol.Optional("satellite_id"): vol.Any(str, None),
}
)
@websocket_api.async_response
@@ -66,8 +64,6 @@ async def websocket_process(
context=connection.context(msg),
language=msg.get("language"),
agent_id=msg.get("agent_id"),
device_id=msg.get("device_id"),
satellite_id=msg.get("satellite_id"),
)
connection.send_result(msg["id"], result.as_dict())
@@ -252,8 +248,6 @@ class ConversationProcessView(http.HomeAssistantView):
vol.Optional("conversation_id"): str,
vol.Optional("language"): str,
vol.Optional("agent_id"): agent_id_validator,
vol.Optional("device_id"): vol.Any(str, None),
vol.Optional("satellite_id"): vol.Any(str, None),
}
)
)
@@ -268,8 +262,6 @@ class ConversationProcessView(http.HomeAssistantView):
context=self.context(request),
language=data.get("language"),
agent_id=data.get("agent_id"),
device_id=data.get("device_id"),
satellite_id=data.get("satellite_id"),
)
return self.json(result.as_dict())

View File

@@ -6,5 +6,5 @@
"documentation": "https://www.home-assistant.io/integrations/conversation",
"integration_type": "entity",
"quality_scale": "internal",
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.3.3"]
"requirements": ["hassil==3.5.0", "home-assistant-intents==2026.2.13"]
}

View File

@@ -91,7 +91,6 @@ class CoverEntityFeature(IntFlag):
ATTR_CURRENT_POSITION = "current_position"
ATTR_CURRENT_TILT_POSITION = "current_tilt_position"
ATTR_IS_CLOSED = "is_closed"
ATTR_POSITION = "position"
ATTR_TILT_POSITION = "tilt_position"
@@ -268,9 +267,7 @@ class CoverEntity(Entity, cached_properties=CACHED_PROPERTIES_WITH_ATTR_):
@property
def state_attributes(self) -> dict[str, Any]:
"""Return the state attributes."""
data: dict[str, Any] = {}
data[ATTR_IS_CLOSED] = self.is_closed
data = {}
if (current := self.current_cover_position) is not None:
data[ATTR_CURRENT_POSITION] = current

View File

@@ -112,12 +112,11 @@ def _zone_is_configured(zone: DaikinZone) -> bool:
def _zone_temperature_lists(device: Appliance) -> tuple[list[str], list[str]]:
"""Return the decoded zone temperature lists."""
values = device.values
if DAIKIN_ZONE_TEMP_HEAT not in values or DAIKIN_ZONE_TEMP_COOL not in values:
try:
heating = device.represent(DAIKIN_ZONE_TEMP_HEAT)[1]
cooling = device.represent(DAIKIN_ZONE_TEMP_COOL)[1]
except AttributeError:
return ([], [])
heating = device.represent(DAIKIN_ZONE_TEMP_HEAT)[1]
cooling = device.represent(DAIKIN_ZONE_TEMP_COOL)[1]
return (list(heating or []), list(cooling or []))

Some files were not shown because too many files have changed in this diff Show More