Compare commits

..

29 Commits

Author SHA1 Message Date
Jan Bouwhuis
ea767c934f Merge branch 'dev' into add-include_entities-attribute 2026-01-12 09:34:50 +01:00
jbouwh
bb97822db9 Cleanup 2025-11-13 06:48:59 +00:00
jbouwh
33ffccabd1 Refactor 2025-11-13 06:48:59 +00:00
jbouwh
56de03ce33 Rework private _included_entities attribute 2025-11-13 06:48:59 +00:00
jbouwh
0cbf7002a8 Add docstring 2025-11-13 06:48:59 +00:00
jbouwh
cffceffe04 Move setup code to add_to_platform_finish 2025-11-13 06:48:59 +00:00
jbouwh
253189805e Remove final 2025-11-13 06:48:59 +00:00
jbouwh
2e91725ac0 Use cached_properties 2025-11-13 06:48:58 +00:00
jbouwh
3b54dddc08 Fix attrbute check - make property final 2025-11-13 06:48:58 +00:00
jbouwh
9bc3d83a55 Update docstring 2025-11-13 06:48:58 +00:00
jbouwh
d62a554cbf Remove the need to manually call async_set_included_entities 2025-11-13 06:48:58 +00:00
jbouwh
f071b7cd46 Improve docstring 2025-11-13 06:48:58 +00:00
jbouwh
37f34f6189 Remove _included_entities property 2025-11-13 06:48:58 +00:00
jbouwh
27dc5b6d18 Do not set included entities if no unique IDs are set 2025-11-13 06:48:58 +00:00
jbouwh
0bbc2f49a6 Upfdate docstr 2025-11-13 06:48:58 +00:00
jbouwh
c121fa25e8 Call async_set_included_entities from add_to_platform_finish 2025-11-13 06:48:58 +00:00
jbouwh
660cea8b65 Handle the entity_id attribute in the Entity base class 2025-11-13 06:48:58 +00:00
jbouwh
c7749ebae1 Fix device tracker 2025-11-13 06:48:58 +00:00
jbouwh
a2acb744b3 Use platform name 2025-11-13 06:48:58 +00:00
jbouwh
0d9158689d Fix device tracker state attrs 2025-11-13 06:48:58 +00:00
jbouwh
f85e8d6c1f Also implement as default in base entity 2025-11-13 06:48:58 +00:00
jbouwh
9be4cc5af1 Integrate with base entity component state attributes 2025-11-13 06:48:58 +00:00
jbouwh
a141eedf2c Update docstr 2025-11-13 06:48:58 +00:00
jbouwh
03040c131c Move logic into Entity class 2025-11-13 06:48:58 +00:00
jbouwh
3eef50632c Use platform domain attribute 2025-11-13 06:48:58 +00:00
jbouwh
eff150cd54 Fix typo 2025-11-13 06:48:58 +00:00
jbouwh
6dcc94b0a1 Follow up on code review 2025-11-13 06:48:58 +00:00
jbouwh
7201903877 Implement mixin class and add feature to maintain included entities from unique IDs 2025-11-13 06:48:58 +00:00
jbouwh
5b776307ea Add included_entities attribute to base Entity class 2025-11-13 06:48:57 +00:00
1626 changed files with 20960 additions and 54020 deletions

View File

@@ -1,784 +0,0 @@
---
name: Home Assistant Integration knowledge
description: Everything you need to know to build, test and review Home Assistant Integrations. If you're looking at an integration, you must use this as your primary reference.
---
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
An integration can have platforms as needed (e.g., `sensor.py`, `switch.py`, etc.). The following platforms have extra guidelines:
- **Diagnostics**: [`platform-diagnostics.md`](platform-diagnostics.md) for diagnostic data collection
- **Repairs**: [`platform-repairs.md`](platform-repairs.md) for user-actionable repair issues
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```

View File

@@ -1,19 +0,0 @@
# Integration Diagnostics
Platform exists as `homeassistant/components/<domain>/diagnostics.py`.
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates

View File

@@ -1,55 +0,0 @@
# Repairs platform
Platform exists as `homeassistant/components/<domain>/repairs.py`.
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve

View File

@@ -91,7 +91,6 @@ components: &components
- homeassistant/components/input_number/**
- homeassistant/components/input_select/**
- homeassistant/components/input_text/**
- homeassistant/components/labs/**
- homeassistant/components/logbook/**
- homeassistant/components/logger/**
- homeassistant/components/lovelace/**

View File

@@ -40,8 +40,7 @@
"python.terminal.activateEnvInCurrentTerminal": true,
"python.testing.pytestArgs": ["--no-cov"],
"pylint.importStrategy": "fromEnvironment",
// Pyright type checking is not compatible with mypy which Home Assistant uses for type checking
"python.analysis.typeCheckingMode": "off",
"python.analysis.typeCheckingMode": "basic",
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,

View File

@@ -2,6 +2,49 @@
This repository contains the core of Home Assistant, a Python 3 based home automation application.
## Integration Quality Scale
Home Assistant uses an Integration Quality Scale to ensure code quality and consistency. The quality level determines which rules apply:
### Quality Scale Levels
- **Bronze**: Basic requirements (ALL Bronze rules are mandatory)
- **Silver**: Enhanced functionality
- **Gold**: Advanced features
- **Platinum**: Highest quality standards
### How Rules Apply
1. **Check `manifest.json`**: Look for `"quality_scale"` key to determine integration level
2. **Bronze Rules**: Always required for any integration with quality scale
3. **Higher Tier Rules**: Only apply if integration targets that tier or higher
4. **Rule Status**: Check `quality_scale.yaml` in integration folder for:
- `done`: Rule implemented
- `exempt`: Rule doesn't apply (with reason in comment)
- `todo`: Rule needs implementation
### Example `quality_scale.yaml` Structure
```yaml
rules:
# Bronze (mandatory)
config-flow: done
entity-unique-id: done
action-setup:
status: exempt
comment: Integration does not register custom actions.
# Silver (if targeting Silver+)
entity-unavailable: done
parallel-updates: done
# Gold (if targeting Gold+)
devices: done
diagnostics: done
# Platinum (if targeting Platinum)
strict-typing: done
```
**When Reviewing/Creating Code**: Always check the integration's quality scale level and exemption status before applying rules.
## Code Review Guidelines
**When reviewing code, do NOT comment on:**
@@ -48,6 +91,73 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
- Use sentence case for titles and messages (capitalize only the first word and proper nouns)
- Avoid abbreviations when possible
## Code Organization
### Core Locations
- Shared constants: `homeassistant/const.py` (use these instead of hardcoding)
- Integration structure:
- `homeassistant/components/{domain}/const.py` - Constants
- `homeassistant/components/{domain}/models.py` - Data models
- `homeassistant/components/{domain}/coordinator.py` - Update coordinator
- `homeassistant/components/{domain}/config_flow.py` - Configuration flow
- `homeassistant/components/{domain}/{platform}.py` - Platform implementations
### Common Modules
- **coordinator.py**: Centralize data fetching logic
```python
class MyCoordinator(DataUpdateCoordinator[MyData]):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=1),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
- **entity.py**: Base entity definitions to reduce duplication
```python
class MyEntity(CoordinatorEntity[MyCoordinator]):
_attr_has_entity_name = True
```
### Runtime Data Storage
- **Use ConfigEntry.runtime_data**: Store non-persistent runtime data
```python
type MyIntegrationConfigEntry = ConfigEntry[MyClient]
async def async_setup_entry(hass: HomeAssistant, entry: MyIntegrationConfigEntry) -> bool:
client = MyClient(entry.data[CONF_HOST])
entry.runtime_data = client
```
### Manifest Requirements
- **Required Fields**: `domain`, `name`, `codeowners`, `integration_type`, `documentation`, `requirements`
- **Integration Types**: `device`, `hub`, `service`, `system`, `helper`
- **IoT Class**: Always specify connectivity method (e.g., `cloud_polling`, `local_polling`, `local_push`)
- **Discovery Methods**: Add when applicable: `zeroconf`, `dhcp`, `bluetooth`, `ssdp`, `usb`
- **Dependencies**: Include platform dependencies (e.g., `application_credentials`, `bluetooth_adapters`)
### Config Flow Patterns
- **Version Control**: Always set `VERSION = 1` and `MINOR_VERSION = 1`
- **Unique ID Management**:
```python
await self.async_set_unique_id(device_unique_id)
self._abort_if_unique_id_configured()
```
- **Error Handling**: Define errors in `strings.json` under `config.error`
- **Step Methods**: Use standard naming (`async_step_user`, `async_step_discovery`, etc.)
### Integration Ownership
- **manifest.json**: Add GitHub usernames to `codeowners`:
```json
{
"domain": "my_integration",
"name": "My Integration",
"codeowners": ["@me"]
}
```
### Documentation Standards
- **File Headers**: Short and concise
```python
@@ -73,6 +183,19 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
- No blocking calls
- Group executor jobs when possible - switching between event loop and executor is expensive
### Async Dependencies (Platinum)
- **Requirement**: All dependencies must use asyncio
- Ensures efficient task handling without thread context switching
### WebSession Injection (Platinum)
- **Pass WebSession**: Support passing web sessions to dependencies
```python
async def async_setup_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Set up integration from config entry."""
client = MyClient(entry.data[CONF_HOST], async_get_clientsession(hass))
```
- For cookies: Use `async_create_clientsession` (aiohttp) or `create_async_httpx_client` (httpx)
### Blocking Operations
- **Use Executor**: For blocking I/O operations
```python
@@ -92,6 +215,200 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
- **Sync APIs from Threads**: Use sync versions when calling from non-event loop threads
- **Registry Changes**: Must be done in event loop thread
### Data Update Coordinator
- **Standard Pattern**: Use for efficient data management
```python
class MyCoordinator(DataUpdateCoordinator):
def __init__(self, hass: HomeAssistant, client: MyClient, config_entry: ConfigEntry) -> None:
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=timedelta(minutes=5),
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
self.client = client
async def _async_update_data(self):
try:
return await self.client.fetch_data()
except ApiError as err:
raise UpdateFailed(f"API communication error: {err}")
```
- **Error Types**: Use `UpdateFailed` for API errors, `ConfigEntryAuthFailed` for auth issues
- **Config Entry**: Always pass `config_entry` parameter to coordinator - it's accepted and recommended
## Integration Guidelines
### Configuration Flow
- **UI Setup Required**: All integrations must support configuration via UI
- **Manifest**: Set `"config_flow": true` in `manifest.json`
- **Data Storage**:
- Connection-critical config: Store in `ConfigEntry.data`
- Non-critical settings: Store in `ConfigEntry.options`
- **Validation**: Always validate user input before creating entries
- **Config Entry Naming**:
- ❌ Do NOT allow users to set config entry names in config flows
- Names are automatically generated or can be customized later in UI
- ✅ Exception: Helper integrations MAY allow custom names in config flow
- **Connection Testing**: Test device/service connection during config flow:
```python
try:
await client.get_data()
except MyException:
errors["base"] = "cannot_connect"
```
- **Duplicate Prevention**: Prevent duplicate configurations:
```python
# Using unique ID
await self.async_set_unique_id(identifier)
self._abort_if_unique_id_configured()
# Using unique data
self._async_abort_entries_match({CONF_HOST: user_input[CONF_HOST]})
```
### Reauthentication Support
- **Required Method**: Implement `async_step_reauth` in config flow
- **Credential Updates**: Allow users to update credentials without re-adding
- **Validation**: Verify account matches existing unique ID:
```python
await self.async_set_unique_id(user_id)
self._abort_if_unique_id_mismatch(reason="wrong_account")
return self.async_update_reload_and_abort(
self._get_reauth_entry(),
data_updates={CONF_API_TOKEN: user_input[CONF_API_TOKEN]}
)
```
### Reconfiguration Flow
- **Purpose**: Allow configuration updates without removing device
- **Implementation**: Add `async_step_reconfigure` method
- **Validation**: Prevent changing underlying account with `_abort_if_unique_id_mismatch`
### Device Discovery
- **Manifest Configuration**: Add discovery method (zeroconf, dhcp, etc.)
```json
{
"zeroconf": ["_mydevice._tcp.local."]
}
```
- **Discovery Handler**: Implement appropriate `async_step_*` method:
```python
async def async_step_zeroconf(self, discovery_info):
"""Handle zeroconf discovery."""
await self.async_set_unique_id(discovery_info.properties["serialno"])
self._abort_if_unique_id_configured(updates={CONF_HOST: discovery_info.host})
```
- **Network Updates**: Use discovery to update dynamic IP addresses
### Network Discovery Implementation
- **Zeroconf/mDNS**: Use async instances
```python
aiozc = await zeroconf.async_get_async_instance(hass)
```
- **SSDP Discovery**: Register callbacks with cleanup
```python
entry.async_on_unload(
ssdp.async_register_callback(
hass, _async_discovered_device,
{"st": "urn:schemas-upnp-org:device:ZonePlayer:1"}
)
)
```
### Bluetooth Integration
- **Manifest Dependencies**: Add `bluetooth_adapters` to dependencies
- **Connectable**: Set `"connectable": true` for connection-required devices
- **Scanner Usage**: Always use shared scanner instance
```python
scanner = bluetooth.async_get_scanner()
entry.async_on_unload(
bluetooth.async_register_callback(
hass, _async_discovered_device,
{"service_uuid": "example_uuid"},
bluetooth.BluetoothScanningMode.ACTIVE
)
)
```
- **Connection Handling**: Never reuse `BleakClient` instances, use 10+ second timeouts
### Setup Validation
- **Test Before Setup**: Verify integration can be set up in `async_setup_entry`
- **Exception Handling**:
- `ConfigEntryNotReady`: Device offline or temporary failure
- `ConfigEntryAuthFailed`: Authentication issues
- `ConfigEntryError`: Unresolvable setup problems
### Config Entry Unloading
- **Required**: Implement `async_unload_entry` for runtime removal/reload
- **Platform Unloading**: Use `hass.config_entries.async_unload_platforms`
- **Cleanup**: Register callbacks with `entry.async_on_unload`:
```python
async def async_unload_entry(hass: HomeAssistant, entry: MyConfigEntry) -> bool:
"""Unload a config entry."""
if unload_ok := await hass.config_entries.async_unload_platforms(entry, PLATFORMS):
entry.runtime_data.listener() # Clean up resources
return unload_ok
```
### Service Actions
- **Registration**: Register all service actions in `async_setup`, NOT in `async_setup_entry`
- **Validation**: Check config entry existence and loaded state:
```python
async def async_setup(hass: HomeAssistant, config: ConfigType) -> bool:
async def service_action(call: ServiceCall) -> ServiceResponse:
if not (entry := hass.config_entries.async_get_entry(call.data[ATTR_CONFIG_ENTRY_ID])):
raise ServiceValidationError("Entry not found")
if entry.state is not ConfigEntryState.LOADED:
raise ServiceValidationError("Entry not loaded")
```
- **Exception Handling**: Raise appropriate exceptions:
```python
# For invalid input
if end_date < start_date:
raise ServiceValidationError("End date must be after start date")
# For service errors
try:
await client.set_schedule(start_date, end_date)
except MyConnectionError as err:
raise HomeAssistantError("Could not connect to the schedule") from err
```
### Service Registration Patterns
- **Entity Services**: Register on platform setup
```python
platform.async_register_entity_service(
"my_entity_service",
{vol.Required("parameter"): cv.string},
"handle_service_method"
)
```
- **Service Schema**: Always validate input
```python
SERVICE_SCHEMA = vol.Schema({
vol.Required("entity_id"): cv.entity_ids,
vol.Required("parameter"): cv.string,
vol.Optional("timeout", default=30): cv.positive_int,
})
```
- **Services File**: Create `services.yaml` with descriptions and field definitions
### Polling
- Use update coordinator pattern when possible
- **Polling intervals are NOT user-configurable**: Never add scan_interval, update_interval, or polling frequency options to config flows or config entries
- **Integration determines intervals**: Set `update_interval` programmatically based on integration logic, not user input
- **Minimum Intervals**:
- Local network: 5 seconds
- Cloud services: 60 seconds
- **Parallel Updates**: Specify number of concurrent updates:
```python
PARALLEL_UPDATES = 1 # Serialize updates to prevent overwhelming device
# OR
PARALLEL_UPDATES = 0 # Unlimited (for coordinator-based or read-only)
```
### Error Handling
- **Exception Types**: Choose most specific exception available
- `ServiceValidationError`: User input errors (preferred over `ValueError`)
@@ -123,7 +440,7 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
except DeviceError:
_LOGGER.error("Failed to get data")
return
# ✅ Process data outside try block
processed = data.get("value", 0) * 100
self._attr_native_value = processed
@@ -135,14 +452,14 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
data = await device.get_data()
except Exception: # Too broad
_LOGGER.error("Failed")
# ✅ Allowed in config flow for robustness
async def async_step_user(self, user_input=None):
try:
await self._test_connection(user_input)
except Exception: # Allowed here
errors["base"] = "unknown"
# ✅ Allowed in background tasks
async def _background_refresh():
try:
@@ -177,7 +494,7 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
- **Implementation Pattern**:
```python
_unavailable_logged: bool = False
if not self._unavailable_logged:
_LOGGER.info("The sensor is unavailable: %s", ex)
self._unavailable_logged = True
@@ -187,17 +504,366 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
self._unavailable_logged = False
```
## Entity Development
### Unique IDs
- **Required**: Every entity must have a unique ID for registry tracking
- Must be unique per platform (not per integration)
- Don't include integration domain or platform in ID
- **Implementation**:
```python
class MySensor(SensorEntity):
def __init__(self, device_id: str) -> None:
self._attr_unique_id = f"{device_id}_temperature"
```
**Acceptable ID Sources**:
- Device serial numbers
- MAC addresses (formatted using `format_mac` from device registry)
- Physical identifiers (printed/EEPROM)
- Config entry ID as last resort: `f"{entry.entry_id}-battery"`
**Never Use**:
- IP addresses, hostnames, URLs
- Device names
- Email addresses, usernames
### Entity Descriptions
- **Lambda/Anonymous Functions**: Often used in EntityDescription for value transformation
- **Multiline Lambdas**: When lambdas exceed line length, wrap in parentheses for readability
- **Bad pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: round(data["temp_value"] * 1.8 + 32, 1) if data.get("temp_value") is not None else None, # ❌ Too long
)
```
- **Good pattern**:
```python
SensorEntityDescription(
key="temperature",
name="Temperature",
value_fn=lambda data: ( # ✅ Parenthesis on same line as lambda
round(data["temp_value"] * 1.8 + 32, 1)
if data.get("temp_value") is not None
else None
),
)
```
### Entity Naming
- **Use has_entity_name**: Set `_attr_has_entity_name = True`
- **For specific fields**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
def __init__(self, device: Device, field: str) -> None:
self._attr_device_info = DeviceInfo(
identifiers={(DOMAIN, device.id)},
name=device.name,
)
self._attr_name = field # e.g., "temperature", "humidity"
```
- **For device itself**: Set `_attr_name = None`
### Event Lifecycle Management
- **Subscribe in `async_added_to_hass`**:
```python
async def async_added_to_hass(self) -> None:
"""Subscribe to events."""
self.async_on_remove(
self.client.events.subscribe("my_event", self._handle_event)
)
```
- **Unsubscribe in `async_will_remove_from_hass`** if not using `async_on_remove`
- Never subscribe in `__init__` or other methods
### State Handling
- Unknown values: Use `None` (not "unknown" or "unavailable")
- Availability: Implement `available()` property instead of using "unavailable" state
### Entity Availability
- **Mark Unavailable**: When data cannot be fetched from device/service
- **Coordinator Pattern**:
```python
@property
def available(self) -> bool:
"""Return if entity is available."""
return super().available and self.identifier in self.coordinator.data
```
- **Direct Update Pattern**:
```python
async def async_update(self) -> None:
"""Update entity."""
try:
data = await self.client.get_data()
except MyException:
self._attr_available = False
else:
self._attr_available = True
self._attr_native_value = data.value
```
### Extra State Attributes
- All attribute keys must always be present
- Unknown values: Use `None`
- Provide descriptive attributes
## Device Management
### Device Registry
- **Create Devices**: Group related entities under devices
- **Device Info**: Provide comprehensive metadata:
```python
_attr_device_info = DeviceInfo(
connections={(CONNECTION_NETWORK_MAC, device.mac)},
identifiers={(DOMAIN, device.id)},
name=device.name,
manufacturer="My Company",
model="My Sensor",
sw_version=device.version,
)
```
- For services: Add `entry_type=DeviceEntryType.SERVICE`
### Dynamic Device Addition
- **Auto-detect New Devices**: After initial setup
- **Implementation Pattern**:
```python
def _check_device() -> None:
current_devices = set(coordinator.data)
new_devices = current_devices - known_devices
if new_devices:
known_devices.update(new_devices)
async_add_entities([MySensor(coordinator, device_id) for device_id in new_devices])
entry.async_on_unload(coordinator.async_add_listener(_check_device))
```
### Stale Device Removal
- **Auto-remove**: When devices disappear from hub/account
- **Device Registry Update**:
```python
device_registry.async_update_device(
device_id=device.id,
remove_config_entry_id=self.config_entry.entry_id,
)
```
- **Manual Deletion**: Implement `async_remove_config_entry_device` when needed
## Diagnostics and Repairs
### Integration Diagnostics
- **Required**: Implement diagnostic data collection
- **Implementation**:
```python
TO_REDACT = [CONF_API_KEY, CONF_LATITUDE, CONF_LONGITUDE]
async def async_get_config_entry_diagnostics(
hass: HomeAssistant, entry: MyConfigEntry
) -> dict[str, Any]:
"""Return diagnostics for a config entry."""
return {
"entry_data": async_redact_data(entry.data, TO_REDACT),
"data": entry.runtime_data.data,
}
```
- **Security**: Never expose passwords, tokens, or sensitive coordinates
### Repair Issues
- **Actionable Issues Required**: All repair issues must be actionable for end users
- **Issue Content Requirements**:
- Clearly explain what is happening
- Provide specific steps users need to take to resolve the issue
- Use friendly, helpful language
- Include relevant context (device names, error details, etc.)
- **Implementation**:
```python
ir.async_create_issue(
hass,
DOMAIN,
"outdated_version",
is_fixable=False,
issue_domain=DOMAIN,
severity=ir.IssueSeverity.ERROR,
translation_key="outdated_version",
)
```
- **Translation Strings Requirements**: Must contain user-actionable text in `strings.json`:
```json
{
"issues": {
"outdated_version": {
"title": "Device firmware is outdated",
"description": "Your device firmware version {current_version} is below the minimum required version {min_version}. To fix this issue: 1) Open the manufacturer's mobile app, 2) Navigate to device settings, 3) Select 'Update Firmware', 4) Wait for the update to complete, then 5) Restart Home Assistant."
}
}
}
```
- **String Content Must Include**:
- What the problem is
- Why it matters
- Exact steps to resolve (numbered list when multiple steps)
- What to expect after following the steps
- **Avoid Vague Instructions**: Don't just say "update firmware" - provide specific steps
- **Severity Guidelines**:
- `CRITICAL`: Reserved for extreme scenarios only
- `ERROR`: Requires immediate user attention
- `WARNING`: Indicates future potential breakage
- **Additional Attributes**:
```python
ir.async_create_issue(
hass, DOMAIN, "issue_id",
breaks_in_ha_version="2024.1.0",
is_fixable=True,
is_persistent=True,
severity=ir.IssueSeverity.ERROR,
translation_key="issue_description",
)
```
- Only create issues for problems users can potentially resolve
### Entity Categories
- **Required**: Assign appropriate category to entities
- **Implementation**: Set `_attr_entity_category`
```python
class MySensor(SensorEntity):
_attr_entity_category = EntityCategory.DIAGNOSTIC
```
- Categories include: `DIAGNOSTIC` for system/technical information
### Device Classes
- **Use When Available**: Set appropriate device class for entity type
```python
class MyTemperatureSensor(SensorEntity):
_attr_device_class = SensorDeviceClass.TEMPERATURE
```
- Provides context for: unit conversion, voice control, UI representation
### Disabled by Default
- **Disable Noisy/Less Popular Entities**: Reduce resource usage
```python
class MySignalStrengthSensor(SensorEntity):
_attr_entity_registry_enabled_default = False
```
- Target: frequently changing states, technical diagnostics
### Entity Translations
- **Required with has_entity_name**: Support international users
- **Implementation**:
```python
class MySensor(SensorEntity):
_attr_has_entity_name = True
_attr_translation_key = "phase_voltage"
```
- Create `strings.json` with translations:
```json
{
"entity": {
"sensor": {
"phase_voltage": {
"name": "Phase voltage"
}
}
}
}
```
### Exception Translations (Gold)
- **Translatable Errors**: Use translation keys for user-facing exceptions
- **Implementation**:
```python
raise ServiceValidationError(
translation_domain=DOMAIN,
translation_key="end_date_before_start_date",
)
```
- Add to `strings.json`:
```json
{
"exceptions": {
"end_date_before_start_date": {
"message": "The end date cannot be before the start date."
}
}
}
```
### Icon Translations (Gold)
- **Dynamic Icons**: Support state and range-based icon selection
- **State-based Icons**:
```json
{
"entity": {
"sensor": {
"tree_pollen": {
"default": "mdi:tree",
"state": {
"high": "mdi:tree-outline"
}
}
}
}
}
```
- **Range-based Icons** (for numeric values):
```json
{
"entity": {
"sensor": {
"battery_level": {
"default": "mdi:battery-unknown",
"range": {
"0": "mdi:battery-outline",
"90": "mdi:battery-90",
"100": "mdi:battery"
}
}
}
}
}
```
## Testing Requirements
- **Location**: `tests/components/{domain}/`
- **Coverage Requirement**: Above 95% test coverage for all modules
- **Best Practices**:
- Use pytest fixtures from `tests.common`
- Mock all external dependencies
- Use snapshots for complex data structures
- Follow existing test patterns
### Config Flow Testing
- **100% Coverage Required**: All config flow paths must be tested
- **Test Scenarios**:
- All flow initiation methods (user, discovery, import)
- Successful configuration paths
- Error recovery scenarios
- Prevention of duplicate entries
- Flow completion after errors
## Development Commands
### Code Quality & Linting
- **Run all linters on all files**: `prek run --all-files`
- **Run linters on staged files only**: `prek run`
- **Run all linters on all files**: `pre-commit run --all-files`
- **Run linters on staged files only**: `pre-commit run`
- **PyLint on everything** (slow): `pylint homeassistant`
- **PyLint on specific folder**: `pylint homeassistant/components/my_integration`
- **MyPy type checking (whole project)**: `mypy homeassistant/`
- **MyPy on specific integration**: `mypy homeassistant/components/my_integration`
### Testing
- **Integration-specific tests** (recommended):
```bash
pytest ./tests/components/<integration_domain> \
--cov=homeassistant.components.<integration_domain> \
--cov-report term-missing \
--durations-min=1 \
--durations=0 \
--numprocesses=auto
```
- **Quick test of changed files**: `pytest --timeout=10 --picked`
- **Update test snapshots**: Add `--snapshot-update` to pytest command
- ⚠️ Omit test results after using `--snapshot-update`
@@ -206,27 +872,62 @@ This repository contains the core of Home Assistant, a Python 3 based home autom
### Dependencies & Requirements
- **Update generated files after dependency changes**: `python -m script.gen_requirements_all`
- **Install all Python requirements**:
- **Install all Python requirements**:
```bash
uv pip install -r requirements_all.txt -r requirements.txt -r requirements_test.txt
```
- **Install test requirements only**:
- **Install test requirements only**:
```bash
uv pip install -r requirements_test_all.txt -r requirements.txt
```
### Translations
- **Update translations after strings.json changes**:
- **Update translations after strings.json changes**:
```bash
python -m script.translations develop --all
```
### Project Validation
- **Run hassfest** (checks project structure and updates generated files):
- **Run hassfest** (checks project structure and updates generated files):
```bash
python -m script.hassfest
```
### File Locations
- **Integration code**: `./homeassistant/components/<integration_domain>/`
- **Integration tests**: `./tests/components/<integration_domain>/`
## Integration Templates
### Standard Integration Structure
```
homeassistant/components/my_integration/
├── __init__.py # Entry point with async_setup_entry
├── manifest.json # Integration metadata and dependencies
├── const.py # Domain and constants
├── config_flow.py # UI configuration flow
├── coordinator.py # Data update coordinator (if needed)
├── entity.py # Base entity class (if shared patterns)
├── sensor.py # Sensor platform
├── strings.json # User-facing text and translations
├── services.yaml # Service definitions (if applicable)
└── quality_scale.yaml # Quality scale rule status
```
### Quality Scale Progression
- **Bronze → Silver**: Add entity unavailability, parallel updates, auth flows
- **Silver → Gold**: Add device management, diagnostics, translations
- **Gold → Platinum**: Add strict typing, async dependencies, websession injection
### Minimal Integration Checklist
- [ ] `manifest.json` with required fields (domain, name, codeowners, etc.)
- [ ] `__init__.py` with `async_setup_entry` and `async_unload_entry`
- [ ] `config_flow.py` with UI configuration support
- [ ] `const.py` with `DOMAIN` constant
- [ ] `strings.json` with at least config flow text
- [ ] Platform files (`sensor.py`, etc.) as needed
- [ ] `quality_scale.yaml` with rule status tracking
## Common Anti-Patterns & Best Practices
### ❌ **Avoid These Patterns**
@@ -267,7 +968,7 @@ try:
response = await client.get_data() # Can throw
# ❌ Data processing should be outside try block
temperature = response["temperature"] / 10
humidity = response["humidity"]
humidity = response["humidity"]
self._attr_native_value = temperature
except ClientError:
_LOGGER.error("Failed to fetch data")
@@ -315,10 +1016,169 @@ class MyCoordinator(DataUpdateCoordinator[MyData]):
# ✅ Integration determines interval based on device capabilities, connection type, etc.
interval = timedelta(minutes=1) if client.is_local else SCAN_INTERVAL
super().__init__(
hass,
logger=LOGGER,
name=DOMAIN,
hass,
logger=LOGGER,
name=DOMAIN,
update_interval=interval,
config_entry=config_entry, # ✅ Pass config_entry - it's accepted and recommended
)
```
### Entity Performance Optimization
```python
# Use __slots__ for memory efficiency
class MySensor(SensorEntity):
__slots__ = ("_attr_native_value", "_attr_available")
@property
def should_poll(self) -> bool:
"""Disable polling when using coordinator."""
return False # ✅ Let coordinator handle updates
```
## Testing Patterns
### Testing Best Practices
- **Never access `hass.data` directly** - Use fixtures and proper integration setup instead
- **Use snapshot testing** - For verifying entity states and attributes
- **Test through integration setup** - Don't test entities in isolation
- **Mock external APIs** - Use fixtures with realistic JSON data
- **Verify registries** - Ensure entities are properly registered with devices
### Config Flow Testing Template
```python
async def test_user_flow_success(hass, mock_api):
"""Test successful user flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == FlowResultType.FORM
assert result["step_id"] == "user"
# Test form submission
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.CREATE_ENTRY
assert result["title"] == "My Device"
assert result["data"] == TEST_USER_INPUT
async def test_flow_connection_error(hass, mock_api_error):
"""Test connection error handling."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=TEST_USER_INPUT
)
assert result["type"] == FlowResultType.FORM
assert result["errors"] == {"base": "cannot_connect"}
```
### Entity Testing Patterns
```python
@pytest.fixture
def platforms() -> list[Platform]:
"""Overridden fixture to specify platforms to test."""
return [Platform.SENSOR] # Or another specific platform as needed.
@pytest.mark.usefixtures("entity_registry_enabled_by_default", "init_integration")
async def test_entities(
hass: HomeAssistant,
snapshot: SnapshotAssertion,
entity_registry: er.EntityRegistry,
device_registry: dr.DeviceRegistry,
mock_config_entry: MockConfigEntry,
) -> None:
"""Test the sensor entities."""
await snapshot_platform(hass, entity_registry, snapshot, mock_config_entry.entry_id)
# Ensure entities are correctly assigned to device
device_entry = device_registry.async_get_device(
identifiers={(DOMAIN, "device_unique_id")}
)
assert device_entry
entity_entries = er.async_entries_for_config_entry(
entity_registry, mock_config_entry.entry_id
)
for entity_entry in entity_entries:
assert entity_entry.device_id == device_entry.id
```
### Mock Patterns
```python
# Modern integration fixture setup
@pytest.fixture
def mock_config_entry() -> MockConfigEntry:
"""Return the default mocked config entry."""
return MockConfigEntry(
title="My Integration",
domain=DOMAIN,
data={CONF_HOST: "127.0.0.1", CONF_API_KEY: "test_key"},
unique_id="device_unique_id",
)
@pytest.fixture
def mock_device_api() -> Generator[MagicMock]:
"""Return a mocked device API."""
with patch("homeassistant.components.my_integration.MyDeviceAPI", autospec=True) as api_mock:
api = api_mock.return_value
api.get_data.return_value = MyDeviceData.from_json(
load_fixture("device_data.json", DOMAIN)
)
yield api
@pytest.fixture
def platforms() -> list[Platform]:
"""Fixture to specify platforms to test."""
return PLATFORMS
@pytest.fixture
async def init_integration(
hass: HomeAssistant,
mock_config_entry: MockConfigEntry,
mock_device_api: MagicMock,
platforms: list[Platform],
) -> MockConfigEntry:
"""Set up the integration for testing."""
mock_config_entry.add_to_hass(hass)
with patch("homeassistant.components.my_integration.PLATFORMS", platforms):
await hass.config_entries.async_setup(mock_config_entry.entry_id)
await hass.async_block_till_done()
return mock_config_entry
```
## Debugging & Troubleshooting
### Common Issues & Solutions
- **Integration won't load**: Check `manifest.json` syntax and required fields
- **Entities not appearing**: Verify `unique_id` and `has_entity_name` implementation
- **Config flow errors**: Check `strings.json` entries and error handling
- **Discovery not working**: Verify manifest discovery configuration and callbacks
- **Tests failing**: Check mock setup and async context
### Debug Logging Setup
```python
# Enable debug logging in tests
caplog.set_level(logging.DEBUG, logger="my_integration")
# In integration code - use proper logging
_LOGGER = logging.getLogger(__name__)
_LOGGER.debug("Processing data: %s", data) # Use lazy logging
```
### Validation Commands
```bash
# Check specific integration
python -m script.hassfest --integration-path homeassistant/components/my_integration
# Validate quality scale
# Check quality_scale.yaml against current rules
# Run integration tests with coverage
pytest ./tests/components/my_integration \
--cov=homeassistant.components.my_integration \
--cov-report term-missing
```

View File

@@ -59,6 +59,7 @@ env:
# 15 is the latest version
# - 15.2 is the latest (as of 9 Feb 2023)
POSTGRESQL_VERSIONS: "['postgres:12.14','postgres:15.2']"
PRE_COMMIT_CACHE: ~/.cache/pre-commit
UV_CACHE_DIR: /tmp/uv-cache
APT_CACHE_BASE: /home/runner/work/apt
APT_CACHE_DIR: /home/runner/work/apt/cache
@@ -82,6 +83,7 @@ jobs:
integrations_glob: ${{ steps.info.outputs.integrations_glob }}
integrations: ${{ steps.integrations.outputs.changes }}
apt_cache_key: ${{ steps.generate_apt_cache_key.outputs.key }}
pre-commit_cache_key: ${{ steps.generate_pre-commit_cache_key.outputs.key }}
python_cache_key: ${{ steps.generate_python_cache_key.outputs.key }}
requirements: ${{ steps.core.outputs.requirements }}
mariadb_groups: ${{ steps.info.outputs.mariadb_groups }}
@@ -109,6 +111,11 @@ jobs:
hashFiles('requirements_all.txt') }}-${{
hashFiles('homeassistant/package_constraints.txt') }}-${{
hashFiles('script/gen_requirements_all.py') }}" >> $GITHUB_OUTPUT
- name: Generate partial pre-commit restore key
id: generate_pre-commit_cache_key
run: >-
echo "key=pre-commit-${{ env.CACHE_VERSION }}-${{
hashFiles('.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
- name: Generate partial apt restore key
id: generate_apt_cache_key
run: |
@@ -237,8 +244,8 @@ jobs:
echo "skip_coverage: ${skip_coverage}"
echo "skip_coverage=${skip_coverage}" >> $GITHUB_OUTPUT
prek:
name: Run prek checks
pre-commit:
name: Prepare pre-commit base
runs-on: *runs-on-ubuntu
needs: [info]
if: |
@@ -247,17 +254,147 @@ jobs:
&& github.event.inputs.audit-licenses-only != 'true'
steps:
- *checkout
- name: Register problem matchers
- &setup-python-default
name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: &actions-setup-python actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: venv
key: &key-pre-commit-venv >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-venv-${{
needs.info.outputs.pre-commit_cache_key }}
- name: Create Python virtual environment
if: steps.cache-venv.outputs.cache-hit != 'true'
run: |
python -m venv venv
. venv/bin/activate
python --version
pip install "$(grep '^uv' < requirements.txt)"
uv pip install "$(cat requirements_test.txt | grep pre-commit)"
- name: Restore pre-commit environment from cache
id: cache-precommit
uses: *actions-cache
with:
path: ${{ env.PRE_COMMIT_CACHE }}
lookup-only: true
key: &key-pre-commit-env >-
${{ runner.os }}-${{ runner.arch }}-${{ steps.python.outputs.python-version }}-${{
needs.info.outputs.pre-commit_cache_key }}
- name: Install pre-commit dependencies
if: steps.cache-precommit.outputs.cache-hit != 'true'
run: |
. venv/bin/activate
pre-commit install-hooks
lint-ruff-format:
name: Check ruff-format
runs-on: *runs-on-ubuntu
needs: &needs-pre-commit
- info
- pre-commit
steps:
- *checkout
- *setup-python-default
- &cache-restore-pre-commit-venv
name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache-restore actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: venv
fail-on-cache-miss: true
key: *key-pre-commit-venv
- &cache-restore-pre-commit-env
name: Restore pre-commit environment from cache
id: cache-precommit
uses: *actions-cache-restore
with:
path: ${{ env.PRE_COMMIT_CACHE }}
fail-on-cache-miss: true
key: *key-pre-commit-env
- name: Run ruff-format
run: |
. venv/bin/activate
pre-commit run --hook-stage manual ruff-format --all-files --show-diff-on-failure
env:
RUFF_OUTPUT_FORMAT: github
lint-ruff:
name: Check ruff
runs-on: *runs-on-ubuntu
needs: *needs-pre-commit
steps:
- *checkout
- *setup-python-default
- *cache-restore-pre-commit-venv
- *cache-restore-pre-commit-env
- name: Run ruff
run: |
. venv/bin/activate
pre-commit run --hook-stage manual ruff-check --all-files --show-diff-on-failure
env:
RUFF_OUTPUT_FORMAT: github
lint-other:
name: Check other linters
runs-on: *runs-on-ubuntu
needs: *needs-pre-commit
steps:
- *checkout
- *setup-python-default
- *cache-restore-pre-commit-venv
- *cache-restore-pre-commit-env
- name: Register yamllint problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/yamllint.json"
- name: Run yamllint
run: |
. venv/bin/activate
pre-commit run --hook-stage manual yamllint --all-files --show-diff-on-failure
- name: Register check-json problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/check-json.json"
- name: Run check-json
run: |
. venv/bin/activate
pre-commit run --hook-stage manual check-json --all-files --show-diff-on-failure
- name: Run prettier (fully)
if: needs.info.outputs.test_full_suite == 'true'
run: |
. venv/bin/activate
pre-commit run --hook-stage manual prettier --all-files --show-diff-on-failure
- name: Run prettier (partially)
if: needs.info.outputs.test_full_suite == 'false'
shell: bash
run: |
. venv/bin/activate
shopt -s globstar
pre-commit run --hook-stage manual prettier --show-diff-on-failure --files {homeassistant,tests}/components/${{ needs.info.outputs.integrations_glob }}/{*,**/*}
- name: Register check executables problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/check-executables-have-shebangs.json"
- name: Run executables check
run: |
. venv/bin/activate
pre-commit run --hook-stage manual check-executables-have-shebangs --all-files --show-diff-on-failure
- name: Register codespell problem matcher
run: |
echo "::add-matcher::.github/workflows/matchers/codespell.json"
- name: Run prek
uses: j178/prek-action@9d6a3097e0c1865ecce00cfb89fe80f2ee91b547 # v1.0.12
env:
PREK_SKIP: no-commit-to-branch,mypy,pylint,gen_requirements_all,hassfest,hassfest-metadata,hassfest-mypy-config
RUFF_OUTPUT_FORMAT: github
- name: Run codespell
run: |
. venv/bin/activate
pre-commit run --show-diff-on-failure --hook-stage manual codespell --all-files
lint-hadolint:
name: Check ${{ matrix.file }}
@@ -297,7 +434,7 @@ jobs:
- &setup-python-matrix
name: Set up Python ${{ matrix.python-version }}
id: python
uses: &actions-setup-python actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
uses: *actions-setup-python
with:
python-version: ${{ matrix.python-version }}
check-latest: true
@@ -310,7 +447,7 @@ jobs:
env.HA_SHORT_VERSION }}-$(date -u '+%Y-%m-%dT%H:%M:%s')" >> $GITHUB_OUTPUT
- name: Restore base Python virtual environment
id: cache-venv
uses: &actions-cache actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: *actions-cache
with:
path: venv
key: &key-python-venv >-
@@ -374,7 +511,7 @@ jobs:
fi
- name: Save apt cache
if: steps.cache-apt-check.outputs.cache-hit != 'true'
uses: &actions-cache-save actions/cache/save@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: &actions-cache-save actions/cache/save@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: *path-apt-cache
key: *key-apt-cache
@@ -425,7 +562,7 @@ jobs:
steps:
- &cache-restore-apt
name: Restore apt cache
uses: &actions-cache-restore actions/cache/restore@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5.0.2
uses: *actions-cache-restore
with:
path: *path-apt-cache
fail-on-cache-miss: true
@@ -442,13 +579,7 @@ jobs:
-o Dir::State::Lists=${{ env.APT_LIST_CACHE_DIR }} \
libturbojpeg
- *checkout
- &setup-python-default
name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: *actions-setup-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
check-latest: true
- *setup-python-default
- &cache-restore-python-default
name: Restore full Python ${{ env.DEFAULT_PYTHON }} virtual environment
id: cache-venv
@@ -651,7 +782,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
steps:
- *cache-restore-apt
@@ -690,7 +823,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
- prepare-pytest-full
if: |
@@ -814,7 +949,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -929,7 +1066,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -1063,7 +1202,9 @@ jobs:
- base
- gen-requirements-all
- hassfest
- prek
- lint-other
- lint-ruff
- lint-ruff-format
- mypy
if: |
needs.info.outputs.lint_only != 'true'
@@ -1187,8 +1328,6 @@ jobs:
- pytest-postgres
- pytest-mariadb
timeout-minutes: 10
permissions:
id-token: write
# codecov/test-results-action currently doesn't support tokenless uploads
# therefore we can't run it on forks
if: |
@@ -1200,9 +1339,8 @@ jobs:
with:
pattern: test-results-*
- name: Upload test results to Codecov
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
with:
report_type: test_results
fail_ci_if_error: true
verbose: true
use_oidc: true
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -24,11 +24,11 @@ jobs:
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Initialize CodeQL
uses: github/codeql-action/init@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
uses: github/codeql-action/init@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
with:
languages: python
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@cdefb33c0f6224e58673d9004f47f7cb3e328b89 # v4.31.10
uses: github/codeql-action/analyze@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
with:
category: "/language:python"

View File

@@ -231,7 +231,7 @@ jobs:
- name: Detect duplicates using AI
id: ai_detection
if: steps.extract.outputs.should_continue == 'true' && steps.fetch_similar.outputs.has_similar == 'true'
uses: actions/ai-inference@a6101c89c6feaecc585efdd8d461f18bb7896f20 # v2.0.5
uses: actions/ai-inference@334892bb203895caaed82ec52d23c1ed9385151e # v2.0.4
with:
model: openai/gpt-4o
system-prompt: |

View File

@@ -57,7 +57,7 @@ jobs:
- name: Detect language using AI
id: ai_language_detection
if: steps.detect_language.outputs.should_continue == 'true'
uses: actions/ai-inference@a6101c89c6feaecc585efdd8d461f18bb7896f20 # v2.0.5
uses: actions/ai-inference@334892bb203895caaed82ec52d23c1ed9385151e # v2.0.4
with:
model: openai/gpt-4o-mini
system-prompt: |

View File

@@ -4,7 +4,7 @@
"owner": "check-executables-have-shebangs",
"pattern": [
{
"regexp": "^(.+):\\s(marked executable but has no \\(or invalid\\) shebang!.*)$",
"regexp": "^(.+):\\s(.+)$",
"file": 1,
"message": 2
}

View File

@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.13
rev: v0.13.0
hooks:
- id: ruff-check
args:
@@ -39,14 +39,14 @@ repos:
- id: prettier
additional_dependencies:
- prettier@3.6.2
- prettier-plugin-sort-json@4.2.0
- prettier-plugin-sort-json@4.1.1
- repo: https://github.com/cdce8p/python-typing-update
rev: v0.6.0
hooks:
# Run `python-typing-update` hook manually from time to time
# to update python typing syntax.
# Will require manual work, before submitting changes!
# prek run --hook-stage manual python-typing-update --all-files
# pre-commit run --hook-stage manual python-typing-update --all-files
- id: python-typing-update
stages: [manual]
args:

View File

@@ -407,7 +407,6 @@ homeassistant.components.person.*
homeassistant.components.pi_hole.*
homeassistant.components.ping.*
homeassistant.components.plugwise.*
homeassistant.components.pooldose.*
homeassistant.components.portainer.*
homeassistant.components.powerfox.*
homeassistant.components.powerwall.*
@@ -455,7 +454,6 @@ homeassistant.components.russound_rio.*
homeassistant.components.ruuvi_gateway.*
homeassistant.components.ruuvitag_ble.*
homeassistant.components.samsungtv.*
homeassistant.components.saunum.*
homeassistant.components.scene.*
homeassistant.components.schedule.*
homeassistant.components.schlage.*

View File

@@ -7,8 +7,8 @@
"python.testing.pytestEnabled": false,
// https://code.visualstudio.com/docs/python/linting#_general-settings
"pylint.importStrategy": "fromEnvironment",
// Pyright type checking is not compatible with mypy which Home Assistant uses for type checking
"python.analysis.typeCheckingMode": "off",
// Pyright is too pedantic for Home Assistant
"python.analysis.typeCheckingMode": "basic",
"[python]": {
"editor.defaultFormatter": "charliermarsh.ruff",
},

8
.vscode/tasks.json vendored
View File

@@ -45,7 +45,7 @@
{
"label": "Ruff",
"type": "shell",
"command": "prek run ruff-check --all-files",
"command": "pre-commit run ruff-check --all-files",
"group": {
"kind": "test",
"isDefault": true
@@ -57,9 +57,9 @@
"problemMatcher": []
},
{
"label": "Prek",
"label": "Pre-commit",
"type": "shell",
"command": "prek run --show-diff-on-failure",
"command": "pre-commit run --show-diff-on-failure",
"group": {
"kind": "test",
"isDefault": true
@@ -120,7 +120,7 @@
{
"label": "Generate Requirements",
"type": "shell",
"command": "${command:python.interpreterPath} -m script.gen_requirements_all",
"command": "./script/gen_requirements_all.py",
"group": {
"kind": "build",
"isDefault": true

9
CODEOWNERS generated
View File

@@ -1017,8 +1017,8 @@ build.json @home-assistant/supervisor
/tests/components/mill/ @danielhiversen
/homeassistant/components/min_max/ @gjohansson-ST
/tests/components/min_max/ @gjohansson-ST
/homeassistant/components/minecraft_server/ @elmurato @zachdeibert
/tests/components/minecraft_server/ @elmurato @zachdeibert
/homeassistant/components/minecraft_server/ @elmurato
/tests/components/minecraft_server/ @elmurato
/homeassistant/components/minio/ @tkislan
/tests/components/minio/ @tkislan
/homeassistant/components/moat/ @bdraco
@@ -1068,8 +1068,6 @@ build.json @home-assistant/supervisor
/tests/components/myuplink/ @pajzo @astrandb
/homeassistant/components/nam/ @bieniu
/tests/components/nam/ @bieniu
/homeassistant/components/namecheapdns/ @tr4nt0r
/tests/components/namecheapdns/ @tr4nt0r
/homeassistant/components/nanoleaf/ @milanmeu @joostlek
/tests/components/nanoleaf/ @milanmeu @joostlek
/homeassistant/components/nasweb/ @nasWebio
@@ -1273,8 +1271,7 @@ build.json @home-assistant/supervisor
/tests/components/prosegur/ @dgomes
/homeassistant/components/proximity/ @mib1185
/tests/components/proximity/ @mib1185
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/tests/components/proxmoxve/ @jhollowe @Corbeno @erwindouna
/homeassistant/components/proxmoxve/ @jhollowe @Corbeno
/homeassistant/components/ps4/ @ktnrg45
/tests/components/ps4/ @ktnrg45
/homeassistant/components/pterodactyl/ @elmurato

View File

@@ -52,7 +52,7 @@ class AdGuardHomeEntity(Entity):
def device_info(self) -> DeviceInfo:
"""Return device information about this AdGuard Home instance."""
if self._entry.source == SOURCE_HASSIO:
config_url = "homeassistant://app/a0d7b954_adguard"
config_url = "homeassistant://hassio/ingress/a0d7b954_adguard"
elif self.adguard.tls:
config_url = f"https://{self.adguard.host}:{self.adguard.port}"
else:

View File

@@ -12,7 +12,6 @@ PLATFORMS: list[Platform] = [
Platform.CLIMATE,
Platform.NUMBER,
Platform.SENSOR,
Platform.SWITCH,
]

View File

@@ -63,11 +63,6 @@ class AirobotClimate(AirobotEntity, ClimateEntity):
_attr_min_temp = SETPOINT_TEMP_MIN
_attr_max_temp = SETPOINT_TEMP_MAX
def __init__(self, coordinator) -> None:
"""Initialize the climate entity."""
super().__init__(coordinator)
self._attr_unique_id = coordinator.data.status.device_id
@property
def _status(self) -> ThermostatStatus:
"""Get status from coordinator data."""

View File

@@ -24,6 +24,8 @@ class AirobotEntity(CoordinatorEntity[AirobotDataUpdateCoordinator]):
status = coordinator.data.status
settings = coordinator.data.settings
self._attr_unique_id = status.device_id
connections = set()
if (mac := coordinator.config_entry.data.get(CONF_MAC)) is not None:
connections.add((CONNECTION_NETWORK_MAC, mac))

View File

@@ -9,14 +9,6 @@
"hysteresis_band": {
"default": "mdi:delta"
}
},
"switch": {
"actuator_exercise_disabled": {
"default": "mdi:valve"
},
"child_lock": {
"default": "mdi:lock"
}
}
}
}

View File

@@ -85,14 +85,6 @@
"heating_uptime": {
"name": "Heating uptime"
}
},
"switch": {
"actuator_exercise_disabled": {
"name": "Actuator exercise disabled"
},
"child_lock": {
"name": "Child lock"
}
}
},
"exceptions": {
@@ -113,12 +105,6 @@
},
"set_value_failed": {
"message": "Failed to set value: {error}"
},
"switch_turn_off_failed": {
"message": "Failed to turn off {switch}."
},
"switch_turn_on_failed": {
"message": "Failed to turn on {switch}."
}
}
}

View File

@@ -1,118 +0,0 @@
"""Switch platform for Airobot thermostat."""
from __future__ import annotations
from collections.abc import Callable, Coroutine
from dataclasses import dataclass
from typing import Any
from pyairobotrest.exceptions import AirobotError
from homeassistant.components.switch import SwitchEntity, SwitchEntityDescription
from homeassistant.const import EntityCategory
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from . import AirobotConfigEntry
from .const import DOMAIN
from .coordinator import AirobotDataUpdateCoordinator
from .entity import AirobotEntity
PARALLEL_UPDATES = 0
@dataclass(frozen=True, kw_only=True)
class AirobotSwitchEntityDescription(SwitchEntityDescription):
"""Describes Airobot switch entity."""
is_on_fn: Callable[[AirobotDataUpdateCoordinator], bool]
turn_on_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
turn_off_fn: Callable[[AirobotDataUpdateCoordinator], Coroutine[Any, Any, None]]
SWITCH_TYPES: tuple[AirobotSwitchEntityDescription, ...] = (
AirobotSwitchEntityDescription(
key="child_lock",
translation_key="child_lock",
entity_category=EntityCategory.CONFIG,
is_on_fn=lambda coordinator: (
coordinator.data.settings.setting_flags.childlock_enabled
),
turn_on_fn=lambda coordinator: coordinator.client.set_child_lock(True),
turn_off_fn=lambda coordinator: coordinator.client.set_child_lock(False),
),
AirobotSwitchEntityDescription(
key="actuator_exercise_disabled",
translation_key="actuator_exercise_disabled",
entity_category=EntityCategory.CONFIG,
entity_registry_enabled_default=False,
is_on_fn=lambda coordinator: (
coordinator.data.settings.setting_flags.actuator_exercise_disabled
),
turn_on_fn=lambda coordinator: coordinator.client.toggle_actuator_exercise(
True
),
turn_off_fn=lambda coordinator: coordinator.client.toggle_actuator_exercise(
False
),
),
)
async def async_setup_entry(
hass: HomeAssistant,
entry: AirobotConfigEntry,
async_add_entities: AddConfigEntryEntitiesCallback,
) -> None:
"""Set up Airobot switch entities."""
coordinator = entry.runtime_data
async_add_entities(
AirobotSwitch(coordinator, description) for description in SWITCH_TYPES
)
class AirobotSwitch(AirobotEntity, SwitchEntity):
"""Representation of an Airobot switch."""
entity_description: AirobotSwitchEntityDescription
def __init__(
self,
coordinator: AirobotDataUpdateCoordinator,
description: AirobotSwitchEntityDescription,
) -> None:
"""Initialize the switch."""
super().__init__(coordinator)
self.entity_description = description
self._attr_unique_id = f"{coordinator.data.status.device_id}_{description.key}"
@property
def is_on(self) -> bool:
"""Return true if the switch is on."""
return self.entity_description.is_on_fn(self.coordinator)
async def async_turn_on(self, **kwargs: Any) -> None:
"""Turn the switch on."""
try:
await self.entity_description.turn_on_fn(self.coordinator)
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="switch_turn_on_failed",
translation_placeholders={"switch": self.entity_description.key},
) from err
await self.coordinator.async_request_refresh()
async def async_turn_off(self, **kwargs: Any) -> None:
"""Turn the switch off."""
try:
await self.entity_description.turn_off_fn(self.coordinator)
except AirobotError as err:
raise HomeAssistantError(
translation_domain=DOMAIN,
translation_key="switch_turn_off_failed",
translation_placeholders={"switch": self.entity_description.key},
) from err
await self.coordinator.async_request_refresh()

View File

@@ -85,22 +85,6 @@ class AirzoneSystemEntity(AirzoneEntity):
value = system[key]
return value
async def _async_update_sys_params(self, params: dict[str, Any]) -> None:
"""Send system parameters to API."""
_params = {
API_SYSTEM_ID: self.system_id,
**params,
}
_LOGGER.debug("update_sys_params=%s", _params)
try:
await self.coordinator.airzone.set_sys_parameters(_params)
except AirzoneError as error:
raise HomeAssistantError(
f"Failed to set system {self.entity_id}: {error}"
) from error
self.coordinator.async_set_updated_data(self.coordinator.airzone.data())
class AirzoneHotWaterEntity(AirzoneEntity):
"""Define an Airzone Hot Water entity."""

View File

@@ -20,7 +20,6 @@ from aioairzone.const import (
AZD_MODES,
AZD_Q_ADAPT,
AZD_SLEEP,
AZD_SYSTEMS,
AZD_ZONES,
)
@@ -31,7 +30,7 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .coordinator import AirzoneConfigEntry, AirzoneUpdateCoordinator
from .entity import AirzoneEntity, AirzoneSystemEntity, AirzoneZoneEntity
from .entity import AirzoneEntity, AirzoneZoneEntity
@dataclass(frozen=True, kw_only=True)
@@ -86,18 +85,6 @@ def main_zone_options(
return [k for k, v in options.items() if v in modes]
SYSTEM_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
AirzoneSelectDescription(
api_param=API_Q_ADAPT,
entity_category=EntityCategory.CONFIG,
key=AZD_Q_ADAPT,
options=list(Q_ADAPT_DICT),
options_dict=Q_ADAPT_DICT,
translation_key="q_adapt",
),
)
MAIN_ZONE_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
AirzoneSelectDescription(
api_param=API_MODE,
@@ -106,6 +93,14 @@ MAIN_ZONE_SELECT_TYPES: Final[tuple[AirzoneSelectDescription, ...]] = (
options_fn=main_zone_options,
translation_key="modes",
),
AirzoneSelectDescription(
api_param=API_Q_ADAPT,
entity_category=EntityCategory.CONFIG,
key=AZD_Q_ADAPT,
options=list(Q_ADAPT_DICT),
options_dict=Q_ADAPT_DICT,
translation_key="q_adapt",
),
)
@@ -145,37 +140,16 @@ async def async_setup_entry(
"""Add Airzone select from a config_entry."""
coordinator = entry.runtime_data
added_systems: set[str] = set()
added_zones: set[str] = set()
def _async_entity_listener() -> None:
"""Handle additions of select."""
entities: list[AirzoneBaseSelect] = []
systems_data = coordinator.data.get(AZD_SYSTEMS, {})
received_systems = set(systems_data)
new_systems = received_systems - added_systems
if new_systems:
entities.extend(
AirzoneSystemSelect(
coordinator,
description,
entry,
system_id,
systems_data.get(system_id),
)
for system_id in new_systems
for description in SYSTEM_SELECT_TYPES
if description.key in systems_data.get(system_id)
)
added_systems.update(new_systems)
zones_data = coordinator.data.get(AZD_ZONES, {})
received_zones = set(zones_data)
new_zones = received_zones - added_zones
if new_zones:
entities.extend(
entities: list[AirzoneZoneSelect] = [
AirzoneZoneSelect(
coordinator,
description,
@@ -187,8 +161,8 @@ async def async_setup_entry(
for description in MAIN_ZONE_SELECT_TYPES
if description.key in zones_data.get(system_zone_id)
and zones_data.get(system_zone_id).get(AZD_MASTER) is True
)
entities.extend(
]
entities += [
AirzoneZoneSelect(
coordinator,
description,
@@ -199,11 +173,10 @@ async def async_setup_entry(
for system_zone_id in new_zones
for description in ZONE_SELECT_TYPES
if description.key in zones_data.get(system_zone_id)
)
]
async_add_entities(entities)
added_zones.update(new_zones)
async_add_entities(entities)
entry.async_on_unload(coordinator.async_add_listener(_async_entity_listener))
_async_entity_listener()
@@ -230,38 +203,6 @@ class AirzoneBaseSelect(AirzoneEntity, SelectEntity):
self._attr_current_option = self._get_current_option()
class AirzoneSystemSelect(AirzoneSystemEntity, AirzoneBaseSelect):
"""Define an Airzone System select."""
def __init__(
self,
coordinator: AirzoneUpdateCoordinator,
description: AirzoneSelectDescription,
entry: ConfigEntry,
system_id: str,
system_data: dict[str, Any],
) -> None:
"""Initialize."""
super().__init__(coordinator, entry, system_data)
self._attr_unique_id = f"{self._attr_unique_id}_{system_id}_{description.key}"
self.entity_description = description
self._attr_options = self.entity_description.options_fn(
system_data, description.options_dict
)
self.values_dict = {v: k for k, v in description.options_dict.items()}
self._async_update_attrs()
async def async_select_option(self, option: str) -> None:
"""Change the selected option."""
param = self.entity_description.api_param
value = self.entity_description.options_dict[option]
await self._async_update_sys_params({param: value})
class AirzoneZoneSelect(AirzoneZoneEntity, AirzoneBaseSelect):
"""Define an Airzone Zone select."""

View File

@@ -1,93 +0,0 @@
"""Provides conditions for alarm control panels."""
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers.condition import (
Condition,
EntityStateConditionBase,
make_entity_state_condition,
)
from homeassistant.helpers.entity import get_supported_features
from .const import DOMAIN, AlarmControlPanelEntityFeature, AlarmControlPanelState
def supports_feature(hass: HomeAssistant, entity_id: str, features: int) -> bool:
"""Test if an entity supports the specified features."""
try:
return bool(get_supported_features(hass, entity_id) & features)
except HomeAssistantError:
return False
class EntityStateRequiredFeaturesCondition(EntityStateConditionBase):
"""State condition."""
_required_features: int
def entity_filter(self, entities: set[str]) -> set[str]:
"""Filter entities of this domain with the required features."""
entities = super().entity_filter(entities)
return {
entity_id
for entity_id in entities
if supports_feature(self._hass, entity_id, self._required_features)
}
def make_entity_state_required_features_condition(
domain: str, to_state: str, required_features: int
) -> type[EntityStateRequiredFeaturesCondition]:
"""Create an entity state condition class with required feature filtering."""
class CustomCondition(EntityStateRequiredFeaturesCondition):
"""Condition for entity state changes."""
_domain = domain
_states = {to_state}
_required_features = required_features
return CustomCondition
CONDITIONS: dict[str, type[Condition]] = {
"is_armed": make_entity_state_condition(
DOMAIN,
{
AlarmControlPanelState.ARMED_AWAY,
AlarmControlPanelState.ARMED_CUSTOM_BYPASS,
AlarmControlPanelState.ARMED_HOME,
AlarmControlPanelState.ARMED_NIGHT,
AlarmControlPanelState.ARMED_VACATION,
},
),
"is_armed_away": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_AWAY,
AlarmControlPanelEntityFeature.ARM_AWAY,
),
"is_armed_home": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_HOME,
AlarmControlPanelEntityFeature.ARM_HOME,
),
"is_armed_night": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_NIGHT,
AlarmControlPanelEntityFeature.ARM_NIGHT,
),
"is_armed_vacation": make_entity_state_required_features_condition(
DOMAIN,
AlarmControlPanelState.ARMED_VACATION,
AlarmControlPanelEntityFeature.ARM_VACATION,
),
"is_disarmed": make_entity_state_condition(DOMAIN, AlarmControlPanelState.DISARMED),
"is_triggered": make_entity_state_condition(
DOMAIN, AlarmControlPanelState.TRIGGERED
),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the alarm control panel conditions."""
return CONDITIONS

View File

@@ -1,52 +0,0 @@
.condition_common: &condition_common
target:
entity:
domain: alarm_control_panel
fields: &condition_common_fields
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_armed: *condition_common
is_armed_away:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_AWAY
is_armed_home:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_HOME
is_armed_night:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_NIGHT
is_armed_vacation:
fields: *condition_common_fields
target:
entity:
domain: alarm_control_panel
supported_features:
- alarm_control_panel.AlarmControlPanelEntityFeature.ARM_VACATION
is_disarmed: *condition_common
is_triggered: *condition_common

View File

@@ -1,27 +1,4 @@
{
"conditions": {
"is_armed": {
"condition": "mdi:shield"
},
"is_armed_away": {
"condition": "mdi:shield-lock"
},
"is_armed_home": {
"condition": "mdi:shield-home"
},
"is_armed_night": {
"condition": "mdi:shield-moon"
},
"is_armed_vacation": {
"condition": "mdi:shield-airplane"
},
"is_disarmed": {
"condition": "mdi:shield-off"
},
"is_triggered": {
"condition": "mdi:bell-ring"
}
},
"entity_component": {
"_": {
"default": "mdi:shield",

View File

@@ -1,82 +1,8 @@
{
"common": {
"condition_behavior_description": "How the state should match on the targeted alarms.",
"condition_behavior_name": "Behavior",
"trigger_behavior_description": "The behavior of the targeted alarms to trigger on.",
"trigger_behavior_name": "Behavior"
},
"conditions": {
"is_armed": {
"description": "Tests if one or more alarms are armed.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed"
},
"is_armed_away": {
"description": "Tests if one or more alarms are armed in away mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed away"
},
"is_armed_home": {
"description": "Tests if one or more alarms are armed in home mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed home"
},
"is_armed_night": {
"description": "Tests if one or more alarms are armed in night mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed night"
},
"is_armed_vacation": {
"description": "Tests if one or more alarms are armed in vacation mode.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is armed vacation"
},
"is_disarmed": {
"description": "Tests if one or more alarms are disarmed.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is disarmed"
},
"is_triggered": {
"description": "Tests if one or more alarms are triggered.",
"fields": {
"behavior": {
"description": "[%key:component::alarm_control_panel::common::condition_behavior_description%]",
"name": "[%key:component::alarm_control_panel::common::condition_behavior_name%]"
}
},
"name": "Alarm is triggered"
}
},
"device_automation": {
"action_type": {
"arm_away": "Arm {entity_name} away",
@@ -150,12 +76,6 @@
}
},
"selector": {
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"trigger_behavior": {
"options": {
"any": "Any",

View File

@@ -14,7 +14,7 @@ from .const import DOMAIN, AlarmControlPanelEntityFeature, AlarmControlPanelStat
def supports_feature(hass: HomeAssistant, entity_id: str, features: int) -> bool:
"""Test if an entity supports the specified features."""
"""Get the device class of an entity or UNDEFINED if not found."""
try:
return bool(get_supported_features(hass, entity_id) & features)
except HomeAssistantError:
@@ -39,7 +39,7 @@ class EntityStateTriggerRequiredFeatures(EntityTargetStateTriggerBase):
def make_entity_state_trigger_required_features(
domain: str, to_state: str, required_features: int
) -> type[EntityTargetStateTriggerBase]:
"""Create an entity state trigger class with required feature filtering."""
"""Create an entity state trigger class."""
class CustomTrigger(EntityStateTriggerRequiredFeatures):
"""Trigger for entity state changes."""

View File

@@ -4,8 +4,6 @@ from __future__ import annotations
import logging
import dateutil
from homeassistant.components.automation import automations_with_entity
from homeassistant.components.script import scripts_with_entity
from homeassistant.components.sensor import (
@@ -181,7 +179,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
LAST_S_TEST: SensorEntityDescription(
key=LAST_S_TEST,
translation_key="last_self_test",
device_class=SensorDeviceClass.TIMESTAMP,
),
"lastxfer": SensorEntityDescription(
key="lastxfer",
@@ -235,7 +232,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
"masterupd": SensorEntityDescription(
key="masterupd",
translation_key="master_update",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"maxlinev": SensorEntityDescription(
@@ -369,7 +365,6 @@ SENSORS: dict[str, SensorEntityDescription] = {
"starttime": SensorEntityDescription(
key="starttime",
translation_key="startup_time",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"statflag": SensorEntityDescription(
@@ -421,19 +416,16 @@ SENSORS: dict[str, SensorEntityDescription] = {
"xoffbat": SensorEntityDescription(
key="xoffbat",
translation_key="transfer_from_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"xoffbatt": SensorEntityDescription(
key="xoffbatt",
translation_key="transfer_from_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
"xonbatt": SensorEntityDescription(
key="xonbatt",
translation_key="transfer_to_battery",
device_class=SensorDeviceClass.TIMESTAMP,
entity_category=EntityCategory.DIAGNOSTIC,
),
}
@@ -537,13 +529,7 @@ class APCUPSdSensor(APCUPSdEntity, SensorEntity):
self._attr_native_value = None
return
data = self.coordinator.data[key]
if self.entity_description.device_class == SensorDeviceClass.TIMESTAMP:
self._attr_native_value = dateutil.parser.parse(data)
return
self._attr_native_value, inferred_unit = infer_unit(data)
self._attr_native_value, inferred_unit = infer_unit(self.coordinator.data[key])
if not self.native_unit_of_measurement:
self._attr_native_unit_of_measurement = inferred_unit

View File

@@ -5,14 +5,9 @@ from __future__ import annotations
import asyncio
import logging
from random import randrange
import sys
from typing import Any, cast
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
from homeassistant.components import zeroconf
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
@@ -29,7 +24,11 @@ from homeassistant.const import (
Platform,
)
from homeassistant.core import Event, HomeAssistant, callback
from homeassistant.exceptions import ConfigEntryAuthFailed, ConfigEntryNotReady
from homeassistant.exceptions import (
ConfigEntryAuthFailed,
ConfigEntryNotReady,
HomeAssistantError,
)
from homeassistant.helpers import device_registry as dr
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.dispatcher import async_dispatcher_send
@@ -43,6 +42,18 @@ from .const import (
SIGNAL_DISCONNECTED,
)
if sys.version_info < (3, 14):
from pyatv import connect, exceptions, scan
from pyatv.conf import AppleTV
from pyatv.const import DeviceModel, Protocol
from pyatv.convert import model_str
from pyatv.interface import AppleTV as AppleTVInterface, DeviceListener
else:
class DeviceListener:
"""Dummy class."""
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME_TV = "Apple TV"
@@ -53,25 +64,30 @@ BACKOFF_TIME_UPPER_LIMIT = 300 # Five minutes
PLATFORMS = [Platform.MEDIA_PLAYER, Platform.REMOTE]
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
if sys.version_info < (3, 14):
AUTH_EXCEPTIONS = (
exceptions.AuthenticationError,
exceptions.InvalidCredentialsError,
exceptions.NoCredentialsError,
)
CONNECTION_TIMEOUT_EXCEPTIONS = (
OSError,
asyncio.CancelledError,
TimeoutError,
exceptions.ConnectionLostError,
exceptions.ConnectionFailedError,
)
DEVICE_EXCEPTIONS = (
exceptions.ProtocolError,
exceptions.NoServiceError,
exceptions.PairingError,
exceptions.BackOffError,
exceptions.DeviceIdMissingError,
)
else:
AUTH_EXCEPTIONS = ()
CONNECTION_TIMEOUT_EXCEPTIONS = ()
DEVICE_EXCEPTIONS = ()
type AppleTvConfigEntry = ConfigEntry[AppleTVManager]
@@ -79,6 +95,10 @@ type AppleTvConfigEntry = ConfigEntry[AppleTVManager]
async def async_setup_entry(hass: HomeAssistant, entry: AppleTvConfigEntry) -> bool:
"""Set up a config entry for Apple TV."""
if sys.version_info >= (3, 14):
raise HomeAssistantError(
"Apple TV is not supported on Python 3.14. Please use Python 3.13."
)
manager = AppleTVManager(hass, entry)
if manager.is_on:

View File

@@ -8,7 +8,7 @@
"integration_type": "device",
"iot_class": "local_push",
"loggers": ["pyatv", "srptools"],
"requirements": ["pyatv==0.17.0"],
"requirements": ["pyatv==0.16.1;python_version<'3.14'"],
"zeroconf": [
"_mediaremotetv._tcp.local.",
"_companion-link._tcp.local.",

View File

@@ -239,15 +239,6 @@ class AppleTvMediaPlayer(
"""
self.async_write_ha_state()
@callback
def volume_device_update(
self, output_device: OutputDevice, old_level: float, new_level: float
) -> None:
"""Output device volume was updated.
This is a callback function from pyatv.interface.AudioListener.
"""
@callback
def outputdevices_update(
self, old_devices: list[OutputDevice], new_devices: list[OutputDevice]

View File

@@ -3,8 +3,9 @@
from abc import ABC, abstractmethod
from dataclasses import dataclass
import logging
import math
from pymicro_vad import MicroVad
from pysilero_vad import SileroVoiceActivityDetector
from pyspeex_noise import AudioProcessor
from .const import BYTES_PER_CHUNK
@@ -42,8 +43,8 @@ class AudioEnhancer(ABC):
"""Enhance chunk of PCM audio @ 16Khz with 16-bit mono samples."""
class MicroVadSpeexEnhancer(AudioEnhancer):
"""Audio enhancer that runs microVAD and speex."""
class SileroVadSpeexEnhancer(AudioEnhancer):
"""Audio enhancer that runs Silero VAD and speex."""
def __init__(
self, auto_gain: int, noise_suppression: int, is_vad_enabled: bool
@@ -69,21 +70,49 @@ class MicroVadSpeexEnhancer(AudioEnhancer):
self.noise_suppression,
)
self.vad: MicroVad | None = None
self.vad: SileroVoiceActivityDetector | None = None
# We get 10ms chunks but Silero works on 32ms chunks, so we have to
# buffer audio. The previous speech probability is used until enough
# audio has been buffered.
self._vad_buffer: bytearray | None = None
self._vad_buffer_chunks = 0
self._vad_buffer_chunk_idx = 0
self._last_speech_probability: float | None = None
if self.is_vad_enabled:
self.vad = MicroVad()
_LOGGER.debug("Initialized microVAD")
self.vad = SileroVoiceActivityDetector()
# VAD buffer is a multiple of 10ms, but Silero VAD needs 32ms.
self._vad_buffer_chunks = int(
math.ceil(self.vad.chunk_bytes() / BYTES_PER_CHUNK)
)
self._vad_leftover_bytes = self.vad.chunk_bytes() - BYTES_PER_CHUNK
self._vad_buffer = bytearray(self.vad.chunk_bytes())
_LOGGER.debug("Initialized Silero VAD")
def enhance_chunk(self, audio: bytes, timestamp_ms: int) -> EnhancedAudioChunk:
"""Enhance 10ms chunk of PCM audio @ 16Khz with 16-bit mono samples."""
speech_probability: float | None = None
assert len(audio) == BYTES_PER_CHUNK
if self.vad is not None:
# Run VAD
speech_probability = self.vad.Process10ms(audio)
assert self._vad_buffer is not None
start_idx = self._vad_buffer_chunk_idx * BYTES_PER_CHUNK
self._vad_buffer[start_idx : start_idx + BYTES_PER_CHUNK] = audio
self._vad_buffer_chunk_idx += 1
if self._vad_buffer_chunk_idx >= self._vad_buffer_chunks:
# We have enough data to run Silero VAD (32 ms)
self._last_speech_probability = self.vad.process_chunk(
self._vad_buffer[: self.vad.chunk_bytes()]
)
# Copy leftover audio that wasn't processed to start
self._vad_buffer[: self._vad_leftover_bytes] = self._vad_buffer[
-self._vad_leftover_bytes :
]
self._vad_buffer_chunk_idx = 0
if self.audio_processor is not None:
# Run noise suppression and auto gain
@@ -92,5 +121,5 @@ class MicroVadSpeexEnhancer(AudioEnhancer):
return EnhancedAudioChunk(
audio=audio,
timestamp_ms=timestamp_ms,
speech_probability=speech_probability,
speech_probability=self._last_speech_probability,
)

View File

@@ -8,5 +8,5 @@
"integration_type": "system",
"iot_class": "local_push",
"quality_scale": "internal",
"requirements": ["pymicro-vad==1.0.1", "pyspeex-noise==1.0.2"]
"requirements": ["pysilero-vad==3.2.0", "pyspeex-noise==1.0.2"]
}

View File

@@ -55,7 +55,7 @@ from homeassistant.util import (
from homeassistant.util.hass_dict import HassKey
from homeassistant.util.limited_size_dict import LimitedSizeDict
from .audio_enhancer import AudioEnhancer, EnhancedAudioChunk, MicroVadSpeexEnhancer
from .audio_enhancer import AudioEnhancer, EnhancedAudioChunk, SileroVadSpeexEnhancer
from .const import (
ACKNOWLEDGE_PATH,
BYTES_PER_CHUNK,
@@ -633,7 +633,7 @@ class PipelineRun:
# Initialize with audio settings
if self.audio_settings.needs_processor and (self.audio_enhancer is None):
# Default audio enhancer
self.audio_enhancer = MicroVadSpeexEnhancer(
self.audio_enhancer = SileroVadSpeexEnhancer(
self.audio_settings.auto_gain_dbfs,
self.audio_settings.noise_suppression_level,
self.audio_settings.is_vad_enabled,

View File

@@ -1,23 +0,0 @@
"""Provides conditions for assist satellites."""
from homeassistant.core import HomeAssistant
from homeassistant.helpers.condition import Condition, make_entity_state_condition
from .const import DOMAIN
from .entity import AssistSatelliteState
CONDITIONS: dict[str, type[Condition]] = {
"is_idle": make_entity_state_condition(DOMAIN, AssistSatelliteState.IDLE),
"is_listening": make_entity_state_condition(DOMAIN, AssistSatelliteState.LISTENING),
"is_processing": make_entity_state_condition(
DOMAIN, AssistSatelliteState.PROCESSING
),
"is_responding": make_entity_state_condition(
DOMAIN, AssistSatelliteState.RESPONDING
),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the assist satellite conditions."""
return CONDITIONS

View File

@@ -1,19 +0,0 @@
.condition_common: &condition_common
target:
entity:
domain: assist_satellite
fields:
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_idle: *condition_common
is_listening: *condition_common
is_processing: *condition_common
is_responding: *condition_common

View File

@@ -1,18 +1,4 @@
{
"conditions": {
"is_idle": {
"condition": "mdi:chat-sleep"
},
"is_listening": {
"condition": "mdi:chat-question"
},
"is_processing": {
"condition": "mdi:chat-processing"
},
"is_responding": {
"condition": "mdi:chat-alert"
}
},
"entity_component": {
"_": {
"default": "mdi:account-voice"

View File

@@ -1,52 +1,8 @@
{
"common": {
"condition_behavior_description": "How the state should match on the targeted Assist satellites.",
"condition_behavior_name": "Behavior",
"trigger_behavior_description": "The behavior of the targeted Assist satellites to trigger on.",
"trigger_behavior_name": "Behavior"
},
"conditions": {
"is_idle": {
"description": "Tests if one or more Assist satellites are idle.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is idle"
},
"is_listening": {
"description": "Tests if one or more Assist satellites are listening.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is listening"
},
"is_processing": {
"description": "Tests if one or more Assist satellites are processing.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is processing"
},
"is_responding": {
"description": "Tests if one or more Assist satellites are responding.",
"fields": {
"behavior": {
"description": "[%key:component::assist_satellite::common::condition_behavior_description%]",
"name": "[%key:component::assist_satellite::common::condition_behavior_name%]"
}
},
"name": "Satellite is responding"
}
},
"entity_component": {
"_": {
"name": "Assist satellite",
@@ -65,12 +21,6 @@
"sentences": "Sentences"
}
},
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"trigger_behavior": {
"options": {
"any": "Any",

View File

@@ -56,7 +56,7 @@ from homeassistant.core import (
valid_entity_id,
)
from homeassistant.exceptions import HomeAssistantError, ServiceNotFound, TemplateError
from homeassistant.helpers import condition as condition_helper, config_validation as cv
from homeassistant.helpers import condition, config_validation as cv
from homeassistant.helpers.entity import ToggleEntity
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.issue_registry import (
@@ -123,11 +123,7 @@ SERVICE_TRIGGER = "trigger"
NEW_TRIGGERS_CONDITIONS_FEATURE_FLAG = "new_triggers_conditions"
_EXPERIMENTAL_CONDITION_PLATFORMS = {
"alarm_control_panel",
"assist_satellite",
"fan",
"light",
"siren",
}
_EXPERIMENTAL_TRIGGER_PLATFORMS = {
@@ -554,7 +550,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
automation_id: str | None,
name: str,
trigger_config: list[ConfigType],
condition: IfAction | None,
cond_func: IfAction | None,
action_script: Script,
initial_state: bool | None,
variables: ScriptVariables | None,
@@ -567,7 +563,7 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
self._attr_name = name
self._trigger_config = trigger_config
self._async_detach_triggers: CALLBACK_TYPE | None = None
self._condition = condition
self._cond_func = cond_func
self.action_script = action_script
self.action_script.change_listener = self.async_write_ha_state
self._initial_state = initial_state
@@ -602,12 +598,6 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced labels."""
referenced = self.action_script.referenced_labels
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(
conf, ATTR_LABEL_ID
)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_LABEL_ID))
return referenced
@@ -617,12 +607,6 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced floors."""
referenced = self.action_script.referenced_floors
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(
conf, ATTR_FLOOR_ID
)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_FLOOR_ID))
return referenced
@@ -632,10 +616,6 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced areas."""
referenced = self.action_script.referenced_areas
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_targets(conf, ATTR_AREA_ID)
for conf in self._trigger_config:
referenced |= set(_get_targets_from_trigger_config(conf, ATTR_AREA_ID))
return referenced
@@ -652,9 +632,9 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced devices."""
referenced = self.action_script.referenced_devices
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_devices(conf)
if self._cond_func is not None:
for conf in self._cond_func.config:
referenced |= condition.async_extract_devices(conf)
for conf in self._trigger_config:
referenced |= set(_trigger_extract_devices(conf))
@@ -666,9 +646,9 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
"""Return a set of referenced entities."""
referenced = self.action_script.referenced_entities
if self._condition is not None:
for conf in self._condition.config:
referenced |= condition_helper.async_extract_entities(conf)
if self._cond_func is not None:
for conf in self._cond_func.config:
referenced |= condition.async_extract_entities(conf)
for conf in self._trigger_config:
for entity_id in _trigger_extract_entities(conf):
@@ -788,8 +768,8 @@ class AutomationEntity(BaseAutomationEntity, RestoreEntity):
if (
not skip_condition
and self._condition is not None
and not self._condition(variables)
and self._cond_func is not None
and not self._cond_func(variables)
):
self._logger.debug(
"Conditions not met, aborting automation. Condition summary: %s",
@@ -1051,12 +1031,12 @@ async def _create_automation_entities(
)
if CONF_CONDITIONS in config_block:
condition = await _async_process_if(hass, name, config_block)
cond_func = await _async_process_if(hass, name, config_block)
if condition is None:
if cond_func is None:
continue
else:
condition = None
cond_func = None
# Add trigger variables to variables
variables = None
@@ -1074,7 +1054,7 @@ async def _create_automation_entities(
automation_id,
name,
config_block[CONF_TRIGGERS],
condition,
cond_func,
action_script,
initial_state,
variables,
@@ -1216,7 +1196,7 @@ async def _async_process_if(
if_configs = config[CONF_CONDITIONS]
try:
if_action = await condition_helper.async_conditions_from_config(
if_action = await condition.async_conditions_from_config(
hass, if_configs, LOGGER, name
)
except HomeAssistantError as ex:

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import json
import logging
from typing import Any
from azure.servicebus import ServiceBusMessage
from azure.servicebus.aio import ServiceBusClient, ServiceBusSender
@@ -93,7 +92,7 @@ class ServiceBusNotificationService(BaseNotificationService):
"""Initialize the service."""
self._client = client
async def async_send_message(self, message: str, **kwargs: Any) -> None:
async def async_send_message(self, message, **kwargs):
"""Send a message."""
dto = {ATTR_ASB_MESSAGE: message}

View File

@@ -85,9 +85,9 @@
}
},
"moving": {
"default": "mdi:octagon",
"default": "mdi:arrow-right",
"state": {
"on": "mdi:arrow-right"
"on": "mdi:octagon"
}
},
"occupancy": {

View File

@@ -1,7 +1,5 @@
"""BleBox sensor entities."""
from datetime import datetime
import blebox_uniapi.sensor
from homeassistant.components.sensor import (
@@ -148,7 +146,7 @@ class BleBoxSensorEntity(BleBoxEntity[blebox_uniapi.sensor.BaseSensor], SensorEn
return self._feature.native_value
@property
def last_reset(self) -> datetime | None:
def last_reset(self):
"""Return the time when the sensor was last reset, if implemented."""
native_implementation = getattr(self._feature, "last_reset", None)

View File

@@ -64,7 +64,6 @@ def _ws_with_blueprint_domain(
return with_domain_blueprints
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "blueprint/list",
@@ -98,7 +97,6 @@ async def ws_list_blueprints(
connection.send_result(msg["id"], results)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "blueprint/import",
@@ -152,7 +150,6 @@ async def ws_import_blueprint(
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "blueprint/save",
@@ -209,7 +206,6 @@ async def ws_save_blueprint(
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "blueprint/delete",
@@ -237,7 +233,6 @@ async def ws_delete_blueprint(
)
@websocket_api.require_admin
@websocket_api.websocket_command(
{
vol.Required("type"): "blueprint/substitute",

View File

@@ -1,6 +1,5 @@
"""The BSB-Lan integration."""
import asyncio
import dataclasses
from bsblan import (
@@ -78,16 +77,12 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
bsblan = BSBLAN(config, session)
try:
# Initialize the client first - this sets up internal caches and validates
# the connection by fetching firmware version
# Initialize the client first - this sets up internal caches and validates the connection
await bsblan.initialize()
# Fetch device metadata in parallel for faster startup
device, info, static = await asyncio.gather(
bsblan.device(),
bsblan.info(),
bsblan.static_values(),
)
# Fetch all required device metadata
device = await bsblan.device()
info = await bsblan.info()
static = await bsblan.static_values()
except BSBLANConnectionError as err:
raise ConfigEntryNotReady(
translation_domain=DOMAIN,
@@ -115,10 +110,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: BSBLanConfigEntry) -> bo
fast_coordinator = BSBLanFastCoordinator(hass, entry, bsblan)
slow_coordinator = BSBLanSlowCoordinator(hass, entry, bsblan)
# Perform first refresh of fast coordinator (required for entities)
# Perform first refresh of both coordinators
await fast_coordinator.async_config_entry_first_refresh()
# Refresh slow coordinator - don't fail if DHW is not available
# Try to refresh slow coordinator, but don't fail if DHW is not available
# This allows the integration to work even if the device doesn't support DHW
await slow_coordinator.async_refresh()

View File

@@ -111,17 +111,11 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
return None
return self.coordinator.data.state.target_temperature.value
@property
def _hvac_mode_value(self) -> int | str | None:
"""Return the raw hvac_mode value from the coordinator."""
if (hvac_mode := self.coordinator.data.state.hvac_mode) is None:
return None
return hvac_mode.value
@property
def hvac_mode(self) -> HVACMode | None:
"""Return hvac operation ie. heat, cool mode."""
if (hvac_mode_value := self._hvac_mode_value) is None:
hvac_mode_value = self.coordinator.data.state.hvac_mode.value
if hvac_mode_value is None:
return None
# BSB-Lan returns integer values: 0=off, 1=auto, 2=eco, 3=heat
if isinstance(hvac_mode_value, int):
@@ -131,8 +125,9 @@ class BSBLANClimate(BSBLanEntity, ClimateEntity):
@property
def preset_mode(self) -> str | None:
"""Return the current preset mode."""
hvac_mode_value = self.coordinator.data.state.hvac_mode.value
# BSB-Lan mode 2 is eco/reduced mode
if self._hvac_mode_value == 2:
if hvac_mode_value == 2:
return PRESET_ECO
return PRESET_NONE

View File

@@ -2,6 +2,7 @@
from dataclasses import dataclass
from datetime import timedelta
from random import randint
from bsblan import (
BSBLAN,
@@ -22,17 +23,6 @@ from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, Upda
from .const import DOMAIN, LOGGER, SCAN_INTERVAL_FAST, SCAN_INTERVAL_SLOW
# Filter lists for optimized API calls - only fetch parameters we actually use
# This significantly reduces response time (~0.2s per parameter saved)
STATE_INCLUDE = ["current_temperature", "target_temperature", "hvac_mode"]
SENSOR_INCLUDE = ["current_temperature", "outside_temperature"]
DHW_STATE_INCLUDE = [
"operating_mode",
"nominal_setpoint",
"dhw_actual_value_top_temperature",
]
DHW_CONFIG_INCLUDE = ["reduced_setpoint", "nominal_setpoint_max"]
@dataclass
class BSBLanFastData:
@@ -90,18 +80,26 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
config_entry,
client,
name=f"{DOMAIN}_fast_{config_entry.data[CONF_HOST]}",
update_interval=SCAN_INTERVAL_FAST,
update_interval=self._get_update_interval(),
)
def _get_update_interval(self) -> timedelta:
"""Get the update interval with a random offset.
Add a random number of seconds to avoid timeouts when
the BSB-Lan device is already/still busy retrieving data,
e.g. for MQTT or internal logging.
"""
return SCAN_INTERVAL_FAST + timedelta(seconds=randint(1, 8))
async def _async_update_data(self) -> BSBLanFastData:
"""Fetch fast-changing data from the BSB-Lan device."""
try:
# Client is already initialized in async_setup_entry
# Use include filtering to only fetch parameters we actually use
# This reduces response time significantly (~0.2s per parameter)
state = await self.client.state(include=STATE_INCLUDE)
sensor = await self.client.sensor(include=SENSOR_INCLUDE)
dhw = await self.client.hot_water_state(include=DHW_STATE_INCLUDE)
# Fetch fast-changing data (state, sensor, DHW state)
state = await self.client.state()
sensor = await self.client.sensor()
dhw = await self.client.hot_water_state()
except BSBLANAuthError as err:
raise ConfigEntryAuthFailed(
@@ -113,6 +111,9 @@ class BSBLanFastCoordinator(BSBLanCoordinator[BSBLanFastData]):
f"Error while establishing connection with BSB-Lan device at {host}"
) from err
# Update the interval with random jitter for next update
self.update_interval = self._get_update_interval()
return BSBLanFastData(
state=state,
sensor=sensor,
@@ -142,8 +143,8 @@ class BSBLanSlowCoordinator(BSBLanCoordinator[BSBLanSlowData]):
"""Fetch slow-changing data from the BSB-Lan device."""
try:
# Client is already initialized in async_setup_entry
# Use include filtering to only fetch parameters we actually use
dhw_config = await self.client.hot_water_config(include=DHW_CONFIG_INCLUDE)
# Fetch slow-changing configuration data
dhw_config = await self.client.hot_water_config()
dhw_schedule = await self.client.hot_water_schedule()
except AttributeError:

View File

@@ -29,11 +29,7 @@ class BSBLanEntityBase[_T: BSBLanCoordinator](CoordinatorEntity[_T]):
connections={(CONNECTION_NETWORK_MAC, format_mac(mac))},
name=data.device.name,
manufacturer="BSBLAN Inc.",
model=(
data.info.device_identification.value
if data.info.device_identification
else None
),
model=data.info.device_identification.value,
sw_version=data.device.version,
configuration_url=f"http://{host}",
)

View File

@@ -7,7 +7,7 @@
"integration_type": "device",
"iot_class": "local_polling",
"loggers": ["bsblan"],
"requirements": ["python-bsblan==4.1.0"],
"requirements": ["python-bsblan==3.1.6"],
"zeroconf": [
{
"name": "bsb-lan*",

View File

@@ -15,13 +15,5 @@
"get_events": {
"service": "mdi:calendar-month"
}
},
"triggers": {
"event_ended": {
"trigger": "mdi:calendar-end"
},
"event_started": {
"trigger": "mdi:calendar-start"
}
}
}

View File

@@ -45,14 +45,6 @@
"title": "Detected use of deprecated action calendar.list_events"
}
},
"selector": {
"trigger_offset_type": {
"options": {
"after": "After",
"before": "Before"
}
}
},
"services": {
"create_event": {
"description": "Adds a new calendar event.",
@@ -111,35 +103,5 @@
"name": "Get events"
}
},
"title": "Calendar",
"triggers": {
"event_ended": {
"description": "Triggers when a calendar event ends.",
"fields": {
"offset": {
"description": "Offset from the end of the event.",
"name": "Offset"
},
"offset_type": {
"description": "Whether to trigger before or after the end of the event, if an offset is defined.",
"name": "Offset type"
}
},
"name": "Calendar event ended"
},
"event_started": {
"description": "Triggers when a calendar event starts.",
"fields": {
"offset": {
"description": "Offset from the start of the event.",
"name": "Offset"
},
"offset_type": {
"description": "Whether to trigger before or after the start of the event, if an offset is defined.",
"name": "Offset type"
}
},
"name": "Calendar event started"
}
}
"title": "Calendar"
}

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
import asyncio
from collections.abc import Awaitable, Callable
from dataclasses import dataclass
import datetime
@@ -11,15 +10,8 @@ from typing import TYPE_CHECKING, Any, cast
import voluptuous as vol
from homeassistant.const import (
ATTR_ENTITY_ID,
CONF_ENTITY_ID,
CONF_EVENT,
CONF_OFFSET,
CONF_OPTIONS,
CONF_TARGET,
)
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback, split_entity_id
from homeassistant.const import CONF_ENTITY_ID, CONF_EVENT, CONF_OFFSET, CONF_OPTIONS
from homeassistant.core import CALLBACK_TYPE, HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.automation import move_top_level_schema_fields_to_options
@@ -28,13 +20,12 @@ from homeassistant.helpers.event import (
async_track_point_in_time,
async_track_time_interval,
)
from homeassistant.helpers.target import TargetEntityChangeTracker, TargetSelection
from homeassistant.helpers.trigger import Trigger, TriggerActionRunner, TriggerConfig
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import dt as dt_util
from . import CalendarEntity, CalendarEvent
from .const import DATA_COMPONENT, DOMAIN
from .const import DATA_COMPONENT
_LOGGER = logging.getLogger(__name__)
@@ -42,35 +33,19 @@ EVENT_START = "start"
EVENT_END = "end"
UPDATE_INTERVAL = datetime.timedelta(minutes=15)
CONF_OFFSET_TYPE = "offset_type"
OFFSET_TYPE_BEFORE = "before"
OFFSET_TYPE_AFTER = "after"
_SINGLE_ENTITY_EVENT_OPTIONS_SCHEMA = {
_OPTIONS_SCHEMA_DICT = {
vol.Required(CONF_ENTITY_ID): cv.entity_id,
vol.Optional(CONF_EVENT, default=EVENT_START): vol.In({EVENT_START, EVENT_END}),
vol.Optional(CONF_OFFSET, default=datetime.timedelta(0)): cv.time_period,
}
_SINGLE_ENTITY_EVENT_TRIGGER_SCHEMA = vol.Schema(
_CONFIG_SCHEMA = vol.Schema(
{
vol.Required(CONF_OPTIONS): _SINGLE_ENTITY_EVENT_OPTIONS_SCHEMA,
vol.Required(CONF_OPTIONS): _OPTIONS_SCHEMA_DICT,
},
)
_EVENT_TRIGGER_SCHEMA = vol.Schema(
{
vol.Required(CONF_OPTIONS, default={}): {
vol.Required(CONF_OFFSET, default=datetime.timedelta(0)): cv.time_period,
vol.Required(CONF_OFFSET_TYPE, default=OFFSET_TYPE_BEFORE): vol.In(
{OFFSET_TYPE_BEFORE, OFFSET_TYPE_AFTER}
),
},
vol.Required(CONF_TARGET): cv.TARGET_FIELDS,
}
)
# mypy: disallow-any-generics
@@ -80,7 +55,6 @@ class QueuedCalendarEvent:
trigger_time: datetime.datetime
event: CalendarEvent
entity_id: str
@dataclass
@@ -120,7 +94,7 @@ class Timespan:
return f"[{self.start}, {self.end})"
type EventFetcher = Callable[[Timespan], Awaitable[list[tuple[str, CalendarEvent]]]]
type EventFetcher = Callable[[Timespan], Awaitable[list[CalendarEvent]]]
type QueuedEventFetcher = Callable[[Timespan], Awaitable[list[QueuedCalendarEvent]]]
@@ -136,24 +110,15 @@ def get_entity(hass: HomeAssistant, entity_id: str) -> CalendarEntity:
return entity
def event_fetcher(hass: HomeAssistant, entity_ids: set[str]) -> EventFetcher:
def event_fetcher(hass: HomeAssistant, entity_id: str) -> EventFetcher:
"""Build an async_get_events wrapper to fetch events during a time span."""
async def async_get_events(timespan: Timespan) -> list[tuple[str, CalendarEvent]]:
async def async_get_events(timespan: Timespan) -> list[CalendarEvent]:
"""Return events active in the specified time span."""
entity = get_entity(hass, entity_id)
# Expand by one second to make the end time exclusive
end_time = timespan.end + datetime.timedelta(seconds=1)
events: list[tuple[str, CalendarEvent]] = []
for entity_id in entity_ids:
entity = get_entity(hass, entity_id)
events.extend(
(entity_id, event)
for event in await entity.async_get_events(
hass, timespan.start, end_time
)
)
return events
return await entity.async_get_events(hass, timespan.start, end_time)
return async_get_events
@@ -177,11 +142,12 @@ def queued_event_fetcher(
# Example: For an EVENT_END trigger the event may start during this
# time span, but need to be triggered later when the end happens.
results = []
for entity_id, event in active_events:
trigger_time = get_trigger_time(event)
for trigger_time, event in zip(
map(get_trigger_time, active_events), active_events, strict=False
):
if trigger_time not in offset_timespan:
continue
results.append(QueuedCalendarEvent(trigger_time + offset, event, entity_id))
results.append(QueuedCalendarEvent(trigger_time + offset, event))
_LOGGER.debug(
"Scan events @ %s%s found %s eligible of %s active",
@@ -274,7 +240,6 @@ class CalendarEventListener:
_LOGGER.debug("Dispatching event: %s", queued_event.event)
payload = {
**self._trigger_payload,
ATTR_ENTITY_ID: queued_event.entity_id,
"calendar_event": queued_event.event.as_dict(),
}
self._action_runner(payload, "calendar event state change")
@@ -295,77 +260,8 @@ class CalendarEventListener:
self._listen_next_calendar_event()
class TargetCalendarEventListener(TargetEntityChangeTracker):
"""Helper class to listen to calendar events for target entity changes."""
def __init__(
self,
hass: HomeAssistant,
target_selection: TargetSelection,
event_type: str,
offset: datetime.timedelta,
run_action: TriggerActionRunner,
) -> None:
"""Initialize the state change tracker."""
def entity_filter(entities: set[str]) -> set[str]:
return {
entity_id
for entity_id in entities
if split_entity_id(entity_id)[0] == DOMAIN
}
super().__init__(hass, target_selection, entity_filter)
self._event_type = event_type
self._offset = offset
self._run_action = run_action
self._trigger_data = {
"event": event_type,
"offset": offset,
}
self._pending_listener_task: asyncio.Task[None] | None = None
self._calendar_event_listener: CalendarEventListener | None = None
@callback
def _handle_entities_update(self, tracked_entities: set[str]) -> None:
"""Restart the listeners when the list of entities of the tracked targets is updated."""
if self._pending_listener_task:
self._pending_listener_task.cancel()
self._pending_listener_task = self._hass.async_create_task(
self._start_listening(tracked_entities)
)
async def _start_listening(self, tracked_entities: set[str]) -> None:
"""Start listening for calendar events."""
_LOGGER.debug("Tracking events for calendars: %s", tracked_entities)
if self._calendar_event_listener:
self._calendar_event_listener.async_detach()
self._calendar_event_listener = CalendarEventListener(
self._hass,
self._run_action,
self._trigger_data,
queued_event_fetcher(
event_fetcher(self._hass, tracked_entities),
self._event_type,
self._offset,
),
)
await self._calendar_event_listener.async_attach()
def _unsubscribe(self) -> None:
"""Unsubscribe from all events."""
super()._unsubscribe()
if self._pending_listener_task:
self._pending_listener_task.cancel()
self._pending_listener_task = None
if self._calendar_event_listener:
self._calendar_event_listener.async_detach()
self._calendar_event_listener = None
class SingleEntityEventTrigger(Trigger):
"""Legacy single calendar entity event trigger."""
class EventTrigger(Trigger):
"""Calendar event trigger."""
_options: dict[str, Any]
@@ -375,7 +271,7 @@ class SingleEntityEventTrigger(Trigger):
) -> ConfigType:
"""Validate complete config."""
complete_config = move_top_level_schema_fields_to_options(
complete_config, _SINGLE_ENTITY_EVENT_OPTIONS_SCHEMA
complete_config, _OPTIONS_SCHEMA_DICT
)
return await super().async_validate_complete_config(hass, complete_config)
@@ -384,7 +280,7 @@ class SingleEntityEventTrigger(Trigger):
cls, hass: HomeAssistant, config: ConfigType
) -> ConfigType:
"""Validate config."""
return cast(ConfigType, _SINGLE_ENTITY_EVENT_TRIGGER_SCHEMA(config))
return cast(ConfigType, _CONFIG_SCHEMA(config))
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:
"""Initialize trigger."""
@@ -415,72 +311,15 @@ class SingleEntityEventTrigger(Trigger):
run_action,
trigger_data,
queued_event_fetcher(
event_fetcher(self._hass, {entity_id}), event_type, offset
event_fetcher(self._hass, entity_id), event_type, offset
),
)
await listener.async_attach()
return listener.async_detach
class EventTrigger(Trigger):
"""Calendar event trigger."""
_options: dict[str, Any]
_event_type: str
@classmethod
async def async_validate_config(
cls, hass: HomeAssistant, config: ConfigType
) -> ConfigType:
"""Validate config."""
return cast(ConfigType, _EVENT_TRIGGER_SCHEMA(config))
def __init__(self, hass: HomeAssistant, config: TriggerConfig) -> None:
"""Initialize trigger."""
super().__init__(hass, config)
if TYPE_CHECKING:
assert config.target is not None
assert config.options is not None
self._target = config.target
self._options = config.options
async def async_attach_runner(
self, run_action: TriggerActionRunner
) -> CALLBACK_TYPE:
"""Attach a trigger."""
offset = self._options[CONF_OFFSET]
offset_type = self._options[CONF_OFFSET_TYPE]
if offset_type == OFFSET_TYPE_BEFORE:
offset = -offset
target_selection = TargetSelection(self._target)
if not target_selection.has_any_target:
raise HomeAssistantError(f"No target defined in {self._target}")
listener = TargetCalendarEventListener(
self._hass, target_selection, self._event_type, offset, run_action
)
return listener.async_setup()
class EventStartedTrigger(EventTrigger):
"""Calendar event started trigger."""
_event_type = EVENT_START
class EventEndedTrigger(EventTrigger):
"""Calendar event ended trigger."""
_event_type = EVENT_END
TRIGGERS: dict[str, type[Trigger]] = {
"_": SingleEntityEventTrigger,
"event_started": EventStartedTrigger,
"event_ended": EventEndedTrigger,
"_": EventTrigger,
}

View File

@@ -1,27 +0,0 @@
.trigger_common: &trigger_common
target:
entity:
domain: calendar
fields:
offset:
required: true
default:
days: 0
hours: 0
minutes: 0
seconds: 0
selector:
duration:
enable_day: true
offset_type:
required: true
default: before
selector:
select:
translation_key: trigger_offset_type
options:
- before
- after
event_started: *trigger_common
event_ended: *trigger_common

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
import logging
from typing import Any
import voluptuous as vol
from webexpythonsdk import ApiError, WebexAPI, exceptions
@@ -52,7 +51,7 @@ class CiscoWebexNotificationService(BaseNotificationService):
self.room = room
self.client = client
def send_message(self, message: str = "", **kwargs: Any) -> None:
def send_message(self, message="", **kwargs):
"""Send a message to a user."""
title = ""

View File

@@ -5,7 +5,6 @@ from __future__ import annotations
from http import HTTPStatus
import json
import logging
from typing import Any
import requests
import voluptuous as vol
@@ -82,7 +81,7 @@ class ClicksendNotificationService(BaseNotificationService):
self.language = config[CONF_LANGUAGE]
self.voice = config[CONF_VOICE]
def send_message(self, message: str = "", **kwargs: Any) -> None:
def send_message(self, message="", **kwargs):
"""Send a voice call to a user."""
data = {
"messages": [

View File

@@ -50,6 +50,7 @@ from . import (
from .client import CloudClient
from .const import (
CONF_ACCOUNT_LINK_SERVER,
CONF_ACCOUNTS_SERVER,
CONF_ACME_SERVER,
CONF_ALEXA,
CONF_ALIASES,
@@ -137,6 +138,7 @@ _BASE_CONFIG_SCHEMA = vol.Schema(
vol.Optional(CONF_ALEXA): ALEXA_SCHEMA,
vol.Optional(CONF_GOOGLE_ACTIONS): GACTIONS_SCHEMA,
vol.Optional(CONF_ACCOUNT_LINK_SERVER): str,
vol.Optional(CONF_ACCOUNTS_SERVER): str,
vol.Optional(CONF_ACME_SERVER): str,
vol.Optional(CONF_API_SERVER): str,
vol.Optional(CONF_RELAYER_SERVER): str,

View File

@@ -76,6 +76,7 @@ CONF_GOOGLE_ACTIONS = "google_actions"
CONF_USER_POOL_ID = "user_pool_id"
CONF_ACCOUNT_LINK_SERVER = "account_link_server"
CONF_ACCOUNTS_SERVER = "accounts_server"
CONF_ACME_SERVER = "acme_server"
CONF_API_SERVER = "api_server"
CONF_DISCOVERY_SERVICE_ACTIONS = "discovery_service_actions"

View File

@@ -13,6 +13,6 @@
"integration_type": "system",
"iot_class": "cloud_push",
"loggers": ["acme", "hass_nabucasa", "snitun"],
"requirements": ["hass-nabucasa==1.11.0"],
"requirements": ["hass-nabucasa==1.7.0"],
"single_config_entry": true
}

View File

@@ -8,5 +8,5 @@
"iot_class": "cloud_polling",
"loggers": ["compit"],
"quality_scale": "bronze",
"requirements": ["compit-inext-api==0.4.2"]
"requirements": ["compit-inext-api==0.3.4"]
}

View File

@@ -49,11 +49,11 @@ def setup_platform(
discovery_info: DiscoveryInfoType | None = None,
) -> None:
"""Set up the Concord232 alarm control panel platform."""
name: str = config[CONF_NAME]
code: str | None = config.get(CONF_CODE)
mode: str = config[CONF_MODE]
host: str = config[CONF_HOST]
port: int = config[CONF_PORT]
name = config[CONF_NAME]
code = config.get(CONF_CODE)
mode = config[CONF_MODE]
host = config[CONF_HOST]
port = config[CONF_PORT]
url = f"http://{host}:{port}"
@@ -72,7 +72,7 @@ class Concord232Alarm(AlarmControlPanelEntity):
| AlarmControlPanelEntityFeature.ARM_AWAY
)
def __init__(self, url: str, name: str, code: str | None, mode: str) -> None:
def __init__(self, url, name, code, mode):
"""Initialize the Concord232 alarm panel."""
self._attr_name = name
@@ -125,7 +125,7 @@ class Concord232Alarm(AlarmControlPanelEntity):
return
self._alarm.arm("away")
def _validate_code(self, code: str | None, state: AlarmControlPanelState) -> bool:
def _validate_code(self, code, state):
"""Validate given code."""
if self._code is None:
return True

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import datetime
import logging
from typing import Any
from concord232 import client as concord232_client
import requests
@@ -30,7 +29,8 @@ CONF_ZONE_TYPES = "zone_types"
DEFAULT_HOST = "localhost"
DEFAULT_NAME = "Alarm"
DEFAULT_PORT = 5007
DEFAULT_PORT = "5007"
DEFAULT_SSL = False
SCAN_INTERVAL = datetime.timedelta(seconds=10)
@@ -56,10 +56,10 @@ def setup_platform(
) -> None:
"""Set up the Concord232 binary sensor platform."""
host: str = config[CONF_HOST]
port: int = config[CONF_PORT]
exclude: list[int] = config[CONF_EXCLUDE_ZONES]
zone_types: dict[int, BinarySensorDeviceClass] = config[CONF_ZONE_TYPES]
host = config[CONF_HOST]
port = config[CONF_PORT]
exclude = config[CONF_EXCLUDE_ZONES]
zone_types = config[CONF_ZONE_TYPES]
sensors = []
try:
@@ -84,6 +84,7 @@ def setup_platform(
if zone["number"] not in exclude:
sensors.append(
Concord232ZoneSensor(
hass,
client,
zone,
zone_types.get(zone["number"], get_opening_type(zone)),
@@ -109,25 +110,26 @@ def get_opening_type(zone):
class Concord232ZoneSensor(BinarySensorEntity):
"""Representation of a Concord232 zone as a sensor."""
def __init__(
self,
client: concord232_client.Client,
zone: dict[str, Any],
zone_type: BinarySensorDeviceClass,
) -> None:
def __init__(self, hass, client, zone, zone_type):
"""Initialize the Concord232 binary sensor."""
self._hass = hass
self._client = client
self._zone = zone
self._number = zone["number"]
self._attr_device_class = zone_type
self._zone_type = zone_type
@property
def name(self) -> str:
def device_class(self):
"""Return the class of this sensor, from DEVICE_CLASSES."""
return self._zone_type
@property
def name(self):
"""Return the name of the binary sensor."""
return self._zone["name"]
@property
def is_on(self) -> bool:
def is_on(self):
"""Return true if the binary sensor is on."""
# True means "faulted" or "open" or "abnormal state"
return bool(self._zone["state"] != "Normal")
@@ -143,5 +145,5 @@ class Concord232ZoneSensor(BinarySensorEntity):
if hasattr(self._client, "zones"):
self._zone = next(
x for x in self._client.zones if x["number"] == self._number
(x for x in self._client.zones if x["number"] == self._number), None
)

View File

@@ -11,11 +11,13 @@ from homeassistant import config_entries
from homeassistant.components import websocket_api
from homeassistant.components.websocket_api import ERR_NOT_FOUND, require_admin
from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import HomeAssistantError
from homeassistant.helpers import (
config_validation as cv,
device_registry as dr,
entity_registry as er,
)
from homeassistant.helpers.entity_component import async_get_entity_suggested_object_id
from homeassistant.helpers.json import json_dumps
_LOGGER = logging.getLogger(__name__)
@@ -349,12 +351,26 @@ def websocket_get_automatic_entity_ids(
if not (entry := registry.entities.get(entity_id)):
automatic_entity_ids[entity_id] = None
continue
new_entity_id = registry.async_regenerate_entity_id(
entry,
try:
suggested = async_get_entity_suggested_object_id(hass, entity_id)
except HomeAssistantError as err:
# This is raised if the entity has no object.
_LOGGER.debug(
"Unable to get suggested object ID for %s, entity ID: %s (%s)",
entry.entity_id,
entity_id,
err,
)
automatic_entity_ids[entity_id] = None
continue
suggested_entity_id = registry.async_generate_entity_id(
entry.domain,
suggested or f"{entry.platform}_{entry.unique_id}",
current_entity_id=entity_id,
reserved_entity_ids=reserved_entity_ids,
)
automatic_entity_ids[entity_id] = new_entity_id
reserved_entity_ids.add(new_entity_id)
automatic_entity_ids[entity_id] = suggested_entity_id
reserved_entity_ids.add(suggested_entity_id)
connection.send_message(
websocket_api.result_message(msg["id"], automatic_entity_ids)

View File

@@ -7,5 +7,6 @@
"integration_type": "service",
"iot_class": "local_push",
"loggers": ["datadog"],
"quality_scale": "legacy",
"requirements": ["datadog==0.52.0"]
}

View File

@@ -10,7 +10,7 @@ LOGGER = logging.getLogger(__package__)
DOMAIN = "deconz"
HASSIO_CONFIGURATION_URL = "homeassistant://app/core_deconz"
HASSIO_CONFIGURATION_URL = "homeassistant://hassio/ingress/core_deconz"
CONF_BRIDGE_ID = "bridgeid"
CONF_GROUP_ID_BASE = "group_id_base"

View File

@@ -1,7 +1,6 @@
"""Support for Digital Ocean."""
from __future__ import annotations
from datetime import timedelta
import logging
import digitalocean
@@ -13,12 +12,27 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.typing import ConfigType
from homeassistant.util import Throttle
from .const import DATA_DIGITAL_OCEAN, DOMAIN, MIN_TIME_BETWEEN_UPDATES
_LOGGER = logging.getLogger(__name__)
ATTR_CREATED_AT = "created_at"
ATTR_DROPLET_ID = "droplet_id"
ATTR_DROPLET_NAME = "droplet_name"
ATTR_FEATURES = "features"
ATTR_IPV4_ADDRESS = "ipv4_address"
ATTR_IPV6_ADDRESS = "ipv6_address"
ATTR_MEMORY = "memory"
ATTR_REGION = "region"
ATTR_VCPUS = "vcpus"
ATTRIBUTION = "Data provided by Digital Ocean"
CONF_DROPLETS = "droplets"
DATA_DIGITAL_OCEAN = "data_do"
DIGITAL_OCEAN_PLATFORMS = [Platform.SWITCH, Platform.BINARY_SENSOR]
DOMAIN = "digital_ocean"
MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=60)
CONFIG_SCHEMA = vol.Schema(
{DOMAIN: vol.Schema({vol.Required(CONF_ACCESS_TOKEN): cv.string})},

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
import logging
from typing import Any
import voluptuous as vol
@@ -17,7 +16,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import (
from . import (
ATTR_CREATED_AT,
ATTR_DROPLET_ID,
ATTR_DROPLET_NAME,
@@ -66,7 +65,6 @@ class DigitalOceanBinarySensor(BinarySensorEntity):
"""Representation of a Digital Ocean droplet sensor."""
_attr_attribution = ATTRIBUTION
_attr_device_class = BinarySensorDeviceClass.MOVING
def __init__(self, do, droplet_id):
"""Initialize a new Digital Ocean sensor."""
@@ -81,12 +79,17 @@ class DigitalOceanBinarySensor(BinarySensorEntity):
return self.data.name
@property
def is_on(self) -> bool:
def is_on(self):
"""Return true if the binary sensor is on."""
return self.data.status == "active"
@property
def extra_state_attributes(self) -> dict[str, Any]:
def device_class(self):
"""Return the class of this sensor."""
return BinarySensorDeviceClass.MOVING
@property
def extra_state_attributes(self):
"""Return the state attributes of the Digital Ocean droplet."""
return {
ATTR_CREATED_AT: self.data.created_at,

View File

@@ -1,30 +0,0 @@
"""Support for Digital Ocean."""
from __future__ import annotations
from datetime import timedelta
from typing import TYPE_CHECKING
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from . import DigitalOcean
ATTR_CREATED_AT = "created_at"
ATTR_DROPLET_ID = "droplet_id"
ATTR_DROPLET_NAME = "droplet_name"
ATTR_FEATURES = "features"
ATTR_IPV4_ADDRESS = "ipv4_address"
ATTR_IPV6_ADDRESS = "ipv6_address"
ATTR_MEMORY = "memory"
ATTR_REGION = "region"
ATTR_VCPUS = "vcpus"
ATTRIBUTION = "Data provided by Digital Ocean"
CONF_DROPLETS = "droplets"
DOMAIN = "digital_ocean"
DATA_DIGITAL_OCEAN: HassKey[DigitalOcean] = HassKey(DOMAIN)
MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=60)

View File

@@ -16,7 +16,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from .const import (
from . import (
ATTR_CREATED_AT,
ATTR_DROPLET_ID,
ATTR_DROPLET_NAME,
@@ -80,12 +80,12 @@ class DigitalOceanSwitch(SwitchEntity):
return self.data.name
@property
def is_on(self) -> bool:
def is_on(self):
"""Return true if switch is on."""
return self.data.status == "active"
@property
def extra_state_attributes(self) -> dict[str, Any]:
def extra_state_attributes(self):
"""Return the state attributes of the Digital Ocean droplet."""
return {
ATTR_CREATED_AT: self.data.created_at,

View File

@@ -3,7 +3,6 @@
from __future__ import annotations
import logging
from typing import Any
from homeassistant.components.notify import ATTR_TARGET, BaseNotificationService
from homeassistant.core import HomeAssistant
@@ -30,7 +29,7 @@ class DovadoSMSNotificationService(BaseNotificationService):
"""Initialize the service."""
self._client = client
def send_message(self, message: str, **kwargs: Any) -> None:
def send_message(self, message, **kwargs):
"""Send SMS to the specified target phone number."""
if not (target := kwargs.get(ATTR_TARGET)):
_LOGGER.error("One target is required")

View File

@@ -1,7 +1,6 @@
"""Support for Ebusd daemon for communication with eBUS heating systems."""
import logging
from typing import Any
import ebusdpy
import voluptuous as vol
@@ -18,7 +17,7 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.discovery import load_platform
from homeassistant.helpers.typing import ConfigType
from .const import DOMAIN, EBUSD_DATA, SENSOR_TYPES
from .const import DOMAIN, SENSOR_TYPES
_LOGGER = logging.getLogger(__name__)
@@ -29,9 +28,9 @@ CACHE_TTL = 900
SERVICE_EBUSD_WRITE = "ebusd_write"
def verify_ebusd_config(config: ConfigType) -> ConfigType:
def verify_ebusd_config(config):
"""Verify eBusd config."""
circuit: str = config[CONF_CIRCUIT]
circuit = config[CONF_CIRCUIT]
for condition in config[CONF_MONITORED_CONDITIONS]:
if condition not in SENSOR_TYPES[circuit]:
raise vol.Invalid(f"Condition '{condition}' not in '{circuit}'.")
@@ -60,17 +59,17 @@ CONFIG_SCHEMA = vol.Schema(
def setup(hass: HomeAssistant, config: ConfigType) -> bool:
"""Set up the eBusd component."""
_LOGGER.debug("Integration setup started")
conf: ConfigType = config[DOMAIN]
name: str = conf[CONF_NAME]
circuit: str = conf[CONF_CIRCUIT]
monitored_conditions: list[str] = conf[CONF_MONITORED_CONDITIONS]
server_address: tuple[str, int] = (conf[CONF_HOST], conf[CONF_PORT])
conf = config[DOMAIN]
name = conf[CONF_NAME]
circuit = conf[CONF_CIRCUIT]
monitored_conditions = conf.get(CONF_MONITORED_CONDITIONS)
server_address = (conf.get(CONF_HOST), conf.get(CONF_PORT))
try:
ebusdpy.init(server_address)
except (TimeoutError, OSError):
return False
hass.data[EBUSD_DATA] = EbusdData(server_address, circuit)
hass.data[DOMAIN] = EbusdData(server_address, circuit)
sensor_config = {
CONF_MONITORED_CONDITIONS: monitored_conditions,
"client_name": name,
@@ -78,7 +77,7 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
}
load_platform(hass, Platform.SENSOR, DOMAIN, sensor_config, config)
hass.services.register(DOMAIN, SERVICE_EBUSD_WRITE, hass.data[EBUSD_DATA].write)
hass.services.register(DOMAIN, SERVICE_EBUSD_WRITE, hass.data[DOMAIN].write)
_LOGGER.debug("Ebusd integration setup completed")
return True
@@ -87,13 +86,13 @@ def setup(hass: HomeAssistant, config: ConfigType) -> bool:
class EbusdData:
"""Get the latest data from Ebusd."""
def __init__(self, address: tuple[str, int], circuit: str) -> None:
def __init__(self, address, circuit):
"""Initialize the data object."""
self._circuit = circuit
self._address = address
self.value: dict[str, Any] = {}
self.value = {}
def update(self, name: str, stype: int) -> None:
def update(self, name, stype):
"""Call the Ebusd API to update the data."""
try:
_LOGGER.debug("Opening socket to ebusd %s", name)

View File

@@ -1,9 +1,5 @@
"""Constants for ebus component."""
from __future__ import annotations
from typing import TYPE_CHECKING
from homeassistant.components.sensor import SensorDeviceClass
from homeassistant.const import (
PERCENTAGE,
@@ -12,283 +8,277 @@ from homeassistant.const import (
UnitOfTemperature,
UnitOfTime,
)
from homeassistant.util.hass_dict import HassKey
if TYPE_CHECKING:
from . import EbusdData
DOMAIN = "ebusd"
EBUSD_DATA: HassKey[EbusdData] = HassKey(DOMAIN)
# SensorTypes from ebusdpy module :
# 0='decimal', 1='time-schedule', 2='switch', 3='string', 4='value;status'
type SensorSpecs = tuple[str, str | None, str | None, int, SensorDeviceClass | None]
SENSOR_TYPES: dict[str, dict[str, SensorSpecs]] = {
SENSOR_TYPES = {
"700": {
"ActualFlowTemperatureDesired": (
"ActualFlowTemperatureDesired": [
"Hc1ActualFlowTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"MaxFlowTemperatureDesired": (
],
"MaxFlowTemperatureDesired": [
"Hc1MaxFlowTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"MinFlowTemperatureDesired": (
],
"MinFlowTemperatureDesired": [
"Hc1MinFlowTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"PumpStatus": ("Hc1PumpStatus", None, "mdi:toggle-switch", 2, None),
"HCSummerTemperatureLimit": (
],
"PumpStatus": ["Hc1PumpStatus", None, "mdi:toggle-switch", 2, None],
"HCSummerTemperatureLimit": [
"Hc1SummerTempLimit",
UnitOfTemperature.CELSIUS,
"mdi:weather-sunny",
0,
SensorDeviceClass.TEMPERATURE,
),
"HolidayTemperature": (
],
"HolidayTemperature": [
"HolidayTemp",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"HWTemperatureDesired": (
],
"HWTemperatureDesired": [
"HwcTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"HWActualTemperature": (
],
"HWActualTemperature": [
"HwcStorageTemp",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"HWTimerMonday": ("hwcTimer.Monday", None, "mdi:timer-outline", 1, None),
"HWTimerTuesday": ("hwcTimer.Tuesday", None, "mdi:timer-outline", 1, None),
"HWTimerWednesday": ("hwcTimer.Wednesday", None, "mdi:timer-outline", 1, None),
"HWTimerThursday": ("hwcTimer.Thursday", None, "mdi:timer-outline", 1, None),
"HWTimerFriday": ("hwcTimer.Friday", None, "mdi:timer-outline", 1, None),
"HWTimerSaturday": ("hwcTimer.Saturday", None, "mdi:timer-outline", 1, None),
"HWTimerSunday": ("hwcTimer.Sunday", None, "mdi:timer-outline", 1, None),
"HWOperativeMode": ("HwcOpMode", None, "mdi:math-compass", 3, None),
"WaterPressure": (
],
"HWTimerMonday": ["hwcTimer.Monday", None, "mdi:timer-outline", 1, None],
"HWTimerTuesday": ["hwcTimer.Tuesday", None, "mdi:timer-outline", 1, None],
"HWTimerWednesday": ["hwcTimer.Wednesday", None, "mdi:timer-outline", 1, None],
"HWTimerThursday": ["hwcTimer.Thursday", None, "mdi:timer-outline", 1, None],
"HWTimerFriday": ["hwcTimer.Friday", None, "mdi:timer-outline", 1, None],
"HWTimerSaturday": ["hwcTimer.Saturday", None, "mdi:timer-outline", 1, None],
"HWTimerSunday": ["hwcTimer.Sunday", None, "mdi:timer-outline", 1, None],
"HWOperativeMode": ["HwcOpMode", None, "mdi:math-compass", 3, None],
"WaterPressure": [
"WaterPressure",
UnitOfPressure.BAR,
"mdi:water-pump",
0,
SensorDeviceClass.PRESSURE,
),
"Zone1RoomZoneMapping": ("z1RoomZoneMapping", None, "mdi:label", 0, None),
"Zone1NightTemperature": (
],
"Zone1RoomZoneMapping": ["z1RoomZoneMapping", None, "mdi:label", 0, None],
"Zone1NightTemperature": [
"z1NightTemp",
UnitOfTemperature.CELSIUS,
"mdi:weather-night",
0,
SensorDeviceClass.TEMPERATURE,
),
"Zone1DayTemperature": (
],
"Zone1DayTemperature": [
"z1DayTemp",
UnitOfTemperature.CELSIUS,
"mdi:weather-sunny",
0,
SensorDeviceClass.TEMPERATURE,
),
"Zone1HolidayTemperature": (
],
"Zone1HolidayTemperature": [
"z1HolidayTemp",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"Zone1RoomTemperature": (
],
"Zone1RoomTemperature": [
"z1RoomTemp",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"Zone1ActualRoomTemperatureDesired": (
],
"Zone1ActualRoomTemperatureDesired": [
"z1ActualRoomTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"Zone1TimerMonday": ("z1Timer.Monday", None, "mdi:timer-outline", 1, None),
"Zone1TimerTuesday": ("z1Timer.Tuesday", None, "mdi:timer-outline", 1, None),
"Zone1TimerWednesday": (
],
"Zone1TimerMonday": ["z1Timer.Monday", None, "mdi:timer-outline", 1, None],
"Zone1TimerTuesday": ["z1Timer.Tuesday", None, "mdi:timer-outline", 1, None],
"Zone1TimerWednesday": [
"z1Timer.Wednesday",
None,
"mdi:timer-outline",
1,
None,
),
"Zone1TimerThursday": ("z1Timer.Thursday", None, "mdi:timer-outline", 1, None),
"Zone1TimerFriday": ("z1Timer.Friday", None, "mdi:timer-outline", 1, None),
"Zone1TimerSaturday": ("z1Timer.Saturday", None, "mdi:timer-outline", 1, None),
"Zone1TimerSunday": ("z1Timer.Sunday", None, "mdi:timer-outline", 1, None),
"Zone1OperativeMode": ("z1OpMode", None, "mdi:math-compass", 3, None),
"ContinuosHeating": (
],
"Zone1TimerThursday": ["z1Timer.Thursday", None, "mdi:timer-outline", 1, None],
"Zone1TimerFriday": ["z1Timer.Friday", None, "mdi:timer-outline", 1, None],
"Zone1TimerSaturday": ["z1Timer.Saturday", None, "mdi:timer-outline", 1, None],
"Zone1TimerSunday": ["z1Timer.Sunday", None, "mdi:timer-outline", 1, None],
"Zone1OperativeMode": ["z1OpMode", None, "mdi:math-compass", 3, None],
"ContinuosHeating": [
"ContinuosHeating",
UnitOfTemperature.CELSIUS,
"mdi:weather-snowy",
0,
SensorDeviceClass.TEMPERATURE,
),
"PowerEnergyConsumptionLastMonth": (
],
"PowerEnergyConsumptionLastMonth": [
"PrEnergySumHcLastMonth",
UnitOfEnergy.KILO_WATT_HOUR,
"mdi:flash",
0,
SensorDeviceClass.ENERGY,
),
"PowerEnergyConsumptionThisMonth": (
],
"PowerEnergyConsumptionThisMonth": [
"PrEnergySumHcThisMonth",
UnitOfEnergy.KILO_WATT_HOUR,
"mdi:flash",
0,
SensorDeviceClass.ENERGY,
),
],
},
"ehp": {
"HWTemperature": (
"HWTemperature": [
"HwcTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"OutsideTemp": (
],
"OutsideTemp": [
"OutsideTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
],
},
"bai": {
"HotWaterTemperature": (
"HotWaterTemperature": [
"HwcTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"StorageTemperature": (
],
"StorageTemperature": [
"StorageTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"DesiredStorageTemperature": (
],
"DesiredStorageTemperature": [
"StorageTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"OutdoorsTemperature": (
],
"OutdoorsTemperature": [
"OutdoorstempSensor",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"WaterPressure": (
],
"WaterPressure": [
"WaterPressure",
UnitOfPressure.BAR,
"mdi:pipe",
4,
SensorDeviceClass.PRESSURE,
),
"AverageIgnitionTime": (
],
"AverageIgnitionTime": [
"averageIgnitiontime",
UnitOfTime.SECONDS,
"mdi:av-timer",
0,
SensorDeviceClass.DURATION,
),
"MaximumIgnitionTime": (
],
"MaximumIgnitionTime": [
"maxIgnitiontime",
UnitOfTime.SECONDS,
"mdi:av-timer",
0,
SensorDeviceClass.DURATION,
),
"MinimumIgnitionTime": (
],
"MinimumIgnitionTime": [
"minIgnitiontime",
UnitOfTime.SECONDS,
"mdi:av-timer",
0,
SensorDeviceClass.DURATION,
),
"ReturnTemperature": (
],
"ReturnTemperature": [
"ReturnTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"CentralHeatingPump": ("WP", None, "mdi:toggle-switch", 2, None),
"HeatingSwitch": ("HeatingSwitch", None, "mdi:toggle-switch", 2, None),
"DesiredFlowTemperature": (
],
"CentralHeatingPump": ["WP", None, "mdi:toggle-switch", 2, None],
"HeatingSwitch": ["HeatingSwitch", None, "mdi:toggle-switch", 2, None],
"DesiredFlowTemperature": [
"FlowTempDesired",
UnitOfTemperature.CELSIUS,
None,
0,
SensorDeviceClass.TEMPERATURE,
),
"FlowTemperature": (
],
"FlowTemperature": [
"FlowTemp",
UnitOfTemperature.CELSIUS,
None,
4,
SensorDeviceClass.TEMPERATURE,
),
"Flame": ("Flame", None, "mdi:toggle-switch", 2, None),
"PowerEnergyConsumptionHeatingCircuit": (
],
"Flame": ["Flame", None, "mdi:toggle-switch", 2, None],
"PowerEnergyConsumptionHeatingCircuit": [
"PrEnergySumHc1",
UnitOfEnergy.KILO_WATT_HOUR,
"mdi:flash",
0,
SensorDeviceClass.ENERGY,
),
"PowerEnergyConsumptionHotWaterCircuit": (
],
"PowerEnergyConsumptionHotWaterCircuit": [
"PrEnergySumHwc1",
UnitOfEnergy.KILO_WATT_HOUR,
"mdi:flash",
0,
SensorDeviceClass.ENERGY,
),
"RoomThermostat": ("DCRoomthermostat", None, "mdi:toggle-switch", 2, None),
"HeatingPartLoad": (
],
"RoomThermostat": ["DCRoomthermostat", None, "mdi:toggle-switch", 2, None],
"HeatingPartLoad": [
"PartloadHcKW",
UnitOfEnergy.KILO_WATT_HOUR,
"mdi:flash",
0,
SensorDeviceClass.ENERGY,
),
"StateNumber": ("StateNumber", None, "mdi:fire", 3, None),
"ModulationPercentage": (
],
"StateNumber": ["StateNumber", None, "mdi:fire", 3, None],
"ModulationPercentage": [
"ModulationTempDesired",
PERCENTAGE,
"mdi:percent",
0,
None,
),
],
},
}

View File

@@ -4,16 +4,14 @@ from __future__ import annotations
import datetime
import logging
from typing import Any, cast
from homeassistant.components.sensor import SensorDeviceClass, SensorEntity
from homeassistant.components.sensor import SensorEntity
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import Throttle, dt as dt_util
from . import EbusdData
from .const import EBUSD_DATA, SensorSpecs
from .const import DOMAIN
TIME_FRAME1_BEGIN = "time_frame1_begin"
TIME_FRAME1_END = "time_frame1_end"
@@ -35,9 +33,9 @@ def setup_platform(
"""Set up the Ebus sensor."""
if not discovery_info:
return
ebusd_api = hass.data[EBUSD_DATA]
monitored_conditions: list[str] = discovery_info["monitored_conditions"]
name: str = discovery_info["client_name"]
ebusd_api = hass.data[DOMAIN]
monitored_conditions = discovery_info["monitored_conditions"]
name = discovery_info["client_name"]
add_entities(
(
@@ -51,8 +49,9 @@ def setup_platform(
class EbusdSensor(SensorEntity):
"""Ebusd component sensor methods definition."""
def __init__(self, data: EbusdData, sensor: SensorSpecs, name: str) -> None:
def __init__(self, data, sensor, name):
"""Initialize the sensor."""
self._state = None
self._client_name = name
(
self._name,
@@ -64,15 +63,20 @@ class EbusdSensor(SensorEntity):
self.data = data
@property
def name(self) -> str:
def name(self):
"""Return the name of the sensor."""
return f"{self._client_name} {self._name}"
@property
def extra_state_attributes(self) -> dict[str, Any] | None:
def native_value(self):
"""Return the state of the sensor."""
return self._state
@property
def extra_state_attributes(self):
"""Return the device state attributes."""
if self._type == 1 and (native_value := self.native_value) is not None:
schedule: dict[str, str | None] = {
if self._type == 1 and self._state is not None:
schedule = {
TIME_FRAME1_BEGIN: None,
TIME_FRAME1_END: None,
TIME_FRAME2_BEGIN: None,
@@ -80,7 +84,7 @@ class EbusdSensor(SensorEntity):
TIME_FRAME3_BEGIN: None,
TIME_FRAME3_END: None,
}
time_frame = cast(str, native_value).split(";")
time_frame = self._state.split(";")
for index, item in enumerate(sorted(schedule.items())):
if index < len(time_frame):
parsed = datetime.datetime.strptime(time_frame[index], "%H:%M")
@@ -92,17 +96,17 @@ class EbusdSensor(SensorEntity):
return None
@property
def device_class(self) -> SensorDeviceClass | None:
def device_class(self):
"""Return the class of this device, from component DEVICE_CLASSES."""
return self._device_class
@property
def icon(self) -> str | None:
def icon(self):
"""Icon to use in the frontend, if any."""
return self._icon
@property
def native_unit_of_measurement(self) -> str | None:
def native_unit_of_measurement(self):
"""Return the unit of measurement."""
return self._unit_of_measurement
@@ -114,6 +118,6 @@ class EbusdSensor(SensorEntity):
if self._name not in self.data.value:
return
self._attr_native_value = self.data.value[self._name]
self._state = self.data.value[self._name]
except RuntimeError:
_LOGGER.debug("EbusdData.update exception")

View File

@@ -18,7 +18,6 @@ from homeassistant.const import (
from homeassistant.core import HomeAssistant
from homeassistant.helpers import config_validation as cv, discovery
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.hass_dict import HassKey
_LOGGER = logging.getLogger(__name__)
@@ -36,7 +35,7 @@ DEFAULT_REPORT_SERVER_PORT = 52010
DEFAULT_VERSION = "GATE-01"
DOMAIN = "egardia"
EGARDIA_DEVICE: HassKey[egardiadevice.EgardiaDevice] = HassKey(DOMAIN)
EGARDIA_DEVICE = "egardiadevice"
EGARDIA_NAME = "egardianame"
EGARDIA_REPORT_SERVER_CODES = "egardia_rs_codes"
EGARDIA_REPORT_SERVER_ENABLED = "egardia_rs_enabled"

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
import logging
from pythonegardia.egardiadevice import EgardiaDevice
import requests
from homeassistant.components.alarm_control_panel import (
@@ -12,7 +11,6 @@ from homeassistant.components.alarm_control_panel import (
AlarmControlPanelEntityFeature,
AlarmControlPanelState,
)
from homeassistant.const import CONF_NAME
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
@@ -49,10 +47,10 @@ def setup_platform(
if discovery_info is None:
return
device = EgardiaAlarm(
discovery_info[CONF_NAME],
discovery_info["name"],
hass.data[EGARDIA_DEVICE],
discovery_info[CONF_REPORT_SERVER_ENABLED],
discovery_info[CONF_REPORT_SERVER_CODES],
discovery_info.get(CONF_REPORT_SERVER_CODES),
discovery_info[CONF_REPORT_SERVER_PORT],
)
@@ -69,13 +67,8 @@ class EgardiaAlarm(AlarmControlPanelEntity):
)
def __init__(
self,
name: str,
egardiasystem: EgardiaDevice,
rs_enabled: bool,
rs_codes: dict[str, list[str]],
rs_port: int,
) -> None:
self, name, egardiasystem, rs_enabled=False, rs_codes=None, rs_port=52010
):
"""Initialize the Egardia alarm."""
self._attr_name = name
self._egardiasystem = egardiasystem
@@ -92,7 +85,9 @@ class EgardiaAlarm(AlarmControlPanelEntity):
@property
def should_poll(self) -> bool:
"""Poll if no report server is enabled."""
return not self._rs_enabled
if not self._rs_enabled:
return True
return False
def handle_status_event(self, event):
"""Handle the Egardia system status event."""

View File

@@ -2,12 +2,11 @@
from __future__ import annotations
from pythonegardia.egardiadevice import EgardiaDevice
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
)
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
@@ -52,20 +51,30 @@ async def async_setup_platform(
class EgardiaBinarySensor(BinarySensorEntity):
"""Represents a sensor based on an Egardia sensor (IR, Door Contact)."""
def __init__(
self,
sensor_id: str,
name: str,
egardia_system: EgardiaDevice,
device_class: BinarySensorDeviceClass | None,
) -> None:
def __init__(self, sensor_id, name, egardia_system, device_class):
"""Initialize the sensor device."""
self._id = sensor_id
self._attr_name = name
self._attr_device_class = device_class
self._name = name
self._state = None
self._device_class = device_class
self._egardia_system = egardia_system
def update(self) -> None:
"""Update the status."""
egardia_input = self._egardia_system.getsensorstate(self._id)
self._attr_is_on = bool(egardia_input)
self._state = STATE_ON if egardia_input else STATE_OFF
@property
def name(self):
"""Return the name of the device."""
return self._name
@property
def is_on(self):
"""Whether the device is switched on."""
return self._state == STATE_ON
@property
def device_class(self):
"""Return the device class."""
return self._device_class

View File

@@ -8,7 +8,7 @@
"iot_class": "local_polling",
"loggers": ["pyenphase"],
"quality_scale": "platinum",
"requirements": ["pyenphase==2.4.3"],
"requirements": ["pyenphase==2.4.2"],
"zeroconf": [
{
"type": "_enphase-envoy._tcp.local."

View File

@@ -18,13 +18,12 @@ from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.discovery import async_load_platform
from homeassistant.helpers.dispatcher import async_dispatcher_send
from homeassistant.helpers.typing import ConfigType
from homeassistant.util.hass_dict import HassKey
_LOGGER = logging.getLogger(__name__)
DOMAIN = "envisalink"
DATA_EVL: HassKey[EnvisalinkAlarmPanel] = HassKey(DOMAIN)
DATA_EVL = "envisalink"
CONF_EVL_KEEPALIVE = "keepalive_interval"
CONF_EVL_PORT = "port"

View File

@@ -3,9 +3,7 @@
from __future__ import annotations
import logging
from typing import Any
from pyenvisalink import EnvisalinkAlarmPanel
import voluptuous as vol
from homeassistant.components.alarm_control_panel import (
@@ -24,7 +22,6 @@ from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from . import (
CONF_PANIC,
CONF_PARTITIONNAME,
CONF_PARTITIONS,
DATA_EVL,
DOMAIN,
PARTITION_SCHEMA,
@@ -54,14 +51,15 @@ async def async_setup_platform(
"""Perform the setup for Envisalink alarm panels."""
if not discovery_info:
return
configured_partitions: dict[int, dict[str, Any]] = discovery_info[CONF_PARTITIONS]
code: str | None = discovery_info[CONF_CODE]
panic_type: str = discovery_info[CONF_PANIC]
configured_partitions = discovery_info["partitions"]
code = discovery_info[CONF_CODE]
panic_type = discovery_info[CONF_PANIC]
entities = []
for part_num, part_config in configured_partitions.items():
entity_config_data = PARTITION_SCHEMA(part_config)
for part_num in configured_partitions:
entity_config_data = PARTITION_SCHEMA(configured_partitions[part_num])
entity = EnvisalinkAlarm(
hass,
part_num,
entity_config_data[CONF_PARTITIONNAME],
code,
@@ -105,14 +103,8 @@ class EnvisalinkAlarm(EnvisalinkEntity, AlarmControlPanelEntity):
)
def __init__(
self,
partition_number: int,
alarm_name: str,
code: str | None,
panic_type: str,
info: dict[str, Any],
controller: EnvisalinkAlarmPanel,
) -> None:
self, hass, partition_number, alarm_name, code, panic_type, info, controller
):
"""Initialize the alarm panel."""
self._partition_number = partition_number
self._panic_type = panic_type

View File

@@ -4,14 +4,8 @@ from __future__ import annotations
import datetime
import logging
from typing import Any
from pyenvisalink import EnvisalinkAlarmPanel
from homeassistant.components.binary_sensor import (
BinarySensorDeviceClass,
BinarySensorEntity,
)
from homeassistant.components.binary_sensor import BinarySensorEntity
from homeassistant.const import ATTR_LAST_TRIP_TIME
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
@@ -19,14 +13,7 @@ from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from homeassistant.util import dt as dt_util
from . import (
CONF_ZONENAME,
CONF_ZONES,
CONF_ZONETYPE,
DATA_EVL,
SIGNAL_ZONE_UPDATE,
ZONE_SCHEMA,
)
from . import CONF_ZONENAME, CONF_ZONETYPE, DATA_EVL, SIGNAL_ZONE_UPDATE, ZONE_SCHEMA
from .entity import EnvisalinkEntity
_LOGGER = logging.getLogger(__name__)
@@ -41,12 +28,13 @@ async def async_setup_platform(
"""Set up the Envisalink binary sensor entities."""
if not discovery_info:
return
configured_zones: dict[int, dict[str, Any]] = discovery_info[CONF_ZONES]
configured_zones = discovery_info["zones"]
entities = []
for zone_num, zone_data in configured_zones.items():
entity_config_data = ZONE_SCHEMA(zone_data)
for zone_num in configured_zones:
entity_config_data = ZONE_SCHEMA(configured_zones[zone_num])
entity = EnvisalinkBinarySensor(
hass,
zone_num,
entity_config_data[CONF_ZONENAME],
entity_config_data[CONF_ZONETYPE],
@@ -61,16 +49,9 @@ async def async_setup_platform(
class EnvisalinkBinarySensor(EnvisalinkEntity, BinarySensorEntity):
"""Representation of an Envisalink binary sensor."""
def __init__(
self,
zone_number: int,
zone_name: str,
zone_type: BinarySensorDeviceClass,
info: dict[str, Any],
controller: EnvisalinkAlarmPanel,
) -> None:
def __init__(self, hass, zone_number, zone_name, zone_type, info, controller):
"""Initialize the binary_sensor."""
self._attr_device_class = zone_type
self._zone_type = zone_type
self._zone_number = zone_number
_LOGGER.debug("Setting up zone: %s", zone_name)
@@ -85,9 +66,9 @@ class EnvisalinkBinarySensor(EnvisalinkEntity, BinarySensorEntity):
)
@property
def extra_state_attributes(self) -> dict[str, Any]:
def extra_state_attributes(self):
"""Return the state attributes."""
attr: dict[str, Any] = {}
attr = {}
# The Envisalink library returns a "last_fault" value that's the
# number of seconds since the last fault, up to a maximum of 327680
@@ -120,6 +101,11 @@ class EnvisalinkBinarySensor(EnvisalinkEntity, BinarySensorEntity):
"""Return true if sensor is on."""
return self._info["status"]["open"]
@property
def device_class(self):
"""Return the class of this sensor, from DEVICE_CLASSES."""
return self._zone_type
@callback
def async_update_callback(self, zone):
"""Update the zone's state, if needed."""

View File

@@ -1,9 +1,5 @@
"""Support for Envisalink devices."""
from typing import Any
from pyenvisalink import EnvisalinkAlarmPanel
from homeassistant.helpers.entity import Entity
@@ -12,10 +8,13 @@ class EnvisalinkEntity(Entity):
_attr_should_poll = False
def __init__(
self, name: str, info: dict[str, Any], controller: EnvisalinkAlarmPanel
) -> None:
def __init__(self, name, info, controller):
"""Initialize the device."""
self._controller = controller
self._info = info
self._attr_name = name
self._name = name
@property
def name(self):
"""Return the name of the device."""
return self._name

View File

@@ -3,9 +3,6 @@
from __future__ import annotations
import logging
from typing import Any
from pyenvisalink import EnvisalinkAlarmPanel
from homeassistant.components.sensor import SensorEntity
from homeassistant.core import HomeAssistant, callback
@@ -15,7 +12,6 @@ from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from . import (
CONF_PARTITIONNAME,
CONF_PARTITIONS,
DATA_EVL,
PARTITION_SCHEMA,
SIGNAL_KEYPAD_UPDATE,
@@ -35,12 +31,13 @@ async def async_setup_platform(
"""Perform the setup for Envisalink sensor entities."""
if not discovery_info:
return
configured_partitions: dict[int, dict[str, Any]] = discovery_info[CONF_PARTITIONS]
configured_partitions = discovery_info["partitions"]
entities = []
for part_num, part_config in configured_partitions.items():
entity_config_data = PARTITION_SCHEMA(part_config)
for part_num in configured_partitions:
entity_config_data = PARTITION_SCHEMA(configured_partitions[part_num])
entity = EnvisalinkSensor(
hass,
entity_config_data[CONF_PARTITIONNAME],
part_num,
hass.data[DATA_EVL].alarm_state["partition"][part_num],
@@ -55,16 +52,9 @@ async def async_setup_platform(
class EnvisalinkSensor(EnvisalinkEntity, SensorEntity):
"""Representation of an Envisalink keypad."""
_attr_icon = "mdi:alarm"
def __init__(
self,
partition_name: str,
partition_number: int,
info: dict[str, Any],
controller: EnvisalinkAlarmPanel,
) -> None:
def __init__(self, hass, partition_name, partition_number, info, controller):
"""Initialize the sensor."""
self._icon = "mdi:alarm"
self._partition_number = partition_number
_LOGGER.debug("Setting up sensor for partition: %s", partition_name)
@@ -83,6 +73,11 @@ class EnvisalinkSensor(EnvisalinkEntity, SensorEntity):
)
)
@property
def icon(self):
"""Return the icon if any."""
return self._icon
@property
def native_value(self):
"""Return the overall state."""

View File

@@ -5,21 +5,13 @@ from __future__ import annotations
import logging
from typing import Any
from pyenvisalink import EnvisalinkAlarmPanel
from homeassistant.components.switch import SwitchEntity
from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddEntitiesCallback
from homeassistant.helpers.typing import ConfigType, DiscoveryInfoType
from . import (
CONF_ZONENAME,
CONF_ZONES,
DATA_EVL,
SIGNAL_ZONE_BYPASS_UPDATE,
ZONE_SCHEMA,
)
from . import CONF_ZONENAME, DATA_EVL, SIGNAL_ZONE_BYPASS_UPDATE, ZONE_SCHEMA
from .entity import EnvisalinkEntity
_LOGGER = logging.getLogger(__name__)
@@ -34,15 +26,16 @@ async def async_setup_platform(
"""Set up the Envisalink switch entities."""
if not discovery_info:
return
configured_zones: dict[int, dict[str, Any]] = discovery_info[CONF_ZONES]
configured_zones = discovery_info["zones"]
entities = []
for zone_num, zone_data in configured_zones.items():
entity_config_data = ZONE_SCHEMA(zone_data)
for zone_num in configured_zones:
entity_config_data = ZONE_SCHEMA(configured_zones[zone_num])
zone_name = f"{entity_config_data[CONF_ZONENAME]}_bypass"
_LOGGER.debug("Setting up zone_bypass switch: %s", zone_name)
entity = EnvisalinkSwitch(
hass,
zone_num,
zone_name,
hass.data[DATA_EVL].alarm_state["zone"][zone_num],
@@ -56,13 +49,7 @@ async def async_setup_platform(
class EnvisalinkSwitch(EnvisalinkEntity, SwitchEntity):
"""Representation of an Envisalink switch."""
def __init__(
self,
zone_number: int,
zone_name: str,
info: dict[str, Any],
controller: EnvisalinkAlarmPanel,
) -> None:
def __init__(self, hass, zone_number, zone_name, info, controller):
"""Initialize the switch."""
self._zone_number = zone_number

View File

@@ -1034,7 +1034,7 @@ def _async_setup_device_registry(
and dashboard.data
and dashboard.data.get(device_info.name)
):
configuration_url = f"homeassistant://app/{dashboard.addon_slug}"
configuration_url = f"homeassistant://hassio/ingress/{dashboard.addon_slug}"
manufacturer = "espressif"
if device_info.manufacturer:

View File

@@ -7,7 +7,7 @@ from enum import StrEnum
from typing import Final
DOMAIN: Final = "essent"
UPDATE_INTERVAL: Final = timedelta(hours=1)
UPDATE_INTERVAL: Final = timedelta(hours=12)
ATTRIBUTION: Final = "Data provided by Essent"

View File

@@ -5,7 +5,6 @@ from __future__ import annotations
from http import HTTPStatus
import json
import logging
from typing import Any
import requests
import voluptuous as vol
@@ -47,7 +46,7 @@ class FacebookNotificationService(BaseNotificationService):
"""Initialize the service."""
self.page_access_token = access_token
def send_message(self, message: str = "", **kwargs: Any) -> None:
def send_message(self, message="", **kwargs):
"""Send some message."""
payload = {"access_token": self.page_access_token}
targets = kwargs.get(ATTR_TARGET)

View File

@@ -1,17 +0,0 @@
"""Provides conditions for fans."""
from homeassistant.const import STATE_OFF, STATE_ON
from homeassistant.core import HomeAssistant
from homeassistant.helpers.condition import Condition, make_entity_state_condition
from . import DOMAIN
CONDITIONS: dict[str, type[Condition]] = {
"is_off": make_entity_state_condition(DOMAIN, STATE_OFF),
"is_on": make_entity_state_condition(DOMAIN, STATE_ON),
}
async def async_get_conditions(hass: HomeAssistant) -> dict[str, type[Condition]]:
"""Return the fan conditions."""
return CONDITIONS

View File

@@ -1,17 +0,0 @@
.condition_common: &condition_common
target:
entity:
domain: fan
fields:
behavior:
required: true
default: any
selector:
select:
translation_key: condition_behavior
options:
- all
- any
is_off: *condition_common
is_on: *condition_common

View File

@@ -1,12 +1,4 @@
{
"conditions": {
"is_off": {
"condition": "mdi:fan-off"
},
"is_on": {
"condition": "mdi:fan"
}
},
"entity_component": {
"_": {
"default": "mdi:fan",

View File

@@ -1,32 +1,8 @@
{
"common": {
"condition_behavior_description": "How the state should match on the targeted fans.",
"condition_behavior_name": "Behavior",
"trigger_behavior_description": "The behavior of the targeted fans to trigger on.",
"trigger_behavior_name": "Behavior"
},
"conditions": {
"is_off": {
"description": "Tests if one or more fans are off.",
"fields": {
"behavior": {
"description": "[%key:component::fan::common::condition_behavior_description%]",
"name": "[%key:component::fan::common::condition_behavior_name%]"
}
},
"name": "Fan is off"
},
"is_on": {
"description": "Tests if one or more fans are on.",
"fields": {
"behavior": {
"description": "[%key:component::fan::common::condition_behavior_description%]",
"name": "[%key:component::fan::common::condition_behavior_name%]"
}
},
"name": "Fan is on"
}
},
"device_automation": {
"action_type": {
"toggle": "[%key:common::device_automation::action_type::toggle%]",
@@ -89,12 +65,6 @@
}
},
"selector": {
"condition_behavior": {
"options": {
"all": "All",
"any": "Any"
}
},
"direction": {
"options": {
"forward": "Forward",

View File

@@ -2,7 +2,6 @@
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from datetime import datetime, timedelta
import logging
@@ -98,30 +97,17 @@ class FireflyDataUpdateCoordinator(DataUpdateCoordinator[FireflyCoordinatorData]
end_date = now
try:
(
accounts,
categories,
primary_currency,
budgets,
bills,
) = await asyncio.gather(
self.firefly.get_accounts(),
self.firefly.get_categories(),
self.firefly.get_currency_primary(),
self.firefly.get_budgets(start=start_date, end=end_date),
self.firefly.get_bills(),
)
category_details = await asyncio.gather(
*(
self.firefly.get_category(
category_id=int(category.id),
start=start_date,
end=end_date,
)
for category in categories
accounts = await self.firefly.get_accounts()
categories = await self.firefly.get_categories()
category_details = [
await self.firefly.get_category(
category_id=int(category.id), start=start_date, end=end_date
)
)
for category in categories
]
primary_currency = await self.firefly.get_currency_primary()
budgets = await self.firefly.get_budgets(start=start_date, end=end_date)
bills = await self.firefly.get_bills()
except FireflyAuthenticationError as err:
raise ConfigEntryAuthFailed(
translation_domain=DOMAIN,

View File

@@ -7,5 +7,5 @@
"integration_type": "service",
"iot_class": "local_polling",
"quality_scale": "bronze",
"requirements": ["pyfirefly==0.1.12"]
"requirements": ["pyfirefly==0.1.10"]
}

View File

@@ -461,7 +461,7 @@ FITBIT_RESOURCES_LIST: Final[tuple[FitbitSensorEntityDescription, ...]] = (
key="sleep/timeInBed",
translation_key="sleep_time_in_bed",
native_unit_of_measurement=UnitOfTime.MINUTES,
icon="mdi:bed",
icon="mdi:hotel",
device_class=SensorDeviceClass.DURATION,
scope=FitbitScope.SLEEP,
state_class=SensorStateClass.TOTAL_INCREASING,

View File

@@ -5,7 +5,6 @@ from __future__ import annotations
import asyncio
from http import HTTPStatus
import logging
from typing import Any
import voluptuous as vol
@@ -48,7 +47,7 @@ class FlockNotificationService(BaseNotificationService):
self._url = url
self._session = session
async def async_send_message(self, message: str, **kwargs: Any) -> None:
async def async_send_message(self, message, **kwargs):
"""Send the message to the user."""
payload = {"text": message}

View File

@@ -4,7 +4,6 @@ from __future__ import annotations
from http import HTTPStatus
import logging
from typing import Any
from freesms import FreeClient
import voluptuous as vol
@@ -41,7 +40,7 @@ class FreeSMSNotificationService(BaseNotificationService):
"""Initialize the service."""
self.free_client = FreeClient(username, access_token)
def send_message(self, message: str = "", **kwargs: Any) -> None:
def send_message(self, message="", **kwargs):
"""Send a message to the Free Mobile user cell."""
resp = self.free_client.send_sms(message)

View File

@@ -31,7 +31,7 @@ STEP_USER_DATA_SCHEMA = vol.Schema(
)
STEP_SMS_CODE_DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_SMS_CODE): str,
vol.Required(CONF_SMS_CODE): int,
}
)
@@ -75,7 +75,7 @@ class FressnapfTrackerConfigFlow(ConfigFlow, domain=DOMAIN):
return errors, False
async def _async_verify_sms_code(
self, sms_code: str
self, sms_code: int
) -> tuple[dict[str, str], str | None]:
"""Verify SMS code and return errors and access_token."""
errors: dict[str, str] = {}

View File

@@ -7,5 +7,5 @@
"integration_type": "hub",
"iot_class": "cloud_polling",
"quality_scale": "bronze",
"requirements": ["fressnapftracker==0.2.1"]
"requirements": ["fressnapftracker==0.2.0"]
}

View File

@@ -164,12 +164,13 @@ def _async_wol_buttons_list(
class FritzBoxWOLButton(FritzDeviceBase, ButtonEntity):
"""Defines a FRITZ!Box Tools Wake On LAN button."""
_attr_icon = "mdi:lan-pending"
_attr_entity_registry_enabled_default = False
_attr_translation_key = "wake_on_lan"
def __init__(self, avm_wrapper: AvmWrapper, device: FritzDevice) -> None:
"""Initialize Fritz!Box WOL button."""
super().__init__(avm_wrapper, device)
self._name = f"{self.hostname} Wake on LAN"
self._attr_unique_id = f"{self._mac}_wake_on_lan"
self._is_available = True

View File

@@ -10,7 +10,6 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity_platform import AddConfigEntryEntitiesCallback
from .const import DEFAULT_DEVICE_NAME
from .coordinator import FRITZ_DATA_KEY, AvmWrapper, FritzConfigEntry, FritzData
from .entity import FritzDeviceBase
from .helpers import device_filter_out_from_trackers
@@ -72,7 +71,6 @@ class FritzBoxTracker(FritzDeviceBase, ScannerEntity):
def __init__(self, avm_wrapper: AvmWrapper, device: FritzDevice) -> None:
"""Initialize a FRITZ!Box device."""
super().__init__(avm_wrapper, device)
self._attr_name: str = device.hostname or DEFAULT_DEVICE_NAME
self._last_activity: datetime.datetime | None = device.last_activity
@property

Some files were not shown because too many files have changed in this diff Show More