Compare commits

...

45 Commits

Author SHA1 Message Date
Bram Kragten
d81a19b5a2 Bumped version to 0.110.0b5 2020-05-19 20:21:58 +02:00
Bram Kragten
39ce063500 Updated frontend to 20200519.0 (#35813) 2020-05-19 20:19:45 +02:00
Alexei Chetroi
af34d130b6 Bump up ZHA dependencies. (#35797) 2020-05-19 20:11:22 +02:00
uvjustin
0cb5ccd492 Change version check in forked-daapd zeroconf step (#35796) 2020-05-19 20:11:22 +02:00
Anders Melchiorsen
7433fa9265 Upgrade pysonos to 0.0.30 (#35793) 2020-05-19 20:11:21 +02:00
Jason Hunter
d629a55134 Fix ONVIF subscription renewal (#35792)
* fix subscription renewal

* catch ValueError for #35762
2020-05-19 20:11:20 +02:00
Franck Nijhof
7a72ada8b2 Bumped version to 0.110.0b4 2020-05-18 22:36:58 +02:00
Bram Kragten
a28646bc24 Updated frontend to 20200518.0 (#35785) 2020-05-18 22:31:35 +02:00
MatsNl
ca4433bd70 Bump Atag dependency to 0.3.1.2 (#35776) 2020-05-18 22:31:29 +02:00
uvjustin
c61bcbf982 Skip forked_daapd ignored entries with empty entry.data (#35772) 2020-05-18 22:31:24 +02:00
Fredrik Erlandsson
e9f398ac28 Fix daikin discovery flow (#35767) 2020-05-18 22:31:19 +02:00
J. Nick Koston
7417b3be66 Handle UPS disconnects in NUT (#35758) 2020-05-18 22:31:14 +02:00
Daniel Høyer Iversen
99afc17b3f Update mill manifest to reflect config flow (#35748) 2020-05-18 22:31:10 +02:00
J. Nick Koston
cc5fc2baa4 Ensure homekit version strings conform to spec (#35741)
HomeKit requires all version strings to be in the
format MAJOR.MINOR.REVISION
2020-05-18 22:31:05 +02:00
Daniel Høyer Iversen
e2f0520028 Upgrade opengarage lib to 0.1.4 (#35729) 2020-05-18 22:31:00 +02:00
uvjustin
5cb1924290 Abort forked-daapd zeroconf flow if version < 27 (#35709)
* Change MediaPlayerDevice to MediaPlayerEntity

* Abort zeroconf if mtd-version < 27.0
2020-05-18 22:30:55 +02:00
Alexei Chetroi
aa176aab07 Bump up ZHA dependencies (#35706) 2020-05-18 22:30:51 +02:00
Franck Nijhof
5695a63e59 Fix handling of additional data in core config storage (#35660) 2020-05-18 22:30:45 +02:00
Franck Nijhof
4a9a004de0 Bumped version to 0.110.0b3 2020-05-16 11:06:22 +02:00
uvjustin
c270d5edcf Change MediaPlayerDevice to MediaPlayerEntity (#35692) 2020-05-16 11:05:15 +02:00
Bram Kragten
cf034ee729 Updated frontend to 20200515.0 (#35677) 2020-05-16 11:05:11 +02:00
Jason Hunter
316d44cf33 ONVIF: Add check around media capabilities (#35667) 2020-05-16 11:05:07 +02:00
Bram Kragten
dbd30d571d Fix caldav event for calendar panel (#35653) 2020-05-16 11:05:03 +02:00
Xiaonan Shen
78c9411dde Bump roombapy to 1.6.1 (#35650)
* Bump roombapy to 1.6.1

* Improve roomba error handling
2020-05-16 11:04:59 +02:00
Glenn Waters
2f999dd77e Update Universal Powerline Bus event name (#35644) 2020-05-16 11:04:55 +02:00
Chris Talkington
e8ee3c7d4d Prevent discovery of IPP printers lacking identifier (#35630) 2020-05-16 11:04:51 +02:00
Bram Kragten
5496a8ca05 Bumped version to 0.110.0b2 2020-05-15 09:12:26 +02:00
Quentame
3928fe9578 Bump python-synology to 0.8.1 (#35640)
* Bump python-synology to 0.8.1

* Fix tests
2020-05-15 09:10:28 +02:00
Bram Kragten
592ecd479f Updated frontend to 20200514.1 (#35632) 2020-05-15 09:10:27 +02:00
Franck Nijhof
20188a36de Bumped version to 0.110.0b1 2020-05-14 23:49:29 +02:00
Franck Nijhof
d66856dd17 Update translations for OZW and ONVIF 2020-05-14 23:48:43 +02:00
Franck Nijhof
bcf068f66f Rename zwave_mqtt to ozw (#35631) 2020-05-14 23:33:33 +02:00
Alexei Chetroi
618ce2ff0a Don't remove deprecated ZHA config option yet (#35627) 2020-05-14 23:26:17 +02:00
Jason Hunter
05778ad307 additional log info and strings fix (#35622) 2020-05-14 23:26:13 +02:00
Ville Skyttä
bc0109256f Upgrade huawei-lte-api to 1.4.12 (#35618)
https://github.com/Salamek/huawei-lte-api/releases/tag/1.4.12
2020-05-14 23:26:09 +02:00
Anders Melchiorsen
fa487c7c2f Upgrade to pysonos 0.0.29 (#35617) 2020-05-14 23:26:05 +02:00
Bram Kragten
b3a0270acd Add check for HTML in translations (#35615)
* Add check for HTML in translations

and remove existing html

* Add test
2020-05-14 23:26:02 +02:00
zacpotts
a930175c55 Fix zwave thermostat specific device type (#35609) 2020-05-14 23:25:58 +02:00
Marcel van der Veldt
45d3bb7da2 Fix zwave_mqtt creating the device name (#35603)
* Fix for creating the device name

Creating the devicename included a typo and was missing the custom (preferred) name set by the OZW Admin tool.

* update comments
2020-05-14 23:25:55 +02:00
Jason Hunter
b83adad417 Additional checks for ONVIF event capabilities (#35599)
catch any exceptions when pulling event capabilities and assume it is not supported
2020-05-14 23:25:51 +02:00
Bram Kragten
856c0e6a15 Updated frontend to 20200514.0 (#35598) 2020-05-14 23:25:48 +02:00
J. Nick Koston
eeaef5731f Fix reversed logic in zeroconf homekit pairing check (#35596)
* Fix reversed logic in zeroconf homekit pairing check

* s/server_info/service_info/
2020-05-14 23:25:44 +02:00
uvjustin
cc431b9f14 Clean up forked_daapd volume saving/setting in async_play_media (#35584)
* Clean up volume saving/setting in async_play_media

* Set source to pipe when queued externally

* Add server version requirement to error string
2020-05-14 23:25:40 +02:00
Bouwe Westerdijk
506dd1d423 Bump haanna to 0.15.0 (#35579)
* Link to haanna v0.15.0

* Update requirements_all.txt
2020-05-14 23:25:35 +02:00
Steven Looman
0e79b47b43 Properly handle incomplete upnp ssdp discovery (#35553) 2020-05-14 23:25:29 +02:00
121 changed files with 660 additions and 386 deletions

View File

@@ -920,10 +920,10 @@ omit =
homeassistant/components/zoneminder/*
homeassistant/components/supla/*
homeassistant/components/zwave/util.py
homeassistant/components/zwave_mqtt/__init__.py
homeassistant/components/zwave_mqtt/discovery.py
homeassistant/components/zwave_mqtt/entity.py
homeassistant/components/zwave_mqtt/services.py
homeassistant/components/ozw/__init__.py
homeassistant/components/ozw/discovery.py
homeassistant/components/ozw/entity.py
homeassistant/components/ozw/services.py
[report]
# Regexes for lines to exclude from consideration

View File

@@ -293,6 +293,7 @@ homeassistant/components/openweathermap/* @fabaff
homeassistant/components/opnsense/* @mtreinish
homeassistant/components/orangepi_gpio/* @pascallj
homeassistant/components/oru/* @bvlaicu
homeassistant/components/ozw/* @cgarwood @marcelveldt @MartinHjelmare
homeassistant/components/panasonic_viera/* @joogps
homeassistant/components/panel_custom/* @home-assistant/frontend
homeassistant/components/panel_iframe/* @home-assistant/frontend
@@ -475,7 +476,6 @@ homeassistant/components/zha/* @dmulcahey @adminiuga
homeassistant/components/zone/* @home-assistant/core
homeassistant/components/zoneminder/* @rohankapoorcom
homeassistant/components/zwave/* @home-assistant/z-wave
homeassistant/components/zwave_mqtt/* @cgarwood @marcelveldt @MartinHjelmare
# Individual files
homeassistant/components/demo/weather @fabaff

View File

@@ -3,7 +3,7 @@
"step": {
"auth": {
"title": "Authenticate Ambiclimate",
"description": "Please follow this [link]({authorization_url}) and <b>Allow</b> access to your Ambiclimate account, then come back and press <b>Submit</b> below.\n(Make sure the specified callback url is {cb_url})"
"description": "Please follow this [link]({authorization_url}) and **Allow** access to your Ambiclimate account, then come back and press **Submit** below.\n(Make sure the specified callback url is {cb_url})"
}
},
"create_entry": {

View File

@@ -3,6 +3,6 @@
"name": "Atag",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/atag/",
"requirements": ["pyatag==0.3.1.1"],
"requirements": ["pyatag==0.3.1.2"],
"codeowners": ["@MatsNL"]
}

View File

@@ -174,7 +174,7 @@ class WebDavCalendarData:
uid = vevent.uid.value
data = {
"uid": uid,
"title": vevent.summary.value,
"summary": vevent.summary.value,
"start": self.get_hass_date(vevent.dtstart.value),
"end": self.get_hass_date(self.get_end_date(vevent)),
"location": self.get_attr_value(vevent, "location"),

View File

@@ -15,14 +15,6 @@ from .const import CONF_KEY, CONF_UUID, KEY_IP, KEY_MAC, TIMEOUT
_LOGGER = logging.getLogger(__name__)
DATA_SCHEMA = vol.Schema(
{
vol.Required(CONF_HOST): str,
vol.Optional(CONF_KEY): str,
vol.Optional(CONF_PASSWORD): str,
}
)
@config_entries.HANDLERS.register("daikin")
class FlowHandler(config_entries.ConfigFlow):
@@ -31,12 +23,26 @@ class FlowHandler(config_entries.ConfigFlow):
VERSION = 1
CONNECTION_CLASS = config_entries.CONN_CLASS_LOCAL_POLL
def _create_entry(self, host, mac, key=None, uuid=None, password=None):
def __init__(self):
"""Initialize the Daikin config flow."""
self.host = None
@property
def schema(self):
"""Return current schema."""
return vol.Schema(
{
vol.Required(CONF_HOST, default=self.host): str,
vol.Optional(CONF_KEY): str,
vol.Optional(CONF_PASSWORD): str,
}
)
async def _create_entry(self, host, mac, key=None, uuid=None, password=None):
"""Register new entry."""
# Check if mac already is registered
for entry in self._async_current_entries():
if entry.data[KEY_MAC] == mac:
return self.async_abort(reason="already_configured")
await self.async_set_unique_id(mac)
self._abort_if_unique_id_configured()
return self.async_create_entry(
title=host,
@@ -73,31 +79,31 @@ class FlowHandler(config_entries.ConfigFlow):
except asyncio.TimeoutError:
return self.async_show_form(
step_id="user",
data_schema=DATA_SCHEMA,
data_schema=self.schema,
errors={"base": "device_timeout"},
)
except web_exceptions.HTTPForbidden:
return self.async_show_form(
step_id="user", data_schema=DATA_SCHEMA, errors={"base": "forbidden"},
step_id="user", data_schema=self.schema, errors={"base": "forbidden"},
)
except ClientError:
_LOGGER.exception("ClientError")
return self.async_show_form(
step_id="user", data_schema=DATA_SCHEMA, errors={"base": "device_fail"},
step_id="user", data_schema=self.schema, errors={"base": "device_fail"},
)
except Exception: # pylint: disable=broad-except
_LOGGER.exception("Unexpected error creating device")
return self.async_show_form(
step_id="user", data_schema=DATA_SCHEMA, errors={"base": "device_fail"},
step_id="user", data_schema=self.schema, errors={"base": "device_fail"},
)
mac = device.mac
return self._create_entry(host, mac, key, uuid, password)
return await self._create_entry(host, mac, key, uuid, password)
async def async_step_user(self, user_input=None):
"""User initiated config flow."""
if user_input is None:
return self.async_show_form(step_id="user", data_schema=DATA_SCHEMA,)
return self.async_show_form(step_id="user", data_schema=self.schema)
return await self._create_device(
user_input[CONF_HOST],
user_input.get(CONF_KEY),
@@ -111,7 +117,10 @@ class FlowHandler(config_entries.ConfigFlow):
return await self.async_step_user()
return await self._create_device(host)
async def async_step_discovery(self, user_input):
async def async_step_discovery(self, discovery_info):
"""Initialize step from discovery."""
_LOGGER.info("Discovered device: %s", user_input)
return self._create_entry(user_input[KEY_IP], user_input[KEY_MAC])
_LOGGER.debug("Discovered device: %s", discovery_info)
await self.async_set_unique_id(discovery_info[KEY_MAC])
self._abort_if_unique_id_configured()
self.host = discovery_info[KEY_IP]
return await self.async_step_user()

View File

@@ -3,7 +3,7 @@
"name": "Daikin AC",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/daikin",
"requirements": ["pydaikin==2.0.1"],
"requirements": ["pydaikin==2.0.2"],
"codeowners": ["@fredrike"],
"quality_scale": "platinum"
}

View File

@@ -3,7 +3,7 @@
"step": {
"user": {
"title": "Configure Daikin AC",
"description": "Enter IP address of your Daikin AC.",
"description": "Enter IP address of your Daikin AC.\n\nNote that [%key:common::config_flow::data::api_key%] and [%key:common::config_flow::data::password%] are used by BRP072Cxx and SKYFi devices respectively.",
"data": {
"host": "[%key:common::config_flow::data::host%]",
"key": "[%key:common::config_flow::data::api_key%]",

View File

@@ -158,23 +158,24 @@ class ForkedDaapdFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
"""Prepare configuration for a discovered forked-daapd device."""
if not (
discovery_info.get("properties")
and discovery_info["properties"].get("mtd-version")
and int(discovery_info["properties"].get("mtd-version", "0").split(".")[0])
>= 27
and discovery_info["properties"].get("Machine Name")
):
return self.async_abort(reason="not_forked_daapd")
await self.async_set_unique_id(discovery_info["properties"]["Machine Name"])
self._abort_if_unique_id_configured()
# Update title and abort if we already have an entry for this host
for entry in self._async_current_entries():
if entry.data[CONF_HOST] != discovery_info["host"]:
if entry.data.get(CONF_HOST) != discovery_info["host"]:
continue
self.hass.config_entries.async_update_entry(
entry, title=discovery_info["properties"]["Machine Name"],
)
return self.async_abort(reason="already_configured")
await self.async_set_unique_id(discovery_info["properties"]["Machine Name"])
self._abort_if_unique_id_configured()
zeroconf_data = {
CONF_HOST: discovery_info["host"],
CONF_PORT: int(discovery_info["port"]),

View File

@@ -6,7 +6,7 @@ import logging
from pyforked_daapd import ForkedDaapdAPI
from pylibrespot_java import LibrespotJavaAPI
from homeassistant.components.media_player import MediaPlayerDevice
from homeassistant.components.media_player import MediaPlayerEntity
from homeassistant.components.media_player.const import MEDIA_TYPE_MUSIC
from homeassistant.const import (
CONF_HOST,
@@ -116,7 +116,7 @@ async def update_listener(hass, entry):
)
class ForkedDaapdZone(MediaPlayerDevice):
class ForkedDaapdZone(MediaPlayerEntity):
"""Representation of a forked-daapd output."""
def __init__(self, api, output, entry_id):
@@ -221,7 +221,7 @@ class ForkedDaapdZone(MediaPlayerDevice):
return SUPPORTED_FEATURES_ZONE
class ForkedDaapdMaster(MediaPlayerDevice):
class ForkedDaapdMaster(MediaPlayerEntity):
"""Representation of the main forked-daapd device."""
def __init__(
@@ -237,7 +237,7 @@ class ForkedDaapdMaster(MediaPlayerDevice):
self._track_info = defaultdict(
str
) # _track info is found by matching _player data with _queue data
self._last_outputs = None # used for device on/off
self._last_outputs = [] # used for device on/off
self._last_volume = DEFAULT_UNMUTE_VOLUME
self._player_last_updated = None
self._pipe_control_api = {}
@@ -349,6 +349,13 @@ class ForkedDaapdMaster(MediaPlayerDevice):
):
self._tts_requested = False
self._tts_queued = True
if (
self._queue["count"] >= 1
and self._queue["items"][0]["data_kind"] == "pipe"
and self._queue["items"][0]["title"] in KNOWN_PIPES
): # if we're playing a pipe, set the source automatically so we can forward controls
self._source = f"{self._queue['items'][0]['title']} (pipe)"
self._update_track_info()
event.set()
@@ -407,6 +414,7 @@ class ForkedDaapdMaster(MediaPlayerDevice):
async def async_turn_on(self):
"""Restore the last on outputs state."""
# restore state
await self._api.set_volume(volume=self._last_volume * 100)
if self._last_outputs:
futures = []
for output in self._last_outputs:
@@ -418,19 +426,16 @@ class ForkedDaapdMaster(MediaPlayerDevice):
)
)
await asyncio.wait(futures)
else:
selected = []
for output in self._outputs:
selected.append(output["id"])
await self._api.set_enabled_outputs(selected)
else: # enable all outputs
await self._api.set_enabled_outputs(
[output["id"] for output in self._outputs]
)
async def async_turn_off(self):
"""Pause player and store outputs state."""
await self.async_media_pause()
if any(
[output["selected"] for output in self._outputs]
): # only store output state if some output is selected
self._last_outputs = self._outputs
self._last_outputs = self._outputs
if any([output["selected"] for output in self._outputs]):
await self._api.set_enabled_outputs([])
async def async_toggle(self):
@@ -613,8 +618,12 @@ class ForkedDaapdMaster(MediaPlayerDevice):
url = self._api.full_url(url)
return url
async def _set_tts_volumes(self):
async def _save_and_set_tts_volumes(self):
if self.volume_level: # save master volume
self._last_volume = self.volume_level
self._last_outputs = self._outputs
if self._outputs:
await self._api.set_volume(volume=self._tts_volume * 100)
futures = []
for output in self._outputs:
futures.append(
@@ -623,7 +632,6 @@ class ForkedDaapdMaster(MediaPlayerDevice):
)
)
await asyncio.wait(futures)
await self._api.set_volume(volume=self._tts_volume * 100)
async def _pause_and_wait_for_callback(self):
"""Send pause and wait for the pause callback to be received."""
@@ -641,14 +649,12 @@ class ForkedDaapdMaster(MediaPlayerDevice):
"""Play a URI."""
if media_type == MEDIA_TYPE_MUSIC:
saved_state = self.state # save play state
if any([output["selected"] for output in self._outputs]): # save outputs
self._last_outputs = self._outputs
await self._api.set_enabled_outputs([]) # turn off outputs
saved_mute = self.is_volume_muted
sleep_future = asyncio.create_task(
asyncio.sleep(self._tts_pause_time)
) # start timing now, but not exact because of fd buffer + tts latency
await self._pause_and_wait_for_callback()
await self._set_tts_volumes()
await self._save_and_set_tts_volumes()
# save position
saved_song_position = self._player["item_progress_ms"]
saved_queue = (
@@ -678,7 +684,9 @@ class ForkedDaapdMaster(MediaPlayerDevice):
_LOGGER.warning("TTS request timed out")
self._tts_playing_event.clear()
# TTS done, return to normal
await self.async_turn_on() # restores outputs
await self.async_turn_on() # restore outputs and volumes
if saved_mute: # mute if we were muted
await self.async_mute_volume(True)
if self._use_pipe_control(): # resume pipe
await self._api.add_to_queue(
uris=self._sources_uris[self._source], clear=True

View File

@@ -16,7 +16,7 @@
"websocket_not_enabled": "forked-daapd server websocket not enabled.",
"wrong_host_or_port": "Unable to connect. Please check host and port.",
"wrong_password": "Incorrect password.",
"wrong_server_type": "Not a forked-daapd server.",
"wrong_server_type": "The forked-daapd integration requires a forked-daapd server with version >= 27.0.",
"unknown_error": "Unknown error."
},
"abort": {

View File

@@ -9,7 +9,7 @@
"websocket_not_enabled": "forked-daapd server websocket not enabled.",
"wrong_host_or_port": "Unable to connect. Please check host and port.",
"wrong_password": "Incorrect password.",
"wrong_server_type": "Not a forked-daapd server."
"wrong_server_type": "The forked-daapd integration requires a forked-daapd server with version >= 27.0."
},
"flow_title": "forked-daapd server: {name} ({host})",
"step": {

View File

@@ -2,7 +2,7 @@
"domain": "frontend",
"name": "Home Assistant Frontend",
"documentation": "https://www.home-assistant.io/integrations/frontend",
"requirements": ["home-assistant-frontend==20200513.0"],
"requirements": ["home-assistant-frontend==20200519.0"],
"dependencies": [
"api",
"auth",

View File

@@ -75,6 +75,7 @@ from .const import (
from .util import (
convert_to_float,
dismiss_setup_message,
format_sw_version,
show_setup_message,
validate_media_player_features,
)
@@ -253,7 +254,7 @@ class HomeAccessory(Accessory):
else:
model = domain.title()
if ATTR_SOFTWARE_VERSION in self.config:
sw_version = self.config[ATTR_SOFTWARE_VERSION]
sw_version = format_sw_version(self.config[ATTR_SOFTWARE_VERSION])
else:
sw_version = __version__

View File

@@ -4,6 +4,7 @@ import io
import ipaddress
import logging
import os
import re
import secrets
import socket
@@ -415,6 +416,14 @@ def get_aid_storage_fullpath_for_entry_id(hass: HomeAssistant, entry_id: str):
)
def format_sw_version(version):
"""Extract the version string in a format homekit can consume."""
match = re.search(r"([0-9]+)(\.[0-9]+)?(\.[0-9]+)?", str(version).replace("-", "."))
if match:
return match.group(0)
return None
def migrate_filesystem_state_data_for_primary_imported_entry_id(
hass: HomeAssistant, entry_id: str
):

View File

@@ -5,7 +5,7 @@
"documentation": "https://www.home-assistant.io/integrations/huawei_lte",
"requirements": [
"getmac==0.8.2",
"huawei-lte-api==1.4.11",
"huawei-lte-api==1.4.12",
"stringcase==1.2.0",
"url-normalize==1.4.1"
],

View File

@@ -152,6 +152,7 @@ class IPPFlowHandler(ConfigFlow, domain=DOMAIN):
_LOGGER.debug(
"Unable to determine unique id from discovery info and IPP response"
)
return self.async_abort(reason="unique_id_required")
await self.async_set_unique_id(unique_id)
self._abort_if_unique_id_configured(

View File

@@ -28,7 +28,8 @@
"connection_upgrade": "Failed to connect to printer due to connection upgrade being required.",
"ipp_error": "Encountered IPP error.",
"ipp_version_error": "IPP version not supported by printer.",
"parse_error": "Failed to parse response from printer."
"parse_error": "Failed to parse response from printer.",
"unique_id_required": "Device missing unique identification required for discovery."
}
}
}

View File

@@ -8,7 +8,7 @@
},
"auth": {
"title": "Authenticate with Logi Circle",
"description": "Please follow the link below and <b>Accept</b> access to your Logi Circle account, then come back and press <b>Submit</b> below.\n\n[Link]({authorization_url})"
"description": "Please follow the link below and **Accept** access to your Logi Circle account, then come back and press **Submit** below.\n\n[Link]({authorization_url})"
}
},
"create_entry": {

View File

@@ -3,5 +3,6 @@
"name": "Mill",
"documentation": "https://www.home-assistant.io/integrations/mill",
"requirements": ["millheater==0.3.4"],
"codeowners": ["@danielhiversen"]
"codeowners": ["@danielhiversen"],
"config_flow": true
}

View File

@@ -18,7 +18,7 @@ from homeassistant.const import (
)
from homeassistant.core import HomeAssistant
from homeassistant.exceptions import ConfigEntryNotReady
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed
from .const import (
COORDINATOR,
@@ -61,7 +61,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry):
async def async_update_data():
"""Fetch data from NUT."""
async with async_timeout.timeout(10):
return await hass.async_add_executor_job(data.update)
await hass.async_add_executor_job(data.update)
if not data.status:
raise UpdateFailed("Error fetching UPS state")
coordinator = DataUpdateCoordinator(
hass,

View File

@@ -189,6 +189,8 @@ class NUTSensor(Entity):
@property
def state(self):
"""Return entity state from ups."""
if not self._data.status:
return None
if self._type == KEY_STATUS_DISPLAY:
return _format_display_state(self._data.status)
return self._data.status.get(self._type)

View File

@@ -54,6 +54,8 @@ class ONVIFDevice:
self.profiles: List[Profile] = []
self.max_resolution: int = 0
self._dt_diff_seconds: int = 0
@property
def name(self) -> str:
"""Return the name of this device."""
@@ -100,6 +102,16 @@ class ONVIFDevice:
if self.capabilities.ptz:
self.device.create_ptz_service()
if self._dt_diff_seconds > 300 and self.capabilities.events:
self.capabilities.events = False
LOGGER.warning(
"The system clock on '%s' is more than 5 minutes off. "
"Although this device supports events, they will be "
"disabled until the device clock is fixed as we will "
"not be able to renew the subscription.",
self.name,
)
if self.capabilities.events:
self.events = EventManager(
self.hass, self.device, self.config_entry.unique_id
@@ -179,9 +191,9 @@ class ONVIFDevice:
)
dt_diff = cam_date - system_date
dt_diff_seconds = dt_diff.total_seconds()
self._dt_diff_seconds = dt_diff.total_seconds()
if dt_diff_seconds > 5:
if self._dt_diff_seconds > 5:
LOGGER.warning(
"The date/time on the device (UTC) is '%s', "
"which is different from the system '%s', "
@@ -207,19 +219,30 @@ class ONVIFDevice:
async def async_get_capabilities(self):
"""Obtain information about the available services on the device."""
media_service = self.device.create_media_service()
media_capabilities = await media_service.GetServiceCapabilities()
event_service = self.device.create_events_service()
event_capabilities = await event_service.GetServiceCapabilities()
snapshot = False
try:
media_service = self.device.create_media_service()
media_capabilities = await media_service.GetServiceCapabilities()
snapshot = media_capabilities.SnapshotUri
except (ONVIFError, Fault):
pass
pullpoint = False
try:
event_service = self.device.create_events_service()
event_capabilities = await event_service.GetServiceCapabilities()
pullpoint = event_capabilities.WSPullPointSupport
except (ONVIFError, Fault):
pass
ptz = False
try:
self.device.get_definition("ptz")
ptz = True
except ONVIFError:
pass
return Capabilities(
media_capabilities.SnapshotUri, event_capabilities.WSPullPointSupport, ptz
)
return Capabilities(snapshot, pullpoint, ptz)
async def async_get_profiles(self) -> List[Profile]:
"""Obtain media profiles for this device."""

View File

@@ -104,7 +104,8 @@ class EventManager:
if not self._subscription:
return
await self._subscription.Renew(dt_util.utcnow() + dt.timedelta(minutes=10))
termination_time = (dt_util.utcnow() + dt.timedelta(minutes=30)).isoformat()
await self._subscription.Renew(termination_time)
async def async_pull_messages(self, _now: dt = None) -> None:
"""Pull messages from device."""
@@ -143,19 +144,22 @@ class EventManager:
async def async_parse_messages(self, messages) -> None:
"""Parse notification message."""
for msg in messages:
# LOGGER.debug("ONVIF Event Message %s: %s", self.device.host, pformat(msg))
topic = msg.Topic._value_1
parser = PARSERS.get(topic)
if not parser:
if topic not in UNHANDLED_TOPICS:
LOGGER.info("No registered handler for event: %s", msg)
LOGGER.info(
"No registered handler for event from %s: %s",
self.unique_id,
msg,
)
UNHANDLED_TOPICS.add(topic)
continue
event = await parser(self.unique_id, msg)
if not event:
LOGGER.warning("Unable to parse event: %s", msg)
LOGGER.warning("Unable to parse event from %s: %s", self.unique_id, msg)
return
self._events[event.uid] = event

View File

@@ -308,7 +308,7 @@ async def async_parse_last_reboot(uid: str, msg) -> Event:
dt_util.parse_datetime(msg.Message._value_1.Data.SimpleItem[0].Value)
),
)
except (AttributeError, KeyError):
except (AttributeError, KeyError, ValueError):
return None
@@ -331,7 +331,7 @@ async def async_parse_last_reset(uid: str, msg) -> Event:
),
entity_enabled=False,
)
except (AttributeError, KeyError):
except (AttributeError, KeyError, ValueError):
return None
@@ -354,5 +354,5 @@ async def async_parse_last_clock_sync(uid: str, msg) -> Event:
),
entity_enabled=False,
)
except (AttributeError, KeyError):
except (AttributeError, KeyError, ValueError):
return None

View File

@@ -23,6 +23,7 @@
},
"manual_input": {
"data": {
"name": "Name",
"host": "[%key:common::config_flow::data::host%]",
"port": "[%key:common::config_flow::data::port%]"
},
@@ -55,4 +56,4 @@
}
}
}
}
}

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "Amfitri\u00f3",
"name": "Nom",
"port": "Port"
},
"title": "Configura el dispositiu ONVIF"

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "Host",
"name": "Name",
"port": "Port"
},
"title": "Configure ONVIF device"

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "Host",
"name": "Nombre",
"port": "Puerto"
},
"title": "Configurar el dispositivo ONVIF"

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "\ud638\uc2a4\ud2b8",
"name": "\uc774\ub984",
"port": "\ud3ec\ud2b8"
},
"title": "ONVIF \uae30\uae30 \uad6c\uc131\ud558\uae30"

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "Vert",
"name": "Navn",
"port": "Port"
},
"title": "Konfigurere ONVIF-enhet"

View File

@@ -1,7 +1,7 @@
{
"config": {
"abort": {
"already_configured": "[%key_id:common::config_flow::abort::already_configured_device%]",
"already_configured": "Urz\u0105dzenie jest ju\u017c skonfigurowane.",
"already_in_progress": "Proces konfiguracji dla urz\u0105dzenia ONVIF jest ju\u017c w toku.",
"no_h264": "Nie by\u0142o dost\u0119pnych \u017cadnych strumieni H264. Sprawd\u017a konfiguracj\u0119 profilu w swoim urz\u0105dzeniu.",
"no_mac": "Nie mo\u017cna utworzy\u0107 unikalnego identyfikatora urz\u0105dzenia ONVIF.",
@@ -13,8 +13,8 @@
"step": {
"auth": {
"data": {
"password": "[%key_id:common::config_flow::data::password%]",
"username": "[%key_id:common::config_flow::data::username%]"
"password": "Has\u0142o",
"username": "Nazwa u\u017cytkownika"
},
"title": "Konfigurowanie uwierzytelniania"
},
@@ -33,8 +33,8 @@
},
"manual_input": {
"data": {
"host": "[%key_id:common::config_flow::data::host%]",
"port": "[%key_id:common::config_flow::data::port%]"
"host": "Nazwa hosta lub adres IP",
"port": "Port"
},
"title": "Konfigurowanie urz\u0105dzenia ONVIF"
},

View File

@@ -34,6 +34,7 @@
"manual_input": {
"data": {
"host": "\u0425\u043e\u0441\u0442",
"name": "\u041d\u0430\u0437\u0432\u0430\u043d\u0438\u0435",
"port": "\u041f\u043e\u0440\u0442"
},
"title": "\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0430 \u0443\u0441\u0442\u0440\u043e\u0439\u0441\u0442\u0432\u0430 ONVIF"

View File

@@ -5,5 +5,5 @@
"codeowners": [
"@danielhiversen"
],
"requirements": ["open-garage==0.1.3"]
"requirements": ["open-garage==0.1.4"]
}

View File

@@ -1,4 +1,4 @@
"""The zwave_mqtt integration."""
"""The ozw integration."""
import asyncio
import json
import logging
@@ -43,7 +43,7 @@ DATA_DEVICES = "zwave-mqtt-devices"
async def async_setup(hass: HomeAssistant, config: dict):
"""Initialize basic config of zwave_mqtt component."""
"""Initialize basic config of ozw component."""
if "mqtt" not in hass.config.components:
_LOGGER.error("MQTT integration is not set up")
return False
@@ -52,9 +52,9 @@ async def async_setup(hass: HomeAssistant, config: dict):
async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry):
"""Set up zwave_mqtt from a config entry."""
zwave_mqtt_data = hass.data[DOMAIN][entry.entry_id] = {}
zwave_mqtt_data[DATA_UNSUBSCRIBE] = []
"""Set up ozw from a config entry."""
ozw_data = hass.data[DOMAIN][entry.entry_id] = {}
ozw_data[DATA_UNSUBSCRIBE] = []
data_nodes = {}
data_values = {}
@@ -216,7 +216,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry):
for component in PLATFORMS
]
)
zwave_mqtt_data[DATA_UNSUBSCRIBE].append(
ozw_data[DATA_UNSUBSCRIBE].append(
await mqtt.async_subscribe(
hass, f"{TOPIC_OPENZWAVE}/#", async_receive_message
)

View File

@@ -1,13 +1,13 @@
"""Config flow for zwave_mqtt integration."""
"""Config flow for ozw integration."""
from homeassistant import config_entries
from .const import DOMAIN # pylint:disable=unused-import
TITLE = "Z-Wave MQTT"
TITLE = "OpenZWave"
class DomainConfigFlow(config_entries.ConfigFlow, domain=DOMAIN):
"""Handle a config flow for zwave_mqtt."""
"""Handle a config flow for ozw."""
VERSION = 1
CONNECTION_CLASS = config_entries.CONN_CLASS_LOCAL_PUSH

View File

@@ -1,10 +1,10 @@
"""Constants for the zwave_mqtt integration."""
"""Constants for the ozw integration."""
from homeassistant.components.binary_sensor import DOMAIN as BINARY_SENSOR_DOMAIN
from homeassistant.components.light import DOMAIN as LIGHT_DOMAIN
from homeassistant.components.sensor import DOMAIN as SENSOR_DOMAIN
from homeassistant.components.switch import DOMAIN as SWITCH_DOMAIN
DOMAIN = "zwave_mqtt"
DOMAIN = "ozw"
DATA_UNSUBSCRIBE = "unsubscribe"
PLATFORMS = [BINARY_SENSOR_DOMAIN, LIGHT_DOMAIN, SENSOR_DOMAIN, SWITCH_DOMAIN]

View File

@@ -265,15 +265,21 @@ class ZWaveDeviceEntity(Entity):
def create_device_name(node: OZWNode):
"""Generate sensible (short) default device name from a OZWNode."""
if node.meta_data["Name"]:
dev_name = node.meta_data["Name"]
elif node.node_product_name:
dev_name = node.node_product_name
elif node.node_device_type_string:
dev_name = node.node_device_type_string
else:
dev_name = node.specific_string
return dev_name
# Prefer custom name set by OZWAdmin if present
if node.node_name:
return node.node_name
# Prefer short devicename from metadata if present
if node.meta_data and node.meta_data.get("Name"):
return node.meta_data["Name"]
# Fallback to productname or devicetype strings
if node.node_product_name:
return node.node_product_name
if node.node_device_type_string:
return node.node_device_type_string
if node.node_specific_string:
return node.node_specific_string
# Last resort: use Node id (should never happen, but just in case)
return f"Node {node.id}"
def create_device_id(node: OZWNode, node_instance: int = 1):

View File

@@ -0,0 +1,9 @@
{
"domain": "ozw",
"name": "OpenZWave (beta)",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/ozw",
"requirements": ["python-openzwave-mqtt==1.0.1"],
"after_dependencies": ["mqtt"],
"codeowners": ["@cgarwood", "@marcelveldt", "@MartinHjelmare"]
}

View File

@@ -1,5 +1,4 @@
{
"title": "Z-Wave over MQTT",
"config": {
"step": {
"user": {

View File

@@ -9,6 +9,5 @@
"title": "Confirmaci\u00f3 de configuraci\u00f3"
}
}
},
"title": "Z-Wave sobre MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Einrichtung best\u00e4tigen"
}
}
},
"title": "Z-Wave \u00fcber MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Confirm set up"
}
}
},
"title": "Z-Wave over MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Confirmar configuraci\u00f3n"
}
}
},
"title": "Z-Wave sobre MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Confirmer la configuration"
}
}
},
"title": "Z-Wave sur MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Confermare la configurazione"
}
}
},
"title": "Z-Wave su MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "\uc124\uc815 \ub0b4\uc6a9 \ud655\uc778\ud558\uae30"
}
}
},
"title": "MQTT \ub97c \ud1b5\ud55c Z-Wave"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Installatioun konfirm\u00e9ieren"
}
}
},
"title": "Z-Wave iwwer MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Bekreft oppsett"
}
}
},
"title": ""
}
}

View File

@@ -9,6 +9,5 @@
"title": "Potwierd\u017a konfiguracj\u0119"
}
}
},
"title": "Z-Wave poprzez MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "\u041f\u043e\u0434\u0442\u0432\u0435\u0440\u0436\u0434\u0435\u043d\u0438\u0435 \u043d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0438"
}
}
},
"title": "Z-Wave \u0447\u0435\u0440\u0435\u0437 MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "Potrdite nastavitev"
}
}
},
"title": "Z-Wave \u010dez MQTT"
}
}

View File

@@ -5,6 +5,5 @@
"title": "Bekr\u00e4fta inst\u00e4llningen"
}
}
},
"title": "Z-Wave \u00f6ver MQTT"
}
}

View File

@@ -9,6 +9,5 @@
"title": "\u78ba\u8a8d\u8a2d\u5b9a"
}
}
},
"title": "Z-Wave over MQTT"
}
}

View File

@@ -3,5 +3,5 @@
"name": "Plugwise Anna",
"documentation": "https://www.home-assistant.io/integrations/plugwise",
"codeowners": ["@laetificat", "@CoMPaTech", "@bouwew"],
"requirements": ["haanna==0.14.3"]
"requirements": ["haanna==0.15.0"]
}

View File

@@ -8,7 +8,7 @@
},
"auth": {
"title": "Authenticate Point",
"description": "Please follow the link below and <b>Accept</b> access to your Minut account, then come back and press <b>Submit</b> below.\n\n[Link]({authorization_url})"
"description": "Please follow the link below and **Accept** access to your Minut account, then come back and press **Submit** below.\n\n[Link]({authorization_url})"
}
},
"create_entry": {

View File

@@ -31,6 +31,7 @@ _LOGGER = logging.getLogger(__name__)
ATTR_CLEANING_TIME = "cleaning_time"
ATTR_CLEANED_AREA = "cleaned_area"
ATTR_ERROR = "error"
ATTR_ERROR_CODE = "error_code"
ATTR_POSITION = "position"
ATTR_SOFTWARE_VERSION = "software_version"
@@ -174,11 +175,6 @@ class IRobotVacuum(IRobotEntity, StateVacuumEntity):
# Roomba software version
software_version = state.get("softwareVer")
# Error message in plain english
error_msg = "None"
if hasattr(self.vacuum, "error_message"):
error_msg = self.vacuum.error_message
# Set properties that are to appear in the GUI
state_attrs = {ATTR_SOFTWARE_VERSION: software_version}
@@ -198,9 +194,10 @@ class IRobotVacuum(IRobotEntity, StateVacuumEntity):
state_attrs[ATTR_CLEANING_TIME] = cleaning_time
state_attrs[ATTR_CLEANED_AREA] = cleaned_area
# Skip error attr if there is none
if error_msg and error_msg != "None":
state_attrs[ATTR_ERROR] = error_msg
# Error
if self.vacuum.error_code != 0:
state_attrs[ATTR_ERROR] = self.vacuum.error_message
state_attrs[ATTR_ERROR_CODE] = self.vacuum.error_code
# Not all Roombas expose position data
# https://github.com/koalazak/dorita980/issues/48

View File

@@ -3,7 +3,7 @@
"name": "iRobot Roomba",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/roomba",
"requirements": ["roombapy==1.5.3"],
"requirements": ["roombapy==1.6.1"],
"dependencies": [],
"codeowners": ["@pschmitt", "@cyr-ius", "@shenxn"]
}

View File

@@ -3,7 +3,7 @@
"name": "Sonos",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/sonos",
"requirements": ["pysonos==0.0.28"],
"requirements": ["pysonos==0.0.30"],
"ssdp": [
{
"st": "urn:schemas-upnp-org:device:ZonePlayer:1"

View File

@@ -3,7 +3,7 @@
"step": {
"auth_app": {
"title": "Application credentials",
"description": "Application ID and secret code from <a href=\"https://my.starline.ru/developer\" target=\"_blank\">StarLine developer account</a>",
"description": "Application ID and secret code from [StarLine developer account](https://my.starline.ru/developer)",
"data": {
"app_id": "App ID",
"app_secret": "Secret"

View File

@@ -2,7 +2,7 @@
"domain": "synology_dsm",
"name": "Synology DSM",
"documentation": "https://www.home-assistant.io/integrations/synology_dsm",
"requirements": ["python-synology==0.8.0"],
"requirements": ["python-synology==0.8.1"],
"codeowners": ["@ProtoThis", "@Quentame"],
"config_flow": true,
"ssdp": [

View File

@@ -14,7 +14,7 @@ from .const import (
ATTR_COMMAND,
ATTR_RATE,
DOMAIN,
EVENT_UPB_LINK_CHANGED,
EVENT_UPB_SCENE_CHANGED,
)
UPB_PLATFORMS = ["light", "scene"]
@@ -49,7 +49,7 @@ async def async_setup_entry(hass, config_entry):
return
hass.bus.async_fire(
EVENT_UPB_LINK_CHANGED,
EVENT_UPB_SCENE_CHANGED,
{
ATTR_COMMAND: change["command"],
ATTR_ADDRESS: element.addr.index,

View File

@@ -13,7 +13,7 @@ ATTR_BRIGHTNESS_PCT = "brightness_pct"
ATTR_COMMAND = "command"
ATTR_RATE = "rate"
CONF_NETWORK = "network"
EVENT_UPB_LINK_CHANGED = "upb.link_changed"
EVENT_UPB_SCENE_CHANGED = "upb.scene_changed"
VALID_BRIGHTNESS = vol.All(vol.Coerce(int), vol.Clamp(min=0, max=255))
VALID_BRIGHTNESS_PCT = vol.All(vol.Coerce(float), vol.Range(min=0, max=100))

View File

@@ -134,6 +134,14 @@ class UpnpFlowHandler(config_entries.ConfigFlow, domain=DOMAIN):
"""
_LOGGER.debug("async_step_ssdp: discovery_info: %s", discovery_info)
# Ensure complete discovery.
if (
ssdp.ATTR_UPNP_UDN not in discovery_info
or ssdp.ATTR_SSDP_ST not in discovery_info
):
_LOGGER.debug("Incomplete discovery, ignoring")
return self.async_abort(reason="incomplete_discovery")
# Ensure not already configuring/configured.
udn = discovery_info[ssdp.ATTR_UPNP_UDN]
st = discovery_info[ssdp.ATTR_SSDP_ST] # pylint: disable=invalid-name
@@ -218,6 +226,7 @@ class UpnpOptionsFlowHandler(config_entries.OptionsFlow):
CONFIG_ENTRY_SCAN_INTERVAL, DEFAULT_SCAN_INTERVAL
)
update_interval = timedelta(seconds=update_interval_sec)
_LOGGER.debug("Updating coordinator, update_interval: %s", update_interval)
coordinator.update_interval = update_interval
return self.async_create_entry(title="", data=user_input)

View File

@@ -17,7 +17,8 @@
"abort": {
"already_configured": "UPnP/IGD is already configured",
"no_devices_discovered": "No UPnP/IGDs discovered",
"no_devices_found": "No UPnP/IGD devices found on the network."
"no_devices_found": "No UPnP/IGD devices found on the network.",
"incomplete_discovery": "Incomplete discovery"
}
}
}

View File

@@ -178,6 +178,11 @@ def setup(hass, config):
return
service_info = zeroconf.get_service_info(service_type, name)
if not service_info:
# Prevent the browser thread from collapsing as
# service_info can be None
return
info = info_from_service(service_info)
_LOGGER.debug("Discovered new device %s %s", name, info)
@@ -196,7 +201,8 @@ def setup(hass, config):
and HOMEKIT_PAIRED_STATUS_FLAG in info[HOMEKIT_PROPERTIES]
):
try:
if not int(info[HOMEKIT_PROPERTIES][HOMEKIT_PAIRED_STATUS_FLAG]):
# 0 means paired and not discoverable by iOS clients)
if int(info[HOMEKIT_PROPERTIES][HOMEKIT_PAIRED_STATUS_FLAG]):
return
except ValueError:
# HomeKit pairing status unknown

View File

@@ -44,6 +44,8 @@ ZHA_CONFIG_SCHEMA = {
),
vol.Optional(CONF_ENABLE_QUIRKS, default=True): cv.boolean,
vol.Optional(CONF_ZIGPY): dict,
vol.Optional(CONF_RADIO_TYPE): cv.enum(RadioType),
vol.Optional(CONF_USB_PATH): cv.string,
}
CONFIG_SCHEMA = vol.Schema(
{

View File

@@ -4,12 +4,12 @@
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/zha",
"requirements": [
"bellows==0.16.1",
"bellows==0.16.2",
"pyserial==3.4",
"zha-quirks==0.0.39",
"zigpy-cc==0.4.2",
"zigpy-deconz==0.9.2",
"zigpy==0.20.3",
"zigpy==0.20.4",
"zigpy-xbee==0.12.1",
"zigpy-zigate==0.6.1"
],

View File

@@ -56,6 +56,7 @@ DISCOVERY_SCHEMAS = [
const.DISC_SPECIFIC_DEVICE_CLASS: [
const.SPECIFIC_TYPE_THERMOSTAT_HEATING,
const.SPECIFIC_TYPE_SETPOINT_THERMOSTAT,
const.SPECIFIC_TYPE_NOT_USED,
],
const.DISC_VALUES: dict(
DEFAULT_VALUES_SCHEMA,

View File

@@ -1,17 +0,0 @@
{
"domain": "zwave_mqtt",
"name": "Z-Wave over MQTT",
"config_flow": true,
"documentation": "https://www.home-assistant.io/integrations/zwave_mqtt",
"requirements": [
"python-openzwave-mqtt==1.0.1"
],
"after_dependencies": [
"mqtt"
],
"codeowners": [
"@cgarwood",
"@marcelveldt",
"@MartinHjelmare"
]
}

View File

@@ -1,7 +1,7 @@
"""Constants used by Home Assistant components."""
MAJOR_VERSION = 0
MINOR_VERSION = 110
PATCH_VERSION = "0b0"
PATCH_VERSION = "0b5"
__short_version__ = f"{MAJOR_VERSION}.{MINOR_VERSION}"
__version__ = f"{__short_version__}.{PATCH_VERSION}"
REQUIRED_PYTHON_VER = (3, 7, 0)

View File

@@ -1454,10 +1454,6 @@ class Config:
)
data = await store.async_load()
if data and "external_url" in data:
self._update(source=SOURCE_STORAGE, **data)
return
async def migrate_base_url(_: Event) -> None:
"""Migrate base_url to internal_url/external_url."""
if self.hass.config.api is None:
@@ -1484,11 +1480,24 @@ class Config:
external_url=network.normalize_url(str(base_url))
)
# Try to migrate base_url to internal_url/external_url
self.hass.bus.async_listen_once(EVENT_HOMEASSISTANT_START, migrate_base_url)
if data:
self._update(source=SOURCE_STORAGE, **data)
# Try to migrate base_url to internal_url/external_url
if "external_url" not in data:
self.hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_START, migrate_base_url
)
self._update(
source=SOURCE_STORAGE,
latitude=data.get("latitude"),
longitude=data.get("longitude"),
elevation=data.get("elevation"),
unit_system=data.get("unit_system"),
location_name=data.get("location_name"),
time_zone=data.get("time_zone"),
external_url=data.get("external_url", _UNDEF),
internal_url=data.get("internal_url", _UNDEF),
)
async def async_store(self) -> None:
"""Store [homeassistant] core config."""

View File

@@ -89,6 +89,7 @@ FLOWS = [
"met",
"meteo_france",
"mikrotik",
"mill",
"minecraft_server",
"mobile_app",
"monoprice",
@@ -106,6 +107,7 @@ FLOWS = [
"opentherm_gw",
"openuv",
"owntracks",
"ozw",
"panasonic_viera",
"pi_hole",
"plaato",
@@ -164,6 +166,5 @@ FLOWS = [
"xiaomi_miio",
"zerproc",
"zha",
"zwave",
"zwave_mqtt"
"zwave"
]

View File

@@ -465,6 +465,15 @@ def string(value: Any) -> str:
return str(value)
def string_with_no_html(value: Any) -> str:
"""Validate that the value is a string without HTML."""
value = string(value)
regex = re.compile(r"<[a-z][\s\S]*>")
if regex.search(value):
raise vol.Invalid("the string should not contain HTML")
return str(value)
def temperature_unit(value: Any) -> str:
"""Validate and transform temperature unit."""
value = str(value).upper()

View File

@@ -12,7 +12,7 @@ cryptography==2.9.2
defusedxml==0.6.0
distro==1.5.0
hass-nabucasa==0.34.2
home-assistant-frontend==20200513.0
home-assistant-frontend==20200519.0
importlib-metadata==1.6.0
jinja2>=2.11.1
netdisco==2.6.0

View File

@@ -333,7 +333,7 @@ beautifulsoup4==4.9.0
beewi_smartclim==0.0.7
# homeassistant.components.zha
bellows==0.16.1
bellows==0.16.2
# homeassistant.components.bmw_connected_drive
bimmer_connected==0.7.5
@@ -692,7 +692,7 @@ ha-ffmpeg==2.0
ha-philipsjs==0.0.8
# homeassistant.components.plugwise
haanna==0.14.3
haanna==0.15.0
# homeassistant.components.habitica
habitipy==0.2.0
@@ -731,7 +731,7 @@ hole==0.5.1
holidays==0.10.2
# homeassistant.components.frontend
home-assistant-frontend==20200513.0
home-assistant-frontend==20200519.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -750,7 +750,7 @@ horimote==0.4.1
httplib2==0.10.3
# homeassistant.components.huawei_lte
huawei-lte-api==1.4.11
huawei-lte-api==1.4.12
# homeassistant.components.hydrawise
hydrawiser==0.1.1
@@ -997,7 +997,7 @@ onkyo-eiscp==1.2.7
onvif-zeep-async==0.3.0
# homeassistant.components.opengarage
open-garage==0.1.3
open-garage==0.1.4
# homeassistant.components.opencv
# opencv-python-headless==4.2.0.32
@@ -1212,7 +1212,7 @@ pyalmond==0.0.2
pyarlo==0.2.3
# homeassistant.components.atag
pyatag==0.3.1.1
pyatag==0.3.1.2
# homeassistant.components.netatmo
pyatmo==3.3.1
@@ -1263,7 +1263,7 @@ pycsspeechtts==1.0.3
# pycups==1.9.73
# homeassistant.components.daikin
pydaikin==2.0.1
pydaikin==2.0.2
# homeassistant.components.danfoss_air
pydanfossair==0.1.0
@@ -1615,7 +1615,7 @@ pysnmp==4.4.12
pysoma==0.0.10
# homeassistant.components.sonos
pysonos==0.0.28
pysonos==0.0.30
# homeassistant.components.spc
pyspcwebgw==0.4.0
@@ -1710,7 +1710,7 @@ python-nest==4.1.0
# homeassistant.components.nmap_tracker
python-nmap==0.6.1
# homeassistant.components.zwave_mqtt
# homeassistant.components.ozw
python-openzwave-mqtt==1.0.1
# homeassistant.components.qbittorrent
@@ -1726,7 +1726,7 @@ python-sochain-api==0.0.2
python-songpal==0.12
# homeassistant.components.synology_dsm
python-synology==0.8.0
python-synology==0.8.1
# homeassistant.components.tado
python-tado==0.8.1
@@ -1874,7 +1874,7 @@ rocketchat-API==0.6.1
rokuecp==0.4.0
# homeassistant.components.roomba
roombapy==1.5.3
roombapy==1.6.1
# homeassistant.components.rova
rova==0.1.0
@@ -2263,7 +2263,7 @@ zigpy-xbee==0.12.1
zigpy-zigate==0.6.1
# homeassistant.components.zha
zigpy==0.20.3
zigpy==0.20.4
# homeassistant.components.zoneminder
zm-py==0.4.0

View File

@@ -147,7 +147,7 @@ axis==25
base36==0.1.1
# homeassistant.components.zha
bellows==0.16.1
bellows==0.16.2
# homeassistant.components.blebox
blebox_uniapi==1.3.2
@@ -312,7 +312,7 @@ hole==0.5.1
holidays==0.10.2
# homeassistant.components.frontend
home-assistant-frontend==20200513.0
home-assistant-frontend==20200519.0
# homeassistant.components.zwave
homeassistant-pyozw==0.1.10
@@ -328,7 +328,7 @@ homematicip==0.10.17
httplib2==0.10.3
# homeassistant.components.huawei_lte
huawei-lte-api==1.4.11
huawei-lte-api==1.4.12
# homeassistant.components.iaqualink
iaqualink==0.3.1
@@ -515,7 +515,7 @@ pyalmond==0.0.2
pyarlo==0.2.3
# homeassistant.components.atag
pyatag==0.3.1.1
pyatag==0.3.1.2
# homeassistant.components.netatmo
pyatmo==3.3.1
@@ -533,7 +533,7 @@ pychromecast==5.1.0
pycoolmasternet==0.0.4
# homeassistant.components.daikin
pydaikin==2.0.1
pydaikin==2.0.2
# homeassistant.components.deconz
pydeconz==70
@@ -681,7 +681,7 @@ pysmartthings==0.7.1
pysoma==0.0.10
# homeassistant.components.sonos
pysonos==0.0.28
pysonos==0.0.30
# homeassistant.components.spc
pyspcwebgw==0.4.0
@@ -704,14 +704,14 @@ python-miio==0.5.0.1
# homeassistant.components.nest
python-nest==4.1.0
# homeassistant.components.zwave_mqtt
# homeassistant.components.ozw
python-openzwave-mqtt==1.0.1
# homeassistant.components.songpal
python-songpal==0.12
# homeassistant.components.synology_dsm
python-synology==0.8.0
python-synology==0.8.1
# homeassistant.components.tado
python-tado==0.8.1
@@ -765,7 +765,7 @@ ring_doorbell==0.6.0
rokuecp==0.4.0
# homeassistant.components.roomba
roombapy==1.5.3
roombapy==1.6.1
# homeassistant.components.yamaha
rxv==0.6.0
@@ -918,4 +918,4 @@ zigpy-xbee==0.12.1
zigpy-zigate==0.6.1
# homeassistant.components.zha
zigpy==0.20.3
zigpy==0.20.4

View File

@@ -91,20 +91,20 @@ def gen_data_entry_schema(
"""Generate a data entry schema."""
step_title_class = vol.Required if require_step_title else vol.Optional
schema = {
vol.Optional("flow_title"): str,
vol.Optional("flow_title"): cv.string_with_no_html,
vol.Required("step"): {
str: {
step_title_class("title"): str,
vol.Optional("description"): str,
vol.Optional("data"): {str: str},
step_title_class("title"): cv.string_with_no_html,
vol.Optional("description"): cv.string_with_no_html,
vol.Optional("data"): {str: cv.string_with_no_html},
}
},
vol.Optional("error"): {str: str},
vol.Optional("abort"): {str: str},
vol.Optional("create_entry"): {str: str},
vol.Optional("error"): {str: cv.string_with_no_html},
vol.Optional("abort"): {str: cv.string_with_no_html},
vol.Optional("create_entry"): {str: cv.string_with_no_html},
}
if flow_title == REQUIRED:
schema[vol.Required("title")] = str
schema[vol.Required("title")] = cv.string_with_no_html
elif flow_title == REMOVED:
schema[vol.Optional("title", msg=REMOVED_TITLE_MSG)] = partial(
removed_title_validator, config, integration
@@ -117,7 +117,7 @@ def gen_strings_schema(config: Config, integration: Integration):
"""Generate a strings schema."""
return vol.Schema(
{
vol.Optional("title"): str,
vol.Optional("title"): cv.string_with_no_html,
vol.Optional("config"): gen_data_entry_schema(
config=config,
integration=integration,
@@ -131,10 +131,10 @@ def gen_strings_schema(config: Config, integration: Integration):
require_step_title=False,
),
vol.Optional("device_automation"): {
vol.Optional("action_type"): {str: str},
vol.Optional("condition_type"): {str: str},
vol.Optional("trigger_type"): {str: str},
vol.Optional("trigger_subtype"): {str: str},
vol.Optional("action_type"): {str: cv.string_with_no_html},
vol.Optional("condition_type"): {str: cv.string_with_no_html},
vol.Optional("trigger_type"): {str: cv.string_with_no_html},
vol.Optional("trigger_subtype"): {str: cv.string_with_no_html},
},
vol.Optional("state"): cv.schema_with_slug_keys(
cv.schema_with_slug_keys(str, slug_validator=lowercase_validator),
@@ -203,7 +203,7 @@ def gen_platform_strings_schema(config: Config, integration: Integration):
)
ONBOARDING_SCHEMA = vol.Schema({vol.Required("area"): {str: str}})
ONBOARDING_SCHEMA = vol.Schema({vol.Required("area"): {str: cv.string_with_no_html}})
def validate_translation_file(config: Config, integration: Integration, all_strings):

View File

@@ -6,8 +6,8 @@ from aiohttp import ClientError
from aiohttp.web_exceptions import HTTPForbidden
import pytest
from homeassistant.components.daikin import config_flow
from homeassistant.components.daikin.const import KEY_IP, KEY_MAC
from homeassistant.config_entries import SOURCE_DISCOVERY, SOURCE_IMPORT, SOURCE_USER
from homeassistant.const import CONF_HOST
from homeassistant.data_entry_flow import (
RESULT_TYPE_ABORT,
@@ -22,13 +22,6 @@ MAC = "AABBCCDDEEFF"
HOST = "127.0.0.1"
def init_config_flow(hass):
"""Init a configuration flow."""
flow = config_flow.FlowHandler()
flow.hass = hass
return flow
@pytest.fixture
def mock_daikin():
"""Mock pydaikin."""
@@ -45,13 +38,16 @@ def mock_daikin():
async def test_user(hass, mock_daikin):
"""Test user config."""
flow = init_config_flow(hass)
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_USER},
)
result = await flow.async_step_user()
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "user"
result = await flow.async_step_user({CONF_HOST: HOST})
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_USER}, data={CONF_HOST: HOST},
)
assert result["type"] == RESULT_TYPE_CREATE_ENTRY
assert result["title"] == HOST
assert result["data"][CONF_HOST] == HOST
@@ -60,34 +56,26 @@ async def test_user(hass, mock_daikin):
async def test_abort_if_already_setup(hass, mock_daikin):
"""Test we abort if Daikin is already setup."""
flow = init_config_flow(hass)
MockConfigEntry(domain="daikin", data={KEY_MAC: MAC}).add_to_hass(hass)
MockConfigEntry(domain="daikin", unique_id=MAC).add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_USER}, data={CONF_HOST: HOST, KEY_MAC: MAC},
)
result = await flow.async_step_user({CONF_HOST: HOST})
assert result["type"] == RESULT_TYPE_ABORT
assert result["reason"] == "already_configured"
async def test_import(hass, mock_daikin):
"""Test import step."""
flow = init_config_flow(hass)
result = await flow.async_step_import({})
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_IMPORT}, data={},
)
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "user"
result = await flow.async_step_import({CONF_HOST: HOST})
assert result["type"] == RESULT_TYPE_CREATE_ENTRY
assert result["title"] == HOST
assert result["data"][CONF_HOST] == HOST
assert result["data"][KEY_MAC] == MAC
async def test_discovery(hass, mock_daikin):
"""Test discovery step."""
flow = init_config_flow(hass)
result = await flow.async_step_discovery({KEY_IP: HOST, KEY_MAC: MAC})
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_IMPORT}, data={CONF_HOST: HOST},
)
assert result["type"] == RESULT_TYPE_CREATE_ENTRY
assert result["title"] == HOST
assert result["data"][CONF_HOST] == HOST
@@ -105,10 +93,31 @@ async def test_discovery(hass, mock_daikin):
)
async def test_device_abort(hass, mock_daikin, s_effect, reason):
"""Test device abort."""
flow = init_config_flow(hass)
mock_daikin.factory.side_effect = s_effect
result = await flow.async_step_user({CONF_HOST: HOST})
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": SOURCE_USER}, data={CONF_HOST: HOST, KEY_MAC: MAC},
)
assert result["type"] == RESULT_TYPE_FORM
assert result["errors"] == {"base": reason}
assert result["step_id"] == "user"
@pytest.mark.parametrize(
"source, data, unique_id", [(SOURCE_DISCOVERY, {KEY_IP: HOST, KEY_MAC: MAC}, MAC)],
)
async def test_discovery(hass, mock_daikin, source, data, unique_id):
"""Test discovery/zeroconf step."""
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": source}, data=data,
)
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "user"
MockConfigEntry(domain="daikin", unique_id=unique_id).add_to_hass(hass)
result = await hass.config_entries.flow.async_init(
"daikin", context={"source": source}, data=data,
)
assert result["type"] == RESULT_TYPE_ABORT
assert result["reason"] == "already_in_progress"

View File

@@ -103,7 +103,7 @@ async def test_zeroconf_updates_title(hass, config_entry):
discovery_info = {
"host": "192.168.1.1",
"port": 23,
"properties": {"mtd-version": 1, "Machine Name": "zeroconf_test"},
"properties": {"mtd-version": "27.0", "Machine Name": "zeroconf_test"},
}
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_ZEROCONF}, data=discovery_info
@@ -129,12 +129,35 @@ async def test_config_flow_no_websocket(hass, config_entry):
async def test_config_flow_zeroconf_invalid(hass):
"""Test that an invalid zeroconf entry doesn't work."""
# test with no discovery properties
discovery_info = {"host": "127.0.0.1", "port": 23}
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_ZEROCONF}, data=discovery_info
) # doesn't create the entry, tries to show form but gets abort
assert result["type"] == data_entry_flow.RESULT_TYPE_ABORT
assert result["reason"] == "not_forked_daapd"
# test with forked-daapd version < 27
discovery_info = {
"host": "127.0.0.1",
"port": 23,
"properties": {"mtd-version": "26.3", "Machine Name": "forked-daapd"},
}
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_ZEROCONF}, data=discovery_info
) # doesn't create the entry, tries to show form but gets abort
assert result["type"] == data_entry_flow.RESULT_TYPE_ABORT
assert result["reason"] == "not_forked_daapd"
# test with verbose mtd-version from Firefly
discovery_info = {
"host": "127.0.0.1",
"port": 23,
"properties": {"mtd-version": "0.2.4.1", "Machine Name": "firefly"},
}
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_ZEROCONF}, data=discovery_info
) # doesn't create the entry, tries to show form but gets abort
assert result["type"] == data_entry_flow.RESULT_TYPE_ABORT
assert result["reason"] == "not_forked_daapd"
async def test_config_flow_zeroconf_valid(hass):
@@ -143,7 +166,7 @@ async def test_config_flow_zeroconf_valid(hass):
"host": "192.168.1.1",
"port": 23,
"properties": {
"mtd-version": 1,
"mtd-version": "27.0",
"Machine Name": "zeroconf_test",
"Machine ID": "5E55EEFF",
},

View File

@@ -111,7 +111,7 @@ SAMPLE_PLAYER_STOPPED = {
"item_progress_ms": 5,
}
SAMPLE_TTS_QUEUE = {
SAMPLE_QUEUE_TTS = {
"version": 833,
"count": 1,
"items": [
@@ -127,11 +127,31 @@ SAMPLE_TTS_QUEUE = {
"length_ms": 0,
"track_number": 1,
"media_kind": "music",
"data_kind": "url",
"uri": "tts_proxy_somefile.mp3",
}
],
}
SAMPLE_QUEUE_PIPE = {
"version": 833,
"count": 1,
"items": [
{
"id": 12322,
"title": "librespot-java",
"artist": "some artist",
"album": "some album",
"album_artist": "The xx",
"length_ms": 0,
"track_number": 1,
"media_kind": "music",
"data_kind": "pipe",
"uri": "pipeuri",
}
],
}
SAMPLE_CONFIG = {
"websocket_port": 3688,
"version": "25.0",
@@ -272,7 +292,7 @@ async def get_request_return_values_fixture():
"config": SAMPLE_CONFIG,
"outputs": SAMPLE_OUTPUTS_ON,
"player": SAMPLE_PLAYER_PAUSED,
"queue": SAMPLE_TTS_QUEUE,
"queue": SAMPLE_QUEUE_TTS,
}
@@ -630,7 +650,9 @@ async def pipe_control_api_object_fixture(
return pipe_control_api.return_value
async def test_librespot_java_stuff(hass, pipe_control_api_object):
async def test_librespot_java_stuff(
hass, get_request_return_values, mock_api_object, pipe_control_api_object
):
"""Test options update and librespot-java stuff."""
state = hass.states.get(TEST_MASTER_ENTITY_NAME)
assert state.attributes[ATTR_INPUT_SOURCE] == "librespot-java (pipe)"
@@ -652,6 +674,13 @@ async def test_librespot_java_stuff(hass, pipe_control_api_object):
)
state = hass.states.get(TEST_MASTER_ENTITY_NAME)
assert state.attributes[ATTR_INPUT_SOURCE] == SOURCE_NAME_DEFAULT
# test pipe getting queued externally changes source
get_request_return_values["queue"] = SAMPLE_QUEUE_PIPE
updater_update = mock_api_object.start_websocket_handler.call_args[0][2]
await updater_update(["queue"])
await hass.async_block_till_done()
state = hass.states.get(TEST_MASTER_ENTITY_NAME)
assert state.attributes[ATTR_INPUT_SOURCE] == "librespot-java (pipe)"
async def test_librespot_java_play_media(hass, pipe_control_api_object):

View File

@@ -28,6 +28,7 @@ from homeassistant.components.homekit.util import (
density_to_air_quality,
dismiss_setup_message,
find_next_available_port,
format_sw_version,
port_is_available,
show_setup_message,
temperature_to_homekit,
@@ -315,3 +316,12 @@ async def test_port_is_available(hass):
assert next_port
assert await hass.async_add_executor_job(port_is_available, next_port)
async def test_format_sw_version():
"""Test format_sw_version method."""
assert format_sw_version("soho+3.6.8+soho-release-rt120+10") == "3.6.8"
assert format_sw_version("undefined-undefined-1.6.8") == "1.6.8"
assert format_sw_version("56.0-76060") == "56.0.76060"
assert format_sw_version(3.6) == "3.6"
assert format_sw_version("unknown") is None

View File

@@ -1,6 +1,9 @@
"""Tests for the IPP integration."""
import os
import aiohttp
from pyipp import IPPConnectionUpgradeRequired, IPPError
from homeassistant.components.ipp.const import CONF_BASE_PATH, CONF_UUID, DOMAIN
from homeassistant.const import (
CONF_HOST,
@@ -18,21 +21,25 @@ from tests.test_util.aiohttp import AiohttpClientMocker
ATTR_HOSTNAME = "hostname"
ATTR_PROPERTIES = "properties"
HOST = "192.168.1.31"
PORT = 631
BASE_PATH = "/ipp/print"
IPP_ZEROCONF_SERVICE_TYPE = "_ipp._tcp.local."
IPPS_ZEROCONF_SERVICE_TYPE = "_ipps._tcp.local."
ZEROCONF_NAME = "EPSON XP-6000 Series"
ZEROCONF_HOST = "192.168.1.31"
ZEROCONF_HOST = HOST
ZEROCONF_HOSTNAME = "EPSON123456.local."
ZEROCONF_PORT = 631
ZEROCONF_PORT = PORT
ZEROCONF_RP = "ipp/print"
MOCK_USER_INPUT = {
CONF_HOST: "192.168.1.31",
CONF_PORT: 361,
CONF_HOST: HOST,
CONF_PORT: PORT,
CONF_SSL: False,
CONF_VERIFY_SSL: False,
CONF_BASE_PATH: "/ipp/print",
CONF_BASE_PATH: BASE_PATH,
}
MOCK_ZEROCONF_IPP_SERVICE_INFO = {
@@ -41,7 +48,7 @@ MOCK_ZEROCONF_IPP_SERVICE_INFO = {
CONF_HOST: ZEROCONF_HOST,
ATTR_HOSTNAME: ZEROCONF_HOSTNAME,
CONF_PORT: ZEROCONF_PORT,
ATTR_PROPERTIES: {"rp": "ipp/print"},
ATTR_PROPERTIES: {"rp": ZEROCONF_RP},
}
MOCK_ZEROCONF_IPPS_SERVICE_INFO = {
@@ -50,7 +57,7 @@ MOCK_ZEROCONF_IPPS_SERVICE_INFO = {
CONF_HOST: ZEROCONF_HOST,
ATTR_HOSTNAME: ZEROCONF_HOSTNAME,
CONF_PORT: ZEROCONF_PORT,
ATTR_PROPERTIES: {"rp": "ipp/print"},
ATTR_PROPERTIES: {"rp": ZEROCONF_RP},
}
@@ -61,30 +68,75 @@ def load_fixture_binary(filename):
return fptr.read()
def mock_connection(
aioclient_mock: AiohttpClientMocker,
host: str = HOST,
port: int = PORT,
ssl: bool = False,
base_path: str = BASE_PATH,
conn_error: bool = False,
conn_upgrade_error: bool = False,
ipp_error: bool = False,
no_unique_id: bool = False,
parse_error: bool = False,
version_not_supported: bool = False,
):
"""Mock the IPP connection."""
scheme = "https" if ssl else "http"
ipp_url = f"{scheme}://{host}:{port}"
if ipp_error:
aioclient_mock.post(f"{ipp_url}{base_path}", exc=IPPError)
return
if conn_error:
aioclient_mock.post(f"{ipp_url}{base_path}", exc=aiohttp.ClientError)
return
if conn_upgrade_error:
aioclient_mock.post(f"{ipp_url}{base_path}", exc=IPPConnectionUpgradeRequired)
return
fixture = "ipp/get-printer-attributes.bin"
if no_unique_id:
fixture = "ipp/get-printer-attributes-success-nodata.bin"
elif version_not_supported:
fixture = "ipp/get-printer-attributes-error-0x0503.bin"
if parse_error:
content = "BAD"
else:
content = load_fixture_binary(fixture)
aioclient_mock.post(
f"{ipp_url}{base_path}",
content=content,
headers={"Content-Type": "application/ipp"},
)
async def init_integration(
hass: HomeAssistant,
aioclient_mock: AiohttpClientMocker,
skip_setup: bool = False,
host: str = HOST,
port: int = PORT,
ssl: bool = False,
base_path: str = BASE_PATH,
uuid: str = "cfe92100-67c4-11d4-a45f-f8d027761251",
unique_id: str = "cfe92100-67c4-11d4-a45f-f8d027761251",
conn_error: bool = False,
) -> MockConfigEntry:
"""Set up the IPP integration in Home Assistant."""
fixture = "ipp/get-printer-attributes.bin"
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary(fixture),
headers={"Content-Type": "application/ipp"},
)
entry = MockConfigEntry(
domain=DOMAIN,
unique_id=unique_id,
data={
CONF_HOST: "192.168.1.31",
CONF_PORT: 631,
CONF_SSL: False,
CONF_HOST: host,
CONF_PORT: port,
CONF_SSL: ssl,
CONF_VERIFY_SSL: True,
CONF_BASE_PATH: "/ipp/print",
CONF_BASE_PATH: base_path,
CONF_UUID: uuid,
},
)
@@ -92,6 +144,14 @@ async def init_integration(
entry.add_to_hass(hass)
if not skip_setup:
mock_connection(
aioclient_mock,
host=host,
port=port,
ssl=ssl,
base_path=base_path,
conn_error=conn_error,
)
await hass.config_entries.async_setup(entry.entry_id)
await hass.async_block_till_done()

View File

@@ -1,7 +1,4 @@
"""Tests for the IPP config flow."""
import aiohttp
from pyipp import IPPConnectionUpgradeRequired, IPPError
from homeassistant.components.ipp.const import CONF_BASE_PATH, CONF_UUID, DOMAIN
from homeassistant.config_entries import SOURCE_USER, SOURCE_ZEROCONF
from homeassistant.const import CONF_HOST, CONF_NAME, CONF_SSL
@@ -17,7 +14,7 @@ from . import (
MOCK_ZEROCONF_IPP_SERVICE_INFO,
MOCK_ZEROCONF_IPPS_SERVICE_INFO,
init_integration,
load_fixture_binary,
mock_connection,
)
from tests.test_util.aiohttp import AiohttpClientMocker
@@ -37,11 +34,7 @@ async def test_show_zeroconf_form(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test that the zeroconf confirmation form is served."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -57,7 +50,7 @@ async def test_connection_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we show user form on IPP connection error."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=aiohttp.ClientError)
mock_connection(aioclient_mock, conn_error=True)
user_input = MOCK_USER_INPUT.copy()
result = await hass.config_entries.flow.async_init(
@@ -73,7 +66,7 @@ async def test_zeroconf_connection_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP connection error."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=aiohttp.ClientError)
mock_connection(aioclient_mock, conn_error=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -88,7 +81,7 @@ async def test_zeroconf_confirm_connection_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP connection error."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=aiohttp.ClientError)
mock_connection(aioclient_mock, conn_error=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -103,9 +96,7 @@ async def test_user_connection_upgrade_required(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we show the user form if connection upgrade required by server."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print", exc=IPPConnectionUpgradeRequired
)
mock_connection(aioclient_mock, conn_upgrade_error=True)
user_input = MOCK_USER_INPUT.copy()
result = await hass.config_entries.flow.async_init(
@@ -121,9 +112,7 @@ async def test_zeroconf_connection_upgrade_required(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP connection error."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print", exc=IPPConnectionUpgradeRequired
)
mock_connection(aioclient_mock, conn_upgrade_error=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -138,11 +127,7 @@ async def test_user_parse_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort user flow on IPP parse error."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content="BAD",
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock, parse_error=True)
user_input = MOCK_USER_INPUT.copy()
result = await hass.config_entries.flow.async_init(
@@ -157,11 +142,7 @@ async def test_zeroconf_parse_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP parse error."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content="BAD",
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock, parse_error=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -176,7 +157,7 @@ async def test_user_ipp_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort the user flow on IPP error."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=IPPError)
mock_connection(aioclient_mock, ipp_error=True)
user_input = MOCK_USER_INPUT.copy()
result = await hass.config_entries.flow.async_init(
@@ -191,7 +172,7 @@ async def test_zeroconf_ipp_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP error."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=IPPError)
mock_connection(aioclient_mock, ipp_error=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -206,11 +187,7 @@ async def test_user_ipp_version_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort user flow on IPP version not supported error."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes-error-0x0503.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock, version_not_supported=True)
user_input = {**MOCK_USER_INPUT}
result = await hass.config_entries.flow.async_init(
@@ -225,11 +202,7 @@ async def test_zeroconf_ipp_version_error(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow on IPP version not supported error."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes-error-0x0503.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock, version_not_supported=True)
discovery_info = {**MOCK_ZEROCONF_IPP_SERVICE_INFO}
result = await hass.config_entries.flow.async_init(
@@ -291,15 +264,26 @@ async def test_zeroconf_with_uuid_device_exists_abort(
assert result["reason"] == "already_configured"
async def test_zeroconf_unique_id_required_abort(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test we abort zeroconf flow if printer lacks unique identification."""
mock_connection(aioclient_mock, no_unique_id=True)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_ZEROCONF}, data=discovery_info,
)
assert result["type"] == RESULT_TYPE_ABORT
assert result["reason"] == "unique_id_required"
async def test_full_user_flow_implementation(
hass: HomeAssistant, aioclient_mock
) -> None:
"""Test the full manual user flow from start to finish."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock)
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": SOURCE_USER},
@@ -328,11 +312,7 @@ async def test_full_zeroconf_flow_implementation(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test the full manual user flow from start to finish."""
aioclient_mock.post(
"http://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock)
discovery_info = MOCK_ZEROCONF_IPP_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(
@@ -363,11 +343,7 @@ async def test_full_zeroconf_tls_flow_implementation(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test the full manual user flow from start to finish."""
aioclient_mock.post(
"https://192.168.1.31:631/ipp/print",
content=load_fixture_binary("ipp/get-printer-attributes.bin"),
headers={"Content-Type": "application/ipp"},
)
mock_connection(aioclient_mock, ssl=True)
discovery_info = MOCK_ZEROCONF_IPPS_SERVICE_INFO.copy()
result = await hass.config_entries.flow.async_init(

View File

@@ -1,6 +1,4 @@
"""Tests for the IPP integration."""
import aiohttp
from homeassistant.components.ipp.const import DOMAIN
from homeassistant.config_entries import (
ENTRY_STATE_LOADED,
@@ -17,9 +15,7 @@ async def test_config_entry_not_ready(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test the IPP configuration entry not ready."""
aioclient_mock.post("http://192.168.1.31:631/ipp/print", exc=aiohttp.ClientError)
entry = await init_integration(hass, aioclient_mock)
entry = await init_integration(hass, aioclient_mock, conn_error=True)
assert entry.state == ENTRY_STATE_SETUP_RETRY

View File

@@ -8,7 +8,7 @@ from homeassistant.core import HomeAssistant
from homeassistant.util import dt as dt_util
from tests.async_mock import patch
from tests.components.ipp import init_integration
from tests.components.ipp import init_integration, mock_connection
from tests.test_util.aiohttp import AiohttpClientMocker
@@ -16,6 +16,8 @@ async def test_sensors(
hass: HomeAssistant, aioclient_mock: AiohttpClientMocker
) -> None:
"""Test the creation and values of the IPP sensors."""
mock_connection(aioclient_mock)
entry = await init_integration(hass, aioclient_mock, skip_setup=True)
registry = await hass.helpers.entity_registry.async_get_registry()

View File

@@ -0,0 +1 @@
"""Tests for the OZW integration."""

View File

@@ -2,14 +2,14 @@
import json
from homeassistant import config_entries
from homeassistant.components.zwave_mqtt.const import DOMAIN
from homeassistant.components.ozw.const import DOMAIN
from tests.async_mock import Mock, patch
from tests.common import MockConfigEntry
async def setup_zwave(hass, entry=None, fixture=None):
"""Set up Z-Wave and load a dump."""
async def setup_ozw(hass, entry=None, fixture=None):
"""Set up OZW and load a dump."""
hass.config.components.add("mqtt")
if entry is None:
@@ -26,7 +26,7 @@ async def setup_zwave(hass, entry=None, fixture=None):
assert await hass.config_entries.async_setup(entry.entry_id)
await hass.async_block_till_done()
assert "zwave_mqtt" in hass.config.components
assert "ozw" in hass.config.components
assert len(mock_subscribe.mock_calls) == 1
receive_message = mock_subscribe.mock_calls[0][1][2]

View File

@@ -12,13 +12,13 @@ from tests.common import load_fixture
@pytest.fixture(name="generic_data", scope="session")
def generic_data_fixture():
"""Load generic MQTT data and return it."""
return load_fixture(f"zwave_mqtt/generic_network_dump.csv")
return load_fixture("ozw/generic_network_dump.csv")
@pytest.fixture(name="light_data", scope="session")
def light_data_fixture():
"""Load light dimmer MQTT data and return it."""
return load_fixture(f"zwave_mqtt/light_network_dump.csv")
return load_fixture("ozw/light_network_dump.csv")
@pytest.fixture(name="sent_messages")
@@ -39,7 +39,7 @@ def sent_messages_fixture():
async def light_msg_fixture(hass):
"""Return a mock MQTT msg with a light actuator message."""
light_json = json.loads(
await hass.async_add_executor_job(load_fixture, "zwave_mqtt/light.json")
await hass.async_add_executor_job(load_fixture, "ozw/light.json")
)
message = MQTTMessage(topic=light_json["topic"], payload=light_json["payload"])
message.encode()
@@ -50,7 +50,7 @@ async def light_msg_fixture(hass):
async def switch_msg_fixture(hass):
"""Return a mock MQTT msg with a switch actuator message."""
switch_json = json.loads(
await hass.async_add_executor_job(load_fixture, "zwave_mqtt/switch.json")
await hass.async_add_executor_job(load_fixture, "ozw/switch.json")
)
message = MQTTMessage(topic=switch_json["topic"], payload=switch_json["payload"])
message.encode()
@@ -61,7 +61,7 @@ async def switch_msg_fixture(hass):
async def sensor_msg_fixture(hass):
"""Return a mock MQTT msg with a sensor change message."""
sensor_json = json.loads(
await hass.async_add_executor_job(load_fixture, "zwave_mqtt/sensor.json")
await hass.async_add_executor_job(load_fixture, "ozw/sensor.json")
)
message = MQTTMessage(topic=sensor_json["topic"], payload=sensor_json["payload"])
message.encode()
@@ -72,7 +72,7 @@ async def sensor_msg_fixture(hass):
async def binary_sensor_msg_fixture(hass):
"""Return a mock MQTT msg with a binary_sensor change message."""
sensor_json = json.loads(
await hass.async_add_executor_job(load_fixture, "zwave_mqtt/binary_sensor.json")
await hass.async_add_executor_job(load_fixture, "ozw/binary_sensor.json")
)
message = MQTTMessage(topic=sensor_json["topic"], payload=sensor_json["payload"])
message.encode()
@@ -83,9 +83,7 @@ async def binary_sensor_msg_fixture(hass):
async def binary_sensor_alt_msg_fixture(hass):
"""Return a mock MQTT msg with a binary_sensor change message."""
sensor_json = json.loads(
await hass.async_add_executor_job(
load_fixture, "zwave_mqtt/binary_sensor_alt.json"
)
await hass.async_add_executor_job(load_fixture, "ozw/binary_sensor_alt.json")
)
message = MQTTMessage(topic=sensor_json["topic"], payload=sensor_json["payload"])
message.encode()

View File

@@ -3,15 +3,15 @@ from homeassistant.components.binary_sensor import (
DEVICE_CLASS_MOTION,
DOMAIN as BINARY_SENSOR_DOMAIN,
)
from homeassistant.components.zwave_mqtt.const import DOMAIN
from homeassistant.components.ozw.const import DOMAIN
from homeassistant.const import ATTR_DEVICE_CLASS
from .common import setup_zwave
from .common import setup_ozw
async def test_binary_sensor(hass, generic_data, binary_sensor_msg):
"""Test setting up config entry."""
receive_msg = await setup_zwave(hass, fixture=generic_data)
receive_msg = await setup_ozw(hass, fixture=generic_data)
# Test Legacy sensor (disabled by default)
registry = await hass.helpers.entity_registry.async_get_registry()
@@ -57,7 +57,7 @@ async def test_sensor_enabled(hass, generic_data, binary_sensor_alt_msg):
)
assert entry.disabled is False
receive_msg = await setup_zwave(hass, fixture=generic_data)
receive_msg = await setup_ozw(hass, fixture=generic_data)
receive_msg(binary_sensor_alt_msg)
await hass.async_block_till_done()

Some files were not shown because too many files have changed in this diff Show More