Compare commits

...

650 Commits

Author SHA1 Message Date
Paulus Schoutsen
207cf18a46 Bump version to 0.97.0b0 2019-07-31 16:19:46 -07:00
Paulus Schoutsen
5961fbb710 Merge remote-tracking branch 'origin/master' into dev 2019-07-31 16:17:17 -07:00
Paulus Schoutsen
37d78af42c Add translations 2019-07-31 16:16:40 -07:00
Paulus Schoutsen
05ecc5a135 Merge pull request #25584 from home-assistant/black
Black
2019-07-31 15:36:42 -07:00
Paulus Schoutsen
dcdbd08d23 Add constraints to key files 2019-07-31 14:49:38 -07:00
Paulus Schoutsen
11a4d36c69 Confirm deletion 2019-07-31 14:49:00 -07:00
Paulus Schoutsen
a4920d3afb Uninstall typing 2019-07-31 14:02:12 -07:00
Paulus Schoutsen
6b2d40327c Make sure typing matches stdlib 2019-07-31 13:46:57 -07:00
Paulus Schoutsen
620cb74050 Type 2019-07-31 13:08:31 -07:00
Paulus Schoutsen
93c0db2328 Lint 2019-07-31 12:53:07 -07:00
Paulus Schoutsen
0ccffc3e55 Lint 2019-07-31 12:46:17 -07:00
Paulus Schoutsen
4de97abc3a Black 2019-07-31 12:25:30 -07:00
Paulus Schoutsen
da05dfe708 Add Black 2019-07-31 12:23:23 -07:00
Ville Skyttä
0490167a12 Azure flake8 dep, docstring fixes (#25605)
* Fix Azure CI flake8 deps

* Docstyle fixes
2019-07-31 12:21:36 -07:00
Ville Skyttä
3cf8964c06 Python < 3.6 remainder cleanups (#25607) 2019-07-31 12:21:15 -07:00
Thomas Lovén
671cb0d092 Return history for entities in the order they were requested (#25560)
* Return history for entities in the order they were requested

* Fix problems. Add tests

* Update __init__.py
2019-07-31 11:59:25 -07:00
Paulus Schoutsen
fcdd66b33b Bump frontend to 20190731.0 2019-07-31 11:22:48 -07:00
cgtobi
4bef2412d2 Netatmo climate refactor (#25457)
* Refactor climate component to use home id rather than name

* Bump pyatmo version

* Add new exception

* Update pyatmo version
2019-07-31 11:13:12 -07:00
Daniyar Yeralin
e1d884a484 Introduce support for color temperature (#25503)
* Introduce support for color temperature

* Fix linter errors

* Remove extra whitespace for pylint
2019-07-31 11:10:52 -07:00
Ross Dargan
5e7465a261 Change how ring polls for changes to allow more platforms to be added (#25534)
* Add in a switch to control lights and sirens

* Improve the way sensors are updated

* fixes following flake8

* remove light platform, and fix breaking test.

* Resolve issues with tests

* add tests for the switch platform

* fix up flake8 errors

* fix the long strings

* fix naming on private method.

* updates following p/r

* further fixes following pr

* removed import

* add additional tests to improve code coverage

* forgot to check this in
2019-07-31 11:08:40 -07:00
ChristianKuehnel
92991b53c4 remove myself from CODEOWNDERS (#25593)
* remove myself from CODEOWNDERS

Sorry, I do not have time right now to work on HomeAssistant, so I'm removing myself from the CODEOWNERS.

* Update manifest.json
2019-07-31 10:59:56 -07:00
Tobias Perschon
11ebd8546c implemented timout setting for telnet switch (#25602) 2019-07-31 10:59:12 -07:00
David Bonnes
16a98359c3 Fix bug and bump geniushub client (#25599)
Fix bug, delint and bump client
2019-07-31 18:44:09 +01:00
croghostrider
8ffc6c05b7 Fix wrong exposed light for emulated hue (#25581)
* Add auto detect if brightness is supported

* Fix

* Fix tests

* Cleanup
2019-07-31 10:34:37 -07:00
Aaron Bach
3a3f70ef21 Add migration notification for Ambient PWS (#25561)
* Add migration notification for Ambient PWS

* Remove unnecessary kwarg

Co-Authored-By: Andrew Sayre <6730289+andrewsayre@users.noreply.github.com>

* Automatically recreate config entry

* Delete entities and devices

* Move imports
2019-07-31 10:31:40 -07:00
Martin Eberhardt
90dc81c1b3 Improve and align Rejseplanen with other transport components (Breaking) (#25375)
* Improve and align Rejseplanen with other transport components (Breaking)

* Fix empty list check

* Remove pointless list cast

* Clean up redundant definition of transport types

* Add docstring

* Simplify _times check
2019-07-31 18:48:54 +02:00
Joakim Plate
3d11c45edd Log platform import errors and correct reqs for config check (#25425)
* Log failures to load plaforms

* Drop unused exception variable

* Also load pip requirements with check config

* Fix lint

* More lint

* Drop invalid parameters for error call
2019-07-31 09:09:00 -07:00
Pascal Vizeli
35c048fe6b Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-31 09:26:47 +02:00
Pascal Vizeli
1c0d847353 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-31 09:19:45 +02:00
Pascal Vizeli
96e84692ef Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-31 09:00:50 +02:00
Tyler Page
2e05431642 Bump venstarcolortouch to 0.9 (#25585)
* Update manifest.json

* Update requirements_all.txt
2019-07-31 08:56:37 +02:00
michaeldavie
255332aca8 Bump env_canada to 0.0.20 (#25594) 2019-07-31 08:56:11 +02:00
Manuel Díez
42c50c5b5e Add support for Roku TVs to be powered on or off (#25590) 2019-07-31 06:01:51 +02:00
Jon Gilmore
1e8a4dd0bc Fix status of lutron switches/lights after HA reboot (#25592) 2019-07-31 05:48:43 +02:00
Tom Harris
ffe6ddeba7 Bump insteonplm to 0.16.5 (#25580) 2019-07-31 04:40:22 +02:00
Paulus Schoutsen
39b8102ce6 Bump Python support to min Python 3.6.0 (#25582)
* Bump Python support to min Python 3.6.0

* Fix type
2019-07-30 16:44:39 -07:00
Aaron Bach
fe1e761a7a Add last event data (including "changed_by") to SimpliSafe (#25569)
* Add "last event" sensor for SimpliSafe

* Functionality round 1

* Cleanup

* Whitespace

* Whitespace

* Updated requirements

* Removed unused constants

* Member comments
2019-07-30 17:23:42 -06:00
Paulus Schoutsen
0257fe0375 Fix Ecobee HVAC action + available presets (#25488)
* Fix Ecobee HVAC action + available presets

* Update python-ecobee-api to 0.0.21

* Include proper operation list.

* Allows pass on preset to set_climate_hold

* Remove aux heat as a preset

* Fix test
2019-07-30 15:25:03 -07:00
Quentame
f8bb0e1229 Fix : Velbus translation error (#25575) 2019-07-30 14:26:06 -07:00
Alexei Chetroi
5aa35b52cc ZHA log helper (#25543)
* Logging helper.
* Use log helper for ZHA entities.
* Use log helper for ZHA core device.
* Log helper for ZHA core channels.
* Lint
* ZHA fixture fix.
2019-07-30 15:19:24 -04:00
Ben Lebherz
2c144bc412 add cleaning state code for roborock s6 (#25500) 2019-07-30 13:34:05 -04:00
Orson
2d10e61c23 Updated Workday Binary Sensor to use Holidays 0.9.11 and added support for Aruba Holidays. (#25568)
* Updated Workday Binary Sensor to use Holidays 0.9.11 and added support for Aruba Holidays.

* updated requirements_all.txts
2019-07-30 12:58:23 +02:00
Maikel Punie
15ae970941 Make the velbus component more robust in handling errors (#25567)
* Add some try excepts for velbus as the python-velbus lib is not good in handling these

* Only catch velbusExceptions

* only wrap the lines that can cause the exception

* Fix indentation mixup
2019-07-30 12:56:40 +02:00
Jon Gilmore
67cae00caa pylutron PyPI update (#25557)
* pylutron PyPI update

We've been working with the original maintainer of pylutron, and they've published an update to PyPI to support a couple different things: homeowner keypads, main repeater keypads

* added requirements
2019-07-30 12:18:26 +02:00
Robert Svensson
35900964cb UniFi - Track devices (#25570) 2019-07-30 10:05:51 +02:00
Aaron Bach
71acc6d3f8 Transition SimpliSafe data retrieval to its own object (#25546)
* Transition SimpliSafe data retrieval to its own object

* Don't overwrite a variable

* Member comments

* Member comments
2019-07-29 15:52:30 -06:00
Robert Svensson
891f19b43f UniFi device tracker restore clients (#25532) 2019-07-29 23:13:04 +02:00
Tobias Perschon
3a91c8f285 updated telegram trusted IPs (#25564)
updated telegram trusted IPs (Source: https://core.telegram.org/bots/webhooks)
2019-07-29 14:51:05 -05:00
Andre Lengwenus
c4f673c894 LCN cover control via output ports (#25511)
* LCN motor control via oputput ports

* Remove default value from cover validator
2019-07-29 20:49:44 +02:00
Robert Svensson
dc722adbb5 UniFi POE control restore clients (#25558)
* Restore POE controls on restart
2019-07-29 19:48:38 +02:00
ktnrg45
2e300aec5a Add PS4 tests for media player (#25415)
* Add tests for media_player

* remove ps4/media_player.py

* Add unsubscribe method to unload

* Add_to_hass instead of add_to_manager

* Use hass.states for states

* Fix assertions

* fix tests

* Add schedule update

* Remove entity assertions
2019-07-29 09:38:17 -04:00
Josh Anderson
b87b29dff6 Quiet noisy tado query logging (#25529)
* Quiet noisy tado query logging

* Lint fix
2019-07-29 09:37:19 -04:00
Yaroslav
65a29e3371 Expose last_video_id as property for Ring camera (#25553) 2019-07-29 07:38:53 -05:00
Finbarr Brady
7e6d47d64a Update Cisco Mobility Express module version (#25422)
* Update requirements_all.txt

* Update manifest.json

* Bump to remove f-strings

* Bump to remove f-strings
2019-07-29 13:47:59 +02:00
Anders Melchiorsen
a90ec88e5c Update eternalegypt to 0.0.8 (#25551) 2019-07-29 11:34:58 +02:00
Ville Skyttä
cc74b22ce8 Ignore .dmypy.json (#25528) 2019-07-29 11:34:19 +02:00
Ville Skyttä
f379bb4016 huawei_lte: try unsupported data retrievals only once (#25524)
* huawei_lte: try unsupported data retrievals only once

Refs https://github.com/home-assistant/home-assistant/pull/23809

* Move huawei_lte_api imports to top level
2019-07-29 10:08:49 +02:00
Maikel Punie
1f9f201571 Enable velbus config entries (#25308)
* Initial work on config_flow

* Finish config flow

* Pylint checks, make sure the import only happens once

* Added support for unloading, small fixes

* Check in the hassfest output files

* Flake8 fixes

* pylint mistake after flake8 fixes

* Work on comments

* Abort the import if it is already imported

* More comments resolved

* Added testcases for velbus config flow

* Fix pylint and flake8

* Added connection test to the config flow

* More sugestions

* renamed the abort reason

* excluded all but the config_flow.py from the velbus component in coveragerc

* Rewrote testcases with a patched version of _test_connection

* Docstyle fixes

* Updated the velbus testcases

* just yield

* flake8 fixes
2019-07-29 09:21:26 +02:00
Daniel Høyer Iversen
03052802a4 Tibber, off peak values (#25320)
* Tibber, off peak values

* style

* style

* refactor

* style
2019-07-29 09:29:45 +03:00
michaeldavie
1aae84173a Bump env_canada to 0.0.19 (#25548)
* Bump env_canada to 0.0.19

* Update requirements_all.txt
2019-07-28 21:52:16 -05:00
William Scanlon
bc38d394d5 Make cool_on and heat_on method calls. They aren't properties in python-wink (#25549) 2019-07-28 19:47:32 -05:00
Cameron Morris
e225243bc5 Fix WinkAC mode API calls to correct methods (#25545) 2019-07-28 16:27:02 -05:00
Aaron Bach
7ceedd15b3 Fix bug with WWLLN update interval (#25498)
* Fix bug with WWLLN update interval

* Match the docs
2019-07-28 14:09:14 -06:00
MJJ
14c3b38461 Update to buienradar json api; and additional monitored_conditions (#24463)
* Update to json api; and additional sensors

* remove unneeded coordinate rounding

* minor optimizations

* Update CODEOWNERS

* Update sensor.py

unit conversion for:
- windgust (ms to km/h)
- windspeed (m/s to km/h)
- windspeed_1d (ms/km/h)
- visibility (m to km)

* Update weather.py

unit conversion for windspeed (m/s to km/h)

* Update sensor.py

* Update weather.py

* Update weather.py

* Update CODEOWNERS

* Update manifest.json

* Update CODEOWNERS

* Update CODEOWNERS

yet another try to get CODEOWNERS accepted...

* update codeowners

* update codeowners

* update

* updated codeowners to match manifest
2019-07-28 22:08:20 +02:00
Robert Svensson
c6c4c07f2d deCONZ - cleanup sensor attributes (#25540)
* Improve attributes
2019-07-28 21:01:52 +02:00
Anglac
a14c299a78 Roombalocate (#25508)
* Add Roomba Locate

* Update vacuum.py
2019-07-28 20:37:25 +02:00
Thomas Germain
4936e55979 Improve seventeentrack (#25454)
* Code improvement + tests

* review

* review + moving to pytest test function

* move test to async

* remove code comment
2019-07-28 19:55:46 +02:00
Robert Svensson
da53e0a836 deCONZ - Add power attribute for consumption sensors (#25512)
* Add power attribute for consumption sensors

* Bump dependency to v62
2019-07-28 18:25:38 +02:00
Pascal Vizeli
3672a5f881 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-28 12:39:06 +02:00
manonstreet
0be0353eed Add last_run_success boolean attribute to Switchbot for error trapping (#25474)
* Add last_run_error boolean attribute to Switchbot entity to allow for trapping of errors

* Add last_run_success boolean attribute to Switchbot for error trapping

* Add last_run_success boolean attribute to Switchbot for error trapping
2019-07-27 16:13:48 +02:00
Oncleben31
4d7fd8ae17 Add container settings for YAML extension to avoid Hass specific custom tags errors (#25504)
* Add settings for YAML extension to avoid !secret tag errors

* add other HA custom tags.
2019-07-27 10:43:18 +02:00
Pascal Vizeli
1f09967abb Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-27 10:41:17 +02:00
gjbadros
7d68def303 Support multiple Elk instances (#23839)
* Support multiple Elk instances

* Allow more than one Elk M1 alarm system to be integrated into a single hass instance.

* Introduces new "devices" schema at the top level, each of which has
  the prior configuration schema.

* Requires new version of elkm1, 0.7.14, that gwww and I just updated (thanks Glen!)

QUESTION: Should the "devices" section be optional to avoid breaking
old configuration files?  I chose not to do that for simplicity and
because I was following the doorbird code which requires the "devices"
section for all configurations even with only one device.

* Fixed a bunch of hound-raised issues

Fixed issues raised by hound -- there was clearly
a tool I was supposed to run to get those warnings
before submitting the PR.  Sorry!

Updated REQUIREMENTS.

* Fixed whitespace and line-length mistakes

Also fixed unused prefix local variable lint warning.

* Fixed missing blank line

* Fixed more lint warnings.

Not sure if I missed these on the first pass or if the linter stopped
after a certain number of warnings or something else.

Switched logging to use %d and %s instead of string concatenation (per
lint request and because I imagine it migth be better performing
in some (oldish, I presume) implementations of python.

* Fixed typo in last commit.

* Eliminate devices subsection in config schema

This eliminates the breaking change for configurations wanting a
singleton elk m1 instance (the majority of users, no doubt).  I did
not do it like this before because I was following the lead of the
doorbird component which introduced a devices: section when moving
to support multiple doorbells.  But Rohan Kapoor kindly pointed me
at the zoneminder component which sets the other (IMO) preferable
precedent. Will update the docs change shortly.

* Call async_add_entities once for all the elk controllers.

Just move async_add_entities() outside of the loops across the elk m1
controllers, so it's called once for each platform.

* Call async_add_entities only once per platform.

Move it to after the loop, so it's called only once
per platform even when there are multiple elk m1 controllers.

* Various improvements to be more idiomatic python + bug fixes

Thanks to Martin Hjelmare for the careful review and suggestions.
(All mistaken improvements and new bugs are my own.)

* Removed semicolon that lint caught.

* Idiomatic python improvements

Use dict.values() (instead of making it easier to add local looping variable
on the keys by using _, bar = ...items())

Use [] when the key is known to exist.

* Support multiple Elk instances

* Allow more than one Elk M1 alarm system to be integrated into a single hass instance.

* Introduces new "devices" schema at the top level, each of which has
  the prior configuration schema.

* Requires new version of elkm1, 0.7.14, that gwww and I just updated (thanks Glen!)

QUESTION: Should the "devices" section be optional to avoid breaking
old configuration files?  I chose not to do that for simplicity and
because I was following the doorbird code which requires the "devices"
section for all configurations even with only one device.

* Fixed a bunch of hound-raised issues

Fixed issues raised by hound -- there was clearly
a tool I was supposed to run to get those warnings
before submitting the PR.  Sorry!

Updated REQUIREMENTS.

* Fixed whitespace and line-length mistakes

Also fixed unused prefix local variable lint warning.

* Fixed missing blank line

* Fixed more lint warnings.

Not sure if I missed these on the first pass or if the linter stopped
after a certain number of warnings or something else.

Switched logging to use %d and %s instead of string concatenation (per
lint request and because I imagine it migth be better performing
in some (oldish, I presume) implementations of python.

* Fixed typo in last commit.

* Eliminate devices subsection in config schema

This eliminates the breaking change for configurations wanting a
singleton elk m1 instance (the majority of users, no doubt).  I did
not do it like this before because I was following the lead of the
doorbird component which introduced a devices: section when moving
to support multiple doorbells.  But Rohan Kapoor kindly pointed me
at the zoneminder component which sets the other (IMO) preferable
precedent. Will update the docs change shortly.

* Call async_add_entities once for all the elk controllers.

Just move async_add_entities() outside of the loops across the elk m1
controllers, so it's called once for each platform.

* Call async_add_entities only once per platform.

Move it to after the loop, so it's called only once
per platform even when there are multiple elk m1 controllers.

* Various improvements to be more idiomatic python + bug fixes

Thanks to Martin Hjelmare for the careful review and suggestions.
(All mistaken improvements and new bugs are my own.)

* Removed semicolon that lint caught.

* Idiomatic python improvements

Use dict.values() (instead of making it easier to add local looping variable
on the keys by using _, bar = ...items())

Use [] when the key is known to exist.

* Use dict[key] instead of .get (incl. fixing typo). Use .values() instead of .items() when ignoring keys.

* Gotta use devices.get(prefix) since we use no prefix for the singleton elk instance

* fix requirement to use newer elkm1 that supports my changes for multiple elk devices

* Removed spurious + between a string broken between two lines for formatting; was failing a lint check about logging needing to use %s

* Remove REQUIREMENTS and DEPENDENCIES since those are now taken care of by the manifest.json file.

* Add configuration check that the prefixes are all unique

* Use new dependency 'getmac' to get mac address of Elk M1 controllers and use that for uniqueid if possible, else use None.  Also removed some procedural checking of unique prefix since that's now handled at schema check time.

* Whitespace changes to make style checker happy and code more consistent

* Removed unused variable, added blank line

* Make getmac a requirement not dependency

I should've RTFM.

* ws only change; I really need to get Emacs to understand these style guidelines

* Ran script/gen_requirements_all.py; script/setup needed to be run so that was failing.

* More style check fixes and one bug fix.

* Incomplete set of changes from last push

* More conform-to-hass-style changes: use caps to start log message (and do not use function name even for debug message. And do not use string concatenation; prefer new-style .format.

* Style fixes.

* Switch back to using the prefix config field for setting the unique_id since the mac address approach has numerous shortcomings including: 1) new dependency; 2) lack of reliability; 3) doesn't work for serial connections; 4) breaks when a layer 4+ networking entity intermediates the elk m1 connection.

* Reran to update (removing getmac dependency)

* Skipped trailing ','; keep forgetting which languages are forgiving about this practical nicety of allowing trailing commas without changing the semantics.

* Validate uniqueness on lowercase versions of the prefix since we're gonna use .lower() on creating the entity id that has to be unique; do the _has_all_unique_prefixes check last so we get errors from the device schema before complaining about the uniqueness problem, if any

* Use vol.Lower to convert to lowercase instead of the map.  Also fixed a pair of bugs for the alarm control panel display message service -- since data templates always generate strings, the values subject to range/set restrictions need to be coerced to their proper type before the check

* Fix some flake8 warnings.

* Fixed typo; it's Coerce not coerce.

* Use elkm1m_ string to start unique_id when and only when there is a non-empty prefix given; this enables backward compatibility to avoid a breaking change by letting the elkm1_ start to unique_id keep working exactly as it used to.

* minor comment tweak to force automation tests to run again since they failed for unrelated reasons last time

* There's actually been a 0.7.15 release which was meta-information and tidying only so we might as well depend on it

* Forgot to update this with gen_requirements_all.py
2019-07-27 10:36:09 +02:00
Paulus Schoutsen
9efb759a98 Fire lovelace updated event when update detected (#25507) 2019-07-26 20:30:51 -07:00
Charles Garwood
0a6d49b293 Improve handling of Z-Wave config entry vs yaml config (#25112)
* Improved handling of config entry vs yaml config

* Address review comment
2019-07-26 20:27:17 -07:00
Paulus Schoutsen
297cd3dc13 Fix deprecation warning in test (#25506) 2019-07-26 17:40:40 -07:00
Paulus Schoutsen
0df1bb5029 Fix python 3.5 test 2019-07-26 16:15:46 -07:00
Pascal Vizeli
1c3e5988db Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 20:16:29 +02:00
Pascal Vizeli
230ca9b89d Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 20:03:30 +02:00
Pascal Vizeli
3c7be11c31 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 18:48:29 +02:00
Pascal Vizeli
668deeb7bd Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 17:54:20 +02:00
Pascal Vizeli
8c61808ce4 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 17:03:58 +02:00
Pascal Vizeli
36f3940c85 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 16:56:38 +02:00
Pascal Vizeli
fc36927468 [skip ci] Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 16:52:42 +02:00
Bram Goolaerts
0fc2813177 Add De Lijn (Flemish Public Transport) component (#24265)
* Initial commit of De Lijn (Flemish Public Transport) component

* Code corrections as per HA dev's requests

* changes to variable naming, setting attribution and states, plus some smaller optimizations

* Overlooked some linting issues, these are now fixed

* Updated pydelijn version requirement to 0.5.1 so UTC timestamps can be used instead of relative/local times.
Removed unused definition

* Updated pydelijn version requirement to 0.5.1 in requirements_all.txt

* Update the self._attributes dict directly instead of replacing it
Assign ATTRIBUTION while creating the _attributes dict
Remove the ATTRIBUTION assignment in device_state_attributes as it's updated in the async_update now.

* Linting issue (lenght of 2 lines) solved

* Removed a relative time attribute
Updated a linting issue in the LOGGER (used % instead of the format)
2019-07-26 16:41:02 +02:00
Robin Wohlers-Reichel
a17e28cc78 Update solax to 0.1.2 (#25497) 2019-07-26 16:24:04 +02:00
Pascal Vizeli
2c8c8009ff Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 16:20:59 +02:00
Pascal Vizeli
2087358d58 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-26 16:10:22 +02:00
aschamberger
3512d05467 Add ord() to template filters (#25398)
* Add ord() to template filters

* Remove trailing whitespace

* add test
2019-07-25 15:06:51 -07:00
Paulus Schoutsen
4f8a93fb3e Merge pull request #25486 from home-assistant/rc
0.96.5
2019-07-25 10:51:55 -07:00
Paulus Schoutsen
47f3be1fe4 Bumped version to 0.96.5 2019-07-25 09:51:43 -07:00
Paulus Schoutsen
e79af97fdc Fix Nest turning off eco (#25472)
* Fix Nest turning off eco

* Add hvac action
2019-07-25 09:51:25 -07:00
Paulus Schoutsen
46b9e9cdfb Allow cors for static files (#25468) 2019-07-25 09:51:24 -07:00
David Bonnes
94f5b262be Bump geniushub client (#25458) 2019-07-25 09:51:23 -07:00
Robert Svensson
ea5d3ce85a Add a unique identifier to deCONZ groups (#25485)
* Add a unique identifier to deconz groups
2019-07-25 18:39:38 +02:00
Robert Svensson
b6934f0cd0 UniFi block clients (#25478)
* Allow blocking clients on UniFi networks
2019-07-25 16:56:56 +02:00
michaeldavie
59c62a261b Add scan interval to config of Environment Canada sensor (#25414)
* Make refresh period configurable

* Remove throttle, set SCAN_INTERVAL

* Move SCAN_INTERVAL to module
2019-07-25 13:56:33 +02:00
Paulus Schoutsen
fae3546910 Allow cors for static files (#25468) 2019-07-25 13:52:27 +02:00
Ville Skyttä
b230562c76 Mypy config cleanups (#25475)
* Move file specific config to inline comments

* Disallow untyped defs by default everywhere
2019-07-25 08:08:20 +02:00
Paulus Schoutsen
a50f1ae614 Fix Nest turning off eco (#25472)
* Fix Nest turning off eco

* Add hvac action
2019-07-24 21:53:51 -07:00
Ville Skyttä
e8e84fb764 Type check homeassistant.scripts (#25464)
* Run mypy on homeassistant.scripts, disabling bunch of checks for now

* Declare async_initialize in AuthProvider

* Add some type hints

* Remove unreachable code

* Help mypy out

* Script docstring fixes
2019-07-24 13:18:40 -07:00
Santobert
10b120f11f Fix bloomsky unit system (#25460)
* initial commit - fix bloomsky unit system

* Add another warning

* Fix linting error

* Include metric sensor units

* Shorten a too long line
2019-07-24 19:37:36 +02:00
Ville Skyttä
408af6e842 Install requirements_test.txt for flake8 in Azure CI (#25463)
Was missing specified versions, as well as flake8-docstrings altogether.
2019-07-24 08:36:52 -07:00
Ville Skyttä
cd0277c2c3 Lint fixes (#25462) 2019-07-24 08:26:41 -07:00
Aaron Bach
00a5a5f3c0 Add area support to group service schemas (#25410) 2019-07-24 09:40:34 -05:00
plafü
18ba2f986e Allow configuring sources for older Pioneer receivers (#25305)
* Allow configuring sources for older Pioneer receivers

* Replace config.get calls with dict[key] syntax
2019-07-24 09:13:12 -05:00
David Bonnes
b0e4260562 Bump geniushub client (#25458) 2019-07-24 08:39:34 -04:00
Fredrik Erlandsson
f799bbf2a7 Daikin simplification and code cleanup (#25416)
* fix preset documentation

* Use pydaikin set holiday method

* update temperature readings, code simplification

* more temperature cleanup

* cleanup HVAC_MODE

* remove get() method and move code to respectivly place

* remove string constant in code

* remove get() method and move code to respectivly place

* isort results

* fixes in state method
2019-07-24 13:48:08 +02:00
Aaron Bach
9e36448f03 Add area support to vacuum service schemas (#25443)
* Add area support to vacuum service schemas

* Fixed tests

* De-couple platform schemas
2019-07-24 08:29:08 +02:00
Aaron Bach
561bbecd25 Remove unnecessary REMOTE_SERVICE_SCHEMA (#25453) 2019-07-24 08:27:37 +02:00
Aaron Bach
df51c07a88 Add area support to timer service schemas (#25440)
* Add area support to timer service schemas

* Base

* Remove unnecessary TIMER_SERVICE_SCHEMA
2019-07-24 08:26:25 +02:00
Pascal Vizeli
fb9ca0d4da Update azure-pipelines-release.yml for Azure Pipelines 2019-07-24 08:19:16 +02:00
Ville Skyttä
f07c714c01 Huawei LTE misc improvements (#25377)
* Rename traffic_statistics to monitoring_traffic_statistics

For better consistency with huawei-lte-api.

* Add default device name for sensors

In case the actual device name cannot be accessed for some reason.

* Support device class in sensor metadata

* Mark known signal strength sensors as such
2019-07-24 07:24:22 +03:00
Paulus Schoutsen
8a2fdb5045 Merge pull request #25452 from home-assistant/rc
0.96.4
2019-07-23 18:24:24 -07:00
Aaron Bach
f1e4153b2c Add area support to media player service schemas (#25436)
* Add area support to media player service schemas

* Re-establish MEDIA_PLAYER_SCHEMA

* Comment

* Localize platforms that used MEDIA_PLAYER_SCHEMA
2019-07-23 18:54:59 -06:00
Alexei Chetroi
c886d00bab Bump up ZHA dependencies. (#25450) 2019-07-23 20:51:40 -04:00
Paulus Schoutsen
065a5c5df6 Bumped version to 0.96.4 2019-07-23 17:12:08 -07:00
David Bonnes
e2e7d39527 [climate] Correct evohome hvac_action (#25407)
* inital commit

* take account of rounding up of curr_temp
2019-07-23 17:11:40 -07:00
Anders Melchiorsen
9fc4b878e2 Update pysonos to 0.0.22 (#25399) 2019-07-23 17:11:40 -07:00
Fredrik Erlandsson
638f5b1932 Update Daikin preset modes (#25395)
* fix preset documentation

* Use pydaikin set holiday method
2019-07-23 17:11:39 -07:00
David Bonnes
956cdba588 [climate] Bugfix/Tweak honeywell migration (#25369)
* refactor supported_features

add dr_data

delint

simplify code

refactor target_temps, and delinting

delint

fix typo

add min/max temps

delint

refactor target temps

refactor target temps 2

correct typo

tweak

fix potential bug

fix potential bug 2

bugfix entity_id

fix typo

* remove explicit entity_id

* refactor hvac_action

* refactor hvac_action

* bugfix - HVAC_MODE_HEAT_COOL incorrectly added to hvac_modes
2019-07-23 17:11:38 -07:00
cgtobi
f11f0956d3 Bump pyatmo version to 2.1.2 (#25296) 2019-07-23 17:11:37 -07:00
David Bonnes
1c861b9732 Tweak evohome migration (#25281)
* Initial commit

add hvac_action to zones, remove target_temp from controller

fix incorrect hvac_action

de-lint

Initial commit

de-lint & minor refactor

tweak docstrings, and type hints

tweak docstrings

* refactor setpoints property

* tweak docstring

* tweak docstring

* avoid a unnecessary I/O

* avoid unnecessary I/O

* refactor schedule/setpoints

* remove type hint

* remove type hint 2

* tweak code

* delint type hints

* fix regression
2019-07-23 17:11:37 -07:00
cgtobi
86d1bb651e Fix Netatmo climate battery level (#25165)
* Interpolate battery level

* Sort list
2019-07-23 17:11:36 -07:00
Fredrik Erlandsson
828f76ca57 Update Daikin preset modes (#25395)
* fix preset documentation

* Use pydaikin set holiday method
2019-07-23 17:10:28 -07:00
Kim Frellsen
f5d0f36caf Add new device tracker supporting Fortinet FortiGate (#23078)
* Create device_tracker.py

initial version

* Update device_tracker.py

set verify SSL to false as default. Normally users do not have a verified certificate at home

* Update device_tracker.py

pep8 compliant

* Update device_tracker.py

upgraded fortiosapi requirements

* Create __init__.py

* tox compliant

* Update device_tracker.py

* Create manifest.json

* Update .coveragerc

added fortios

* Update device_tracker.py

circle ci, blank line required

* Update manifest.json

removed code owners

* Update manifest.json

removed dependencies

* Update manifest.json

removed codeowners

* Update requirements_all.txt

added fortios

* Update requirements_all.txt

* Update device_tracker.py

pylint corrections

* Update device_tracker.py

pylint exceptions

* Update device_tracker.py

disable pylint broad exceptions

* Update device_tracker.py

pylint

* Update device_tracker.py

removed pointless string statements

* Update device_tracker.py

removed blank line

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update device_tracker.py

* Update manifest.json

added codeowners

* Update CODEOWNERS

added kimfrellsen as codeowner

* fortiosapi 0.10.8

Updated to use latest version of fortiosapi 0.10.8

* Update requirements_all.txt

updated fortiosapi to 0.10.8

* Update device_tracker.py

fixed some requests.

* Update device_tracker.py

better exception handling.

* Update device_tracker.py

exception handling

* Update CODEOWNERS

* Update device_tracker.py

corrected exception handling

* Update device_tracker.py

exception handling.

* Update device_tracker.py

lint corrections

* Update device_tracker.py

removed broad exception.

* Update device_tracker.py

fix lint errors

* Update device_tracker.py

minor changes, mostly cosmetic
2019-07-24 01:18:58 +02:00
Farid
4fb1937f65 Suez water (#23844)
* Add suez water sensor

* flake8 test

* pylint test

* edition to fix flake8 and pylint issues

* edition to be okay with the musts

* Added a blank line to __init.py__ for flake8

* added blank line for flake8

* changer scan interval from 10 to 720 minutes

* use of pysuez

* bug fix and isort

* use of pysuez

* fixed flake8 and pylint errors

* update requirements_all.txt

* added a method to test login/password befire adding device

* flake8 edition

* update requirements_all.txt

* add of .coveragerc file with untested files

* update of .coveragerc

* Update homeassistant/components/suez_water/__init__.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* Update homeassistant/components/suez_water/sensor.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* Update homeassistant/components/suez_water/sensor.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* Update homeassistant/components/suez_water/sensor.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* Update homeassistant/components/suez_water/sensor.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* Update homeassistant/components/suez_water/sensor.py

Co-Authored-By: Fabian Affolter <mail@fabian-affolter.ch>

* bug fix in check credentials

* flake8 and pylint fixes

* fix codeowner

* update requirements_all.txt

* Sorted suez_water line

* edition to answer comments from @MartinHjelmare on #23844

* Attribute keys formatting to lowercase snakecase, name and icon constants returned directly, and remove of  attribute. Update of .

* pylint edition

* correction wrong keys in client attributes

* remove of unnedeed return and move add_entities
2019-07-24 01:14:41 +02:00
Aaron Bach
e4b4551b35 Add area support to remote service schemas (#25437)
* Add area support to remote service schemas

* Base
2019-07-23 16:05:55 -07:00
Aaron Bach
5e2dfb14fb Add area support to lock service schemas (#25435)
* Add area support to lock service schemas

* extend

* Fixed tests

* Fixed tests
2019-07-23 16:05:21 -07:00
Sören
4c067ecff7 Add Elgato Avea integration (#24281)
* Adds Elgato Avea integration

* Revert "Adds Elgato Avea integration"

This reverts commit 8607a685eb.

* Adds Elgato Avea integration

* Removed debug

* Improved readability

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Adds Elgato Avea integration

* Fixes for flake8

* More Fixes for flake8

* Hopefully last fixes for flake8

* Unnecessary rounding and typo removed

* Duplicate calls removed

* raise PlatformNotReady if communication with bulb failes

* Fixes: flake8, missing import of exception and better handling of ble problems

* Update requirements_all.txt

Add requirements_all.txt file

* Revert "Update requirements_all.txt"

This reverts commit 2856025ed3.

* Update requirements_all.txt

* conform with snake_case naming style

https://circleci.com/gh/home-assistant/home-assistant/31823

* Fixed variable rename

* Unnecessary calculation removed

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Better Exception Handling

* Changed position of import, renamed add_entities to add_devices, remove unnecessary comment

* Unnecessary comments removed.
2019-07-24 01:02:00 +02:00
Aaron Bach
2fb03106ea Add area support to utility meter service schemas (#25442)
* Add area support to utility meter service schemas

* Reverted mistaken deletion
2019-07-23 16:43:48 -06:00
Joe Trabulsy
a8ec826ef7 Add Support for VeSync Devices - Outlets and Switches (#24953)
* Change dependency to pyvesync-v2 for vesync switch

* async vesync component

* FInish data_entry_flow

* Update config flow

* strings.json

* Minor fix

* Syntax fix

* Minor Fixs

* UI Fix

* Minor Correct

* Debug lines

* fix device dictionaries

* Light switch fix

* Cleanup

* pylint fixes

* Hassfest and setup scripts

* Flake8 fixes

* Add vesync light platform

* Fix typo

* Update Devices Service

* Fix update devices service

* Add initial test

* Add Config Flow Tests

* Remove Extra Platforms

* Fix requirements

* Update pypi package

* Add login to config_flow

Avoid setting up component if login credentials are invalid

* Fix variable import

* Update config_flow.py

* Update config_flow.py

* Put VS object into hass.data instead of config entry

* Update __init__.py

* Handle Login Error

* Fix invalid login error

* Fix typo

* Remove line

* PEP fixes

* Fix change requests

* Fix typo

* Update __init__.py

* Update switch.py

* Flake8 fix

* Update test requirements

* Fix permission

* Address change requests

* Address change requests

* Fix device discovery indent, add MockConfigEntry

* Fix vesynclightswitch classs

* Remove active time attribute

* Remove time_zone, grammar check
2019-07-23 23:40:55 +02:00
Andre Richter
738d00fb05 Increase vallox robustness on startup (#25382)
* Vallox: Increase robustness on startup

Experiments showed that timing of websocket requests to the Vallox firmware is
critical when fetching new metrics. Tests on different Raspberry Pis and x86
machines showed that those machines with little processing power tend to fail
the timing requirments during the busy startup phase of Home Assistant,
resulting in the Vallox integration failing to set itself up.

This patch catches Websocket's InvalidMessage, which is a symptom of failing the
timing requirements. Experiments again showed that on the Raspberry's, this
exception is catched once at startup, but the integration is running fine
afterwards.

* Update __init__.py

* Bump to new 2.1.0 version of api.

* Bump to api 2.2.0
2019-07-23 23:32:48 +02:00
Johnny Moore
5ce6ea2df5 Upgrade HPILO requirement to v4.3 (#25444)
* Update Python-HPILO to 4.3

Update of Python-HPILO requirement to 4.3 to resolve outstanding SSL connections for older HP servers (ILO 3)

* Update requirements_all.txt

Update HPILO to 4.3
2019-07-23 16:20:05 -05:00
Aaron Bach
2850f9d19e Add area support to climate service schemas (#25441)
* Add area support to climate service schemas

* Incorrect Voluptuous usage

* Fix typo
2019-07-23 14:33:41 -06:00
Aaron Bach
20ed07cc5c Add area support to input text service schemas (#25434) 2019-07-23 14:32:28 -06:00
Aaron Bach
2354108e6f Add area support to image processing service schemas (#25428) 2019-07-23 14:08:23 -06:00
Aaron Bach
a5c2a80db3 Add area support to input boolean service schemas (#25429) 2019-07-23 14:05:15 -06:00
Aaron Bach
bd2b107575 Add area support to Wink service schemas (#25445) 2019-07-23 14:04:59 -06:00
Aaron Bach
c92f287c73 Add area support to input datetime service schemas (#25430)
* Add area support to input datetime service schemas

* Fixed tests
2019-07-23 13:39:07 -06:00
Aaron Bach
3af77eb594 Add area support to input number service schemas (#25431) 2019-07-23 13:39:02 -06:00
Aaron Bach
8e4a234bbf Add area support to input select service schemas (#25432) 2019-07-23 13:38:20 -06:00
Aaron Bach
ee7ec5f234 Add area support to scene service schemas (#25438) 2019-07-23 13:38:08 -06:00
Aaron Bach
80051b7fc2 Add area support to script service schemas (#25439) 2019-07-23 13:37:47 -06:00
cgtobi
ca989cba44 Bump pyatmo version to 2.1.2 (#25296) 2019-07-23 21:29:04 +02:00
cgtobi
aa062176ca Clean up Netatmo sensor code (#25390)
* Clean up code

* Add parameter

* Make it a list

* Move loop and add debug message

* Further clean up

* Move manual config
2019-07-23 21:27:54 +02:00
David Bonnes
a1bccb1934 [climate] Bugfix/Tweak honeywell migration (#25369)
* refactor supported_features

add dr_data

delint

simplify code

refactor target_temps, and delinting

delint

fix typo

add min/max temps

delint

refactor target temps

refactor target temps 2

correct typo

tweak

fix potential bug

fix potential bug 2

bugfix entity_id

fix typo

* remove explicit entity_id

* refactor hvac_action

* refactor hvac_action

* bugfix - HVAC_MODE_HEAT_COOL incorrectly added to hvac_modes
2019-07-23 11:23:22 -07:00
Aaron Bach
9470829978 Add area support to alarm_control_panel service schemas (#25402)
* Add area support to alarm_control_panel service schemas

* Corrected import
2019-07-23 11:09:09 -06:00
Aaron Bach
b71cb73c80 Add area support to counter service schemas (#25401)
* Add area support to counter service schemas

* Updates
2019-07-23 11:08:32 -06:00
Aaron Bach
0caab133e6 Add area support to automation service schemas (#25403)
* Add area support to automation service schemas

* Fixed import

* Mispelling
2019-07-23 11:07:13 -06:00
Aaron Bach
e6445a602b Add area support to cover service schemas (#25408)
* Add area support to cover service schemas

* Linting
2019-07-23 11:05:53 -06:00
Aaron Bach
8f2de2bf1b Add area support to fan service schemas (#25409) 2019-07-23 11:05:28 -06:00
David Bonnes
7cf0684aa1 [climate] Correct evohome hvac_action (#25407)
* inital commit

* take account of rounding up of curr_temp
2019-07-23 07:14:42 +02:00
Anders Melchiorsen
5e805768aa Update pysonos to 0.0.22 (#25399) 2019-07-23 06:31:05 +02:00
Jc2k
8c69fd91ff Only poll HomeKit connection once for all entities on a single bridge/pairing (#25249)
* Stub for polling from a central location

* Allow connection to know the entity objects attached to it

* Move polling logic to connection

* Don't poll if no characteristics selected

* Loosen coupling between entity and HKDevice

* Disable track_time_interval when removing entry

* Revert self.entities changes

* Use @callback for async_state_changed

* Split out unload and remove and add a test

* Test that entity is gone and fix docstring
2019-07-22 09:22:44 -07:00
Paulus Schoutsen
58f946e452 Merge remote-tracking branch 'origin/master' into dev 2019-07-22 08:31:02 -07:00
David Bonnes
bf37cc8371 Tweak evohome migration (#25281)
* Initial commit

add hvac_action to zones, remove target_temp from controller

fix incorrect hvac_action

de-lint

Initial commit

de-lint & minor refactor

tweak docstrings, and type hints

tweak docstrings

* refactor setpoints property

* tweak docstring

* tweak docstring

* avoid a unnecessary I/O

* avoid unnecessary I/O

* refactor schedule/setpoints

* remove type hint

* remove type hint 2

* tweak code

* delint type hints

* fix regression
2019-07-22 10:45:31 +02:00
David Radcliffe
11c74cd0d7 Add support for contact binary sensors in homekit_controller (#25355) 2019-07-22 08:40:55 +01:00
Pierre Ståhl
797196dce9 Add add_torrent service to Transmission (#25144)
* Add add_torrent service to Transmission

* Fix services.yaml format

* Verify that torrent is whitelisted

* Add logging if adding failed

* Change warn to warning
2019-07-21 22:31:11 +02:00
Paulus Schoutsen
81bde77c04 Updated frontend to 20190721.1 2019-07-21 13:02:26 -07:00
Paulus Schoutsen
0be2dad651 Updated frontend to 20190721.1 2019-07-21 13:02:16 -07:00
Paulus Schoutsen
a652a4d9e9 Merge pull request #25376 from home-assistant/rc
0.96.3
2019-07-21 12:02:38 -07:00
cgtobi
2189cb0ee7 Add Netatmo climate battery level (#25143)
* Add battery level sensor

* Only update battery level if lower or nonexistent
2019-07-21 12:00:56 -07:00
Paulus Schoutsen
15064e83b4 Bumped version to 0.96.3 2019-07-21 11:10:36 -07:00
David F. Mulcahey
8538e69e28 change and condition to or condition (#25374) 2019-07-21 11:10:26 -07:00
David F. Mulcahey
35c719628d fix remove and re-add scenario (#25370) 2019-07-21 11:10:25 -07:00
Otto Winter
0eab89c8f4 Fix ESPHome climate migration (#25366) 2019-07-21 11:10:25 -07:00
David F. Mulcahey
a3043b9a90 bump quirks version (#25362) 2019-07-21 11:10:24 -07:00
Paulus Schoutsen
c795c93034 Introduce PRESET_NONE for climate (#25360)
* Introduce PRESET_NONE for climate

* Require preset mode to be a string

* Lint

* Fix tests
2019-07-21 11:10:24 -07:00
David Bonnes
2228a0dcac Improve geniushub logging and bump client (#25359)
* add debug logging

* bump geniushub client library

* delint

* bump again

* bump again, again
2019-07-21 11:10:23 -07:00
cgtobi
68e7f4ca5a Fix preset service call (#25358) 2019-07-21 11:09:59 -07:00
David F. Mulcahey
e052bcb03b add available to device info (#25349) 2019-07-21 11:08:21 -07:00
Michael Scherer
a56b604936 Fix for hvac_modes list being null (#25347)
* Fix for empty hvac_modes list

* Empty list instead of default value for hvac_modes
2019-07-21 11:08:21 -07:00
Fredrik Erlandsson
f6b6818fb0 Restore Daikin A/C on/off services (#25332) 2019-07-21 11:08:20 -07:00
eyager1
93a65bf507 Update zwave climate mappings (#25327)
hvac_action should be idle when thermostat is in Pending Heat or Pending Cool.
2019-07-21 11:08:19 -07:00
Paulus Schoutsen
ec302912a3 Restore sensiobo turn on/off methods (#25321) 2019-07-21 11:08:19 -07:00
Paulus Schoutsen
50d4921d0a Introduce PRESET_NONE for climate (#25360)
* Introduce PRESET_NONE for climate

* Require preset mode to be a string

* Lint

* Fix tests
2019-07-21 11:00:42 -07:00
Ville Skyttä
17d754dbbf Optional and Union simplifications (#25365) 2019-07-21 10:59:51 -07:00
eyager1
5e29d4d098 Update zwave climate mappings (#25327)
hvac_action should be idle when thermostat is in Pending Heat or Pending Cool.
2019-07-21 10:44:15 -07:00
David Bonnes
97a13bdcd4 Improve geniushub logging and bump client (#25359)
* add debug logging

* bump geniushub client library

* delint

* bump again

* bump again, again
2019-07-21 10:07:03 -07:00
Paulus Schoutsen
7f5607c918 Updated frontend to 20190721.0 2019-07-21 10:03:25 -07:00
David F. Mulcahey
d7bd7f2c4c bump quirks version (#25362) 2019-07-21 10:02:22 -07:00
Otto Winter
b6ba24de5d Fix ESPHome climate migration (#25366) 2019-07-21 10:01:16 -07:00
David F. Mulcahey
7a8130cd2b fix remove and re-add scenario (#25370) 2019-07-21 10:00:27 -07:00
Ville Skyttä
d64f1e767c Type check all helpers (#25373)
* Type check all helpers, add inline exclusions for work in progress

* Remove unused Script._template_cache

* Add some missing type hints

* Remove unneeded type: ignore

* Type hint fixes

* Mypy assistance tweaks

* Don't look for None in deprecated config "at most once" check

* Avoid None name slugify attempt when generating entity id

* Avoid None state store attempt on entity remove
2019-07-21 09:59:02 -07:00
David F. Mulcahey
0653f57fb4 change and condition to or condition (#25374) 2019-07-21 09:57:40 -07:00
Paulus Schoutsen
615af773e5 Updated frontend to 20190721.0 2019-07-21 09:56:12 -07:00
Aaron Bach
aa27e22b17 Automatically expand WWLLN window to 1 hour (if necessary) (#25357)
* Expand default window for WWLLN

* Fleshed out conditions

* Fixed tests

* Removed unused import

* Linting
2019-07-21 08:58:50 +02:00
Paulus Schoutsen
9c51650ea3 Updated frontend to 20190720.0 2019-07-20 18:01:47 -07:00
Paulus Schoutsen
95223cb9ea Updated frontend to 20190720.0 2019-07-20 18:01:28 -07:00
Michael Scherer
ba04ff17b2 Fix for hvac_modes list being null (#25347)
* Fix for empty hvac_modes list

* Empty list instead of default value for hvac_modes
2019-07-20 18:00:16 -07:00
cgtobi
b0b2b0d654 Fix preset service call (#25358) 2019-07-20 17:58:06 -07:00
Matthias Alphart
01430262cd temporary patch to fix KNX climate devices (#25356)
This is a temporary patch for knx climate devices. It should be reverted when #24738 is merged to release. 
It should fix https://github.com/home-assistant/home-assistant/issues/25247 for 0.96
2019-07-20 17:57:38 -07:00
shbatm
83581be4d5 Update Google Maps Location Tracker to use locationsharinglib==4.0.2 (#25316)
* Update google_maps: Bump locationsharinglib to 4.0.2

* Remove unused import.

* Corrections based on review.
2019-07-21 00:49:10 +02:00
Ville Skyttä
fc5b1c7005 Mypy config improvements (#25340)
* Specify python version

So that it type checks against the lowest common denominator version,
not the runtime one.

* Disallow incomplete definitions
2019-07-20 14:35:59 -07:00
Ville Skyttä
56e4a2aea6 Fix util.ruamel_yaml type errors (#25338) 2019-07-20 14:35:22 -07:00
David F. Mulcahey
7ea27c0f2a add available to device info (#25349) 2019-07-20 14:44:47 -04:00
Ville Skyttä
258dc80fbd Remove some Python 2 compatibility code (#25341) 2019-07-20 08:04:18 -05:00
Ville Skyttä
22d9a73e8e Doc lint fixes (#25339) 2019-07-20 15:30:36 +03:00
9R
1fddf47e8f Fix missing Nachteule in mvglive component (#25304) 2019-07-20 14:02:00 +02:00
Fredrik Erlandsson
0da0dda39c Restore Daikin A/C on/off services (#25332) 2019-07-20 13:41:33 +02:00
ktnrg45
48540fc21e Ps4 reformat media data (#25172)
* Reformat saved media data/ fix load + save helpers

* Add url constant

* Reformat saved media data

* Add tests for media data

* Refactor

* Revert deleted lines

* Set attrs after checking for lock

* Patch load games.

* remove unneeded imports

* fix tests

* Correct condition

* Handle errors with loading games

* Correct condition

* Fix select source

* add test

* Remove unneeded vars

* line break

* cleanup loading json

* remove test

* move check for dict

* Set games to {}
2019-07-20 07:36:45 +02:00
Aaron Bach
693fa15924 Return Ambient PWS brightness sensor unit and remove CONF_MONITORED_CONDITIONS (#25284)
* Revert "Change Ambient solar radiation units to lx (#24690)"

This reverts commit 40fa4463de.

* Re-add sensor for Ambient PWS outdoor brightness (W/m^2)

* Corrected available and comments

* Member feedback

* Member comments
2019-07-20 06:32:33 +02:00
Paulus Schoutsen
c8abbf6d76 Restore sensiobo turn on/off methods (#25321) 2019-07-19 21:04:13 -04:00
nierob
979f801488 Avoid creating temporary lists (#25317)
That gives nano performance improvements as *() is slightly faster
then *[].
2019-07-19 13:36:18 -07:00
lyghtnox
caa7a3a3d6 Multiroom support for snapcast (#24061)
* Multiroom support for snapcast

* Moving services to the snapcast domain

* Passing asyncio event via dispatcher instead of hass.data

* Fixing lint
2019-07-19 12:43:44 -07:00
Paulus Schoutsen
290d32267e Merge remote-tracking branch 'origin/master' into dev 2019-07-19 11:55:51 -07:00
Paulus Schoutsen
a6e3cc6617 Merge pull request #25313 from home-assistant/rc
0.96.2
2019-07-19 11:42:01 -07:00
Andrew Sayre
b4481269ec Turn on device before setting mode (#25314) 2019-07-19 10:19:51 -07:00
Andrew Sayre
752d0deb97 Turn on device before setting mode (#25314) 2019-07-19 10:19:34 -07:00
Paulus Schoutsen
662c33af85 Bumped version to 0.96.2 2019-07-19 09:44:53 -07:00
cgtobi
fc384ca6d5 Fix plant error when adding new value (#25302)
* Only add value if int or floar

* Simplify check

* Simplify check
2019-07-19 09:44:44 -07:00
Pascal Vizeli
49e2583b08 Fix HM with use wrong datapoint for off (#25298) 2019-07-19 09:44:43 -07:00
David Bonnes
8629b86186 [climate] Correct honeywell supported_features (#25292)
* Initial commit

* delint
2019-07-19 09:44:43 -07:00
William Scanlon
68c4e5c0c9 Fixed python-wink method names (#25285)
* Fixed python-wink method names

* Fixed aux heat
2019-07-19 09:44:42 -07:00
cgtobi
c4d1cd0e03 Fix fritzbox climate HVAC mode / temperature (#25275)
* Set the target temperature

* Update tests

* Update tests

* Fix linter complaints
2019-07-19 09:44:41 -07:00
Paulus Schoutsen
8b020ea5e6 Updated frontend to 20190719.0 2019-07-19 09:43:24 -07:00
Paulus Schoutsen
fd2d6c8a74 Updated frontend to 20190719.0 2019-07-19 09:43:03 -07:00
cgtobi
5552a5be70 Fix plant error when adding new value (#25302)
* Only add value if int or floar

* Simplify check

* Simplify check
2019-07-19 16:09:29 +02:00
Pascal Vizeli
b9c6758dba Fix HM with use wrong datapoint for off (#25298) 2019-07-19 11:18:13 +02:00
David Bonnes
5015311d6b [climate] Correct honeywell supported_features (#25292)
* Initial commit

* delint
2019-07-19 09:51:02 +02:00
cgtobi
1eb66f3657 Fix fritzbox climate HVAC mode / temperature (#25275)
* Set the target temperature

* Update tests

* Update tests

* Fix linter complaints
2019-07-19 09:49:28 +02:00
William Scanlon
bc7e1a3797 Fixed python-wink method names (#25285)
* Fixed python-wink method names

* Fixed aux heat
2019-07-19 08:54:09 +02:00
Aaron Bach
003f7865a9 Add services to set and remove Simplisafe PINs (#25207)
* Add services to set and remove SimpliSafe PINs

* Added services spec

* Add services to set and remove SimpliSafe PINs

* Member comments

* Missed one
2019-07-18 22:07:07 -06:00
Paulus Schoutsen
33cba4da85 Merge pull request #25280 from home-assistant/rc
0.96.1
2019-07-18 15:22:49 -07:00
Philip Rosenberg-Watt
1c6d55e51b Add MQTT climate precision (#25265)
* Add MQTT climate precision

* Remove stale code
2019-07-18 15:21:50 -07:00
Greg
3fd138bbdd Add support for Rainforest Eagle-200 (#24919)
* Add support for Rainforest Eagle-200

* Removed direct access selector to monitored conditions

* Refactored code to use throttle on the update function

* Fixed issue in new code to use only one EagleReader instance

* Resolve comments

* Resolved comments

* Resolved comments and added Debug statement

* Added return statements

* Fixed typo

* Resolved comments and added debug statements

* Moved get_status method into Data object and decorated it with @staticmethod

* Resolved comments
2019-07-18 23:37:26 +02:00
Pascal Vizeli
3db106c562 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-18 23:20:56 +02:00
Pascal Vizeli
34c3d1ce47 Update azure-pipelines-ci.yml 2019-07-18 23:19:52 +02:00
Paulus Schoutsen
cc595632bd Bumped version to 0.96.1 2019-07-18 14:08:50 -07:00
stboch
f76700567e Added states and modes for zwave climate (#25274)
* Update climate.py

Added support for Fan Only State
Added additional missing modes and states this should correct issue #25216

* Correct line lint error

* Corrected mode spelling

* Lint
2019-07-18 14:08:43 -07:00
Paulus Schoutsen
86cf02739b Add hvac modes back to opentherm (#25268) 2019-07-18 14:08:42 -07:00
Andrew Sayre
46cdbd273a Restore SmartThings A/C on/off services (#25259)
* Restore ST A/C on/off services

* Use correct OFF const

* Support AC HVAC_MODE_OFF
2019-07-18 14:08:42 -07:00
William Sutton
ec3cb11e2f Update CT80 Humidity call (#25258)
Last PR was from a few versions before, not sure how I had it working, but functioning properly now on .96
2019-07-18 14:08:41 -07:00
geekofweek
2016cf872e ecobee Preset Fix (#25256)
* ecobee Preset Fix

* Celsius Fix

* Checks Fix

* Check Fix #2

* Check Fix #3
2019-07-18 14:08:40 -07:00
cgtobi
37810e010a Fix the unit of measurement for ecobee climate (#25246)
* Fix the unit of measurement

* Remove unused const
2019-07-18 14:08:40 -07:00
cgtobi
2b69904b94 Make presets prettier (#25245) 2019-07-18 14:08:39 -07:00
Pascal Vizeli
59cf6a0c79 Fix eq3btsmart (#25238) 2019-07-18 14:08:39 -07:00
Pascal Vizeli
39b249d202 Show off value (#25236) 2019-07-18 14:08:38 -07:00
Paulus Schoutsen
d57cf01cf2 Updated frontend to 20190718.0 2019-07-18 14:08:06 -07:00
Paulus Schoutsen
997187c7d3 Updated frontend to 20190718.0 2019-07-18 14:07:55 -07:00
stboch
217da36c86 Added states and modes for zwave climate (#25274)
* Update climate.py

Added support for Fan Only State
Added additional missing modes and states this should correct issue #25216

* Correct line lint error

* Corrected mode spelling

* Lint
2019-07-18 13:54:05 -07:00
William Sutton
be56851feb Update CT80 Humidity call (#25258)
Last PR was from a few versions before, not sure how I had it working, but functioning properly now on .96
2019-07-18 21:36:17 +02:00
Paulus Schoutsen
53954d6f8f Add hvac modes back to opentherm (#25268) 2019-07-18 21:32:17 +02:00
geekofweek
5c53257c23 ecobee Preset Fix (#25256)
* ecobee Preset Fix

* Celsius Fix

* Checks Fix

* Check Fix #2

* Check Fix #3
2019-07-18 10:39:53 -07:00
cgtobi
c7ebd109b8 Make presets prettier (#25245) 2019-07-18 10:27:04 -07:00
Daniel Shokouhi
32e89dcbb6 Add vendor support for vorwerk robots and fix zone retrieval (#25200)
* Add vendor support for vorwerk robots and fix zone retrieval

* Lint

* Review comments

* Lint

* Review commeent

* Remove unused variable

* Review comment

* Remove unused variable
2019-07-18 10:22:05 -07:00
definitio
93970b5621 Add hvac_action support for MQTT HVAC (#25260)
* Add hvac_action support

* Add tests
2019-07-18 10:17:26 -07:00
Andrew Sayre
cfc2c58fe0 Restore SmartThings A/C on/off services (#25259)
* Restore ST A/C on/off services

* Use correct OFF const

* Support AC HVAC_MODE_OFF
2019-07-18 10:14:58 -07:00
Finbarr Brady
516bab9969 OpenWrt Luci RPC Device Tracker Module Bump (#25234)
* Update manifest.json

* Update requirements_all.txt
2019-07-18 15:54:04 +02:00
cgtobi
89ed26eb86 Fix the unit of measurement for ecobee climate (#25246)
* Fix the unit of measurement

* Remove unused const
2019-07-18 13:27:56 +02:00
Pascal Vizeli
e13e4376f8 Fix eq3btsmart (#25238) 2019-07-18 11:25:11 +02:00
Pascal Vizeli
75ad5f8c9e Show off value (#25236) 2019-07-18 11:24:07 +02:00
Pascal Vizeli
2accd8ed1c Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-18 10:02:49 +02:00
Pascal Vizeli
d9e4050cdf Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-18 09:58:36 +02:00
Pascal Vizeli
bbf1ee4c68 Update azure-pipelines-ci.yml 2019-07-18 09:50:21 +02:00
Teemu R
70cab201db Add myself to songpal codeowners (#25221) 2019-07-18 08:05:17 +02:00
Paulus Schoutsen
9a79a0aa90 Merge pull request #25205 from home-assistant/rc
0.96.0
2019-07-17 16:23:24 -07:00
Paulus Schoutsen
7089188fd5 Updated frontend to 20190717.1 2019-07-17 15:27:25 -07:00
Khole
ccc4f628f1 Hive water heater - Remove Duplication of appending entities (#25210)
* climate_water_heater

* updated names

* Update water_heater

* Update requirements

* Updated reqirements

* Version update

* updated Versiojn

* Update device list

* Removed unused Attributes

* removed duplicate appending entities

* re-added missing hotwater

* Move call to async_added_to_hass

* Move session append to async_added_to_hass

* White space
2019-07-17 15:19:03 -07:00
cgtobi
90231c5e07 Fix schema validation for service calls (#25204)
* Fix schema validation for service calls

* No need for get

* No need for get
2019-07-17 15:19:02 -07:00
cgtobi
5b24e46a29 Fix schema validation for service calls (#25204)
* Fix schema validation for service calls

* No need for get

* No need for get
2019-07-17 15:18:07 -07:00
Khole
1215398aef Hive water heater - Remove Duplication of appending entities (#25210)
* climate_water_heater

* updated names

* Update water_heater

* Update requirements

* Updated reqirements

* Version update

* updated Versiojn

* Update device list

* Removed unused Attributes

* removed duplicate appending entities

* re-added missing hotwater

* Move call to async_added_to_hass

* Move session append to async_added_to_hass

* White space
2019-07-17 15:17:44 -07:00
Franck Nijhof
9550a38f22 Upgrades Dockerfiles to Debian Buster (#25208) 2019-07-17 15:16:15 -07:00
Aaron Bach
4e20e4964e Bump simplisafe-python to 4.0.0 + add additional SimpliSafe attributes (#25202)
* Bump simplisafe-python to 4.0.0 + add additional SimpliSafe attributes

* Fixed incorrect attr assignment

* Member comments

* Add system ID as a state attribute
2019-07-17 16:13:03 -06:00
Paulus Schoutsen
ff5dd0cf42 Updated frontend to 20190717.1 2019-07-17 15:10:57 -07:00
Paulus Schoutsen
5d7f420821 Fix ecobee missing preset mode support flag (#25211) 2019-07-17 15:08:14 -07:00
Paulus Schoutsen
a5012f39da Fix ecobee missing preset mode support flag (#25211) 2019-07-17 15:07:14 -07:00
Paulus Schoutsen
e4bb955498 Pin Docker to Debain Stretch (#25206)
* Pin Docker to Debain Stretch

* Update dev docker too"
2019-07-17 14:05:00 -07:00
Paulus Schoutsen
8f7767d5e5 Pin Docker to Debain Stretch (#25206)
* Pin Docker to Debain Stretch

* Update dev docker too"
2019-07-17 14:04:13 -07:00
Paulus Schoutsen
74d0e65958 Bumped version to 0.96.0 2019-07-17 13:42:32 -07:00
Paulus Schoutsen
3cfbbdc720 Only include target temp if has right support flag (#25193)
* Only include target temp if has right support flag

* Remove comma
2019-07-17 13:09:07 -07:00
David Bonnes
3d5c773670 [climate] Tweak evohome migration (#25187)
* de-lint

* use _evo_tcs instead of _evo_device for TCS

* add hvac_action to zones, remove target_temp from controller

* fix incorrect hvac_action

* de-lint
2019-07-17 13:09:06 -07:00
David F. Mulcahey
b5b0f56ae7 Fix device name customization on ZHA add devices page (#25180)
* ensure new device exists

* clean up dev reg handling

* update test

* fix tests
2019-07-17 13:09:05 -07:00
Paulus Schoutsen
c03d5f1a73 Correctly set property decorator on preset modes (#25151) 2019-07-17 13:09:05 -07:00
Paulus Schoutsen
5abe4dd1f7 Updated frontend to 20190717.0 2019-07-17 13:08:13 -07:00
Paulus Schoutsen
b507822280 Updated frontend to 20190717.0 2019-07-17 13:08:02 -07:00
Markus Jankowski
ded9eb89bb Add HmIP-PCBS2, HmIP-PCBS-BAT to Homematic IP Cloud (#25201)
* Add HmIP-PCBS2, HmIP-PCBS-BAT to Homematic IP Cloud

* fix lint
2019-07-17 21:29:25 +02:00
Kees Schollaart
bc4f91a89a Simplify cache restore (#25186)
* Simplify cache restore

* Missed the task version

* RestoreAndSaveCache1@1 > RestoreAndSaveCache@1

* Revert changes on cache saving & add comment

* Trim whitespaces
2019-07-17 21:23:36 +02:00
Paulus Schoutsen
971223de19 Only include target temp if has right support flag (#25193)
* Only include target temp if has right support flag

* Remove comma
2019-07-17 12:09:44 -07:00
tgermain
60ca8b95a4 Fix issue #24495 (#25199) 2019-07-17 13:05:26 -06:00
tetienne
a012c61762 Handle somfy expired token (#25195)
* HANDLE expired token

* RENAME constant

* FIX typo
2019-07-17 11:09:46 -04:00
bouni
21f68b80ea Add login_method config option to fix login issue with RouterOS Version > 6.43 (#25194)
* added login_method config option to fix login issue with RouterOS Version > 6.43

* minor changes so that users don't have to change their config

* removed default config value to make the fallback without config change work as expected
2019-07-17 15:02:15 +02:00
Markus Jankowski
8bae7a45a5 Add HMIP-FCI / HMIP-FBL / HmIP-BBL (#25188) 2019-07-16 18:05:57 -07:00
David Bonnes
2bac24fbb7 [climate] Tweak evohome migration (#25187)
* de-lint

* use _evo_tcs instead of _evo_device for TCS

* add hvac_action to zones, remove target_temp from controller

* fix incorrect hvac_action

* de-lint
2019-07-16 15:18:21 -07:00
David F. Mulcahey
ac91423d71 Fix device name customization on ZHA add devices page (#25180)
* ensure new device exists

* clean up dev reg handling

* update test

* fix tests
2019-07-16 15:16:49 -07:00
Ville Skyttä
56841da2d3 Upgrade mypy to 0.720, turn on unreachability warnings (#25157)
* Upgrade mypy to 0.720

* Turn on mypy unreachability warnings, address raised issues
2019-07-16 15:11:38 -07:00
Paulus Schoutsen
026dbffa77 Bumped version to 0.96.0b4 2019-07-16 14:59:46 -07:00
Fabian Affolter
e74fc9836d Upgrade luftdaten to 0.6.2 (#25177) 2019-07-16 14:59:38 -07:00
Alexei Chetroi
c7dfec702d Fix climate is_aux_heat type hint. (#25170) 2019-07-16 14:59:37 -07:00
Anders Melchiorsen
0f8f9db319 Update pysonos to 0.0.21 (#25168) 2019-07-16 14:59:37 -07:00
Daniel Perna
366ad8202a Fix device types for some HomeMatic IP sensors (#25167)
* Update pyhomematic to 0.1.60

* Devicetype for pyhomematic classes, fixes #24080
2019-07-16 14:59:36 -07:00
Andrew Sayre
20301ae888 Use MockConfigEntry (#25190) 2019-07-16 14:51:30 -07:00
Ryan Claussen
de3d28d9d5 Add severe weather sensor to Dark Sky (#22701)
* Add severe weather alert sensor to Dark Sky

* fixup test case

* address review comments and fixup testcases

* address comments, fix assertion order

* remove extra line

* remove index increment
2019-07-16 18:03:05 +02:00
Paulus Schoutsen
b52848d376 Fix typo in azure-pipelines-ci.yml 2019-07-16 08:47:07 -07:00
cgtobi
9c2625f0a5 Raise not ready when no data from API is retrieved (#25182) 2019-07-16 17:16:35 +02:00
Fran
4afc19ff3a Improve Nuki lock (#22888)
Using port on bridge initialization
Service: check_connection
Attribute: available
Updated requeriments_all.txt
Change unlatch service for open service

Removed extra info

nuki_lock_n_go renamed to lock_n_go
nuki_check_connection renamed to check_connection
2019-07-16 17:06:47 +02:00
Pascal Vizeli
91d065314c Delete config.yml (#25181) 2019-07-16 14:13:44 +02:00
Fabian Affolter
8a6515936d Upgrade luftdaten to 0.6.2 (#25177) 2019-07-16 11:32:38 +02:00
Fabian Affolter
3381fa0ac4 Upgrade Mastodon.py to 1.4.5 (#25176) 2019-07-16 11:26:07 +02:00
Fabian Affolter
aac01aaa50 Upgrade ruamel.yaml to 0.15.99 (#25175) 2019-07-16 11:16:43 +02:00
Fabian Affolter
a096858426 Upgrade discord.py to 1.2.3 (#25174) 2019-07-16 11:16:34 +02:00
Thomas Le Gentil
3d3dd05789 Add Fortigate integration (#24908)
* Add Fortigate integration

* added feedback changes

* removed the only case

* fixed a description

* removed the CONFIG_PLATFORM

* deleted README

* added return from setup

* added return from setup

* fixed reviews

* Link updated

* Rename var and a couple of other minor changes

* Typos
2019-07-16 11:15:59 +02:00
Fabian Affolter
f9ae6f6ce7 Upgrade youtube_dl to 2019.07.16 (#25173) 2019-07-16 10:48:10 +02:00
Alexei Chetroi
e8fd01bea5 Fix climate is_aux_heat type hint. (#25170) 2019-07-16 09:32:09 +02:00
Leonardo Merza
64b9102206 Add travel time attribution/coordinates (#24956)
* add google travel time attribution

* add origin/destination

* update waze origin/destination

* add attribution and origin/destination

* add google attribution
2019-07-16 04:51:51 +02:00
Leandro Loureiro
dcb12a992a Add spotify service to allow to play music from playlist (#24991)
* adding custom service to Spotify component to allow to play random playlist music

* fixing findings

* improving naming

* improving way of using required parameters
2019-07-16 04:41:16 +02:00
cgtobi
25285ef6a7 Fix Netatmo climate battery level (#25165)
* Interpolate battery level

* Sort list
2019-07-16 04:27:28 +02:00
Anders Melchiorsen
5e5abf77da Update pysonos to 0.0.21 (#25168) 2019-07-16 04:22:41 +02:00
Austin Drummond
c2f4f06005 Add HomeKit Reset Accessory (#25158)
* added the ability to reset homekit accessories

* added tests for homekit reset accessory

* minor fixes
2019-07-16 03:43:37 +02:00
Paulus Schoutsen
28bd7b6a4e Bumped version to 0.96.0b3 2019-07-15 13:57:41 -07:00
Paulus Schoutsen
c04049d6f6 Make dev tools titlte translatable (#25166) 2019-07-15 13:57:35 -07:00
Joakim Sørensen
ff79e437d2 Version sensor update (#25162)
* component -> integration

* Bump pyhaversion to 3.0.2

* Update requirements

* Formating
2019-07-15 13:57:34 -07:00
Daniel Perna
c8b495f224 Update pyhomematic to 0.1.60 (#25152) 2019-07-15 13:57:33 -07:00
Josh Anderson
842c1a2274 Remove check and restore temp/mode changes (#25149) 2019-07-15 13:57:33 -07:00
Khole
50b145cf05 [Climate] Hive Add water heater Component post the refresh of the climate component. (#25148)
* climate_water_heater

* updated names

* Update water_heater

* Update requirements

* Updated reqirements

* Version update

* updated Versiojn

* Update device list

* Removed unused Attributes
2019-07-15 13:57:32 -07:00
David Bonnes
78a0d72a5c [climate-1.0] Add RoundThermostat to evohome (#25141)
* initial commit

* improve enumeration of zone(s)

* remove unused self._config

* remove unused self._config 2

* remove unused self._id

* clean up device_state_attributes

* remove some pylint: disable=protected-access

* remove LOGGER.warn(

* refactor for RoundThermostat

* ready for review

* small tweak

* small tweak 2

* fix regression, tweak

* tidy up docstring

* simplify code
2019-07-15 13:57:32 -07:00
Markus Jankowski
97ca0d81e7 remove comfort mode (#25140) 2019-07-15 13:57:31 -07:00
David Bonnes
02e8ee137f [climate-1.0] Bugfix evohome showstopper (#25139)
* initial commit

* small tweak
2019-07-15 13:57:31 -07:00
Anders Melchiorsen
2643bbc228 Handle Sonos connection errors during setup (#25135) 2019-07-15 13:57:30 -07:00
Joakim Plate
c8d7e1346c Load requirements for platforms (#25133)
Fixes #25124 and fixes #25126
2019-07-15 13:57:30 -07:00
Paulus Schoutsen
7dedf173ad Allow area ID in service call schemas (#25121)
* Allow area ID in service call schemas

* Remove ATTR_ENTITY_ID from service light turn off schcema
2019-07-15 13:57:29 -07:00
Paulus Schoutsen
65593e36b1 Verify cloud user exists during boot (#25119) 2019-07-15 13:57:28 -07:00
Paulus Schoutsen
8996e330b8 Simplify Alexa/Google for new climate turn_on/off (#25115) 2019-07-15 13:57:28 -07:00
Markus Jankowski
e85d434f4e Add climate related services to Homematic IP Cloud (#25079)
* add hmip climate services

* Rename accesspoint_id to hapid

to comply with config

* Revert "Rename accesspoint_id to hapid"

This reverts commit 4a3cd14e1482fb508273c728ad8020945b02e426.
2019-07-15 13:57:27 -07:00
Paulus Schoutsen
82d9488ec8 Updated frontend to 20190715.0 2019-07-15 13:55:59 -07:00
Paulus Schoutsen
cca50a8339 Updated frontend to 20190715.0 2019-07-15 13:54:30 -07:00
Daniel Perna
84373ce754 Fix device types for some HomeMatic IP sensors (#25167)
* Update pyhomematic to 0.1.60

* Devicetype for pyhomematic classes, fixes #24080
2019-07-15 13:39:52 -07:00
Paulus Schoutsen
67546ce0b1 Make dev tools titlte translatable (#25166) 2019-07-15 13:39:04 -07:00
Pascal Vizeli
4fc302b67a Update azure-pipelines-release.yml for Azure Pipelines 2019-07-15 22:38:48 +02:00
Pascal Vizeli
ac33c22689 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-15 22:38:12 +02:00
Paulus Schoutsen
7aae490a85 Allow area ID in service call schemas (#25121)
* Allow area ID in service call schemas

* Remove ATTR_ENTITY_ID from service light turn off schcema
2019-07-15 11:31:53 -07:00
Joakim Sørensen
50f9117982 Version sensor update (#25162)
* component -> integration

* Bump pyhaversion to 3.0.2

* Update requirements

* Formating
2019-07-15 19:38:21 +02:00
ktnrg45
99c6c60bec PS4 Add tests for init (#25161)
* Add some tests for init

* Remove init

* Add config entry version

* Use const for version

* Remove var
2019-07-15 08:47:47 -07:00
Pascal Vizeli
9548345ed0 Update azure-pipelines-wheels.yml for Azure Pipelines 2019-07-15 15:23:08 +02:00
Pascal Vizeli
d444ba397b Update azure-pipelines-wheels.yml for Azure Pipelines 2019-07-15 15:22:41 +02:00
Pascal Vizeli
62df3c00df Update azure-pipelines-wheels.yml for Azure Pipelines 2019-07-15 15:22:19 +02:00
cgtobi
831564784a Add Netatmo climate battery level (#25143)
* Add battery level sensor

* Only update battery level if lower or nonexistent
2019-07-15 09:46:48 +02:00
Daniel Perna
17013c7c2c Update pyhomematic to 0.1.60 (#25152) 2019-07-14 20:21:37 -07:00
David Bonnes
3ddd482cc1 [climate-1.0] Add RoundThermostat to evohome (#25141)
* initial commit

* improve enumeration of zone(s)

* remove unused self._config

* remove unused self._config 2

* remove unused self._id

* clean up device_state_attributes

* remove some pylint: disable=protected-access

* remove LOGGER.warn(

* refactor for RoundThermostat

* ready for review

* small tweak

* small tweak 2

* fix regression, tweak

* tidy up docstring

* simplify code
2019-07-14 20:14:24 -07:00
Khole
bcf85a0df1 [Climate] Hive Add water heater Component post the refresh of the climate component. (#25148)
* climate_water_heater

* updated names

* Update water_heater

* Update requirements

* Updated reqirements

* Version update

* updated Versiojn

* Update device list

* Removed unused Attributes
2019-07-14 23:54:07 +02:00
Paulus Schoutsen
0a8b68fd4d Correctly set property decorator on preset modes (#25151) 2019-07-14 14:45:44 -07:00
Josh Anderson
08f12750f1 Remove check and restore temp/mode changes (#25149) 2019-07-14 17:38:57 -04:00
Anders Melchiorsen
1798522ec8 Handle Sonos connection errors during setup (#25135) 2019-07-14 14:36:05 -07:00
Markus Jankowski
d91e5a6b66 remove comfort mode (#25140) 2019-07-14 14:31:32 -07:00
Joakim Plate
b57c60ad7a Load requirements for platforms (#25133)
Fixes #25124 and fixes #25126
2019-07-14 14:13:36 -07:00
Frederik Bolding
fa8ae0865e Small changes to bluetooth RSSI tracking (#25056)
* Updated bt_proximity dependency

* Closed bluetooth socket after RSSI request

* Updated bt_proximity requirement in manifest
2019-07-14 23:11:54 +02:00
Robert Svensson
01b890f426 Merge UniFi device tracker to config entry (#24367)
* Move device tracker to use config entry

* Remove monitored conditions attributes based on ADR0003

* Add support for import of device tracker config to be backwards compatible

* Remove unnecessary configuration options from device tracker

* Add component configuration support
2019-07-14 21:57:09 +02:00
David Bonnes
3480e6229a [climate-1.0] Bugfix evohome showstopper (#25139)
* initial commit

* small tweak
2019-07-14 09:40:06 -07:00
cgtobi
e6a2dde19a Fix aggregation in Netatmo public sensor (#25132)
* Clean up values

* Fix divisor
2019-07-14 12:46:17 +02:00
Franck Nijhof
9d4b5ee58d Add Twente Milieu integration (#25129)
* Adds Twente Milieu integration

* Addresses flake8 warnings

* Adds required test deps

* Fixes path typo in coveragerc

* dispatcher_send -> async_dispatcher_send

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Removes not needed __init__

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Remove explicitly setting None default value on get call

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Correct typo in comment

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Clean storage for only the unloaded entry

Signed-off-by: Franck Nijhof <frenck@addons.community>

* asyncio.wait on updating all integrations

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Use string formatting

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Set a more sane SCAN_INTERVAL

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Small refactor around services

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Small styling correction

* Extract update logic into own function

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Addresses flake8 warnings
2019-07-14 12:30:23 +02:00
Austin Mroczek
369e6a3905 Move totalconnect from platform to component config (#24427)
* Move totalconnect component toward being a multi-platform integration.  Bump total_connect_client to 0.28.

* add missing total-connect alarm state mappings

* Made recommended changes of MartinHjelmare at
https://github.com/home-assistant/home-assistant/pull/24427

* Update __init__.py

* Updates per MartinHjelmare comments

* flake8/pydocstyle fixes

* removed . at end of log message

* added blank line between logging and voluptuous

* more fixes
2019-07-14 09:24:40 +02:00
ktnrg45
b77d060304 PS4 move load_games and save_games helpers to init from media_player (#25127)
* Add constant for games_file

* move load and save games to init from media_player

* Move save and load games to init

* Missed arg

* missed arg
2019-07-13 20:11:19 +02:00
michaeldavie
a147a189ca Update Environment Canada platforms (#24884)
* Add support for French

* Move labels to env_canada

* Bump env_canada to 0.0.17, change update frequency to 1 minute

* Update requirements_all.txt

* Set entity IDs separate from labels

* Flake error

* Remove monitored conditions

* Use next hourly forecast for missing conditions

* Switch sensors to unique_id

* Flake error

* Requested changes

* Simplify setting location parameters
2019-07-13 18:14:29 +02:00
Fabian Affolter
1e474bb5da Upgrade youtube_dl to 2019.07.12 (#25128) 2019-07-13 18:10:09 +02:00
Paulus Schoutsen
d37d1ce4ad Simplify Alexa/Google for new climate turn_on/off (#25115) 2019-07-13 10:27:50 +02:00
Paulus Schoutsen
8ec75cf883 Verify cloud user exists during boot (#25119) 2019-07-13 09:33:31 +02:00
Ville Skyttä
59f6fd7630 Upgrade flake8 to 3.7.8 (#25120)
http://flake8.pycqa.org/en/latest/release-notes/3.7.8.html
2019-07-13 09:32:08 +02:00
ktnrg45
68edf10270 PS4 handle no connection/ fix spamming of logs when device is off (#25091)
* Bump 0.8.7

* Bump 0.8.7

* 0.8.7

* Handle exception. Handle  device unavailable.

* Typo

* Blank line
2019-07-12 20:45:04 -06:00
cgtobi
c6b63b15b8 Add more public rain sensors (#25117) 2019-07-12 22:05:54 -04:00
Alex S
f705a1e62e Splunk component filter support (#25071)
* Added code to support entity and domain filters in the config for splunk component, and the code to enforce the filter.

* * Moved code for posting splunk request to separate function, primarily to more easily write a test case where I can mock the point where the post would occur and validate that filtering is working correctly.
* Test cases created for full config check and to test the filtering

* Correcting static check errors/issues

* Correcting flake8 static check issue (introduced when addressing prior static check issues)

* Removing unused parameter to setup function - cleanup from reviewer request.
2019-07-13 00:35:23 +02:00
Paulus Schoutsen
c884f9edbc Bumped version to 0.96.0b2 2019-07-12 15:09:02 -07:00
Aaron Bach
d0af73efe1 Fix missing sensor unit in RainMachine (#25101) 2019-07-12 15:08:56 -07:00
Aaron Bach
5eb7268ae7 Fix window exception in WWLLN (#25100)
* Beta fix: handle window exception in WWLLN

* Fixed test

* Fix bug

* Member comments

* Removed unused import
2019-07-12 15:08:55 -07:00
On Freund
60c2e5e2e2 Add turn on/off to coolmaster (#25097) 2019-07-12 15:08:54 -07:00
cgtobi
4e69b5b45f Fix Netatmo climate issue when device out of reach (#25096)
* Fix valve/thermostat out of reach

* Fix boost for valves

* Set netatmo default max temp to 30

* Remove unnecessary get

* Remove unnecessary default value

* Readd get
2019-07-12 15:08:54 -07:00
David Bonnes
1d784bdc05 [climate] Add water_heater to evohome (#25035)
* initial commit

* refactor for sync

* minor tweak

* refactor convert code

* fix regression

* remove bad await

* de-lint

* de-lint 2

* address edge case - invalid tokens

* address edge case - delint

* handle no schedule

* improve support for RoundThermostat

* tweak logging

* delint

* refactor for greatness

* use time_zone: for state attributes

* small tweak

* small tweak 2

* have datetime state attributes as UTC

* have datetime state attributes as UTC - delint

* have datetime state attributes as UTC - tweak

* missed this - remove

* de-lint type hint

* use parse_datetime instead of datetime.strptime)

* remove debug code

* state atrribute datetimes are UTC now

* revert

* de-lint (again)

* tweak type hints

* de-lint (again, again)

* tweak type hints

* Convert datetime closer to sending it out
2019-07-12 15:08:53 -07:00
Paulus Schoutsen
9181660497 Updated frontend to 20190712.0 2019-07-12 15:05:02 -07:00
Paulus Schoutsen
f7aa1b026f Updated frontend to 20190712.0 2019-07-12 14:58:50 -07:00
Aaron Bach
c73fa6157d Add additional WWLLN test (#25111) 2019-07-12 14:36:49 -06:00
David Bonnes
de43237f6d [climate] Add water_heater to evohome (#25035)
* initial commit

* refactor for sync

* minor tweak

* refactor convert code

* fix regression

* remove bad await

* de-lint

* de-lint 2

* address edge case - invalid tokens

* address edge case - delint

* handle no schedule

* improve support for RoundThermostat

* tweak logging

* delint

* refactor for greatness

* use time_zone: for state attributes

* small tweak

* small tweak 2

* have datetime state attributes as UTC

* have datetime state attributes as UTC - delint

* have datetime state attributes as UTC - tweak

* missed this - remove

* de-lint type hint

* use parse_datetime instead of datetime.strptime)

* remove debug code

* state atrribute datetimes are UTC now

* revert

* de-lint (again)

* tweak type hints

* de-lint (again, again)

* tweak type hints

* Convert datetime closer to sending it out
2019-07-12 21:29:45 +02:00
escoand
49abda2d49 Use more compatible samsungtv TV key (#25083)
* use more compatible TV key

* Remove extra spaces
2019-07-12 11:28:30 -07:00
Tom Harris
1368501cba Bump insteonplm to 0.16.3 (#25108) 2019-07-12 19:47:59 +02:00
Victor Vostrikov
eae63cd231 Add support for multiple N26 accounts (#25086)
* Added support of multiple accounts for n26

* Code cleanup

* Added check for proper config

* Fiexed lints
2019-07-12 18:59:40 +02:00
Aaron Bach
b69663857b Fix missing sensor unit in RainMachine (#25101) 2019-07-12 17:59:04 +02:00
cgtobi
a9980c8be0 Fix Netatmo climate issue when device out of reach (#25096)
* Fix valve/thermostat out of reach

* Fix boost for valves

* Set netatmo default max temp to 30

* Remove unnecessary get

* Remove unnecessary default value

* Readd get
2019-07-12 17:43:18 +02:00
Aaron Bach
31dd6364c3 Fix window exception in WWLLN (#25100)
* Beta fix: handle window exception in WWLLN

* Fixed test

* Fix bug

* Member comments

* Removed unused import
2019-07-12 17:41:47 +02:00
On Freund
0478e7f41d Add turn on/off to coolmaster (#25097) 2019-07-12 17:40:28 +02:00
Wim Haanstra
f25f44a75b Rename RitAssist to FleetGO (#25093) 2019-07-12 16:14:58 +02:00
ktnrg45
bbe45cbd4b Ps4 move send_command service to init (#25094)
* Move services from media_player

* Move services to init

* add COMMANDS to const

* change service handler to sync
2019-07-12 13:14:35 +02:00
Aaron Bach
69cc6affd5 Add support for recording history to Apache Kafka (#25085)
* Add support for Apache Kafka

* Simplified

* Revert "Simplified"

This reverts commit fde4624e07.

* Revert "Revert "Simplified""

This reverts commit 5ae57e64c2.

* Completed

* Updated requirements

* Updated .coveragerc

* Removed unused import

* Updated codeowner
2019-07-12 13:13:51 +02:00
Pascal Vizeli
8fdebf4f8f Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-12 12:08:53 +02:00
Pascal Vizeli
f2ae2c128d Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-12 11:27:31 +02:00
Pascal Vizeli
95abd91354 Update azure-pipelines-wheels.yml for Azure Pipelines 2019-07-12 09:42:21 +02:00
Pascal Vizeli
aaeca69bd5 Update azure-pipelines-ci.yml for Azure Pipelines 2019-07-12 09:29:55 +02:00
Pascal Vizeli
7fc8ff982b Version bump to 0.96.0b1 2019-07-12 07:24:30 +00:00
Paulus Schoutsen
155c75c54a Guard module being None (#25077) 2019-07-12 07:23:30 +00:00
Anders Melchiorsen
afade4e997 Support podcast episodes as Sonos favorites (#25087) 2019-07-12 07:22:02 +00:00
Pascal Vizeli
53111f6426 Fix powercontrol media player alexa (#25080) 2019-07-12 07:21:59 +00:00
Aaron Bach
53a701b12c Change unique_id formula for Notion entities (#25076)
* Change unique_id formula for Notion entities

* Don't use name
2019-07-12 07:21:57 +00:00
Pascal Vizeli
a0e45cce79 Add support for on/off climate (#25026)
* Add support for on/off climate

* address comments

* Add test for sync overwrite

* Add more tests
2019-07-12 07:21:54 +00:00
Paulus Schoutsen
2b62ea1f0e Do not reverse open/close calls (#24879) 2019-07-12 07:19:26 +00:00
Pascal Vizeli
cc7b65a6c8 Support hass-release inside devcontainer (#25090) 2019-07-12 07:18:31 +00:00
Pascal Vizeli
60fe4c9ae0 Support hass-release inside devcontainer (#25090) 2019-07-12 09:16:14 +02:00
Anders Melchiorsen
6173d7c8a0 Support podcast episodes as Sonos favorites (#25087) 2019-07-12 07:08:57 +02:00
Pascal Vizeli
d47905d119 Add support for on/off climate (#25026)
* Add support for on/off climate

* address comments

* Add test for sync overwrite

* Add more tests
2019-07-11 15:28:11 -07:00
Pascal Vizeli
f5a4af40ee Update azure-pipelines-wheels.yml 2019-07-11 22:10:40 +02:00
Matthias Alphart
e299d7b3d6 Update KNX component to xknx 0.11 (#24738)
* update component for xknx 0.11.0

- expose sensor state is not casted to float anymore
- climate mode operation list has no more None values
- light supports white_value (rgbw)
- sensor expects `group_address_state` now instead of `group_address`
- sensor forwards device_class if available

* update manifest to use xknx 0.11.0

* update requirements_all for xknx 0.11.0

* update for xknx 0.11.1

- require xknx 0.11.1
- use 'state_address' instead of 'address' in sensor and binary_sensor configuration
- optional 'sync_state' for sensors and binary_sensors

* remove questionable `del kwargs`
2019-07-11 22:01:37 +02:00
Pascal Vizeli
78a5dc71ac Fix powercontrol media player alexa (#25080) 2019-07-11 08:35:46 -07:00
Markus Jankowski
04b4284746 Add climate related services to Homematic IP Cloud (#25079)
* add hmip climate services

* Rename accesspoint_id to hapid

to comply with config

* Revert "Rename accesspoint_id to hapid"

This reverts commit 4a3cd14e1482fb508273c728ad8020945b02e426.
2019-07-11 15:14:05 +02:00
Pascal Vizeli
2be5e0dcf9 Update azure-pipelines-wheels.yml for Azure Pipelines 2019-07-11 12:21:01 +02:00
Niels Mündler
71ddebbf41 Remove monitored conditions from syncthru (#25052) 2019-07-11 11:13:34 +02:00
Paulus Schoutsen
27d750db1c Guard module being None (#25077) 2019-07-11 09:38:58 +02:00
Pascal Vizeli
3b6b421152 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:24:45 +02:00
Pascal Vizeli
b8ee3536b3 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:24:15 +02:00
Pascal Vizeli
fcb1783f56 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:22:37 +02:00
Pascal Vizeli
8937d44399 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:21:58 +02:00
Pascal Vizeli
8041339052 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:17:44 +02:00
Pascal Vizeli
f0fe865798 Update azure-pipelines-release.yml for Azure Pipelines 2019-07-11 09:15:14 +02:00
Paulus Schoutsen
2eecb08b51 Do not reverse open/close calls (#24879) 2019-07-10 23:33:38 -07:00
Aaron Bach
51a40c0441 Change unique_id formula for Notion entities (#25076)
* Change unique_id formula for Notion entities

* Don't use name
2019-07-11 07:31:03 +02:00
Martin Hjelmare
177f5a35ae Rewrite calendar component (#24950)
* Correct google calendar test name

* Rewrite calendar component

* Save component in hass.data.
* Rename device_state_attributes to state_attributes.
* Remove offset attribute from base state_attributes.
* Extract offset helpers to calendar component.
* Clean imports.
* Remove stale constants.
* Remove name and add async_get_events.
* Add normalize_event helper function. Copied from #21495.
* Add event property to base entity.
* Use event property for calendar state.
* Ensure event start and end.
* Remove entity init.
* Add comment about event data class.
* Temporary keep old start and end datetime format.

* Convert demo calendar

* Convert google calendar

* Convert google calendar.
* Clean up google component.
* Keep offset feature by using offset helpers.

* Convert caldav calendar

* Clean up caldav calendar.
* Update caldav cal on addition.
* Bring back offset to caldav calendar.
* Copy caldav event on update.

* Convert todoist calendar
2019-07-10 20:59:37 -07:00
Paulus Schoutsen
87d3680630 Merge commit 'df920b4eda1d64368ed3bf166bcb0a90aeec6c44' into rc 2019-07-10 20:54:06 -07:00
Paulus Schoutsen
c6af8811fb Version bump to 0.97.0dev0 2019-07-10 20:50:31 -07:00
Paulus Schoutsen
bd7c0e87d5 Version bump to 0.96.0b0 2019-07-10 20:49:56 -07:00
Paulus Schoutsen
df920b4eda Merge remote-tracking branch 'origin/master' into dev 2019-07-10 20:48:54 -07:00
Paulus Schoutsen
cde3f670c2 pylint 2019-07-10 20:47:27 -07:00
Phil Bruckner
c80683bb15 Restore automation last_triggered as datetime & fix test (#24951)
* Restore automation last_triggered as datetime & fix test

* last_triggered is always a string
2019-07-10 20:42:38 -07:00
Paulus Schoutsen
073327831f Correctly store removed entities for restore state (#25073)
* Correctly store removed entities for restore state

* Lint

* Do not assume about set encoding
2019-07-10 20:41:03 -07:00
Charles Garwood
312fceeaf6 Add websocket API command for Z-Wave network status (#25066)
* Add websocket API command for Z-Wave network status

* lint

* Add callback decorator

* Remove state_str, fix lint
2019-07-10 19:50:42 -07:00
monte-monte
42d2f30ab8 Complete OPERATION_MODES (#25069)
XKNX library has complete list of KNX controller modes, but current version of HA KNX climate plugin uses only two of them and one is named incorrectly ("Dehumidification" instead of "Dry"). https://github.com/XKNX/xknx/blob/master/xknx/knx/dpt_hvac_mode.py
I've added missing control modes, which has corresponding operation mode in HA. Tested this patch on my KNX IntesisBox which is used with Mitsubishi split AC, all modes were detected correctly and working as expected.
I've also corrected datapoint number in a comment, because it was pointing to a wrong one: http://www.sti.uniurb.it/romanell/Domotica_e_Edifici_Intelligenti/110504-Lez10a-KNX-Datapoint%20Types%20v1.5.00%20AS.pdf see page 94.
2019-07-10 15:59:43 -07:00
Jeff Irion
4844477d3a Make sure volume level is valid when incrementing/decrementing (#25061)
* Make sure volume level is not None before incrementing/decrementing

* Pass linting checks
2019-07-10 15:58:29 -07:00
Martijn van Zal
ca8118138c Change phrases in the logbook component for persons and binary_sensors (#25053)
Persons are now threated the same as device trackers, so the logbook states
"<name> is at <location>" or "<name> is away" instead of "<name> changed to <location|not_home>"

Binary sensors now show phrases that relate to their device_class attribute.
So "Front door is closed" instead of "Front door turned off" or "Hallway PIR detected movement"
instead of "Hallway PIR turned on"
2019-07-10 15:56:41 -07:00
Johann Kellerman
e51b5e801e SMA catch error (#25045)
* SMA small fix

* lib update

* req
2019-07-10 15:55:40 -07:00
Aaron Bach
9ccb85d959 Add support for World Wide Lightning Location Network (#25001)
* Add support for World Wide Lightning Location Network

* Updated .coveragerc

* Added test

* Updated requirements

* Fixed tests

* Use local time for nearest strike

* Base geo location in place

* Finished geolocation work

* Fixed tests

* Cleanup

* Removed no-longer-needed method

* Updated requirements

* Add support for window and attrs

* Add strike ID to entity name

* Member comments
2019-07-10 16:40:11 -06:00
Alexei Chetroi
cea857e18a Bump up ZHA dependencies. (#25062)
Bump zigpy-homeassistant to 0.7.0
Bump zigpy-deconz to 0.2.1
Bump zigpy-xbee-homeassistant to 0.4.0
2019-07-10 12:20:37 -07:00
Anders Melchiorsen
1afa136fc0 Fix for Sonos debug logging (#25064)
* Fix for Sonos debug logging

* Start logging messages with capital letters
2019-07-10 12:19:28 -07:00
Paulus Schoutsen
7d33b0a259 Fix broken test in Python 3.7 (#25067) 2019-07-10 12:17:10 -07:00
David F. Mulcahey
777e1ca832 bump zha-quirks version (#25059) 2019-07-10 11:59:06 -07:00
Johann Kellerman
2e26f0bd2b Add check_config helper (#24557)
* check_config

* no ignore

* tests

* try tests again
2019-07-10 11:56:50 -07:00
Penny Wood
236debb455 Avoid flooding steam API (#23941) 2019-07-10 11:15:42 -07:00
Paulus Schoutsen
5f5c541f2f Update translations 2019-07-10 10:50:50 -07:00
Paulus Schoutsen
f0f7dc4884 Updated frontend to 20190710.0 2019-07-10 10:49:07 -07:00
Anders Melchiorsen
18d27c997d Add Sonos debug logging (#25063) 2019-07-10 09:30:45 -07:00
David Bonnes
a44686389c [climate] Bugfix honeywell misleading error message (#25048)
* initial commit

* refactor for sync

* minor tweak

* refactor convert code

* fix regression

* remove bad await

* de-lint

* de-lint 2

* improve error message

* rebase

* tweak

* de-lint
2019-07-10 08:38:31 -07:00
cdce8p
98ba015f06 Remove myself as codeowner (#25043) 2019-07-10 08:36:17 -07:00
Matte23
c1c2159dee Added marker sensor to CUPS integration (#25037) 2019-07-10 08:35:30 -07:00
Paul Annekov
a30c37017b Update tuyaha to 0.0.2 to catch API exceptions (#25050)
* Update tuyaha to 0.0.2 to catch API exceptions

* Updated tuyaha version in requirements
2019-07-10 01:54:19 +02:00
Aaron Bach
195b034abc Add config flow support to Geolocation (#25046) 2019-07-10 00:50:16 +02:00
William Sutton
c5239c6176 Add radiotherm CT80 current humidity support (#25024)
* Added CT80 Current Humidity Support

Added a check for if device is a CT80, and if so, queries the humidity object to get the current measured humidity reading.

* Update climate.py

Removed whitespace on line 229

* Update climate.py

Added humidity property. Version on local machine had that from previous tinkering.

* Update climate.py

Removed whitespace

* Update climate.py

Fixed tstat error handling for humidity data.
2019-07-09 21:18:05 +02:00
jlrgraham
5be695c49c Bump pyvera to 0.3.2, null/missing value protection (#25041)
* Bump pyvera to 0.3.2, null/missing value protection.

* Add another place where the pyvera version is set.
2019-07-09 20:06:45 +02:00
cgtobi
8652c84745 Fix Netatmo rain gauge precision (#25036) 2019-07-09 19:57:29 +02:00
Franck Nijhof
36ed725ab4 Improve toon climate (#25040)
* Renames internal climate state variable to preset

* Shorten function comments

* Updates local variables on preset and temp changes

* Adds support for hvac_action
2019-07-09 19:52:38 +02:00
Malte Franken
cf5a35a421 updated geojson_client library to version 0.4 (#25039) 2019-07-09 13:06:10 -04:00
Fabian Affolter
8256d72f6d Upgrade youtube_dl to 2019.07.02 (#24990)
* Upgrade youtube_dl to 2019.07.01

* Update homeassistant/components/media_extractor/manifest.json

Co-Authored-By: Josef Schlehofer <pepe.schlehofer@gmail.com>

* Update requirements_all.txt

Co-Authored-By: Josef Schlehofer <pepe.schlehofer@gmail.com>
2019-07-09 13:03:52 -04:00
Pascal Vizeli
25745e9e27 Update build pipeline 2019-07-09 15:32:09 +02:00
Franck Nijhof
3ce1049d21 Centralizes Toon data, reducing API calls (#23988)
* Centralizes Toon data, reducing API calls

Fixes #21825

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Fixes bad copy past action in services.yaml

Signed-off-by: Franck Nijhof <frenck@addons.community>

* Addresses review comments

Signed-off-by: Franck Nijhof <frenck@addons.community>

* 👕 Fixes too many blank lines

* Unsub dispatcher
2019-07-09 14:18:51 +02:00
arigilder
f3e542542a Add missing support for jewish_calendar.omer_count sensor (#24958)
* Add missing support for omer_count to jewish_calendar

* Add tests for omer sensor

* Add tests for omer after tzeit hakochavim

* Lint fixes
2019-07-09 11:58:57 +02:00
cgtobi
07b635e7aa Fix Netatmo climate presets (#25029)
* Fix netatmo presets

* Remove off mode for valves

* Revert usage of global const

* Flip values

* Remove try...except block
2019-07-09 10:40:02 +02:00
Aaron Bach
c2e843cbc3 Add support for Notion Home Monitoring (#24634)
* Add support for Notion Home Monitoring

* Updated coverage

* Removed auto-generated translations

* Stale docstrings

* Corrected hardware version

* Fixed binary sensor representation

* Cleanup and update protection

* Updated log message

* Cleaned up is_on

* Updated docstring

* Modified which data is updated during async_update

* Added more checks during update

* More cleanup

* Fixed unhandled exception

* Owner-requested changes (round 1)

* Fixed incorrect scan interval retrieval

* Ugh

* Removed unnecessary import

* Simplified everything via dict lookups

* Ensure bridges are properly registered

* Fixed tests

* Added catch for invalid credentials

* Ensure bridge ID is updated as necessary

* Updated method name

* Simplified bridge update

* Add support for updating bridge via_device_id

* Device update guard clause

* Removed excess whitespace

* Whitespace

* Owner comments

* Member comments
2019-07-09 10:29:06 +02:00
Andrew Sayre
7a5fca69af Add hvac fan state (#25030) 2019-07-09 09:59:48 +02:00
Franck Nijhof
3016d3a186 Toon fixes for Climate 1.0 (#25027) 2019-07-09 08:44:30 +02:00
Andrew Sayre
a31e49c857 Improve SmartThings test mocking (#25028)
* Migrate to asynctest

* Simplify mock access

* Use mocks
2019-07-08 22:39:55 -04:00
Joakim Plate
2fbbcafaed Support config flow on custom components (#24946)
* Support populating list of flows from custom components

* Re-allow custom component config flows

* Add tests for custom component retrieval

* Don't crash view if no handler exist

* Use get_custom_components instead fo resolve_from_root

* Switch to using an event instead of lock

* Leave list of integrations as set

* The returned list is not guaranteed to be ordered

Backend uses a set to represent them.
2019-07-09 01:19:37 +02:00
Pascal Vizeli
a2237ce5d4 homematic add off support for climate (#25017)
* homematic add off support for climate

* fix lint
2019-07-09 00:00:25 +02:00
Daniel Høyer Iversen
af7f61fec2 ambiclimate hvac_modes (#25015)
* ambiclimate hvac_modes

* style
2019-07-08 14:12:23 -07:00
Paulus Schoutsen
26a66276cd Fix Nest sensor (#25023) 2019-07-08 14:12:02 -07:00
Phil Bruckner
9944e675a5 Add template support to state trigger's for option (#24912) 2019-07-08 13:59:58 -07:00
Phil Bruckner
f9b9883aba Add template support to numeric_state trigger's for option (#24955) 2019-07-08 13:58:50 -07:00
Phil Bruckner
1431fd6fbd Add datetime option to input_datetime.set_datetime service (#24975) 2019-07-08 13:18:42 -07:00
Paulus Schoutsen
b11171aaeb Fix mimetypes on borked Windows machines (#25018) 2019-07-08 13:16:22 -07:00
Paulus Schoutsen
0b7a901c81 Fix ecobee flaky test (#25019) 2019-07-08 13:10:01 -07:00
Daniel Høyer Iversen
662e0dde80 Sensibo, add HVAC_MODE_OFF (#25016) 2019-07-08 13:17:59 -04:00
Joakim Plate
ab832cda71 Add support for arcam fmj receivers (#24621)
* Add arcam_fmj support

* Just use use state in player avoid direct client access

* Avoid leaking exceptions on invalid data

* Fix return value for volume in case of 0

* Mark component as having no coverage

* Add new requirement

* Add myself as maintainer

* Correct linting errors

* Use async_create_task instead of async_add_job

* Use new style string format instead of concat

* Don't call init of base class without init

* Annotate callbacks with @callback

Otherwise they won't be called in loop

* Reduce log level to debug

* Use async_timeout instead of wait_for

* Bump to version of arcam_fmj supporting 3.5

* Fix extra spaces

* Drop somewhat flaky unique_id

* Un-blackify ident to satisy pylint

* Un-blackify ident to satisy pylint

* Move default name calculation to config validation

* Add test folder

* Drop unused code

* Add tests for config flow import
2019-07-08 17:14:19 +02:00
Jesse Rizzo
f90fe7e628 Enphase envoy individual inverter production (#24445)
* bump envoy_reader version to 0.4

* bump dependency envoy_reader to 0.4

* Enphase envoy get individual inverter production

* Add period in function description

* Fix dumb typo

* Define _attributes in __init__

* Better error messages, make update async

* Fix format error

* Fix pylint errors

* set unknown state to None

* Bump envoy_reader version to 0.8

* Change attributes to separate sensors

* Fix dumb thing

* Improve platform_setup for inverters

* Remove unneeded self._attributes, refactor platform setup

* Refactor platform setup
2019-07-08 10:21:08 -04:00
Chris Johnston
32685f16bf Implement Twilio SMS notify MediaUrl support (#24971)
* Implement Twilio SMS notify MediaUrl support

Adds support for setting the `media_url` parameter of the twilio API
client with an optional attribute under the notify `data`
attribute.

Per the twilio docs (https://www.twilio.com/docs/sms/send-messages#include-medi$
this feature is only available in the US and Canada, for
GIF, PNG, or JPEG content.

* lint: fix 80 char ruler

* use kwargs to set the media_url

after testing locally, seems like the previous way of using
object() was not working. this seems to be working

* re-use the ATTR_MEDIAURL attribute
2019-07-08 14:05:15 +02:00
Pascal Vizeli
84cf76ba36 Climate 1.0 (#23899)
* Climate 1.0 / part 1/2/3

* fix flake

* Lint

* Update Google Assistant

* ambiclimate to climate 1.0 (#24911)

* Fix Alexa

* Lint

* Migrate zhong_hong

* Migrate tuya

* Migrate honeywell to new climate schema (#24257)

* Update one

* Fix model climate v2

* Cleanup p4

* Add comfort hold mode

* Fix old code

* Update homeassistant/components/climate/__init__.py

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>

* Update homeassistant/components/climate/const.py

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>

* First renaming

* Rename operation to hvac for paulus

* Rename hold mode to preset mode

* Cleanup & update comments

* Remove on/off

* Fix supported feature count

* Update services

* Update demo

* Fix tests & use current_hvac

* Update comment

* Fix tests & add typing

* Add more typing

* Update modes

* Fix tests

* Cleanup low/high with range

* Update homematic part 1

* Finish homematic

* Fix lint

* fix hm mapping

* Support simple devices

* convert lcn

* migrate oem

* Fix xs1

* update hive

* update mil

* Update toon

* migrate deconz

* cleanup

* update tesla

* Fix lint

* Fix vera

* Migrate zwave

* Migrate velbus

* Cleanup humity feature

* Cleanup

* Migrate wink

* migrate dyson

* Fix current hvac

* Renaming

* Fix lint

* Migrate tfiac

* migrate tado

* Fix PRESET can be None

* apply PR#23913 from dev

* remove EU component, etc.

* remove EU component, etc.

* ready to test now

* de-linted

* some tweaks

* de-lint

* better handling of edge cases

* delint

* fix set_mode typos

* apply PR#23913 from dev

* remove EU component, etc.

* ready to test now

* de-linted

* some tweaks

* de-lint

* better handling of edge cases

* delint

* fix set_mode typos

* delint, move debug code

* away preset now working

* code tidy-up

* code tidy-up 2

* code tidy-up 3

* address issues #18932, #15063

* address issues #18932, #15063 - 2/2

* refactor MODE_AUTO to MODE_HEAT_COOL and use F not C

* add low/high to set_temp

* add low/high to set_temp 2

* add low/high to set_temp - delint

* run HA scripts

* port changes from PR #24402

* manual rebase

* manual rebase 2

* delint

* minor change

* remove SUPPORT_HVAC_ACTION

* Migrate radiotherm

* Convert touchline

* Migrate flexit

* Migrate nuheat

* Migrate maxcube

* Fix names maxcube const

* Migrate proliphix

* Migrate heatmiser

* Migrate fritzbox

* Migrate opentherm_gw

* Migrate venstar

* Migrate daikin

* Migrate modbus

* Fix elif

* Migrate Homematic IP Cloud to climate-1.0 (#24913)

* hmip climate fix

* Update hvac_mode and preset_mode

* fix lint

* Fix lint

* Migrate generic_thermostat

* Migrate incomfort to new climate schema (#24915)

* initial commit

* Update climate.py

* Migrate eq3btsmart

* Lint

* cleanup PRESET_MANUAL

* Migrate ecobee

* No conditional features

* KNX: Migrate climate component to new climate platform (#24931)

* Migrate climate component

* Remove unused code

* Corrected line length

* Lint

* Lint

* fix tests

* Fix value

* Migrate geniushub to new climate schema (#24191)

* Update one

* Fix model climate v2

* Cleanup p4

* Add comfort hold mode

* Fix old code

* Update homeassistant/components/climate/__init__.py

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>

* Update homeassistant/components/climate/const.py

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>

* First renaming

* Rename operation to hvac for paulus

* Rename hold mode to preset mode

* Cleanup & update comments

* Remove on/off

* Fix supported feature count

* Update services

* Update demo

* Fix tests & use current_hvac

* Update comment

* Fix tests & add typing

* Add more typing

* Update modes

* Fix tests

* Cleanup low/high with range

* Update homematic part 1

* Finish homematic

* Fix lint

* fix hm mapping

* Support simple devices

* convert lcn

* migrate oem

* Fix xs1

* update hive

* update mil

* Update toon

* migrate deconz

* cleanup

* update tesla

* Fix lint

* Fix vera

* Migrate zwave

* Migrate velbus

* Cleanup humity feature

* Cleanup

* Migrate wink

* migrate dyson

* Fix current hvac

* Renaming

* Fix lint

* Migrate tfiac

* migrate tado

* delinted

* delinted

* use latest client

* clean up mappings

* clean up mappings

* add duration to set_temperature

* add duration to set_temperature

* manual rebase

* tweak

* fix regression

* small fix

* fix rebase mixup

* address comments

* finish refactor

* fix regression

* tweak type hints

* delint

* manual rebase

* WIP: Fixes for honeywell migration to climate-1.0 (#24938)

* add type hints

* code tidy-up

* Fixes for incomfort migration to climate-1.0 (#24936)

* delint type hints

* no async unless await

* revert: no async unless await

* revert: no async unless await 2

* delint

* fix typo

* Fix homekit_controller on climate-1.0 (#24948)

* Fix tests on climate-1.0 branch

* As part of climate-1.0, make state return the heating-cooling.current characteristic

* Fixes from review

* lint

* Fix imports

* Migrate stibel_eltron

* Fix lint

* Migrate coolmaster to climate 1.0 (#24967)

* Migrate coolmaster to climate 1.0

* fix lint errors

* More lint fixes

* Fix demo to work with UI

* Migrate spider

* Demo update

* Updated frontend to 20190705.0

* Fix boost mode (#24980)

* Prepare Netatmo for climate 1.0 (#24973)

* Migration Netatmo

* Address comments

* Update climate.py

* Migrate ephember

* Migrate Sensibo

* Implemented review comments (#24942)

* Migrate ESPHome

* Migrate MQTT

* Migrate Nest

* Migrate melissa

* Initial/partial migration of ST

* Migrate ST

* Remove Away mode (#24995)

* Migrate evohome, cache access tokens (#24491)

* add water_heater, add storage - initial commit

* add water_heater, add storage - initial commit

delint

add missing code

desiderata

update honeywell client library & CODEOWNER

add auth_tokens code, refactor & delint

refactor for broker

delint

* Add Broker, Water Heater & Refactor

add missing code

desiderata

* update honeywell client library & CODEOWNER

add auth_tokens code, refactor & delint

refactor for broker

* bugfix - loc_idx may not be 0

more refactor - ensure pure async

more refactoring

appears all r/o attributes are working

tweak precsion, DHW & delint

remove unused code

remove unused code 2

remove unused code, refactor _save_auth_tokens()

* support RoundThermostat

bugfix opmode, switch to util.dt, add until=1h

revert breaking change

* store at_expires as naive UTC

remove debug code

delint

tidy up exception handling

delint

add water_heater, add storage - initial commit

delint

add missing code

desiderata

update honeywell client library & CODEOWNER

add auth_tokens code, refactor & delint

refactor for broker

add water_heater, add storage - initial commit

delint

add missing code

desiderata

update honeywell client library & CODEOWNER

add auth_tokens code, refactor & delint

refactor for broker

delint

bugfix - loc_idx may not be 0

more refactor - ensure pure async

more refactoring

appears all r/o attributes are working

tweak precsion, DHW & delint

remove unused code

remove unused code 2

remove unused code, refactor _save_auth_tokens()

support RoundThermostat

bugfix opmode, switch to util.dt, add until=1h

revert breaking change

store at_expires as naive UTC

remove debug code

delint

tidy up exception handling

delint

* update CODEOWNERS

* fix regression

* fix requirements

* migrate to climate-1.0

* tweaking

* de-lint

* TCS working? & delint

* tweaking

* TCS code finalised

* remove available() logic

* refactor _switchpoints()

* tidy up switchpoint code

* tweak

* teaking device_state_attributes

* some refactoring

* move PRESET_CUSTOM back to evohome

* move CONF_ACCESS_TOKEN_EXPIRES CONF_REFRESH_TOKEN back to evohome

* refactor SP code and dt conversion

* delinted

* delinted

* remove water_heater

* fix regression

* Migrate homekit

* Cleanup away mode

* Fix tests

* add helpers

* fix tests melissa

* Fix nehueat

* fix zwave

* add more tests

* fix deconz

* Fix climate test emulate_hue

* fix tests

* fix dyson tests

* fix demo with new layout

* fix honeywell

* Switch homekit_controller to use HVAC_MODE_HEAT_COOL instead of HVAC_MODE_AUTO (#25009)

* Lint

* PyLint

* Pylint

* fix fritzbox tests

* Fix google

* Fix all tests

* Fix lint

* Fix auto for homekit like controler

* Fix lint

* fix lint
2019-07-08 14:00:24 +02:00
Joakim Plate
c2f1c4b981 Correct socket use in cert_expiry platform (#25011)
* Make sure we use same family for ssl socket and connection

getaddrinfo result could be different from what connection
was made with. It also blocks potential use of
happy eye balls algorithm

This also fixes lingering sockets until python garbage
collection.

* Add availability value if unable to get expiry

* Fix lint issue
2019-07-08 11:33:23 +02:00
Seweryn Zeman
31d7b702a6 Added missing yeelight models mapping (#24963) 2019-07-07 23:50:48 -04:00
Joakim Sørensen
df4caf41d0 Install requirements for integrations in packages before importing them. (#25005)
* Process requirements for integrations in packages before loading

* trigger buld
2019-07-07 12:04:30 -07:00
Tom Harris
0595fc3097 Upgrade insteonplm to 0.16.0 and add INSTEON scene triggering (#24765)
* Upgrade insteonplm to 0.16.0 and add INSTEON scene triggering

* Fix spacing issue

* Dummy commit to trigger CLA

* Remove dummy change

* Code review changes

* Use ENTITY_MATCH_ALL keyword from const and lint cleanup

* Make entity method print_aldb private
2019-07-07 20:31:04 +02:00
Tsvi Mostovicz
b0dc782c98 Upgrade hdate==0.8.8 (#25008)
This should fix incosistencies between issur_melacha_in_effect sensor and candle_lighting time.

Probably fixes #24479 and #23852
2019-07-07 17:32:54 +02:00
Daniel Høyer Iversen
ecd7f86df0 upgrade switchmate to latest lib (#25006) 2019-07-07 13:02:13 +02:00
Ville Skyttä
b834671555 Test dependency updates (#25004)
* Upgrade pytest to 5.0.1

https://docs.pytest.org/en/latest/changelog.html#pytest-5-0-1-2019-07-04

* Upgrade asynctest to 0.13.0

* Upgrade requests_mock to 1.6.0
2019-07-07 12:30:31 +02:00
Dave T
6e24b52a7e Add support for aurora ABB Powerone solar photovoltaic inverter (#24809)
* Add support for aurora ABB Powerone solar photovoltaic inverter

* Add support for aurora ABB Powerone solar photovoltaic inverter

* Update stale docstring

* Fixed whitespace lint errors

* Remove test code

* Delete README.md

Website documentation contains setup instructions.  README not needed here.

* Only close the serial line once.

* Correct newlines between imports

* Change add_devices to add_entites and remove unnecessary logging.

* Use new style string formatting instead of concatenation

* Directly access variables rather than via config.get

* Update sensor.py
2019-07-07 11:22:21 +02:00
David Winn
628e12c944 Sleepiq single sleeper crash (#24941)
* Update sleepyq to 0.7

Fixes crash when working with a single sleeper.

* sleepiq: Handle null side definitions

These happen if no sleeper is defined for a side of the bed. Don't
create sensors for null sides; they'll crash every time we try to use
them.

* sleepiq: Fix urls mocked to match sleepyq 0.7

* sleepi: Fix test_sensor.TestSleepIQSensorSetup

Sleepyq 0.7 throws on empty strings, so we have to specify them.

* sleepiq: Test for ValueError thrown by sleepyq 0.7

* sleepiq: Drop no longer used HTTPError import

* sleepiq: Add tests for single sleeper case

* sleepiq: Shorten comments to not overflow line length

* sleepiq: Use formatted string literals for adding suffixes to test files

* sleepiq: Use str.format() for test suffixing
2019-07-07 08:40:02 +02:00
Penny Wood
adbec5bffc Changes as per code review of #24646 (#24917) 2019-07-07 07:36:57 +02:00
Ville Skyttä
e8a5306c23 Upgrade mypy to 0.711, drop no longer needed workarounds (#24998)
https://mypy-lang.blogspot.com/2019/06/mypy-0711-released.html
2019-07-07 03:58:33 +02:00
Franck Nijhof
b274b10f38 Adds Stale Probot for issues (#24985)
* Adds Stale Probot for issues

* Do not ignore assigned issues

* Small language tweak in mark comment
2019-07-06 20:18:20 +02:00
Franck Nijhof
ac4f2c9f73 Adds Lock Threads Probot (#24984) 2019-07-06 19:48:08 +02:00
Paul Annekov
97ed7fbb3f Switched from tuyapy to tuyaha as 1st one is not maintained (#24821) 2019-07-06 10:39:49 -07:00
Robert Dunmire III
003ca655ee Fix errors if rest source becomes unavailable (#24986)
* Fix errors if rest source becomes unavailable

* Remove exclamation mark
2019-07-06 19:33:37 +02:00
Adriaan Peeters
412910ca65 Add sonos.play_queue service (#24974)
* Add sonos.play_queue service

* Add SERVICE_PLAY_QUEUE import in alphabetical order

* Add queue_position parameter for sonos.play_queue service

* Move queue_position default to schema definition
2019-07-06 17:19:03 +02:00
Franck Nijhof
31f569ada9 Batch of Component(s) -> Integration(s) (#24972) 2019-07-05 15:24:26 -07:00
Niels Mündler
e75c9efb3f Fix monitoring of trays in syncthru component (#24961) 2019-07-05 11:23:17 +02:00
cgtobi
e93919673e Implement ADR0003 for Netatmo sensor (#24944)
* Remove configurable monitored conditions

* Only process existing modules

* Remove unused import

* Fix linter error
2019-07-05 09:41:18 +02:00
John Mihalic
c814b39fdb Update pyHik library to 0.2.3 (#24957) 2019-07-05 09:29:35 +02:00
Aaron Bach
a491f97eb9 Allow updating of via_device in device registry (#24921)
* Allow updating of via_device in device registry

* Added test
2019-07-04 19:10:23 -04:00
David F. Mulcahey
3c487928d4 New scanner device tracker and ZHA device tracker support (#24584)
* initial implementation for zha device trackers

* constant

* review comments

* Revert "review comments"

This reverts commit 2130823566820dfc114dbeda08fcdf76ed47a4e7.

* rename device tracker entity

* update trackers

* raise when not implemented

* Update homeassistant/components/device_tracker/config_entry.py

Review comment

Co-Authored-By: Martin Hjelmare <marhje52@kth.se>

* move source type to base state attrs

* review comments

* review comments

* review comments

* fix super call

* fix battery and use last seen from device

* add test

* cleanup and add more to test

* cleanup post zha entity removal PR

* add tests for base entities

* rework entity tests
2019-07-04 12:44:39 +02:00
Steven Rollason
e824c553ca Fix exclusion of routes with excl_filter (#24928)
Fix exclusion of routes with excl_filter (was including instead of excluding)
2019-07-03 19:48:01 -04:00
Chris Soyars
2634f35b4e Add support for Yale YRL256 lock (#24932) 2019-07-03 19:29:21 -04:00
Anders Melchiorsen
a1aaeab33a Update pysonos to 0.0.19 (#24930) 2019-07-03 19:26:16 -04:00
Jeff Irion
e9816f7e30 Bump androidtv to 0.0.18 (#24927)
* Bump androidtv to 0.0.18

* Bump androidtv to 0.0.18
2019-07-03 20:18:37 +02:00
David F. Mulcahey
a9459c6d92 Remove ZHA device entity (#24909)
* move availability handling to device

* update last_seen format

* add battery sensor

* fix interval

* fix battery reporting now that it is a sensor

* remove zha entities and add battery sensor
2019-07-03 13:36:36 -04:00
Дубовик Максим
eec67d8b1a New languages that looks like supported by Google but not documented: (#24881)
* cs-CZ – Czech, Czech Republic
* el-GR – Modern Greek (1453-), Greece
* en-IN – English, India
* fi-FI – Finnish, Finland
* fil-PH – Filipino, Philippines
* hi-IN – Hindi, India
* id-ID – Indonesian, Indonesia
* vi-VN – Vietnamese, Viet Nam
Fixed regex expression to match language codes like fil-PH
2019-07-03 16:40:14 +02:00
cgtobi
e8d9fe0aa8 Fix home coach discovery (#24902)
* Fix home coach discovery

* Update requirements file
2019-07-02 21:55:01 -04:00
Paulus Schoutsen
aa03550f6b Updated frontend to 20190702.0 2019-07-02 10:34:22 -07:00
kreegahbundolo
61c88db8a1 Add ability to send attachments in pushover notifications (#24806)
* Added ability to send attachments in pushover notifications

* Added full name for exception to satisfy static check

* Fixed hanging indent lint problem

* Added path checking, removed import re, changed url check method to use
startswith.

* Removed argument from logging statement.

* Changed IOError to OSError, fixed logging, added logging statement.
2019-07-02 17:56:12 +02:00
Phil Bruckner
8dca73d08e Add missing trigger.for variable to template trigger (#24893) 2019-07-02 17:46:26 +02:00
Phil Bruckner
3f4ce70414 Fix 'same state' monitoring in state trigger (#24904) 2019-07-02 17:29:38 +02:00
Phil Bruckner
945afbc6d4 Fix 'same state' monitoring in numeric_state trigger (#24910) 2019-07-02 17:28:02 +02:00
Anders Melchiorsen
c0a342d790 Stability improvements for Sonos availability (#24880)
* Stability improvements for Sonos availability

* Handle seen reentrancy
2019-07-02 09:25:02 -04:00
Pascal Vizeli
6de6c10bc3 Update devcontainer.json 2019-07-02 14:31:06 +02:00
Pascal Vizeli
6c25c9760a Update devcontainer.json 2019-07-02 13:34:50 +02:00
Pascal Vizeli
7bf140f921 Update devcontainer.json 2019-07-02 13:32:35 +02:00
Phil Bruckner
e3d281b3c4 Bump life360 package to 4.0.1 (#24905) 2019-07-02 12:14:46 +02:00
Pascal Vizeli
0c43c4b5e1 Add git editor / app port 2019-07-02 10:39:02 +02:00
Penny Wood
23dd644f4a Update IDs for rename node/value (#24646)
* Update IDs for rename node/value

* Rename devices and entities

* Improved coverage
2019-07-01 15:54:19 -07:00
David F. Mulcahey
7f90a1cab2 go back to signals and no hard entity references (#24894) 2019-07-01 16:32:57 -04:00
kevank
b6e0f538c5 Update tts.py (#24892) 2019-07-01 10:49:27 -07:00
Dennis Keitzel
8cd138608c Support mqtt discovery topic prefix with slashes (#24840) 2019-07-01 10:23:01 -07:00
David Bonnes
846575b7fb Tweak geniushub battery icons according to device state (#24798)
* tweak battery icons according to device state/availability

* tweak battery icons according to device state/availability 2

* make dt objects aware

* make dt objects aware 2

* woops - use util.dt in favour of datetime

* woops - use util.dt in favour of datetime 2

* refactor battery icon code, remove parallel_updates
2019-07-01 10:19:14 -07:00
Daniel Høyer Iversen
3d2f843c1d Upgrade pytest to 5.0.0 (#24885)
* Upgrade pytest to 5.0.0

* exception message for pytest 5
2019-07-01 10:47:42 -04:00
Jeff Irion
5ba83d4dfb Bump androidtv to 0.0.17 (#24886)
* Bump androidtv to 0.0.17

* Bump androidtv to 0.0.17
2019-07-01 10:47:21 -04:00
Paulus Schoutsen
0dd19ed49c Updated frontend to 20190630.0 2019-06-30 22:53:35 -07:00
Paulus Schoutsen
77b83b9e4d Update translations 2019-06-30 22:53:27 -07:00
Andrew Sayre
7db4eeaf7f Move SmartThings imports to top (#24878)
* Move imports to top

* use lib constants

* Add missing three_axis mapping
2019-06-30 22:29:21 -04:00
David F. Mulcahey
7d651e2b7a Fix traceback during ZHA device removal (#24882)
* fix device remove lifecycle
* clean up remove signal
* add guard
2019-06-30 21:12:27 -04:00
Fabian Affolter
40c424e793 Upgrade bcrypt to 3.1.7 (#24850) 2019-06-30 20:23:47 -04:00
Fabian Affolter
a6ea5d43b4 Upgrade importlib-metadata to 0.18 (#24848) 2019-06-30 20:23:27 -04:00
Maikel Punie
bf70e91a0d Velbus: autodiscover covers (#24877)
* Added covers to the velbus component with autodicovery, bumped python velbus version

* Fixed some pylint stuff
2019-06-30 13:02:07 -07:00
Fabian Affolter
5cf923ead6 Upgrade youtube_dl to 2019.06.27 (#24875) 2019-06-30 13:52:08 -04:00
realthk
fec2461e0e Hungarian is also supported in Google Cloud TTS (#24861)
* Hungarian is also a supported language

* Hungarian is also a supported language

* Hungarian is also a supported language
2019-06-30 13:50:06 -04:00
Fabian Affolter
c71a5643ff Update praw to 6.3.1 (#23737)
* Upgrade praw to 6.3.1

* Update praw to 6.3.1
2019-06-30 16:49:16 +02:00
zewelor
b0387c4428 Fix mysensors icon name (#24871) 2019-06-30 12:15:29 +02:00
Fabian Affolter
1e149a704b Upgrade cryptography to 2.7 (#24852) 2019-06-30 07:21:35 +02:00
Fabian Affolter
cb71b4a657 Upgrade psutil to 5.6.3 (#24854) 2019-06-29 11:40:57 -04:00
Fabian Affolter
26cc41094d Upgrade jinja2 to >=2.10.1 (#24851) 2019-06-29 15:47:22 +02:00
Fabian Affolter
9946b19735 Upgrade pyyaml to 5.1.1 (#24847) 2019-06-29 14:34:55 +02:00
Fabian Affolter
6ad9a97f0d Upgrade certifi to >= 2019.6.16 (#24846) 2019-06-29 14:34:27 +02:00
Fabian Affolter
a91ad0189e Upgrade numpy to 1.16.4 (#24845) 2019-06-29 07:15:32 -04:00
Fabian Affolter
67b6657bcd Upgrade sqlalchemy to 1.3.5 (#24844) 2019-06-29 07:14:47 -04:00
Fabian Affolter
e1a34c8030 Upgrade luftdaten to 0.6.1 (#24842)
* Upgrade luftdaten to 0.6.0

* Upgrade luftdaten to 0.6.1
2019-06-29 11:03:38 +02:00
zewelor
b70f907d25 Fix yeelight color temp getter (#24830)
* Fix yeelight color temp getter

* Remove wrong types
2019-06-28 22:56:11 -07:00
Jonathan Keljo
cde855f67d Upgrade sisyphus-control to 2.2 (#24837)
PR #22457 added some code that used new methods in `sisyphus-control` 2.2.
Unfortunately, because of the move to manifests it was merged still depending
on 2.1.

Fixes #24834
2019-06-28 22:45:57 -07:00
Paulus Schoutsen
9cf43dd8ff Merge pull request #24839 from home-assistant/rc
0.95.4
2019-06-28 22:42:23 -07:00
Phil Bruckner
03e6a92cf3 Add template support to template trigger's for option (#24810) 2019-06-28 22:30:47 -07:00
Paulus Schoutsen
21c2e8da6e Bumped version to 0.95.4 2019-06-28 22:23:22 -07:00
Paulus Schoutsen
b3963e56ec Guard for None entity config (#24838) 2019-06-28 22:23:17 -07:00
zewelor
c6e8e2398c Improve autodiscovered yeelights model detection (#24671)
* Improve autodiscovered yeelights model detection

* Lint fixes

* Logger warn fix
2019-06-28 22:23:17 -07:00
Paulus Schoutsen
4b5718431d Guard for None entity config (#24838) 2019-06-28 22:23:00 -07:00
Niels Mündler
333e1d6789 Fronius (solar energy and inverter) component (#22316)
* Introduced fronius component that adds ability to track Fronius devices from Home Assistant

* Use device parameter for fetching inverter data

* Fixed handling of default scope

* Handle exceptions from yield

* Fulfill PR requirements

* Fixed houndci violations

* Found the last hound violation

* Fixed docstring (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165776934)

* Fixed import order with isort (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165776957)

* CONF_DEVICE is now CONF_DEVICEID (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165777161)

* Added docstring to class FroniusSensor (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165777792)

* Fixed docstring for state (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165777885)

* Added/fixed docstrings (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165778108 & https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165778125)

* Remove redundant log entry (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165779213)

* Fixed error message if sensor update fails (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165779435)

* Fixed error log messages (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165779751 & https://github.com/home-assistant/home-assistant/pull/11446#discussion_r165779761)

* Satisfy hound

* Handle exceptions explicit (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r168940902)

* Removed unnecessary call of update (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r168940894)

* The point makes the difference.

* Removed unrelated requirements

* Remove config logging (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r168968748)

* Reorder and fix imports (https://github.com/home-assistant/home-assistant/pull/11446#discussion_r168968725, https://github.com/home-assistant/home-assistant/pull/11446#discussion_r168968691)

* Update fronius requirement

* Various small fixes

* Small fixes

* Formatting

* Add fronius to coverage

* New structure and formatting

* Add manifest.json

* Fix data loading

* Make pylint happy

* Fix issues

* Fix parse_attributes

* Fix docstring and platform schema

* Make use of default HA-Const config values

* Change configuration setup, introducing list of monitored conditions

* Change the structure slightly, allowing for a list of sensors

* Remove periods from logging

* Formatting

* Change name generation, use variable instead of string

* small fixes

* Update sensor.py

* Incorporate correction proposals

* Setting default device inside validation

* Move import on top and small format

* Formatting fix

* Rename validation method to _device_id_validator
2019-06-28 20:48:52 -07:00
Paulus Schoutsen
2e9c71f2c0 Merge pull request #24836 from home-assistant/rc
0.95.3
2019-06-28 20:45:21 -07:00
Paulus Schoutsen
072879cc6e Bumped version to 0.95.3 2019-06-28 20:44:23 -07:00
Paulus Schoutsen
cc75adfed6 Alexa sync state report (#24835)
* Do a sync after changing state reporting

* Fix entity config being None
2019-06-28 20:44:19 -07:00
Paulus Schoutsen
3cafc1f2c6 Alexa sync state report (#24835)
* Do a sync after changing state reporting

* Fix entity config being None
2019-06-28 20:43:57 -07:00
Aaron Bach
19a65f8db6 Remove temperature attribute from SimpliSafe alarm control panel (#24833) 2019-06-28 20:38:07 -07:00
Paulus Schoutsen
e8d1d28fdd Make sure alert is set up after notify (#24829) 2019-06-28 16:28:33 -06:00
Pascal Vizeli
9ad063ce03 Update azure-pipelines-ci.yml for Azure Pipelines 2019-06-28 22:55:04 +02:00
Paulus Schoutsen
f67693c56c Fix vacuum tests 2019-06-28 13:41:25 -07:00
Pascal Vizeli
9616fbdc36 Update azure-pipelines-ci.yml for Azure Pipelines 2019-06-28 22:31:27 +02:00
Pascal Vizeli
48dd5af9e3 Update azure-pipelines-ci.yml for Azure Pipelines 2019-06-28 22:18:52 +02:00
Pascal Vizeli
bc0fb5e3d9 Update azure-pipelines-ci.yml for Azure Pipelines 2019-06-28 22:08:29 +02:00
Pascal Vizeli
8e2bbf8c82 Update azure-pipelines-ci.yml for Azure Pipelines 2019-06-28 19:38:13 +02:00
Pascal Vizeli
538caafac2 Full speed azure 2019-06-28 17:35:17 +00:00
Paulus Schoutsen
0871d6c9c6 Merge pull request #24828 from home-assistant/rc
0.95.2
2019-06-28 10:30:01 -07:00
Luuk
468b0e8934 Add template vacuum support (#22904)
* Add template vacuum component

* Fix linting issues

* Make vacuum state optional

* Fix pylint issues

* Add context to template vacuum service calls

* Added tests to template vacuum

* Fix indent

* Fix docstrings

* Move files for new component folder structure

* Revert additions for template_vacuum tests to common.py

* Use existing constants for template vacuum config

* Handle invalid templates

* Add tests for unused services

* Add test for invalid templates

* Fix line too long

* Do not start template change tracking in case of MATCH_ALL

* Resolve review comments
2019-06-28 12:19:00 -04:00
Paulus Schoutsen
6cbfc63311 Bumped version to 0.95.2 2019-06-28 08:50:23 -07:00
Paulus Schoutsen
2886b217ab Fix calling empty script turn off (#24827) 2019-06-28 08:50:16 -07:00
Phil Bruckner
fafc68673a Fix another Life360 bug (#24805) 2019-06-28 08:50:16 -07:00
David F. Mulcahey
1990df63aa Bump ZHA quirks module (#24802)
* bump quirks version

* bump version - mija magnet
2019-06-28 08:50:15 -07:00
Paulus Schoutsen
e39f0f3e25 Make sure entity config is never none (#24801) 2019-06-28 08:50:14 -07:00
Pascal Vizeli
1f5e2fa3ce Update azure-pipelines-release.yml for Azure Pipelines (#24800) 2019-06-28 08:50:14 -07:00
cgtobi
204dd77404 Fix netatmo weatherstation setup error (#24788)
* Check if station data exists and reduce calls

* Fix module names list

* Add warning

* Remove dead code
2019-06-28 08:50:13 -07:00
Paulus Schoutsen
4e5b1ccde6 Fix calling empty script turn off (#24827) 2019-06-28 08:49:33 -07:00
Paulus Schoutsen
80844ae2ee Add developer tools panel (#24812) 2019-06-28 08:34:53 -07:00
cgtobi
a69a00785f Fix netatmo weatherstation setup error (#24788)
* Check if station data exists and reduce calls

* Fix module names list

* Add warning

* Remove dead code
2019-06-27 20:16:46 -07:00
Tejpal Sahota
41dd70f644 Changed default encoding to mp3 (#24808) 2019-06-27 20:16:22 -07:00
Paulus Schoutsen
e5b8d5f7ea Updated frontend to 20190627.0 2019-06-27 17:57:02 -07:00
Josh Anderson
c49869160b Use step from tado rather than assuming 0.1 (#24807) 2019-06-27 16:17:15 -07:00
Josh Anderson
69089da88e Use climate device's target temp step value (#24804) 2019-06-27 15:14:23 -07:00
Phil Bruckner
e43a733017 Fix another Life360 bug (#24805) 2019-06-27 15:11:32 -07:00
dreed47
3eb6b9d297 Zestimate fix for issue #23837 (#23838)
* Zestimate fix for issue #23837

removed references to MIN_TIME_BETWEEN_UPDATES
and replaced with SCAN_INTERVAL

* Zestimate fix for issue #23837

removed references to MIN_TIME_BETWEEN_UPDATES
and replaced with SCAN_INTERVAL
2019-06-27 15:09:33 -07:00
David F. Mulcahey
ac5ab52d01 Bump ZHA quirks module (#24802)
* bump quirks version

* bump version - mija magnet
2019-06-27 15:28:56 -04:00
Paulus Schoutsen
0d89b82bff Make sure entity config is never none (#24801) 2019-06-27 15:17:42 -04:00
Pascal Vizeli
0cde24e103 Update azure-pipelines-release.yml for Azure Pipelines (#24800) 2019-06-27 18:26:13 +02:00
h3ndrik
e932fc832c Add time delta option when searching for deutsche_bahn connections (#24600)
* Add time delta option when searching for connections

Add another option 'in' to search for upcoming connections in the future.
Handy if you need a few minutes to get to the train station and need to add that to the queried departure time.

* correct style errors

* rename new option

* rename new option (2/2)

* add offset correctly
2019-06-27 15:53:05 +02:00
Paulus Schoutsen
c87d6e4720 Catch uncaught Alexa error (#24785) 2019-06-26 20:24:20 -07:00
Ville Skyttä
71346760d0 Upgrade pytest to 4.6.3 (#24782)
* Upgrade pytest to 4.6.3

https://docs.pytest.org/en/latest/changelog.html#pytest-4-6-2-2019-06-03
https://docs.pytest.org/en/latest/changelog.html#pytest-4-6-3-2019-06-11

* Make litejet switch test work with pytest 4.6.3

Essentially reverts the corresponing change that was made for pytest
4.2 compatibility.
2019-06-26 20:01:03 -07:00
William Scanlon
f6c1f336d4 Pubnub to 1.0.8 (#24781) 2019-06-26 16:14:00 -07:00
Phil Bruckner
638c958acd Fix life360 exception when no location provided (#24777) 2019-06-26 16:03:11 -07:00
Paulus Schoutsen
b2231945dc Merge branch 'master' into dev 2019-06-26 10:42:25 -07:00
Andre Richter
56b8da133c Upgrade vallox to async client API (#24774) 2019-06-26 18:40:34 +02:00
Paulus Schoutsen
06af6f19a3 Entity to handle updates via events (#24733)
* Entity to handle updates via events

* Fix a bug

* Update entity.py
2019-06-26 09:22:51 -07:00
Paulus Schoutsen
9e0636eefa Updated frontend to 20190626.0 2019-06-26 09:15:54 -07:00
Alexei Chetroi
6ae1228e61 Enhancement/zha model manuf (#24771)
* Cleanup ZHA entities model and manufacturer usage.
Zigpy includes manufacturer and model as attributes of a zigpy
Device class, which simplifies handling of manufacturer and/or model
derived properties for the ZHA platform.

* Sort ZHA imports.
* Lint.
2019-06-26 09:31:19 -04:00
Matte23
29311e6391 Add support for IPP Printers to the CUPS integration (#24756)
* Add support for IPP Printers to the CUPS integration

* Fixed lint error

* Addressed comments, removed redundant check

* Simplified check, improved code readability
2019-06-25 16:13:08 -07:00
John Dyer
bd4f66fda3 Update Waze route dependency to 0.10 (#24754)
* Update manifest.json

Update waze calculator to 0.10, this was supposed to have been done in #22428 but was missed. See discussion [here](https://community.home-assistant.io/t/waze-travel-time-update/50955/201)

* Update requirements_all.txt
2019-06-25 15:25:53 -07:00
Daniel Høyer Iversen
dc89499116 Return correct name for met.no (#24763) 2019-06-25 13:09:04 -07:00
Alain Tavan
41b58b8bc1 fix an error in the description (#24735) 2019-06-25 10:37:25 -07:00
Emilv2
58df05a7e7 Remove obsolete comments in Dockerfile (#24748)
relevant lines were removed in e49b970665
2019-06-25 10:16:05 -07:00
Andre Richter
fb940e4269 Vallox: Fix missing hass member (#24753) 2019-06-25 10:15:41 -07:00
Paulus Schoutsen
26fc57d1b3 Ignore duplicate tradfri discovery (#24759)
* Ignore duplicate tradfri discovery

* Update name
2019-06-25 09:54:40 -07:00
cgtobi
da57f92796 Handle timeouts gracefully (#24752) 2019-06-25 08:57:43 -07:00
Andre Richter
236820d093 Add integration for Vallox Ventilation Units (#24660)
* Add integration for Vallox Ventilation Units.

* Address review comments #1

* Address review comments #2

* Replace IOError with OSError.

* Bump to fixed version of vallox_websocket_api.
2019-06-25 11:38:24 +02:00
Paulus Schoutsen
9813396880 Updated frontend to 20190624.1 2019-06-24 22:07:39 -07:00
Paulus Schoutsen
f5f86993f1 Improve Alexa error handling (#24745) 2019-06-24 22:04:31 -07:00
Martin Hjelmare
d4fc22add4 Fix locative device update (#24744)
* Add a test for two devices

* Fix locative updating all devices

* Add a guard clause that checks if correct device is passed.
2019-06-24 20:00:28 -07:00
Anders Melchiorsen
6e14e8ed91 Update pysonos to 0.0.17 (#24740) 2019-06-24 14:59:15 -07:00
Paulus Schoutsen
4aedd3a09a AdGuard to update entry (#24737) 2019-06-24 14:46:32 -07:00
Alexei Chetroi
26dea0f247 Update ZHA dependencies. (#24736) 2019-06-24 16:57:07 -04:00
Conrad Juhl Andersen
0792e72f71 Add support for sensor state STATE_UNAVAILABLE (#24641)
* Fixed integration with ESPhome, which caused an error if ESPhome did not update fast enough on startup

* Set state to problem if sensor is unavailable

* Fix line length.
2019-06-24 11:30:44 -07:00
David F. Mulcahey
d9420c1f73 Remove device and entity registry entries when removing a ZHA device (#24369)
* cleanup when device is removed

fixes

* cleanup
2019-06-24 11:26:44 -07:00
Evan Bruhn
ee1884423a Save cached logi_circle tokens in config folder (#24726)
Instead of the working directory, which it's doing currently. Matches pattern observed on Abode, Ring, Skybell integrations.
2019-06-24 09:36:39 -07:00
Robin Wohlers-Reichel
17480a0398 Add 'unique_id' Property to Inverter Sensors (#24707)
* Option to change sensor names

* Python 3.5 compatibility

* Oops

* Get serial number at start

* Remove config opportunity

* Oops comma

* Changes from review

* Check yourself before you commit.
2019-06-24 08:34:20 -07:00
Paulus Schoutsen
e841f568c1 Update translations 2019-06-24 08:27:46 -07:00
Paulus Schoutsen
df32a81165 Updated frontend to 20190624.0 2019-06-24 08:26:50 -07:00
Phil Bruckner
8924d657a4 Add show_as_state options to Life360 (#24725) 2019-06-24 08:05:34 -07:00
endor
98ba529ead Add Trafikverket train component (#23470)
* Added Trafikverket train component

* Updated manifest with proper name and codeowner

* Updated requirements and manifest

* Updated CODEOWNERS

* Corrected requirements

* Added trafikverket_train/sensor.py to .coveragerc

* Added error handling and log if API call fails

* Corrected styles, removed dev log, improved validation

* Method calls to async_update(), improved error handling

* Minor cleanup/reorg for effeciency

* Added station cache and corrected to fit standards

* Simplified trainstop id  and cleaned up dict.get

* Corrected mistake after change from dict to array

* Change device class to timestamp
2019-06-24 10:38:50 +02:00
cgtobi
9a01cd84c2 Bump pyatmo to v2.1.0 (#24724) 2019-06-24 07:43:49 +02:00
John Luetke
09c6f57364 Expose ports 8123, 8300 and 51827 in Dockerfile (#24389) 2019-06-23 21:44:26 +02:00
Pascal Vizeli
a807572382 Add initial support for remote dev container (#24681)
* Add initial support for remote container

* Use constrain
2019-06-23 12:18:33 -07:00
Oleg Kurapov
dc6a44d0eb Extend websocket method usage to port 8002 in Samsung TV media player (#24716) 2019-06-23 12:11:25 -07:00
Paulus Schoutsen
c296e9b9bb Update owner stream integration 2019-06-23 12:00:06 -07:00
David F. Mulcahey
d22bb8fc7d Update ZHA dependencies (#24718)
* update deps and remove legacy constants bridge

* run deps script and fix test import
2019-06-23 13:43:19 -04:00
ktnrg45
b99275f6a5 Fix PS4 entities with shared host not updating and latency with multiple connections (#24642)
* correct assume info call

* 0.8.4

* 0.8.4

* 0.8.4

* 0.8.5

* 0.8.5

* 0.8.5

* revert condition
2019-06-23 09:52:53 -06:00
Robin Wohlers-Reichel
57502bc911 Solax update 0.1.0 (#24708)
* Update to solax 0.0.6

* Library version 0.1.0
2019-06-23 11:16:39 +02:00
cgtobi
128e66fa24 Bump version pyatmo to 2.0.1 (#24703) 2019-06-23 07:50:04 +02:00
Fabian Affolter
0132ac3c27 Upgrade Sphinx to 2.1.2 (#24693) 2019-06-23 07:49:40 +02:00
David F. Mulcahey
cfd8d70890 ZHA fix device type mappings (#24699) 2019-06-22 15:05:35 -04:00
Fabian Affolter
44d2871dc9 Upgrade youtube_dl to 2019.06.08 (#24692) 2019-06-22 14:45:39 +02:00
Fabian Affolter
821e3beab0 Upgrade discord.py to 1.2.2 (#24695) 2019-06-22 14:44:24 +02:00
Anders Melchiorsen
a439e087e1 Fix time expression parsing (#24696) 2019-06-22 13:39:33 +02:00
Andre Lengwenus
b8acbf3c3a Corrected number of default LCN segment coupler scan tryouts (#24678)
* Bump to pypck==0.6.2

* Set default segment coupler scan tryouts to 0
2019-06-22 13:27:41 +02:00
Jonathan
d25214beb1 Add aml_thermal label (#24665)
Added label for the CPU Temperature for AmLogic ARM chips.
2019-06-22 12:58:37 +02:00
Penny Wood
22d9bee41a Template: Expand method to expand groups, and closest as filter (#23691)
* Implement expand method

* Allow expand and closest to be used as filters

* Correct patch

* Addresses review comments
2019-06-22 00:32:32 -07:00
Ville Skyttä
a6eef22fbc Upgrade mypy to 0.710 (#24666)
* Upgrade mypy to 0.710

* Address mypy 0.710 errors
2019-06-22 10:19:36 +03:00
Steven Looman
f189367c02 Upgrade to async_upnp_client==0.14.10 and increase search timeout (#24685) 2019-06-22 09:12:27 +02:00
Aaron Bach
40fa4463de Change Ambient solar radiation units to lx (#24690) 2019-06-21 23:12:16 -06:00
Aaron Bach
729df112a7 Add RainMachine device classes where appropriate (#24682) 2019-06-21 17:12:28 -06:00
Thomas Lovén
9b52b9bf66 Allow extra js modules to be included in frontend (#24675)
* Add extra_module_url and extra_module_url_es5 to frontend options

* Address review comments
2019-06-21 13:16:28 -07:00
zewelor
c6d5a5a6cc Improve autodiscovered yeelights model detection (#24671)
* Improve autodiscovered yeelights model detection

* Lint fixes

* Logger warn fix
2019-06-21 15:50:25 -04:00
Pascal Vizeli
560161bdbb Update azure-pipelines-release.yml for Azure Pipelines 2019-06-21 20:08:06 +02:00
Aaron Bach
3da3612c7b Add device class support for Ambient PWS sensors (#24677) 2019-06-21 10:27:53 -06:00
Paulus Schoutsen
8f243ad59d Updated frontend to 20190620.0 2019-06-21 09:27:02 -07:00
Pascal Vizeli
c9453bab19 Prefere binary with wheels (#24669) 2019-06-21 08:47:56 -07:00
Paulus Schoutsen
78b7ed0ebe Clean up Google Config (#24663)
* Clean up Google Config

* Lint

* pylint

* pylint2
2019-06-21 11:17:21 +02:00
Rodrigo Pérez
d468d0f71b Vlc telnet (#24290)
* Vlc telnet first commit

First functional version, remains to add more functionality.

* New functions added and bugfixes

* Compliance with dev checklist

* Compliance with dev checklist

* Compliance with pydocstyle

* Removed unused import

* Fixed wrong reference for exception

* Module renamed

* Fixed module rename in other

* Fixed wrong reference for exception


Module renamed


Fixed module rename in other

* Update homeassistant/components/vlc_telnet/media_player.py

Accepted suggestion by @OttoWinter

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Update homeassistant/components/vlc_telnet/media_player.py

Accepted suggestion by @OttoWinter

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Update homeassistant/components/vlc_telnet/media_player.py

Accepted suggestion by @OttoWinter

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Update homeassistant/components/vlc_telnet/media_player.py

Accepted suggestion by @OttoWinter

Co-Authored-By: Otto Winter <otto@otto-winter.com>

* Suggestions by @OttoWinter

+Manage error when the VLC dissapears to show status unavailable.

* Removed error log, instead set unavailable state

* Changes suggested by @pvizeli

-Import location
-Use of constants

* Implemented available method

* Improved available method
2019-06-21 11:13:47 +02:00
Pascal Vizeli
6bc636c2f2 Update azure-pipelines-wheels.yml for Azure Pipelines 2019-06-21 11:07:42 +02:00
mvn23
43a6be6471 Multiple devices support for opentherm_gw (#22932)
* Breaking change: Rewrite opentherm_gw to add support for more than one OpenTherm Gateway.
Breaks config layout and child entity ids and adds a required parameter to all service calls (gateway_id).

* Add schema and parameter description for service opentherm_gw.reset_gateway.

* Add optional name attribute in config to be used for friendly names.
Fix bugs in binary_sensor and climate platforms.

* pylint fixes

* Remove unused variables.

* Update manifest.json, remove REQUIREMENTS from .py file

* Update CODEOWNERS

* Address issues that were brought up (requested changes):
- Move imports to module level
- Change certain functions from async to sync
- Move constants to const.py (new file)
- Call gateway setup from outside of __init__()
- Move validation of monitored_variables to config schema

* Address requested changes:
- Make module imports relative
- Move more functions from async to sync, decorate with @callback where necessary
- Remove monitored_variables option, add all sensors by default
2019-06-21 10:52:25 +02:00
Pascal Vizeli
d9f2a406f6 Update azure-pipelines-wheels.yml for Azure Pipelines 2019-06-21 10:46:35 +02:00
sfjes
0bdbf007b2 Fix downloader_download_failed event not firing for HTTP response errors (#24640) 2019-06-20 13:59:17 -07:00
Ville Skyttä
f1cbb2a0b3 braviatv, nmap_tracker: use getmac for getting MAC addresses (#24628)
* braviatv, nmap_tracker: use getmac for getting MAC addresses

Refs https://github.com/home-assistant/home-assistant/pull/24601

* Move getmac imports to top level
2019-06-20 23:35:02 +03:00
foreign-sub
ecfbfb4527 Fix AttributeError: 'NoneType' object has no attribute 'group' with sytadin component (#24652)
* Fix AttributeError: 'NoneType' object has no attribute 'group'

* Update sensor.py
2019-06-20 13:28:39 -07:00
Andrew Sayre
d8690f426c Bump pysmartthings (#24659) 2019-06-20 13:25:32 -07:00
Anders Melchiorsen
39f2e49451 Update LIFX brightness during long transitions (#24653) 2019-06-20 13:24:45 -07:00
Kevin Fronczak
58f14c5fe2 Upgrade blinkpy==0.14.1 for startup bugfix (#24656) 2019-06-20 13:24:02 -07:00
Paulus Schoutsen
319ac23736 Warn when user tries run custom config flow (#24657) 2019-06-20 13:22:12 -07:00
Alexei Chetroi
86e50530b0 Bump ZHA dependencies. (#24637) 2019-06-19 22:32:31 -04:00
Martin Hjelmare
7881081207 Fix device tracker see for entity registry entities (#24633)
* Add a test for see service gaurd

* Guard from seeing devices part of entity registry

* Await registry task early

* Lint

* Correct comment

* Clean up wait for registry

* Fix spelling

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>

* Fix spelling

Co-Authored-By: Paulus Schoutsen <paulus@home-assistant.io>
2019-06-20 03:22:33 +02:00
3125 changed files with 184695 additions and 145023 deletions

View File

@@ -1,272 +0,0 @@
# Python CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-python/ for more details
#
version: 2.1
executors:
python:
parameters:
tag:
type: string
default: latest
docker:
- image: circleci/python:<< parameters.tag >>
- image: circleci/buildpack-deps:stretch
working_directory: ~/repo
commands:
docker-prereqs:
description: Set up docker prerequisite requirement
steps:
- run: sudo apt-get update && sudo apt-get install -y --no-install-recommends
libudev-dev libavformat-dev libavcodec-dev libavdevice-dev libavutil-dev
libswscale-dev libswresample-dev libavfilter-dev
install-requirements:
description: Set up venv and install requirements python packages with cache support
parameters:
python:
type: string
default: latest
all:
description: pip install -r requirements_all.txt
type: boolean
default: false
test:
description: pip install -r requirements_test.txt
type: boolean
default: false
test_all:
description: pip install -r requirements_test_all.txt
type: boolean
default: false
steps:
- restore_cache:
keys:
- v1-<< parameters.python >>-{{ checksum "homeassistant/package_constraints.txt" }}-<<# parameters.all >>{{ checksum "requirements_all.txt" }}<</ parameters.all>>-<<# parameters.test >>{{ checksum "requirements_test.txt" }}<</ parameters.test>>-<<# parameters.test_all >>{{ checksum "requirements_test_all.txt" }}<</ parameters.test_all>>
- run:
name: install dependencies
command: |
python3 -m venv venv
. venv/bin/activate
pip install -q -U pip
pip install -q -U setuptools
<<# parameters.all >>pip install -q --progress-bar off -r requirements_all.txt -c homeassistant/package_constraints.txt<</ parameters.all>>
<<# parameters.test >>pip install -q --progress-bar off -r requirements_test.txt -c homeassistant/package_constraints.txt<</ parameters.test>>
<<# parameters.test_all >>pip install -q --progress-bar off -r requirements_test_all.txt -c homeassistant/package_constraints.txt<</ parameters.test_all>>
no_output_timeout: 15m
- save_cache:
paths:
- ./venv
key: v1-<< parameters.python >>-{{ checksum "homeassistant/package_constraints.txt" }}-<<# parameters.all >>{{ checksum "requirements_all.txt" }}<</ parameters.all>>-<<# parameters.test >>{{ checksum "requirements_test.txt" }}<</ parameters.test>>-<<# parameters.test_all >>{{ checksum "requirements_test_all.txt" }}<</ parameters.test_all>>
install:
description: Install Home Assistant
steps:
- run:
name: install
command: |
. venv/bin/activate
pip install -q --progress-bar off -e .
jobs:
static-check:
executor:
name: python
tag: 3.5.5-stretch
steps:
- checkout
- docker-prereqs
- install-requirements:
python: 3.5.5-stretch
test: true
- run:
name: run static check
command: |
. venv/bin/activate
flake8 homeassistant tests script
- run:
name: run static type check
command: |
. venv/bin/activate
TYPING_FILES=$(cat mypyrc)
mypy $TYPING_FILES
- install
- run:
name: validate manifests
command: |
. venv/bin/activate
python -m script.hassfest validate
- run:
name: run gen_requirements_all
command: |
. venv/bin/activate
python script/gen_requirements_all.py validate
pre-install-all-requirements:
executor:
name: python
tag: 3.5.5-stretch
steps:
- checkout
- docker-prereqs
- install-requirements:
python: 3.5.5-stretch
all: true
test: true
pylint:
executor:
name: python
tag: 3.5.5-stretch
parallelism: 2
steps:
- checkout
- docker-prereqs
- install-requirements:
python: 3.5.5-stretch
all: true
test: true
- install
- run:
name: run pylint
command: |
. venv/bin/activate
PYFILES=$(circleci tests glob "homeassistant/**/*.py" | circleci tests split)
pylint ${PYFILES}
no_output_timeout: 15m
pre-test:
parameters:
python:
type: string
executor:
name: python
tag: << parameters.python >>
steps:
- checkout
- docker-prereqs
- install-requirements:
python: << parameters.python >>
test_all: true
test:
parameters:
python:
type: string
executor:
name: python
tag: << parameters.python >>
parallelism: 2
steps:
- checkout
- docker-prereqs
- install-requirements:
python: << parameters.python >>
test_all: true
- install
- run:
name: run tests with code coverage
command: |
. venv/bin/activate
CC_SWITCH="--cov --cov-report="
TESTFILES=$(circleci tests glob "tests/**/test_*.py" | circleci tests split --split-by=timings)
pytest --timeout=9 --durations=10 --junitxml=test-reports/homeassistant/results.xml -qq -o junit_family=xunit2 -o junit_suite_name=homeassistant -o console_output_style=count -p no:sugar $CC_SWITCH -- ${TESTFILES}
script/check_dirty
codecov
- store_test_results:
path: test-reports
- store_artifacts:
path: htmlcov
destination: cov-reports
- store_artifacts:
path: test-reports
destination: test-reports
# This job use machine executor, e.g. classic CircleCI VM because we need both lokalise-cli and a Python runtime.
# Classic CircleCI included python 2.7.12 and python 3.5.2 managed by pyenv, the Python version may need change if
# CircleCI changed its VM in future.
upload-translations:
machine: true
steps:
- checkout
- run:
name: upload english translations
command: |
pyenv versions
pyenv global 3.5.2
docker pull lokalise/lokalise-cli@sha256:2198814ebddfda56ee041a4b427521757dd57f75415ea9693696a64c550cef21
script/translations_upload
workflows:
version: 2
build:
jobs:
- static-check
- pre-install-all-requirements:
requires:
- static-check
- pylint:
requires:
- pre-install-all-requirements
- pre-test:
name: pre-test 3.5.5
requires:
- static-check
python: 3.5.5-stretch
- pre-test:
name: pre-test 3.6
requires:
- static-check
python: 3.6-stretch
- pre-test:
name: pre-test 3.7
requires:
- static-check
python: 3.7-stretch
- test:
name: test 3.5.5
requires:
- pre-test 3.5.5
python: 3.5.5-stretch
- test:
name: test 3.6
requires:
- pre-test 3.6
python: 3.6-stretch
- test:
name: test 3.7
requires:
- pre-test 3.7
python: 3.7-stretch
# CircleCI does not allow failure yet
# - test:
# name: test 3.8
# python: 3.8-rc-stretch
- upload-translations:
requires:
- static-check
filters:
branches:
only: dev

View File

@@ -34,10 +34,13 @@ omit =
homeassistant/components/androidtv/*
homeassistant/components/anel_pwrctrl/switch.py
homeassistant/components/anthemav/media_player.py
homeassistant/components/apache_kafka/*
homeassistant/components/apcupsd/*
homeassistant/components/apple_tv/*
homeassistant/components/aqualogic/*
homeassistant/components/aquostv/media_player.py
homeassistant/components/arcam_fmj/media_player.py
homeassistant/components/arcam_fmj/__init__.py
homeassistant/components/arduino/*
homeassistant/components/arest/binary_sensor.py
homeassistant/components/arest/sensor.py
@@ -49,7 +52,9 @@ omit =
homeassistant/components/asterisk_mbox/*
homeassistant/components/asuswrt/device_tracker.py
homeassistant/components/august/*
homeassistant/components/aurora_abb_powerone/sensor.py
homeassistant/components/automatic/device_tracker.py
homeassistant/components/avea/light.py
homeassistant/components/avion/light.py
homeassistant/components/azure_event_hub/*
homeassistant/components/baidu/tts.py
@@ -118,6 +123,7 @@ omit =
homeassistant/components/ddwrt/device_tracker.py
homeassistant/components/decora/light.py
homeassistant/components/decora_wifi/light.py
homeassistant/components/delijn/*
homeassistant/components/deluge/sensor.py
homeassistant/components/deluge/switch.py
homeassistant/components/denon/media_player.py
@@ -197,6 +203,7 @@ omit =
homeassistant/components/fints/sensor.py
homeassistant/components/fitbit/sensor.py
homeassistant/components/fixer/sensor.py
homeassistant/components/fleetgo/device_tracker.py
homeassistant/components/flexit/climate.py
homeassistant/components/flic/binary_sensor.py
homeassistant/components/flock/notify.py
@@ -205,6 +212,8 @@ omit =
homeassistant/components/folder/sensor.py
homeassistant/components/folder_watcher/*
homeassistant/components/foobot/sensor.py
homeassistant/components/fortios/device_tracker.py
homeassistant/components/fortigate/*
homeassistant/components/foscam/camera.py
homeassistant/components/foursquare/*
homeassistant/components/free_mobile/notify.py
@@ -214,6 +223,7 @@ omit =
homeassistant/components/fritzbox_callmonitor/sensor.py
homeassistant/components/fritzbox_netmonitor/sensor.py
homeassistant/components/fritzdect/switch.py
homeassistant/components/fronius/sensor.py
homeassistant/components/frontier_silicon/media_player.py
homeassistant/components/futurenow/light.py
homeassistant/components/garadget/cover.py
@@ -405,6 +415,8 @@ omit =
homeassistant/components/nissan_leaf/*
homeassistant/components/nmap_tracker/device_tracker.py
homeassistant/components/nmbs/sensor.py
homeassistant/components/notion/binary_sensor.py
homeassistant/components/notion/sensor.py
homeassistant/components/noaa_tides/sensor.py
homeassistant/components/norway_air/air_quality.py
homeassistant/components/nsw_fuel_station/sensor.py
@@ -464,8 +476,6 @@ omit =
homeassistant/components/prometheus/*
homeassistant/components/prowl/notify.py
homeassistant/components/proxy/camera.py
homeassistant/components/ps4/__init__.py
homeassistant/components/ps4/media_player.py
homeassistant/components/ptvsd/*
homeassistant/components/pulseaudio_loopback/switch.py
homeassistant/components/pushbullet/notify.py
@@ -491,6 +501,7 @@ omit =
homeassistant/components/rainmachine/binary_sensor.py
homeassistant/components/rainmachine/sensor.py
homeassistant/components/rainmachine/switch.py
homeassistant/components/rainforest_eagle/sensor.py
homeassistant/components/raspihats/*
homeassistant/components/raspyrfm/*
homeassistant/components/recollect_waste/sensor.py
@@ -507,7 +518,6 @@ omit =
homeassistant/components/rfxtrx/*
homeassistant/components/ring/camera.py
homeassistant/components/ripple/sensor.py
homeassistant/components/ritassist/device_tracker.py
homeassistant/components/rocketchat/notify.py
homeassistant/components/roku/*
homeassistant/components/roomba/vacuum.py
@@ -581,6 +591,7 @@ omit =
homeassistant/components/stiebel_eltron/*
homeassistant/components/streamlabswater/*
homeassistant/components/stride/notify.py
homeassistant/components/suez_water/*
homeassistant/components/supervisord/sensor.py
homeassistant/components/swiss_hydrological_data/sensor.py
homeassistant/components/swiss_public_transport/sensor.py
@@ -626,7 +637,7 @@ omit =
homeassistant/components/tomato/device_tracker.py
homeassistant/components/toon/*
homeassistant/components/torque/sensor.py
homeassistant/components/totalconnect/alarm_control_panel.py
homeassistant/components/totalconnect/*
homeassistant/components/touchline/climate.py
homeassistant/components/tplink/device_tracker.py
homeassistant/components/tplink/light.py
@@ -637,10 +648,13 @@ omit =
homeassistant/components/trackr/device_tracker.py
homeassistant/components/tradfri/*
homeassistant/components/tradfri/light.py
homeassistant/components/trafikverket_train/sensor.py
homeassistant/components/trafikverket_weatherstation/sensor.py
homeassistant/components/transmission/*
homeassistant/components/travisci/sensor.py
homeassistant/components/tuya/*
homeassistant/components/twentemilieu/const.py
homeassistant/components/twentemilieu/sensor.py
homeassistant/components/twilio_call/notify.py
homeassistant/components/twilio_sms/notify.py
homeassistant/components/twitch/sensor.py
@@ -655,12 +669,22 @@ omit =
homeassistant/components/uptimerobot/binary_sensor.py
homeassistant/components/uscis/sensor.py
homeassistant/components/usps/*
homeassistant/components/vallox/*
homeassistant/components/vasttrafik/sensor.py
homeassistant/components/velbus/*
homeassistant/components/velbus/__init__.py
homeassistant/components/velbus/binary_sensor.py
homeassistant/components/velbus/climate.py
homeassistant/components/velbus/const.py
homeassistant/components/velbus/cover.py
homeassistant/components/velbus/sensor.py
homeassistant/components/velbus/switch.py
homeassistant/components/velux/*
homeassistant/components/venstar/climate.py
homeassistant/components/vera/*
homeassistant/components/verisure/*
homeassistant/components/vesync/__init__.py
homeassistant/components/vesync/common.py
homeassistant/components/vesync/const.py
homeassistant/components/vesync/switch.py
homeassistant/components/viaggiatreno/sensor.py
homeassistant/components/vizio/media_player.py
@@ -684,6 +708,8 @@ omit =
homeassistant/components/worldtidesinfo/sensor.py
homeassistant/components/worxlandroid/sensor.py
homeassistant/components/wunderlist/*
homeassistant/components/wwlln/__init__.py
homeassistant/components/wwlln/geo_location.py
homeassistant/components/x10/light.py
homeassistant/components/xbox_live/sensor.py
homeassistant/components/xeoma/camera.py

30
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,30 @@
FROM python:3.7
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
libudev-dev \
libavformat-dev \
libavcodec-dev \
libavdevice-dev \
libavutil-dev \
libswscale-dev \
libswresample-dev \
libavfilter-dev \
git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /usr/src
RUN git clone --depth 1 https://github.com/home-assistant/hass-release \
&& cd hass-release \
&& pip3 install -e .
WORKDIR /workspace
# Install Python dependencies from requirements.txt if it exists
COPY requirements_test_all.txt homeassistant/package_constraints.txt /workspace/
RUN pip3 install -r requirements_test_all.txt -c package_constraints.txt
# Set the default shell to bash instead of sh
ENV SHELL /bin/bash

View File

@@ -0,0 +1,35 @@
// See https://aka.ms/vscode-remote/devcontainer.json for format details.
{
"name": "Home Assistant Dev",
"context": "..",
"dockerFile": "Dockerfile",
"postCreateCommand": "pip3 install -e .",
"appPort": 8123,
"runArgs": [
"-e", "GIT_EDTIOR='code --wait'"
],
"extensions": [
"ms-python.python",
"ms-azure-devops.azure-pipelines",
"redhat.vscode-yaml"
],
"settings": {
"python.pythonPath": "/usr/local/bin/python",
"python.linting.pylintEnabled": true,
"python.linting.enabled": true,
"python.formatting.provider": "black",
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,
"files.trimTrailingWhitespace": true,
"editor.rulers": [80],
"terminal.integrated.shell.linux": "/bin/bash",
"yaml.customTags": [
"!secret scalar",
"!include_dir_named scalar",
"!include_dir_list scalar",
"!include_dir_merge_list scalar",
"!include_dir_merge_named scalar"
]
}
}

View File

@@ -3,7 +3,7 @@
- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases
- Frontend issues should be submitted to the home-assistant-polymer repository: https://github.com/home-assistant/home-assistant-polymer/issues
- iOS issues should be submitted to the home-assistant-iOS repository: https://github.com/home-assistant/home-assistant-iOS/issues
- Do not report issues for components if you are using custom components: files in <config-dir>/custom_components
- Do not report issues for integrations if you are using custom integration: files in <config-dir>/custom_components
- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!
-->

View File

@@ -9,7 +9,7 @@ about: Create a report to help us improve
- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases
- Frontend issues should be submitted to the home-assistant-polymer repository: https://github.com/home-assistant/home-assistant-polymer/issues
- iOS issues should be submitted to the home-assistant-iOS repository: https://github.com/home-assistant/home-assistant-iOS/issues
- Do not report issues for components if you are using custom components: files in <config-dir>/custom_components
- Do not report issues for integrations if you are using a custom integration: files in <config-dir>/custom_components
- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!
-->

27
.github/lock.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
# Configuration for Lock Threads - https://github.com/dessant/lock-threads
# Number of days of inactivity before a closed issue or pull request is locked
daysUntilLock: 1
# Skip issues and pull requests created before a given timestamp. Timestamp must
# follow ISO 8601 (`YYYY-MM-DD`). Set to `false` to disable
skipCreatedBefore: 2019-07-01
# Issues and pull requests with these labels will be ignored. Set to `[]` to disable
exemptLabels: []
# Label to add before locking, such as `outdated`. Set to `false` to disable
lockLabel: false
# Comment to post before locking. Set to `false` to disable
lockComment: false
# Assign `resolved` as the reason for locking. Set to `false` to disable
setLockReason: false
# Limit to only `issues` or `pulls`
only: pulls
# Optionally, specify configuration settings just for `issues` or `pulls`
issues:
daysUntilLock: 30

54
.github/stale.yml vendored Normal file
View File

@@ -0,0 +1,54 @@
# Configuration for probot-stale - https://github.com/probot/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 90
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
daysUntilClose: 7
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
onlyLabels: []
# Issues or Pull Requests with these labels will never be considered stale. Set to `[]` to disable
exemptLabels:
- under investigation
# Set to true to ignore issues in a project (defaults to false)
exemptProjects: true
# Set to true to ignore issues in a milestone (defaults to false)
exemptMilestones: true
# Set to true to ignore issues with an assignee (defaults to false)
exemptAssignees: false
# Label to use when marking as stale
staleLabel: stale
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
There hasn't been any activity on this issue recently. Due to the high number
of incoming GitHub notifications, we have to clean some of the old issues,
as many of them have already been resolved with the latest updates.
Please make sure to update to the latest Home Assistant version and check
if that solves the issue. Let us know if that works for you by adding a
comment 👍
This issue now has been marked as stale and will be closed if no further
activity occurs. Thank you for your contributions.
# Comment to post when removing the stale label.
# unmarkComment: >
# Your comment here.
# Comment to post when closing a stale Issue or Pull Request.
# closeComment: >
# Your comment here.
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 30
# Limit to only `issues` or `pulls`
only: issues

11
.gitignore vendored
View File

@@ -4,6 +4,10 @@ config2/*
tests/testing_config/deps
tests/testing_config/home-assistant.log
# hass-release
data/
.token
# Hide sublime text stuff
*.sublime-project
*.sublime-workspace
@@ -94,8 +98,10 @@ virtualization/vagrant/.vagrant
virtualization/vagrant/config
# Visual Studio Code
.vscode
.devcontainer
.vscode/*
!.vscode/cSpell.json
!.vscode/extensions.json
!.vscode/tasks.json
# Built docs
docs/build
@@ -108,6 +114,7 @@ desktop.ini
# mypy
/.mypy_cache/*
/.dmypy.json
# Secrets
.lokalise_token

8
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,8 @@
repos:
- repo: https://github.com/python/black
rev: 19.3b0
hooks:
- id: black
args:
- --safe
- --quiet

View File

@@ -16,14 +16,14 @@ addons:
matrix:
fast_finish: true
include:
- python: "3.5.3"
- python: "3.6"
env: TOXENV=lint
- python: "3.5.3"
- python: "3.6"
env: TOXENV=pylint
- python: "3.5.3"
- python: "3.6"
env: TOXENV=typing
- python: "3.5.3"
env: TOXENV=py35
- python: "3.6"
env: TOXENV=py36
- python: "3.7"
env: TOXENV=py37

92
.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,92 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Preview",
"type": "shell",
"command": "hass -c ./config",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pytest",
"type": "shell",
"command": "pytest --timeout=10 tests",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Flake8",
"type": "shell",
"command": "flake8 homeassistant tests",
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Pylint",
"type": "shell",
"command": "pylint homeassistant",
"dependsOn": [
"Install all Requirements"
],
"group": {
"kind": "test",
"isDefault": true,
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Generate Requirements",
"type": "shell",
"command": "./script/gen_requirements_all.py",
"group": {
"kind": "build",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
},
{
"label": "Install all Requirements",
"type": "shell",
"command": "pip3 install -r requirements_all.txt -c homeassistant/package_constraints.txt",
"group": {
"kind": "build",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new"
},
"problemMatcher": []
}
]
}

View File

@@ -24,14 +24,18 @@ homeassistant/components/alpha_vantage/* @fabaff
homeassistant/components/amazon_polly/* @robbiet480
homeassistant/components/ambiclimate/* @danielhiversen
homeassistant/components/ambient_station/* @bachya
homeassistant/components/apache_kafka/* @bachya
homeassistant/components/api/* @home-assistant/core
homeassistant/components/aprs/* @PhilRW
homeassistant/components/arcam_fmj/* @elupus
homeassistant/components/arduino/* @fabaff
homeassistant/components/arest/* @fabaff
homeassistant/components/asuswrt/* @kennedyshead
homeassistant/components/aurora_abb_powerone/* @davet2001
homeassistant/components/auth/* @home-assistant/core
homeassistant/components/automatic/* @armills
homeassistant/components/automation/* @home-assistant/core
homeassistant/components/avea/* @pattyland
homeassistant/components/awair/* @danielsjf
homeassistant/components/aws/* @awarecan @robbiet480
homeassistant/components/axis/* @kane610
@@ -39,12 +43,11 @@ homeassistant/components/azure_event_hub/* @eavanvalkenburg
homeassistant/components/bitcoin/* @fabaff
homeassistant/components/bizkaibus/* @UgaitzEtxebarria
homeassistant/components/blink/* @fronzbot
homeassistant/components/bmw_connected_drive/* @ChristianKuehnel
homeassistant/components/braviatv/* @robbiet480
homeassistant/components/broadlink/* @danielhiversen
homeassistant/components/brunt/* @eavanvalkenburg
homeassistant/components/bt_smarthub/* @jxwolstenholme
homeassistant/components/buienradar/* @ties
homeassistant/components/buienradar/* @mjj4791 @ties
homeassistant/components/cisco_ios/* @fbradyirl
homeassistant/components/cisco_mobility_express/* @fbradyirl
homeassistant/components/cisco_webex_teams/* @fbradyirl
@@ -62,6 +65,7 @@ homeassistant/components/cups/* @fabaff
homeassistant/components/daikin/* @fredrike @rofrantz
homeassistant/components/darksky/* @fabaff
homeassistant/components/deconz/* @kane610
homeassistant/components/delijn/* @bollewolle
homeassistant/components/demo/* @home-assistant/core
homeassistant/components/device_automation/* @home-assistant/core
homeassistant/components/digital_ocean/* @fabaff
@@ -89,8 +93,11 @@ homeassistant/components/fitbit/* @robbiet480
homeassistant/components/fixer/* @fabaff
homeassistant/components/flock/* @fabaff
homeassistant/components/flunearyou/* @bachya
homeassistant/components/fortigate/* @kifeo
homeassistant/components/fortios/* @kimfrellsen
homeassistant/components/foursquare/* @robbiet480
homeassistant/components/freebox/* @snoof85
homeassistant/components/fronius/* @nielstron
homeassistant/components/frontend/* @home-assistant/frontend
homeassistant/components/gearbest/* @HerrHofrat
homeassistant/components/geniushub/* @zxdavb
@@ -113,7 +120,6 @@ homeassistant/components/history/* @home-assistant/core
homeassistant/components/history_graph/* @andrey-git
homeassistant/components/hive/* @Rendili @KJonline
homeassistant/components/homeassistant/* @home-assistant/core
homeassistant/components/homekit/* @cdce8p
homeassistant/components/homekit_controller/* @Jc2k
homeassistant/components/homematic/* @pvizeli @danielperna84
homeassistant/components/honeywell/* @zxdavb
@@ -180,10 +186,12 @@ homeassistant/components/nissan_leaf/* @filcole
homeassistant/components/nmbs/* @thibmaek
homeassistant/components/no_ip/* @fabaff
homeassistant/components/notify/* @home-assistant/core
homeassistant/components/notion/* @bachya
homeassistant/components/nsw_fuel_station/* @nickw444
homeassistant/components/nuki/* @pschmitt
homeassistant/components/ohmconnect/* @robbiet480
homeassistant/components/onboarding/* @home-assistant/core
homeassistant/components/opentherm_gw/* @mvn23
homeassistant/components/openuv/* @bachya
homeassistant/components/openweathermap/* @fabaff
homeassistant/components/orangepi_gpio/* @pascallj
@@ -205,6 +213,7 @@ homeassistant/components/qnap/* @colinodell
homeassistant/components/quantum_gateway/* @cisasteelersfan
homeassistant/components/qwikswitch/* @kellerza
homeassistant/components/raincloud/* @vanstinator
homeassistant/components/rainforest_eagle/* @gtdiehl
homeassistant/components/rainmachine/* @bachya
homeassistant/components/random/* @fabaff
homeassistant/components/repetier/* @MTrab
@@ -231,12 +240,15 @@ homeassistant/components/smtp/* @fabaff
homeassistant/components/solaredge_local/* @drobtravels
homeassistant/components/solax/* @squishykid
homeassistant/components/somfy/* @tetienne
homeassistant/components/songpal/* @rytilahti
homeassistant/components/sonos/* @amelchio
homeassistant/components/spaceapi/* @fabaff
homeassistant/components/spider/* @peternijssen
homeassistant/components/sql/* @dgomes
homeassistant/components/statistics/* @fabaff
homeassistant/components/stiebel_eltron/* @fucm
homeassistant/components/stream/* @hunterjm
homeassistant/components/suez_water/* @ooii
homeassistant/components/sun/* @Swamp-Ig
homeassistant/components/supla/* @mwegrzynek
homeassistant/components/swiss_hydrological_data/* @fabaff
@@ -263,7 +275,9 @@ homeassistant/components/toon/* @frenck
homeassistant/components/tplink/* @rytilahti
homeassistant/components/traccar/* @ludeeus
homeassistant/components/tradfri/* @ggravlingen
homeassistant/components/trafikverket_train/* @endor-force
homeassistant/components/tts/* @robbiet480
homeassistant/components/twentemilieu/* @frenck
homeassistant/components/twilio_call/* @robbiet480
homeassistant/components/twilio_sms/* @robbiet480
homeassistant/components/unifi/* @kane610
@@ -272,8 +286,10 @@ homeassistant/components/updater/* @home-assistant/core
homeassistant/components/upnp/* @robbiet480
homeassistant/components/uptimerobot/* @ludeeus
homeassistant/components/utility_meter/* @dgomes
homeassistant/components/velbus/* @ceral2nd
homeassistant/components/velux/* @Julius2342
homeassistant/components/version/* @fabaff
homeassistant/components/vesync/* @markperdue @webdjoe
homeassistant/components/vizio/* @raman325
homeassistant/components/vlc_telnet/* @rodripf
homeassistant/components/waqi/* @andrey-git
@@ -283,6 +299,7 @@ homeassistant/components/weblink/* @home-assistant/core
homeassistant/components/websocket_api/* @home-assistant/core
homeassistant/components/wemo/* @sqldiablo
homeassistant/components/worldclock/* @fabaff
homeassistant/components/wwlln/* @bachya
homeassistant/components/xfinity/* @cisasteelersfan
homeassistant/components/xiaomi_aqara/* @danielhiversen @syssi
homeassistant/components/xiaomi_miio/* @rytilahti @syssi
@@ -301,5 +318,4 @@ homeassistant/components/zoneminder/* @rohankapoorcom
homeassistant/components/zwave/* @home-assistant/z-wave
# Individual files
homeassistant/components/group/cover @cdce8p
homeassistant/components/demo/weather @fabaff

View File

@@ -2,7 +2,7 @@
# When updating this file, please also update virtualization/Docker/Dockerfile.dev
# This way, the development image and the production image are kept in sync.
FROM python:3.7
FROM python:3.7-buster
LABEL maintainer="Paulus Schoutsen <Paulus@PaulusSchoutsen.nl>"
# Uncomment any of the following lines to disable the installation.
@@ -24,12 +24,14 @@ RUN virtualization/Docker/setup_docker_prereqs
# Install hass component dependencies
COPY requirements_all.txt requirements_all.txt
# Uninstall enum34 because some dependencies install it but breaks Python 3.4+.
# See PR #8103 for more info.
RUN pip3 install --no-cache-dir -r requirements_all.txt && \
pip3 install --no-cache-dir mysqlclient psycopg2 uvloop==0.12.2 cchardet cython tensorflow
# Copy source
COPY . .
EXPOSE 8123
EXPOSE 8300
EXPOSE 51827
CMD [ "python", "-m", "homeassistant", "--config", "/config" ]

View File

@@ -4,147 +4,212 @@ trigger:
batch: true
branches:
include:
- rc
- dev
pr: none
- master
pr:
- rc
- dev
- master
resources:
containers:
- container: 35
image: homeassistant/ci-azure:3.5
- container: 36
image: homeassistant/ci-azure:3.6
- container: 37
image: homeassistant/ci-azure:3.7
variables:
- name: ArtifactFeed
value: '2df3ae11-3bf6-49bc-a809-ba0d340d6a6d'
- name: PythonMain
value: '35'
value: '36'
- group: codecov
stages:
jobs:
- stage: 'Overview'
jobs:
- job: 'Lint'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python -m venv venv
- job: 'Lint'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python -m venv lint
. lint/bin/activate
pip install flake8
flake8 homeassistant tests script
displayName: 'Run flake8'
. venv/bin/activate
pip install -r requirements_test.txt -c homeassistant/package_constraints.txt
displayName: 'Setup Env'
- script: |
. venv/bin/activate
flake8 homeassistant tests script
displayName: 'Run flake8'
- job: 'Validate'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python -m venv venv
. venv/bin/activate
pip install -e .
displayName: 'Setup Env'
- script: |
. venv/bin/activate
python -m script.hassfest validate
displayName: 'Validate manifests'
- script: |
. venv/bin/activate
./script/gen_requirements_all.py validate
displayName: 'requirements_all validate'
- job: 'CheckFormat'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python -m venv venv
- job: 'Check'
. venv/bin/activate
pip install -r requirements_test.txt -c homeassistant/package_constraints.txt
displayName: 'Setup Env'
- script: |
. venv/bin/activate
./script/check_format
displayName: 'Check Black formatting'
- stage: 'Tests'
dependsOn:
- Lint
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 1
matrix:
Python35:
python.version: '3.5'
python.container: '35'
Python36:
python.version: '3.6'
python.container: '36'
Python37:
python.version: '3.7'
python.container: '37'
container: $[ variables['python.container'] ]
steps:
- script: |
echo "$(python.version)" > .cache
displayName: 'Set python $(python.version) for requirement cache'
- 'Overview'
jobs:
- job: 'PyTest'
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 3
matrix:
Python36:
python.container: '36'
Python37:
python.container: '37'
container: $[ variables['python.container'] ]
steps:
- script: |
python --version > .cache
displayName: 'Set python $(python.container) for requirement cache'
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: 'Restore artifacts based on Requirements'
inputs:
keyfile: 'requirements_test_all.txt, .cache, homeassistant/package_constraints.txt'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
set -e
python -m venv venv
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: 'Restore artifacts based on Requirements'
inputs:
keyfile: 'requirements_test_all.txt, .cache'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
. venv/bin/activate
pip install -U pip setuptools pytest-azurepipelines -c homeassistant/package_constraints.txt
pip install -r requirements_test_all.txt -c homeassistant/package_constraints.txt
# This is a TEMP. Eventually we should make sure our 4 dependencies drop typing.
# Find offending deps with `pipdeptree -r -p typing`
pip uninstall -y typing
displayName: 'Create Virtual Environment & Install Requirements'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# Explicit Cache Save (instead of using RestoreAndSaveCache)
# Dont wait with cache save for all the other task in this job to complete (±30 minutes), other parallel jobs might utilize this
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: 'Save artifacts based on Requirements'
inputs:
keyfile: 'requirements_test_all.txt, .cache, homeassistant/package_constraints.txt'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
. venv/bin/activate
pip install -e .
displayName: 'Install Home Assistant for python $(python.container)'
- script: |
. venv/bin/activate
pytest --timeout=9 --durations=10 --junitxml=test-results.xml -qq -o console_output_style=count -p no:sugar tests
displayName: 'Run pytest for python $(python.container)'
condition: and(succeeded(), ne(variables['python.container'], variables['PythonMain']))
- script: |
set -e
- script: |
set -e
python -m venv venv
. venv/bin/activate
pip install -U pip setuptools
pip install -r requirements_test_all.txt -c homeassistant/package_constraints.txt
displayName: 'Create Virtual Environment & Install Requirements'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
. venv/bin/activate
pytest --timeout=9 --durations=10 --junitxml=test-results.xml --cov --cov-report=xml -qq -o console_output_style=count -p no:sugar tests
codecov --token $(codecovToken)
displayName: 'Run pytest for python $(python.container) / coverage'
condition: and(succeeded(), eq(variables['python.container'], variables['PythonMain']))
- task: PublishTestResults@2
condition: succeededOrFailed()
inputs:
testResultsFiles: 'test-results.xml'
testRunTitle: 'Publish test results for Python $(python.container)'
- task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: cobertura
summaryFileLocation: coverage.xml
displayName: 'publish coverage artifact'
condition: and(succeeded(), eq(variables['python.container'], variables['PythonMain']))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: 'Save artifacts based on Requirements'
inputs:
keyfile: 'requirements_test_all.txt, .cache'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
. venv/bin/activate
pip install -e .
displayName: 'Install Home Assistant for python $(python.version)'
- script: |
. venv/bin/activate
pytest --timeout=9 --durations=10 --junitxml=junit/test-results.xml -qq -o console_output_style=count -p no:sugar tests
displayName: 'Run pytest for python $(python.version)'
- task: PublishTestResults@2
condition: succeededOrFailed()
inputs:
testResultsFiles: '**/test-*.xml'
testRunTitle: 'Publish test results for Python $(python.version)'
- job: 'FullCheck'
- stage: 'FullCheck'
dependsOn:
- Check
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
echo "$(PythonMain)" > .cache
displayName: 'Set python $(python.version) for requirement cache'
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: 'Restore artifacts based on Requirements'
inputs:
keyfile: 'requirements_all.txt, requirements_test.txt, .cache'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- 'Overview'
jobs:
- job: 'Pylint'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python --version > .cache
displayName: 'Set python $(PythonMain) for requirement cache'
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: 'Restore artifacts based on Requirements'
inputs:
keyfile: 'requirements_all.txt, requirements_test.txt, .cache, homeassistant/package_constraints.txt'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
set -e
python -m venv venv
- script: |
set -e
python -m venv venv
. venv/bin/activate
pip install -U pip setuptools
pip install -r requirements_all.txt -c homeassistant/package_constraints.txt
pip install -r requirements_test.txt -c homeassistant/package_constraints.txt
displayName: 'Create Virtual Environment & Install Requirements'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
. venv/bin/activate
pip install -U pip setuptools
pip install -r requirements_all.txt -c homeassistant/package_constraints.txt
pip install -r requirements_test.txt -c homeassistant/package_constraints.txt
displayName: 'Create Virtual Environment & Install Requirements'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: 'Save artifacts based on Requirements'
inputs:
keyfile: 'requirements_all.txt, requirements_test.txt, .cache, homeassistant/package_constraints.txt'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
. venv/bin/activate
pip install -e .
displayName: 'Install Home Assistant for python $(PythonMain)'
- script: |
. venv/bin/activate
pylint homeassistant
displayName: 'Run pylint'
- job: 'Mypy'
pool:
vmImage: 'ubuntu-latest'
container: $[ variables['PythonMain'] ]
steps:
- script: |
python -m venv venv
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: 'Save artifacts based on Requirements'
inputs:
keyfile: 'requirements_all.txt, requirements_test.txt, .cache'
targetfolder: './venv'
vstsFeed: '$(ArtifactFeed)'
- script: |
. venv/bin/activate
pip install -e .
displayName: 'Install Home Assistant for python $(python.version)'
- script: |
. venv/bin/activate
pylint homeassistant
displayName: 'Run pylint'
. venv/bin/activate
pip install -r requirements_test.txt -c homeassistant/package_constraints.txt
displayName: 'Setup Env'
- script: |
TYPING_FILES=$(cat mypyrc)
echo -e "Run mypy on: \n$TYPING_FILES"
. venv/bin/activate
mypy $TYPING_FILES
displayName: 'Run mypy'

View File

@@ -1,168 +1,158 @@
# https://dev.azure.com/home-assistant
trigger:
batch: true
tags:
include:
- '*'
pr: none
variables:
- name: versionBuilder
value: '4.2'
value: '5.2'
- group: docker
- group: github
- group: twine
jobs:
stages:
- stage: 'Validate'
jobs:
- job: 'VersionValidate'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: |
setup_version="$(python setup.py -V)"
branch_version="$(Build.SourceBranchName)"
- job: 'VersionValidate'
condition: startsWith(variables['Build.SourceBranch'], 'refs/tags')
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: |
setup_version="$(python setup.py -V)"
branch_version="$(Build.SourceBranchName)"
if [ "${setup_version}" != "${branch_version}" ]; then
echo "Version of tag ${branch_version} don't match with ${setup_version}!"
exit 1
fi
displayName: 'Check version of branch/tag'
- script: |
sudo apt-get install -y --no-install-recommends \
jq curl
if [ "${setup_version}" != "${branch_version}" ]; then
echo "Version of tag ${branch_version} don't match with ${setup_version}!"
release="$(Build.SourceBranchName)"
created_by="$(curl -s https://api.github.com/repos/home-assistant/home-assistant/releases/tags/${release} | jq --raw-output '.author.login')"
if [[ "${created_by}" =~ ^(balloob|pvizeli|fabaff|robbiet480)$ ]]; then
exit 0
fi
echo "${created_by} is not allowed to create an release!"
exit 1
fi
displayName: 'Check version of branch/tag'
- script: |
sudo apt-get install -y --no-install-recommends \
jq curl
displayName: 'Check rights'
release="$(Build.SourceBranchName)"
created_by="$(curl -s https://api.github.com/repos/home-assistant/home-assistant/releases/tags/${release} | jq --raw-output '.author.login')"
- stage: 'Build'
jobs:
- job: 'ReleasePython'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: pip install twine wheel
displayName: 'Install tools'
- script: python setup.py sdist bdist_wheel
displayName: 'Build package'
- script: |
export TWINE_USERNAME="$(twineUser)"
export TWINE_PASSWORD="$(twinePassword)"
twine upload dist/* --skip-existing
displayName: 'Upload pypi'
- job: 'ReleaseDocker'
timeoutInMinutes: 240
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 5
matrix:
amd64:
buildArch: 'amd64'
buildMachine: 'qemux86-64,intel-nuc'
i386:
buildArch: 'i386'
buildMachine: 'qemux86'
armhf:
buildArch: 'armhf'
buildMachine: 'qemuarm,raspberrypi'
armv7:
buildArch: 'armv7'
buildMachine: 'raspberrypi2,raspberrypi3,raspberrypi4,odroid-xu,tinker'
aarch64:
buildArch: 'aarch64'
buildMachine: 'qemuarm-64,raspberrypi3-64,raspberrypi4-64,odroid-c2,orangepi-prime'
steps:
- script: sudo docker login -u $(dockerUser) -p $(dockerPassword)
displayName: 'Docker hub login'
- script: sudo docker pull homeassistant/amd64-builder:$(versionBuilder)
displayName: 'Install Builder'
- script: |
set -e
if [[ "${created_by}" =~ ^(balloob|pvizeli|fabaff|robbiet480)$ ]]; then
exit 0
fi
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw \
homeassistant/amd64-builder:$(versionBuilder) \
--homeassistant $(Build.SourceBranchName) "--$(buildArch)" \
-r https://github.com/home-assistant/hassio-homeassistant \
-t generic --docker-hub homeassistant
echo "${created_by} is not allowed to create an release!"
exit 1
displayName: 'Check rights'
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw \
homeassistant/amd64-builder:$(versionBuilder) \
--homeassistant-machine "$(Build.SourceBranchName)=$(buildMachine)" \
-r https://github.com/home-assistant/hassio-homeassistant \
-t machine --docker-hub homeassistant
displayName: 'Build Release'
- stage: 'Publish'
jobs:
- job: 'ReleaseHassio'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
sudo apt-get install -y --no-install-recommends \
git jq curl
- job: 'ReleasePython'
condition: and(startsWith(variables['Build.SourceBranch'], 'refs/tags'), succeeded('VersionValidate'))
dependsOn:
- 'VersionValidate'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
- script: pip install twine wheel
displayName: 'Install tools'
- script: python setup.py sdist bdist_wheel
displayName: 'Build package'
- script: |
export TWINE_USERNAME="$(twineUser)"
export TWINE_PASSWORD="$(twinePassword)"
twine upload dist/* --skip-existing
displayName: 'Upload pypi'
git config --global user.name "Pascal Vizeli"
git config --global user.email "pvizeli@syshack.ch"
git config --global credential.helper store
echo "https://$(githubToken):x-oauth-basic@github.com" > $HOME/.git-credentials
displayName: 'Install requirements'
- script: |
set -e
- job: 'ReleaseDocker'
condition: and(startsWith(variables['Build.SourceBranch'], 'refs/tags'), succeeded('VersionValidate'))
dependsOn:
- 'VersionValidate'
timeoutInMinutes: 240
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 5
matrix:
amd64:
buildArch: 'amd64'
buildMachine: 'qemux86-64,intel-nuc'
i386:
buildArch: 'i386'
buildMachine: 'qemux86'
armhf:
buildArch: 'armhf'
buildMachine: 'qemuarm,raspberrypi'
armv7:
buildArch: 'armv7'
buildMachine: 'raspberrypi2,raspberrypi3,odroid-xu,tinker'
aarch64:
buildArch: 'aarch64'
buildMachine: 'qemuarm-64,raspberrypi3-64,odroid-c2,orangepi-prime'
steps:
- script: sudo docker login -u $(dockerUser) -p $(dockerPassword)
displayName: 'Docker hub login'
- script: sudo docker pull homeassistant/amd64-builder:$(versionBuilder)
displayName: 'Install Builder'
- script: |
set -e
version="$(Build.SourceBranchName)"
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw \
homeassistant/amd64-builder:$(versionBuilder) \
--homeassistant $(Build.SourceBranchName) "--$(buildArch)" \
-r https://github.com/home-assistant/hassio-homeassistant \
-t generic --docker-hub homeassistant
git clone https://github.com/home-assistant/hassio-version
cd hassio-version
sudo docker run --rm --privileged \
-v ~/.docker:/root/.docker \
-v /run/docker.sock:/run/docker.sock:rw \
homeassistant/amd64-builder:$(versionBuilder) \
--homeassistant-machine "$(Build.SourceBranchName)=$(buildMachine)" \
-r https://github.com/home-assistant/hassio-homeassistant \
-t machine --docker-hub homeassistant
displayName: 'Build Release'
dev_version="$(jq --raw-output '.homeassistant.default' dev.json)"
beta_version="$(jq --raw-output '.homeassistant.default' beta.json)"
stable_version="$(jq --raw-output '.homeassistant.default' stable.json)"
if [[ "$version" =~ b ]]; then
sed -i "s|$dev_version|$version|g" dev.json
sed -i "s|$beta_version|$version|g" beta.json
else
sed -i "s|$dev_version|$version|g" dev.json
sed -i "s|$beta_version|$version|g" beta.json
sed -i "s|$stable_version|$version|g" stable.json
fi
- job: 'ReleaseHassio'
condition: and(startsWith(variables['Build.SourceBranch'], 'refs/tags'), succeeded('ReleaseDocker'))
dependsOn:
- 'ReleaseDocker'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
sudo apt-get install -y --no-install-recommends \
git jq curl
git config --global user.name "Pascal Vizeli"
git config --global user.email "pvizeli@syshack.ch"
git config --global credential.helper store
echo "https://$(githubToken):x-oauth-basic@github.com" > $HOME/.git-credentials
displayName: 'Install requirements'
- script: |
set -e
version="$(Build.SourceBranchName)"
git clone https://github.com/home-assistant/hassio-version
cd hassio-version
dev_version="$(jq --raw-output '.homeassistant.default' dev.json)"
beta_version="$(jq --raw-output '.homeassistant.default' beta.json)"
stable_version="$(jq --raw-output '.homeassistant.default' stable.json)"
if [[ "$version" =~ b ]]; then
sed -i "s|$dev_version|$version|g" dev.json
sed -i "s|$beta_version|$version|g" beta.json
else
sed -i "s|$dev_version|$version|g" dev.json
sed -i "s|$beta_version|$version|g" beta.json
sed -i "s|$stable_version|$version|g" stable.json
fi
git commit -am "Bump Home Assistant $version"
git push
displayName: 'Update version files'
git commit -am "Bump Home Assistant $version"
git push
displayName: 'Update version files'

View File

@@ -11,19 +11,18 @@ trigger:
pr: none
variables:
- name: versionWheels
value: '0.7'
value: '1.0-3.7-alpine3.10'
- group: wheels
jobs:
- job: 'Wheels'
condition: or(eq(variables['Build.SourceBranchName'], 'dev'), eq(variables['Build.SourceBranchName'], 'master'))
timeoutInMinutes: 360
pool:
vmImage: 'ubuntu-latest'
strategy:
maxParallel: 3
maxParallel: 5
matrix:
amd64:
buildArch: 'amd64'

View File

@@ -7,9 +7,7 @@ import platform
import subprocess
import sys
import threading
from typing import ( # noqa pylint: disable=unused-import
List, Dict, Any, TYPE_CHECKING
)
from typing import List, Dict, Any, TYPE_CHECKING # noqa pylint: disable=unused-import
from homeassistant import monkey_patch
from homeassistant.const import (
@@ -30,11 +28,12 @@ def set_loop() -> None:
policy = None
if sys.platform == 'win32':
if hasattr(asyncio, 'WindowsProactorEventLoopPolicy'):
if sys.platform == "win32":
if hasattr(asyncio, "WindowsProactorEventLoopPolicy"):
# pylint: disable=no-member
policy = asyncio.WindowsProactorEventLoopPolicy()
else:
class ProactorPolicy(BaseDefaultEventLoopPolicy):
"""Event loop policy to create proactor loops."""
@@ -56,28 +55,40 @@ def set_loop() -> None:
def validate_python() -> None:
"""Validate that the right Python version is running."""
if sys.version_info[:3] < REQUIRED_PYTHON_VER:
print("Home Assistant requires at least Python {}.{}.{}".format(
*REQUIRED_PYTHON_VER))
print(
"Home Assistant requires at least Python {}.{}.{}".format(
*REQUIRED_PYTHON_VER
)
)
sys.exit(1)
def ensure_config_path(config_dir: str) -> None:
"""Validate the configuration directory."""
import homeassistant.config as config_util
lib_dir = os.path.join(config_dir, 'deps')
lib_dir = os.path.join(config_dir, "deps")
# Test if configuration directory exists
if not os.path.isdir(config_dir):
if config_dir != config_util.get_default_config_dir():
print(('Fatal Error: Specified configuration directory does '
'not exist {} ').format(config_dir))
print(
(
"Fatal Error: Specified configuration directory does "
"not exist {} "
).format(config_dir)
)
sys.exit(1)
try:
os.mkdir(config_dir)
except OSError:
print(('Fatal Error: Unable to create default configuration '
'directory {} ').format(config_dir))
print(
(
"Fatal Error: Unable to create default configuration "
"directory {} "
).format(config_dir)
)
sys.exit(1)
# Test if library directory exists
@@ -85,20 +96,22 @@ def ensure_config_path(config_dir: str) -> None:
try:
os.mkdir(lib_dir)
except OSError:
print(('Fatal Error: Unable to create library '
'directory {} ').format(lib_dir))
print(
("Fatal Error: Unable to create library " "directory {} ").format(
lib_dir
)
)
sys.exit(1)
async def ensure_config_file(hass: 'core.HomeAssistant', config_dir: str) \
-> str:
async def ensure_config_file(hass: "core.HomeAssistant", config_dir: str) -> str:
"""Ensure configuration file exists."""
import homeassistant.config as config_util
config_path = await config_util.async_ensure_config_exists(
hass, config_dir)
config_path = await config_util.async_ensure_config_exists(hass, config_dir)
if config_path is None:
print('Error getting configuration path')
print("Error getting configuration path")
sys.exit(1)
return config_path
@@ -107,71 +120,72 @@ async def ensure_config_file(hass: 'core.HomeAssistant', config_dir: str) \
def get_arguments() -> argparse.Namespace:
"""Get parsed passed in arguments."""
import homeassistant.config as config_util
parser = argparse.ArgumentParser(
description="Home Assistant: Observe, Control, Automate.")
parser.add_argument('--version', action='version', version=__version__)
description="Home Assistant: Observe, Control, Automate."
)
parser.add_argument("--version", action="version", version=__version__)
parser.add_argument(
'-c', '--config',
metavar='path_to_config_dir',
"-c",
"--config",
metavar="path_to_config_dir",
default=config_util.get_default_config_dir(),
help="Directory that contains the Home Assistant configuration")
help="Directory that contains the Home Assistant configuration",
)
parser.add_argument(
'--demo-mode',
action='store_true',
help='Start Home Assistant in demo mode')
"--demo-mode", action="store_true", help="Start Home Assistant in demo mode"
)
parser.add_argument(
'--debug',
action='store_true',
help='Start Home Assistant in debug mode')
"--debug", action="store_true", help="Start Home Assistant in debug mode"
)
parser.add_argument(
'--open-ui',
action='store_true',
help='Open the webinterface in a browser')
"--open-ui", action="store_true", help="Open the webinterface in a browser"
)
parser.add_argument(
'--skip-pip',
action='store_true',
help='Skips pip install of required packages on startup')
"--skip-pip",
action="store_true",
help="Skips pip install of required packages on startup",
)
parser.add_argument(
'-v', '--verbose',
action='store_true',
help="Enable verbose logging to file.")
"-v", "--verbose", action="store_true", help="Enable verbose logging to file."
)
parser.add_argument(
'--pid-file',
metavar='path_to_pid_file',
"--pid-file",
metavar="path_to_pid_file",
default=None,
help='Path to PID file useful for running as daemon')
help="Path to PID file useful for running as daemon",
)
parser.add_argument(
'--log-rotate-days',
"--log-rotate-days",
type=int,
default=None,
help='Enables daily log rotation and keeps up to the specified days')
help="Enables daily log rotation and keeps up to the specified days",
)
parser.add_argument(
'--log-file',
"--log-file",
type=str,
default=None,
help='Log file to write to. If not set, CONFIG/home-assistant.log '
'is used')
help="Log file to write to. If not set, CONFIG/home-assistant.log " "is used",
)
parser.add_argument(
'--log-no-color',
action='store_true',
help="Disable color logs")
"--log-no-color", action="store_true", help="Disable color logs"
)
parser.add_argument(
'--runner',
action='store_true',
help='On restart exit with code {}'.format(RESTART_EXIT_CODE))
"--runner",
action="store_true",
help="On restart exit with code {}".format(RESTART_EXIT_CODE),
)
parser.add_argument(
'--script',
nargs=argparse.REMAINDER,
help='Run one of the embedded scripts')
"--script", nargs=argparse.REMAINDER, help="Run one of the embedded scripts"
)
if os.name == "posix":
parser.add_argument(
'--daemon',
action='store_true',
help='Run Home Assistant as daemon')
"--daemon", action="store_true", help="Run Home Assistant as daemon"
)
arguments = parser.parse_args()
if os.name != "posix" or arguments.debug or arguments.runner:
setattr(arguments, 'daemon', False)
setattr(arguments, "daemon", False)
return arguments
@@ -192,8 +206,8 @@ def daemonize() -> None:
sys.exit(0)
# redirect standard file descriptors to devnull
infd = open(os.devnull, 'r')
outfd = open(os.devnull, 'a+')
infd = open(os.devnull, "r")
outfd = open(os.devnull, "a+")
sys.stdout.flush()
sys.stderr.flush()
os.dup2(infd.fileno(), sys.stdin.fileno())
@@ -205,7 +219,7 @@ def check_pid(pid_file: str) -> None:
"""Check that Home Assistant is not already running."""
# Check pid file
try:
with open(pid_file, 'r') as file:
with open(pid_file, "r") as file:
pid = int(file.readline())
except IOError:
# PID File does not exist
@@ -220,7 +234,7 @@ def check_pid(pid_file: str) -> None:
except OSError:
# PID does not exist
return
print('Fatal Error: HomeAssistant is already running.')
print("Fatal Error: HomeAssistant is already running.")
sys.exit(1)
@@ -228,10 +242,10 @@ def write_pid(pid_file: str) -> None:
"""Create a PID File."""
pid = os.getpid()
try:
with open(pid_file, 'w') as file:
with open(pid_file, "w") as file:
file.write(str(pid))
except IOError:
print('Fatal Error: Unable to write pid file {}'.format(pid_file))
print("Fatal Error: Unable to write pid file {}".format(pid_file))
sys.exit(1)
@@ -255,17 +269,15 @@ def closefds_osx(min_fd: int, max_fd: int) -> None:
def cmdline() -> List[str]:
"""Collect path and arguments to re-execute the current hass instance."""
if os.path.basename(sys.argv[0]) == '__main__.py':
if os.path.basename(sys.argv[0]) == "__main__.py":
modulepath = os.path.dirname(sys.argv[0])
os.environ['PYTHONPATH'] = os.path.dirname(modulepath)
return [sys.executable] + [arg for arg in sys.argv if
arg != '--daemon']
os.environ["PYTHONPATH"] = os.path.dirname(modulepath)
return [sys.executable] + [arg for arg in sys.argv if arg != "--daemon"]
return [arg for arg in sys.argv if arg != '--daemon']
return [arg for arg in sys.argv if arg != "--daemon"]
async def setup_and_run_hass(config_dir: str,
args: argparse.Namespace) -> int:
async def setup_and_run_hass(config_dir: str, args: argparse.Namespace) -> int:
"""Set up HASS and run."""
# pylint: disable=redefined-outer-name
from homeassistant import bootstrap, core
@@ -273,21 +285,29 @@ async def setup_and_run_hass(config_dir: str,
hass = core.HomeAssistant()
if args.demo_mode:
config = {
'frontend': {},
'demo': {}
} # type: Dict[str, Any]
config = {"frontend": {}, "demo": {}} # type: Dict[str, Any]
bootstrap.async_from_config_dict(
config, hass, config_dir=config_dir, verbose=args.verbose,
skip_pip=args.skip_pip, log_rotate_days=args.log_rotate_days,
log_file=args.log_file, log_no_color=args.log_no_color)
config,
hass,
config_dir=config_dir,
verbose=args.verbose,
skip_pip=args.skip_pip,
log_rotate_days=args.log_rotate_days,
log_file=args.log_file,
log_no_color=args.log_no_color,
)
else:
config_file = await ensure_config_file(hass, config_dir)
print('Config directory:', config_dir)
print("Config directory:", config_dir)
await bootstrap.async_from_config_file(
config_file, hass, verbose=args.verbose, skip_pip=args.skip_pip,
log_rotate_days=args.log_rotate_days, log_file=args.log_file,
log_no_color=args.log_no_color)
config_file,
hass,
verbose=args.verbose,
skip_pip=args.skip_pip,
log_rotate_days=args.log_rotate_days,
log_file=args.log_file,
log_no_color=args.log_no_color,
)
if args.open_ui:
# Imported here to avoid importing asyncio before monkey patch
@@ -297,12 +317,14 @@ async def setup_and_run_hass(config_dir: str,
"""Open the web interface in a browser."""
if hass.config.api is not None:
import webbrowser
webbrowser.open(hass.config.api.base_url)
run_callback_threadsafe(
hass.loop,
hass.bus.async_listen_once,
EVENT_HOMEASSISTANT_START, open_browser
EVENT_HOMEASSISTANT_START,
open_browser,
)
return await hass.async_run()
@@ -312,17 +334,17 @@ def try_to_restart() -> None:
"""Attempt to clean up state and start a new Home Assistant instance."""
# Things should be mostly shut down already at this point, now just try
# to clean up things that may have been left behind.
sys.stderr.write('Home Assistant attempting to restart.\n')
sys.stderr.write("Home Assistant attempting to restart.\n")
# Count remaining threads, ideally there should only be one non-daemonized
# thread left (which is us). Nothing we really do with it, but it might be
# useful when debugging shutdown/restart issues.
try:
nthreads = sum(thread.is_alive() and not thread.daemon
for thread in threading.enumerate())
nthreads = sum(
thread.is_alive() and not thread.daemon for thread in threading.enumerate()
)
if nthreads > 1:
sys.stderr.write(
"Found {} non-daemonic threads.\n".format(nthreads))
sys.stderr.write("Found {} non-daemonic threads.\n".format(nthreads))
# Somehow we sometimes seem to trigger an assertion in the python threading
# module. It seems we find threads that have no associated OS level thread
@@ -336,7 +358,7 @@ def try_to_restart() -> None:
except ValueError:
max_fd = 256
if platform.system() == 'Darwin':
if platform.system() == "Darwin":
closefds_osx(3, max_fd)
else:
os.closerange(3, max_fd)
@@ -355,16 +377,15 @@ def main() -> int:
validate_python()
monkey_patch_needed = sys.version_info[:3] < (3, 6, 3)
if monkey_patch_needed and os.environ.get('HASS_NO_MONKEY') != '1':
if sys.version_info[:2] >= (3, 6):
monkey_patch.disable_c_asyncio()
if monkey_patch_needed and os.environ.get("HASS_NO_MONKEY") != "1":
monkey_patch.disable_c_asyncio()
monkey_patch.patch_weakref_tasks()
set_loop()
# Run a simple daemon runner process on Windows to handle restarts
if os.name == 'nt' and '--runner' not in sys.argv:
nt_args = cmdline() + ['--runner']
if os.name == "nt" and "--runner" not in sys.argv:
nt_args = cmdline() + ["--runner"]
while True:
try:
subprocess.check_call(nt_args)
@@ -379,6 +400,7 @@ def main() -> int:
if args.script is not None:
from homeassistant import scripts
return scripts.run(args.script)
config_dir = os.path.join(os.getcwd(), args.config)
@@ -393,6 +415,7 @@ def main() -> int:
write_pid(args.pid_file)
from homeassistant.util.async_ import asyncio_run
exit_code = asyncio_run(setup_and_run_hass(config_dir, args))
if exit_code == RESTART_EXIT_CODE and not args.runner:
try_to_restart()

View File

@@ -17,8 +17,8 @@ from .const import GROUP_ID_ADMIN
from .mfa_modules import auth_mfa_module_from_config, MultiFactorAuthModule
from .providers import auth_provider_from_config, AuthProvider, LoginFlow
EVENT_USER_ADDED = 'user_added'
EVENT_USER_REMOVED = 'user_removed'
EVENT_USER_ADDED = "user_added"
EVENT_USER_REMOVED = "user_removed"
_LOGGER = logging.getLogger(__name__)
_MfaModuleDict = Dict[str, MultiFactorAuthModule]
@@ -27,9 +27,10 @@ _ProviderDict = Dict[_ProviderKey, AuthProvider]
async def auth_manager_from_config(
hass: HomeAssistant,
provider_configs: List[Dict[str, Any]],
module_configs: List[Dict[str, Any]]) -> 'AuthManager':
hass: HomeAssistant,
provider_configs: List[Dict[str, Any]],
module_configs: List[Dict[str, Any]],
) -> "AuthManager":
"""Initialize an auth manager from config.
CORE_CONFIG_SCHEMA will make sure do duplicated auth providers or
@@ -38,8 +39,11 @@ async def auth_manager_from_config(
store = auth_store.AuthStore(hass)
if provider_configs:
providers = await asyncio.gather(
*[auth_provider_from_config(hass, store, config)
for config in provider_configs])
*(
auth_provider_from_config(hass, store, config)
for config in provider_configs
)
)
else:
providers = ()
# So returned auth providers are in same order as config
@@ -50,8 +54,8 @@ async def auth_manager_from_config(
if module_configs:
modules = await asyncio.gather(
*[auth_mfa_module_from_config(hass, config)
for config in module_configs])
*(auth_mfa_module_from_config(hass, config) for config in module_configs)
)
else:
modules = ()
# So returned auth modules are in same order as config
@@ -66,17 +70,21 @@ async def auth_manager_from_config(
class AuthManager:
"""Manage the authentication for Home Assistant."""
def __init__(self, hass: HomeAssistant, store: auth_store.AuthStore,
providers: _ProviderDict, mfa_modules: _MfaModuleDict) \
-> None:
def __init__(
self,
hass: HomeAssistant,
store: auth_store.AuthStore,
providers: _ProviderDict,
mfa_modules: _MfaModuleDict,
) -> None:
"""Initialize the auth manager."""
self.hass = hass
self._store = store
self._providers = providers
self._mfa_modules = mfa_modules
self.login_flow = data_entry_flow.FlowManager(
hass, self._async_create_login_flow,
self._async_finish_login_flow)
hass, self._async_create_login_flow, self._async_finish_login_flow
)
@property
def support_legacy(self) -> bool:
@@ -86,7 +94,7 @@ class AuthManager:
Should be removed when we removed legacy_api_password auth providers.
"""
for provider_type, _ in self._providers:
if provider_type == 'legacy_api_password':
if provider_type == "legacy_api_password":
return True
return False
@@ -100,20 +108,21 @@ class AuthManager:
"""Return a list of available auth modules."""
return list(self._mfa_modules.values())
def get_auth_provider(self, provider_type: str, provider_id: str) \
-> Optional[AuthProvider]:
def get_auth_provider(
self, provider_type: str, provider_id: str
) -> Optional[AuthProvider]:
"""Return an auth provider, None if not found."""
return self._providers.get((provider_type, provider_id))
def get_auth_providers(self, provider_type: str) \
-> List[AuthProvider]:
def get_auth_providers(self, provider_type: str) -> List[AuthProvider]:
"""Return a List of auth provider of one type, Empty if not found."""
return [provider
for (p_type, _), provider in self._providers.items()
if p_type == provider_type]
return [
provider
for (p_type, _), provider in self._providers.items()
if p_type == provider_type
]
def get_auth_mfa_module(self, module_id: str) \
-> Optional[MultiFactorAuthModule]:
def get_auth_mfa_module(self, module_id: str) -> Optional[MultiFactorAuthModule]:
"""Return a multi-factor auth module, None if not found."""
return self._mfa_modules.get(module_id)
@@ -135,7 +144,8 @@ class AuthManager:
return await self._store.async_get_group(group_id)
async def async_get_user_by_credentials(
self, credentials: models.Credentials) -> Optional[models.User]:
self, credentials: models.Credentials
) -> Optional[models.User]:
"""Get a user by credential, return None if not found."""
for user in await self.async_get_users():
for creds in user.credentials:
@@ -145,57 +155,50 @@ class AuthManager:
return None
async def async_create_system_user(
self, name: str,
group_ids: Optional[List[str]] = None) -> models.User:
self, name: str, group_ids: Optional[List[str]] = None
) -> models.User:
"""Create a system user."""
user = await self._store.async_create_user(
name=name,
system_generated=True,
is_active=True,
group_ids=group_ids or [],
name=name, system_generated=True, is_active=True, group_ids=group_ids or []
)
self.hass.bus.async_fire(EVENT_USER_ADDED, {
'user_id': user.id
})
self.hass.bus.async_fire(EVENT_USER_ADDED, {"user_id": user.id})
return user
async def async_create_user(self, name: str) -> models.User:
"""Create a user."""
kwargs = {
'name': name,
'is_active': True,
'group_ids': [GROUP_ID_ADMIN]
"name": name,
"is_active": True,
"group_ids": [GROUP_ID_ADMIN],
} # type: Dict[str, Any]
if await self._user_should_be_owner():
kwargs['is_owner'] = True
kwargs["is_owner"] = True
user = await self._store.async_create_user(**kwargs)
self.hass.bus.async_fire(EVENT_USER_ADDED, {
'user_id': user.id
})
self.hass.bus.async_fire(EVENT_USER_ADDED, {"user_id": user.id})
return user
async def async_get_or_create_user(self, credentials: models.Credentials) \
-> models.User:
async def async_get_or_create_user(
self, credentials: models.Credentials
) -> models.User:
"""Get or create a user."""
if not credentials.is_new:
user = await self.async_get_user_by_credentials(credentials)
if user is None:
raise ValueError('Unable to find the user.')
raise ValueError("Unable to find the user.")
return user
auth_provider = self._async_get_auth_provider(credentials)
if auth_provider is None:
raise RuntimeError('Credential with unknown provider encountered')
raise RuntimeError("Credential with unknown provider encountered")
info = await auth_provider.async_user_meta_for_credentials(
credentials)
info = await auth_provider.async_user_meta_for_credentials(credentials)
user = await self._store.async_create_user(
credentials=credentials,
@@ -204,14 +207,13 @@ class AuthManager:
group_ids=[GROUP_ID_ADMIN],
)
self.hass.bus.async_fire(EVENT_USER_ADDED, {
'user_id': user.id
})
self.hass.bus.async_fire(EVENT_USER_ADDED, {"user_id": user.id})
return user
async def async_link_user(self, user: models.User,
credentials: models.Credentials) -> None:
async def async_link_user(
self, user: models.User, credentials: models.Credentials
) -> None:
"""Link credentials to an existing user."""
await self._store.async_link_user(user, credentials)
@@ -227,19 +229,20 @@ class AuthManager:
await self._store.async_remove_user(user)
self.hass.bus.async_fire(EVENT_USER_REMOVED, {
'user_id': user.id
})
self.hass.bus.async_fire(EVENT_USER_REMOVED, {"user_id": user.id})
async def async_update_user(self, user: models.User,
name: Optional[str] = None,
group_ids: Optional[List[str]] = None) -> None:
async def async_update_user(
self,
user: models.User,
name: Optional[str] = None,
group_ids: Optional[List[str]] = None,
) -> None:
"""Update a user."""
kwargs = {} # type: Dict[str,Any]
if name is not None:
kwargs['name'] = name
kwargs["name"] = name
if group_ids is not None:
kwargs['group_ids'] = group_ids
kwargs["group_ids"] = group_ids
await self._store.async_update_user(user, **kwargs)
async def async_activate_user(self, user: models.User) -> None:
@@ -249,47 +252,52 @@ class AuthManager:
async def async_deactivate_user(self, user: models.User) -> None:
"""Deactivate a user."""
if user.is_owner:
raise ValueError('Unable to deactive the owner')
raise ValueError("Unable to deactive the owner")
await self._store.async_deactivate_user(user)
async def async_remove_credentials(
self, credentials: models.Credentials) -> None:
async def async_remove_credentials(self, credentials: models.Credentials) -> None:
"""Remove credentials."""
provider = self._async_get_auth_provider(credentials)
if (provider is not None and
hasattr(provider, 'async_will_remove_credentials')):
if provider is not None and hasattr(provider, "async_will_remove_credentials"):
# https://github.com/python/mypy/issues/1424
await provider.async_will_remove_credentials( # type: ignore
credentials)
credentials
)
await self._store.async_remove_credentials(credentials)
async def async_enable_user_mfa(self, user: models.User,
mfa_module_id: str, data: Any) -> None:
async def async_enable_user_mfa(
self, user: models.User, mfa_module_id: str, data: Any
) -> None:
"""Enable a multi-factor auth module for user."""
if user.system_generated:
raise ValueError('System generated users cannot enable '
'multi-factor auth module.')
raise ValueError(
"System generated users cannot enable " "multi-factor auth module."
)
module = self.get_auth_mfa_module(mfa_module_id)
if module is None:
raise ValueError('Unable find multi-factor auth module: {}'
.format(mfa_module_id))
raise ValueError(
"Unable find multi-factor auth module: {}".format(mfa_module_id)
)
await module.async_setup_user(user.id, data)
async def async_disable_user_mfa(self, user: models.User,
mfa_module_id: str) -> None:
async def async_disable_user_mfa(
self, user: models.User, mfa_module_id: str
) -> None:
"""Disable a multi-factor auth module for user."""
if user.system_generated:
raise ValueError('System generated users cannot disable '
'multi-factor auth module.')
raise ValueError(
"System generated users cannot disable " "multi-factor auth module."
)
module = self.get_auth_mfa_module(mfa_module_id)
if module is None:
raise ValueError('Unable find multi-factor auth module: {}'
.format(mfa_module_id))
raise ValueError(
"Unable find multi-factor auth module: {}".format(mfa_module_id)
)
await module.async_depose_user(user.id)
@@ -302,20 +310,23 @@ class AuthManager:
return modules
async def async_create_refresh_token(
self, user: models.User, client_id: Optional[str] = None,
client_name: Optional[str] = None,
client_icon: Optional[str] = None,
token_type: Optional[str] = None,
access_token_expiration: timedelta = ACCESS_TOKEN_EXPIRATION) \
-> models.RefreshToken:
self,
user: models.User,
client_id: Optional[str] = None,
client_name: Optional[str] = None,
client_icon: Optional[str] = None,
token_type: Optional[str] = None,
access_token_expiration: timedelta = ACCESS_TOKEN_EXPIRATION,
) -> models.RefreshToken:
"""Create a new refresh token for a user."""
if not user.is_active:
raise ValueError('User is not active')
raise ValueError("User is not active")
if user.system_generated and client_id is not None:
raise ValueError(
'System generated users cannot have refresh tokens connected '
'to a client.')
"System generated users cannot have refresh tokens connected "
"to a client."
)
if token_type is None:
if user.system_generated:
@@ -325,61 +336,76 @@ class AuthManager:
if user.system_generated != (token_type == models.TOKEN_TYPE_SYSTEM):
raise ValueError(
'System generated users can only have system type '
'refresh tokens')
"System generated users can only have system type " "refresh tokens"
)
if token_type == models.TOKEN_TYPE_NORMAL and client_id is None:
raise ValueError('Client is required to generate a refresh token.')
raise ValueError("Client is required to generate a refresh token.")
if (token_type == models.TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN and
client_name is None):
raise ValueError('Client_name is required for long-lived access '
'token')
if (
token_type == models.TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN
and client_name is None
):
raise ValueError("Client_name is required for long-lived access " "token")
if token_type == models.TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN:
for token in user.refresh_tokens.values():
if (token.client_name == client_name and token.token_type ==
models.TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN):
if (
token.client_name == client_name
and token.token_type == models.TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN
):
# Each client_name can only have one
# long_lived_access_token type of refresh token
raise ValueError('{} already exists'.format(client_name))
raise ValueError("{} already exists".format(client_name))
return await self._store.async_create_refresh_token(
user, client_id, client_name, client_icon,
token_type, access_token_expiration)
user,
client_id,
client_name,
client_icon,
token_type,
access_token_expiration,
)
async def async_get_refresh_token(
self, token_id: str) -> Optional[models.RefreshToken]:
self, token_id: str
) -> Optional[models.RefreshToken]:
"""Get refresh token by id."""
return await self._store.async_get_refresh_token(token_id)
async def async_get_refresh_token_by_token(
self, token: str) -> Optional[models.RefreshToken]:
self, token: str
) -> Optional[models.RefreshToken]:
"""Get refresh token by token."""
return await self._store.async_get_refresh_token_by_token(token)
async def async_remove_refresh_token(self,
refresh_token: models.RefreshToken) \
-> None:
async def async_remove_refresh_token(
self, refresh_token: models.RefreshToken
) -> None:
"""Delete a refresh token."""
await self._store.async_remove_refresh_token(refresh_token)
@callback
def async_create_access_token(self,
refresh_token: models.RefreshToken,
remote_ip: Optional[str] = None) -> str:
def async_create_access_token(
self, refresh_token: models.RefreshToken, remote_ip: Optional[str] = None
) -> str:
"""Create a new access token."""
self._store.async_log_refresh_token_usage(refresh_token, remote_ip)
now = dt_util.utcnow()
return jwt.encode({
'iss': refresh_token.id,
'iat': now,
'exp': now + refresh_token.access_token_expiration,
}, refresh_token.jwt_key, algorithm='HS256').decode()
return jwt.encode(
{
"iss": refresh_token.id,
"iat": now,
"exp": now + refresh_token.access_token_expiration,
},
refresh_token.jwt_key,
algorithm="HS256",
).decode()
async def async_validate_access_token(
self, token: str) -> Optional[models.RefreshToken]:
self, token: str
) -> Optional[models.RefreshToken]:
"""Return refresh token if an access token is valid."""
try:
unverif_claims = jwt.decode(token, verify=False)
@@ -387,23 +413,18 @@ class AuthManager:
return None
refresh_token = await self.async_get_refresh_token(
cast(str, unverif_claims.get('iss')))
cast(str, unverif_claims.get("iss"))
)
if refresh_token is None:
jwt_key = ''
issuer = ''
jwt_key = ""
issuer = ""
else:
jwt_key = refresh_token.jwt_key
issuer = refresh_token.id
try:
jwt.decode(
token,
jwt_key,
leeway=10,
issuer=issuer,
algorithms=['HS256']
)
jwt.decode(token, jwt_key, leeway=10, issuer=issuer, algorithms=["HS256"])
except jwt.InvalidTokenError:
return None
@@ -413,31 +434,32 @@ class AuthManager:
return refresh_token
async def _async_create_login_flow(
self, handler: _ProviderKey, *, context: Optional[Dict],
data: Optional[Any]) -> data_entry_flow.FlowHandler:
self, handler: _ProviderKey, *, context: Optional[Dict], data: Optional[Any]
) -> data_entry_flow.FlowHandler:
"""Create a login flow."""
auth_provider = self._providers[handler]
return await auth_provider.async_login_flow(context)
async def _async_finish_login_flow(
self, flow: LoginFlow, result: Dict[str, Any]) \
-> Dict[str, Any]:
self, flow: LoginFlow, result: Dict[str, Any]
) -> Dict[str, Any]:
"""Return a user as result of login flow."""
if result['type'] != data_entry_flow.RESULT_TYPE_CREATE_ENTRY:
if result["type"] != data_entry_flow.RESULT_TYPE_CREATE_ENTRY:
return result
# we got final result
if isinstance(result['data'], models.User):
result['result'] = result['data']
if isinstance(result["data"], models.User):
result["result"] = result["data"]
return result
auth_provider = self._providers[result['handler']]
auth_provider = self._providers[result["handler"]]
credentials = await auth_provider.async_get_or_create_credentials(
result['data'])
result["data"]
)
if flow.context is not None and flow.context.get('credential_only'):
result['result'] = credentials
if flow.context is not None and flow.context.get("credential_only"):
result["result"] = credentials
return result
# multi-factor module cannot enabled for new credential
@@ -452,15 +474,18 @@ class AuthManager:
flow.available_mfa_modules = modules
return await flow.async_step_select_mfa_module()
result['result'] = await self.async_get_or_create_user(credentials)
result["result"] = await self.async_get_or_create_user(credentials)
return result
@callback
def _async_get_auth_provider(
self, credentials: models.Credentials) -> Optional[AuthProvider]:
self, credentials: models.Credentials
) -> Optional[AuthProvider]:
"""Get auth provider from a set of credentials."""
auth_provider_key = (credentials.auth_provider_type,
credentials.auth_provider_id)
auth_provider_key = (
credentials.auth_provider_type,
credentials.auth_provider_id,
)
return self._providers.get(auth_provider_key)
async def _user_should_be_owner(self) -> bool:

View File

@@ -16,10 +16,10 @@ from .permissions import PermissionLookup, system_policies
from .permissions.types import PolicyType # noqa: F401
STORAGE_VERSION = 1
STORAGE_KEY = 'auth'
GROUP_NAME_ADMIN = 'Administrators'
STORAGE_KEY = "auth"
GROUP_NAME_ADMIN = "Administrators"
GROUP_NAME_USER = "Users"
GROUP_NAME_READ_ONLY = 'Read Only'
GROUP_NAME_READ_ONLY = "Read Only"
class AuthStore:
@@ -37,8 +37,9 @@ class AuthStore:
self._users = None # type: Optional[Dict[str, models.User]]
self._groups = None # type: Optional[Dict[str, models.Group]]
self._perm_lookup = None # type: Optional[PermissionLookup]
self._store = hass.helpers.storage.Store(STORAGE_VERSION, STORAGE_KEY,
private=True)
self._store = hass.helpers.storage.Store(
STORAGE_VERSION, STORAGE_KEY, private=True
)
self._lock = asyncio.Lock()
async def async_get_groups(self) -> List[models.Group]:
@@ -74,11 +75,14 @@ class AuthStore:
return self._users.get(user_id)
async def async_create_user(
self, name: Optional[str], is_owner: Optional[bool] = None,
is_active: Optional[bool] = None,
system_generated: Optional[bool] = None,
credentials: Optional[models.Credentials] = None,
group_ids: Optional[List[str]] = None) -> models.User:
self,
name: Optional[str],
is_owner: Optional[bool] = None,
is_active: Optional[bool] = None,
system_generated: Optional[bool] = None,
credentials: Optional[models.Credentials] = None,
group_ids: Optional[List[str]] = None,
) -> models.User:
"""Create a new user."""
if self._users is None:
await self._async_load()
@@ -87,28 +91,28 @@ class AuthStore:
assert self._groups is not None
groups = []
for group_id in (group_ids or []):
for group_id in group_ids or []:
group = self._groups.get(group_id)
if group is None:
raise ValueError('Invalid group specified {}'.format(group_id))
raise ValueError("Invalid group specified {}".format(group_id))
groups.append(group)
kwargs = {
'name': name,
"name": name,
# Until we get group management, we just put everyone in the
# same group.
'groups': groups,
'perm_lookup': self._perm_lookup,
"groups": groups,
"perm_lookup": self._perm_lookup,
} # type: Dict[str, Any]
if is_owner is not None:
kwargs['is_owner'] = is_owner
kwargs["is_owner"] = is_owner
if is_active is not None:
kwargs['is_active'] = is_active
kwargs["is_active"] = is_active
if system_generated is not None:
kwargs['system_generated'] = system_generated
kwargs["system_generated"] = system_generated
new_user = models.User(**kwargs)
@@ -122,8 +126,9 @@ class AuthStore:
await self.async_link_user(new_user, credentials)
return new_user
async def async_link_user(self, user: models.User,
credentials: models.Credentials) -> None:
async def async_link_user(
self, user: models.User, credentials: models.Credentials
) -> None:
"""Add credentials to an existing user."""
user.credentials.append(credentials)
self._async_schedule_save()
@@ -139,9 +144,12 @@ class AuthStore:
self._async_schedule_save()
async def async_update_user(
self, user: models.User, name: Optional[str] = None,
is_active: Optional[bool] = None,
group_ids: Optional[List[str]] = None) -> None:
self,
user: models.User,
name: Optional[str] = None,
is_active: Optional[bool] = None,
group_ids: Optional[List[str]] = None,
) -> None:
"""Update a user."""
assert self._groups is not None
@@ -156,10 +164,7 @@ class AuthStore:
user.groups = groups
user.invalidate_permission_cache()
for attr_name, value in (
('name', name),
('is_active', is_active),
):
for attr_name, value in (("name", name), ("is_active", is_active)):
if value is not None:
setattr(user, attr_name, value)
@@ -175,8 +180,7 @@ class AuthStore:
user.is_active = False
self._async_schedule_save()
async def async_remove_credentials(
self, credentials: models.Credentials) -> None:
async def async_remove_credentials(self, credentials: models.Credentials) -> None:
"""Remove credentials."""
if self._users is None:
await self._async_load()
@@ -197,23 +201,25 @@ class AuthStore:
self._async_schedule_save()
async def async_create_refresh_token(
self, user: models.User, client_id: Optional[str] = None,
client_name: Optional[str] = None,
client_icon: Optional[str] = None,
token_type: str = models.TOKEN_TYPE_NORMAL,
access_token_expiration: timedelta = ACCESS_TOKEN_EXPIRATION) \
-> models.RefreshToken:
self,
user: models.User,
client_id: Optional[str] = None,
client_name: Optional[str] = None,
client_icon: Optional[str] = None,
token_type: str = models.TOKEN_TYPE_NORMAL,
access_token_expiration: timedelta = ACCESS_TOKEN_EXPIRATION,
) -> models.RefreshToken:
"""Create a new token for a user."""
kwargs = {
'user': user,
'client_id': client_id,
'token_type': token_type,
'access_token_expiration': access_token_expiration
"user": user,
"client_id": client_id,
"token_type": token_type,
"access_token_expiration": access_token_expiration,
} # type: Dict[str, Any]
if client_name:
kwargs['client_name'] = client_name
kwargs["client_name"] = client_name
if client_icon:
kwargs['client_icon'] = client_icon
kwargs["client_icon"] = client_icon
refresh_token = models.RefreshToken(**kwargs)
user.refresh_tokens[refresh_token.id] = refresh_token
@@ -222,7 +228,8 @@ class AuthStore:
return refresh_token
async def async_remove_refresh_token(
self, refresh_token: models.RefreshToken) -> None:
self, refresh_token: models.RefreshToken
) -> None:
"""Remove a refresh token."""
if self._users is None:
await self._async_load()
@@ -234,7 +241,8 @@ class AuthStore:
break
async def async_get_refresh_token(
self, token_id: str) -> Optional[models.RefreshToken]:
self, token_id: str
) -> Optional[models.RefreshToken]:
"""Get refresh token by id."""
if self._users is None:
await self._async_load()
@@ -248,7 +256,8 @@ class AuthStore:
return None
async def async_get_refresh_token_by_token(
self, token: str) -> Optional[models.RefreshToken]:
self, token: str
) -> Optional[models.RefreshToken]:
"""Get refresh token by token."""
if self._users is None:
await self._async_load()
@@ -265,8 +274,8 @@ class AuthStore:
@callback
def async_log_refresh_token_usage(
self, refresh_token: models.RefreshToken,
remote_ip: Optional[str] = None) -> None:
self, refresh_token: models.RefreshToken, remote_ip: Optional[str] = None
) -> None:
"""Update refresh token last used information."""
refresh_token.last_used_at = dt_util.utcnow()
refresh_token.last_used_ip = remote_ip
@@ -292,9 +301,7 @@ class AuthStore:
if self._users is not None:
return
self._perm_lookup = perm_lookup = PermissionLookup(
ent_reg, dev_reg
)
self._perm_lookup = perm_lookup = PermissionLookup(ent_reg, dev_reg)
if data is None:
self._set_defaults()
@@ -317,24 +324,24 @@ class AuthStore:
# prevents crashing if user rolls back HA version after a new property
# was added.
for group_dict in data.get('groups', []):
for group_dict in data.get("groups", []):
policy = None # type: Optional[PolicyType]
if group_dict['id'] == GROUP_ID_ADMIN:
if group_dict["id"] == GROUP_ID_ADMIN:
has_admin_group = True
name = GROUP_NAME_ADMIN
policy = system_policies.ADMIN_POLICY
system_generated = True
elif group_dict['id'] == GROUP_ID_USER:
elif group_dict["id"] == GROUP_ID_USER:
has_user_group = True
name = GROUP_NAME_USER
policy = system_policies.USER_POLICY
system_generated = True
elif group_dict['id'] == GROUP_ID_READ_ONLY:
elif group_dict["id"] == GROUP_ID_READ_ONLY:
has_read_only_group = True
name = GROUP_NAME_READ_ONLY
@@ -342,18 +349,18 @@ class AuthStore:
system_generated = True
else:
name = group_dict['name']
policy = group_dict.get('policy')
name = group_dict["name"]
policy = group_dict.get("policy")
system_generated = False
# We don't want groups without a policy that are not system groups
# This is part of migrating from state 1
if policy is None:
group_without_policy = group_dict['id']
group_without_policy = group_dict["id"]
continue
groups[group_dict['id']] = models.Group(
id=group_dict['id'],
groups[group_dict["id"]] = models.Group(
id=group_dict["id"],
name=name,
policy=policy,
system_generated=system_generated,
@@ -361,8 +368,7 @@ class AuthStore:
# If there are no groups, add all existing users to the admin group.
# This is part of migrating from state 2
migrate_users_to_admin_group = (not groups and
group_without_policy is None)
migrate_users_to_admin_group = not groups and group_without_policy is None
# If we find a no_policy_group, we need to migrate all users to the
# admin group. We only do this if there are no other groups, as is
@@ -385,82 +391,86 @@ class AuthStore:
user_group = _system_user_group()
groups[user_group.id] = user_group
for user_dict in data['users']:
for user_dict in data["users"]:
# Collect the users group.
user_groups = []
for group_id in user_dict.get('group_ids', []):
for group_id in user_dict.get("group_ids", []):
# This is part of migrating from state 1
if group_id == group_without_policy:
group_id = GROUP_ID_ADMIN
user_groups.append(groups[group_id])
# This is part of migrating from state 2
if (not user_dict['system_generated'] and
migrate_users_to_admin_group):
if not user_dict["system_generated"] and migrate_users_to_admin_group:
user_groups.append(groups[GROUP_ID_ADMIN])
users[user_dict['id']] = models.User(
name=user_dict['name'],
users[user_dict["id"]] = models.User(
name=user_dict["name"],
groups=user_groups,
id=user_dict['id'],
is_owner=user_dict['is_owner'],
is_active=user_dict['is_active'],
system_generated=user_dict['system_generated'],
id=user_dict["id"],
is_owner=user_dict["is_owner"],
is_active=user_dict["is_active"],
system_generated=user_dict["system_generated"],
perm_lookup=perm_lookup,
)
for cred_dict in data['credentials']:
users[cred_dict['user_id']].credentials.append(models.Credentials(
id=cred_dict['id'],
is_new=False,
auth_provider_type=cred_dict['auth_provider_type'],
auth_provider_id=cred_dict['auth_provider_id'],
data=cred_dict['data'],
))
for cred_dict in data["credentials"]:
users[cred_dict["user_id"]].credentials.append(
models.Credentials(
id=cred_dict["id"],
is_new=False,
auth_provider_type=cred_dict["auth_provider_type"],
auth_provider_id=cred_dict["auth_provider_id"],
data=cred_dict["data"],
)
)
for rt_dict in data['refresh_tokens']:
for rt_dict in data["refresh_tokens"]:
# Filter out the old keys that don't have jwt_key (pre-0.76)
if 'jwt_key' not in rt_dict:
if "jwt_key" not in rt_dict:
continue
created_at = dt_util.parse_datetime(rt_dict['created_at'])
created_at = dt_util.parse_datetime(rt_dict["created_at"])
if created_at is None:
getLogger(__name__).error(
'Ignoring refresh token %(id)s with invalid created_at '
'%(created_at)s for user_id %(user_id)s', rt_dict)
"Ignoring refresh token %(id)s with invalid created_at "
"%(created_at)s for user_id %(user_id)s",
rt_dict,
)
continue
token_type = rt_dict.get('token_type')
token_type = rt_dict.get("token_type")
if token_type is None:
if rt_dict['client_id'] is None:
if rt_dict["client_id"] is None:
token_type = models.TOKEN_TYPE_SYSTEM
else:
token_type = models.TOKEN_TYPE_NORMAL
# old refresh_token don't have last_used_at (pre-0.78)
last_used_at_str = rt_dict.get('last_used_at')
last_used_at_str = rt_dict.get("last_used_at")
if last_used_at_str:
last_used_at = dt_util.parse_datetime(last_used_at_str)
else:
last_used_at = None
token = models.RefreshToken(
id=rt_dict['id'],
user=users[rt_dict['user_id']],
client_id=rt_dict['client_id'],
id=rt_dict["id"],
user=users[rt_dict["user_id"]],
client_id=rt_dict["client_id"],
# use dict.get to keep backward compatibility
client_name=rt_dict.get('client_name'),
client_icon=rt_dict.get('client_icon'),
client_name=rt_dict.get("client_name"),
client_icon=rt_dict.get("client_icon"),
token_type=token_type,
created_at=created_at,
access_token_expiration=timedelta(
seconds=rt_dict['access_token_expiration']),
token=rt_dict['token'],
jwt_key=rt_dict['jwt_key'],
seconds=rt_dict["access_token_expiration"]
),
token=rt_dict["token"],
jwt_key=rt_dict["jwt_key"],
last_used_at=last_used_at,
last_used_ip=rt_dict.get('last_used_ip'),
last_used_ip=rt_dict.get("last_used_ip"),
)
users[rt_dict['user_id']].refresh_tokens[token.id] = token
users[rt_dict["user_id"]].refresh_tokens[token.id] = token
self._groups = groups
self._users = users
@@ -481,12 +491,12 @@ class AuthStore:
users = [
{
'id': user.id,
'group_ids': [group.id for group in user.groups],
'is_owner': user.is_owner,
'is_active': user.is_active,
'name': user.name,
'system_generated': user.system_generated,
"id": user.id,
"group_ids": [group.id for group in user.groups],
"is_owner": user.is_owner,
"is_active": user.is_active,
"name": user.name,
"system_generated": user.system_generated,
}
for user in self._users.values()
]
@@ -494,23 +504,23 @@ class AuthStore:
groups = []
for group in self._groups.values():
g_dict = {
'id': group.id,
"id": group.id,
# Name not read for sys groups. Kept here for backwards compat
'name': group.name
"name": group.name,
} # type: Dict[str, Any]
if not group.system_generated:
g_dict['policy'] = group.policy
g_dict["policy"] = group.policy
groups.append(g_dict)
credentials = [
{
'id': credential.id,
'user_id': user.id,
'auth_provider_type': credential.auth_provider_type,
'auth_provider_id': credential.auth_provider_id,
'data': credential.data,
"id": credential.id,
"user_id": user.id,
"auth_provider_type": credential.auth_provider_type,
"auth_provider_id": credential.auth_provider_id,
"data": credential.data,
}
for user in self._users.values()
for credential in user.credentials
@@ -518,36 +528,35 @@ class AuthStore:
refresh_tokens = [
{
'id': refresh_token.id,
'user_id': user.id,
'client_id': refresh_token.client_id,
'client_name': refresh_token.client_name,
'client_icon': refresh_token.client_icon,
'token_type': refresh_token.token_type,
'created_at': refresh_token.created_at.isoformat(),
'access_token_expiration':
refresh_token.access_token_expiration.total_seconds(),
'token': refresh_token.token,
'jwt_key': refresh_token.jwt_key,
'last_used_at':
refresh_token.last_used_at.isoformat()
if refresh_token.last_used_at else None,
'last_used_ip': refresh_token.last_used_ip,
"id": refresh_token.id,
"user_id": user.id,
"client_id": refresh_token.client_id,
"client_name": refresh_token.client_name,
"client_icon": refresh_token.client_icon,
"token_type": refresh_token.token_type,
"created_at": refresh_token.created_at.isoformat(),
"access_token_expiration": refresh_token.access_token_expiration.total_seconds(),
"token": refresh_token.token,
"jwt_key": refresh_token.jwt_key,
"last_used_at": refresh_token.last_used_at.isoformat()
if refresh_token.last_used_at
else None,
"last_used_ip": refresh_token.last_used_ip,
}
for user in self._users.values()
for refresh_token in user.refresh_tokens.values()
]
return {
'users': users,
'groups': groups,
'credentials': credentials,
'refresh_tokens': refresh_tokens,
"users": users,
"groups": groups,
"credentials": credentials,
"refresh_tokens": refresh_tokens,
}
def _set_defaults(self) -> None:
"""Set default values for auth store."""
self._users = OrderedDict() # type: Dict[str, models.User]
self._users = OrderedDict()
groups = OrderedDict() # type: Dict[str, models.Group]
admin_group = _system_admin_group()

View File

@@ -4,6 +4,6 @@ from datetime import timedelta
ACCESS_TOKEN_EXPIRATION = timedelta(minutes=30)
MFA_SESSION_EXPIRATION = timedelta(minutes=5)
GROUP_ID_ADMIN = 'system-admin'
GROUP_ID_USER = 'system-users'
GROUP_ID_READ_ONLY = 'system-read-only'
GROUP_ID_ADMIN = "system-admin"
GROUP_ID_USER = "system-users"
GROUP_ID_READ_ONLY = "system-read-only"

View File

@@ -15,14 +15,17 @@ from homeassistant.util.decorator import Registry
MULTI_FACTOR_AUTH_MODULES = Registry()
MULTI_FACTOR_AUTH_MODULE_SCHEMA = vol.Schema({
vol.Required(CONF_TYPE): str,
vol.Optional(CONF_NAME): str,
# Specify ID if you have two mfa auth module for same type.
vol.Optional(CONF_ID): str,
}, extra=vol.ALLOW_EXTRA)
MULTI_FACTOR_AUTH_MODULE_SCHEMA = vol.Schema(
{
vol.Required(CONF_TYPE): str,
vol.Optional(CONF_NAME): str,
# Specify ID if you have two mfa auth module for same type.
vol.Optional(CONF_ID): str,
},
extra=vol.ALLOW_EXTRA,
)
DATA_REQS = 'mfa_auth_module_reqs_processed'
DATA_REQS = "mfa_auth_module_reqs_processed"
_LOGGER = logging.getLogger(__name__)
@@ -30,7 +33,7 @@ _LOGGER = logging.getLogger(__name__)
class MultiFactorAuthModule:
"""Multi-factor Auth Module of validation function."""
DEFAULT_TITLE = 'Unnamed auth module'
DEFAULT_TITLE = "Unnamed auth module"
MAX_RETRY_TIME = 3
def __init__(self, hass: HomeAssistant, config: Dict[str, Any]) -> None:
@@ -63,7 +66,7 @@ class MultiFactorAuthModule:
"""Return a voluptuous schema to define mfa auth module's input."""
raise NotImplementedError
async def async_setup_flow(self, user_id: str) -> 'SetupFlow':
async def async_setup_flow(self, user_id: str) -> "SetupFlow":
"""Return a data entry flow handler for setup module.
Mfa module should extend SetupFlow
@@ -82,8 +85,7 @@ class MultiFactorAuthModule:
"""Return whether user is setup."""
raise NotImplementedError
async def async_validate(
self, user_id: str, user_input: Dict[str, Any]) -> bool:
async def async_validate(self, user_id: str, user_input: Dict[str, Any]) -> bool:
"""Return True if validation passed."""
raise NotImplementedError
@@ -91,17 +93,17 @@ class MultiFactorAuthModule:
class SetupFlow(data_entry_flow.FlowHandler):
"""Handler for the setup flow."""
def __init__(self, auth_module: MultiFactorAuthModule,
setup_schema: vol.Schema,
user_id: str) -> None:
def __init__(
self, auth_module: MultiFactorAuthModule, setup_schema: vol.Schema, user_id: str
) -> None:
"""Initialize the setup flow."""
self._auth_module = auth_module
self._setup_schema = setup_schema
self._user_id = user_id
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the first step of setup flow.
Return self.async_show_form(step_id='init') if user_input is None.
@@ -110,23 +112,19 @@ class SetupFlow(data_entry_flow.FlowHandler):
errors = {} # type: Dict[str, str]
if user_input:
result = await self._auth_module.async_setup_user(
self._user_id, user_input)
result = await self._auth_module.async_setup_user(self._user_id, user_input)
return self.async_create_entry(
title=self._auth_module.name,
data={'result': result}
title=self._auth_module.name, data={"result": result}
)
return self.async_show_form(
step_id='init',
data_schema=self._setup_schema,
errors=errors
step_id="init", data_schema=self._setup_schema, errors=errors
)
async def auth_mfa_module_from_config(
hass: HomeAssistant, config: Dict[str, Any]) \
-> MultiFactorAuthModule:
hass: HomeAssistant, config: Dict[str, Any]
) -> MultiFactorAuthModule:
"""Initialize an auth module from a config."""
module_name = config[CONF_TYPE]
module = await _load_mfa_module(hass, module_name)
@@ -134,26 +132,29 @@ async def auth_mfa_module_from_config(
try:
config = module.CONFIG_SCHEMA(config) # type: ignore
except vol.Invalid as err:
_LOGGER.error('Invalid configuration for multi-factor module %s: %s',
module_name, humanize_error(config, err))
_LOGGER.error(
"Invalid configuration for multi-factor module %s: %s",
module_name,
humanize_error(config, err),
)
raise
return MULTI_FACTOR_AUTH_MODULES[module_name](hass, config) # type: ignore
async def _load_mfa_module(hass: HomeAssistant, module_name: str) \
-> types.ModuleType:
async def _load_mfa_module(hass: HomeAssistant, module_name: str) -> types.ModuleType:
"""Load an mfa auth module."""
module_path = 'homeassistant.auth.mfa_modules.{}'.format(module_name)
module_path = "homeassistant.auth.mfa_modules.{}".format(module_name)
try:
module = importlib.import_module(module_path)
except ImportError as err:
_LOGGER.error('Unable to load mfa module %s: %s', module_name, err)
raise HomeAssistantError('Unable to load mfa module {}: {}'.format(
module_name, err))
_LOGGER.error("Unable to load mfa module %s: %s", module_name, err)
raise HomeAssistantError(
"Unable to load mfa module {}: {}".format(module_name, err)
)
if hass.config.skip_pip or not hasattr(module, 'REQUIREMENTS'):
if hass.config.skip_pip or not hasattr(module, "REQUIREMENTS"):
return module
processed = hass.data.get(DATA_REQS)
@@ -164,12 +165,13 @@ async def _load_mfa_module(hass: HomeAssistant, module_name: str) \
# https://github.com/python/mypy/issues/1424
req_success = await requirements.async_process_requirements(
hass, module_path, module.REQUIREMENTS) # type: ignore
hass, module_path, module.REQUIREMENTS # type: ignore
)
if not req_success:
raise HomeAssistantError(
'Unable to process requirements of mfa module {}'.format(
module_name))
"Unable to process requirements of mfa module {}".format(module_name)
)
processed.add(module_name)
return module

View File

@@ -6,39 +6,45 @@ import voluptuous as vol
from homeassistant.core import HomeAssistant
from . import MultiFactorAuthModule, MULTI_FACTOR_AUTH_MODULES, \
MULTI_FACTOR_AUTH_MODULE_SCHEMA, SetupFlow
from . import (
MultiFactorAuthModule,
MULTI_FACTOR_AUTH_MODULES,
MULTI_FACTOR_AUTH_MODULE_SCHEMA,
SetupFlow,
)
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({
vol.Required('data'): [vol.Schema({
vol.Required('user_id'): str,
vol.Required('pin'): str,
})]
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend(
{
vol.Required("data"): [
vol.Schema({vol.Required("user_id"): str, vol.Required("pin"): str})
]
},
extra=vol.PREVENT_EXTRA,
)
_LOGGER = logging.getLogger(__name__)
@MULTI_FACTOR_AUTH_MODULES.register('insecure_example')
@MULTI_FACTOR_AUTH_MODULES.register("insecure_example")
class InsecureExampleModule(MultiFactorAuthModule):
"""Example auth module validate pin."""
DEFAULT_TITLE = 'Insecure Personal Identify Number'
DEFAULT_TITLE = "Insecure Personal Identify Number"
def __init__(self, hass: HomeAssistant, config: Dict[str, Any]) -> None:
"""Initialize the user data store."""
super().__init__(hass, config)
self._data = config['data']
self._data = config["data"]
@property
def input_schema(self) -> vol.Schema:
"""Validate login flow input data."""
return vol.Schema({'pin': str})
return vol.Schema({"pin": str})
@property
def setup_schema(self) -> vol.Schema:
"""Validate async_setup_user input data."""
return vol.Schema({'pin': str})
return vol.Schema({"pin": str})
async def async_setup_flow(self, user_id: str) -> SetupFlow:
"""Return a data entry flow handler for setup module.
@@ -50,21 +56,21 @@ class InsecureExampleModule(MultiFactorAuthModule):
async def async_setup_user(self, user_id: str, setup_data: Any) -> Any:
"""Set up user to use mfa module."""
# data shall has been validate in caller
pin = setup_data['pin']
pin = setup_data["pin"]
for data in self._data:
if data['user_id'] == user_id:
if data["user_id"] == user_id:
# already setup, override
data['pin'] = pin
data["pin"] = pin
return
self._data.append({'user_id': user_id, 'pin': pin})
self._data.append({"user_id": user_id, "pin": pin})
async def async_depose_user(self, user_id: str) -> None:
"""Remove user from mfa module."""
found = None
for data in self._data:
if data['user_id'] == user_id:
if data["user_id"] == user_id:
found = data
break
if found:
@@ -73,17 +79,16 @@ class InsecureExampleModule(MultiFactorAuthModule):
async def async_is_user_setup(self, user_id: str) -> bool:
"""Return whether user is setup."""
for data in self._data:
if data['user_id'] == user_id:
if data["user_id"] == user_id:
return True
return False
async def async_validate(
self, user_id: str, user_input: Dict[str, Any]) -> bool:
async def async_validate(self, user_id: str, user_input: Dict[str, Any]) -> bool:
"""Return True if validation passed."""
for data in self._data:
if data['user_id'] == user_id:
if data["user_id"] == user_id:
# user_input has been validate in caller
if data['pin'] == user_input['pin']:
if data["pin"] == user_input["pin"]:
return True
return False

View File

@@ -15,26 +15,32 @@ from homeassistant.core import HomeAssistant, callback
from homeassistant.exceptions import ServiceNotFound
from homeassistant.helpers import config_validation as cv
from . import MultiFactorAuthModule, MULTI_FACTOR_AUTH_MODULES, \
MULTI_FACTOR_AUTH_MODULE_SCHEMA, SetupFlow
from . import (
MultiFactorAuthModule,
MULTI_FACTOR_AUTH_MODULES,
MULTI_FACTOR_AUTH_MODULE_SCHEMA,
SetupFlow,
)
REQUIREMENTS = ['pyotp==2.2.7']
REQUIREMENTS = ["pyotp==2.2.7"]
CONF_MESSAGE = 'message'
CONF_MESSAGE = "message"
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({
vol.Optional(CONF_INCLUDE): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(CONF_EXCLUDE): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(CONF_MESSAGE,
default='{} is your Home Assistant login code'): str
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend(
{
vol.Optional(CONF_INCLUDE): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(CONF_EXCLUDE): vol.All(cv.ensure_list, [cv.string]),
vol.Optional(CONF_MESSAGE, default="{} is your Home Assistant login code"): str,
},
extra=vol.PREVENT_EXTRA,
)
STORAGE_VERSION = 1
STORAGE_KEY = 'auth_module.notify'
STORAGE_USERS = 'users'
STORAGE_USER_ID = 'user_id'
STORAGE_KEY = "auth_module.notify"
STORAGE_USERS = "users"
STORAGE_USER_ID = "user_id"
INPUT_FIELD_CODE = 'code'
INPUT_FIELD_CODE = "code"
_LOGGER = logging.getLogger(__name__)
@@ -42,24 +48,28 @@ _LOGGER = logging.getLogger(__name__)
def _generate_secret() -> str:
"""Generate a secret."""
import pyotp
return str(pyotp.random_base32())
def _generate_random() -> int:
"""Generate a 8 digit number."""
import pyotp
return int(pyotp.random_base32(length=8, chars=list('1234567890')))
return int(pyotp.random_base32(length=8, chars=list("1234567890")))
def _generate_otp(secret: str, count: int) -> str:
"""Generate one time password."""
import pyotp
return str(pyotp.HOTP(secret).at(count))
def _verify_otp(secret: str, otp: str, count: int) -> bool:
"""Verify one time password."""
import pyotp
return bool(pyotp.HOTP(secret).verify(otp, count))
@@ -67,7 +77,7 @@ def _verify_otp(secret: str, otp: str, count: int) -> bool:
class NotifySetting:
"""Store notify setting for one user."""
secret = attr.ib(type=str, factory=_generate_secret) # not persistent
secret = attr.ib(type=str, factory=_generate_secret) # not persistent
counter = attr.ib(type=int, factory=_generate_random) # not persistent
notify_service = attr.ib(type=Optional[str], default=None)
target = attr.ib(type=Optional[str], default=None)
@@ -76,18 +86,19 @@ class NotifySetting:
_UsersDict = Dict[str, NotifySetting]
@MULTI_FACTOR_AUTH_MODULES.register('notify')
@MULTI_FACTOR_AUTH_MODULES.register("notify")
class NotifyAuthModule(MultiFactorAuthModule):
"""Auth module send hmac-based one time password by notify service."""
DEFAULT_TITLE = 'Notify One-Time Password'
DEFAULT_TITLE = "Notify One-Time Password"
def __init__(self, hass: HomeAssistant, config: Dict[str, Any]) -> None:
"""Initialize the user data store."""
super().__init__(hass, config)
self._user_settings = None # type: Optional[_UsersDict]
self._user_store = hass.helpers.storage.Store(
STORAGE_VERSION, STORAGE_KEY, private=True)
STORAGE_VERSION, STORAGE_KEY, private=True
)
self._include = config.get(CONF_INCLUDE, [])
self._exclude = config.get(CONF_EXCLUDE, [])
self._message_template = config[CONF_MESSAGE]
@@ -119,22 +130,27 @@ class NotifyAuthModule(MultiFactorAuthModule):
if self._user_settings is None:
return
await self._user_store.async_save({STORAGE_USERS: {
user_id: attr.asdict(
notify_setting, filter=attr.filters.exclude(
attr.fields(NotifySetting).secret,
attr.fields(NotifySetting).counter,
))
for user_id, notify_setting
in self._user_settings.items()
}})
await self._user_store.async_save(
{
STORAGE_USERS: {
user_id: attr.asdict(
notify_setting,
filter=attr.filters.exclude(
attr.fields(NotifySetting).secret,
attr.fields(NotifySetting).counter,
),
)
for user_id, notify_setting in self._user_settings.items()
}
}
)
@callback
def aync_get_available_notify_services(self) -> List[str]:
"""Return list of notify services."""
unordered_services = set()
for service in self.hass.services.async_services().get('notify', {}):
for service in self.hass.services.async_services().get("notify", {}):
if service not in self._exclude:
unordered_services.add(service)
@@ -149,8 +165,8 @@ class NotifyAuthModule(MultiFactorAuthModule):
Mfa module should extend SetupFlow
"""
return NotifySetupFlow(
self, self.input_schema, user_id,
self.aync_get_available_notify_services())
self, self.input_schema, user_id, self.aync_get_available_notify_services()
)
async def async_setup_user(self, user_id: str, setup_data: Any) -> Any:
"""Set up auth module for user."""
@@ -159,8 +175,8 @@ class NotifyAuthModule(MultiFactorAuthModule):
assert self._user_settings is not None
self._user_settings[user_id] = NotifySetting(
notify_service=setup_data.get('notify_service'),
target=setup_data.get('target'),
notify_service=setup_data.get("notify_service"),
target=setup_data.get("target"),
)
await self._async_save()
@@ -182,8 +198,7 @@ class NotifyAuthModule(MultiFactorAuthModule):
return user_id in self._user_settings
async def async_validate(
self, user_id: str, user_input: Dict[str, Any]) -> bool:
async def async_validate(self, user_id: str, user_input: Dict[str, Any]) -> bool:
"""Return True if validation passed."""
if self._user_settings is None:
await self._async_load()
@@ -195,9 +210,11 @@ class NotifyAuthModule(MultiFactorAuthModule):
# user_input has been validate in caller
return await self.hass.async_add_executor_job(
_verify_otp, notify_setting.secret,
user_input.get(INPUT_FIELD_CODE, ''),
notify_setting.counter)
_verify_otp,
notify_setting.secret,
user_input.get(INPUT_FIELD_CODE, ""),
notify_setting.counter,
)
async def async_initialize_login_mfa_step(self, user_id: str) -> None:
"""Generate code and notify user."""
@@ -207,7 +224,7 @@ class NotifyAuthModule(MultiFactorAuthModule):
notify_setting = self._user_settings.get(user_id, None)
if notify_setting is None:
raise ValueError('Cannot find user_id')
raise ValueError("Cannot find user_id")
def generate_secret_and_one_time_password() -> str:
"""Generate and send one time password."""
@@ -215,11 +232,11 @@ class NotifyAuthModule(MultiFactorAuthModule):
# secret and counter are not persistent
notify_setting.secret = _generate_secret()
notify_setting.counter = _generate_random()
return _generate_otp(
notify_setting.secret, notify_setting.counter)
return _generate_otp(notify_setting.secret, notify_setting.counter)
code = await self.hass.async_add_executor_job(
generate_secret_and_one_time_password)
generate_secret_and_one_time_password
)
await self.async_notify_user(user_id, code)
@@ -231,105 +248,107 @@ class NotifyAuthModule(MultiFactorAuthModule):
notify_setting = self._user_settings.get(user_id, None)
if notify_setting is None:
_LOGGER.error('Cannot find user %s', user_id)
_LOGGER.error("Cannot find user %s", user_id)
return
await self.async_notify( # type: ignore
code, notify_setting.notify_service, notify_setting.target)
await self.async_notify( # type: ignore
code, notify_setting.notify_service, notify_setting.target
)
async def async_notify(self, code: str, notify_service: str,
target: Optional[str] = None) -> None:
async def async_notify(
self, code: str, notify_service: str, target: Optional[str] = None
) -> None:
"""Send code by notify service."""
data = {'message': self._message_template.format(code)}
data = {"message": self._message_template.format(code)}
if target:
data['target'] = [target]
data["target"] = [target]
await self.hass.services.async_call('notify', notify_service, data)
await self.hass.services.async_call("notify", notify_service, data)
class NotifySetupFlow(SetupFlow):
"""Handler for the setup flow."""
def __init__(self, auth_module: NotifyAuthModule,
setup_schema: vol.Schema,
user_id: str,
available_notify_services: List[str]) -> None:
def __init__(
self,
auth_module: NotifyAuthModule,
setup_schema: vol.Schema,
user_id: str,
available_notify_services: List[str],
) -> None:
"""Initialize the setup flow."""
super().__init__(auth_module, setup_schema, user_id)
# to fix typing complaint
self._auth_module = auth_module # type: NotifyAuthModule
self._available_notify_services = available_notify_services
self._secret = None # type: Optional[str]
self._count = None # type: Optional[int]
self._count = None # type: Optional[int]
self._notify_service = None # type: Optional[str]
self._target = None # type: Optional[str]
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Let user select available notify services."""
errors = {} # type: Dict[str, str]
hass = self._auth_module.hass
if user_input:
self._notify_service = user_input['notify_service']
self._target = user_input.get('target')
self._notify_service = user_input["notify_service"]
self._target = user_input.get("target")
self._secret = await hass.async_add_executor_job(_generate_secret)
self._count = await hass.async_add_executor_job(_generate_random)
return await self.async_step_setup()
if not self._available_notify_services:
return self.async_abort(reason='no_available_service')
return self.async_abort(reason="no_available_service")
schema = OrderedDict() # type: Dict[str, Any]
schema['notify_service'] = vol.In(self._available_notify_services)
schema['target'] = vol.Optional(str)
schema["notify_service"] = vol.In(self._available_notify_services)
schema["target"] = vol.Optional(str)
return self.async_show_form(
step_id='init',
data_schema=vol.Schema(schema),
errors=errors
step_id="init", data_schema=vol.Schema(schema), errors=errors
)
async def async_step_setup(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Verify user can recevie one-time password."""
errors = {} # type: Dict[str, str]
hass = self._auth_module.hass
if user_input:
verified = await hass.async_add_executor_job(
_verify_otp, self._secret, user_input['code'], self._count)
_verify_otp, self._secret, user_input["code"], self._count
)
if verified:
await self._auth_module.async_setup_user(
self._user_id, {
'notify_service': self._notify_service,
'target': self._target,
})
return self.async_create_entry(
title=self._auth_module.name,
data={}
self._user_id,
{"notify_service": self._notify_service, "target": self._target},
)
return self.async_create_entry(title=self._auth_module.name, data={})
errors['base'] = 'invalid_code'
errors["base"] = "invalid_code"
# generate code every time, no retry logic
assert self._secret and self._count
code = await hass.async_add_executor_job(
_generate_otp, self._secret, self._count)
_generate_otp, self._secret, self._count
)
assert self._notify_service
try:
await self._auth_module.async_notify(
code, self._notify_service, self._target)
code, self._notify_service, self._target
)
except ServiceNotFound:
return self.async_abort(reason='notify_service_not_exist')
return self.async_abort(reason="notify_service_not_exist")
return self.async_show_form(
step_id='setup',
step_id="setup",
data_schema=self._setup_schema,
description_placeholders={'notify_service': self._notify_service},
description_placeholders={"notify_service": self._notify_service},
errors=errors,
)

View File

@@ -9,23 +9,26 @@ import voluptuous as vol
from homeassistant.auth.models import User
from homeassistant.core import HomeAssistant
from . import MultiFactorAuthModule, MULTI_FACTOR_AUTH_MODULES, \
MULTI_FACTOR_AUTH_MODULE_SCHEMA, SetupFlow
from . import (
MultiFactorAuthModule,
MULTI_FACTOR_AUTH_MODULES,
MULTI_FACTOR_AUTH_MODULE_SCHEMA,
SetupFlow,
)
REQUIREMENTS = ['pyotp==2.2.7', 'PyQRCode==1.2.1']
REQUIREMENTS = ["pyotp==2.2.7", "PyQRCode==1.2.1"]
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = MULTI_FACTOR_AUTH_MODULE_SCHEMA.extend({}, extra=vol.PREVENT_EXTRA)
STORAGE_VERSION = 1
STORAGE_KEY = 'auth_module.totp'
STORAGE_USERS = 'users'
STORAGE_USER_ID = 'user_id'
STORAGE_OTA_SECRET = 'ota_secret'
STORAGE_KEY = "auth_module.totp"
STORAGE_USERS = "users"
STORAGE_USER_ID = "user_id"
STORAGE_OTA_SECRET = "ota_secret"
INPUT_FIELD_CODE = 'code'
INPUT_FIELD_CODE = "code"
DUMMY_SECRET = 'FPPTH34D4E3MI2HG'
DUMMY_SECRET = "FPPTH34D4E3MI2HG"
_LOGGER = logging.getLogger(__name__)
@@ -38,10 +41,15 @@ def _generate_qr_code(data: str) -> str:
with BytesIO() as buffer:
qr_code.svg(file=buffer, scale=4)
return '{}'.format(
buffer.getvalue().decode("ascii").replace('\n', '')
.replace('<?xml version="1.0" encoding="UTF-8"?>'
'<svg xmlns="http://www.w3.org/2000/svg"', '<svg')
return "{}".format(
buffer.getvalue()
.decode("ascii")
.replace("\n", "")
.replace(
'<?xml version="1.0" encoding="UTF-8"?>'
'<svg xmlns="http://www.w3.org/2000/svg"',
"<svg",
)
)
@@ -51,16 +59,17 @@ def _generate_secret_and_qr_code(username: str) -> Tuple[str, str, str]:
ota_secret = pyotp.random_base32()
url = pyotp.totp.TOTP(ota_secret).provisioning_uri(
username, issuer_name="Home Assistant")
username, issuer_name="Home Assistant"
)
image = _generate_qr_code(url)
return ota_secret, url, image
@MULTI_FACTOR_AUTH_MODULES.register('totp')
@MULTI_FACTOR_AUTH_MODULES.register("totp")
class TotpAuthModule(MultiFactorAuthModule):
"""Auth module validate time-based one time password."""
DEFAULT_TITLE = 'Time-based One Time Password'
DEFAULT_TITLE = "Time-based One Time Password"
MAX_RETRY_TIME = 5
def __init__(self, hass: HomeAssistant, config: Dict[str, Any]) -> None:
@@ -68,7 +77,8 @@ class TotpAuthModule(MultiFactorAuthModule):
super().__init__(hass, config)
self._users = None # type: Optional[Dict[str, str]]
self._user_store = hass.helpers.storage.Store(
STORAGE_VERSION, STORAGE_KEY, private=True)
STORAGE_VERSION, STORAGE_KEY, private=True
)
self._init_lock = asyncio.Lock()
@property
@@ -93,14 +103,13 @@ class TotpAuthModule(MultiFactorAuthModule):
"""Save data."""
await self._user_store.async_save({STORAGE_USERS: self._users})
def _add_ota_secret(self, user_id: str,
secret: Optional[str] = None) -> str:
def _add_ota_secret(self, user_id: str, secret: Optional[str] = None) -> str:
"""Create a ota_secret for user."""
import pyotp
ota_secret = secret or pyotp.random_base32() # type: str
self._users[user_id] = ota_secret # type: ignore
self._users[user_id] = ota_secret # type: ignore
return ota_secret
async def async_setup_flow(self, user_id: str) -> SetupFlow:
@@ -108,7 +117,7 @@ class TotpAuthModule(MultiFactorAuthModule):
Mfa module should extend SetupFlow
"""
user = await self.hass.auth.async_get_user(user_id) # type: ignore
user = await self.hass.auth.async_get_user(user_id) # type: ignore
return TotpSetupFlow(self, self.input_schema, user)
async def async_setup_user(self, user_id: str, setup_data: Any) -> str:
@@ -117,7 +126,8 @@ class TotpAuthModule(MultiFactorAuthModule):
await self._async_load()
result = await self.hass.async_add_executor_job(
self._add_ota_secret, user_id, setup_data.get('secret'))
self._add_ota_secret, user_id, setup_data.get("secret")
)
await self._async_save()
return result
@@ -127,7 +137,7 @@ class TotpAuthModule(MultiFactorAuthModule):
if self._users is None:
await self._async_load()
if self._users.pop(user_id, None): # type: ignore
if self._users.pop(user_id, None): # type: ignore
await self._async_save()
async def async_is_user_setup(self, user_id: str) -> bool:
@@ -135,10 +145,9 @@ class TotpAuthModule(MultiFactorAuthModule):
if self._users is None:
await self._async_load()
return user_id in self._users # type: ignore
return user_id in self._users # type: ignore
async def async_validate(
self, user_id: str, user_input: Dict[str, Any]) -> bool:
async def async_validate(self, user_id: str, user_input: Dict[str, Any]) -> bool:
"""Return True if validation passed."""
if self._users is None:
await self._async_load()
@@ -146,7 +155,8 @@ class TotpAuthModule(MultiFactorAuthModule):
# user_input has been validate in caller
# set INPUT_FIELD_CODE as vol.Required is not user friendly
return await self.hass.async_add_executor_job(
self._validate_2fa, user_id, user_input.get(INPUT_FIELD_CODE, ''))
self._validate_2fa, user_id, user_input.get(INPUT_FIELD_CODE, "")
)
def _validate_2fa(self, user_id: str, code: str) -> bool:
"""Validate two factor authentication code."""
@@ -165,9 +175,9 @@ class TotpAuthModule(MultiFactorAuthModule):
class TotpSetupFlow(SetupFlow):
"""Handler for the setup flow."""
def __init__(self, auth_module: TotpAuthModule,
setup_schema: vol.Schema,
user: User) -> None:
def __init__(
self, auth_module: TotpAuthModule, setup_schema: vol.Schema, user: User
) -> None:
"""Initialize the setup flow."""
super().__init__(auth_module, setup_schema, user.id)
# to fix typing complaint
@@ -178,8 +188,8 @@ class TotpSetupFlow(SetupFlow):
self._image = None # type Optional[str]
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the first step of setup flow.
Return self.async_show_form(step_id='init') if user_input is None.
@@ -191,30 +201,31 @@ class TotpSetupFlow(SetupFlow):
if user_input:
verified = await self.hass.async_add_executor_job( # type: ignore
pyotp.TOTP(self._ota_secret).verify, user_input['code'])
pyotp.TOTP(self._ota_secret).verify, user_input["code"]
)
if verified:
result = await self._auth_module.async_setup_user(
self._user_id, {'secret': self._ota_secret})
self._user_id, {"secret": self._ota_secret}
)
return self.async_create_entry(
title=self._auth_module.name,
data={'result': result}
title=self._auth_module.name, data={"result": result}
)
errors['base'] = 'invalid_code'
errors["base"] = "invalid_code"
else:
hass = self._auth_module.hass
self._ota_secret, self._url, self._image = \
await hass.async_add_executor_job( # type: ignore
_generate_secret_and_qr_code, str(self._user.name))
self._ota_secret, self._url, self._image = await hass.async_add_executor_job( # type: ignore
_generate_secret_and_qr_code, str(self._user.name)
)
return self.async_show_form(
step_id='init',
step_id="init",
data_schema=self._setup_schema,
description_placeholders={
'code': self._ota_secret,
'url': self._url,
'qr_code': self._image
"code": self._ota_secret,
"url": self._url,
"qr_code": self._image,
},
errors=errors
errors=errors,
)

View File

@@ -11,9 +11,9 @@ from . import permissions as perm_mdl
from .const import GROUP_ID_ADMIN
from .util import generate_secret
TOKEN_TYPE_NORMAL = 'normal'
TOKEN_TYPE_SYSTEM = 'system'
TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN = 'long_lived_access_token'
TOKEN_TYPE_NORMAL = "normal"
TOKEN_TYPE_SYSTEM = "system"
TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN = "long_lived_access_token"
@attr.s(slots=True)
@@ -32,7 +32,7 @@ class User:
name = attr.ib(type=str) # type: Optional[str]
perm_lookup = attr.ib(
type=perm_mdl.PermissionLookup, cmp=False,
type=perm_mdl.PermissionLookup, cmp=False
) # type: perm_mdl.PermissionLookup
id = attr.ib(type=str, factory=lambda: uuid.uuid4().hex)
is_owner = attr.ib(type=bool, default=False)
@@ -42,9 +42,7 @@ class User:
groups = attr.ib(type=List, factory=list, cmp=False) # type: List[Group]
# List of credentials of a user.
credentials = attr.ib(
type=list, factory=list, cmp=False
) # type: List[Credentials]
credentials = attr.ib(type=list, factory=list, cmp=False) # type: List[Credentials]
# Tokens associated with a user.
refresh_tokens = attr.ib(
@@ -52,10 +50,7 @@ class User:
) # type: Dict[str, RefreshToken]
_permissions = attr.ib(
type=Optional[perm_mdl.PolicyPermissions],
init=False,
cmp=False,
default=None,
type=Optional[perm_mdl.PolicyPermissions], init=False, cmp=False, default=None
)
@property
@@ -68,9 +63,9 @@ class User:
return self._permissions
self._permissions = perm_mdl.PolicyPermissions(
perm_mdl.merge_policies([
group.policy for group in self.groups]),
self.perm_lookup)
perm_mdl.merge_policies([group.policy for group in self.groups]),
self.perm_lookup,
)
return self._permissions
@@ -80,8 +75,7 @@ class User:
if self.is_owner:
return True
return self.is_active and any(
gr.id == GROUP_ID_ADMIN for gr in self.groups)
return self.is_active and any(gr.id == GROUP_ID_ADMIN for gr in self.groups)
def invalidate_permission_cache(self) -> None:
"""Invalidate permission cache."""
@@ -97,10 +91,13 @@ class RefreshToken:
access_token_expiration = attr.ib(type=timedelta)
client_name = attr.ib(type=Optional[str], default=None)
client_icon = attr.ib(type=Optional[str], default=None)
token_type = attr.ib(type=str, default=TOKEN_TYPE_NORMAL,
validator=attr.validators.in_((
TOKEN_TYPE_NORMAL, TOKEN_TYPE_SYSTEM,
TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN)))
token_type = attr.ib(
type=str,
default=TOKEN_TYPE_NORMAL,
validator=attr.validators.in_(
(TOKEN_TYPE_NORMAL, TOKEN_TYPE_SYSTEM, TOKEN_TYPE_LONG_LIVED_ACCESS_TOKEN)
),
)
id = attr.ib(type=str, factory=lambda: uuid.uuid4().hex)
created_at = attr.ib(type=datetime, factory=dt_util.utcnow)
token = attr.ib(type=str, factory=lambda: generate_secret(64))
@@ -124,5 +121,4 @@ class Credentials:
is_new = attr.ib(type=bool, default=True)
UserMeta = NamedTuple("UserMeta",
[('name', Optional[str]), ('is_active', bool)])
UserMeta = NamedTuple("UserMeta", [("name", Optional[str]), ("is_active", bool)])

View File

@@ -1,8 +1,17 @@
"""Permissions for Home Assistant."""
import logging
from typing import ( # noqa: F401
cast, Any, Callable, Dict, List, Mapping, Set, Tuple, Union,
TYPE_CHECKING)
cast,
Any,
Callable,
Dict,
List,
Mapping,
Set,
Tuple,
Union,
TYPE_CHECKING,
)
import voluptuous as vol
@@ -14,9 +23,7 @@ from .merge import merge_policies # noqa
from .util import test_all
POLICY_SCHEMA = vol.Schema({
vol.Optional(CAT_ENTITIES): ENTITY_POLICY_SCHEMA
})
POLICY_SCHEMA = vol.Schema({vol.Optional(CAT_ENTITIES): ENTITY_POLICY_SCHEMA})
_LOGGER = logging.getLogger(__name__)
@@ -47,8 +54,7 @@ class AbstractPermissions:
class PolicyPermissions(AbstractPermissions):
"""Handle permissions."""
def __init__(self, policy: PolicyType,
perm_lookup: PermissionLookup) -> None:
def __init__(self, policy: PolicyType, perm_lookup: PermissionLookup) -> None:
"""Initialize the permission class."""
self._policy = policy
self._perm_lookup = perm_lookup
@@ -59,14 +65,12 @@ class PolicyPermissions(AbstractPermissions):
def _entity_func(self) -> Callable[[str, str], bool]:
"""Return a function that can test entity access."""
return compile_entities(self._policy.get(CAT_ENTITIES),
self._perm_lookup)
return compile_entities(self._policy.get(CAT_ENTITIES), self._perm_lookup)
def __eq__(self, other: Any) -> bool:
"""Equals check."""
# pylint: disable=protected-access
return (isinstance(other, PolicyPermissions) and
other._policy == self._policy)
return isinstance(other, PolicyPermissions) and other._policy == self._policy
class _OwnerPermissions(AbstractPermissions):

View File

@@ -1,8 +1,8 @@
"""Permission constants."""
CAT_ENTITIES = 'entities'
CAT_CONFIG_ENTRIES = 'config_entries'
SUBCAT_ALL = 'all'
CAT_ENTITIES = "entities"
CAT_CONFIG_ENTRIES = "config_entries"
SUBCAT_ALL = "all"
POLICY_READ = 'read'
POLICY_CONTROL = 'control'
POLICY_EDIT = 'edit'
POLICY_READ = "read"
POLICY_CONTROL = "control"
POLICY_EDIT = "edit"

View File

@@ -7,51 +7,59 @@ import voluptuous as vol
from .const import SUBCAT_ALL, POLICY_READ, POLICY_CONTROL, POLICY_EDIT
from .models import PermissionLookup
from .types import CategoryType, SubCategoryDict, ValueType
# pylint: disable=unused-import
from .util import SubCatLookupType, lookup_all, compile_policy # noqa
SINGLE_ENTITY_SCHEMA = vol.Any(True, vol.Schema({
vol.Optional(POLICY_READ): True,
vol.Optional(POLICY_CONTROL): True,
vol.Optional(POLICY_EDIT): True,
}))
SINGLE_ENTITY_SCHEMA = vol.Any(
True,
vol.Schema(
{
vol.Optional(POLICY_READ): True,
vol.Optional(POLICY_CONTROL): True,
vol.Optional(POLICY_EDIT): True,
}
),
)
ENTITY_DOMAINS = 'domains'
ENTITY_AREAS = 'area_ids'
ENTITY_DEVICE_IDS = 'device_ids'
ENTITY_ENTITY_IDS = 'entity_ids'
ENTITY_DOMAINS = "domains"
ENTITY_AREAS = "area_ids"
ENTITY_DEVICE_IDS = "device_ids"
ENTITY_ENTITY_IDS = "entity_ids"
ENTITY_VALUES_SCHEMA = vol.Any(True, vol.Schema({
str: SINGLE_ENTITY_SCHEMA
}))
ENTITY_VALUES_SCHEMA = vol.Any(True, vol.Schema({str: SINGLE_ENTITY_SCHEMA}))
ENTITY_POLICY_SCHEMA = vol.Any(True, vol.Schema({
vol.Optional(SUBCAT_ALL): SINGLE_ENTITY_SCHEMA,
vol.Optional(ENTITY_AREAS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_DEVICE_IDS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_DOMAINS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_ENTITY_IDS): ENTITY_VALUES_SCHEMA,
}))
ENTITY_POLICY_SCHEMA = vol.Any(
True,
vol.Schema(
{
vol.Optional(SUBCAT_ALL): SINGLE_ENTITY_SCHEMA,
vol.Optional(ENTITY_AREAS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_DEVICE_IDS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_DOMAINS): ENTITY_VALUES_SCHEMA,
vol.Optional(ENTITY_ENTITY_IDS): ENTITY_VALUES_SCHEMA,
}
),
)
def _lookup_domain(perm_lookup: PermissionLookup,
domains_dict: SubCategoryDict,
entity_id: str) -> Optional[ValueType]:
def _lookup_domain(
perm_lookup: PermissionLookup, domains_dict: SubCategoryDict, entity_id: str
) -> Optional[ValueType]:
"""Look up entity permissions by domain."""
return domains_dict.get(entity_id.split(".", 1)[0])
def _lookup_area(perm_lookup: PermissionLookup, area_dict: SubCategoryDict,
entity_id: str) -> Optional[ValueType]:
def _lookup_area(
perm_lookup: PermissionLookup, area_dict: SubCategoryDict, entity_id: str
) -> Optional[ValueType]:
"""Look up entity permissions by area."""
entity_entry = perm_lookup.entity_registry.async_get(entity_id)
if entity_entry is None or entity_entry.device_id is None:
return None
device_entry = perm_lookup.device_registry.async_get(
entity_entry.device_id
)
device_entry = perm_lookup.device_registry.async_get(entity_entry.device_id)
if device_entry is None or device_entry.area_id is None:
return None
@@ -59,9 +67,9 @@ def _lookup_area(perm_lookup: PermissionLookup, area_dict: SubCategoryDict,
return area_dict.get(device_entry.area_id)
def _lookup_device(perm_lookup: PermissionLookup,
devices_dict: SubCategoryDict,
entity_id: str) -> Optional[ValueType]:
def _lookup_device(
perm_lookup: PermissionLookup, devices_dict: SubCategoryDict, entity_id: str
) -> Optional[ValueType]:
"""Look up entity permissions by device."""
entity_entry = perm_lookup.entity_registry.async_get(entity_id)
@@ -71,15 +79,16 @@ def _lookup_device(perm_lookup: PermissionLookup,
return devices_dict.get(entity_entry.device_id)
def _lookup_entity_id(perm_lookup: PermissionLookup,
entities_dict: SubCategoryDict,
entity_id: str) -> Optional[ValueType]:
def _lookup_entity_id(
perm_lookup: PermissionLookup, entities_dict: SubCategoryDict, entity_id: str
) -> Optional[ValueType]:
"""Look up entity permission by entity id."""
return entities_dict.get(entity_id)
def compile_entities(policy: CategoryType, perm_lookup: PermissionLookup) \
-> Callable[[str, str], bool]:
def compile_entities(
policy: CategoryType, perm_lookup: PermissionLookup
) -> Callable[[str, str], bool]:
"""Compile policy into a function that tests policy."""
subcategories = OrderedDict() # type: SubCatLookupType
subcategories[ENTITY_ENTITY_IDS] = _lookup_entity_id

View File

@@ -1,6 +1,5 @@
"""Merging of policies."""
from typing import ( # noqa: F401
cast, Dict, List, Set)
from typing import cast, Dict, List, Set # noqa: F401
from .types import PolicyType, CategoryType
@@ -14,8 +13,9 @@ def merge_policies(policies: List[PolicyType]) -> PolicyType:
if category in seen:
continue
seen.add(category)
new_policy[category] = _merge_policies([
policy.get(category) for policy in policies])
new_policy[category] = _merge_policies(
[policy.get(category) for policy in policies]
)
cast(PolicyType, new_policy)
return new_policy

View File

@@ -5,17 +5,13 @@ import attr
if TYPE_CHECKING:
# pylint: disable=unused-import
from homeassistant.helpers import ( # noqa
entity_registry as ent_reg,
)
from homeassistant.helpers import ( # noqa
device_registry as dev_reg,
)
from homeassistant.helpers import entity_registry as ent_reg # noqa
from homeassistant.helpers import device_registry as dev_reg # noqa
@attr.s(slots=True)
class PermissionLookup:
"""Class to hold data for permission lookups."""
entity_registry = attr.ib(type='ent_reg.EntityRegistry')
device_registry = attr.ib(type='dev_reg.DeviceRegistry')
entity_registry = attr.ib(type="ent_reg.EntityRegistry")
device_registry = attr.ib(type="dev_reg.DeviceRegistry")

View File

@@ -1,18 +1,8 @@
"""System policies."""
from .const import CAT_ENTITIES, SUBCAT_ALL, POLICY_READ
ADMIN_POLICY = {
CAT_ENTITIES: True,
}
ADMIN_POLICY = {CAT_ENTITIES: True}
USER_POLICY = {
CAT_ENTITIES: True,
}
USER_POLICY = {CAT_ENTITIES: True}
READ_ONLY_POLICY = {
CAT_ENTITIES: {
SUBCAT_ALL: {
POLICY_READ: True
}
}
}
READ_ONLY_POLICY = {CAT_ENTITIES: {SUBCAT_ALL: {POLICY_READ: True}}}

View File

@@ -7,17 +7,13 @@ ValueType = Union[
# Example: entities.all = { read: true, control: true }
Mapping[str, bool],
bool,
None
None,
]
# Example: entities.domains = { light: … }
SubCategoryDict = Mapping[str, ValueType]
SubCategoryType = Union[
SubCategoryDict,
bool,
None
]
SubCategoryType = Union[SubCategoryDict, bool, None]
CategoryType = Union[
# Example: entities.domains
@@ -25,7 +21,7 @@ CategoryType = Union[
# Example: entities.all
Mapping[str, ValueType],
bool,
None
None,
]
# Example: { entities: … }

View File

@@ -1,34 +1,34 @@
"""Helpers to deal with permissions."""
from functools import wraps
from typing import Callable, Dict, List, Optional, Union, cast # noqa: F401
from typing import Callable, Dict, List, Optional, cast # noqa: F401
from .const import SUBCAT_ALL
from .models import PermissionLookup
from .types import CategoryType, SubCategoryDict, ValueType
LookupFunc = Callable[[PermissionLookup, SubCategoryDict, str],
Optional[ValueType]]
LookupFunc = Callable[[PermissionLookup, SubCategoryDict, str], Optional[ValueType]]
SubCatLookupType = Dict[str, LookupFunc]
def lookup_all(perm_lookup: PermissionLookup, lookup_dict: SubCategoryDict,
object_id: str) -> ValueType:
def lookup_all(
perm_lookup: PermissionLookup, lookup_dict: SubCategoryDict, object_id: str
) -> ValueType:
"""Look up permission for all."""
# In case of ALL category, lookup_dict IS the schema.
return cast(ValueType, lookup_dict)
def compile_policy(
policy: CategoryType, subcategories: SubCatLookupType,
perm_lookup: PermissionLookup
) -> Callable[[str, str], bool]: # noqa
policy: CategoryType, subcategories: SubCatLookupType, perm_lookup: PermissionLookup
) -> Callable[[str, str], bool]: # noqa
"""Compile policy into a function that tests policy.
Subcategories are mapping key -> lookup function, ordered by highest
priority first.
"""
# None, False, empty dict
if not policy:
def apply_policy_deny_all(entity_id: str, key: str) -> bool:
"""Decline all."""
return False
@@ -36,6 +36,7 @@ def compile_policy(
return apply_policy_deny_all
if policy is True:
def apply_policy_allow_all(entity_id: str, key: str) -> bool:
"""Approve all."""
return True
@@ -44,7 +45,7 @@ def compile_policy(
assert isinstance(policy, dict)
funcs = [] # type: List[Callable[[str, str], Union[None, bool]]]
funcs = [] # type: List[Callable[[str, str], Optional[bool]]]
for key, lookup_func in subcategories.items():
lookup_value = policy.get(key)
@@ -54,8 +55,7 @@ def compile_policy(
return lambda object_id, key: True
if lookup_value is not None:
funcs.append(_gen_dict_test_func(
perm_lookup, lookup_func, lookup_value))
funcs.append(_gen_dict_test_func(perm_lookup, lookup_func, lookup_value))
if len(funcs) == 1:
func = funcs[0]
@@ -79,15 +79,13 @@ def compile_policy(
def _gen_dict_test_func(
perm_lookup: PermissionLookup,
lookup_func: LookupFunc,
lookup_dict: SubCategoryDict
) -> Callable[[str, str], Optional[bool]]: # noqa
perm_lookup: PermissionLookup, lookup_func: LookupFunc, lookup_dict: SubCategoryDict
) -> Callable[[str, str], Optional[bool]]: # noqa
"""Generate a lookup function."""
def test_value(object_id: str, key: str) -> Optional[bool]:
"""Test if permission is allowed based on the keys."""
schema = lookup_func(
perm_lookup, lookup_dict, object_id) # type: ValueType
schema = lookup_func(perm_lookup, lookup_dict, object_id) # type: ValueType
if schema is None or isinstance(schema, bool):
return schema

View File

@@ -19,25 +19,29 @@ from ..const import MFA_SESSION_EXPIRATION
from ..models import Credentials, User, UserMeta # noqa: F401
_LOGGER = logging.getLogger(__name__)
DATA_REQS = 'auth_prov_reqs_processed'
DATA_REQS = "auth_prov_reqs_processed"
AUTH_PROVIDERS = Registry()
AUTH_PROVIDER_SCHEMA = vol.Schema({
vol.Required(CONF_TYPE): str,
vol.Optional(CONF_NAME): str,
# Specify ID if you have two auth providers for same type.
vol.Optional(CONF_ID): str,
}, extra=vol.ALLOW_EXTRA)
AUTH_PROVIDER_SCHEMA = vol.Schema(
{
vol.Required(CONF_TYPE): str,
vol.Optional(CONF_NAME): str,
# Specify ID if you have two auth providers for same type.
vol.Optional(CONF_ID): str,
},
extra=vol.ALLOW_EXTRA,
)
class AuthProvider:
"""Provider of user authentication."""
DEFAULT_TITLE = 'Unnamed auth provider'
DEFAULT_TITLE = "Unnamed auth provider"
def __init__(self, hass: HomeAssistant, store: AuthStore,
config: Dict[str, Any]) -> None:
def __init__(
self, hass: HomeAssistant, store: AuthStore, config: Dict[str, Any]
) -> None:
"""Initialize an auth provider."""
self.hass = hass
self.store = store
@@ -73,22 +77,22 @@ class AuthProvider:
credentials
for user in users
for credentials in user.credentials
if (credentials.auth_provider_type == self.type and
credentials.auth_provider_id == self.id)
if (
credentials.auth_provider_type == self.type
and credentials.auth_provider_id == self.id
)
]
@callback
def async_create_credentials(self, data: Dict[str, str]) -> Credentials:
"""Create credentials."""
return Credentials(
auth_provider_type=self.type,
auth_provider_id=self.id,
data=data,
auth_provider_type=self.type, auth_provider_id=self.id, data=data
)
# Implement by extending class
async def async_login_flow(self, context: Optional[Dict]) -> 'LoginFlow':
async def async_login_flow(self, context: Optional[Dict]) -> "LoginFlow":
"""Return the data flow for logging in with auth provider.
Auth provider should extend LoginFlow and return an instance.
@@ -96,22 +100,28 @@ class AuthProvider:
raise NotImplementedError
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]) -> Credentials:
self, flow_result: Dict[str, str]
) -> Credentials:
"""Get credentials based on the flow result."""
raise NotImplementedError
async def async_user_meta_for_credentials(
self, credentials: Credentials) -> UserMeta:
self, credentials: Credentials
) -> UserMeta:
"""Return extra user metadata for credentials.
Will be used to populate info when creating a new user.
"""
raise NotImplementedError
async def async_initialize(self) -> None:
"""Initialize the auth provider."""
pass
async def auth_provider_from_config(
hass: HomeAssistant, store: AuthStore,
config: Dict[str, Any]) -> AuthProvider:
hass: HomeAssistant, store: AuthStore, config: Dict[str, Any]
) -> AuthProvider:
"""Initialize an auth provider from a config."""
provider_name = config[CONF_TYPE]
module = await load_auth_provider_module(hass, provider_name)
@@ -119,25 +129,31 @@ async def auth_provider_from_config(
try:
config = module.CONFIG_SCHEMA(config) # type: ignore
except vol.Invalid as err:
_LOGGER.error('Invalid configuration for auth provider %s: %s',
provider_name, humanize_error(config, err))
_LOGGER.error(
"Invalid configuration for auth provider %s: %s",
provider_name,
humanize_error(config, err),
)
raise
return AUTH_PROVIDERS[provider_name](hass, store, config) # type: ignore
async def load_auth_provider_module(
hass: HomeAssistant, provider: str) -> types.ModuleType:
hass: HomeAssistant, provider: str
) -> types.ModuleType:
"""Load an auth provider."""
try:
module = importlib.import_module(
'homeassistant.auth.providers.{}'.format(provider))
"homeassistant.auth.providers.{}".format(provider)
)
except ImportError as err:
_LOGGER.error('Unable to load auth provider %s: %s', provider, err)
raise HomeAssistantError('Unable to load auth provider {}: {}'.format(
provider, err))
_LOGGER.error("Unable to load auth provider %s: %s", provider, err)
raise HomeAssistantError(
"Unable to load auth provider {}: {}".format(provider, err)
)
if hass.config.skip_pip or not hasattr(module, 'REQUIREMENTS'):
if hass.config.skip_pip or not hasattr(module, "REQUIREMENTS"):
return module
processed = hass.data.get(DATA_REQS)
@@ -150,12 +166,13 @@ async def load_auth_provider_module(
# https://github.com/python/mypy/issues/1424
reqs = module.REQUIREMENTS # type: ignore
req_success = await requirements.async_process_requirements(
hass, 'auth provider {}'.format(provider), reqs)
hass, "auth provider {}".format(provider), reqs
)
if not req_success:
raise HomeAssistantError(
'Unable to process requirements of auth provider {}'.format(
provider))
"Unable to process requirements of auth provider {}".format(provider)
)
processed.add(provider)
return module
@@ -175,8 +192,8 @@ class LoginFlow(data_entry_flow.FlowHandler):
self.user = None # type: Optional[User]
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the first step of login flow.
Return self.async_show_form(step_id='init') if user_input is None.
@@ -185,80 +202,75 @@ class LoginFlow(data_entry_flow.FlowHandler):
raise NotImplementedError
async def async_step_select_mfa_module(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of select mfa module."""
errors = {}
if user_input is not None:
auth_module = user_input.get('multi_factor_auth_module')
auth_module = user_input.get("multi_factor_auth_module")
if auth_module in self.available_mfa_modules:
self._auth_module_id = auth_module
return await self.async_step_mfa()
errors['base'] = 'invalid_auth_module'
errors["base"] = "invalid_auth_module"
if len(self.available_mfa_modules) == 1:
self._auth_module_id = list(self.available_mfa_modules.keys())[0]
return await self.async_step_mfa()
return self.async_show_form(
step_id='select_mfa_module',
data_schema=vol.Schema({
'multi_factor_auth_module': vol.In(self.available_mfa_modules)
}),
step_id="select_mfa_module",
data_schema=vol.Schema(
{"multi_factor_auth_module": vol.In(self.available_mfa_modules)}
),
errors=errors,
)
async def async_step_mfa(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of mfa validation."""
assert self.user
errors = {}
auth_module = self._auth_manager.get_auth_mfa_module(
self._auth_module_id)
auth_module = self._auth_manager.get_auth_mfa_module(self._auth_module_id)
if auth_module is None:
# Given an invalid input to async_step_select_mfa_module
# will show invalid_auth_module error
return await self.async_step_select_mfa_module(user_input={})
if user_input is None and hasattr(auth_module,
'async_initialize_login_mfa_step'):
if user_input is None and hasattr(
auth_module, "async_initialize_login_mfa_step"
):
try:
await auth_module.async_initialize_login_mfa_step(self.user.id)
except HomeAssistantError:
_LOGGER.exception('Error initializing MFA step')
return self.async_abort(reason='unknown_error')
_LOGGER.exception("Error initializing MFA step")
return self.async_abort(reason="unknown_error")
if user_input is not None:
expires = self.created_at + MFA_SESSION_EXPIRATION
if dt_util.utcnow() > expires:
return self.async_abort(
reason='login_expired'
)
return self.async_abort(reason="login_expired")
result = await auth_module.async_validate(
self.user.id, user_input)
result = await auth_module.async_validate(self.user.id, user_input)
if not result:
errors['base'] = 'invalid_code'
errors["base"] = "invalid_code"
self.invalid_mfa_times += 1
if self.invalid_mfa_times >= auth_module.MAX_RETRY_TIME > 0:
return self.async_abort(
reason='too_many_retry'
)
return self.async_abort(reason="too_many_retry")
if not errors:
return await self.async_finish(self.user)
description_placeholders = {
'mfa_module_name': auth_module.name,
'mfa_module_id': auth_module.id,
"mfa_module_name": auth_module.name,
"mfa_module_id": auth_module.id,
} # type: Dict[str, Optional[str]]
return self.async_show_form(
step_id='mfa',
step_id="mfa",
data_schema=auth_module.input_schema,
description_placeholders=description_placeholders,
errors=errors,
@@ -266,7 +278,4 @@ class LoginFlow(data_entry_flow.FlowHandler):
async def async_finish(self, flow_result: Any) -> Dict:
"""Handle the pass of login flow."""
return self.async_create_entry(
title=self._auth_provider.name,
data=flow_result
)
return self.async_create_entry(title=self._auth_provider.name, data=flow_result)

View File

@@ -19,15 +19,16 @@ CONF_COMMAND = "command"
CONF_ARGS = "args"
CONF_META = "meta"
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend({
vol.Required(CONF_COMMAND): vol.All(
str,
os.path.normpath,
msg="must be an absolute path"
),
vol.Optional(CONF_ARGS, default=None): vol.Any(vol.DefaultTo(list), [str]),
vol.Optional(CONF_META, default=False): bool,
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend(
{
vol.Required(CONF_COMMAND): vol.All(
str, os.path.normpath, msg="must be an absolute path"
),
vol.Optional(CONF_ARGS, default=None): vol.Any(vol.DefaultTo(list), [str]),
vol.Optional(CONF_META, default=False): bool,
},
extra=vol.PREVENT_EXTRA,
)
_LOGGER = logging.getLogger(__name__)
@@ -60,29 +61,27 @@ class CommandLineAuthProvider(AuthProvider):
async def async_validate_login(self, username: str, password: str) -> None:
"""Validate a username and password."""
env = {
"username": username,
"password": password,
}
env = {"username": username, "password": password}
try:
# pylint: disable=no-member
process = await asyncio.subprocess.create_subprocess_exec(
self.config[CONF_COMMAND], *self.config[CONF_ARGS],
self.config[CONF_COMMAND],
*self.config[CONF_ARGS],
env=env,
stdout=asyncio.subprocess.PIPE
if self.config[CONF_META] else None,
stdout=asyncio.subprocess.PIPE if self.config[CONF_META] else None,
)
stdout, _ = (await process.communicate())
stdout, _ = await process.communicate()
except OSError as err:
# happens when command doesn't exist or permission is denied
_LOGGER.error("Error while authenticating %r: %s",
username, err)
_LOGGER.error("Error while authenticating %r: %s", username, err)
raise InvalidAuthError
if process.returncode != 0:
_LOGGER.error("User %r failed to authenticate, command exited "
"with code %d.",
username, process.returncode)
_LOGGER.error(
"User %r failed to authenticate, command exited " "with code %d.",
username,
process.returncode,
)
raise InvalidAuthError
if self.config[CONF_META]:
@@ -103,7 +102,7 @@ class CommandLineAuthProvider(AuthProvider):
self._user_meta[username] = meta
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]
self, flow_result: Dict[str, str]
) -> Credentials:
"""Get credentials based on the flow result."""
username = flow_result["username"]
@@ -112,29 +111,24 @@ class CommandLineAuthProvider(AuthProvider):
return credential
# Create new credentials.
return self.async_create_credentials({
"username": username,
})
return self.async_create_credentials({"username": username})
async def async_user_meta_for_credentials(
self, credentials: Credentials
self, credentials: Credentials
) -> UserMeta:
"""Return extra user metadata for credentials.
Currently, only name is supported.
"""
meta = self._user_meta.get(credentials.data["username"], {})
return UserMeta(
name=meta.get("name"),
is_active=True,
)
return UserMeta(name=meta.get("name"), is_active=True)
class CommandLineLoginFlow(LoginFlow):
"""Handler for the login flow."""
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of the form."""
errors = {}
@@ -142,10 +136,9 @@ class CommandLineLoginFlow(LoginFlow):
if user_input is not None:
user_input["username"] = user_input["username"].strip()
try:
await cast(CommandLineAuthProvider, self._auth_provider) \
.async_validate_login(
user_input["username"], user_input["password"]
)
await cast(
CommandLineAuthProvider, self._auth_provider
).async_validate_login(user_input["username"], user_input["password"])
except InvalidAuthError:
errors["base"] = "invalid_auth"
@@ -158,7 +151,5 @@ class CommandLineLoginFlow(LoginFlow):
schema["password"] = str
return self.async_show_form(
step_id="init",
data_schema=vol.Schema(schema),
errors=errors,
step_id="init", data_schema=vol.Schema(schema), errors=errors
)

View File

@@ -19,14 +19,13 @@ from ..models import Credentials, UserMeta
STORAGE_VERSION = 1
STORAGE_KEY = 'auth_provider.homeassistant'
STORAGE_KEY = "auth_provider.homeassistant"
def _disallow_id(conf: Dict[str, Any]) -> Dict[str, Any]:
"""Disallow ID in config."""
if CONF_ID in conf:
raise vol.Invalid(
'ID is not allowed for the homeassistant auth provider.')
raise vol.Invalid("ID is not allowed for the homeassistant auth provider.")
return conf
@@ -51,8 +50,9 @@ class Data:
def __init__(self, hass: HomeAssistant) -> None:
"""Initialize the user data store."""
self.hass = hass
self._store = hass.helpers.storage.Store(STORAGE_VERSION, STORAGE_KEY,
private=True)
self._store = hass.helpers.storage.Store(
STORAGE_VERSION, STORAGE_KEY, private=True
)
self._data = None # type: Optional[Dict[str, Any]]
# Legacy mode will allow usernames to start/end with whitespace
# and will compare usernames case-insensitive.
@@ -72,14 +72,12 @@ class Data:
data = await self._store.async_load()
if data is None:
data = {
'users': []
}
data = {"users": []}
seen = set() # type: Set[str]
for user in data['users']:
username = user['username']
for user in data["users"]:
username = user["username"]
# check if we have duplicates
folded = username.casefold()
@@ -90,7 +88,9 @@ class Data:
logging.getLogger(__name__).warning(
"Home Assistant auth provider is running in legacy mode "
"because we detected usernames that are case-insensitive"
"equivalent. Please change the username: '%s'.", username)
"equivalent. Please change the username: '%s'.",
username,
)
break
@@ -103,7 +103,9 @@ class Data:
logging.getLogger(__name__).warning(
"Home Assistant auth provider is running in legacy mode "
"because we detected usernames that start or end in a "
"space. Please change the username: '%s'.", username)
"space. Please change the username: '%s'.",
username,
)
break
@@ -112,7 +114,7 @@ class Data:
@property
def users(self) -> List[Dict[str, str]]:
"""Return users."""
return self._data['users'] # type: ignore
return self._data["users"] # type: ignore
def validate_login(self, username: str, password: str) -> None:
"""Validate a username and password.
@@ -120,32 +122,30 @@ class Data:
Raises InvalidAuth if auth invalid.
"""
username = self.normalize_username(username)
dummy = b'$2b$12$CiuFGszHx9eNHxPuQcwBWez4CwDTOcLTX5CbOpV6gef2nYuXkY7BO'
dummy = b"$2b$12$CiuFGszHx9eNHxPuQcwBWez4CwDTOcLTX5CbOpV6gef2nYuXkY7BO"
found = None
# Compare all users to avoid timing attacks.
for user in self.users:
if self.normalize_username(user['username']) == username:
if self.normalize_username(user["username"]) == username:
found = user
if found is None:
# check a hash to make timing the same as if user was found
bcrypt.checkpw(b'foo',
dummy)
bcrypt.checkpw(b"foo", dummy)
raise InvalidAuth
user_hash = base64.b64decode(found['password'])
user_hash = base64.b64decode(found["password"])
# bcrypt.checkpw is timing-safe
if not bcrypt.checkpw(password.encode(),
user_hash):
if not bcrypt.checkpw(password.encode(), user_hash):
raise InvalidAuth
# pylint: disable=no-self-use
def hash_password(self, password: str, for_storage: bool = False) -> bytes:
"""Encode a password."""
hashed = bcrypt.hashpw(password.encode(), bcrypt.gensalt(rounds=12)) \
# type: bytes
hashed: bytes = bcrypt.hashpw(password.encode(), bcrypt.gensalt(rounds=12))
if for_storage:
hashed = base64.b64encode(hashed)
return hashed
@@ -154,14 +154,17 @@ class Data:
"""Add a new authenticated user/pass."""
username = self.normalize_username(username)
if any(self.normalize_username(user['username']) == username
for user in self.users):
if any(
self.normalize_username(user["username"]) == username for user in self.users
):
raise InvalidUser
self.users.append({
'username': username,
'password': self.hash_password(password, True).decode(),
})
self.users.append(
{
"username": username,
"password": self.hash_password(password, True).decode(),
}
)
@callback
def async_remove_auth(self, username: str) -> None:
@@ -170,7 +173,7 @@ class Data:
index = None
for i, user in enumerate(self.users):
if self.normalize_username(user['username']) == username:
if self.normalize_username(user["username"]) == username:
index = i
break
@@ -187,9 +190,8 @@ class Data:
username = self.normalize_username(username)
for user in self.users:
if self.normalize_username(user['username']) == username:
user['password'] = self.hash_password(
new_password, True).decode()
if self.normalize_username(user["username"]) == username:
user["password"] = self.hash_password(new_password, True).decode()
break
else:
raise InvalidUser
@@ -199,11 +201,11 @@ class Data:
await self._store.async_save(self._data)
@AUTH_PROVIDERS.register('homeassistant')
@AUTH_PROVIDERS.register("homeassistant")
class HassAuthProvider(AuthProvider):
"""Auth provider based on a local storage of users in HASS config dir."""
DEFAULT_TITLE = 'Home Assistant Local'
DEFAULT_TITLE = "Home Assistant Local"
def __init__(self, *args: Any, **kwargs: Any) -> None:
"""Initialize an Home Assistant auth provider."""
@@ -221,8 +223,7 @@ class HassAuthProvider(AuthProvider):
await data.async_load()
self.data = data
async def async_login_flow(
self, context: Optional[Dict]) -> LoginFlow:
async def async_login_flow(self, context: Optional[Dict]) -> LoginFlow:
"""Return a flow to login."""
return HassLoginFlow(self)
@@ -233,41 +234,41 @@ class HassAuthProvider(AuthProvider):
assert self.data is not None
await self.hass.async_add_executor_job(
self.data.validate_login, username, password)
self.data.validate_login, username, password
)
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]) -> Credentials:
self, flow_result: Dict[str, str]
) -> Credentials:
"""Get credentials based on the flow result."""
if self.data is None:
await self.async_initialize()
assert self.data is not None
norm_username = self.data.normalize_username
username = norm_username(flow_result['username'])
username = norm_username(flow_result["username"])
for credential in await self.async_credentials():
if norm_username(credential.data['username']) == username:
if norm_username(credential.data["username"]) == username:
return credential
# Create new credentials.
return self.async_create_credentials({
'username': username
})
return self.async_create_credentials({"username": username})
async def async_user_meta_for_credentials(
self, credentials: Credentials) -> UserMeta:
self, credentials: Credentials
) -> UserMeta:
"""Get extra info for this credential."""
return UserMeta(name=credentials.data['username'], is_active=True)
return UserMeta(name=credentials.data["username"], is_active=True)
async def async_will_remove_credentials(
self, credentials: Credentials) -> None:
async def async_will_remove_credentials(self, credentials: Credentials) -> None:
"""When credentials get removed, also remove the auth."""
if self.data is None:
await self.async_initialize()
assert self.data is not None
try:
self.data.async_remove_auth(credentials.data['username'])
self.data.async_remove_auth(credentials.data["username"])
await self.data.async_save()
except InvalidUser:
# Can happen if somehow we didn't clean up a credential
@@ -278,29 +279,27 @@ class HassLoginFlow(LoginFlow):
"""Handler for the login flow."""
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of the form."""
errors = {}
if user_input is not None:
try:
await cast(HassAuthProvider, self._auth_provider)\
.async_validate_login(user_input['username'],
user_input['password'])
await cast(HassAuthProvider, self._auth_provider).async_validate_login(
user_input["username"], user_input["password"]
)
except InvalidAuth:
errors['base'] = 'invalid_auth'
errors["base"] = "invalid_auth"
if not errors:
user_input.pop('password')
user_input.pop("password")
return await self.async_finish(user_input)
schema = OrderedDict() # type: Dict[str, type]
schema['username'] = str
schema['password'] = str
schema["username"] = str
schema["password"] = str
return self.async_show_form(
step_id='init',
data_schema=vol.Schema(schema),
errors=errors,
step_id="init", data_schema=vol.Schema(schema), errors=errors
)

View File

@@ -12,23 +12,25 @@ from . import AuthProvider, AUTH_PROVIDER_SCHEMA, AUTH_PROVIDERS, LoginFlow
from ..models import Credentials, UserMeta
USER_SCHEMA = vol.Schema({
vol.Required('username'): str,
vol.Required('password'): str,
vol.Optional('name'): str,
})
USER_SCHEMA = vol.Schema(
{
vol.Required("username"): str,
vol.Required("password"): str,
vol.Optional("name"): str,
}
)
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend({
vol.Required('users'): [USER_SCHEMA]
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend(
{vol.Required("users"): [USER_SCHEMA]}, extra=vol.PREVENT_EXTRA
)
class InvalidAuthError(HomeAssistantError):
"""Raised when submitting invalid authentication."""
@AUTH_PROVIDERS.register('insecure_example')
@AUTH_PROVIDERS.register("insecure_example")
class ExampleAuthProvider(AuthProvider):
"""Example auth provider based on hardcoded usernames and passwords."""
@@ -42,47 +44,48 @@ class ExampleAuthProvider(AuthProvider):
user = None
# Compare all users to avoid timing attacks.
for usr in self.config['users']:
if hmac.compare_digest(username.encode('utf-8'),
usr['username'].encode('utf-8')):
for usr in self.config["users"]:
if hmac.compare_digest(
username.encode("utf-8"), usr["username"].encode("utf-8")
):
user = usr
if user is None:
# Do one more compare to make timing the same as if user was found.
hmac.compare_digest(password.encode('utf-8'),
password.encode('utf-8'))
hmac.compare_digest(password.encode("utf-8"), password.encode("utf-8"))
raise InvalidAuthError
if not hmac.compare_digest(user['password'].encode('utf-8'),
password.encode('utf-8')):
if not hmac.compare_digest(
user["password"].encode("utf-8"), password.encode("utf-8")
):
raise InvalidAuthError
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]) -> Credentials:
self, flow_result: Dict[str, str]
) -> Credentials:
"""Get credentials based on the flow result."""
username = flow_result['username']
username = flow_result["username"]
for credential in await self.async_credentials():
if credential.data['username'] == username:
if credential.data["username"] == username:
return credential
# Create new credentials.
return self.async_create_credentials({
'username': username
})
return self.async_create_credentials({"username": username})
async def async_user_meta_for_credentials(
self, credentials: Credentials) -> UserMeta:
self, credentials: Credentials
) -> UserMeta:
"""Return extra user metadata for credentials.
Will be used to populate info when creating a new user.
"""
username = credentials.data['username']
username = credentials.data["username"]
name = None
for user in self.config['users']:
if user['username'] == username:
name = user.get('name')
for user in self.config["users"]:
if user["username"] == username:
name = user.get("name")
break
return UserMeta(name=name, is_active=True)
@@ -92,29 +95,27 @@ class ExampleLoginFlow(LoginFlow):
"""Handler for the login flow."""
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of the form."""
errors = {}
if user_input is not None:
try:
cast(ExampleAuthProvider, self._auth_provider)\
.async_validate_login(user_input['username'],
user_input['password'])
cast(ExampleAuthProvider, self._auth_provider).async_validate_login(
user_input["username"], user_input["password"]
)
except InvalidAuthError:
errors['base'] = 'invalid_auth'
errors["base"] = "invalid_auth"
if not errors:
user_input.pop('password')
user_input.pop("password")
return await self.async_finish(user_input)
schema = OrderedDict() # type: Dict[str, type]
schema['username'] = str
schema['password'] = str
schema["username"] = str
schema["password"] = str
return self.async_show_form(
step_id='init',
data_schema=vol.Schema(schema),
errors=errors,
step_id="init", data_schema=vol.Schema(schema), errors=errors
)

View File

@@ -16,27 +16,26 @@ from . import AuthProvider, AUTH_PROVIDER_SCHEMA, AUTH_PROVIDERS, LoginFlow
from .. import AuthManager
from ..models import Credentials, UserMeta, User
AUTH_PROVIDER_TYPE = 'legacy_api_password'
CONF_API_PASSWORD = 'api_password'
AUTH_PROVIDER_TYPE = "legacy_api_password"
CONF_API_PASSWORD = "api_password"
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend({
vol.Required(CONF_API_PASSWORD): cv.string,
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend(
{vol.Required(CONF_API_PASSWORD): cv.string}, extra=vol.PREVENT_EXTRA
)
LEGACY_USER_NAME = 'Legacy API password user'
LEGACY_USER_NAME = "Legacy API password user"
class InvalidAuthError(HomeAssistantError):
"""Raised when submitting invalid authentication."""
async def async_validate_password(hass: HomeAssistant, password: str)\
-> Optional[User]:
async def async_validate_password(hass: HomeAssistant, password: str) -> Optional[User]:
"""Return a user if password is valid. None if not."""
auth = cast(AuthManager, hass.auth) # type: ignore
providers = auth.get_auth_providers(AUTH_PROVIDER_TYPE)
if not providers:
raise ValueError('Legacy API password provider not found')
raise ValueError("Legacy API password provider not found")
try:
provider = cast(LegacyApiPasswordAuthProvider, providers[0])
@@ -52,7 +51,7 @@ async def async_validate_password(hass: HomeAssistant, password: str)\
class LegacyApiPasswordAuthProvider(AuthProvider):
"""An auth provider support legacy api_password."""
DEFAULT_TITLE = 'Legacy API Password'
DEFAULT_TITLE = "Legacy API Password"
@property
def api_password(self) -> str:
@@ -68,12 +67,14 @@ class LegacyApiPasswordAuthProvider(AuthProvider):
"""Validate password."""
api_password = str(self.config[CONF_API_PASSWORD])
if not hmac.compare_digest(api_password.encode('utf-8'),
password.encode('utf-8')):
if not hmac.compare_digest(
api_password.encode("utf-8"), password.encode("utf-8")
):
raise InvalidAuthError
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]) -> Credentials:
self, flow_result: Dict[str, str]
) -> Credentials:
"""Return credentials for this login."""
credentials = await self.async_credentials()
if credentials:
@@ -82,7 +83,8 @@ class LegacyApiPasswordAuthProvider(AuthProvider):
return self.async_create_credentials({})
async def async_user_meta_for_credentials(
self, credentials: Credentials) -> UserMeta:
self, credentials: Credentials
) -> UserMeta:
"""
Return info for the user.
@@ -95,23 +97,22 @@ class LegacyLoginFlow(LoginFlow):
"""Handler for the login flow."""
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of the form."""
errors = {}
if user_input is not None:
try:
cast(LegacyApiPasswordAuthProvider, self._auth_provider)\
.async_validate_login(user_input['password'])
cast(
LegacyApiPasswordAuthProvider, self._auth_provider
).async_validate_login(user_input["password"])
except InvalidAuthError:
errors['base'] = 'invalid_auth'
errors["base"] = "invalid_auth"
if not errors:
return await self.async_finish({})
return self.async_show_form(
step_id='init',
data_schema=vol.Schema({'password': str}),
errors=errors,
step_id="init", data_schema=vol.Schema({"password": str}), errors=errors
)

View File

@@ -3,8 +3,7 @@
It shows list of users if access from trusted network.
Abort login flow if not access from trusted network.
"""
from ipaddress import ip_network, IPv4Address, IPv6Address, IPv4Network,\
IPv6Network
from ipaddress import ip_network, IPv4Address, IPv6Address, IPv4Network, IPv6Network
from typing import Any, Dict, List, Optional, Union, cast
import voluptuous as vol
@@ -18,27 +17,32 @@ from ..models import Credentials, UserMeta
IPAddress = Union[IPv4Address, IPv6Address]
IPNetwork = Union[IPv4Network, IPv6Network]
CONF_TRUSTED_NETWORKS = 'trusted_networks'
CONF_TRUSTED_USERS = 'trusted_users'
CONF_GROUP = 'group'
CONF_ALLOW_BYPASS_LOGIN = 'allow_bypass_login'
CONF_TRUSTED_NETWORKS = "trusted_networks"
CONF_TRUSTED_USERS = "trusted_users"
CONF_GROUP = "group"
CONF_ALLOW_BYPASS_LOGIN = "allow_bypass_login"
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend({
vol.Required(CONF_TRUSTED_NETWORKS): vol.All(
cv.ensure_list, [ip_network]
),
vol.Optional(CONF_TRUSTED_USERS, default={}): vol.Schema(
# we only validate the format of user_id or group_id
{ip_network: vol.All(
cv.ensure_list,
[vol.Or(
cv.uuid4_hex,
vol.Schema({vol.Required(CONF_GROUP): cv.uuid4_hex}),
)],
)}
),
vol.Optional(CONF_ALLOW_BYPASS_LOGIN, default=False): cv.boolean,
}, extra=vol.PREVENT_EXTRA)
CONFIG_SCHEMA = AUTH_PROVIDER_SCHEMA.extend(
{
vol.Required(CONF_TRUSTED_NETWORKS): vol.All(cv.ensure_list, [ip_network]),
vol.Optional(CONF_TRUSTED_USERS, default={}): vol.Schema(
# we only validate the format of user_id or group_id
{
ip_network: vol.All(
cv.ensure_list,
[
vol.Or(
cv.uuid4_hex,
vol.Schema({vol.Required(CONF_GROUP): cv.uuid4_hex}),
)
],
)
}
),
vol.Optional(CONF_ALLOW_BYPASS_LOGIN, default=False): cv.boolean,
},
extra=vol.PREVENT_EXTRA,
)
class InvalidAuthError(HomeAssistantError):
@@ -49,14 +53,14 @@ class InvalidUserError(HomeAssistantError):
"""Raised when try to login as invalid user."""
@AUTH_PROVIDERS.register('trusted_networks')
@AUTH_PROVIDERS.register("trusted_networks")
class TrustedNetworksAuthProvider(AuthProvider):
"""Trusted Networks auth provider.
Allow passwordless access from trusted network.
"""
DEFAULT_TITLE = 'Trusted Networks'
DEFAULT_TITLE = "Trusted Networks"
@property
def trusted_networks(self) -> List[IPNetwork]:
@@ -76,49 +80,58 @@ class TrustedNetworksAuthProvider(AuthProvider):
async def async_login_flow(self, context: Optional[Dict]) -> LoginFlow:
"""Return a flow to login."""
assert context is not None
ip_addr = cast(IPAddress, context.get('ip_address'))
ip_addr = cast(IPAddress, context.get("ip_address"))
users = await self.store.async_get_users()
available_users = [user for user in users
if not user.system_generated and user.is_active]
available_users = [
user for user in users if not user.system_generated and user.is_active
]
for ip_net, user_or_group_list in self.trusted_users.items():
if ip_addr in ip_net:
user_list = [user_id for user_id in user_or_group_list
if isinstance(user_id, str)]
group_list = [group[CONF_GROUP] for group in user_or_group_list
if isinstance(group, dict)]
flattened_group_list = [group for sublist in group_list
for group in sublist]
user_list = [
user_id
for user_id in user_or_group_list
if isinstance(user_id, str)
]
group_list = [
group[CONF_GROUP]
for group in user_or_group_list
if isinstance(group, dict)
]
flattened_group_list = [
group for sublist in group_list for group in sublist
]
available_users = [
user for user in available_users
if (user.id in user_list or
any([group.id in flattened_group_list
for group in user.groups]))
user
for user in available_users
if (
user.id in user_list
or any(
[group.id in flattened_group_list for group in user.groups]
)
)
]
break
return TrustedNetworksLoginFlow(
self,
ip_addr,
{
user.id: user.name for user in available_users
},
{user.id: user.name for user in available_users},
self.config[CONF_ALLOW_BYPASS_LOGIN],
)
async def async_get_or_create_credentials(
self, flow_result: Dict[str, str]) -> Credentials:
self, flow_result: Dict[str, str]
) -> Credentials:
"""Get credentials based on the flow result."""
user_id = flow_result['user']
user_id = flow_result["user"]
users = await self.store.async_get_users()
for user in users:
if (not user.system_generated and
user.is_active and
user.id == user_id):
if not user.system_generated and user.is_active and user.id == user_id:
for credential in await self.async_credentials():
if credential.data['user_id'] == user_id:
if credential.data["user_id"] == user_id:
return credential
cred = self.async_create_credentials({'user_id': user_id})
cred = self.async_create_credentials({"user_id": user_id})
await self.store.async_link_user(user, cred)
return cred
@@ -126,7 +139,8 @@ class TrustedNetworksAuthProvider(AuthProvider):
raise InvalidUserError
async def async_user_meta_for_credentials(
self, credentials: Credentials) -> UserMeta:
self, credentials: Credentials
) -> UserMeta:
"""Return extra user metadata for credentials.
Trusted network auth provider should never create new user.
@@ -141,20 +155,24 @@ class TrustedNetworksAuthProvider(AuthProvider):
Raise InvalidAuthError if trusted_networks is not configured.
"""
if not self.trusted_networks:
raise InvalidAuthError('trusted_networks is not configured')
raise InvalidAuthError("trusted_networks is not configured")
if not any(ip_addr in trusted_network for trusted_network
in self.trusted_networks):
raise InvalidAuthError('Not in trusted_networks')
if not any(
ip_addr in trusted_network for trusted_network in self.trusted_networks
):
raise InvalidAuthError("Not in trusted_networks")
class TrustedNetworksLoginFlow(LoginFlow):
"""Handler for the login flow."""
def __init__(self, auth_provider: TrustedNetworksAuthProvider,
ip_addr: IPAddress,
available_users: Dict[str, Optional[str]],
allow_bypass_login: bool) -> None:
def __init__(
self,
auth_provider: TrustedNetworksAuthProvider,
ip_addr: IPAddress,
available_users: Dict[str, Optional[str]],
allow_bypass_login: bool,
) -> None:
"""Initialize the login flow."""
super().__init__(auth_provider)
self._available_users = available_users
@@ -162,27 +180,26 @@ class TrustedNetworksLoginFlow(LoginFlow):
self._allow_bypass_login = allow_bypass_login
async def async_step_init(
self, user_input: Optional[Dict[str, str]] = None) \
-> Dict[str, Any]:
self, user_input: Optional[Dict[str, str]] = None
) -> Dict[str, Any]:
"""Handle the step of the form."""
try:
cast(TrustedNetworksAuthProvider, self._auth_provider)\
.async_validate_access(self._ip_address)
cast(
TrustedNetworksAuthProvider, self._auth_provider
).async_validate_access(self._ip_address)
except InvalidAuthError:
return self.async_abort(
reason='not_whitelisted'
)
return self.async_abort(reason="not_whitelisted")
if user_input is not None:
return await self.async_finish(user_input)
if self._allow_bypass_login and len(self._available_users) == 1:
return await self.async_finish({
'user': next(iter(self._available_users.keys()))
})
return await self.async_finish(
{"user": next(iter(self._available_users.keys()))}
)
return self.async_show_form(
step_id='init',
data_schema=vol.Schema({'user': vol.In(self._available_users)}),
step_id="init",
data_schema=vol.Schema({"user": vol.In(self._available_users)}),
)

View File

@@ -10,4 +10,4 @@ def generate_secret(entropy: int = 32) -> str:
Event loop friendly.
"""
return binascii.hexlify(os.urandom(entropy)).decode('ascii')
return binascii.hexlify(os.urandom(entropy)).decode("ascii")

View File

@@ -20,32 +20,33 @@ from homeassistant.exceptions import HomeAssistantError
_LOGGER = logging.getLogger(__name__)
ERROR_LOG_FILENAME = 'home-assistant.log'
ERROR_LOG_FILENAME = "home-assistant.log"
# hass.data key for logging information.
DATA_LOGGING = 'logging'
DATA_LOGGING = "logging"
DEBUGGER_INTEGRATIONS = {'ptvsd', }
CORE_INTEGRATIONS = ('homeassistant', 'persistent_notification')
LOGGING_INTEGRATIONS = {'logger', 'system_log'}
DEBUGGER_INTEGRATIONS = {"ptvsd"}
CORE_INTEGRATIONS = ("homeassistant", "persistent_notification")
LOGGING_INTEGRATIONS = {"logger", "system_log"}
STAGE_1_INTEGRATIONS = {
# To record data
'recorder',
"recorder",
# To make sure we forward data to other instances
'mqtt_eventstream',
"mqtt_eventstream",
}
async def async_from_config_dict(config: Dict[str, Any],
hass: core.HomeAssistant,
config_dir: Optional[str] = None,
enable_log: bool = True,
verbose: bool = False,
skip_pip: bool = False,
log_rotate_days: Any = None,
log_file: Any = None,
log_no_color: bool = False) \
-> Optional[core.HomeAssistant]:
async def async_from_config_dict(
config: Dict[str, Any],
hass: core.HomeAssistant,
config_dir: Optional[str] = None,
enable_log: bool = True,
verbose: bool = False,
skip_pip: bool = False,
log_rotate_days: Any = None,
log_file: Any = None,
log_no_color: bool = False,
) -> Optional[core.HomeAssistant]:
"""Try to configure Home Assistant from a configuration dictionary.
Dynamically loads required components and its dependencies.
@@ -54,28 +55,30 @@ async def async_from_config_dict(config: Dict[str, Any],
start = time()
if enable_log:
async_enable_logging(hass, verbose, log_rotate_days, log_file,
log_no_color)
async_enable_logging(hass, verbose, log_rotate_days, log_file, log_no_color)
hass.config.skip_pip = skip_pip
if skip_pip:
_LOGGER.warning("Skipping pip installation of required modules. "
"This may cause issues")
_LOGGER.warning(
"Skipping pip installation of required modules. " "This may cause issues"
)
core_config = config.get(core.DOMAIN, {})
api_password = config.get('http', {}).get('api_password')
trusted_networks = config.get('http', {}).get('trusted_networks')
api_password = config.get("http", {}).get("api_password")
trusted_networks = config.get("http", {}).get("trusted_networks")
try:
await conf_util.async_process_ha_core_config(
hass, core_config, api_password, trusted_networks)
hass, core_config, api_password, trusted_networks
)
except vol.Invalid as config_err:
conf_util.async_log_exception(
config_err, 'homeassistant', core_config, hass)
conf_util.async_log_exception(config_err, "homeassistant", core_config, hass)
return None
except HomeAssistantError:
_LOGGER.error("Home Assistant core failed to initialize. "
"Further initialization aborted")
_LOGGER.error(
"Home Assistant core failed to initialize. "
"Further initialization aborted"
)
return None
# Make a copy because we are mutating it.
@@ -83,7 +86,8 @@ async def async_from_config_dict(config: Dict[str, Any],
# Merge packages
await conf_util.merge_packages_config(
hass, config, core_config.get(conf_util.CONF_PACKAGES, {}))
hass, config, core_config.get(conf_util.CONF_PACKAGES, {})
)
hass.config_entries = config_entries.ConfigEntries(hass, config)
await hass.config_entries.async_initialize()
@@ -91,26 +95,20 @@ async def async_from_config_dict(config: Dict[str, Any],
await _async_set_up_integrations(hass, config)
stop = time()
_LOGGER.info("Home Assistant initialized in %.2fs", stop-start)
if sys.version_info[:3] < (3, 6, 0):
hass.components.persistent_notification.async_create(
"Python 3.5 support is deprecated and will "
"be removed in the first release after August 1. Please "
"upgrade Python.", "Python version", "python_version"
)
_LOGGER.info("Home Assistant initialized in %.2fs", stop - start)
return hass
async def async_from_config_file(config_path: str,
hass: core.HomeAssistant,
verbose: bool = False,
skip_pip: bool = True,
log_rotate_days: Any = None,
log_file: Any = None,
log_no_color: bool = False)\
-> Optional[core.HomeAssistant]:
async def async_from_config_file(
config_path: str,
hass: core.HomeAssistant,
verbose: bool = False,
skip_pip: bool = True,
log_rotate_days: Any = None,
log_file: Any = None,
log_no_color: bool = False,
) -> Optional[core.HomeAssistant]:
"""Read the configuration file and try to start all the functionality.
Will add functionality to 'hass' parameter.
@@ -123,15 +121,14 @@ async def async_from_config_file(config_path: str,
if not is_virtual_env():
await async_mount_local_lib_path(config_dir)
async_enable_logging(hass, verbose, log_rotate_days, log_file,
log_no_color)
async_enable_logging(hass, verbose, log_rotate_days, log_file, log_no_color)
await hass.async_add_executor_job(
conf_util.process_ha_config_upgrade, hass)
await hass.async_add_executor_job(conf_util.process_ha_config_upgrade, hass)
try:
config_dict = await hass.async_add_executor_job(
conf_util.load_yaml_config_file, config_path)
conf_util.load_yaml_config_file, config_path
)
except HomeAssistantError as err:
_LOGGER.error("Error loading %s: %s", config_path, err)
return None
@@ -139,43 +136,48 @@ async def async_from_config_file(config_path: str,
clear_secret_cache()
return await async_from_config_dict(
config_dict, hass, enable_log=False, skip_pip=skip_pip)
config_dict, hass, enable_log=False, skip_pip=skip_pip
)
@core.callback
def async_enable_logging(hass: core.HomeAssistant,
verbose: bool = False,
log_rotate_days: Optional[int] = None,
log_file: Optional[str] = None,
log_no_color: bool = False) -> None:
def async_enable_logging(
hass: core.HomeAssistant,
verbose: bool = False,
log_rotate_days: Optional[int] = None,
log_file: Optional[str] = None,
log_no_color: bool = False,
) -> None:
"""Set up the logging.
This method must be run in the event loop.
"""
fmt = ("%(asctime)s %(levelname)s (%(threadName)s) "
"[%(name)s] %(message)s")
datefmt = '%Y-%m-%d %H:%M:%S'
fmt = "%(asctime)s %(levelname)s (%(threadName)s) " "[%(name)s] %(message)s"
datefmt = "%Y-%m-%d %H:%M:%S"
if not log_no_color:
try:
from colorlog import ColoredFormatter
# basicConfig must be called after importing colorlog in order to
# ensure that the handlers it sets up wraps the correct streams.
logging.basicConfig(level=logging.INFO)
colorfmt = "%(log_color)s{}%(reset)s".format(fmt)
logging.getLogger().handlers[0].setFormatter(ColoredFormatter(
colorfmt,
datefmt=datefmt,
reset=True,
log_colors={
'DEBUG': 'cyan',
'INFO': 'green',
'WARNING': 'yellow',
'ERROR': 'red',
'CRITICAL': 'red',
}
))
logging.getLogger().handlers[0].setFormatter(
ColoredFormatter(
colorfmt,
datefmt=datefmt,
reset=True,
log_colors={
"DEBUG": "cyan",
"INFO": "green",
"WARNING": "yellow",
"ERROR": "red",
"CRITICAL": "red",
},
)
)
except ImportError:
pass
@@ -184,9 +186,9 @@ def async_enable_logging(hass: core.HomeAssistant,
logging.basicConfig(format=fmt, datefmt=datefmt, level=logging.INFO)
# Suppress overly verbose logs from libraries that aren't helpful
logging.getLogger('requests').setLevel(logging.WARNING)
logging.getLogger('urllib3').setLevel(logging.WARNING)
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
logging.getLogger("requests").setLevel(logging.WARNING)
logging.getLogger("urllib3").setLevel(logging.WARNING)
logging.getLogger("aiohttp.access").setLevel(logging.WARNING)
# Log errors to a file if we have write access to file or config dir
if log_file is None:
@@ -199,16 +201,16 @@ def async_enable_logging(hass: core.HomeAssistant,
# Check if we can write to the error log if it exists or that
# we can create files in the containing directory if not.
if (err_path_exists and os.access(err_log_path, os.W_OK)) or \
(not err_path_exists and os.access(err_dir, os.W_OK)):
if (err_path_exists and os.access(err_log_path, os.W_OK)) or (
not err_path_exists and os.access(err_dir, os.W_OK)
):
if log_rotate_days:
err_handler = logging.handlers.TimedRotatingFileHandler(
err_log_path, when='midnight',
backupCount=log_rotate_days) # type: logging.FileHandler
err_log_path, when="midnight", backupCount=log_rotate_days
) # type: logging.FileHandler
else:
err_handler = logging.FileHandler(
err_log_path, mode='w', delay=True)
err_handler = logging.FileHandler(err_log_path, mode="w", delay=True)
err_handler.setLevel(logging.INFO if verbose else logging.WARNING)
err_handler.setFormatter(logging.Formatter(fmt, datefmt=datefmt))
@@ -217,21 +219,19 @@ def async_enable_logging(hass: core.HomeAssistant,
async def async_stop_async_handler(_: Any) -> None:
"""Cleanup async handler."""
logging.getLogger('').removeHandler(async_handler) # type: ignore
logging.getLogger("").removeHandler(async_handler) # type: ignore
await async_handler.async_close(blocking=True)
hass.bus.async_listen_once(
EVENT_HOMEASSISTANT_CLOSE, async_stop_async_handler)
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_CLOSE, async_stop_async_handler)
logger = logging.getLogger('')
logger = logging.getLogger("")
logger.addHandler(async_handler) # type: ignore
logger.setLevel(logging.INFO)
# Save the log file location for access by other components.
hass.data[DATA_LOGGING] = err_log_path
else:
_LOGGER.error(
"Unable to set up error log %s (access denied)", err_log_path)
_LOGGER.error("Unable to set up error log %s (access denied)", err_log_path)
async def async_mount_local_lib_path(config_dir: str) -> str:
@@ -239,7 +239,7 @@ async def async_mount_local_lib_path(config_dir: str) -> str:
This function is a coroutine.
"""
deps_dir = os.path.join(config_dir, 'deps')
deps_dir = os.path.join(config_dir, "deps")
lib_dir = await async_get_user_site(deps_dir)
if lib_dir not in sys.path:
sys.path.insert(0, lib_dir)
@@ -250,21 +250,21 @@ async def async_mount_local_lib_path(config_dir: str) -> str:
def _get_domains(hass: core.HomeAssistant, config: Dict[str, Any]) -> Set[str]:
"""Get domains of components to set up."""
# Filter out the repeating and common config section [homeassistant]
domains = set(key.split(' ')[0] for key in config.keys()
if key != core.DOMAIN)
domains = set(key.split(" ")[0] for key in config.keys() if key != core.DOMAIN)
# Add config entry domains
domains.update(hass.config_entries.async_domains()) # type: ignore
# Make sure the Hass.io component is loaded
if 'HASSIO' in os.environ:
domains.add('hassio')
if "HASSIO" in os.environ:
domains.add("hassio")
return domains
async def _async_set_up_integrations(
hass: core.HomeAssistant, config: Dict[str, Any]) -> None:
hass: core.HomeAssistant, config: Dict[str, Any]
) -> None:
"""Set up all the integrations."""
domains = _get_domains(hass, config)
@@ -272,27 +272,33 @@ async def _async_set_up_integrations(
debuggers = domains & DEBUGGER_INTEGRATIONS
if debuggers:
_LOGGER.debug("Starting up debuggers %s", debuggers)
await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in debuggers])
await asyncio.gather(
*(async_setup_component(hass, domain, config) for domain in debuggers)
)
domains -= DEBUGGER_INTEGRATIONS
# Resolve all dependencies of all components so we can find the logging
# and integrations that need faster initialization.
resolved_domains_task = asyncio.gather(*[
loader.async_component_dependencies(hass, domain)
for domain in domains
], return_exceptions=True)
resolved_domains_task = asyncio.gather(
*(loader.async_component_dependencies(hass, domain) for domain in domains),
return_exceptions=True,
)
# Set up core.
_LOGGER.debug("Setting up %s", CORE_INTEGRATIONS)
if not all(await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in CORE_INTEGRATIONS
])):
_LOGGER.error("Home Assistant core failed to initialize. "
"Further initialization aborted")
if not all(
await asyncio.gather(
*(
async_setup_component(hass, domain, config)
for domain in CORE_INTEGRATIONS
)
)
):
_LOGGER.error(
"Home Assistant core failed to initialize. "
"Further initialization aborted"
)
return
_LOGGER.debug("Home Assistant core initialized")
@@ -312,36 +318,32 @@ async def _async_set_up_integrations(
if logging_domains:
_LOGGER.info("Setting up %s", logging_domains)
await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in logging_domains
])
await asyncio.gather(
*(async_setup_component(hass, domain, config) for domain in logging_domains)
)
# Kick off loading the registries. They don't need to be awaited.
asyncio.gather(
hass.helpers.device_registry.async_get_registry(),
hass.helpers.entity_registry.async_get_registry(),
hass.helpers.area_registry.async_get_registry())
hass.helpers.area_registry.async_get_registry(),
)
if stage_1_domains:
await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in stage_1_domains
])
await asyncio.gather(
*(async_setup_component(hass, domain, config) for domain in stage_1_domains)
)
# Load all integrations
after_dependencies = {} # type: Dict[str, Set[str]]
for int_or_exc in await asyncio.gather(*[
loader.async_get_integration(hass, domain)
for domain in stage_2_domains
], return_exceptions=True):
for int_or_exc in await asyncio.gather(
*(loader.async_get_integration(hass, domain) for domain in stage_2_domains),
return_exceptions=True,
):
# Exceptions are handled in async_setup_component.
if (isinstance(int_or_exc, loader.Integration) and
int_or_exc.after_dependencies):
after_dependencies[int_or_exc.domain] = set(
int_or_exc.after_dependencies
)
if isinstance(int_or_exc, loader.Integration) and int_or_exc.after_dependencies:
after_dependencies[int_or_exc.domain] = set(int_or_exc.after_dependencies)
last_load = None
while stage_2_domains:
@@ -351,8 +353,7 @@ async def _async_set_up_integrations(
after_deps = after_dependencies.get(domain)
# Load if integration has no after_dependencies or they are
# all loaded
if (not after_deps or
not after_deps-hass.config.components):
if not after_deps or not after_deps - hass.config.components:
domains_to_load.add(domain)
if not domains_to_load or domains_to_load == last_load:
@@ -360,10 +361,9 @@ async def _async_set_up_integrations(
_LOGGER.debug("Setting up %s", domains_to_load)
await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in domains_to_load
])
await asyncio.gather(
*(async_setup_component(hass, domain, config) for domain in domains_to_load)
)
last_load = domains_to_load
stage_2_domains -= domains_to_load
@@ -373,10 +373,9 @@ async def _async_set_up_integrations(
if stage_2_domains:
_LOGGER.debug("Final set up: %s", stage_2_domains)
await asyncio.gather(*[
async_setup_component(hass, domain, config)
for domain in stage_2_domains
])
await asyncio.gather(
*(async_setup_component(hass, domain, config) for domain in stage_2_domains)
)
# Wrap up startup
await hass.async_block_till_done()

View File

@@ -31,12 +31,11 @@ def is_on(hass, entity_id=None):
component = getattr(hass.components, domain)
except ImportError:
_LOGGER.error('Failed to call %s.is_on: component not found',
domain)
_LOGGER.error("Failed to call %s.is_on: component not found", domain)
continue
if not hasattr(component, 'is_on'):
_LOGGER.warning("Component %s has no is_on method.", domain)
if not hasattr(component, "is_on"):
_LOGGER.warning("Integration %s has no is_on method.", domain)
continue
if component.is_on(ent_id):

View File

@@ -6,9 +6,18 @@ from requests.exceptions import HTTPError, ConnectTimeout
import voluptuous as vol
from homeassistant.const import (
ATTR_ATTRIBUTION, ATTR_DATE, ATTR_TIME, ATTR_ENTITY_ID, CONF_USERNAME,
CONF_PASSWORD, CONF_EXCLUDE, CONF_NAME, CONF_LIGHTS,
EVENT_HOMEASSISTANT_STOP, EVENT_HOMEASSISTANT_START)
ATTR_ATTRIBUTION,
ATTR_DATE,
ATTR_TIME,
ATTR_ENTITY_ID,
CONF_USERNAME,
CONF_PASSWORD,
CONF_EXCLUDE,
CONF_NAME,
CONF_LIGHTS,
EVENT_HOMEASSISTANT_STOP,
EVENT_HOMEASSISTANT_START,
)
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers import discovery
from homeassistant.helpers.entity import Entity
@@ -17,77 +26,88 @@ _LOGGER = logging.getLogger(__name__)
ATTRIBUTION = "Data provided by goabode.com"
CONF_POLLING = 'polling'
CONF_POLLING = "polling"
DOMAIN = 'abode'
DEFAULT_CACHEDB = './abodepy_cache.pickle'
DOMAIN = "abode"
DEFAULT_CACHEDB = "./abodepy_cache.pickle"
NOTIFICATION_ID = 'abode_notification'
NOTIFICATION_TITLE = 'Abode Security Setup'
NOTIFICATION_ID = "abode_notification"
NOTIFICATION_TITLE = "Abode Security Setup"
EVENT_ABODE_ALARM = 'abode_alarm'
EVENT_ABODE_ALARM_END = 'abode_alarm_end'
EVENT_ABODE_AUTOMATION = 'abode_automation'
EVENT_ABODE_FAULT = 'abode_panel_fault'
EVENT_ABODE_RESTORE = 'abode_panel_restore'
EVENT_ABODE_ALARM = "abode_alarm"
EVENT_ABODE_ALARM_END = "abode_alarm_end"
EVENT_ABODE_AUTOMATION = "abode_automation"
EVENT_ABODE_FAULT = "abode_panel_fault"
EVENT_ABODE_RESTORE = "abode_panel_restore"
SERVICE_SETTINGS = 'change_setting'
SERVICE_CAPTURE_IMAGE = 'capture_image'
SERVICE_TRIGGER = 'trigger_quick_action'
SERVICE_SETTINGS = "change_setting"
SERVICE_CAPTURE_IMAGE = "capture_image"
SERVICE_TRIGGER = "trigger_quick_action"
ATTR_DEVICE_ID = 'device_id'
ATTR_DEVICE_NAME = 'device_name'
ATTR_DEVICE_TYPE = 'device_type'
ATTR_EVENT_CODE = 'event_code'
ATTR_EVENT_NAME = 'event_name'
ATTR_EVENT_TYPE = 'event_type'
ATTR_EVENT_UTC = 'event_utc'
ATTR_SETTING = 'setting'
ATTR_USER_NAME = 'user_name'
ATTR_VALUE = 'value'
ATTR_DEVICE_ID = "device_id"
ATTR_DEVICE_NAME = "device_name"
ATTR_DEVICE_TYPE = "device_type"
ATTR_EVENT_CODE = "event_code"
ATTR_EVENT_NAME = "event_name"
ATTR_EVENT_TYPE = "event_type"
ATTR_EVENT_UTC = "event_utc"
ATTR_SETTING = "setting"
ATTR_USER_NAME = "user_name"
ATTR_VALUE = "value"
ABODE_DEVICE_ID_LIST_SCHEMA = vol.Schema([str])
CONFIG_SCHEMA = vol.Schema({
DOMAIN: vol.Schema({
vol.Required(CONF_USERNAME): cv.string,
vol.Required(CONF_PASSWORD): cv.string,
vol.Optional(CONF_NAME): cv.string,
vol.Optional(CONF_POLLING, default=False): cv.boolean,
vol.Optional(CONF_EXCLUDE, default=[]): ABODE_DEVICE_ID_LIST_SCHEMA,
vol.Optional(CONF_LIGHTS, default=[]): ABODE_DEVICE_ID_LIST_SCHEMA
}),
}, extra=vol.ALLOW_EXTRA)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.Schema(
{
vol.Required(CONF_USERNAME): cv.string,
vol.Required(CONF_PASSWORD): cv.string,
vol.Optional(CONF_NAME): cv.string,
vol.Optional(CONF_POLLING, default=False): cv.boolean,
vol.Optional(CONF_EXCLUDE, default=[]): ABODE_DEVICE_ID_LIST_SCHEMA,
vol.Optional(CONF_LIGHTS, default=[]): ABODE_DEVICE_ID_LIST_SCHEMA,
}
)
},
extra=vol.ALLOW_EXTRA,
)
CHANGE_SETTING_SCHEMA = vol.Schema({
vol.Required(ATTR_SETTING): cv.string,
vol.Required(ATTR_VALUE): cv.string
})
CHANGE_SETTING_SCHEMA = vol.Schema(
{vol.Required(ATTR_SETTING): cv.string, vol.Required(ATTR_VALUE): cv.string}
)
CAPTURE_IMAGE_SCHEMA = vol.Schema({
ATTR_ENTITY_ID: cv.entity_ids,
})
CAPTURE_IMAGE_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
TRIGGER_SCHEMA = vol.Schema({
ATTR_ENTITY_ID: cv.entity_ids,
})
TRIGGER_SCHEMA = vol.Schema({ATTR_ENTITY_ID: cv.entity_ids})
ABODE_PLATFORMS = [
'alarm_control_panel', 'binary_sensor', 'lock', 'switch', 'cover',
'camera', 'light', 'sensor'
"alarm_control_panel",
"binary_sensor",
"lock",
"switch",
"cover",
"camera",
"light",
"sensor",
]
class AbodeSystem:
"""Abode System class."""
def __init__(self, username, password, cache,
name, polling, exclude, lights):
def __init__(self, username, password, cache, name, polling, exclude, lights):
"""Initialize the system."""
import abodepy
self.abode = abodepy.Abode(
username, password, auto_login=True, get_devices=True,
get_automations=True, cache_path=cache)
username,
password,
auto_login=True,
get_devices=True,
get_automations=True,
cache_path=cache,
)
self.name = name
self.polling = polling
self.exclude = exclude
@@ -106,9 +126,9 @@ class AbodeSystem:
"""Check if a switch device is configured as a light."""
import abodepy.helpers.constants as CONST
return (device.generic_type == CONST.TYPE_LIGHT or
(device.generic_type == CONST.TYPE_SWITCH and
device.device_id in self.lights))
return device.generic_type == CONST.TYPE_LIGHT or (
device.generic_type == CONST.TYPE_SWITCH and device.device_id in self.lights
)
def setup(hass, config):
@@ -126,16 +146,18 @@ def setup(hass, config):
try:
cache = hass.config.path(DEFAULT_CACHEDB)
hass.data[DOMAIN] = AbodeSystem(
username, password, cache, name, polling, exclude, lights)
username, password, cache, name, polling, exclude, lights
)
except (AbodeException, ConnectTimeout, HTTPError) as ex:
_LOGGER.error("Unable to connect to Abode: %s", str(ex))
hass.components.persistent_notification.create(
'Error: {}<br />'
'You will need to restart hass after fixing.'
''.format(ex),
"Error: {}<br />"
"You will need to restart hass after fixing."
"".format(ex),
title=NOTIFICATION_TITLE,
notification_id=NOTIFICATION_ID)
notification_id=NOTIFICATION_ID,
)
return False
setup_hass_services(hass)
@@ -166,8 +188,11 @@ def setup_hass_services(hass):
"""Capture a new image."""
entity_ids = call.data.get(ATTR_ENTITY_ID)
target_devices = [device for device in hass.data[DOMAIN].devices
if device.entity_id in entity_ids]
target_devices = [
device
for device in hass.data[DOMAIN].devices
if device.entity_id in entity_ids
]
for device in target_devices:
device.capture()
@@ -176,27 +201,31 @@ def setup_hass_services(hass):
"""Trigger a quick action."""
entity_ids = call.data.get(ATTR_ENTITY_ID, None)
target_devices = [device for device in hass.data[DOMAIN].devices
if device.entity_id in entity_ids]
target_devices = [
device
for device in hass.data[DOMAIN].devices
if device.entity_id in entity_ids
]
for device in target_devices:
device.trigger()
hass.services.register(
DOMAIN, SERVICE_SETTINGS, change_setting,
schema=CHANGE_SETTING_SCHEMA)
DOMAIN, SERVICE_SETTINGS, change_setting, schema=CHANGE_SETTING_SCHEMA
)
hass.services.register(
DOMAIN, SERVICE_CAPTURE_IMAGE, capture_image,
schema=CAPTURE_IMAGE_SCHEMA)
DOMAIN, SERVICE_CAPTURE_IMAGE, capture_image, schema=CAPTURE_IMAGE_SCHEMA
)
hass.services.register(
DOMAIN, SERVICE_TRIGGER, trigger_quick_action,
schema=TRIGGER_SCHEMA)
DOMAIN, SERVICE_TRIGGER, trigger_quick_action, schema=TRIGGER_SCHEMA
)
def setup_hass_events(hass):
"""Home Assistant start and stop callbacks."""
def startup(event):
"""Listen for push events."""
hass.data[DOMAIN].abode.events.start()
@@ -222,28 +251,32 @@ def setup_abode_events(hass):
def event_callback(event, event_json):
"""Handle an event callback from Abode."""
data = {
ATTR_DEVICE_ID: event_json.get(ATTR_DEVICE_ID, ''),
ATTR_DEVICE_NAME: event_json.get(ATTR_DEVICE_NAME, ''),
ATTR_DEVICE_TYPE: event_json.get(ATTR_DEVICE_TYPE, ''),
ATTR_EVENT_CODE: event_json.get(ATTR_EVENT_CODE, ''),
ATTR_EVENT_NAME: event_json.get(ATTR_EVENT_NAME, ''),
ATTR_EVENT_TYPE: event_json.get(ATTR_EVENT_TYPE, ''),
ATTR_EVENT_UTC: event_json.get(ATTR_EVENT_UTC, ''),
ATTR_USER_NAME: event_json.get(ATTR_USER_NAME, ''),
ATTR_DATE: event_json.get(ATTR_DATE, ''),
ATTR_TIME: event_json.get(ATTR_TIME, ''),
ATTR_DEVICE_ID: event_json.get(ATTR_DEVICE_ID, ""),
ATTR_DEVICE_NAME: event_json.get(ATTR_DEVICE_NAME, ""),
ATTR_DEVICE_TYPE: event_json.get(ATTR_DEVICE_TYPE, ""),
ATTR_EVENT_CODE: event_json.get(ATTR_EVENT_CODE, ""),
ATTR_EVENT_NAME: event_json.get(ATTR_EVENT_NAME, ""),
ATTR_EVENT_TYPE: event_json.get(ATTR_EVENT_TYPE, ""),
ATTR_EVENT_UTC: event_json.get(ATTR_EVENT_UTC, ""),
ATTR_USER_NAME: event_json.get(ATTR_USER_NAME, ""),
ATTR_DATE: event_json.get(ATTR_DATE, ""),
ATTR_TIME: event_json.get(ATTR_TIME, ""),
}
hass.bus.fire(event, data)
events = [TIMELINE.ALARM_GROUP, TIMELINE.ALARM_END_GROUP,
TIMELINE.PANEL_FAULT_GROUP, TIMELINE.PANEL_RESTORE_GROUP,
TIMELINE.AUTOMATION_GROUP]
events = [
TIMELINE.ALARM_GROUP,
TIMELINE.ALARM_END_GROUP,
TIMELINE.PANEL_FAULT_GROUP,
TIMELINE.PANEL_RESTORE_GROUP,
TIMELINE.AUTOMATION_GROUP,
]
for event in events:
hass.data[DOMAIN].abode.events.add_event_callback(
event,
partial(event_callback, event))
event, partial(event_callback, event)
)
class AbodeDevice(Entity):
@@ -258,7 +291,8 @@ class AbodeDevice(Entity):
"""Subscribe Abode events."""
self.hass.async_add_job(
self._data.abode.events.add_device_callback,
self._device.device_id, self._update_callback
self._device.device_id,
self._update_callback,
)
@property
@@ -280,10 +314,10 @@ class AbodeDevice(Entity):
"""Return the state attributes."""
return {
ATTR_ATTRIBUTION: ATTRIBUTION,
'device_id': self._device.device_id,
'battery_low': self._device.battery_low,
'no_response': self._device.no_response,
'device_type': self._device.type
"device_id": self._device.device_id,
"battery_low": self._device.battery_low,
"no_response": self._device.no_response,
"device_type": self._device.type,
}
def _update_callback(self, device):
@@ -305,7 +339,8 @@ class AbodeAutomation(Entity):
if self._event:
self.hass.async_add_job(
self._data.abode.events.add_event_callback,
self._event, self._update_callback
self._event,
self._update_callback,
)
@property
@@ -327,9 +362,9 @@ class AbodeAutomation(Entity):
"""Return the state attributes."""
return {
ATTR_ATTRIBUTION: ATTRIBUTION,
'automation_id': self._automation.automation_id,
'type': self._automation.type,
'sub_type': self._automation.sub_type
"automation_id": self._automation.automation_id,
"type": self._automation.type,
"sub_type": self._automation.sub_type,
}
def _update_callback(self, device):

View File

@@ -3,14 +3,17 @@ import logging
import homeassistant.components.alarm_control_panel as alarm
from homeassistant.const import (
ATTR_ATTRIBUTION, STATE_ALARM_ARMED_AWAY, STATE_ALARM_ARMED_HOME,
STATE_ALARM_DISARMED)
ATTR_ATTRIBUTION,
STATE_ALARM_ARMED_AWAY,
STATE_ALARM_ARMED_HOME,
STATE_ALARM_DISARMED,
)
from . import ATTRIBUTION, DOMAIN as ABODE_DOMAIN, AbodeDevice
_LOGGER = logging.getLogger(__name__)
ICON = 'mdi:security'
ICON = "mdi:security"
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -72,7 +75,7 @@ class AbodeAlarm(AbodeDevice, alarm.AlarmControlPanel):
"""Return the state attributes."""
return {
ATTR_ATTRIBUTION: ATTRIBUTION,
'device_id': self._device.device_id,
'battery_backup': self._device.battery,
'cellular_backup': self._device.is_cellular,
"device_id": self._device.device_id,
"battery_backup": self._device.battery,
"cellular_backup": self._device.is_cellular,
}

View File

@@ -15,9 +15,13 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
data = hass.data[ABODE_DOMAIN]
device_types = [CONST.TYPE_CONNECTIVITY, CONST.TYPE_MOISTURE,
CONST.TYPE_MOTION, CONST.TYPE_OCCUPANCY,
CONST.TYPE_OPENING]
device_types = [
CONST.TYPE_CONNECTIVITY,
CONST.TYPE_MOISTURE,
CONST.TYPE_MOTION,
CONST.TYPE_OCCUPANCY,
CONST.TYPE_OPENING,
]
devices = []
for device in data.abode.get_devices(generic_type=device_types):
@@ -26,13 +30,15 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
devices.append(AbodeBinarySensor(data, device))
for automation in data.abode.get_automations(
generic_type=CONST.TYPE_QUICK_ACTION):
for automation in data.abode.get_automations(generic_type=CONST.TYPE_QUICK_ACTION):
if data.is_automation_excluded(automation):
continue
devices.append(AbodeQuickActionBinarySensor(
data, automation, TIMELINE.AUTOMATION_EDIT_GROUP))
devices.append(
AbodeQuickActionBinarySensor(
data, automation, TIMELINE.AUTOMATION_EDIT_GROUP
)
)
data.devices.extend(devices)

View File

@@ -49,7 +49,8 @@ class AbodeCamera(AbodeDevice, Camera):
self.hass.async_add_job(
self._data.abode.events.add_timeline_callback,
self._event, self._capture_callback
self._event,
self._capture_callback,
)
def capture(self):
@@ -66,8 +67,7 @@ class AbodeCamera(AbodeDevice, Camera):
"""Attempt to download the most recent capture."""
if self._device.image_url:
try:
self._response = requests.get(
self._device.image_url, stream=True)
self._response = requests.get(self._device.image_url, stream=True)
self._response.raise_for_status()
except requests.HTTPError as err:

View File

@@ -3,10 +3,18 @@ import logging
from math import ceil
from homeassistant.components.light import (
ATTR_BRIGHTNESS, ATTR_COLOR_TEMP, ATTR_HS_COLOR, SUPPORT_BRIGHTNESS,
SUPPORT_COLOR, SUPPORT_COLOR_TEMP, Light)
ATTR_BRIGHTNESS,
ATTR_COLOR_TEMP,
ATTR_HS_COLOR,
SUPPORT_BRIGHTNESS,
SUPPORT_COLOR,
SUPPORT_COLOR_TEMP,
Light,
)
from homeassistant.util.color import (
color_temperature_kelvin_to_mired, color_temperature_mired_to_kelvin)
color_temperature_kelvin_to_mired,
color_temperature_mired_to_kelvin,
)
from . import DOMAIN as ABODE_DOMAIN, AbodeDevice
@@ -42,8 +50,8 @@ class AbodeLight(AbodeDevice, Light):
"""Turn on the light."""
if ATTR_COLOR_TEMP in kwargs and self._device.is_color_capable:
self._device.set_color_temp(
int(color_temperature_mired_to_kelvin(
kwargs[ATTR_COLOR_TEMP])))
int(color_temperature_mired_to_kelvin(kwargs[ATTR_COLOR_TEMP]))
)
if ATTR_HS_COLOR in kwargs and self._device.is_color_capable:
self._device.set_color(kwargs[ATTR_HS_COLOR])

View File

@@ -2,7 +2,10 @@
import logging
from homeassistant.const import (
DEVICE_CLASS_HUMIDITY, DEVICE_CLASS_ILLUMINANCE, DEVICE_CLASS_TEMPERATURE)
DEVICE_CLASS_HUMIDITY,
DEVICE_CLASS_ILLUMINANCE,
DEVICE_CLASS_TEMPERATURE,
)
from . import DOMAIN as ABODE_DOMAIN, AbodeDevice
@@ -10,9 +13,9 @@ _LOGGER = logging.getLogger(__name__)
# Sensor types: Name, icon
SENSOR_TYPES = {
'temp': ['Temperature', DEVICE_CLASS_TEMPERATURE],
'humidity': ['Humidity', DEVICE_CLASS_HUMIDITY],
'lux': ['Lux', DEVICE_CLASS_ILLUMINANCE],
"temp": ["Temperature", DEVICE_CLASS_TEMPERATURE],
"humidity": ["Humidity", DEVICE_CLASS_HUMIDITY],
"lux": ["Lux", DEVICE_CLASS_ILLUMINANCE],
}
@@ -42,8 +45,9 @@ class AbodeSensor(AbodeDevice):
"""Initialize a sensor for an Abode device."""
super().__init__(data, device)
self._sensor_type = sensor_type
self._name = '{0} {1}'.format(
self._device.name, SENSOR_TYPES[self._sensor_type][0])
self._name = "{0} {1}".format(
self._device.name, SENSOR_TYPES[self._sensor_type][0]
)
self._device_class = SENSOR_TYPES[self._sensor_type][1]
@property
@@ -59,19 +63,19 @@ class AbodeSensor(AbodeDevice):
@property
def state(self):
"""Return the state of the sensor."""
if self._sensor_type == 'temp':
if self._sensor_type == "temp":
return self._device.temp
if self._sensor_type == 'humidity':
if self._sensor_type == "humidity":
return self._device.humidity
if self._sensor_type == 'lux':
if self._sensor_type == "lux":
return self._device.lux
@property
def unit_of_measurement(self):
"""Return the units of measurement."""
if self._sensor_type == 'temp':
if self._sensor_type == "temp":
return self._device.temp_unit
if self._sensor_type == 'humidity':
if self._sensor_type == "humidity":
return self._device.humidity_unit
if self._sensor_type == 'lux':
if self._sensor_type == "lux":
return self._device.lux_unit

View File

@@ -25,13 +25,13 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
devices.append(AbodeSwitch(data, device))
# Get all Abode automations that can be enabled/disabled
for automation in data.abode.get_automations(
generic_type=CONST.TYPE_AUTOMATION):
for automation in data.abode.get_automations(generic_type=CONST.TYPE_AUTOMATION):
if data.is_automation_excluded(automation):
continue
devices.append(AbodeAutomationSwitch(
data, automation, TIMELINE.AUTOMATION_EDIT_GROUP))
devices.append(
AbodeAutomationSwitch(data, automation, TIMELINE.AUTOMATION_EDIT_GROUP)
)
data.devices.extend(devices)

View File

@@ -4,50 +4,58 @@ import re
import voluptuous as vol
from homeassistant.components.switch import (SwitchDevice, PLATFORM_SCHEMA)
from homeassistant.components.switch import SwitchDevice, PLATFORM_SCHEMA
from homeassistant.const import (
STATE_ON, STATE_OFF, STATE_UNKNOWN, CONF_NAME, CONF_FILENAME)
STATE_ON,
STATE_OFF,
STATE_UNKNOWN,
CONF_NAME,
CONF_FILENAME,
)
import homeassistant.helpers.config_validation as cv
_LOGGER = logging.getLogger(__name__)
CONF_TIMEOUT = 'timeout'
CONF_WRITE_TIMEOUT = 'write_timeout'
CONF_TIMEOUT = "timeout"
CONF_WRITE_TIMEOUT = "write_timeout"
DEFAULT_NAME = 'Acer Projector'
DEFAULT_NAME = "Acer Projector"
DEFAULT_TIMEOUT = 1
DEFAULT_WRITE_TIMEOUT = 1
ECO_MODE = 'ECO Mode'
ECO_MODE = "ECO Mode"
ICON = 'mdi:projector'
ICON = "mdi:projector"
INPUT_SOURCE = 'Input Source'
INPUT_SOURCE = "Input Source"
LAMP = 'Lamp'
LAMP_HOURS = 'Lamp Hours'
LAMP = "Lamp"
LAMP_HOURS = "Lamp Hours"
MODEL = 'Model'
MODEL = "Model"
# Commands known to the projector
CMD_DICT = {
LAMP: '* 0 Lamp ?\r',
LAMP_HOURS: '* 0 Lamp\r',
INPUT_SOURCE: '* 0 Src ?\r',
ECO_MODE: '* 0 IR 052\r',
MODEL: '* 0 IR 035\r',
STATE_ON: '* 0 IR 001\r',
STATE_OFF: '* 0 IR 002\r',
LAMP: "* 0 Lamp ?\r",
LAMP_HOURS: "* 0 Lamp\r",
INPUT_SOURCE: "* 0 Src ?\r",
ECO_MODE: "* 0 IR 052\r",
MODEL: "* 0 IR 035\r",
STATE_ON: "* 0 IR 001\r",
STATE_OFF: "* 0 IR 002\r",
}
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_FILENAME): cv.isdevice,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_TIMEOUT, default=DEFAULT_TIMEOUT): cv.positive_int,
vol.Optional(CONF_WRITE_TIMEOUT, default=DEFAULT_WRITE_TIMEOUT):
cv.positive_int,
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_FILENAME): cv.isdevice,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_TIMEOUT, default=DEFAULT_TIMEOUT): cv.positive_int,
vol.Optional(
CONF_WRITE_TIMEOUT, default=DEFAULT_WRITE_TIMEOUT
): cv.positive_int,
}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -66,9 +74,10 @@ class AcerSwitch(SwitchDevice):
def __init__(self, serial_port, name, timeout, write_timeout, **kwargs):
"""Init of the Acer projector."""
import serial
self.ser = serial.Serial(
port=serial_port, timeout=timeout, write_timeout=write_timeout,
**kwargs)
port=serial_port, timeout=timeout, write_timeout=write_timeout, **kwargs
)
self._serial_port = serial_port
self._name = name
self._state = False
@@ -82,6 +91,7 @@ class AcerSwitch(SwitchDevice):
def _write_read(self, msg):
"""Write to the projector and read the return."""
import serial
ret = ""
# Sometimes the projector won't answer for no reason or the projector
# was disconnected during runtime.
@@ -89,14 +99,14 @@ class AcerSwitch(SwitchDevice):
try:
if not self.ser.is_open:
self.ser.open()
msg = msg.encode('utf-8')
msg = msg.encode("utf-8")
self.ser.write(msg)
# Size is an experience value there is no real limit.
# AFAIK there is no limit and no end character so we will usually
# need to wait for timeout
ret = self.ser.read_until(size=20).decode('utf-8')
ret = self.ser.read_until(size=20).decode("utf-8")
except serial.SerialException:
_LOGGER.error('Problem communicating with %s', self._serial_port)
_LOGGER.error("Problem communicating with %s", self._serial_port)
self.ser.close()
return ret
@@ -104,7 +114,7 @@ class AcerSwitch(SwitchDevice):
"""Write msg, obtain answer and format output."""
# answers are formatted as ***\answer\r***
awns = self._write_read(msg)
match = re.search(r'\r(.+)\r', awns)
match = re.search(r"\r(.+)\r", awns)
if match:
return match.group(1)
return STATE_UNKNOWN
@@ -133,10 +143,10 @@ class AcerSwitch(SwitchDevice):
"""Get the latest state from the projector."""
msg = CMD_DICT[LAMP]
awns = self._write_read_format(msg)
if awns == 'Lamp 1':
if awns == "Lamp 1":
self._state = True
self._available = True
elif awns == 'Lamp 0':
elif awns == "Lamp 0":
self._state = False
self._available = True
else:

View File

@@ -8,22 +8,28 @@ import voluptuous as vol
import homeassistant.helpers.config_validation as cv
import homeassistant.util.dt as dt_util
from homeassistant.components.device_tracker import (
DOMAIN, PLATFORM_SCHEMA, DeviceScanner)
DOMAIN,
PLATFORM_SCHEMA,
DeviceScanner,
)
from homeassistant.const import CONF_HOST, CONF_PASSWORD, CONF_USERNAME
_LOGGER = logging.getLogger(__name__)
_LEASES_REGEX = re.compile(
r'(?P<ip>([0-9]{1,3}[\.]){3}[0-9]{1,3})' +
r'\smac:\s(?P<mac>([0-9a-f]{2}[:-]){5}([0-9a-f]{2}))' +
r'\svalid\sfor:\s(?P<timevalid>(-?\d+))' +
r'\ssec')
r"(?P<ip>([0-9]{1,3}[\.]){3}[0-9]{1,3})"
+ r"\smac:\s(?P<mac>([0-9a-f]{2}[:-]){5}([0-9a-f]{2}))"
+ r"\svalid\sfor:\s(?P<timevalid>(-?\d+))"
+ r"\ssec"
)
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_HOST): cv.string,
vol.Required(CONF_PASSWORD): cv.string,
vol.Required(CONF_USERNAME): cv.string
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_HOST): cv.string,
vol.Required(CONF_PASSWORD): cv.string,
vol.Required(CONF_USERNAME): cv.string,
}
)
def get_scanner(hass, config):
@@ -32,7 +38,7 @@ def get_scanner(hass, config):
return scanner if scanner.success_init else None
Device = namedtuple('Device', ['mac', 'ip', 'last_update'])
Device = namedtuple("Device", ["mac", "ip", "last_update"])
class ActiontecDeviceScanner(DeviceScanner):
@@ -75,9 +81,11 @@ class ActiontecDeviceScanner(DeviceScanner):
actiontec_data = self.get_actiontec_data()
if not actiontec_data:
return False
self.last_results = [Device(data['mac'], name, now)
for name, data in actiontec_data.items()
if data['timevalid'] > -60]
self.last_results = [
Device(data["mac"], name, now)
for name, data in actiontec_data.items()
if data["timevalid"] > -60
]
_LOGGER.info("Scan successful")
return True
@@ -85,17 +93,16 @@ class ActiontecDeviceScanner(DeviceScanner):
"""Retrieve data from Actiontec MI424WR and return parsed result."""
try:
telnet = telnetlib.Telnet(self.host)
telnet.read_until(b'Username: ')
telnet.write((self.username + '\n').encode('ascii'))
telnet.read_until(b'Password: ')
telnet.write((self.password + '\n').encode('ascii'))
prompt = telnet.read_until(
b'Wireless Broadband Router> ').split(b'\n')[-1]
telnet.write('firewall mac_cache_dump\n'.encode('ascii'))
telnet.write('\n'.encode('ascii'))
telnet.read_until(b"Username: ")
telnet.write((self.username + "\n").encode("ascii"))
telnet.read_until(b"Password: ")
telnet.write((self.password + "\n").encode("ascii"))
prompt = telnet.read_until(b"Wireless Broadband Router> ").split(b"\n")[-1]
telnet.write("firewall mac_cache_dump\n".encode("ascii"))
telnet.write("\n".encode("ascii"))
telnet.read_until(prompt)
leases_result = telnet.read_until(prompt).split(b'\n')[1:-1]
telnet.write('exit\n'.encode('ascii'))
leases_result = telnet.read_until(prompt).split(b"\n")[1:-1]
telnet.write("exit\n".encode("ascii"))
except EOFError:
_LOGGER.exception("Unexpected response from router")
return
@@ -105,11 +112,11 @@ class ActiontecDeviceScanner(DeviceScanner):
devices = {}
for lease in leases_result:
match = _LEASES_REGEX.search(lease.decode('utf-8'))
match = _LEASES_REGEX.search(lease.decode("utf-8"))
if match is not None:
devices[match.group('ip')] = {
'ip': match.group('ip'),
'mac': match.group('mac').upper(),
'timevalid': int(match.group('timevalid'))
}
devices[match.group("ip")] = {
"ip": match.group("ip"),
"mac": match.group("mac").upper(),
"timevalid": int(match.group("timevalid")),
}
return devices

View File

@@ -0,0 +1,30 @@
{
"config": {
"abort": {
"existing_instance_updated": "\u0410\u043a\u0442\u0443\u0430\u043b\u0438\u0437\u0438\u0440\u0430\u043d\u0435 \u043d\u0430 \u0441\u044a\u0449\u0435\u0441\u0442\u0432\u0443\u0432\u0430\u0449\u0430\u0442\u0430 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044f.",
"single_instance_allowed": "\u0420\u0430\u0437\u0440\u0435\u0448\u0435\u043d\u0430 \u0435 \u0441\u0430\u043c\u043e \u0435\u0434\u043d\u0430 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044f \u043d\u0430 AdGuard Home."
},
"error": {
"connection_error": "\u041d\u0435\u0443\u0441\u043f\u0435\u0448\u043d\u043e \u0441\u0432\u044a\u0440\u0437\u0432\u0430\u043d\u0435."
},
"step": {
"hassio_confirm": {
"description": "\u0418\u0441\u043a\u0430\u0442\u0435 \u043b\u0438 \u0434\u0430 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0438\u0440\u0430\u0442\u0435 Home Assistant \u0434\u0430 \u0441\u0435 \u0441\u0432\u044a\u0440\u0437\u0432\u0430 \u0441 AdGuard Home, \u043f\u0440\u0435\u0434\u043e\u0441\u0442\u0430\u0432\u0435\u043d \u043e\u0442 Hass.io \u0434\u043e\u0431\u0430\u0432\u043a\u0430\u0442\u0430: {addon} ?",
"title": "AdGuard Home \u0447\u0440\u0435\u0437 Hass.io \u0434\u043e\u0431\u0430\u0432\u043a\u0430"
},
"user": {
"data": {
"host": "\u0410\u0434\u0440\u0435\u0441",
"password": "\u041f\u0430\u0440\u043e\u043b\u0430",
"port": "\u041f\u043e\u0440\u0442",
"ssl": "AdGuard Home \u0438\u0437\u043f\u043e\u043b\u0437\u0432\u0430 SSL \u0441\u0435\u0440\u0442\u0438\u0444\u0438\u043a\u0430\u0442",
"username": "\u041f\u043e\u0442\u0440\u0435\u0431\u0438\u0442\u0435\u043b\u0441\u043a\u043e \u0438\u043c\u0435",
"verify_ssl": "AdGuard Home \u0438\u0437\u043f\u043e\u043b\u0437\u0432\u0430 \u043d\u0430\u0434\u0435\u0436\u0434\u0435\u043d \u0441\u0435\u0440\u0442\u0438\u0444\u0438\u043a\u0430\u0442"
},
"description": "\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u0442\u0435 \u0412\u0430\u0448\u0438\u044f AdGuard Home, \u0437\u0430 \u0434\u0430 \u043f\u043e\u0437\u0432\u043e\u043b\u0438\u0442\u0435 \u043d\u0430\u0431\u043b\u044e\u0434\u0435\u043d\u0438\u0435 \u0438 \u043a\u043e\u043d\u0442\u0440\u043e\u043b.",
"title": "\u0421\u0432\u044a\u0440\u0436\u0435\u0442\u0435 \u0412\u0430\u0448\u0438\u044f AdGuard Home."
}
},
"title": "AdGuard Home"
}
}

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "S'ha actualitzat la configuraci\u00f3 existent.",
"single_instance_allowed": "Nom\u00e9s es permet una \u00fanica configuraci\u00f3 d'AdGuard Home."
},
"error": {

View File

@@ -0,0 +1,30 @@
{
"config": {
"abort": {
"existing_instance_updated": "Opdaterede eksisterende konfiguration.",
"single_instance_allowed": "Det er kun n\u00f8dvendigt med en ops\u00e6tning af AdGuard Home."
},
"error": {
"connection_error": "Forbindelse mislykkedes."
},
"step": {
"hassio_confirm": {
"description": "Vil du konfigurere Home Assistant til at oprette forbindelse til Adguard Home, der leveres af Hass.io add-on: {addon}?",
"title": "AdGuard Home via Hass.io add-on"
},
"user": {
"data": {
"host": "V\u00e6rt",
"password": "Adgangskode",
"port": "Port",
"ssl": "AdGuard Home bruger et SSL-certifikat",
"username": "Brugernavn",
"verify_ssl": "AdGuard Home bruger et korrekt certifikat"
},
"description": "Konfigurer din AdGuard Home instans for at tillade overv\u00e5gning og kontrol.",
"title": "Link AdGuard Home."
}
},
"title": "AdGuard Home"
}
}

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Bestehende Konfiguration wurde aktualisiert.",
"single_instance_allowed": "Es ist nur eine einzige Konfiguration von AdGuard Home zul\u00e4ssig."
},
"error": {
@@ -23,6 +24,7 @@
"description": "Richte deine AdGuard Home-Instanz ein um sie zu \u00dcberwachen und zu Steuern.",
"title": "Verkn\u00fcpfe AdGuard Home."
}
}
},
"title": "AdGuard Home"
}
}

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Updated existing configuration.",
"single_instance_allowed": "Only a single configuration of AdGuard Home is allowed."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Se actualiz\u00f3 la configuraci\u00f3n existente.",
"single_instance_allowed": "Solo se permite una \u00fanica configuraci\u00f3n de AdGuard Home."
},
"error": {

View File

@@ -0,0 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Se ha actualizado la configuraci\u00f3n existente."
}
}
}

View File

@@ -0,0 +1,24 @@
{
"config": {
"abort": {
"existing_instance_updated": "La configuration existante a \u00e9t\u00e9 mise \u00e0 jour."
},
"error": {
"connection_error": "\u00c9chec de connexion."
},
"step": {
"hassio_confirm": {
"title": "AdGuard Home via le module compl\u00e9mentaire Hass.io"
},
"user": {
"data": {
"host": "H\u00f4te",
"password": "Mot de passe",
"port": "Port",
"ssl": "AdGuard Home utilise un certificat SSL",
"username": "Nom d'utilisateur"
}
}
}
}
}

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "\uae30\uc874 \uad6c\uc131\uc744 \uc5c5\ub370\uc774\ud2b8\ud588\uc2b5\ub2c8\ub2e4.",
"single_instance_allowed": "\ud558\ub098\uc758 AdGuard Home \ub9cc \uad6c\uc131 \ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4."
},
"error": {
@@ -16,7 +17,7 @@
"host": "\ud638\uc2a4\ud2b8",
"password": "\ube44\ubc00\ubc88\ud638",
"port": "\ud3ec\ud2b8",
"ssl": "AdGuard Home \uc740 SSL \uc778\uc99d\uc11c\ub97c \uc0ac\uc6a9\ud569\ub2c8\ub2e4",
"ssl": "AdGuard Home \uc740 SSL \uc778\uc99d\uc11c\ub97c \uc0ac\uc6a9\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4",
"username": "\uc0ac\uc6a9\uc790 \uc774\ub984",
"verify_ssl": "AdGuard Home \uc740 \uc62c\ubc14\ub978 \uc778\uc99d\uc11c\ub97c \uc0ac\uc6a9\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4"
},

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "D\u00e9i bestehend Konfiguratioun ass ge\u00e4nnert.",
"single_instance_allowed": "N\u00ebmmen eng eenzeg Konfiguratioun vun AdGuard Home ass erlaabt."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Bestaande configuratie bijgewerkt.",
"single_instance_allowed": "Slechts \u00e9\u00e9n configuratie van AdGuard Home is toegestaan."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Oppdatert eksisterende konfigurasjon.",
"single_instance_allowed": "Kun \u00e9n enkelt konfigurasjon av AdGuard Hjemer tillatt."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Zaktualizowano istniej\u0105c\u0105 konfiguracj\u0119.",
"single_instance_allowed": "Dozwolona jest tylko jedna konfiguracja AdGuard Home."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Configura\u00e7\u00e3o existente atualizada.",
"single_instance_allowed": "Apenas uma \u00fanica configura\u00e7\u00e3o do AdGuard Home \u00e9 permitida."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "\u041a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044f \u043e\u0431\u043d\u043e\u0432\u043b\u0435\u043d\u0430.",
"single_instance_allowed": "\u041d\u0430\u0441\u0442\u0440\u043e\u0439\u043a\u0430 \u043a\u043e\u043c\u043f\u043e\u043d\u0435\u043d\u0442\u0430 \u0443\u0436\u0435 \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0430."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Posodobljena obstoje\u010da konfiguracija.",
"single_instance_allowed": "Dovoljena je samo ena konfiguracija AdGuard Home."
},
"error": {

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "Uppdaterade existerande konfiguration.",
"single_instance_allowed": "Endast en enda konfiguration av AdGuard Home \u00e4r till\u00e5ten."
},
"error": {

View File

@@ -0,0 +1,16 @@
{
"config": {
"abort": {
"existing_instance_updated": "\u66f4\u65b0\u4e86\u73b0\u6709\u914d\u7f6e\u3002"
},
"step": {
"user": {
"data": {
"password": "\u5bc6\u7801",
"port": "\u7aef\u53e3",
"username": "\u7528\u6237\u540d"
}
}
}
}
}

View File

@@ -1,6 +1,7 @@
{
"config": {
"abort": {
"existing_instance_updated": "\u5df2\u66f4\u65b0\u73fe\u6709\u8a2d\u5b9a\u3002",
"single_instance_allowed": "\u50c5\u5141\u8a31\u8a2d\u5b9a\u4e00\u7d44 AdGuard Home\u3002"
},
"error": {

View File

@@ -6,13 +6,27 @@ from adguardhome import AdGuardHome, AdGuardHomeError
import voluptuous as vol
from homeassistant.components.adguard.const import (
CONF_FORCE, DATA_ADGUARD_CLIENT, DATA_ADGUARD_VERION, DOMAIN,
SERVICE_ADD_URL, SERVICE_DISABLE_URL, SERVICE_ENABLE_URL, SERVICE_REFRESH,
SERVICE_REMOVE_URL)
CONF_FORCE,
DATA_ADGUARD_CLIENT,
DATA_ADGUARD_VERION,
DOMAIN,
SERVICE_ADD_URL,
SERVICE_DISABLE_URL,
SERVICE_ENABLE_URL,
SERVICE_REFRESH,
SERVICE_REMOVE_URL,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import (
CONF_HOST, CONF_NAME, CONF_PASSWORD, CONF_PORT, CONF_SSL, CONF_URL,
CONF_USERNAME, CONF_VERIFY_SSL)
CONF_HOST,
CONF_NAME,
CONF_PASSWORD,
CONF_PORT,
CONF_SSL,
CONF_URL,
CONF_USERNAME,
CONF_VERIFY_SSL,
)
from homeassistant.helpers import config_validation as cv
from homeassistant.helpers.aiohttp_client import async_get_clientsession
from homeassistant.helpers.entity import Entity
@@ -34,9 +48,7 @@ async def async_setup(hass: HomeAssistantType, config: ConfigType) -> bool:
return True
async def async_setup_entry(
hass: HomeAssistantType, entry: ConfigEntry
) -> bool:
async def async_setup_entry(hass: HomeAssistantType, entry: ConfigEntry) -> bool:
"""Set up AdGuard Home from a config entry."""
session = async_get_clientsession(hass, entry.data[CONF_VERIFY_SSL])
adguard = AdGuardHome(
@@ -52,7 +64,7 @@ async def async_setup_entry(
hass.data.setdefault(DOMAIN, {})[DATA_ADGUARD_CLIENT] = adguard
for component in 'sensor', 'switch':
for component in "sensor", "switch":
hass.async_create_task(
hass.config_entries.async_forward_entry_setup(entry, component)
)
@@ -98,9 +110,7 @@ async def async_setup_entry(
return True
async def async_unload_entry(
hass: HomeAssistantType, entry: ConfigType
) -> bool:
async def async_unload_entry(hass: HomeAssistantType, entry: ConfigType) -> bool:
"""Unload AdGuard Home config entry."""
hass.services.async_remove(DOMAIN, SERVICE_ADD_URL)
hass.services.async_remove(DOMAIN, SERVICE_REMOVE_URL)
@@ -108,7 +118,7 @@ async def async_unload_entry(
hass.services.async_remove(DOMAIN, SERVICE_DISABLE_URL)
hass.services.async_remove(DOMAIN, SERVICE_REFRESH)
for component in 'sensor', 'switch':
for component in "sensor", "switch":
await hass.config_entries.async_forward_entry_unload(entry, component)
del hass.data[DOMAIN]
@@ -166,15 +176,10 @@ class AdGuardHomeDeviceEntity(AdGuardHomeEntity):
def device_info(self) -> Dict[str, Any]:
"""Return device information about this AdGuard Home instance."""
return {
'identifiers': {
(
DOMAIN,
self.adguard.host,
self.adguard.port,
self.adguard.base_path,
)
"identifiers": {
(DOMAIN, self.adguard.host, self.adguard.port, self.adguard.base_path)
},
'name': 'AdGuard Home',
'manufacturer': 'AdGuard Team',
'sw_version': self.hass.data[DOMAIN].get(DATA_ADGUARD_VERION),
"name": "AdGuard Home",
"manufacturer": "AdGuard Team",
"sw_version": self.hass.data[DOMAIN].get(DATA_ADGUARD_VERION),
}

View File

@@ -8,8 +8,13 @@ from homeassistant import config_entries
from homeassistant.components.adguard.const import DOMAIN
from homeassistant.config_entries import ConfigFlow
from homeassistant.const import (
CONF_HOST, CONF_PASSWORD, CONF_PORT, CONF_SSL, CONF_USERNAME,
CONF_VERIFY_SSL)
CONF_HOST,
CONF_PASSWORD,
CONF_PORT,
CONF_SSL,
CONF_USERNAME,
CONF_VERIFY_SSL,
)
from homeassistant.helpers.aiohttp_client import async_get_clientsession
_LOGGER = logging.getLogger(__name__)
@@ -31,7 +36,7 @@ class AdGuardHomeFlowHandler(ConfigFlow):
async def _show_setup_form(self, errors=None):
"""Show the setup form to the user."""
return self.async_show_form(
step_id='user',
step_id="user",
data_schema=vol.Schema(
{
vol.Required(CONF_HOST): str,
@@ -48,10 +53,8 @@ class AdGuardHomeFlowHandler(ConfigFlow):
async def _show_hassio_form(self, errors=None):
"""Show the Hass.io confirmation form to the user."""
return self.async_show_form(
step_id='hassio_confirm',
description_placeholders={
'addon': self._hassio_discovery['addon']
},
step_id="hassio_confirm",
description_placeholders={"addon": self._hassio_discovery["addon"]},
data_schema=vol.Schema({}),
errors=errors or {},
)
@@ -59,16 +62,14 @@ class AdGuardHomeFlowHandler(ConfigFlow):
async def async_step_user(self, user_input=None):
"""Handle a flow initiated by the user."""
if self._async_current_entries():
return self.async_abort(reason='single_instance_allowed')
return self.async_abort(reason="single_instance_allowed")
if user_input is None:
return await self._show_setup_form(user_input)
errors = {}
session = async_get_clientsession(
self.hass, user_input[CONF_VERIFY_SSL]
)
session = async_get_clientsession(self.hass, user_input[CONF_VERIFY_SSL])
adguard = AdGuardHome(
user_input[CONF_HOST],
@@ -84,7 +85,7 @@ class AdGuardHomeFlowHandler(ConfigFlow):
try:
await adguard.version()
except AdGuardHomeConnectionError:
errors['base'] = 'connection_error'
errors["base"] = "connection_error"
return await self._show_setup_form(errors)
return self.async_create_entry(
@@ -112,25 +113,30 @@ class AdGuardHomeFlowHandler(ConfigFlow):
cur_entry = entries[0]
if (cur_entry.data[CONF_HOST] == user_input[CONF_HOST] and
cur_entry.data[CONF_PORT] == user_input[CONF_PORT]):
return self.async_abort(reason='single_instance_allowed')
if (
cur_entry.data[CONF_HOST] == user_input[CONF_HOST]
and cur_entry.data[CONF_PORT] == user_input[CONF_PORT]
):
return self.async_abort(reason="single_instance_allowed")
is_loaded = cur_entry.state == config_entries.ENTRY_STATE_LOADED
if is_loaded:
await self.hass.config_entries.async_unload(cur_entry.entry_id)
self.hass.config_entries.async_update_entry(cur_entry, data={
**cur_entry.data,
CONF_HOST: user_input[CONF_HOST],
CONF_PORT: user_input[CONF_PORT],
})
self.hass.config_entries.async_update_entry(
cur_entry,
data={
**cur_entry.data,
CONF_HOST: user_input[CONF_HOST],
CONF_PORT: user_input[CONF_PORT],
},
)
if is_loaded:
await self.hass.config_entries.async_setup(cur_entry.entry_id)
return self.async_abort(reason='existing_instance_updated')
return self.async_abort(reason="existing_instance_updated")
async def async_step_hassio_confirm(self, user_input=None):
"""Confirm Hass.io discovery."""
@@ -152,11 +158,11 @@ class AdGuardHomeFlowHandler(ConfigFlow):
try:
await adguard.version()
except AdGuardHomeConnectionError:
errors['base'] = 'connection_error'
errors["base"] = "connection_error"
return await self._show_hassio_form(errors)
return self.async_create_entry(
title=self._hassio_discovery['addon'],
title=self._hassio_discovery["addon"],
data={
CONF_HOST: self._hassio_discovery[CONF_HOST],
CONF_PORT: self._hassio_discovery[CONF_PORT],

View File

@@ -1,14 +1,14 @@
"""Constants for the AdGuard Home integration."""
DOMAIN = 'adguard'
DOMAIN = "adguard"
DATA_ADGUARD_CLIENT = 'adguard_client'
DATA_ADGUARD_VERION = 'adguard_version'
DATA_ADGUARD_CLIENT = "adguard_client"
DATA_ADGUARD_VERION = "adguard_version"
CONF_FORCE = 'force'
CONF_FORCE = "force"
SERVICE_ADD_URL = 'add_url'
SERVICE_DISABLE_URL = 'disable_url'
SERVICE_ENABLE_URL = 'enable_url'
SERVICE_REFRESH = 'refresh'
SERVICE_REMOVE_URL = 'remove_url'
SERVICE_ADD_URL = "add_url"
SERVICE_DISABLE_URL = "disable_url"
SERVICE_ENABLE_URL = "enable_url"
SERVICE_REFRESH = "refresh"
SERVICE_REMOVE_URL = "remove_url"

View File

@@ -6,7 +6,10 @@ from adguardhome import AdGuardHomeConnectionError
from homeassistant.components.adguard import AdGuardHomeDeviceEntity
from homeassistant.components.adguard.const import (
DATA_ADGUARD_CLIENT, DATA_ADGUARD_VERION, DOMAIN)
DATA_ADGUARD_CLIENT,
DATA_ADGUARD_VERION,
DOMAIN,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.exceptions import PlatformNotReady
from homeassistant.helpers.typing import HomeAssistantType
@@ -18,7 +21,7 @@ PARALLEL_UPDATES = 4
async def async_setup_entry(
hass: HomeAssistantType, entry: ConfigEntry, async_add_entities
hass: HomeAssistantType, entry: ConfigEntry, async_add_entities
) -> None:
"""Set up AdGuard Home sensor based on a config entry."""
adguard = hass.data[DOMAIN][DATA_ADGUARD_CLIENT]
@@ -48,12 +51,7 @@ class AdGuardHomeSensor(AdGuardHomeDeviceEntity):
"""Defines a AdGuard Home sensor."""
def __init__(
self,
adguard,
name: str,
icon: str,
measurement: str,
unit_of_measurement: str,
self, adguard, name: str, icon: str, measurement: str, unit_of_measurement: str
) -> None:
"""Initialize AdGuard Home sensor."""
self._state = None
@@ -65,12 +63,12 @@ class AdGuardHomeSensor(AdGuardHomeDeviceEntity):
@property
def unique_id(self) -> str:
"""Return the unique ID for this sensor."""
return '_'.join(
return "_".join(
[
DOMAIN,
self.adguard.host,
str(self.adguard.port),
'sensor',
"sensor",
self.measurement,
]
)
@@ -92,11 +90,7 @@ class AdGuardHomeDNSQueriesSensor(AdGuardHomeSensor):
def __init__(self, adguard):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard DNS Queries',
'mdi:magnify',
'dns_queries',
'queries',
adguard, "AdGuard DNS Queries", "mdi:magnify", "dns_queries", "queries"
)
async def _adguard_update(self) -> None:
@@ -111,10 +105,10 @@ class AdGuardHomeBlockedFilteringSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard DNS Queries Blocked',
'mdi:magnify-close',
'blocked_filtering',
'queries',
"AdGuard DNS Queries Blocked",
"mdi:magnify-close",
"blocked_filtering",
"queries",
)
async def _adguard_update(self) -> None:
@@ -129,10 +123,10 @@ class AdGuardHomePercentageBlockedSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard DNS Queries Blocked Ratio',
'mdi:magnify-close',
'blocked_percentage',
'%',
"AdGuard DNS Queries Blocked Ratio",
"mdi:magnify-close",
"blocked_percentage",
"%",
)
async def _adguard_update(self) -> None:
@@ -148,10 +142,10 @@ class AdGuardHomeReplacedParentalSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard Parental Control Blocked',
'mdi:human-male-girl',
'blocked_parental',
'requests',
"AdGuard Parental Control Blocked",
"mdi:human-male-girl",
"blocked_parental",
"requests",
)
async def _adguard_update(self) -> None:
@@ -166,10 +160,10 @@ class AdGuardHomeReplacedSafeBrowsingSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard Safe Browsing Blocked',
'mdi:shield-half-full',
'blocked_safebrowsing',
'requests',
"AdGuard Safe Browsing Blocked",
"mdi:shield-half-full",
"blocked_safebrowsing",
"requests",
)
async def _adguard_update(self) -> None:
@@ -184,10 +178,10 @@ class AdGuardHomeReplacedSafeSearchSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'Searches Safe Search Enforced',
'mdi:shield-search',
'enforced_safesearch',
'requests',
"Searches Safe Search Enforced",
"mdi:shield-search",
"enforced_safesearch",
"requests",
)
async def _adguard_update(self) -> None:
@@ -202,10 +196,10 @@ class AdGuardHomeAverageProcessingTimeSensor(AdGuardHomeSensor):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard Average Processing Speed',
'mdi:speedometer',
'average_speed',
'ms',
"AdGuard Average Processing Speed",
"mdi:speedometer",
"average_speed",
"ms",
)
async def _adguard_update(self) -> None:
@@ -220,11 +214,7 @@ class AdGuardHomeRulesCountSensor(AdGuardHomeSensor):
def __init__(self, adguard):
"""Initialize AdGuard Home sensor."""
super().__init__(
adguard,
'AdGuard Rules Count',
'mdi:counter',
'rules_count',
'rules',
adguard, "AdGuard Rules Count", "mdi:counter", "rules_count", "rules"
)
async def _adguard_update(self) -> None:

View File

@@ -6,7 +6,10 @@ from adguardhome import AdGuardHomeConnectionError, AdGuardHomeError
from homeassistant.components.adguard import AdGuardHomeDeviceEntity
from homeassistant.components.adguard.const import (
DATA_ADGUARD_CLIENT, DATA_ADGUARD_VERION, DOMAIN)
DATA_ADGUARD_CLIENT,
DATA_ADGUARD_VERION,
DOMAIN,
)
from homeassistant.config_entries import ConfigEntry
from homeassistant.exceptions import PlatformNotReady
from homeassistant.helpers.entity import ToggleEntity
@@ -19,7 +22,7 @@ PARALLEL_UPDATES = 1
async def async_setup_entry(
hass: HomeAssistantType, entry: ConfigEntry, async_add_entities
hass: HomeAssistantType, entry: ConfigEntry, async_add_entities
) -> None:
"""Set up AdGuard Home switch based on a config entry."""
adguard = hass.data[DOMAIN][DATA_ADGUARD_CLIENT]
@@ -54,14 +57,8 @@ class AdGuardHomeSwitch(ToggleEntity, AdGuardHomeDeviceEntity):
@property
def unique_id(self) -> str:
"""Return the unique ID for this sensor."""
return '_'.join(
[
DOMAIN,
self.adguard.host,
str(self.adguard.port),
'switch',
self._key,
]
return "_".join(
[DOMAIN, self.adguard.host, str(self.adguard.port), "switch", self._key]
)
@property
@@ -74,9 +71,7 @@ class AdGuardHomeSwitch(ToggleEntity, AdGuardHomeDeviceEntity):
try:
await self._adguard_turn_off()
except AdGuardHomeError:
_LOGGER.error(
"An error occurred while turning off AdGuard Home switch."
)
_LOGGER.error("An error occurred while turning off AdGuard Home switch.")
self._available = False
async def _adguard_turn_off(self) -> None:
@@ -88,9 +83,7 @@ class AdGuardHomeSwitch(ToggleEntity, AdGuardHomeDeviceEntity):
try:
await self._adguard_turn_on()
except AdGuardHomeError:
_LOGGER.error(
"An error occurred while turning on AdGuard Home switch."
)
_LOGGER.error("An error occurred while turning on AdGuard Home switch.")
self._available = False
async def _adguard_turn_on(self) -> None:
@@ -104,7 +97,7 @@ class AdGuardHomeProtectionSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard, "AdGuard Protection", 'mdi:shield-check', 'protection'
adguard, "AdGuard Protection", "mdi:shield-check", "protection"
)
async def _adguard_turn_off(self) -> None:
@@ -126,7 +119,7 @@ class AdGuardHomeParentalSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard, "AdGuard Parental Control", 'mdi:shield-check', 'parental'
adguard, "AdGuard Parental Control", "mdi:shield-check", "parental"
)
async def _adguard_turn_off(self) -> None:
@@ -148,7 +141,7 @@ class AdGuardHomeSafeSearchSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard, "AdGuard Safe Search", 'mdi:shield-check', 'safesearch'
adguard, "AdGuard Safe Search", "mdi:shield-check", "safesearch"
)
async def _adguard_turn_off(self) -> None:
@@ -170,10 +163,7 @@ class AdGuardHomeSafeBrowsingSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard,
"AdGuard Safe Browsing",
'mdi:shield-check',
'safebrowsing',
adguard, "AdGuard Safe Browsing", "mdi:shield-check", "safebrowsing"
)
async def _adguard_turn_off(self) -> None:
@@ -194,9 +184,7 @@ class AdGuardHomeFilteringSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard, "AdGuard Filtering", 'mdi:shield-check', 'filtering'
)
super().__init__(adguard, "AdGuard Filtering", "mdi:shield-check", "filtering")
async def _adguard_turn_off(self) -> None:
"""Turn off the switch."""
@@ -216,9 +204,7 @@ class AdGuardHomeQueryLogSwitch(AdGuardHomeSwitch):
def __init__(self, adguard) -> None:
"""Initialize AdGuard Home switch."""
super().__init__(
adguard, "AdGuard Query Log", 'mdi:shield-check', 'querylog'
)
super().__init__(adguard, "AdGuard Query Log", "mdi:shield-check", "querylog")
async def _adguard_turn_off(self) -> None:
"""Turn off the switch."""

View File

@@ -10,57 +10,76 @@ import async_timeout
import voluptuous as vol
from homeassistant.const import (
CONF_DEVICE, CONF_IP_ADDRESS, CONF_PORT, EVENT_HOMEASSISTANT_STOP)
CONF_DEVICE,
CONF_IP_ADDRESS,
CONF_PORT,
EVENT_HOMEASSISTANT_STOP,
)
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity import Entity
_LOGGER = logging.getLogger(__name__)
DATA_ADS = 'data_ads'
DATA_ADS = "data_ads"
# Supported Types
ADSTYPE_BOOL = 'bool'
ADSTYPE_BYTE = 'byte'
ADSTYPE_DINT = 'dint'
ADSTYPE_INT = 'int'
ADSTYPE_UDINT = 'udint'
ADSTYPE_UINT = 'uint'
ADSTYPE_BOOL = "bool"
ADSTYPE_BYTE = "byte"
ADSTYPE_DINT = "dint"
ADSTYPE_INT = "int"
ADSTYPE_UDINT = "udint"
ADSTYPE_UINT = "uint"
CONF_ADS_FACTOR = 'factor'
CONF_ADS_TYPE = 'adstype'
CONF_ADS_VALUE = 'value'
CONF_ADS_VAR = 'adsvar'
CONF_ADS_VAR_BRIGHTNESS = 'adsvar_brightness'
CONF_ADS_VAR_POSITION = 'adsvar_position'
CONF_ADS_FACTOR = "factor"
CONF_ADS_TYPE = "adstype"
CONF_ADS_VALUE = "value"
CONF_ADS_VAR = "adsvar"
CONF_ADS_VAR_BRIGHTNESS = "adsvar_brightness"
CONF_ADS_VAR_POSITION = "adsvar_position"
STATE_KEY_STATE = 'state'
STATE_KEY_BRIGHTNESS = 'brightness'
STATE_KEY_POSITION = 'position'
STATE_KEY_STATE = "state"
STATE_KEY_BRIGHTNESS = "brightness"
STATE_KEY_POSITION = "position"
DOMAIN = 'ads'
DOMAIN = "ads"
SERVICE_WRITE_DATA_BY_NAME = 'write_data_by_name'
SERVICE_WRITE_DATA_BY_NAME = "write_data_by_name"
CONFIG_SCHEMA = vol.Schema({
DOMAIN: vol.Schema({
vol.Required(CONF_DEVICE): cv.string,
vol.Required(CONF_PORT): cv.port,
vol.Optional(CONF_IP_ADDRESS): cv.string,
})
}, extra=vol.ALLOW_EXTRA)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.Schema(
{
vol.Required(CONF_DEVICE): cv.string,
vol.Required(CONF_PORT): cv.port,
vol.Optional(CONF_IP_ADDRESS): cv.string,
}
)
},
extra=vol.ALLOW_EXTRA,
)
SCHEMA_SERVICE_WRITE_DATA_BY_NAME = vol.Schema({
vol.Required(CONF_ADS_TYPE):
vol.In([ADSTYPE_INT, ADSTYPE_UINT, ADSTYPE_BYTE, ADSTYPE_BOOL,
ADSTYPE_DINT, ADSTYPE_UDINT]),
vol.Required(CONF_ADS_VALUE): vol.Coerce(int),
vol.Required(CONF_ADS_VAR): cv.string,
})
SCHEMA_SERVICE_WRITE_DATA_BY_NAME = vol.Schema(
{
vol.Required(CONF_ADS_TYPE): vol.In(
[
ADSTYPE_INT,
ADSTYPE_UINT,
ADSTYPE_BYTE,
ADSTYPE_BOOL,
ADSTYPE_DINT,
ADSTYPE_UDINT,
]
),
vol.Required(CONF_ADS_VALUE): vol.Coerce(int),
vol.Required(CONF_ADS_VAR): cv.string,
}
)
def setup(hass, config):
"""Set up the ADS component."""
import pyads
conf = config[DOMAIN]
net_id = conf.get(CONF_DEVICE)
@@ -91,7 +110,10 @@ def setup(hass, config):
except pyads.ADSError:
_LOGGER.error(
"Could not connect to ADS host (netid=%s, ip=%s, port=%s)",
net_id, ip_address, port)
net_id,
ip_address,
port,
)
return False
hass.data[DATA_ADS] = ads
@@ -109,15 +131,18 @@ def setup(hass, config):
_LOGGER.error(err)
hass.services.register(
DOMAIN, SERVICE_WRITE_DATA_BY_NAME, handle_write_data_by_name,
schema=SCHEMA_SERVICE_WRITE_DATA_BY_NAME)
DOMAIN,
SERVICE_WRITE_DATA_BY_NAME,
handle_write_data_by_name,
schema=SCHEMA_SERVICE_WRITE_DATA_BY_NAME,
)
return True
# Tuple to hold data needed for notification
NotificationItem = namedtuple(
'NotificationItem', 'hnotify huser name plc_datatype callback'
"NotificationItem", "hnotify huser name plc_datatype callback"
)
@@ -137,15 +162,17 @@ class AdsHub:
def shutdown(self, *args, **kwargs):
"""Shutdown ADS connection."""
import pyads
_LOGGER.debug("Shutting down ADS")
for notification_item in self._notification_items.values():
_LOGGER.debug(
"Deleting device notification %d, %d",
notification_item.hnotify, notification_item.huser)
notification_item.hnotify,
notification_item.huser,
)
try:
self._client.del_device_notification(
notification_item.hnotify,
notification_item.huser
notification_item.hnotify, notification_item.huser
)
except pyads.ADSError as err:
_LOGGER.error(err)
@@ -161,6 +188,7 @@ class AdsHub:
def write_by_name(self, name, value, plc_datatype):
"""Write a value to the device."""
import pyads
with self._lock:
try:
return self._client.write_by_name(name, value, plc_datatype)
@@ -170,6 +198,7 @@ class AdsHub:
def read_by_name(self, name, plc_datatype):
"""Read a value from the device."""
import pyads
with self._lock:
try:
return self._client.read_by_name(name, plc_datatype)
@@ -179,22 +208,25 @@ class AdsHub:
def add_device_notification(self, name, plc_datatype, callback):
"""Add a notification to the ADS devices."""
import pyads
attr = pyads.NotificationAttrib(ctypes.sizeof(plc_datatype))
with self._lock:
try:
hnotify, huser = self._client.add_device_notification(
name, attr, self._device_notification_callback)
name, attr, self._device_notification_callback
)
except pyads.ADSError as err:
_LOGGER.error("Error subscribing to %s: %s", name, err)
else:
hnotify = int(hnotify)
self._notification_items[hnotify] = NotificationItem(
hnotify, huser, name, plc_datatype, callback)
hnotify, huser, name, plc_datatype, callback
)
_LOGGER.debug(
"Added device notification %d for variable %s",
hnotify, name)
"Added device notification %d for variable %s", hnotify, name
)
def _device_notification_callback(self, notification, name):
"""Handle device notifications."""
@@ -213,17 +245,17 @@ class AdsHub:
# Parse data to desired datatype
if notification_item.plc_datatype == self.PLCTYPE_BOOL:
value = bool(struct.unpack('<?', bytearray(data)[:1])[0])
value = bool(struct.unpack("<?", bytearray(data)[:1])[0])
elif notification_item.plc_datatype == self.PLCTYPE_INT:
value = struct.unpack('<h', bytearray(data)[:2])[0]
value = struct.unpack("<h", bytearray(data)[:2])[0]
elif notification_item.plc_datatype == self.PLCTYPE_BYTE:
value = struct.unpack('<B', bytearray(data)[:1])[0]
value = struct.unpack("<B", bytearray(data)[:1])[0]
elif notification_item.plc_datatype == self.PLCTYPE_UINT:
value = struct.unpack('<H', bytearray(data)[:2])[0]
value = struct.unpack("<H", bytearray(data)[:2])[0]
elif notification_item.plc_datatype == self.PLCTYPE_DINT:
value = struct.unpack('<i', bytearray(data)[:4])[0]
value = struct.unpack("<i", bytearray(data)[:4])[0]
elif notification_item.plc_datatype == self.PLCTYPE_UDINT:
value = struct.unpack('<I', bytearray(data)[:4])[0]
value = struct.unpack("<I", bytearray(data)[:4])[0]
else:
value = bytearray(data)
_LOGGER.warning("No callback available for this datatype")
@@ -245,11 +277,13 @@ class AdsEntity(Entity):
self._event = None
async def async_initialize_device(
self, ads_var, plctype, state_key=STATE_KEY_STATE, factor=None):
self, ads_var, plctype, state_key=STATE_KEY_STATE, factor=None
):
"""Register device notification."""
def update(name, value):
"""Handle device notifications."""
_LOGGER.debug('Variable %s changed its value to %d', name, value)
_LOGGER.debug("Variable %s changed its value to %d", name, value)
if factor is None:
self._state_dict[state_key] = value
@@ -266,14 +300,13 @@ class AdsEntity(Entity):
self._event = asyncio.Event()
await self.hass.async_add_executor_job(
self._ads_hub.add_device_notification,
ads_var, plctype, update)
self._ads_hub.add_device_notification, ads_var, plctype, update
)
try:
with async_timeout.timeout(10):
await self._event.wait()
except asyncio.TimeoutError:
_LOGGER.debug('Variable %s: Timeout during first update',
ads_var)
_LOGGER.debug("Variable %s: Timeout during first update", ads_var)
@property
def name(self):

View File

@@ -4,7 +4,10 @@ import logging
import voluptuous as vol
from homeassistant.components.binary_sensor import (
DEVICE_CLASSES_SCHEMA, PLATFORM_SCHEMA, BinarySensorDevice)
DEVICE_CLASSES_SCHEMA,
PLATFORM_SCHEMA,
BinarySensorDevice,
)
from homeassistant.const import CONF_DEVICE_CLASS, CONF_NAME
import homeassistant.helpers.config_validation as cv
@@ -12,12 +15,14 @@ from . import CONF_ADS_VAR, DATA_ADS, AdsEntity, STATE_KEY_STATE
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = 'ADS binary sensor'
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA,
})
DEFAULT_NAME = "ADS binary sensor"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA,
}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -38,12 +43,11 @@ class AdsBinarySensor(AdsEntity, BinarySensorDevice):
def __init__(self, ads_hub, name, ads_var, device_class):
"""Initialize ADS binary sensor."""
super().__init__(ads_hub, name, ads_var)
self._device_class = device_class or 'moving'
self._device_class = device_class or "moving"
async def async_added_to_hass(self):
"""Register device notification."""
await self.async_initialize_device(self._ads_var,
self._ads_hub.PLCTYPE_BOOL)
await self.async_initialize_device(self._ads_var, self._ads_hub.PLCTYPE_BOOL)
@property
def is_on(self):

View File

@@ -4,35 +4,48 @@ import logging
import voluptuous as vol
from homeassistant.components.cover import (
PLATFORM_SCHEMA, SUPPORT_OPEN, SUPPORT_CLOSE, SUPPORT_STOP,
SUPPORT_SET_POSITION, ATTR_POSITION, DEVICE_CLASSES_SCHEMA,
CoverDevice)
from homeassistant.const import (
CONF_NAME, CONF_DEVICE_CLASS)
PLATFORM_SCHEMA,
SUPPORT_OPEN,
SUPPORT_CLOSE,
SUPPORT_STOP,
SUPPORT_SET_POSITION,
ATTR_POSITION,
DEVICE_CLASSES_SCHEMA,
CoverDevice,
)
from homeassistant.const import CONF_NAME, CONF_DEVICE_CLASS
import homeassistant.helpers.config_validation as cv
from . import CONF_ADS_VAR, CONF_ADS_VAR_POSITION, DATA_ADS, \
AdsEntity, STATE_KEY_STATE, STATE_KEY_POSITION
from . import (
CONF_ADS_VAR,
CONF_ADS_VAR_POSITION,
DATA_ADS,
AdsEntity,
STATE_KEY_STATE,
STATE_KEY_POSITION,
)
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = 'ADS Cover'
DEFAULT_NAME = "ADS Cover"
CONF_ADS_VAR_SET_POS = 'adsvar_set_position'
CONF_ADS_VAR_OPEN = 'adsvar_open'
CONF_ADS_VAR_CLOSE = 'adsvar_close'
CONF_ADS_VAR_STOP = 'adsvar_stop'
CONF_ADS_VAR_SET_POS = "adsvar_set_position"
CONF_ADS_VAR_OPEN = "adsvar_open"
CONF_ADS_VAR_CLOSE = "adsvar_close"
CONF_ADS_VAR_STOP = "adsvar_stop"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Optional(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_VAR_POSITION): cv.string,
vol.Optional(CONF_ADS_VAR_SET_POS): cv.string,
vol.Optional(CONF_ADS_VAR_CLOSE): cv.string,
vol.Optional(CONF_ADS_VAR_OPEN): cv.string,
vol.Optional(CONF_ADS_VAR_STOP): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Optional(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_VAR_POSITION): cv.string,
vol.Optional(CONF_ADS_VAR_SET_POS): cv.string,
vol.Optional(CONF_ADS_VAR_CLOSE): cv.string,
vol.Optional(CONF_ADS_VAR_OPEN): cv.string,
vol.Optional(CONF_ADS_VAR_STOP): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_DEVICE_CLASS): DEVICE_CLASSES_SCHEMA,
}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -48,24 +61,38 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
name = config[CONF_NAME]
device_class = config.get(CONF_DEVICE_CLASS)
add_entities([AdsCover(ads_hub,
ads_var_is_closed,
ads_var_position,
ads_var_pos_set,
ads_var_open,
ads_var_close,
ads_var_stop,
name,
device_class)])
add_entities(
[
AdsCover(
ads_hub,
ads_var_is_closed,
ads_var_position,
ads_var_pos_set,
ads_var_open,
ads_var_close,
ads_var_stop,
name,
device_class,
)
]
)
class AdsCover(AdsEntity, CoverDevice):
"""Representation of ADS cover."""
def __init__(self, ads_hub,
ads_var_is_closed, ads_var_position,
ads_var_pos_set, ads_var_open,
ads_var_close, ads_var_stop, name, device_class):
def __init__(
self,
ads_hub,
ads_var_is_closed,
ads_var_position,
ads_var_pos_set,
ads_var_open,
ads_var_close,
ads_var_stop,
name,
device_class,
):
"""Initialize AdsCover entity."""
super().__init__(ads_hub, name, ads_var_is_closed)
if self._ads_var is None:
@@ -87,13 +114,14 @@ class AdsCover(AdsEntity, CoverDevice):
async def async_added_to_hass(self):
"""Register device notification."""
if self._ads_var is not None:
await self.async_initialize_device(self._ads_var,
self._ads_hub.PLCTYPE_BOOL)
await self.async_initialize_device(
self._ads_var, self._ads_hub.PLCTYPE_BOOL
)
if self._ads_var_position is not None:
await self.async_initialize_device(self._ads_var_position,
self._ads_hub.PLCTYPE_BYTE,
STATE_KEY_POSITION)
await self.async_initialize_device(
self._ads_var_position, self._ads_hub.PLCTYPE_BYTE, STATE_KEY_POSITION
)
@property
def device_class(self):
@@ -130,29 +158,33 @@ class AdsCover(AdsEntity, CoverDevice):
def stop_cover(self, **kwargs):
"""Fire the stop action."""
if self._ads_var_stop:
self._ads_hub.write_by_name(self._ads_var_stop, True,
self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(
self._ads_var_stop, True, self._ads_hub.PLCTYPE_BOOL
)
def set_cover_position(self, **kwargs):
"""Set cover position."""
position = kwargs[ATTR_POSITION]
if self._ads_var_pos_set is not None:
self._ads_hub.write_by_name(self._ads_var_pos_set, position,
self._ads_hub.PLCTYPE_BYTE)
self._ads_hub.write_by_name(
self._ads_var_pos_set, position, self._ads_hub.PLCTYPE_BYTE
)
def open_cover(self, **kwargs):
"""Move the cover up."""
if self._ads_var_open is not None:
self._ads_hub.write_by_name(self._ads_var_open, True,
self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(
self._ads_var_open, True, self._ads_hub.PLCTYPE_BOOL
)
elif self._ads_var_pos_set is not None:
self.set_cover_position(position=100)
def close_cover(self, **kwargs):
"""Move the cover down."""
if self._ads_var_close is not None:
self._ads_hub.write_by_name(self._ads_var_close, True,
self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(
self._ads_var_close, True, self._ads_hub.PLCTYPE_BOOL
)
elif self._ads_var_pos_set is not None:
self.set_cover_position(position=0)
@@ -160,6 +192,8 @@ class AdsCover(AdsEntity, CoverDevice):
def available(self):
"""Return False if state has not been updated yet."""
if self._ads_var is not None or self._ads_var_position is not None:
return self._state_dict[STATE_KEY_STATE] is not None or \
self._state_dict[STATE_KEY_POSITION] is not None
return (
self._state_dict[STATE_KEY_STATE] is not None
or self._state_dict[STATE_KEY_POSITION] is not None
)
return True

View File

@@ -4,20 +4,32 @@ import logging
import voluptuous as vol
from homeassistant.components.light import (
ATTR_BRIGHTNESS, PLATFORM_SCHEMA, SUPPORT_BRIGHTNESS, Light)
ATTR_BRIGHTNESS,
PLATFORM_SCHEMA,
SUPPORT_BRIGHTNESS,
Light,
)
from homeassistant.const import CONF_NAME
import homeassistant.helpers.config_validation as cv
from . import CONF_ADS_VAR, CONF_ADS_VAR_BRIGHTNESS, DATA_ADS, \
AdsEntity, STATE_KEY_BRIGHTNESS, STATE_KEY_STATE
from . import (
CONF_ADS_VAR,
CONF_ADS_VAR_BRIGHTNESS,
DATA_ADS,
AdsEntity,
STATE_KEY_BRIGHTNESS,
STATE_KEY_STATE,
)
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = 'ADS Light'
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_VAR_BRIGHTNESS): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string
})
DEFAULT_NAME = "ADS Light"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_VAR_BRIGHTNESS): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -28,8 +40,7 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
ads_var_brightness = config.get(CONF_ADS_VAR_BRIGHTNESS)
name = config.get(CONF_NAME)
add_entities([AdsLight(ads_hub, ads_var_enable, ads_var_brightness,
name)])
add_entities([AdsLight(ads_hub, ads_var_enable, ads_var_brightness, name)])
class AdsLight(AdsEntity, Light):
@@ -43,13 +54,14 @@ class AdsLight(AdsEntity, Light):
async def async_added_to_hass(self):
"""Register device notification."""
await self.async_initialize_device(self._ads_var,
self._ads_hub.PLCTYPE_BOOL)
await self.async_initialize_device(self._ads_var, self._ads_hub.PLCTYPE_BOOL)
if self._ads_var_brightness is not None:
await self.async_initialize_device(self._ads_var_brightness,
self._ads_hub.PLCTYPE_UINT,
STATE_KEY_BRIGHTNESS)
await self.async_initialize_device(
self._ads_var_brightness,
self._ads_hub.PLCTYPE_UINT,
STATE_KEY_BRIGHTNESS,
)
@property
def brightness(self):
@@ -72,14 +84,13 @@ class AdsLight(AdsEntity, Light):
def turn_on(self, **kwargs):
"""Turn the light on or set a specific dimmer value."""
brightness = kwargs.get(ATTR_BRIGHTNESS)
self._ads_hub.write_by_name(self._ads_var, True,
self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(self._ads_var, True, self._ads_hub.PLCTYPE_BOOL)
if self._ads_var_brightness is not None and brightness is not None:
self._ads_hub.write_by_name(self._ads_var_brightness, brightness,
self._ads_hub.PLCTYPE_UINT)
self._ads_hub.write_by_name(
self._ads_var_brightness, brightness, self._ads_hub.PLCTYPE_UINT
)
def turn_off(self, **kwargs):
"""Turn the light off."""
self._ads_hub.write_by_name(self._ads_var, False,
self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(self._ads_var, False, self._ads_hub.PLCTYPE_BOOL)

View File

@@ -8,21 +8,28 @@ from homeassistant.components.sensor import PLATFORM_SCHEMA
from homeassistant.const import CONF_NAME, CONF_UNIT_OF_MEASUREMENT
import homeassistant.helpers.config_validation as cv
from . import CONF_ADS_FACTOR, CONF_ADS_TYPE, CONF_ADS_VAR, \
AdsEntity, STATE_KEY_STATE
from . import CONF_ADS_FACTOR, CONF_ADS_TYPE, CONF_ADS_VAR, AdsEntity, STATE_KEY_STATE
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = "ADS sensor"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_FACTOR): cv.positive_int,
vol.Optional(CONF_ADS_TYPE, default=ads.ADSTYPE_INT):
vol.In([ads.ADSTYPE_INT, ads.ADSTYPE_UINT, ads.ADSTYPE_BYTE,
ads.ADSTYPE_DINT, ads.ADSTYPE_UDINT]),
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_UNIT_OF_MEASUREMENT, default=''): cv.string,
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_ADS_FACTOR): cv.positive_int,
vol.Optional(CONF_ADS_TYPE, default=ads.ADSTYPE_INT): vol.In(
[
ads.ADSTYPE_INT,
ads.ADSTYPE_UINT,
ads.ADSTYPE_BYTE,
ads.ADSTYPE_DINT,
ads.ADSTYPE_UDINT,
]
),
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
vol.Optional(CONF_UNIT_OF_MEASUREMENT, default=""): cv.string,
}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -35,8 +42,7 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
unit_of_measurement = config.get(CONF_UNIT_OF_MEASUREMENT)
factor = config.get(CONF_ADS_FACTOR)
entity = AdsSensor(
ads_hub, ads_var, ads_type, name, unit_of_measurement, factor)
entity = AdsSensor(ads_hub, ads_var, ads_type, name, unit_of_measurement, factor)
add_entities([entity])
@@ -44,8 +50,7 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
class AdsSensor(AdsEntity):
"""Representation of an ADS sensor entity."""
def __init__(self, ads_hub, ads_var, ads_type, name, unit_of_measurement,
factor):
def __init__(self, ads_hub, ads_var, ads_type, name, unit_of_measurement, factor):
"""Initialize AdsSensor entity."""
super().__init__(ads_hub, name, ads_var)
self._unit_of_measurement = unit_of_measurement
@@ -58,7 +63,8 @@ class AdsSensor(AdsEntity):
self._ads_var,
self._ads_hub.ADS_TYPEMAP[self._ads_type],
STATE_KEY_STATE,
self._factor)
self._factor,
)
@property
def state(self):

View File

@@ -11,12 +11,11 @@ from . import CONF_ADS_VAR, DATA_ADS, AdsEntity, STATE_KEY_STATE
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = 'ADS Switch'
DEFAULT_NAME = "ADS Switch"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_ADS_VAR): cv.string,
vol.Optional(CONF_NAME): cv.string,
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{vol.Required(CONF_ADS_VAR): cv.string, vol.Optional(CONF_NAME): cv.string}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -34,8 +33,7 @@ class AdsSwitch(AdsEntity, SwitchDevice):
async def async_added_to_hass(self):
"""Register device notification."""
await self.async_initialize_device(self._ads_var,
self._ads_hub.PLCTYPE_BOOL)
await self.async_initialize_device(self._ads_var, self._ads_hub.PLCTYPE_BOOL)
@property
def is_on(self):
@@ -44,10 +42,8 @@ class AdsSwitch(AdsEntity, SwitchDevice):
def turn_on(self, **kwargs):
"""Turn the switch on."""
self._ads_hub.write_by_name(
self._ads_var, True, self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(self._ads_var, True, self._ads_hub.PLCTYPE_BOOL)
def turn_off(self, **kwargs):
"""Turn the switch off."""
self._ads_hub.write_by_name(
self._ads_var, False, self._ads_hub.PLCTYPE_BOOL)
self._ads_hub.write_by_name(self._ads_var, False, self._ads_hub.PLCTYPE_BOOL)

View File

@@ -1,2 +1,2 @@
"""Constants for the Aftership integration."""
DOMAIN = 'aftership'
DOMAIN = "aftership"

View File

@@ -15,24 +15,24 @@ from .const import DOMAIN
_LOGGER = logging.getLogger(__name__)
ATTRIBUTION = 'Information provided by AfterShip'
ATTR_TRACKINGS = 'trackings'
ATTRIBUTION = "Information provided by AfterShip"
ATTR_TRACKINGS = "trackings"
BASE = 'https://track.aftership.com/'
BASE = "https://track.aftership.com/"
CONF_SLUG = 'slug'
CONF_TITLE = 'title'
CONF_TRACKING_NUMBER = 'tracking_number'
CONF_SLUG = "slug"
CONF_TITLE = "title"
CONF_TRACKING_NUMBER = "tracking_number"
DEFAULT_NAME = 'aftership'
UPDATE_TOPIC = DOMAIN + '_update'
DEFAULT_NAME = "aftership"
UPDATE_TOPIC = DOMAIN + "_update"
ICON = 'mdi:package-variant-closed'
ICON = "mdi:package-variant-closed"
MIN_TIME_BETWEEN_UPDATES = timedelta(minutes=5)
SERVICE_ADD_TRACKING = 'add_tracking'
SERVICE_REMOVE_TRACKING = 'remove_tracking'
SERVICE_ADD_TRACKING = "add_tracking"
SERVICE_REMOVE_TRACKING = "remove_tracking"
ADD_TRACKING_SERVICE_SCHEMA = vol.Schema(
{
@@ -43,18 +43,18 @@ ADD_TRACKING_SERVICE_SCHEMA = vol.Schema(
)
REMOVE_TRACKING_SERVICE_SCHEMA = vol.Schema(
{vol.Required(CONF_SLUG): cv.string,
vol.Required(CONF_TRACKING_NUMBER): cv.string}
{vol.Required(CONF_SLUG): cv.string, vol.Required(CONF_TRACKING_NUMBER): cv.string}
)
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_API_KEY): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_API_KEY): cv.string,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
}
)
async def async_setup_platform(
hass, config, async_add_entities, discovery_info=None):
async def async_setup_platform(hass, config, async_add_entities, discovery_info=None):
"""Set up the AfterShip sensor platform."""
from pyaftership.tracker import Tracking
@@ -66,9 +66,10 @@ async def async_setup_platform(
await aftership.get_trackings()
if not aftership.meta or aftership.meta['code'] != 200:
_LOGGER.error("No tracking data found. Check API key is correct: %s",
aftership.meta)
if not aftership.meta or aftership.meta["code"] != 200:
_LOGGER.error(
"No tracking data found. Check API key is correct: %s", aftership.meta
)
return
instance = AfterShipSensor(aftership, name)
@@ -130,7 +131,7 @@ class AfterShipSensor(Entity):
@property
def unit_of_measurement(self):
"""Return the unit of measurement of this entity, if any."""
return 'packages'
return "packages"
@property
def device_state_attributes(self):
@@ -145,7 +146,8 @@ class AfterShipSensor(Entity):
async def async_added_to_hass(self):
"""Register callbacks."""
self.hass.helpers.dispatcher.async_dispatcher_connect(
UPDATE_TOPIC, self.force_update)
UPDATE_TOPIC, self.force_update
)
async def force_update(self):
"""Force update of data."""
@@ -160,40 +162,40 @@ class AfterShipSensor(Entity):
if not self.aftership.meta:
_LOGGER.error("Unknown errors when querying")
return
if self.aftership.meta['code'] != 200:
if self.aftership.meta["code"] != 200:
_LOGGER.error(
"Errors when querying AfterShip. %s", str(self.aftership.meta))
"Errors when querying AfterShip. %s", str(self.aftership.meta)
)
return
status_to_ignore = {'delivered'}
status_to_ignore = {"delivered"}
status_counts = {}
trackings = []
not_delivered_count = 0
for track in self.aftership.trackings['trackings']:
status = track['tag'].lower()
for track in self.aftership.trackings["trackings"]:
status = track["tag"].lower()
name = (
track['tracking_number']
if track['title'] is None
else track['title']
track["tracking_number"] if track["title"] is None else track["title"]
)
last_checkpoint = (
"Shipment pending"
if track['tag'] == "Pending"
else track['checkpoints'][-1]
if track["tag"] == "Pending"
else track["checkpoints"][-1]
)
status_counts[status] = status_counts.get(status, 0) + 1
trackings.append({
'name': name,
'tracking_number': track['tracking_number'],
'slug': track['slug'],
'link': '%s%s/%s' %
(BASE, track['slug'], track['tracking_number']),
'last_update': track['updated_at'],
'expected_delivery': track['expected_delivery'],
'status': track['tag'],
'last_checkpoint': last_checkpoint
})
trackings.append(
{
"name": name,
"tracking_number": track["tracking_number"],
"slug": track["slug"],
"link": "%s%s/%s" % (BASE, track["slug"], track["tracking_number"]),
"last_update": track["updated_at"],
"expected_delivery": track["expected_delivery"],
"status": track["tag"],
"last_checkpoint": last_checkpoint,
}
)
if status not in status_to_ignore:
not_delivered_count += 1

View File

@@ -4,50 +4,53 @@ import logging
from homeassistant.helpers.entity_component import EntityComponent
from homeassistant.helpers.config_validation import ( # noqa
PLATFORM_SCHEMA, PLATFORM_SCHEMA_BASE)
PLATFORM_SCHEMA,
PLATFORM_SCHEMA_BASE,
)
from homeassistant.helpers.entity import Entity
_LOGGER = logging.getLogger(__name__)
ATTR_AQI = 'air_quality_index'
ATTR_ATTRIBUTION = 'attribution'
ATTR_CO2 = 'carbon_dioxide'
ATTR_CO = 'carbon_monoxide'
ATTR_N2O = 'nitrogen_oxide'
ATTR_NO = 'nitrogen_monoxide'
ATTR_NO2 = 'nitrogen_dioxide'
ATTR_OZONE = 'ozone'
ATTR_PM_0_1 = 'particulate_matter_0_1'
ATTR_PM_10 = 'particulate_matter_10'
ATTR_PM_2_5 = 'particulate_matter_2_5'
ATTR_SO2 = 'sulphur_dioxide'
ATTR_AQI = "air_quality_index"
ATTR_ATTRIBUTION = "attribution"
ATTR_CO2 = "carbon_dioxide"
ATTR_CO = "carbon_monoxide"
ATTR_N2O = "nitrogen_oxide"
ATTR_NO = "nitrogen_monoxide"
ATTR_NO2 = "nitrogen_dioxide"
ATTR_OZONE = "ozone"
ATTR_PM_0_1 = "particulate_matter_0_1"
ATTR_PM_10 = "particulate_matter_10"
ATTR_PM_2_5 = "particulate_matter_2_5"
ATTR_SO2 = "sulphur_dioxide"
DOMAIN = 'air_quality'
DOMAIN = "air_quality"
ENTITY_ID_FORMAT = DOMAIN + '.{}'
ENTITY_ID_FORMAT = DOMAIN + ".{}"
SCAN_INTERVAL = timedelta(seconds=30)
PROP_TO_ATTR = {
'air_quality_index': ATTR_AQI,
'attribution': ATTR_ATTRIBUTION,
'carbon_dioxide': ATTR_CO2,
'carbon_monoxide': ATTR_CO,
'nitrogen_oxide': ATTR_N2O,
'nitrogen_monoxide': ATTR_NO,
'nitrogen_dioxide': ATTR_NO2,
'ozone': ATTR_OZONE,
'particulate_matter_0_1': ATTR_PM_0_1,
'particulate_matter_10': ATTR_PM_10,
'particulate_matter_2_5': ATTR_PM_2_5,
'sulphur_dioxide': ATTR_SO2,
"air_quality_index": ATTR_AQI,
"attribution": ATTR_ATTRIBUTION,
"carbon_dioxide": ATTR_CO2,
"carbon_monoxide": ATTR_CO,
"nitrogen_oxide": ATTR_N2O,
"nitrogen_monoxide": ATTR_NO,
"nitrogen_dioxide": ATTR_NO2,
"ozone": ATTR_OZONE,
"particulate_matter_0_1": ATTR_PM_0_1,
"particulate_matter_10": ATTR_PM_10,
"particulate_matter_2_5": ATTR_PM_2_5,
"sulphur_dioxide": ATTR_SO2,
}
async def async_setup(hass, config):
"""Set up the air quality component."""
component = hass.data[DOMAIN] = EntityComponent(
_LOGGER, DOMAIN, hass, SCAN_INTERVAL)
_LOGGER, DOMAIN, hass, SCAN_INTERVAL
)
await component.async_setup(config)
return True

View File

@@ -6,118 +6,96 @@ import voluptuous as vol
from homeassistant.components.sensor import PLATFORM_SCHEMA
from homeassistant.const import (
ATTR_ATTRIBUTION, ATTR_LATITUDE, ATTR_LONGITUDE, CONF_API_KEY,
CONF_LATITUDE, CONF_LONGITUDE, CONF_MONITORED_CONDITIONS,
CONF_SCAN_INTERVAL, CONF_STATE, CONF_SHOW_ON_MAP)
ATTR_ATTRIBUTION,
ATTR_LATITUDE,
ATTR_LONGITUDE,
CONF_API_KEY,
CONF_LATITUDE,
CONF_LONGITUDE,
CONF_MONITORED_CONDITIONS,
CONF_SCAN_INTERVAL,
CONF_STATE,
CONF_SHOW_ON_MAP,
)
from homeassistant.helpers import aiohttp_client, config_validation as cv
from homeassistant.helpers.entity import Entity
from homeassistant.util import Throttle
_LOGGER = getLogger(__name__)
ATTR_CITY = 'city'
ATTR_COUNTRY = 'country'
ATTR_POLLUTANT_SYMBOL = 'pollutant_symbol'
ATTR_POLLUTANT_UNIT = 'pollutant_unit'
ATTR_REGION = 'region'
ATTR_CITY = "city"
ATTR_COUNTRY = "country"
ATTR_POLLUTANT_SYMBOL = "pollutant_symbol"
ATTR_POLLUTANT_UNIT = "pollutant_unit"
ATTR_REGION = "region"
CONF_CITY = 'city'
CONF_COUNTRY = 'country'
CONF_CITY = "city"
CONF_COUNTRY = "country"
DEFAULT_ATTRIBUTION = "Data provided by AirVisual"
DEFAULT_SCAN_INTERVAL = timedelta(minutes=10)
MASS_PARTS_PER_MILLION = 'ppm'
MASS_PARTS_PER_BILLION = 'ppb'
VOLUME_MICROGRAMS_PER_CUBIC_METER = 'µg/m3'
MASS_PARTS_PER_MILLION = "ppm"
MASS_PARTS_PER_BILLION = "ppb"
VOLUME_MICROGRAMS_PER_CUBIC_METER = "µg/m3"
SENSOR_TYPE_LEVEL = 'air_pollution_level'
SENSOR_TYPE_AQI = 'air_quality_index'
SENSOR_TYPE_POLLUTANT = 'main_pollutant'
SENSOR_TYPE_LEVEL = "air_pollution_level"
SENSOR_TYPE_AQI = "air_quality_index"
SENSOR_TYPE_POLLUTANT = "main_pollutant"
SENSORS = [
(SENSOR_TYPE_LEVEL, 'Air Pollution Level', 'mdi:gauge', None),
(SENSOR_TYPE_AQI, 'Air Quality Index', 'mdi:chart-line', 'AQI'),
(SENSOR_TYPE_POLLUTANT, 'Main Pollutant', 'mdi:chemical-weapon', None),
(SENSOR_TYPE_LEVEL, "Air Pollution Level", "mdi:gauge", None),
(SENSOR_TYPE_AQI, "Air Quality Index", "mdi:chart-line", "AQI"),
(SENSOR_TYPE_POLLUTANT, "Main Pollutant", "mdi:chemical-weapon", None),
]
POLLUTANT_LEVEL_MAPPING = [{
'label': 'Good',
'icon': 'mdi:emoticon-excited',
'minimum': 0,
'maximum': 50
}, {
'label': 'Moderate',
'icon': 'mdi:emoticon-happy',
'minimum': 51,
'maximum': 100
}, {
'label': 'Unhealthy for sensitive groups',
'icon': 'mdi:emoticon-neutral',
'minimum': 101,
'maximum': 150
}, {
'label': 'Unhealthy',
'icon': 'mdi:emoticon-sad',
'minimum': 151,
'maximum': 200
}, {
'label': 'Very Unhealthy',
'icon': 'mdi:emoticon-dead',
'minimum': 201,
'maximum': 300
}, {
'label': 'Hazardous',
'icon': 'mdi:biohazard',
'minimum': 301,
'maximum': 10000
}]
POLLUTANT_LEVEL_MAPPING = [
{"label": "Good", "icon": "mdi:emoticon-excited", "minimum": 0, "maximum": 50},
{"label": "Moderate", "icon": "mdi:emoticon-happy", "minimum": 51, "maximum": 100},
{
"label": "Unhealthy for sensitive groups",
"icon": "mdi:emoticon-neutral",
"minimum": 101,
"maximum": 150,
},
{"label": "Unhealthy", "icon": "mdi:emoticon-sad", "minimum": 151, "maximum": 200},
{
"label": "Very Unhealthy",
"icon": "mdi:emoticon-dead",
"minimum": 201,
"maximum": 300,
},
{"label": "Hazardous", "icon": "mdi:biohazard", "minimum": 301, "maximum": 10000},
]
POLLUTANT_MAPPING = {
'co': {
'label': 'Carbon Monoxide',
'unit': MASS_PARTS_PER_MILLION
},
'n2': {
'label': 'Nitrogen Dioxide',
'unit': MASS_PARTS_PER_BILLION
},
'o3': {
'label': 'Ozone',
'unit': MASS_PARTS_PER_BILLION
},
'p1': {
'label': 'PM10',
'unit': VOLUME_MICROGRAMS_PER_CUBIC_METER
},
'p2': {
'label': 'PM2.5',
'unit': VOLUME_MICROGRAMS_PER_CUBIC_METER
},
's2': {
'label': 'Sulfur Dioxide',
'unit': MASS_PARTS_PER_BILLION
},
"co": {"label": "Carbon Monoxide", "unit": MASS_PARTS_PER_MILLION},
"n2": {"label": "Nitrogen Dioxide", "unit": MASS_PARTS_PER_BILLION},
"o3": {"label": "Ozone", "unit": MASS_PARTS_PER_BILLION},
"p1": {"label": "PM10", "unit": VOLUME_MICROGRAMS_PER_CUBIC_METER},
"p2": {"label": "PM2.5", "unit": VOLUME_MICROGRAMS_PER_CUBIC_METER},
"s2": {"label": "Sulfur Dioxide", "unit": MASS_PARTS_PER_BILLION},
}
SENSOR_LOCALES = {'cn': 'Chinese', 'us': 'U.S.'}
SENSOR_LOCALES = {"cn": "Chinese", "us": "U.S."}
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_API_KEY): cv.string,
vol.Required(CONF_MONITORED_CONDITIONS, default=list(SENSOR_LOCALES)):
vol.All(cv.ensure_list, [vol.In(SENSOR_LOCALES)]),
vol.Inclusive(CONF_CITY, 'city'): cv.string,
vol.Inclusive(CONF_COUNTRY, 'city'): cv.string,
vol.Inclusive(CONF_LATITUDE, 'coords'): cv.latitude,
vol.Inclusive(CONF_LONGITUDE, 'coords'): cv.longitude,
vol.Optional(CONF_SHOW_ON_MAP, default=True): cv.boolean,
vol.Inclusive(CONF_STATE, 'city'): cv.string,
vol.Optional(CONF_SCAN_INTERVAL, default=DEFAULT_SCAN_INTERVAL):
cv.time_period
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_API_KEY): cv.string,
vol.Required(CONF_MONITORED_CONDITIONS, default=list(SENSOR_LOCALES)): vol.All(
cv.ensure_list, [vol.In(SENSOR_LOCALES)]
),
vol.Inclusive(CONF_CITY, "city"): cv.string,
vol.Inclusive(CONF_COUNTRY, "city"): cv.string,
vol.Inclusive(CONF_LATITUDE, "coords"): cv.latitude,
vol.Inclusive(CONF_LONGITUDE, "coords"): cv.longitude,
vol.Optional(CONF_SHOW_ON_MAP, default=True): cv.boolean,
vol.Inclusive(CONF_STATE, "city"): cv.string,
vol.Optional(CONF_SCAN_INTERVAL, default=DEFAULT_SCAN_INTERVAL): cv.time_period,
}
)
async def async_setup_platform(
hass, config, async_add_entities, discovery_info=None):
async def async_setup_platform(hass, config, async_add_entities, discovery_info=None):
"""Configure the platform and add the sensors."""
from pyairvisual import Client
@@ -132,25 +110,27 @@ async def async_setup_platform(
if city and state and country:
_LOGGER.debug(
"Using city, state, and country: %s, %s, %s", city, state, country)
location_id = ','.join((city, state, country))
"Using city, state, and country: %s, %s, %s", city, state, country
)
location_id = ",".join((city, state, country))
data = AirVisualData(
Client(websession, api_key=config[CONF_API_KEY]),
city=city,
state=state,
country=country,
show_on_map=config[CONF_SHOW_ON_MAP],
scan_interval=config[CONF_SCAN_INTERVAL])
scan_interval=config[CONF_SCAN_INTERVAL],
)
else:
_LOGGER.debug(
"Using latitude and longitude: %s, %s", latitude, longitude)
location_id = ','.join((str(latitude), str(longitude)))
_LOGGER.debug("Using latitude and longitude: %s, %s", latitude, longitude)
location_id = ",".join((str(latitude), str(longitude)))
data = AirVisualData(
Client(websession, api_key=config[CONF_API_KEY]),
latitude=latitude,
longitude=longitude,
show_on_map=config[CONF_SHOW_ON_MAP],
scan_interval=config[CONF_SCAN_INTERVAL])
scan_interval=config[CONF_SCAN_INTERVAL],
)
await data.async_update()
@@ -158,8 +138,8 @@ async def async_setup_platform(
for locale in config[CONF_MONITORED_CONDITIONS]:
for kind, name, icon, unit in SENSORS:
sensors.append(
AirVisualSensor(
data, kind, name, icon, unit, locale, location_id))
AirVisualSensor(data, kind, name, icon, unit, locale, location_id)
)
async_add_entities(sensors, True)
@@ -186,8 +166,8 @@ class AirVisualSensor(Entity):
self._attrs[ATTR_LATITUDE] = self.airvisual.latitude
self._attrs[ATTR_LONGITUDE] = self.airvisual.longitude
else:
self._attrs['lati'] = self.airvisual.latitude
self._attrs['long'] = self.airvisual.longitude
self._attrs["lati"] = self.airvisual.latitude
self._attrs["long"] = self.airvisual.longitude
return self._attrs
@@ -204,7 +184,7 @@ class AirVisualSensor(Entity):
@property
def name(self):
"""Return the name."""
return '{0} {1}'.format(SENSOR_LOCALES[self._locale], self._name)
return "{0} {1}".format(SENSOR_LOCALES[self._locale], self._name)
@property
def state(self):
@@ -214,8 +194,7 @@ class AirVisualSensor(Entity):
@property
def unique_id(self):
"""Return a unique, HASS-friendly identifier for this entity."""
return '{0}_{1}_{2}'.format(
self._location_id, self._locale, self._type)
return "{0}_{1}_{2}".format(self._location_id, self._locale, self._type)
@property
def unit_of_measurement(self):
@@ -231,22 +210,25 @@ class AirVisualSensor(Entity):
return
if self._type == SENSOR_TYPE_LEVEL:
aqi = data['aqi{0}'.format(self._locale)]
aqi = data["aqi{0}".format(self._locale)]
[level] = [
i for i in POLLUTANT_LEVEL_MAPPING
if i['minimum'] <= aqi <= i['maximum']
i
for i in POLLUTANT_LEVEL_MAPPING
if i["minimum"] <= aqi <= i["maximum"]
]
self._state = level['label']
self._icon = level['icon']
self._state = level["label"]
self._icon = level["icon"]
elif self._type == SENSOR_TYPE_AQI:
self._state = data['aqi{0}'.format(self._locale)]
self._state = data["aqi{0}".format(self._locale)]
elif self._type == SENSOR_TYPE_POLLUTANT:
symbol = data['main{0}'.format(self._locale)]
self._state = POLLUTANT_MAPPING[symbol]['label']
self._attrs.update({
ATTR_POLLUTANT_SYMBOL: symbol,
ATTR_POLLUTANT_UNIT: POLLUTANT_MAPPING[symbol]['unit']
})
symbol = data["main{0}".format(self._locale)]
self._state = POLLUTANT_MAPPING[symbol]["label"]
self._attrs.update(
{
ATTR_POLLUTANT_SYMBOL: symbol,
ATTR_POLLUTANT_UNIT: POLLUTANT_MAPPING[symbol]["unit"],
}
)
class AirVisualData:
@@ -263,8 +245,7 @@ class AirVisualData:
self.show_on_map = kwargs.get(CONF_SHOW_ON_MAP)
self.state = kwargs.get(CONF_STATE)
self.async_update = Throttle(
kwargs[CONF_SCAN_INTERVAL])(self._async_update)
self.async_update = Throttle(kwargs[CONF_SCAN_INTERVAL])(self._async_update)
async def _async_update(self):
"""Update AirVisual data."""
@@ -272,23 +253,21 @@ class AirVisualData:
try:
if self.city and self.state and self.country:
resp = await self._client.api.city(
self.city, self.state, self.country)
self.longitude, self.latitude = resp['location']['coordinates']
resp = await self._client.api.city(self.city, self.state, self.country)
self.longitude, self.latitude = resp["location"]["coordinates"]
else:
resp = await self._client.api.nearest_city(
self.latitude, self.longitude)
self.latitude, self.longitude
)
_LOGGER.debug("New data retrieved: %s", resp)
self.pollution_info = resp['current']['pollution']
self.pollution_info = resp["current"]["pollution"]
except (KeyError, AirVisualError) as err:
if self.city and self.state and self.country:
location = (self.city, self.state, self.country)
else:
location = (self.latitude, self.longitude)
_LOGGER.error(
"Can't retrieve data for location: %s (%s)", location,
err)
_LOGGER.error("Can't retrieve data for location: %s (%s)", location, err)
self.pollution_info = {}

View File

@@ -3,30 +3,39 @@ import logging
import voluptuous as vol
from homeassistant.components.cover import (CoverDevice, PLATFORM_SCHEMA,
SUPPORT_OPEN, SUPPORT_CLOSE)
from homeassistant.const import (CONF_USERNAME, CONF_PASSWORD, STATE_CLOSED,
STATE_OPENING, STATE_CLOSING, STATE_OPEN)
from homeassistant.components.cover import (
CoverDevice,
PLATFORM_SCHEMA,
SUPPORT_OPEN,
SUPPORT_CLOSE,
)
from homeassistant.const import (
CONF_USERNAME,
CONF_PASSWORD,
STATE_CLOSED,
STATE_OPENING,
STATE_CLOSING,
STATE_OPEN,
)
import homeassistant.helpers.config_validation as cv
_LOGGER = logging.getLogger(__name__)
NOTIFICATION_ID = 'aladdin_notification'
NOTIFICATION_TITLE = 'Aladdin Connect Cover Setup'
NOTIFICATION_ID = "aladdin_notification"
NOTIFICATION_TITLE = "Aladdin Connect Cover Setup"
STATES_MAP = {
'open': STATE_OPEN,
'opening': STATE_OPENING,
'closed': STATE_CLOSED,
'closing': STATE_CLOSING
"open": STATE_OPEN,
"opening": STATE_OPENING,
"closed": STATE_CLOSED,
"closing": STATE_CLOSING,
}
SUPPORTED_FEATURES = SUPPORT_OPEN | SUPPORT_CLOSE
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_USERNAME): cv.string,
vol.Required(CONF_PASSWORD): cv.string
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{vol.Required(CONF_USERNAME): cv.string, vol.Required(CONF_PASSWORD): cv.string}
)
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -44,11 +53,12 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
except (TypeError, KeyError, NameError, ValueError) as ex:
_LOGGER.error("%s", ex)
hass.components.persistent_notification.create(
'Error: {}<br />'
'You will need to restart hass after fixing.'
''.format(ex),
"Error: {}<br />"
"You will need to restart hass after fixing."
"".format(ex),
title=NOTIFICATION_TITLE,
notification_id=NOTIFICATION_ID)
notification_id=NOTIFICATION_ID,
)
class AladdinDevice(CoverDevice):
@@ -57,15 +67,15 @@ class AladdinDevice(CoverDevice):
def __init__(self, acc, device):
"""Initialize the cover."""
self._acc = acc
self._device_id = device['device_id']
self._number = device['door_number']
self._name = device['name']
self._status = STATES_MAP.get(device['status'])
self._device_id = device["device_id"]
self._number = device["door_number"]
self._name = device["name"]
self._status = STATES_MAP.get(device["status"])
@property
def device_class(self):
"""Define this cover as a garage door."""
return 'garage'
return "garage"
@property
def supported_features(self):
@@ -75,7 +85,7 @@ class AladdinDevice(CoverDevice):
@property
def unique_id(self):
"""Return a unique ID."""
return '{}-{}'.format(self._device_id, self._number)
return "{}-{}".format(self._device_id, self._number)
@property
def name(self):

View File

@@ -5,60 +5,65 @@ import logging
import voluptuous as vol
from homeassistant.const import (
ATTR_CODE, ATTR_CODE_FORMAT, ATTR_ENTITY_ID, SERVICE_ALARM_TRIGGER,
SERVICE_ALARM_DISARM, SERVICE_ALARM_ARM_HOME, SERVICE_ALARM_ARM_AWAY,
SERVICE_ALARM_ARM_NIGHT, SERVICE_ALARM_ARM_CUSTOM_BYPASS)
ATTR_CODE,
ATTR_CODE_FORMAT,
SERVICE_ALARM_TRIGGER,
SERVICE_ALARM_DISARM,
SERVICE_ALARM_ARM_HOME,
SERVICE_ALARM_ARM_AWAY,
SERVICE_ALARM_ARM_NIGHT,
SERVICE_ALARM_ARM_CUSTOM_BYPASS,
)
from homeassistant.helpers.config_validation import ( # noqa
PLATFORM_SCHEMA, PLATFORM_SCHEMA_BASE)
ENTITY_SERVICE_SCHEMA,
PLATFORM_SCHEMA,
PLATFORM_SCHEMA_BASE,
)
import homeassistant.helpers.config_validation as cv
from homeassistant.helpers.entity import Entity
from homeassistant.helpers.entity_component import EntityComponent
DOMAIN = 'alarm_control_panel'
DOMAIN = "alarm_control_panel"
SCAN_INTERVAL = timedelta(seconds=30)
ATTR_CHANGED_BY = 'changed_by'
FORMAT_TEXT = 'text'
FORMAT_NUMBER = 'number'
ATTR_CODE_ARM_REQUIRED = 'code_arm_required'
ATTR_CHANGED_BY = "changed_by"
FORMAT_TEXT = "text"
FORMAT_NUMBER = "number"
ATTR_CODE_ARM_REQUIRED = "code_arm_required"
ENTITY_ID_FORMAT = DOMAIN + '.{}'
ENTITY_ID_FORMAT = DOMAIN + ".{}"
ALARM_SERVICE_SCHEMA = vol.Schema({
vol.Optional(ATTR_ENTITY_ID): cv.comp_entity_ids,
vol.Optional(ATTR_CODE): cv.string,
})
ALARM_SERVICE_SCHEMA = ENTITY_SERVICE_SCHEMA.extend(
{vol.Optional(ATTR_CODE): cv.string}
)
async def async_setup(hass, config):
"""Track states and offer events for sensors."""
component = hass.data[DOMAIN] = EntityComponent(
logging.getLogger(__name__), DOMAIN, hass, SCAN_INTERVAL)
logging.getLogger(__name__), DOMAIN, hass, SCAN_INTERVAL
)
await component.async_setup(config)
component.async_register_entity_service(
SERVICE_ALARM_DISARM, ALARM_SERVICE_SCHEMA,
'async_alarm_disarm'
SERVICE_ALARM_DISARM, ALARM_SERVICE_SCHEMA, "async_alarm_disarm"
)
component.async_register_entity_service(
SERVICE_ALARM_ARM_HOME, ALARM_SERVICE_SCHEMA,
'async_alarm_arm_home'
SERVICE_ALARM_ARM_HOME, ALARM_SERVICE_SCHEMA, "async_alarm_arm_home"
)
component.async_register_entity_service(
SERVICE_ALARM_ARM_AWAY, ALARM_SERVICE_SCHEMA,
'async_alarm_arm_away'
SERVICE_ALARM_ARM_AWAY, ALARM_SERVICE_SCHEMA, "async_alarm_arm_away"
)
component.async_register_entity_service(
SERVICE_ALARM_ARM_NIGHT, ALARM_SERVICE_SCHEMA,
'async_alarm_arm_night'
SERVICE_ALARM_ARM_NIGHT, ALARM_SERVICE_SCHEMA, "async_alarm_arm_night"
)
component.async_register_entity_service(
SERVICE_ALARM_ARM_CUSTOM_BYPASS, ALARM_SERVICE_SCHEMA,
'async_alarm_arm_custom_bypass'
SERVICE_ALARM_ARM_CUSTOM_BYPASS,
ALARM_SERVICE_SCHEMA,
"async_alarm_arm_custom_bypass",
)
component.async_register_entity_service(
SERVICE_ALARM_TRIGGER, ALARM_SERVICE_SCHEMA,
'async_alarm_trigger'
SERVICE_ALARM_TRIGGER, ALARM_SERVICE_SCHEMA, "async_alarm_trigger"
)
return True
@@ -157,8 +162,7 @@ class AlarmControlPanel(Entity):
This method must be run in the event loop and returns a coroutine.
"""
return self.hass.async_add_executor_job(
self.alarm_arm_custom_bypass, code)
return self.hass.async_add_executor_job(self.alarm_arm_custom_bypass, code)
@property
def state_attributes(self):
@@ -166,6 +170,6 @@ class AlarmControlPanel(Entity):
state_attr = {
ATTR_CODE_FORMAT: self.code_format,
ATTR_CHANGED_BY: self.changed_by,
ATTR_CODE_ARM_REQUIRED: self.code_arm_required
ATTR_CODE_ARM_REQUIRED: self.code_arm_required,
}
return state_attr

View File

@@ -12,85 +12,105 @@ from homeassistant.components.binary_sensor import DEVICE_CLASSES_SCHEMA
_LOGGER = logging.getLogger(__name__)
DOMAIN = 'alarmdecoder'
DOMAIN = "alarmdecoder"
DATA_AD = 'alarmdecoder'
DATA_AD = "alarmdecoder"
CONF_DEVICE = 'device'
CONF_DEVICE_BAUD = 'baudrate'
CONF_DEVICE_PATH = 'path'
CONF_DEVICE_PORT = 'port'
CONF_DEVICE_TYPE = 'type'
CONF_PANEL_DISPLAY = 'panel_display'
CONF_ZONE_NAME = 'name'
CONF_ZONE_TYPE = 'type'
CONF_ZONE_LOOP = 'loop'
CONF_ZONE_RFID = 'rfid'
CONF_ZONES = 'zones'
CONF_RELAY_ADDR = 'relayaddr'
CONF_RELAY_CHAN = 'relaychan'
CONF_DEVICE = "device"
CONF_DEVICE_BAUD = "baudrate"
CONF_DEVICE_PATH = "path"
CONF_DEVICE_PORT = "port"
CONF_DEVICE_TYPE = "type"
CONF_PANEL_DISPLAY = "panel_display"
CONF_ZONE_NAME = "name"
CONF_ZONE_TYPE = "type"
CONF_ZONE_LOOP = "loop"
CONF_ZONE_RFID = "rfid"
CONF_ZONES = "zones"
CONF_RELAY_ADDR = "relayaddr"
CONF_RELAY_CHAN = "relaychan"
DEFAULT_DEVICE_TYPE = 'socket'
DEFAULT_DEVICE_HOST = 'localhost'
DEFAULT_DEVICE_TYPE = "socket"
DEFAULT_DEVICE_HOST = "localhost"
DEFAULT_DEVICE_PORT = 10000
DEFAULT_DEVICE_PATH = '/dev/ttyUSB0'
DEFAULT_DEVICE_PATH = "/dev/ttyUSB0"
DEFAULT_DEVICE_BAUD = 115200
DEFAULT_PANEL_DISPLAY = False
DEFAULT_ZONE_TYPE = 'opening'
DEFAULT_ZONE_TYPE = "opening"
SIGNAL_PANEL_MESSAGE = 'alarmdecoder.panel_message'
SIGNAL_PANEL_ARM_AWAY = 'alarmdecoder.panel_arm_away'
SIGNAL_PANEL_ARM_HOME = 'alarmdecoder.panel_arm_home'
SIGNAL_PANEL_DISARM = 'alarmdecoder.panel_disarm'
SIGNAL_PANEL_MESSAGE = "alarmdecoder.panel_message"
SIGNAL_PANEL_ARM_AWAY = "alarmdecoder.panel_arm_away"
SIGNAL_PANEL_ARM_HOME = "alarmdecoder.panel_arm_home"
SIGNAL_PANEL_DISARM = "alarmdecoder.panel_disarm"
SIGNAL_ZONE_FAULT = 'alarmdecoder.zone_fault'
SIGNAL_ZONE_RESTORE = 'alarmdecoder.zone_restore'
SIGNAL_RFX_MESSAGE = 'alarmdecoder.rfx_message'
SIGNAL_REL_MESSAGE = 'alarmdecoder.rel_message'
SIGNAL_ZONE_FAULT = "alarmdecoder.zone_fault"
SIGNAL_ZONE_RESTORE = "alarmdecoder.zone_restore"
SIGNAL_RFX_MESSAGE = "alarmdecoder.rfx_message"
SIGNAL_REL_MESSAGE = "alarmdecoder.rel_message"
DEVICE_SOCKET_SCHEMA = vol.Schema({
vol.Required(CONF_DEVICE_TYPE): 'socket',
vol.Optional(CONF_HOST, default=DEFAULT_DEVICE_HOST): cv.string,
vol.Optional(CONF_DEVICE_PORT, default=DEFAULT_DEVICE_PORT): cv.port})
DEVICE_SOCKET_SCHEMA = vol.Schema(
{
vol.Required(CONF_DEVICE_TYPE): "socket",
vol.Optional(CONF_HOST, default=DEFAULT_DEVICE_HOST): cv.string,
vol.Optional(CONF_DEVICE_PORT, default=DEFAULT_DEVICE_PORT): cv.port,
}
)
DEVICE_SERIAL_SCHEMA = vol.Schema({
vol.Required(CONF_DEVICE_TYPE): 'serial',
vol.Optional(CONF_DEVICE_PATH, default=DEFAULT_DEVICE_PATH): cv.string,
vol.Optional(CONF_DEVICE_BAUD, default=DEFAULT_DEVICE_BAUD): cv.string})
DEVICE_SERIAL_SCHEMA = vol.Schema(
{
vol.Required(CONF_DEVICE_TYPE): "serial",
vol.Optional(CONF_DEVICE_PATH, default=DEFAULT_DEVICE_PATH): cv.string,
vol.Optional(CONF_DEVICE_BAUD, default=DEFAULT_DEVICE_BAUD): cv.string,
}
)
DEVICE_USB_SCHEMA = vol.Schema({
vol.Required(CONF_DEVICE_TYPE): 'usb'})
DEVICE_USB_SCHEMA = vol.Schema({vol.Required(CONF_DEVICE_TYPE): "usb"})
ZONE_SCHEMA = vol.Schema({
vol.Required(CONF_ZONE_NAME): cv.string,
vol.Optional(CONF_ZONE_TYPE,
default=DEFAULT_ZONE_TYPE): vol.Any(DEVICE_CLASSES_SCHEMA),
vol.Optional(CONF_ZONE_RFID): cv.string,
vol.Optional(CONF_ZONE_LOOP):
vol.All(vol.Coerce(int), vol.Range(min=1, max=4)),
vol.Inclusive(CONF_RELAY_ADDR, 'relaylocation',
'Relay address and channel must exist together'): cv.byte,
vol.Inclusive(CONF_RELAY_CHAN, 'relaylocation',
'Relay address and channel must exist together'): cv.byte})
ZONE_SCHEMA = vol.Schema(
{
vol.Required(CONF_ZONE_NAME): cv.string,
vol.Optional(CONF_ZONE_TYPE, default=DEFAULT_ZONE_TYPE): vol.Any(
DEVICE_CLASSES_SCHEMA
),
vol.Optional(CONF_ZONE_RFID): cv.string,
vol.Optional(CONF_ZONE_LOOP): vol.All(vol.Coerce(int), vol.Range(min=1, max=4)),
vol.Inclusive(
CONF_RELAY_ADDR,
"relaylocation",
"Relay address and channel must exist together",
): cv.byte,
vol.Inclusive(
CONF_RELAY_CHAN,
"relaylocation",
"Relay address and channel must exist together",
): cv.byte,
}
)
CONFIG_SCHEMA = vol.Schema({
DOMAIN: vol.Schema({
vol.Required(CONF_DEVICE): vol.Any(
DEVICE_SOCKET_SCHEMA, DEVICE_SERIAL_SCHEMA,
DEVICE_USB_SCHEMA),
vol.Optional(CONF_PANEL_DISPLAY,
default=DEFAULT_PANEL_DISPLAY): cv.boolean,
vol.Optional(CONF_ZONES): {vol.Coerce(int): ZONE_SCHEMA},
}),
}, extra=vol.ALLOW_EXTRA)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: vol.Schema(
{
vol.Required(CONF_DEVICE): vol.Any(
DEVICE_SOCKET_SCHEMA, DEVICE_SERIAL_SCHEMA, DEVICE_USB_SCHEMA
),
vol.Optional(
CONF_PANEL_DISPLAY, default=DEFAULT_PANEL_DISPLAY
): cv.boolean,
vol.Optional(CONF_ZONES): {vol.Coerce(int): ZONE_SCHEMA},
}
)
},
extra=vol.ALLOW_EXTRA,
)
def setup(hass, config):
"""Set up for the AlarmDecoder devices."""
from alarmdecoder import AlarmDecoder
from alarmdecoder.devices import (SocketDevice, SerialDevice, USBDevice)
from alarmdecoder.devices import SocketDevice, SerialDevice, USBDevice
conf = config.get(DOMAIN)
@@ -115,13 +135,15 @@ def setup(hass, config):
def open_connection(now=None):
"""Open a connection to AlarmDecoder."""
from alarmdecoder.util import NoDeviceError
nonlocal restart
try:
controller.open(baud)
except NoDeviceError:
_LOGGER.debug("Failed to connect. Retrying in 5 seconds")
hass.helpers.event.track_point_in_time(
open_connection, dt_util.utcnow() + timedelta(seconds=5))
open_connection, dt_util.utcnow() + timedelta(seconds=5)
)
return
_LOGGER.debug("Established a connection with the alarmdecoder")
restart = True
@@ -137,39 +159,34 @@ def setup(hass, config):
def handle_message(sender, message):
"""Handle message from AlarmDecoder."""
hass.helpers.dispatcher.dispatcher_send(
SIGNAL_PANEL_MESSAGE, message)
hass.helpers.dispatcher.dispatcher_send(SIGNAL_PANEL_MESSAGE, message)
def handle_rfx_message(sender, message):
"""Handle RFX message from AlarmDecoder."""
hass.helpers.dispatcher.dispatcher_send(
SIGNAL_RFX_MESSAGE, message)
hass.helpers.dispatcher.dispatcher_send(SIGNAL_RFX_MESSAGE, message)
def zone_fault_callback(sender, zone):
"""Handle zone fault from AlarmDecoder."""
hass.helpers.dispatcher.dispatcher_send(
SIGNAL_ZONE_FAULT, zone)
hass.helpers.dispatcher.dispatcher_send(SIGNAL_ZONE_FAULT, zone)
def zone_restore_callback(sender, zone):
"""Handle zone restore from AlarmDecoder."""
hass.helpers.dispatcher.dispatcher_send(
SIGNAL_ZONE_RESTORE, zone)
hass.helpers.dispatcher.dispatcher_send(SIGNAL_ZONE_RESTORE, zone)
def handle_rel_message(sender, message):
"""Handle relay message from AlarmDecoder."""
hass.helpers.dispatcher.dispatcher_send(
SIGNAL_REL_MESSAGE, message)
hass.helpers.dispatcher.dispatcher_send(SIGNAL_REL_MESSAGE, message)
controller = False
if device_type == 'socket':
if device_type == "socket":
host = device.get(CONF_HOST)
port = device.get(CONF_DEVICE_PORT)
controller = AlarmDecoder(SocketDevice(interface=(host, port)))
elif device_type == 'serial':
elif device_type == "serial":
path = device.get(CONF_DEVICE_PATH)
baud = device.get(CONF_DEVICE_BAUD)
controller = AlarmDecoder(SerialDevice(interface=path))
elif device_type == 'usb':
elif device_type == "usb":
AlarmDecoder(USBDevice.find())
return False
@@ -186,13 +203,12 @@ def setup(hass, config):
hass.bus.async_listen_once(EVENT_HOMEASSISTANT_STOP, stop_alarmdecoder)
load_platform(hass, 'alarm_control_panel', DOMAIN, conf, config)
load_platform(hass, "alarm_control_panel", DOMAIN, conf, config)
if zones:
load_platform(
hass, 'binary_sensor', DOMAIN, {CONF_ZONES: zones}, config)
load_platform(hass, "binary_sensor", DOMAIN, {CONF_ZONES: zones}, config)
if display:
load_platform(hass, 'sensor', DOMAIN, conf, config)
load_platform(hass, "sensor", DOMAIN, conf, config)
return True

View File

@@ -5,18 +5,20 @@ import voluptuous as vol
import homeassistant.components.alarm_control_panel as alarm
from homeassistant.const import (
ATTR_CODE, STATE_ALARM_ARMED_AWAY, STATE_ALARM_ARMED_HOME,
STATE_ALARM_DISARMED, STATE_ALARM_TRIGGERED)
ATTR_CODE,
STATE_ALARM_ARMED_AWAY,
STATE_ALARM_ARMED_HOME,
STATE_ALARM_DISARMED,
STATE_ALARM_TRIGGERED,
)
import homeassistant.helpers.config_validation as cv
from . import DATA_AD, SIGNAL_PANEL_MESSAGE
_LOGGER = logging.getLogger(__name__)
SERVICE_ALARM_TOGGLE_CHIME = 'alarmdecoder_alarm_toggle_chime'
ALARM_TOGGLE_CHIME_SCHEMA = vol.Schema({
vol.Required(ATTR_CODE): cv.string,
})
SERVICE_ALARM_TOGGLE_CHIME = "alarmdecoder_alarm_toggle_chime"
ALARM_TOGGLE_CHIME_SCHEMA = vol.Schema({vol.Required(ATTR_CODE): cv.string})
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -30,8 +32,11 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
device.alarm_toggle_chime(code)
hass.services.register(
alarm.DOMAIN, SERVICE_ALARM_TOGGLE_CHIME, alarm_toggle_chime_handler,
schema=ALARM_TOGGLE_CHIME_SCHEMA)
alarm.DOMAIN,
SERVICE_ALARM_TOGGLE_CHIME,
alarm_toggle_chime_handler,
schema=ALARM_TOGGLE_CHIME_SCHEMA,
)
class AlarmDecoderAlarmPanel(alarm.AlarmControlPanel):
@@ -55,7 +60,8 @@ class AlarmDecoderAlarmPanel(alarm.AlarmControlPanel):
async def async_added_to_hass(self):
"""Register callbacks."""
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_PANEL_MESSAGE, self._message_callback)
SIGNAL_PANEL_MESSAGE, self._message_callback
)
def _message_callback(self, message):
"""Handle received messages."""
@@ -104,15 +110,15 @@ class AlarmDecoderAlarmPanel(alarm.AlarmControlPanel):
def device_state_attributes(self):
"""Return the state attributes."""
return {
'ac_power': self._ac_power,
'backlight_on': self._backlight_on,
'battery_low': self._battery_low,
'check_zone': self._check_zone,
'chime': self._chime,
'entry_delay_off': self._entry_delay_off,
'programming_mode': self._programming_mode,
'ready': self._ready,
'zone_bypassed': self._zone_bypassed,
"ac_power": self._ac_power,
"backlight_on": self._backlight_on,
"battery_low": self._battery_low,
"check_zone": self._check_zone,
"chime": self._chime,
"entry_delay_off": self._entry_delay_off,
"programming_mode": self._programming_mode,
"ready": self._ready,
"zone_bypassed": self._zone_bypassed,
}
def alarm_disarm(self, code=None):

View File

@@ -4,20 +4,30 @@ import logging
from homeassistant.components.binary_sensor import BinarySensorDevice
from . import (
CONF_RELAY_ADDR, CONF_RELAY_CHAN, CONF_ZONE_LOOP, CONF_ZONE_NAME,
CONF_ZONE_RFID, CONF_ZONE_TYPE, CONF_ZONES, SIGNAL_REL_MESSAGE,
SIGNAL_RFX_MESSAGE, SIGNAL_ZONE_FAULT, SIGNAL_ZONE_RESTORE, ZONE_SCHEMA)
CONF_RELAY_ADDR,
CONF_RELAY_CHAN,
CONF_ZONE_LOOP,
CONF_ZONE_NAME,
CONF_ZONE_RFID,
CONF_ZONE_TYPE,
CONF_ZONES,
SIGNAL_REL_MESSAGE,
SIGNAL_RFX_MESSAGE,
SIGNAL_ZONE_FAULT,
SIGNAL_ZONE_RESTORE,
ZONE_SCHEMA,
)
_LOGGER = logging.getLogger(__name__)
ATTR_RF_BIT0 = 'rf_bit0'
ATTR_RF_LOW_BAT = 'rf_low_battery'
ATTR_RF_SUPERVISED = 'rf_supervised'
ATTR_RF_BIT3 = 'rf_bit3'
ATTR_RF_LOOP3 = 'rf_loop3'
ATTR_RF_LOOP2 = 'rf_loop2'
ATTR_RF_LOOP4 = 'rf_loop4'
ATTR_RF_LOOP1 = 'rf_loop1'
ATTR_RF_BIT0 = "rf_bit0"
ATTR_RF_LOW_BAT = "rf_low_battery"
ATTR_RF_SUPERVISED = "rf_supervised"
ATTR_RF_BIT3 = "rf_bit3"
ATTR_RF_LOOP3 = "rf_loop3"
ATTR_RF_LOOP2 = "rf_loop2"
ATTR_RF_LOOP4 = "rf_loop4"
ATTR_RF_LOOP1 = "rf_loop1"
def setup_platform(hass, config, add_entities, discovery_info=None):
@@ -34,8 +44,8 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
relay_addr = device_config_data.get(CONF_RELAY_ADDR)
relay_chan = device_config_data.get(CONF_RELAY_CHAN)
device = AlarmDecoderBinarySensor(
zone_num, zone_name, zone_type, zone_rfid, zone_loop, relay_addr,
relay_chan)
zone_num, zone_name, zone_type, zone_rfid, zone_loop, relay_addr, relay_chan
)
devices.append(device)
add_entities(devices)
@@ -46,8 +56,16 @@ def setup_platform(hass, config, add_entities, discovery_info=None):
class AlarmDecoderBinarySensor(BinarySensorDevice):
"""Representation of an AlarmDecoder binary sensor."""
def __init__(self, zone_number, zone_name, zone_type, zone_rfid, zone_loop,
relay_addr, relay_chan):
def __init__(
self,
zone_number,
zone_name,
zone_type,
zone_rfid,
zone_loop,
relay_addr,
relay_chan,
):
"""Initialize the binary_sensor."""
self._zone_number = zone_number
self._zone_type = zone_type
@@ -62,16 +80,20 @@ class AlarmDecoderBinarySensor(BinarySensorDevice):
async def async_added_to_hass(self):
"""Register callbacks."""
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_ZONE_FAULT, self._fault_callback)
SIGNAL_ZONE_FAULT, self._fault_callback
)
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_ZONE_RESTORE, self._restore_callback)
SIGNAL_ZONE_RESTORE, self._restore_callback
)
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_RFX_MESSAGE, self._rfx_message_callback)
SIGNAL_RFX_MESSAGE, self._rfx_message_callback
)
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_REL_MESSAGE, self._rel_message_callback)
SIGNAL_REL_MESSAGE, self._rel_message_callback
)
@property
def name(self):
@@ -130,9 +152,9 @@ class AlarmDecoderBinarySensor(BinarySensorDevice):
def _rel_message_callback(self, message):
"""Update relay state."""
if (self._relay_addr == message.address and
self._relay_chan == message.channel):
_LOGGER.debug("Relay %d:%d value:%d", message.address,
message.channel, message.value)
if self._relay_addr == message.address and self._relay_chan == message.channel:
_LOGGER.debug(
"Relay %d:%d value:%d", message.address, message.channel, message.value
)
self._state = message.value
self.schedule_update_ha_state()

View File

@@ -24,13 +24,14 @@ class AlarmDecoderSensor(Entity):
"""Initialize the alarm panel."""
self._display = ""
self._state = None
self._icon = 'mdi:alarm-check'
self._name = 'Alarm Panel Display'
self._icon = "mdi:alarm-check"
self._name = "Alarm Panel Display"
async def async_added_to_hass(self):
"""Register callbacks."""
self.hass.helpers.dispatcher.async_dispatcher_connect(
SIGNAL_PANEL_MESSAGE, self._message_callback)
SIGNAL_PANEL_MESSAGE, self._message_callback
)
def _message_callback(self, message):
if self._display != message.text:

View File

@@ -7,25 +7,32 @@ import voluptuous as vol
import homeassistant.components.alarm_control_panel as alarm
from homeassistant.components.alarm_control_panel import PLATFORM_SCHEMA
from homeassistant.const import (
CONF_CODE, CONF_NAME, CONF_PASSWORD, CONF_USERNAME, STATE_ALARM_ARMED_AWAY,
STATE_ALARM_ARMED_HOME, STATE_ALARM_DISARMED)
CONF_CODE,
CONF_NAME,
CONF_PASSWORD,
CONF_USERNAME,
STATE_ALARM_ARMED_AWAY,
STATE_ALARM_ARMED_HOME,
STATE_ALARM_DISARMED,
)
from homeassistant.helpers.aiohttp_client import async_get_clientsession
import homeassistant.helpers.config_validation as cv
_LOGGER = logging.getLogger(__name__)
DEFAULT_NAME = 'Alarm.com'
DEFAULT_NAME = "Alarm.com"
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_PASSWORD): cv.string,
vol.Required(CONF_USERNAME): cv.string,
vol.Optional(CONF_CODE): cv.positive_int,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
})
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(
{
vol.Required(CONF_PASSWORD): cv.string,
vol.Required(CONF_USERNAME): cv.string,
vol.Optional(CONF_CODE): cv.positive_int,
vol.Optional(CONF_NAME, default=DEFAULT_NAME): cv.string,
}
)
async def async_setup_platform(hass, config, async_add_entities,
discovery_info=None):
async def async_setup_platform(hass, config, async_add_entities, discovery_info=None):
"""Set up a Alarm.com control panel."""
name = config.get(CONF_NAME)
code = config.get(CONF_CODE)
@@ -43,7 +50,8 @@ class AlarmDotCom(alarm.AlarmControlPanel):
def __init__(self, hass, name, code, username, password):
"""Initialize the Alarm.com status."""
from pyalarmdotcom import Alarmdotcom
_LOGGER.debug('Setting up Alarm.com...')
_LOGGER.debug("Setting up Alarm.com...")
self._hass = hass
self._name = name
self._code = str(code) if code else None
@@ -51,8 +59,7 @@ class AlarmDotCom(alarm.AlarmControlPanel):
self._password = password
self._websession = async_get_clientsession(self._hass)
self._state = None
self._alarm = Alarmdotcom(
username, password, self._websession, hass.loop)
self._alarm = Alarmdotcom(username, password, self._websession, hass.loop)
async def async_login(self):
"""Login to Alarm.com."""
@@ -73,27 +80,25 @@ class AlarmDotCom(alarm.AlarmControlPanel):
"""Return one or more digits/characters."""
if self._code is None:
return None
if isinstance(self._code, str) and re.search('^\\d+$', self._code):
if isinstance(self._code, str) and re.search("^\\d+$", self._code):
return alarm.FORMAT_NUMBER
return alarm.FORMAT_TEXT
@property
def state(self):
"""Return the state of the device."""
if self._alarm.state.lower() == 'disarmed':
if self._alarm.state.lower() == "disarmed":
return STATE_ALARM_DISARMED
if self._alarm.state.lower() == 'armed stay':
if self._alarm.state.lower() == "armed stay":
return STATE_ALARM_ARMED_HOME
if self._alarm.state.lower() == 'armed away':
if self._alarm.state.lower() == "armed away":
return STATE_ALARM_ARMED_AWAY
return None
@property
def device_state_attributes(self):
"""Return the state attributes."""
return {
'sensor_status': self._alarm.sensor_status
}
return {"sensor_status": self._alarm.sensor_status}
async def async_alarm_disarm(self, code=None):
"""Send disarm command."""

View File

@@ -7,51 +7,65 @@ import voluptuous as vol
import homeassistant.helpers.config_validation as cv
from homeassistant.components.notify import (
ATTR_MESSAGE, ATTR_TITLE, ATTR_DATA, DOMAIN as DOMAIN_NOTIFY)
ATTR_MESSAGE,
ATTR_TITLE,
ATTR_DATA,
DOMAIN as DOMAIN_NOTIFY,
)
from homeassistant.const import (
CONF_ENTITY_ID, STATE_IDLE, CONF_NAME, CONF_STATE, STATE_ON, STATE_OFF,
SERVICE_TURN_ON, SERVICE_TURN_OFF, SERVICE_TOGGLE, ATTR_ENTITY_ID)
CONF_ENTITY_ID,
STATE_IDLE,
CONF_NAME,
CONF_STATE,
STATE_ON,
STATE_OFF,
SERVICE_TURN_ON,
SERVICE_TURN_OFF,
SERVICE_TOGGLE,
ATTR_ENTITY_ID,
)
from homeassistant.helpers import service, event
from homeassistant.helpers.entity import ToggleEntity
from homeassistant.util.dt import now
_LOGGER = logging.getLogger(__name__)
DOMAIN = 'alert'
ENTITY_ID_FORMAT = DOMAIN + '.{}'
DOMAIN = "alert"
ENTITY_ID_FORMAT = DOMAIN + ".{}"
CONF_CAN_ACK = 'can_acknowledge'
CONF_NOTIFIERS = 'notifiers'
CONF_REPEAT = 'repeat'
CONF_SKIP_FIRST = 'skip_first'
CONF_ALERT_MESSAGE = 'message'
CONF_DONE_MESSAGE = 'done_message'
CONF_TITLE = 'title'
CONF_DATA = 'data'
CONF_CAN_ACK = "can_acknowledge"
CONF_NOTIFIERS = "notifiers"
CONF_REPEAT = "repeat"
CONF_SKIP_FIRST = "skip_first"
CONF_ALERT_MESSAGE = "message"
CONF_DONE_MESSAGE = "done_message"
CONF_TITLE = "title"
CONF_DATA = "data"
DEFAULT_CAN_ACK = True
DEFAULT_SKIP_FIRST = False
ALERT_SCHEMA = vol.Schema({
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_ENTITY_ID): cv.entity_id,
vol.Required(CONF_STATE, default=STATE_ON): cv.string,
vol.Required(CONF_REPEAT): vol.All(cv.ensure_list, [vol.Coerce(float)]),
vol.Required(CONF_CAN_ACK, default=DEFAULT_CAN_ACK): cv.boolean,
vol.Required(CONF_SKIP_FIRST, default=DEFAULT_SKIP_FIRST): cv.boolean,
vol.Optional(CONF_ALERT_MESSAGE): cv.template,
vol.Optional(CONF_DONE_MESSAGE): cv.template,
vol.Optional(CONF_TITLE): cv.template,
vol.Optional(CONF_DATA): dict,
vol.Required(CONF_NOTIFIERS): cv.ensure_list})
ALERT_SCHEMA = vol.Schema(
{
vol.Required(CONF_NAME): cv.string,
vol.Required(CONF_ENTITY_ID): cv.entity_id,
vol.Required(CONF_STATE, default=STATE_ON): cv.string,
vol.Required(CONF_REPEAT): vol.All(cv.ensure_list, [vol.Coerce(float)]),
vol.Required(CONF_CAN_ACK, default=DEFAULT_CAN_ACK): cv.boolean,
vol.Required(CONF_SKIP_FIRST, default=DEFAULT_SKIP_FIRST): cv.boolean,
vol.Optional(CONF_ALERT_MESSAGE): cv.template,
vol.Optional(CONF_DONE_MESSAGE): cv.template,
vol.Optional(CONF_TITLE): cv.template,
vol.Optional(CONF_DATA): dict,
vol.Required(CONF_NOTIFIERS): cv.ensure_list,
}
)
CONFIG_SCHEMA = vol.Schema({
DOMAIN: cv.schema_with_slug_keys(ALERT_SCHEMA),
}, extra=vol.ALLOW_EXTRA)
CONFIG_SCHEMA = vol.Schema(
{DOMAIN: cv.schema_with_slug_keys(ALERT_SCHEMA)}, extra=vol.ALLOW_EXTRA
)
ALERT_SERVICE_SCHEMA = vol.Schema({
vol.Required(ATTR_ENTITY_ID): cv.entity_ids,
})
ALERT_SERVICE_SCHEMA = vol.Schema({vol.Required(ATTR_ENTITY_ID): cv.entity_ids})
def is_on(hass, entity_id):
@@ -79,11 +93,23 @@ async def async_setup(hass, config):
title_template = cfg.get(CONF_TITLE)
data = cfg.get(CONF_DATA)
entities.append(Alert(hass, object_id, name,
watched_entity_id, alert_state, repeat,
skip_first, message_template,
done_message_template, notifiers,
can_ack, title_template, data))
entities.append(
Alert(
hass,
object_id,
name,
watched_entity_id,
alert_state,
repeat,
skip_first,
message_template,
done_message_template,
notifiers,
can_ack,
title_template,
data,
)
)
if not entities:
return False
@@ -107,14 +133,17 @@ async def async_setup(hass, config):
# Setup service calls
hass.services.async_register(
DOMAIN, SERVICE_TURN_OFF, async_handle_alert_service,
schema=ALERT_SERVICE_SCHEMA)
DOMAIN,
SERVICE_TURN_OFF,
async_handle_alert_service,
schema=ALERT_SERVICE_SCHEMA,
)
hass.services.async_register(
DOMAIN, SERVICE_TURN_ON, async_handle_alert_service,
schema=ALERT_SERVICE_SCHEMA)
DOMAIN, SERVICE_TURN_ON, async_handle_alert_service, schema=ALERT_SERVICE_SCHEMA
)
hass.services.async_register(
DOMAIN, SERVICE_TOGGLE, async_handle_alert_service,
schema=ALERT_SERVICE_SCHEMA)
DOMAIN, SERVICE_TOGGLE, async_handle_alert_service, schema=ALERT_SERVICE_SCHEMA
)
tasks = [alert.async_update_ha_state() for alert in entities]
if tasks:
@@ -126,10 +155,22 @@ async def async_setup(hass, config):
class Alert(ToggleEntity):
"""Representation of an alert."""
def __init__(self, hass, entity_id, name, watched_entity_id,
state, repeat, skip_first, message_template,
done_message_template, notifiers, can_ack, title_template,
data):
def __init__(
self,
hass,
entity_id,
name,
watched_entity_id,
state,
repeat,
skip_first,
message_template,
done_message_template,
notifiers,
can_ack,
title_template,
data,
):
"""Initialize the alert."""
self.hass = hass
self._name = name
@@ -162,7 +203,8 @@ class Alert(ToggleEntity):
self.entity_id = ENTITY_ID_FORMAT.format(entity_id)
event.async_track_state_change(
hass, watched_entity_id, self.watched_entity_change)
hass, watched_entity_id, self.watched_entity_change
)
@property
def name(self):
@@ -224,8 +266,9 @@ class Alert(ToggleEntity):
"""Schedule a notification."""
delay = self._delay[self._next_delay]
next_msg = now() + delay
self._cancel = \
event.async_track_point_in_time(self.hass, self._notify, next_msg)
self._cancel = event.async_track_point_in_time(
self.hass, self._notify, next_msg
)
self._next_delay = min(self._next_delay + 1, len(self._delay) - 1)
async def _notify(self, *args):
@@ -270,8 +313,7 @@ class Alert(ToggleEntity):
_LOGGER.debug(msg_payload)
for target in self._notifiers:
await self.hass.services.async_call(
DOMAIN_NOTIFY, target, msg_payload)
await self.hass.services.async_call(DOMAIN_NOTIFY, target, msg_payload)
async def async_turn_on(self, **kwargs):
"""Async Unacknowledge alert."""

View File

@@ -4,5 +4,8 @@
"documentation": "https://www.home-assistant.io/components/alert",
"requirements": [],
"dependencies": [],
"after_dependencies": [
"notify"
],
"codeowners": []
}

View File

@@ -9,45 +9,68 @@ from homeassistant.const import CONF_NAME
from . import flash_briefings, intent, smart_home_http
from .const import (
CONF_AUDIO, CONF_CLIENT_ID, CONF_CLIENT_SECRET, CONF_DISPLAY_URL,
CONF_ENDPOINT, CONF_TEXT, CONF_TITLE, CONF_UID, DOMAIN, CONF_FILTER,
CONF_ENTITY_CONFIG, CONF_DESCRIPTION, CONF_DISPLAY_CATEGORIES)
CONF_AUDIO,
CONF_CLIENT_ID,
CONF_CLIENT_SECRET,
CONF_DISPLAY_URL,
CONF_ENDPOINT,
CONF_TEXT,
CONF_TITLE,
CONF_UID,
DOMAIN,
CONF_FILTER,
CONF_ENTITY_CONFIG,
CONF_DESCRIPTION,
CONF_DISPLAY_CATEGORIES,
)
_LOGGER = logging.getLogger(__name__)
CONF_FLASH_BRIEFINGS = 'flash_briefings'
CONF_SMART_HOME = 'smart_home'
CONF_FLASH_BRIEFINGS = "flash_briefings"
CONF_SMART_HOME = "smart_home"
ALEXA_ENTITY_SCHEMA = vol.Schema({
vol.Optional(CONF_DESCRIPTION): cv.string,
vol.Optional(CONF_DISPLAY_CATEGORIES): cv.string,
vol.Optional(CONF_NAME): cv.string,
})
SMART_HOME_SCHEMA = vol.Schema({
vol.Optional(CONF_ENDPOINT): cv.string,
vol.Optional(CONF_CLIENT_ID): cv.string,
vol.Optional(CONF_CLIENT_SECRET): cv.string,
vol.Optional(CONF_FILTER, default={}): entityfilter.FILTER_SCHEMA,
vol.Optional(CONF_ENTITY_CONFIG): {cv.entity_id: ALEXA_ENTITY_SCHEMA}
})
CONFIG_SCHEMA = vol.Schema({
DOMAIN: {
CONF_FLASH_BRIEFINGS: {
cv.string: vol.All(cv.ensure_list, [{
vol.Optional(CONF_UID): cv.string,
vol.Required(CONF_TITLE): cv.template,
vol.Optional(CONF_AUDIO): cv.template,
vol.Required(CONF_TEXT, default=""): cv.template,
vol.Optional(CONF_DISPLAY_URL): cv.template,
}]),
},
# vol.Optional here would mean we couldn't distinguish between an empty
# smart_home: and none at all.
CONF_SMART_HOME: vol.Any(SMART_HOME_SCHEMA, None),
ALEXA_ENTITY_SCHEMA = vol.Schema(
{
vol.Optional(CONF_DESCRIPTION): cv.string,
vol.Optional(CONF_DISPLAY_CATEGORIES): cv.string,
vol.Optional(CONF_NAME): cv.string,
}
}, extra=vol.ALLOW_EXTRA)
)
SMART_HOME_SCHEMA = vol.Schema(
{
vol.Optional(CONF_ENDPOINT): cv.string,
vol.Optional(CONF_CLIENT_ID): cv.string,
vol.Optional(CONF_CLIENT_SECRET): cv.string,
vol.Optional(CONF_FILTER, default={}): entityfilter.FILTER_SCHEMA,
vol.Optional(CONF_ENTITY_CONFIG): {cv.entity_id: ALEXA_ENTITY_SCHEMA},
}
)
CONFIG_SCHEMA = vol.Schema(
{
DOMAIN: {
CONF_FLASH_BRIEFINGS: {
cv.string: vol.All(
cv.ensure_list,
[
{
vol.Optional(CONF_UID): cv.string,
vol.Required(CONF_TITLE): cv.template,
vol.Optional(CONF_AUDIO): cv.template,
vol.Required(CONF_TEXT, default=""): cv.template,
vol.Optional(CONF_DISPLAY_URL): cv.template,
}
],
)
},
# vol.Optional here would mean we couldn't distinguish between an empty
# smart_home: and none at all.
CONF_SMART_HOME: vol.Any(SMART_HOME_SCHEMA, None),
}
},
extra=vol.ALLOW_EXTRA,
)
async def async_setup(hass, config):

View File

@@ -13,12 +13,10 @@ from homeassistant.util import dt
_LOGGER = logging.getLogger(__name__)
LWA_TOKEN_URI = "https://api.amazon.com/auth/o2/token"
LWA_HEADERS = {
"Content-Type": "application/x-www-form-urlencoded;charset=UTF-8"
}
LWA_HEADERS = {"Content-Type": "application/x-www-form-urlencoded;charset=UTF-8"}
PREEMPTIVE_REFRESH_TTL_IN_SECONDS = 300
STORAGE_KEY = 'alexa_auth'
STORAGE_KEY = "alexa_auth"
STORAGE_VERSION = 1
STORAGE_EXPIRE_TIME = "expire_time"
STORAGE_ACCESS_TOKEN = "access_token"
@@ -49,10 +47,12 @@ class Auth:
"grant_type": "authorization_code",
"code": accept_grant_code,
"client_id": self.client_id,
"client_secret": self.client_secret
"client_secret": self.client_secret,
}
_LOGGER.debug("Calling LWA to get the access token (first time), "
"with: %s", json.dumps(lwa_params))
_LOGGER.debug(
"Calling LWA to get the access token (first time), " "with: %s",
json.dumps(lwa_params),
)
return await self._async_request_new_token(lwa_params)
@@ -74,7 +74,7 @@ class Auth:
"grant_type": "refresh_token",
"refresh_token": self._prefs[STORAGE_REFRESH_TOKEN],
"client_id": self.client_id,
"client_secret": self.client_secret
"client_secret": self.client_secret,
}
_LOGGER.debug("Calling LWA to refresh the access token.")
@@ -88,7 +88,8 @@ class Auth:
expire_time = dt.parse_datetime(self._prefs[STORAGE_EXPIRE_TIME])
preemptive_expire_time = expire_time - timedelta(
seconds=PREEMPTIVE_REFRESH_TTL_IN_SECONDS)
seconds=PREEMPTIVE_REFRESH_TTL_IN_SECONDS
)
return dt.utcnow() < preemptive_expire_time
@@ -97,10 +98,12 @@ class Auth:
try:
session = aiohttp_client.async_get_clientsession(self.hass)
with async_timeout.timeout(10):
response = await session.post(LWA_TOKEN_URI,
headers=LWA_HEADERS,
data=lwa_params,
allow_redirects=True)
response = await session.post(
LWA_TOKEN_URI,
headers=LWA_HEADERS,
data=lwa_params,
allow_redirects=True,
)
except (asyncio.TimeoutError, aiohttp.ClientError):
_LOGGER.error("Timeout calling LWA to get auth token.")
@@ -121,8 +124,9 @@ class Auth:
expires_in = response_json["expires_in"]
expire_time = dt.utcnow() + timedelta(seconds=expires_in)
await self._async_update_preferences(access_token, refresh_token,
expire_time.isoformat())
await self._async_update_preferences(
access_token, refresh_token, expire_time.isoformat()
)
return access_token
@@ -134,11 +138,10 @@ class Auth:
self._prefs = {
STORAGE_ACCESS_TOKEN: None,
STORAGE_REFRESH_TOKEN: None,
STORAGE_EXPIRE_TIME: None
STORAGE_EXPIRE_TIME: None,
}
async def _async_update_preferences(self, access_token, refresh_token,
expire_time):
async def _async_update_preferences(self, access_token, refresh_token, expire_time):
"""Update user preferences."""
if self._prefs is None:
await self.async_load_preferences()

View File

@@ -13,16 +13,13 @@ from homeassistant.const import (
STATE_UNLOCKED,
)
import homeassistant.components.climate.const as climate
from homeassistant.components import (
light,
fan,
cover,
)
from homeassistant.components import light, fan, cover
import homeassistant.util.color as color_util
from .const import (
API_TEMP_UNITS,
API_THERMOSTAT_MODES,
API_THERMOSTAT_PRESETS,
DATE_FORMAT,
PERCENTAGE_FAN_MAP,
)
@@ -84,35 +81,35 @@ class AlexaCapibility:
def serialize_discovery(self):
"""Serialize according to the Discovery API."""
result = {
'type': 'AlexaInterface',
'interface': self.name(),
'version': '3',
'properties': {
'supported': self.properties_supported(),
'proactivelyReported': self.properties_proactively_reported(),
'retrievable': self.properties_retrievable(),
"type": "AlexaInterface",
"interface": self.name(),
"version": "3",
"properties": {
"supported": self.properties_supported(),
"proactivelyReported": self.properties_proactively_reported(),
"retrievable": self.properties_retrievable(),
},
}
# pylint: disable=assignment-from-none
supports_deactivation = self.supports_deactivation()
if supports_deactivation is not None:
result['supportsDeactivation'] = supports_deactivation
result["supportsDeactivation"] = supports_deactivation
return result
def serialize_properties(self):
"""Return properties serialized for an API response."""
for prop in self.properties_supported():
prop_name = prop['name']
prop_name = prop["name"]
# pylint: disable=assignment-from-no-return
prop_value = self.get_property(prop_name)
if prop_value is not None:
yield {
'name': prop_name,
'namespace': self.name(),
'value': prop_value,
'timeOfSample': datetime.now().strftime(DATE_FORMAT),
'uncertaintyInMilliseconds': 0
"name": prop_name,
"namespace": self.name(),
"value": prop_value,
"timeOfSample": datetime.now().strftime(DATE_FORMAT),
"uncertaintyInMilliseconds": 0,
}
@@ -129,11 +126,11 @@ class AlexaEndpointHealth(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.EndpointHealth'
return "Alexa.EndpointHealth"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'connectivity'}]
return [{"name": "connectivity"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -145,12 +142,12 @@ class AlexaEndpointHealth(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'connectivity':
if name != "connectivity":
raise UnsupportedProperty(name)
if self.entity.state == STATE_UNAVAILABLE:
return {'value': 'UNREACHABLE'}
return {'value': 'OK'}
return {"value": "UNREACHABLE"}
return {"value": "OK"}
class AlexaPowerController(AlexaCapibility):
@@ -161,11 +158,11 @@ class AlexaPowerController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.PowerController'
return "Alexa.PowerController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'powerState'}]
return [{"name": "powerState"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -177,12 +174,16 @@ class AlexaPowerController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'powerState':
if name != "powerState":
raise UnsupportedProperty(name)
if self.entity.state == STATE_OFF:
return 'OFF'
return 'ON'
if self.entity.domain == climate.DOMAIN:
is_on = self.entity.state != climate.HVAC_MODE_OFF
else:
is_on = self.entity.state != STATE_OFF
return "ON" if is_on else "OFF"
class AlexaLockController(AlexaCapibility):
@@ -193,11 +194,11 @@ class AlexaLockController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.LockController'
return "Alexa.LockController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'lockState'}]
return [{"name": "lockState"}]
def properties_retrievable(self):
"""Return True if properties can be retrieved."""
@@ -209,14 +210,14 @@ class AlexaLockController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'lockState':
if name != "lockState":
raise UnsupportedProperty(name)
if self.entity.state == STATE_LOCKED:
return 'LOCKED'
return "LOCKED"
if self.entity.state == STATE_UNLOCKED:
return 'UNLOCKED'
return 'JAMMED'
return "UNLOCKED"
return "JAMMED"
class AlexaSceneController(AlexaCapibility):
@@ -232,7 +233,7 @@ class AlexaSceneController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.SceneController'
return "Alexa.SceneController"
class AlexaBrightnessController(AlexaCapibility):
@@ -243,11 +244,11 @@ class AlexaBrightnessController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.BrightnessController'
return "Alexa.BrightnessController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'brightness'}]
return [{"name": "brightness"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -259,10 +260,10 @@ class AlexaBrightnessController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'brightness':
if name != "brightness":
raise UnsupportedProperty(name)
if 'brightness' in self.entity.attributes:
return round(self.entity.attributes['brightness'] / 255.0 * 100)
if "brightness" in self.entity.attributes:
return round(self.entity.attributes["brightness"] / 255.0 * 100)
return 0
@@ -274,11 +275,11 @@ class AlexaColorController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.ColorController'
return "Alexa.ColorController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'color'}]
return [{"name": "color"}]
def properties_retrievable(self):
"""Return True if properties can be retrieved."""
@@ -286,17 +287,15 @@ class AlexaColorController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'color':
if name != "color":
raise UnsupportedProperty(name)
hue, saturation = self.entity.attributes.get(
light.ATTR_HS_COLOR, (0, 0))
hue, saturation = self.entity.attributes.get(light.ATTR_HS_COLOR, (0, 0))
return {
'hue': hue,
'saturation': saturation / 100.0,
'brightness': self.entity.attributes.get(
light.ATTR_BRIGHTNESS, 0) / 255.0,
"hue": hue,
"saturation": saturation / 100.0,
"brightness": self.entity.attributes.get(light.ATTR_BRIGHTNESS, 0) / 255.0,
}
@@ -308,11 +307,11 @@ class AlexaColorTemperatureController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.ColorTemperatureController'
return "Alexa.ColorTemperatureController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'colorTemperatureInKelvin'}]
return [{"name": "colorTemperatureInKelvin"}]
def properties_retrievable(self):
"""Return True if properties can be retrieved."""
@@ -320,11 +319,12 @@ class AlexaColorTemperatureController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'colorTemperatureInKelvin':
if name != "colorTemperatureInKelvin":
raise UnsupportedProperty(name)
if 'color_temp' in self.entity.attributes:
if "color_temp" in self.entity.attributes:
return color_util.color_temperature_mired_to_kelvin(
self.entity.attributes['color_temp'])
self.entity.attributes["color_temp"]
)
return 0
@@ -336,11 +336,11 @@ class AlexaPercentageController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.PercentageController'
return "Alexa.PercentageController"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'percentage'}]
return [{"name": "percentage"}]
def properties_retrievable(self):
"""Return True if properties can be retrieved."""
@@ -348,7 +348,7 @@ class AlexaPercentageController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'percentage':
if name != "percentage":
raise UnsupportedProperty(name)
if self.entity.domain == fan.DOMAIN:
@@ -370,7 +370,7 @@ class AlexaSpeaker(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.Speaker'
return "Alexa.Speaker"
class AlexaStepSpeaker(AlexaCapibility):
@@ -381,7 +381,7 @@ class AlexaStepSpeaker(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.StepSpeaker'
return "Alexa.StepSpeaker"
class AlexaPlaybackController(AlexaCapibility):
@@ -392,7 +392,7 @@ class AlexaPlaybackController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.PlaybackController'
return "Alexa.PlaybackController"
class AlexaInputController(AlexaCapibility):
@@ -403,7 +403,7 @@ class AlexaInputController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.InputController'
return "Alexa.InputController"
class AlexaTemperatureSensor(AlexaCapibility):
@@ -419,11 +419,11 @@ class AlexaTemperatureSensor(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.TemperatureSensor'
return "Alexa.TemperatureSensor"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'temperature'}]
return [{"name": "temperature"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -435,19 +435,15 @@ class AlexaTemperatureSensor(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'temperature':
if name != "temperature":
raise UnsupportedProperty(name)
unit = self.entity.attributes.get(ATTR_UNIT_OF_MEASUREMENT)
temp = self.entity.state
if self.entity.domain == climate.DOMAIN:
unit = self.hass.config.units.temperature_unit
temp = self.entity.attributes.get(
climate.ATTR_CURRENT_TEMPERATURE)
return {
'value': float(temp),
'scale': API_TEMP_UNITS[unit],
}
temp = self.entity.attributes.get(climate.ATTR_CURRENT_TEMPERATURE)
return {"value": float(temp), "scale": API_TEMP_UNITS[unit]}
class AlexaContactSensor(AlexaCapibility):
@@ -468,11 +464,11 @@ class AlexaContactSensor(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.ContactSensor'
return "Alexa.ContactSensor"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'detectionState'}]
return [{"name": "detectionState"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -484,12 +480,12 @@ class AlexaContactSensor(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'detectionState':
if name != "detectionState":
raise UnsupportedProperty(name)
if self.entity.state == STATE_ON:
return 'DETECTED'
return 'NOT_DETECTED'
return "DETECTED"
return "NOT_DETECTED"
class AlexaMotionSensor(AlexaCapibility):
@@ -505,11 +501,11 @@ class AlexaMotionSensor(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.MotionSensor'
return "Alexa.MotionSensor"
def properties_supported(self):
"""Return what properties this entity supports."""
return [{'name': 'detectionState'}]
return [{"name": "detectionState"}]
def properties_proactively_reported(self):
"""Return True if properties asynchronously reported."""
@@ -521,12 +517,12 @@ class AlexaMotionSensor(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name != 'detectionState':
if name != "detectionState":
raise UnsupportedProperty(name)
if self.entity.state == STATE_ON:
return 'DETECTED'
return 'NOT_DETECTED'
return "DETECTED"
return "NOT_DETECTED"
class AlexaThermostatController(AlexaCapibility):
@@ -542,20 +538,17 @@ class AlexaThermostatController(AlexaCapibility):
def name(self):
"""Return the Alexa API name of this interface."""
return 'Alexa.ThermostatController'
return "Alexa.ThermostatController"
def properties_supported(self):
"""Return what properties this entity supports."""
properties = []
properties = [{"name": "thermostatMode"}]
supported = self.entity.attributes.get(ATTR_SUPPORTED_FEATURES, 0)
if supported & climate.SUPPORT_TARGET_TEMPERATURE:
properties.append({'name': 'targetSetpoint'})
if supported & climate.SUPPORT_TARGET_TEMPERATURE_LOW:
properties.append({'name': 'lowerSetpoint'})
if supported & climate.SUPPORT_TARGET_TEMPERATURE_HIGH:
properties.append({'name': 'upperSetpoint'})
if supported & climate.SUPPORT_OPERATION_MODE:
properties.append({'name': 'thermostatMode'})
properties.append({"name": "targetSetpoint"})
if supported & climate.SUPPORT_TARGET_TEMPERATURE_RANGE:
properties.append({"name": "lowerSetpoint"})
properties.append({"name": "upperSetpoint"})
return properties
def properties_proactively_reported(self):
@@ -568,22 +561,29 @@ class AlexaThermostatController(AlexaCapibility):
def get_property(self, name):
"""Read and return a property."""
if name == 'thermostatMode':
ha_mode = self.entity.attributes.get(climate.ATTR_OPERATION_MODE)
mode = API_THERMOSTAT_MODES.get(ha_mode)
if mode is None:
_LOGGER.error("%s (%s) has unsupported %s value '%s'",
self.entity.entity_id, type(self.entity),
climate.ATTR_OPERATION_MODE, ha_mode)
raise UnsupportedProperty(name)
if name == "thermostatMode":
preset = self.entity.attributes.get(climate.ATTR_PRESET_MODE)
if preset in API_THERMOSTAT_PRESETS:
mode = API_THERMOSTAT_PRESETS[preset]
else:
mode = API_THERMOSTAT_MODES.get(self.entity.state)
if mode is None:
_LOGGER.error(
"%s (%s) has unsupported state value '%s'",
self.entity.entity_id,
type(self.entity),
self.entity.state,
)
raise UnsupportedProperty(name)
return mode
unit = self.hass.config.units.temperature_unit
if name == 'targetSetpoint':
if name == "targetSetpoint":
temp = self.entity.attributes.get(ATTR_TEMPERATURE)
elif name == 'lowerSetpoint':
elif name == "lowerSetpoint":
temp = self.entity.attributes.get(climate.ATTR_TARGET_TEMP_LOW)
elif name == 'upperSetpoint':
elif name == "upperSetpoint":
temp = self.entity.attributes.get(climate.ATTR_TARGET_TEMP_HIGH)
else:
raise UnsupportedProperty(name)
@@ -591,7 +591,4 @@ class AlexaThermostatController(AlexaCapibility):
if temp is None:
return None
return {
'value': float(temp),
'scale': API_TEMP_UNITS[unit],
}
return {"value": float(temp), "scale": API_TEMP_UNITS[unit]}

View File

@@ -1,78 +1,68 @@
"""Constants for the Alexa integration."""
from collections import OrderedDict
from homeassistant.const import (
STATE_OFF,
TEMP_CELSIUS,
TEMP_FAHRENHEIT,
)
from homeassistant.const import TEMP_CELSIUS, TEMP_FAHRENHEIT
from homeassistant.components.climate import const as climate
from homeassistant.components import fan
DOMAIN = 'alexa'
DOMAIN = "alexa"
# Flash briefing constants
CONF_UID = 'uid'
CONF_TITLE = 'title'
CONF_AUDIO = 'audio'
CONF_TEXT = 'text'
CONF_DISPLAY_URL = 'display_url'
CONF_UID = "uid"
CONF_TITLE = "title"
CONF_AUDIO = "audio"
CONF_TEXT = "text"
CONF_DISPLAY_URL = "display_url"
CONF_FILTER = 'filter'
CONF_ENTITY_CONFIG = 'entity_config'
CONF_ENDPOINT = 'endpoint'
CONF_CLIENT_ID = 'client_id'
CONF_CLIENT_SECRET = 'client_secret'
CONF_FILTER = "filter"
CONF_ENTITY_CONFIG = "entity_config"
CONF_ENDPOINT = "endpoint"
CONF_CLIENT_ID = "client_id"
CONF_CLIENT_SECRET = "client_secret"
ATTR_UID = 'uid'
ATTR_UPDATE_DATE = 'updateDate'
ATTR_TITLE_TEXT = 'titleText'
ATTR_STREAM_URL = 'streamUrl'
ATTR_MAIN_TEXT = 'mainText'
ATTR_REDIRECTION_URL = 'redirectionURL'
ATTR_UID = "uid"
ATTR_UPDATE_DATE = "updateDate"
ATTR_TITLE_TEXT = "titleText"
ATTR_STREAM_URL = "streamUrl"
ATTR_MAIN_TEXT = "mainText"
ATTR_REDIRECTION_URL = "redirectionURL"
SYN_RESOLUTION_MATCH = 'ER_SUCCESS_MATCH'
SYN_RESOLUTION_MATCH = "ER_SUCCESS_MATCH"
DATE_FORMAT = '%Y-%m-%dT%H:%M:%S.0Z'
DATE_FORMAT = "%Y-%m-%dT%H:%M:%S.0Z"
API_DIRECTIVE = 'directive'
API_ENDPOINT = 'endpoint'
API_EVENT = 'event'
API_CONTEXT = 'context'
API_HEADER = 'header'
API_PAYLOAD = 'payload'
API_SCOPE = 'scope'
API_CHANGE = 'change'
API_DIRECTIVE = "directive"
API_ENDPOINT = "endpoint"
API_EVENT = "event"
API_CONTEXT = "context"
API_HEADER = "header"
API_PAYLOAD = "payload"
API_SCOPE = "scope"
API_CHANGE = "change"
CONF_DESCRIPTION = 'description'
CONF_DISPLAY_CATEGORIES = 'display_categories'
CONF_DESCRIPTION = "description"
CONF_DISPLAY_CATEGORIES = "display_categories"
API_TEMP_UNITS = {
TEMP_FAHRENHEIT: 'FAHRENHEIT',
TEMP_CELSIUS: 'CELSIUS',
}
API_TEMP_UNITS = {TEMP_FAHRENHEIT: "FAHRENHEIT", TEMP_CELSIUS: "CELSIUS"}
# Needs to be ordered dict for `async_api_set_thermostat_mode` which does a
# reverse mapping of this dict and we want to map the first occurrance of OFF
# back to HA state.
API_THERMOSTAT_MODES = OrderedDict([
(climate.STATE_HEAT, 'HEAT'),
(climate.STATE_COOL, 'COOL'),
(climate.STATE_AUTO, 'AUTO'),
(climate.STATE_ECO, 'ECO'),
(climate.STATE_MANUAL, 'AUTO'),
(STATE_OFF, 'OFF'),
(climate.STATE_IDLE, 'OFF'),
(climate.STATE_FAN_ONLY, 'OFF'),
(climate.STATE_DRY, 'OFF'),
])
API_THERMOSTAT_MODES = OrderedDict(
[
(climate.HVAC_MODE_HEAT, "HEAT"),
(climate.HVAC_MODE_COOL, "COOL"),
(climate.HVAC_MODE_HEAT_COOL, "AUTO"),
(climate.HVAC_MODE_AUTO, "AUTO"),
(climate.HVAC_MODE_OFF, "OFF"),
(climate.HVAC_MODE_FAN_ONLY, "OFF"),
(climate.HVAC_MODE_DRY, "OFF"),
]
)
API_THERMOSTAT_PRESETS = {climate.PRESET_ECO: "ECO"}
PERCENTAGE_FAN_MAP = {
fan.SPEED_LOW: 33,
fan.SPEED_MEDIUM: 66,
fan.SPEED_HIGH: 100,
}
PERCENTAGE_FAN_MAP = {fan.SPEED_LOW: 33, fan.SPEED_MEDIUM: 66, fan.SPEED_HIGH: 100}
class Cause:
@@ -84,25 +74,25 @@ class Cause:
# Indicates that the event was caused by a customer interaction with an
# application. For example, a customer switches on a light, or locks a door
# using the Alexa app or an app provided by a device vendor.
APP_INTERACTION = 'APP_INTERACTION'
APP_INTERACTION = "APP_INTERACTION"
# Indicates that the event was caused by a physical interaction with an
# endpoint. For example manually switching on a light or manually locking a
# door lock
PHYSICAL_INTERACTION = 'PHYSICAL_INTERACTION'
PHYSICAL_INTERACTION = "PHYSICAL_INTERACTION"
# Indicates that the event was caused by the periodic poll of an appliance,
# which found a change in value. For example, you might poll a temperature
# sensor every hour, and send the updated temperature to Alexa.
PERIODIC_POLL = 'PERIODIC_POLL'
PERIODIC_POLL = "PERIODIC_POLL"
# Indicates that the event was caused by the application of a device rule.
# For example, a customer configures a rule to switch on a light if a
# motion sensor detects motion. In this case, Alexa receives an event from
# the motion sensor, and another event from the light to indicate that its
# state change was caused by the rule.
RULE_TRIGGER = 'RULE_TRIGGER'
RULE_TRIGGER = "RULE_TRIGGER"
# Indicates that the event was caused by a voice interaction with Alexa.
# For example a user speaking to their Echo device.
VOICE_INTERACTION = 'VOICE_INTERACTION'
VOICE_INTERACTION = "VOICE_INTERACTION"

View File

@@ -14,8 +14,21 @@ from homeassistant.const import (
from homeassistant.util.decorator import Registry
from homeassistant.components.climate import const as climate
from homeassistant.components import (
alert, automation, binary_sensor, cover, fan, group,
input_boolean, light, lock, media_player, scene, script, sensor, switch)
alert,
automation,
binary_sensor,
cover,
fan,
group,
input_boolean,
light,
lock,
media_player,
scene,
script,
sensor,
switch,
)
from .const import CONF_DESCRIPTION, CONF_DISPLAY_CATEGORIES
from .capabilities import (
@@ -129,7 +142,7 @@ class AlexaEntity:
def alexa_id(self):
"""Return the Alexa API entity id."""
return self.entity.entity_id.replace('.', '#')
return self.entity.entity_id.replace(".", "#")
def display_categories(self):
"""Return a list of display categories."""
@@ -171,15 +184,13 @@ class AlexaEntity:
def serialize_discovery(self):
"""Serialize the entity for discovery."""
return {
'displayCategories': self.display_categories(),
'cookie': {},
'endpointId': self.alexa_id(),
'friendlyName': self.friendly_name(),
'description': self.description(),
'manufacturerName': 'Home Assistant',
'capabilities': [
i.serialize_discovery() for i in self.interfaces()
]
"displayCategories": self.display_categories(),
"cookie": {},
"endpointId": self.alexa_id(),
"friendlyName": self.friendly_name(),
"description": self.description(),
"manufacturerName": "Home Assistant",
"capabilities": [i.serialize_discovery() for i in self.interfaces()],
}
@@ -220,8 +231,10 @@ class GenericCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
return [AlexaPowerController(self.entity),
AlexaEndpointHealth(self.hass, self.entity)]
return [
AlexaPowerController(self.entity),
AlexaEndpointHealth(self.hass, self.entity),
]
@ENTITY_ADAPTERS.register(switch.DOMAIN)
@@ -234,8 +247,10 @@ class SwitchCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
return [AlexaPowerController(self.entity),
AlexaEndpointHealth(self.hass, self.entity)]
return [
AlexaPowerController(self.entity),
AlexaEndpointHealth(self.hass, self.entity),
]
@ENTITY_ADAPTERS.register(climate.DOMAIN)
@@ -248,9 +263,10 @@ class ClimateCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
supported = self.entity.attributes.get(ATTR_SUPPORTED_FEATURES, 0)
if supported & climate.SUPPORT_ON_OFF:
# If we support two modes, one being off, we allow turning on too.
if climate.HVAC_MODE_OFF in self.entity.attributes[climate.ATTR_HVAC_MODES]:
yield AlexaPowerController(self.entity)
yield AlexaThermostatController(self.hass, self.entity)
yield AlexaTemperatureSensor(self.hass, self.entity)
yield AlexaEndpointHealth(self.hass, self.entity)
@@ -322,8 +338,10 @@ class LockCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
return [AlexaLockController(self.entity),
AlexaEndpointHealth(self.hass, self.entity)]
return [
AlexaLockController(self.entity),
AlexaEndpointHealth(self.hass, self.entity),
]
@ENTITY_ADAPTERS.register(media_player.const.DOMAIN)
@@ -337,26 +355,26 @@ class MediaPlayerCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
yield AlexaEndpointHealth(self.hass, self.entity)
yield AlexaPowerController(self.entity)
supported = self.entity.attributes.get(ATTR_SUPPORTED_FEATURES, 0)
if supported & media_player.const.SUPPORT_VOLUME_SET:
yield AlexaSpeaker(self.entity)
power_features = (media_player.SUPPORT_TURN_ON |
media_player.SUPPORT_TURN_OFF)
if supported & power_features:
yield AlexaPowerController(self.entity)
step_volume_features = (media_player.const.SUPPORT_VOLUME_MUTE |
media_player.const.SUPPORT_VOLUME_STEP)
step_volume_features = (
media_player.const.SUPPORT_VOLUME_MUTE
| media_player.const.SUPPORT_VOLUME_STEP
)
if supported & step_volume_features:
yield AlexaStepSpeaker(self.entity)
playback_features = (media_player.const.SUPPORT_PLAY |
media_player.const.SUPPORT_PAUSE |
media_player.const.SUPPORT_STOP |
media_player.const.SUPPORT_NEXT_TRACK |
media_player.const.SUPPORT_PREVIOUS_TRACK)
playback_features = (
media_player.const.SUPPORT_PLAY
| media_player.const.SUPPORT_PAUSE
| media_player.const.SUPPORT_STOP
| media_player.const.SUPPORT_NEXT_TRACK
| media_player.const.SUPPORT_PREVIOUS_TRACK
)
if supported & playback_features:
yield AlexaPlaybackController(self.entity)
@@ -371,7 +389,7 @@ class SceneCapabilities(AlexaEntity):
def description(self):
"""Return the description of the entity."""
# Required description as per Amazon Scene docs
scene_fmt = '{} (Scene connected via Home Assistant)'
scene_fmt = "{} (Scene connected via Home Assistant)"
return scene_fmt.format(AlexaEntity.description(self))
def default_display_categories(self):
@@ -380,8 +398,7 @@ class SceneCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
return [AlexaSceneController(self.entity,
supports_deactivation=False)]
return [AlexaSceneController(self.entity, supports_deactivation=False)]
@ENTITY_ADAPTERS.register(script.DOMAIN)
@@ -394,9 +411,8 @@ class ScriptCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
can_cancel = bool(self.entity.attributes.get('can_cancel'))
return [AlexaSceneController(self.entity,
supports_deactivation=can_cancel)]
can_cancel = bool(self.entity.attributes.get("can_cancel"))
return [AlexaSceneController(self.entity, supports_deactivation=can_cancel)]
@ENTITY_ADAPTERS.register(sensor.DOMAIN)
@@ -412,10 +428,7 @@ class SensorCapabilities(AlexaEntity):
def interfaces(self):
"""Yield the supported interfaces."""
attrs = self.entity.attributes
if attrs.get(ATTR_UNIT_OF_MEASUREMENT) in (
TEMP_FAHRENHEIT,
TEMP_CELSIUS,
):
if attrs.get(ATTR_UNIT_OF_MEASUREMENT) in (TEMP_FAHRENHEIT, TEMP_CELSIUS):
yield AlexaTemperatureSensor(self.hass, self.entity)
yield AlexaEndpointHealth(self.hass, self.entity)
@@ -424,8 +437,8 @@ class SensorCapabilities(AlexaEntity):
class BinarySensorCapabilities(AlexaEntity):
"""Class to represent BinarySensor capabilities."""
TYPE_CONTACT = 'contact'
TYPE_MOTION = 'motion'
TYPE_CONTACT = "contact"
TYPE_MOTION = "motion"
def default_display_categories(self):
"""Return the display categories for this entity."""
@@ -448,12 +461,7 @@ class BinarySensorCapabilities(AlexaEntity):
def get_type(self):
"""Return the type of binary sensor."""
attrs = self.entity.attributes
if attrs.get(ATTR_DEVICE_CLASS) in (
'door',
'garage_door',
'opening',
'window',
):
if attrs.get(ATTR_DEVICE_CLASS) in ("door", "garage_door", "opening", "window"):
return self.TYPE_CONTACT
if attrs.get(ATTR_DEVICE_CLASS) == 'motion':
if attrs.get(ATTR_DEVICE_CLASS) == "motion":
return self.TYPE_MOTION

Some files were not shown because too many files have changed in this diff Show More