Merge branch 'release/v5.1.0'

This commit is contained in:
Ivan Kravets
2021-01-28 19:23:14 +02:00
54 changed files with 884 additions and 572 deletions

View File

@ -8,7 +8,7 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
os: [ubuntu-latest, windows-latest, macos-latest] os: [ubuntu-latest, windows-latest, macos-latest]
python-version: [2.7, 3.7, 3.8] python-version: [3.6, 3.7, 3.8, 3.9]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2

View File

@ -8,7 +8,7 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
os: [ubuntu-16.04, windows-latest, macos-latest] os: [ubuntu-16.04, windows-latest, macos-latest]
python-version: [2.7, 3.7] python-version: [3.7]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2

View File

@ -1,3 +1,3 @@
[settings] [settings]
line_length=88 line_length=88
known_third_party=OpenSSL, SCons, autobahn, jsonrpc, twisted, zope known_third_party=OpenSSL, SCons, jsonrpc, twisted, zope

View File

@ -14,7 +14,6 @@ disable=
too-few-public-methods, too-few-public-methods,
useless-object-inheritance, useless-object-inheritance,
useless-import-alias, useless-import-alias,
fixme,
bad-option-value, bad-option-value,
; PY2 Compat ; PY2 Compat

View File

@ -8,15 +8,54 @@ PlatformIO Core 5
**A professional collaborative platform for embedded development** **A professional collaborative platform for embedded development**
5.1.0 (2021-01-28)
~~~~~~~~~~~~~~~~~~
* **PlatformIO Home**
- Boosted PlatformIO Home performance thanks to migrating the codebase to the pure Python 3 Asynchronous I/O stack
- Added a new ``--session-id`` option to `pio home <https://docs.platformio.org/page/core/userguide/cmd_home.html>`__ command that helps to keep PlatformIO Home isolated from other instances and protect from 3rd party access (`issue #3397 <https://github.com/platformio/platformio-core/issues/3397>`_)
* **Build System**
- Upgraded build engine to the SCons 4.1 (`release notes <https://scons.org/scons-410-is-available.html>`_)
- Refactored a workaround for a maximum command line character limitation (`issue #3792 <https://github.com/platformio/platformio-core/issues/3792>`_)
- Fixed an issue with Python 3.8+ on Windows when a network drive is used (`issue #3417 <https://github.com/platformio/platformio-core/issues/3417>`_)
* **Package Management**
- New options for `pio system prune <https://docs.platformio.org/page/core/userguide/system/cmd_prune.html>`__ command:
+ ``--dry-run`` option to show data that will be removed
+ ``--core-packages`` option to remove unnecessary core packages
+ ``--platform-packages`` option to remove unnecessary development platform packages (`issue #923 <https://github.com/platformio/platformio-core/issues/923>`_)
- Added new `check_prune_system_threshold <https://docs.platformio.org/page/core/userguide/cmd_settings.html#check-prune-system-threshold>`__ setting
- Disabled automatic removal of unnecessary development platform packages (`issue #3708 <https://github.com/platformio/platformio-core/issues/3708>`_, `issue #3770 <https://github.com/platformio/platformio-core/issues/3770>`_)
- Fixed an issue when unnecessary packages were removed in ``update --dry-run`` mode (`issue #3809 <https://github.com/platformio/platformio-core/issues/3809>`_)
- Fixed a "ValueError: Invalid simple block" when uninstalling a package with a custom name and external source (`issue #3816 <https://github.com/platformio/platformio-core/issues/3816>`_)
* **Debugging**
- Configure a custom debug adapter speed using a new `debug_speed <https://docs.platformio.org/page/projectconf/section_env_debug.html#debug-speed>`__ option (`issue #3799 <https://github.com/platformio/platformio-core/issues/3799>`_)
- Handle debugging server's "ready_pattern" in "stderr" output
* **Miscellaneous**
- Improved listing of `multicast DNS services <https://docs.platformio.org/page/core/userguide/device/cmd_list.html>`_
- Fixed a "UnicodeDecodeError: 'utf-8' codec can't decode byte" when using J-Link for firmware uploading on Linux (`issue #3804 <https://github.com/platformio/platformio-core/issues/3804>`_)
- Fixed an issue with a compiler driver for ".ccls" language server (`issue #3808 <https://github.com/platformio/platformio-core/issues/3808>`_)
- Fixed an issue when `pio device monitor --eol <https://docs.platformio.org/en/latest/core/userguide/device/cmd_monitor.html#cmdoption-pio-device-monitor-eol>`__ and "send_on_enter" filter do not work properly (`issue #3787 <https://github.com/platformio/platformio-core/issues/3787>`_)
5.0.4 (2020-12-30) 5.0.4 (2020-12-30)
~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~
- Added "Core" suffix when showing PlatformIO Core version using ``pio --version`` command - Added "Core" suffix when showing PlatformIO Core version using ``pio --version`` command
- Improved ``.ccls`` configuration file for Emacs, Vim, and Sublime Text integrations - Improved ".ccls" configuration file for Emacs, Vim, and Sublime Text integrations
- Updated analysis tools: - Updated analysis tools:
* `Cppcheck <https://docs.platformio.org/page/plus/check-tools/cppcheck.html>`__ v2.3 with improved C++ parser and several new MISRA rules * `Cppcheck <https://docs.platformio.org/page/plus/check-tools/cppcheck.html>`__ v2.3 with improved C++ parser and several new MISRA rules
* `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ v7.11 with new diagnostics and updated mass suppression mechanism * `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ v7.11 with new diagnostics and updated mass suppression mechanism
- Show a warning message about deprecated support for Python 2 and Python 3.5 - Show a warning message about deprecated support for Python 2 and Python 3.5
- Do not provide "intelliSenseMode" option when generating configuration for VSCode C/C++ extension - Do not provide "intelliSenseMode" option when generating configuration for VSCode C/C++ extension

View File

@ -3,8 +3,8 @@ lint:
pylint -j 6 --rcfile=./.pylintrc ./tests pylint -j 6 --rcfile=./.pylintrc ./tests
isort: isort:
isort -rc ./platformio isort ./platformio
isort -rc ./tests isort ./tests
format: format:
black --target-version py27 ./platformio black --target-version py27 ./platformio

2
docs

Submodule docs updated: 9db46dccef...25edd66d55

View File

@ -14,7 +14,7 @@
import sys import sys
VERSION = (5, 0, 4) VERSION = (5, 1, 0)
__version__ = ".".join([str(s) for s in VERSION]) __version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio" __title__ = "platformio"
@ -47,10 +47,10 @@ __pioremote_endpoint__ = "ssl:host=remote.platformio.org:port=4413"
__default_requests_timeout__ = (10, None) # (connect, read) __default_requests_timeout__ = (10, None) # (connect, read)
__core_packages__ = { __core_packages__ = {
"contrib-piohome": "~3.3.1", "contrib-piohome": "~3.3.3",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor), "contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor),
"tool-unity": "~1.20500.0", "tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if sys.version_info.major == 2 else "~4.40001.0", "tool-scons": "~2.20501.7" if sys.version_info.major == 2 else "~4.40100.2",
"tool-cppcheck": "~1.230.0", "tool-cppcheck": "~1.230.0",
"tool-clangtidy": "~1.100000.0", "tool-clangtidy": "~1.100000.0",
"tool-pvs-studio": "~7.11.0", "tool-pvs-studio": "~7.11.0",

View File

@ -55,6 +55,10 @@ DEFAULT_SETTINGS = {
"description": "Check for the platform updates interval (days)", "description": "Check for the platform updates interval (days)",
"value": 7, "value": 7,
}, },
"check_prune_system_threshold": {
"description": "Check for pruning unnecessary data threshold (megabytes)",
"value": 1024,
},
"enable_cache": { "enable_cache": {
"description": "Enable caching for HTTP API requests", "description": "Enable caching for HTTP API requests",
"value": True, "value": True,

View File

@ -81,12 +81,19 @@ DEFAULT_ENV_OPTIONS = dict(
IDE_EXTRA_DATA={}, IDE_EXTRA_DATA={},
) )
# Declare command verbose messages
command_strings = dict(
ARCOM="Archiving",
LINKCOM="Linking",
RANLIBCOM="Indexing",
ASCOM="Compiling",
ASPPCOM="Compiling",
CCCOM="Compiling",
CXXCOM="Compiling",
)
if not int(ARGUMENTS.get("PIOVERBOSE", 0)): if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
DEFAULT_ENV_OPTIONS["ARCOMSTR"] = "Archiving $TARGET" for name, value in command_strings.items():
DEFAULT_ENV_OPTIONS["LINKCOMSTR"] = "Linking $TARGET" DEFAULT_ENV_OPTIONS["%sSTR" % name] = "%s $TARGET" % (value)
DEFAULT_ENV_OPTIONS["RANLIBCOMSTR"] = "Indexing $TARGET"
for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"):
DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET"
env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS) env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS)

View File

@ -41,7 +41,7 @@ from platformio.proc import where_is_program
# should hold the compilation database, otherwise, the file defaults to compile_commands.json, # should hold the compilation database, otherwise, the file defaults to compile_commands.json,
# which is the name that most clang tools search for by default. # which is the name that most clang tools search for by default.
# TODO: Is there a better way to do this than this global? Right now this exists so that the # Is there a better way to do this than this global? Right now this exists so that the
# emitter we add can record all of the things it emits, so that the scanner for the top level # emitter we add can record all of the things it emits, so that the scanner for the top level
# compilation database can access the complete list, and also so that the writer has easy # compilation database can access the complete list, and also so that the writer has easy
# access to write all of the files. But it seems clunky. How can the emitter and the scanner # access to write all of the files. But it seems clunky. How can the emitter and the scanner
@ -104,7 +104,7 @@ def makeEmitCompilationDbEntry(comstr):
__COMPILATIONDB_ENV=env, __COMPILATIONDB_ENV=env,
) )
# TODO: Technically, these next two lines should not be required: it should be fine to # Technically, these next two lines should not be required: it should be fine to
# cache the entries. However, they don't seem to update properly. Since they are quick # cache the entries. However, they don't seem to update properly. Since they are quick
# to re-generate disable caching and sidestep this problem. # to re-generate disable caching and sidestep this problem.
env.AlwaysBuild(entry) env.AlwaysBuild(entry)

View File

@ -17,7 +17,8 @@ from __future__ import absolute_import
import os import os
from glob import glob from glob import glob
from SCons.Defaults import processDefines # pylint: disable=import-error import SCons.Defaults # pylint: disable=import-error
import SCons.Subst # pylint: disable=import-error
from platformio.compat import glob_escape from platformio.compat import glob_escape
from platformio.package.manager.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
@ -58,8 +59,16 @@ def _dump_includes(env):
for g in toolchain_incglobs: for g in toolchain_incglobs:
includes["toolchain"].extend([os.path.realpath(inc) for inc in glob(g)]) includes["toolchain"].extend([os.path.realpath(inc) for inc in glob(g)])
# include Unity framework if there are tests in project
includes["unity"] = [] includes["unity"] = []
unity_dir = get_core_package_dir("tool-unity") auto_install_unity = False
test_dir = env.GetProjectConfig().get_optional_dir("test")
if os.path.isdir(test_dir) and os.listdir(test_dir) != ["README"]:
auto_install_unity = True
unity_dir = get_core_package_dir(
"tool-unity",
auto_install=auto_install_unity,
)
if unity_dir: if unity_dir:
includes["unity"].append(unity_dir) includes["unity"].append(unity_dir)
@ -92,7 +101,7 @@ def _get_gcc_defines(env):
def _dump_defines(env): def _dump_defines(env):
defines = [] defines = []
# global symbols # global symbols
for item in processDefines(env.get("CPPDEFINES", [])): for item in SCons.Defaults.processDefines(env.get("CPPDEFINES", [])):
item = item.strip() item = item.strip()
if item: if item:
defines.append(env.subst(item).replace("\\", "")) defines.append(env.subst(item).replace("\\", ""))
@ -141,25 +150,17 @@ def _get_svd_path(env):
return None return None
def _escape_build_flag(flags): def _subst_cmd(env, cmd):
return [flag if " " not in flag else '"%s"' % flag for flag in flags] args = env.subst_list(cmd, SCons.Subst.SUBST_CMD)[0]
return " ".join([SCons.Subst.quote_spaces(arg) for arg in args])
def DumpIDEData(env, globalenv): def DumpIDEData(env, globalenv):
""" env here is `projenv`""" """ env here is `projenv`"""
env["__escape_build_flag"] = _escape_build_flag
LINTCCOM = (
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
LINTCXXCOM = (
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
data = { data = {
"env_name": env["PIOENV"], "env_name": env["PIOENV"],
"libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()], "libsource_dirs": [env.subst(item) for item in env.GetLibSourceDirs()],
"defines": _dump_defines(env), "defines": _dump_defines(env),
"includes": _dump_includes(env), "includes": _dump_includes(env),
"cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")), "cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
@ -181,7 +182,7 @@ def DumpIDEData(env, globalenv):
env_ = env.Clone() env_ = env.Clone()
# https://github.com/platformio/platformio-atom-ide/issues/34 # https://github.com/platformio/platformio-atom-ide/issues/34
_new_defines = [] _new_defines = []
for item in processDefines(env_.get("CPPDEFINES", [])): for item in SCons.Defaults.processDefines(env_.get("CPPDEFINES", [])):
item = item.replace('\\"', '"') item = item.replace('\\"', '"')
if " " in item: if " " in item:
_new_defines.append(item.replace(" ", "\\\\ ")) _new_defines.append(item.replace(" ", "\\\\ "))
@ -189,7 +190,13 @@ def DumpIDEData(env, globalenv):
_new_defines.append(item) _new_defines.append(item)
env_.Replace(CPPDEFINES=_new_defines) env_.Replace(CPPDEFINES=_new_defines)
data.update({"cc_flags": env_.subst(LINTCCOM), "cxx_flags": env_.subst(LINTCXXCOM)}) # export C/C++ build flags
data.update(
{
"cc_flags": _subst_cmd(env_, "$CFLAGS $CCFLAGS $CPPFLAGS"),
"cxx_flags": _subst_cmd(env_, "$CXXFLAGS $CCFLAGS $CPPFLAGS"),
}
)
return data return data

View File

@ -14,15 +14,30 @@
from __future__ import absolute_import from __future__ import absolute_import
from hashlib import md5 import hashlib
from os import makedirs import os
from os.path import isdir, isfile, join import re
from SCons.Platform import TempFileMunge # pylint: disable=import-error
from SCons.Subst import quote_spaces # pylint: disable=import-error
from platformio.compat import WINDOWS, hashlib_encode_data from platformio.compat import WINDOWS, hashlib_encode_data
# Windows CLI has limit with command length to 8192 # There are the next limits depending on a platform:
# Leave 2000 chars for flags and other options # - Windows = 8192
MAX_LINE_LENGTH = 6000 if WINDOWS else 128072 # - Unix = 131072
# We need ~256 characters for a temporary file path
MAX_LINE_LENGTH = (8192 if WINDOWS else 131072) - 256
WINPATHSEP_RE = re.compile(r"\\([^\"'\\]|$)")
def tempfile_arg_esc_func(arg):
arg = quote_spaces(arg)
if not WINDOWS:
return arg
# GCC requires double Windows slashes, let's use UNIX separator
return WINPATHSEP_RE.sub(r"/\1", arg)
def long_sources_hook(env, sources): def long_sources_hook(env, sources):
@ -41,30 +56,14 @@ def long_sources_hook(env, sources):
return '@"%s"' % _file_long_data(env, " ".join(data)) return '@"%s"' % _file_long_data(env, " ".join(data))
def long_incflags_hook(env, incflags):
_incflags = env.subst(incflags).replace("\\", "/")
if len(_incflags) < MAX_LINE_LENGTH:
return incflags
# fix space in paths
data = []
for line in _incflags.split(" -I"):
line = line.strip()
if not line.startswith("-I"):
line = "-I" + line
data.append('-I"%s"' % line[2:])
return '@"%s"' % _file_long_data(env, " ".join(data))
def _file_long_data(env, data): def _file_long_data(env, data):
build_dir = env.subst("$BUILD_DIR") build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir): if not os.path.isdir(build_dir):
makedirs(build_dir) os.makedirs(build_dir)
tmp_file = join( tmp_file = os.path.join(
build_dir, "longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest() build_dir, "longcmd-%s" % hashlib.md5(hashlib_encode_data(data)).hexdigest()
) )
if isfile(tmp_file): if os.path.isfile(tmp_file):
return tmp_file return tmp_file
with open(tmp_file, "w") as fp: with open(tmp_file, "w") as fp:
fp.write(data) fp.write(data)
@ -76,17 +75,21 @@ def exists(_):
def generate(env): def generate(env):
env.Replace(_long_sources_hook=long_sources_hook) kwargs = dict(
env.Replace(_long_incflags_hook=long_incflags_hook) _long_sources_hook=long_sources_hook,
coms = {} TEMPFILE=TempFileMunge,
for key in ("ARCOM", "LINKCOM"): MAXLINELENGTH=MAX_LINE_LENGTH,
coms[key] = env.get(key, "").replace( TEMPFILEARGESCFUNC=tempfile_arg_esc_func,
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}" TEMPFILESUFFIX=".tmp",
) TEMPFILEDIR="$BUILD_DIR",
for key in ("_CCCOMCOM", "ASPPCOM"): )
coms[key] = env.get(key, "").replace(
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}" for name in ("LINKCOM", "ASCOM", "ASPPCOM", "CCCOM", "CXXCOM"):
) kwargs[name] = "${TEMPFILE('%s','$%sSTR')}" % (env.get(name), name)
env.Replace(**coms)
kwargs["ARCOM"] = env.get("ARCOM", "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
env.Replace(**kwargs)
return env return env

View File

@ -176,6 +176,7 @@ def configure_initial_debug_options(platform, env_options):
tool_name, tool_name,
tool_settings, tool_settings,
), ),
speed=env_options.get("debug_speed", tool_settings.get("speed")),
server=server_options, server=server_options,
) )
return result return result

View File

@ -124,16 +124,25 @@ class DebugServer(BaseProcess):
@defer.inlineCallbacks @defer.inlineCallbacks
def _wait_until_ready(self): def _wait_until_ready(self):
timeout = 10 ready_pattern = self.debug_options.get("server", {}).get("ready_pattern")
timeout = 60 if ready_pattern else 10
elapsed = 0 elapsed = 0
delay = 0.5 delay = 0.5
auto_ready_delay = 0.5 auto_ready_delay = 0.5
while not self._ready and not self._process_ended and elapsed < timeout: while not self._ready and not self._process_ended and elapsed < timeout:
yield self.async_sleep(delay) yield self.async_sleep(delay)
if not self.debug_options.get("server", {}).get("ready_pattern"): if not ready_pattern:
self._ready = self._last_activity < (time.time() - auto_ready_delay) self._ready = self._last_activity < (time.time() - auto_ready_delay)
elapsed += delay elapsed += delay
def _check_ready_by_pattern(self, data):
if self._ready:
return self._ready
ready_pattern = self.debug_options.get("server", {}).get("ready_pattern")
if ready_pattern:
self._ready = ready_pattern.encode() in data
return self._ready
@staticmethod @staticmethod
def async_sleep(secs): def async_sleep(secs):
d = defer.Deferred() d = defer.Deferred()
@ -147,11 +156,11 @@ class DebugServer(BaseProcess):
super(DebugServer, self).outReceived( super(DebugServer, self).outReceived(
escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data
) )
if self._ready: self._check_ready_by_pattern(data)
return
ready_pattern = self.debug_options.get("server", {}).get("ready_pattern") def errReceived(self, data):
if ready_pattern: super(DebugServer, self).errReceived(data)
self._ready = ready_pattern.encode() in data self._check_ready_by_pattern(data)
def processEnded(self, reason): def processEnded(self, reason):
self._process_ended = True self._process_ended = True

View File

@ -179,7 +179,9 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
for name in os.listdir(filters_dir): for name in os.listdir(filters_dir):
if not name.endswith(".py"): if not name.endswith(".py"):
continue continue
device_helpers.load_monitor_filter(os.path.join(filters_dir, name)) device_helpers.load_monitor_filter(
os.path.join(filters_dir, name), options=kwargs
)
project_options = {} project_options = {}
try: try:
@ -193,9 +195,7 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
if "platform" in project_options: if "platform" in project_options:
with fs.cd(kwargs["project_dir"]): with fs.cd(kwargs["project_dir"]):
platform = PlatformFactory.new(project_options["platform"]) platform = PlatformFactory.new(project_options["platform"])
device_helpers.register_platform_filters( device_helpers.register_platform_filters(platform, options=kwargs)
platform, kwargs["project_dir"], kwargs["environment"]
)
if not kwargs["port"]: if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True) ports = util.get_serial_ports(filter_hwid=True)

View File

@ -18,12 +18,13 @@ from platformio.project.config import ProjectConfig
class DeviceMonitorFilter(miniterm.Transform): class DeviceMonitorFilter(miniterm.Transform):
def __init__(self, project_dir=None, environment=None): def __init__(self, options=None):
""" Called by PlatformIO to pass context """ """ Called by PlatformIO to pass context """
miniterm.Transform.__init__(self) miniterm.Transform.__init__(self)
self.project_dir = project_dir self.options = options or {}
self.environment = environment self.project_dir = self.options.get("project_dir")
self.environment = self.options.get("environment")
self.config = ProjectConfig.get_instance() self.config = ProjectConfig.get_instance()
if not self.environment: if not self.environment:

View File

@ -22,10 +22,17 @@ class SendOnEnter(DeviceMonitorFilter):
super(SendOnEnter, self).__init__(*args, **kwargs) super(SendOnEnter, self).__init__(*args, **kwargs)
self._buffer = "" self._buffer = ""
if self.options.get("eol") == "CR":
self._eol = "\r"
elif self.options.get("eol") == "LF":
self._eol = "\n"
else:
self._eol = "\r\n"
def tx(self, text): def tx(self, text):
self._buffer += text self._buffer += text
if self._buffer.endswith("\r\n"): if self._buffer.endswith(self._eol):
text = self._buffer[:-2] text = self._buffer[: len(self._eol) * -1]
self._buffer = "" self._buffer = ""
return text return text
return "" return ""

View File

@ -76,7 +76,7 @@ def get_board_hwids(project_dir, platform, board):
return platform.board_config(board).get("build.hwids", []) return platform.board_config(board).get("build.hwids", [])
def load_monitor_filter(path, project_dir=None, environment=None): def load_monitor_filter(path, options=None):
name = os.path.basename(path) name = os.path.basename(path)
name = name[: name.find(".")] name = name[: name.find(".")]
module = load_python_module("platformio.commands.device.filters.%s" % name, path) module = load_python_module("platformio.commands.device.filters.%s" % name, path)
@ -87,12 +87,12 @@ def load_monitor_filter(path, project_dir=None, environment=None):
or cls == DeviceMonitorFilter or cls == DeviceMonitorFilter
): ):
continue continue
obj = cls(project_dir, environment) obj = cls(options)
miniterm.TRANSFORMATIONS[obj.NAME] = obj miniterm.TRANSFORMATIONS[obj.NAME] = obj
return True return True
def register_platform_filters(platform, project_dir, environment): def register_platform_filters(platform, options=None):
monitor_dir = os.path.join(platform.get_dir(), "monitor") monitor_dir = os.path.join(platform.get_dir(), "monitor")
if not os.path.isdir(monitor_dir): if not os.path.isdir(monitor_dir):
return return
@ -103,4 +103,4 @@ def register_platform_filters(platform, project_dir, environment):
path = os.path.join(monitor_dir, name) path = os.path.join(monitor_dir, name)
if not os.path.isfile(path): if not os.path.isfile(path):
continue continue
load_monitor_filter(path, project_dir, environment) load_monitor_filter(path, options)

View File

@ -12,20 +12,15 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-locals,too-many-statements
import mimetypes import mimetypes
import socket
from os.path import isdir
import click import click
from platformio import exception from platformio.commands.home.helpers import is_port_used
from platformio.compat import WINDOWS from platformio.compat import ensure_python3
from platformio.package.manager.core import get_core_package_dir, inject_contrib_pysite
@click.command("home", short_help="UI to manage PlatformIO") @click.command("home", short_help="GUI to manage PlatformIO")
@click.option("--port", type=int, default=8008, help="HTTP port, default=8008") @click.option("--port", type=int, default=8008, help="HTTP port, default=8008")
@click.option( @click.option(
"--host", "--host",
@ -45,61 +40,30 @@ from platformio.package.manager.core import get_core_package_dir, inject_contrib
"are connected. Default is 0 which means never auto shutdown" "are connected. Default is 0 which means never auto shutdown"
), ),
) )
def cli(port, host, no_open, shutdown_timeout): @click.option(
# pylint: disable=import-error, import-outside-toplevel "--session-id",
help=(
# import contrib modules "A unique session identifier to keep PIO Home isolated from other instances "
inject_contrib_pysite() "and protect from 3rd party access"
),
from autobahn.twisted.resource import WebSocketResource )
from twisted.internet import reactor def cli(port, host, no_open, shutdown_timeout, session_id):
from twisted.web import server ensure_python3()
from twisted.internet.error import CannotListenError
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.server import JSONRPCServerFactory
from platformio.commands.home.web import WebRoot
factory = JSONRPCServerFactory(shutdown_timeout)
factory.addHandler(AppRPC(), namespace="app")
factory.addHandler(IDERPC(), namespace="ide")
factory.addHandler(MiscRPC(), namespace="misc")
factory.addHandler(OSRPC(), namespace="os")
factory.addHandler(PIOCoreRPC(), namespace="core")
factory.addHandler(ProjectRPC(), namespace="project")
factory.addHandler(AccountRPC(), namespace="account")
contrib_dir = get_core_package_dir("contrib-piohome")
if not isdir(contrib_dir):
raise exception.PlatformioException("Invalid path to PIO Home Contrib")
# Ensure PIO Home mimetypes are known # Ensure PIO Home mimetypes are known
mimetypes.add_type("text/html", ".html") mimetypes.add_type("text/html", ".html")
mimetypes.add_type("text/css", ".css") mimetypes.add_type("text/css", ".css")
mimetypes.add_type("application/javascript", ".js") mimetypes.add_type("application/javascript", ".js")
root = WebRoot(contrib_dir)
root.putChild(b"wsrpc", WebSocketResource(factory))
site = server.Site(root)
# hook for `platformio-node-helpers` # hook for `platformio-node-helpers`
if host == "__do_not_start__": if host == "__do_not_start__":
return return
already_started = is_port_used(host, port) home_url = "http://%s:%d%s" % (
home_url = "http://%s:%d" % (host, port) host,
if not no_open: port,
if already_started: ("/session/%s/" % session_id) if session_id else "/",
click.launch(home_url) )
else:
reactor.callLater(1, lambda: click.launch(home_url))
click.echo( click.echo(
"\n".join( "\n".join(
[ [
@ -108,45 +72,28 @@ def cli(port, host, no_open, shutdown_timeout):
" /\\-_--\\ PlatformIO Home", " /\\-_--\\ PlatformIO Home",
"/ \\_-__\\", "/ \\_-__\\",
"|[]| [] | %s" % home_url, "|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)), "|__|____|__%s" % ("_" * len(home_url)),
] ]
) )
) )
click.echo("") click.echo("")
click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url) click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
try: if is_port_used(host, port):
reactor.listenTCP(port, site, interface=host)
except CannotListenError as e:
click.secho(str(e), fg="red", err=True)
already_started = True
if already_started:
click.secho( click.secho(
"PlatformIO Home server is already started in another process.", fg="yellow" "PlatformIO Home server is already started in another process.", fg="yellow"
) )
if not no_open:
click.launch(home_url)
return return
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.") # pylint: disable=import-outside-toplevel
from platformio.commands.home.run import run_server
reactor.run() run_server(
host=host,
port=port,
def is_port_used(host, port): no_open=no_open,
socket.setdefaulttimeout(1) shutdown_timeout=shutdown_timeout,
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) home_url=home_url,
if WINDOWS: )
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@ -12,36 +12,27 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=keyword-arg-before-vararg,arguments-differ,signature-differs import socket
import requests import requests
from twisted.internet import defer # pylint: disable=import-error from starlette.concurrency import run_in_threadpool
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import threads # pylint: disable=import-error
from platformio import util from platformio import util
from platformio.compat import WINDOWS
from platformio.proc import where_is_program from platformio.proc import where_is_program
class AsyncSession(requests.Session): class AsyncSession(requests.Session):
def __init__(self, n=None, *args, **kwargs): async def request( # pylint: disable=signature-differs,invalid-overridden-method
if n: self, *args, **kwargs
pool = reactor.getThreadPool() ):
pool.adjustPoolsize(0, n)
super(AsyncSession, self).__init__(*args, **kwargs)
def request(self, *args, **kwargs):
func = super(AsyncSession, self).request func = super(AsyncSession, self).request
return threads.deferToThread(func, *args, **kwargs) return await run_in_threadpool(func, *args, **kwargs)
def wrap(self, *args, **kwargs): # pylint: disable=no-self-use
return defer.ensureDeferred(*args, **kwargs)
@util.memoized(expire="60s") @util.memoized(expire="60s")
def requests_session(): def requests_session():
return AsyncSession(n=5) return AsyncSession()
@util.memoized(expire="60s") @util.memoized(expire="60s")
@ -49,3 +40,23 @@ def get_core_fullpath():
return where_is_program( return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else "") "platformio" + (".exe" if "windows" in util.get_systype() else "")
) )
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@ -12,12 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import jsonrpc # pylint: disable=import-error import jsonrpc
from platformio.clients.account import AccountClient from platformio.clients.account import AccountClient
class AccountRPC(object): class AccountRPC:
@staticmethod @staticmethod
def call_client(method, *args, **kwargs): def call_client(method, *args, **kwargs):
try: try:

View File

@ -20,7 +20,7 @@ from platformio import __version__, app, fs, util
from platformio.project.helpers import get_project_core_dir, is_platformio_project from platformio.project.helpers import get_project_core_dir, is_platformio_project
class AppRPC(object): class AppRPC:
APPSTATE_PATH = join(get_project_core_dir(), "homestate.json") APPSTATE_PATH = join(get_project_core_dir(), "homestate.json")

View File

@ -14,11 +14,12 @@
import time import time
import jsonrpc # pylint: disable=import-error import jsonrpc
from twisted.internet import defer # pylint: disable=import-error
from platformio.compat import get_running_loop
class IDERPC(object): class IDERPC:
def __init__(self): def __init__(self):
self._queue = {} self._queue = {}
@ -28,14 +29,14 @@ class IDERPC(object):
code=4005, message="PIO Home IDE agent is not started" code=4005, message="PIO Home IDE agent is not started"
) )
while self._queue[sid]: while self._queue[sid]:
self._queue[sid].pop().callback( self._queue[sid].pop().set_result(
{"id": time.time(), "method": command, "params": params} {"id": time.time(), "method": command, "params": params}
) )
def listen_commands(self, sid=0): def listen_commands(self, sid=0):
if sid not in self._queue: if sid not in self._queue:
self._queue[sid] = [] self._queue[sid] = []
self._queue[sid].append(defer.Deferred()) self._queue[sid].append(get_running_loop().create_future())
return self._queue[sid][-1] return self._queue[sid][-1]
def open_project(self, sid, project_dir): def open_project(self, sid, project_dir):

View File

@ -15,14 +15,13 @@
import json import json
import time import time
from twisted.internet import defer, reactor # pylint: disable=import-error
from platformio.cache import ContentCache from platformio.cache import ContentCache
from platformio.commands.home.rpc.handlers.os import OSRPC from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.compat import create_task
class MiscRPC(object): class MiscRPC:
def load_latest_tweets(self, data_url): async def load_latest_tweets(self, data_url):
cache_key = ContentCache.key_from_args(data_url, "tweets") cache_key = ContentCache.key_from_args(data_url, "tweets")
cache_valid = "180d" cache_valid = "180d"
with ContentCache() as cc: with ContentCache() as cc:
@ -31,22 +30,20 @@ class MiscRPC(object):
cache_data = json.loads(cache_data) cache_data = json.loads(cache_data)
# automatically update cache in background every 12 hours # automatically update cache in background every 12 hours
if cache_data["time"] < (time.time() - (3600 * 12)): if cache_data["time"] < (time.time() - (3600 * 12)):
reactor.callLater( create_task(
5, self._preload_latest_tweets, data_url, cache_key, cache_valid self._preload_latest_tweets(data_url, cache_key, cache_valid)
) )
return cache_data["result"] return cache_data["result"]
result = self._preload_latest_tweets(data_url, cache_key, cache_valid) return await self._preload_latest_tweets(data_url, cache_key, cache_valid)
return result
@staticmethod @staticmethod
@defer.inlineCallbacks async def _preload_latest_tweets(data_url, cache_key, cache_valid):
def _preload_latest_tweets(data_url, cache_key, cache_valid): result = json.loads((await OSRPC.fetch_content(data_url)))
result = json.loads((yield OSRPC.fetch_content(data_url)))
with ContentCache() as cc: with ContentCache() as cc:
cc.set( cc.set(
cache_key, cache_key,
json.dumps({"time": int(time.time()), "result": result}), json.dumps({"time": int(time.time()), "result": result}),
cache_valid, cache_valid,
) )
defer.returnValue(result) return result

View File

@ -14,25 +14,23 @@
from __future__ import absolute_import from __future__ import absolute_import
import glob
import io import io
import os import os
import shutil import shutil
from functools import cmp_to_key from functools import cmp_to_key
import click import click
from twisted.internet import defer # pylint: disable=import-error
from platformio import __default_requests_timeout__, fs, util from platformio import __default_requests_timeout__, fs, util
from platformio.cache import ContentCache from platformio.cache import ContentCache
from platformio.clients.http import ensure_internet_on from platformio.clients.http import ensure_internet_on
from platformio.commands.home import helpers from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding, glob_recursive
class OSRPC(object): class OSRPC:
@staticmethod @staticmethod
@defer.inlineCallbacks async def fetch_content(uri, data=None, headers=None, cache_valid=None):
def fetch_content(uri, data=None, headers=None, cache_valid=None):
if not headers: if not headers:
headers = { headers = {
"User-Agent": ( "User-Agent": (
@ -46,18 +44,18 @@ class OSRPC(object):
if cache_key: if cache_key:
result = cc.get(cache_key) result = cc.get(cache_key)
if result is not None: if result is not None:
defer.returnValue(result) return result
# check internet before and resolve issue with 60 seconds timeout # check internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True) ensure_internet_on(raise_exception=True)
session = helpers.requests_session() session = helpers.requests_session()
if data: if data:
r = yield session.post( r = await session.post(
uri, data=data, headers=headers, timeout=__default_requests_timeout__ uri, data=data, headers=headers, timeout=__default_requests_timeout__
) )
else: else:
r = yield session.get( r = await session.get(
uri, headers=headers, timeout=__default_requests_timeout__ uri, headers=headers, timeout=__default_requests_timeout__
) )
@ -66,11 +64,11 @@ class OSRPC(object):
if cache_valid: if cache_valid:
with ContentCache() as cc: with ContentCache() as cc:
cc.set(cache_key, result, cache_valid) cc.set(cache_key, result, cache_valid)
defer.returnValue(result) return result
def request_content(self, uri, data=None, headers=None, cache_valid=None): async def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith("http"): if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid) return await self.fetch_content(uri, data, headers, cache_valid)
if os.path.isfile(uri): if os.path.isfile(uri):
with io.open(uri, encoding="utf-8") as fp: with io.open(uri, encoding="utf-8") as fp:
return fp.read() return fp.read()
@ -82,13 +80,11 @@ class OSRPC(object):
@staticmethod @staticmethod
def reveal_file(path): def reveal_file(path):
return click.launch( return click.launch(path, locate=True)
path.encode(get_filesystem_encoding()) if PY2 else path, locate=True
)
@staticmethod @staticmethod
def open_file(path): def open_file(path):
return click.launch(path.encode(get_filesystem_encoding()) if PY2 else path) return click.launch(path)
@staticmethod @staticmethod
def is_file(path): def is_file(path):
@ -121,7 +117,9 @@ class OSRPC(object):
result = set() result = set()
for pathname in pathnames: for pathname in pathnames:
result |= set( result |= set(
glob_recursive(os.path.join(root, pathname) if root else pathname) glob.glob(
os.path.join(root, pathname) if root else pathname, recursive=True
)
) )
return list(result) return list(result)

View File

@ -17,23 +17,15 @@ from __future__ import absolute_import
import json import json
import os import os
import sys import sys
from io import BytesIO, StringIO from io import StringIO
import click import click
import jsonrpc # pylint: disable=import-error import jsonrpc
from twisted.internet import defer # pylint: disable=import-error from starlette.concurrency import run_in_threadpool
from twisted.internet import threads # pylint: disable=import-error
from twisted.internet import utils # pylint: disable=import-error
from platformio import __main__, __version__, fs from platformio import __main__, __version__, fs, proc
from platformio.commands.home import helpers from platformio.commands.home import helpers
from platformio.compat import ( from platformio.compat import get_locale_encoding, is_bytes
PY2,
get_filesystem_encoding,
get_locale_encoding,
is_bytes,
string_types,
)
try: try:
from thread import get_ident as thread_get_ident from thread import get_ident as thread_get_ident
@ -52,13 +44,11 @@ class MultiThreadingStdStream(object):
def _ensure_thread_buffer(self, thread_id): def _ensure_thread_buffer(self, thread_id):
if thread_id not in self._buffers: if thread_id not in self._buffers:
self._buffers[thread_id] = BytesIO() if PY2 else StringIO() self._buffers[thread_id] = StringIO()
def write(self, value): def write(self, value):
thread_id = thread_get_ident() thread_id = thread_get_ident()
self._ensure_thread_buffer(thread_id) self._ensure_thread_buffer(thread_id)
if PY2 and isinstance(value, unicode): # pylint: disable=undefined-variable
value = value.encode()
return self._buffers[thread_id].write( return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value value.decode() if is_bytes(value) else value
) )
@ -74,7 +64,7 @@ class MultiThreadingStdStream(object):
return result return result
class PIOCoreRPC(object): class PIOCoreRPC:
@staticmethod @staticmethod
def version(): def version():
return __version__ return __version__
@ -89,16 +79,9 @@ class PIOCoreRPC(object):
sys.stderr = PIOCoreRPC.thread_stderr sys.stderr = PIOCoreRPC.thread_stderr
@staticmethod @staticmethod
def call(args, options=None): async def call(args, options=None):
return defer.maybeDeferred(PIOCoreRPC._call_generator, args, options)
@staticmethod
@defer.inlineCallbacks
def _call_generator(args, options=None):
for i, arg in enumerate(args): for i, arg in enumerate(args):
if isinstance(arg, string_types): if not isinstance(arg, str):
args[i] = arg.encode(get_filesystem_encoding()) if PY2 else arg
else:
args[i] = str(arg) args[i] = str(arg)
options = options or {} options = options or {}
@ -106,27 +89,34 @@ class PIOCoreRPC(object):
try: try:
if options.get("force_subprocess"): if options.get("force_subprocess"):
result = yield PIOCoreRPC._call_subprocess(args, options) result = await PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(PIOCoreRPC._process_result(result, to_json)) return PIOCoreRPC._process_result(result, to_json)
else: result = await PIOCoreRPC._call_inline(args, options)
result = yield PIOCoreRPC._call_inline(args, options) try:
try: return PIOCoreRPC._process_result(result, to_json)
defer.returnValue(PIOCoreRPC._process_result(result, to_json)) except ValueError:
except ValueError: # fall-back to subprocess method
# fall-back to subprocess method result = await PIOCoreRPC._call_subprocess(args, options)
result = yield PIOCoreRPC._call_subprocess(args, options) return PIOCoreRPC._process_result(result, to_json)
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except Exception as e: # pylint: disable=bare-except except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4003, message="PIO Core Call Error", data=str(e) code=4003, message="PIO Core Call Error", data=str(e)
) )
@staticmethod @staticmethod
def _call_inline(args, options): async def _call_subprocess(args, options):
PIOCoreRPC.setup_multithreading_std_streams() result = await run_in_threadpool(
cwd = options.get("cwd") or os.getcwd() proc.exec_command,
[helpers.get_core_fullpath()] + args,
cwd=options.get("cwd") or os.getcwd(),
)
return (result["out"], result["err"], result["returncode"])
def _thread_task(): @staticmethod
async def _call_inline(args, options):
PIOCoreRPC.setup_multithreading_std_streams()
def _thread_safe_call(args, cwd):
with fs.cd(cwd): with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args) exit_code = __main__.main(["-c"] + args)
return ( return (
@ -135,16 +125,8 @@ class PIOCoreRPC(object):
exit_code, exit_code,
) )
return threads.deferToThread(_thread_task) return await run_in_threadpool(
_thread_safe_call, args=args, cwd=options.get("cwd") or os.getcwd()
@staticmethod
def _call_subprocess(args, options):
cwd = (options or {}).get("cwd") or os.getcwd()
return utils.getProcessOutputAndValue(
helpers.get_core_fullpath(),
args,
path=cwd,
env={k: v for k, v in os.environ.items() if "%" not in k},
) )
@staticmethod @staticmethod

View File

@ -18,12 +18,11 @@ import os
import shutil import shutil
import time import time
import jsonrpc # pylint: disable=import-error import jsonrpc
from platformio import exception, fs from platformio import exception, fs
from platformio.commands.home.rpc.handlers.app import AppRPC from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.package.manager.platform import PlatformPackageManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@ -32,7 +31,7 @@ from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.options import get_config_options_schema from platformio.project.options import get_config_options_schema
class ProjectRPC(object): class ProjectRPC:
@staticmethod @staticmethod
def config_call(init_kwargs, method, *args): def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict) assert isinstance(init_kwargs, dict)
@ -185,7 +184,7 @@ class ProjectRPC(object):
) )
return sorted(result, key=lambda data: data["platform"]["title"]) return sorted(result, key=lambda data: data["platform"]["title"])
def init(self, board, framework, project_dir): async def init(self, board, framework, project_dir):
assert project_dir assert project_dir
state = AppRPC.load_state() state = AppRPC.load_state()
if not os.path.isdir(project_dir): if not os.path.isdir(project_dir):
@ -198,14 +197,13 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides() and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
): ):
args.extend(["--ide", state["storage"]["coreCaller"]]) args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call( await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True} args, options={"cwd": project_dir, "force_subprocess": True}
) )
d.addCallback(self._generate_project_main, project_dir, framework) return self._generate_project_main(project_dir, framework)
return d
@staticmethod @staticmethod
def _generate_project_main(_, project_dir, framework): def _generate_project_main(project_dir, framework):
main_content = None main_content = None
if framework == "arduino": if framework == "arduino":
main_content = "\n".join( main_content = "\n".join(
@ -252,10 +250,8 @@ class ProjectRPC(object):
fp.write(main_content.strip()) fp.write(main_content.strip())
return project_dir return project_dir
def import_arduino(self, board, use_arduino_libs, arduino_project_dir): async def import_arduino(self, board, use_arduino_libs, arduino_project_dir):
board = str(board) board = str(board)
if arduino_project_dir and PY2:
arduino_project_dir = arduino_project_dir.encode(get_filesystem_encoding())
# don't import PIO Project # don't import PIO Project
if is_platformio_project(arduino_project_dir): if is_platformio_project(arduino_project_dir):
return arduino_project_dir return arduino_project_dir
@ -293,14 +289,9 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides() and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
): ):
args.extend(["--ide", state["storage"]["coreCaller"]]) args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call( await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True} args, options={"cwd": project_dir, "force_subprocess": True}
) )
d.addCallback(self._finalize_arduino_import, project_dir, arduino_project_dir)
return d
@staticmethod
def _finalize_arduino_import(_, project_dir, arduino_project_dir):
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig() config = ProjectConfig()
src_dir = config.get_optional_dir("src") src_dir = config.get_optional_dir("src")
@ -310,7 +301,7 @@ class ProjectRPC(object):
return project_dir return project_dir
@staticmethod @staticmethod
def import_pio(project_dir): async def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir): if not project_dir or not is_platformio_project(project_dir):
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4001, message="Not an PlatformIO project: %s" % project_dir code=4001, message="Not an PlatformIO project: %s" % project_dir
@ -328,8 +319,7 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides() and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
): ):
args.extend(["--ide", state["storage"]["coreCaller"]]) args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call( await PIOCoreRPC.call(
args, options={"cwd": new_project_dir, "force_subprocess": True} args, options={"cwd": new_project_dir, "force_subprocess": True}
) )
d.addCallback(lambda _: new_project_dir) return new_project_dir
return d

View File

@ -12,90 +12,107 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=import-error import inspect
import json
import click import click
import jsonrpc import jsonrpc
from autobahn.twisted.websocket import WebSocketServerFactory, WebSocketServerProtocol from starlette.endpoints import WebSocketEndpoint
from jsonrpc.exceptions import JSONRPCDispatchException
from twisted.internet import defer, reactor
from platformio.compat import PY2, dump_json_to_unicode, is_bytes from platformio.compat import create_task, get_running_loop, is_bytes
from platformio.proc import force_exit
class JSONRPCServerProtocol(WebSocketServerProtocol): class JSONRPCServerFactoryBase:
def onOpen(self):
self.factory.connection_nums += 1
if self.factory.shutdown_timer:
self.factory.shutdown_timer.cancel()
self.factory.shutdown_timer = None
def onClose(self, wasClean, code, reason): # pylint: disable=unused-argument
self.factory.connection_nums -= 1
if self.factory.connection_nums == 0:
self.factory.shutdownByTimeout()
def onMessage(self, payload, isBinary): # pylint: disable=unused-argument
# click.echo("> %s" % payload)
response = jsonrpc.JSONRPCResponseManager.handle(
payload, self.factory.dispatcher
).data
# if error
if "result" not in response:
self.sendJSONResponse(response)
return None
d = defer.maybeDeferred(lambda: response["result"])
d.addCallback(self._callback, response)
d.addErrback(self._errback, response)
return None
def _callback(self, result, response):
response["result"] = result
self.sendJSONResponse(response)
def _errback(self, failure, response):
if isinstance(failure.value, JSONRPCDispatchException):
e = failure.value
else:
e = JSONRPCDispatchException(code=4999, message=failure.getErrorMessage())
del response["result"]
response["error"] = e.error._data # pylint: disable=protected-access
self.sendJSONResponse(response)
def sendJSONResponse(self, response):
# click.echo("< %s" % response)
if "error" in response:
click.secho("Error: %s" % response["error"], fg="red", err=True)
response = dump_json_to_unicode(response)
if not PY2 and not is_bytes(response):
response = response.encode("utf-8")
self.sendMessage(response)
class JSONRPCServerFactory(WebSocketServerFactory):
protocol = JSONRPCServerProtocol
connection_nums = 0 connection_nums = 0
shutdown_timer = 0 shutdown_timer = None
def __init__(self, shutdown_timeout=0): def __init__(self, shutdown_timeout=0):
super(JSONRPCServerFactory, self).__init__()
self.shutdown_timeout = shutdown_timeout self.shutdown_timeout = shutdown_timeout
self.dispatcher = jsonrpc.Dispatcher() self.dispatcher = jsonrpc.Dispatcher()
def shutdownByTimeout(self): def __call__(self, *args, **kwargs):
raise NotImplementedError
def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace)
def on_client_connect(self):
self.connection_nums += 1
if self.shutdown_timer:
self.shutdown_timer.cancel()
self.shutdown_timer = None
def on_client_disconnect(self):
self.connection_nums -= 1
if self.connection_nums < 1:
self.connection_nums = 0
if self.connection_nums == 0:
self.shutdown_by_timeout()
async def on_shutdown(self):
pass
def shutdown_by_timeout(self):
if self.shutdown_timeout < 1: if self.shutdown_timeout < 1:
return return
def _auto_shutdown_server(): def _auto_shutdown_server():
click.echo("Automatically shutdown server on timeout") click.echo("Automatically shutdown server on timeout")
reactor.stop() force_exit()
self.shutdown_timer = reactor.callLater( self.shutdown_timer = get_running_loop().call_later(
self.shutdown_timeout, _auto_shutdown_server self.shutdown_timeout, _auto_shutdown_server
) )
def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace) class WebSocketJSONRPCServerFactory(JSONRPCServerFactoryBase):
def __call__(self, *args, **kwargs):
ws = WebSocketJSONRPCServer(*args, **kwargs)
ws.factory = self
return ws
class WebSocketJSONRPCServer(WebSocketEndpoint):
encoding = "text"
factory: WebSocketJSONRPCServerFactory = None
async def on_connect(self, websocket):
await websocket.accept()
self.factory.on_client_connect() # pylint: disable=no-member
async def on_receive(self, websocket, data):
create_task(self._handle_rpc(websocket, data))
async def on_disconnect(self, websocket, close_code):
self.factory.on_client_disconnect() # pylint: disable=no-member
async def _handle_rpc(self, websocket, data):
response = jsonrpc.JSONRPCResponseManager.handle(
data, self.factory.dispatcher # pylint: disable=no-member
)
if response.result and inspect.isawaitable(response.result):
try:
response.result = await response.result
response.data["result"] = response.result
response.error = None
except Exception as exc: # pylint: disable=broad-except
if not isinstance(exc, jsonrpc.exceptions.JSONRPCDispatchException):
exc = jsonrpc.exceptions.JSONRPCDispatchException(
code=4999, message=str(exc)
)
response.result = None
response.error = exc.error._data # pylint: disable=protected-access
new_data = response.data.copy()
new_data["error"] = response.error
del new_data["result"]
response.data = new_data
if response.error:
click.secho("Error: %s" % response.error, fg="red", err=True)
if "result" in response.data and is_bytes(response.data["result"]):
response.data["result"] = response.data["result"].decode("utf-8")
await websocket.send_text(json.dumps(response.data))

View File

@ -0,0 +1,99 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from urllib.parse import urlparse
import click
import uvicorn
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.responses import PlainTextResponse
from starlette.routing import Mount, Route, WebSocketRoute
from starlette.staticfiles import StaticFiles
from starlette.status import HTTP_403_FORBIDDEN
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.server import WebSocketJSONRPCServerFactory
from platformio.compat import get_running_loop
from platformio.exception import PlatformioException
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import force_exit
class ShutdownMiddleware:
def __init__(self, app):
self.app = app
async def __call__(self, scope, receive, send):
if scope["type"] == "http" and b"__shutdown__" in scope.get("query_string", {}):
await shutdown_server()
await self.app(scope, receive, send)
async def shutdown_server(_=None):
get_running_loop().call_later(0.5, force_exit)
return PlainTextResponse("Server has been shutdown!")
async def protected_page(_):
return PlainTextResponse(
"Protected PlatformIO Home session", status_code=HTTP_403_FORBIDDEN
)
def run_server(host, port, no_open, shutdown_timeout, home_url):
contrib_dir = get_core_package_dir("contrib-piohome")
if not os.path.isdir(contrib_dir):
raise PlatformioException("Invalid path to PIO Home Contrib")
ws_rpc_factory = WebSocketJSONRPCServerFactory(shutdown_timeout)
ws_rpc_factory.addHandler(AccountRPC(), namespace="account")
ws_rpc_factory.addHandler(AppRPC(), namespace="app")
ws_rpc_factory.addHandler(IDERPC(), namespace="ide")
ws_rpc_factory.addHandler(MiscRPC(), namespace="misc")
ws_rpc_factory.addHandler(OSRPC(), namespace="os")
ws_rpc_factory.addHandler(PIOCoreRPC(), namespace="core")
ws_rpc_factory.addHandler(ProjectRPC(), namespace="project")
path = urlparse(home_url).path
routes = [
WebSocketRoute(path + "wsrpc", ws_rpc_factory, name="wsrpc"),
Route(path + "__shutdown__", shutdown_server, methods=["POST"]),
Mount(path, StaticFiles(directory=contrib_dir, html=True), name="static"),
]
if path != "/":
routes.append(Route("/", protected_page))
uvicorn.run(
Starlette(
middleware=[Middleware(ShutdownMiddleware)],
routes=routes,
on_startup=[
lambda: click.echo(
"PIO Home has been started. Press Ctrl+C to shutdown."
),
lambda: None if no_open else click.launch(home_url),
],
),
host=host,
port=port,
log_level="warning",
)

View File

@ -1,28 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from twisted.internet import reactor # pylint: disable=import-error
from twisted.web import static # pylint: disable=import-error
class WebRoot(static.File):
def render_GET(self, request):
if request.args.get(b"__shutdown__", False):
reactor.stop()
return "Server has been stopped"
request.setHeader("cache-control", "no-cache, no-store, must-revalidate")
request.setHeader("pragma", "no-cache")
request.setHeader("expires", "0")
return static.File.render_GET(self, request)

View File

@ -13,7 +13,6 @@
# limitations under the License. # limitations under the License.
import json import json
import os
import platform import platform
import subprocess import subprocess
import sys import sys
@ -27,11 +26,15 @@ from platformio.commands.system.completion import (
install_completion_code, install_completion_code,
uninstall_completion_code, uninstall_completion_code,
) )
from platformio.commands.system.prune import (
prune_cached_data,
prune_core_packages,
prune_platform_packages,
)
from platformio.package.manager.library import LibraryPackageManager from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manager.platform import PlatformPackageManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manager.tool import ToolPackageManager from platformio.package.manager.tool import ToolPackageManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_cache_dir
@click.group("system", short_help="Miscellaneous system commands") @click.group("system", short_help="Miscellaneous system commands")
@ -99,22 +102,49 @@ def system_info(json_output):
@cli.command("prune", short_help="Remove unused data") @cli.command("prune", short_help="Remove unused data")
@click.option("--force", "-f", is_flag=True, help="Do not prompt for confirmation") @click.option("--force", "-f", is_flag=True, help="Do not prompt for confirmation")
def system_prune(force): @click.option(
click.secho("WARNING! This will remove:", fg="yellow") "--dry-run", is_flag=True, help="Do not prune, only show data that will be removed"
click.echo(" - cached API requests") )
click.echo(" - cached package downloads") @click.option("--cache", is_flag=True, help="Prune only cached data")
click.echo(" - temporary data") @click.option(
if not force: "--core-packages", is_flag=True, help="Prune only unnecessary core packages"
click.confirm("Do you want to continue?", abort=True) )
@click.option(
"--platform-packages",
is_flag=True,
help="Prune only unnecessary development platform packages",
)
def system_prune(force, dry_run, cache, core_packages, platform_packages):
if dry_run:
click.secho(
"Dry run mode (do not prune, only show data that will be removed)",
fg="yellow",
)
click.echo()
reclaimed_total = 0 reclaimed_cache = 0
cache_dir = get_project_cache_dir() reclaimed_core_packages = 0
if os.path.isdir(cache_dir): reclaimed_platform_packages = 0
reclaimed_total += fs.calculate_folder_size(cache_dir) prune_all = not any([cache, core_packages, platform_packages])
fs.rmtree(cache_dir)
if cache or prune_all:
reclaimed_cache = prune_cached_data(force, dry_run)
click.echo()
if core_packages or prune_all:
reclaimed_core_packages = prune_core_packages(force, dry_run)
click.echo()
if platform_packages or prune_all:
reclaimed_platform_packages = prune_platform_packages(force, dry_run)
click.echo()
click.secho( click.secho(
"Total reclaimed space: %s" % fs.humanize_file_size(reclaimed_total), fg="green" "Total reclaimed space: %s"
% fs.humanize_file_size(
reclaimed_cache + reclaimed_core_packages + reclaimed_platform_packages
),
fg="green",
) )

View File

@ -0,0 +1,98 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from operator import itemgetter
import click
from tabulate import tabulate
from platformio import fs
from platformio.package.manager.core import remove_unnecessary_core_packages
from platformio.package.manager.platform import remove_unnecessary_platform_packages
from platformio.project.helpers import get_project_cache_dir
def prune_cached_data(force=False, dry_run=False, silent=False):
reclaimed_space = 0
if not silent:
click.secho("Prune cached data:", bold=True)
click.echo(" - cached API requests")
click.echo(" - cached package downloads")
click.echo(" - temporary data")
cache_dir = get_project_cache_dir()
if os.path.isdir(cache_dir):
reclaimed_space += fs.calculate_folder_size(cache_dir)
if not dry_run:
if not force:
click.confirm("Do you want to continue?", abort=True)
fs.rmtree(cache_dir)
if not silent:
click.secho("Space on disk: %s" % fs.humanize_file_size(reclaimed_space))
return reclaimed_space
def prune_core_packages(force=False, dry_run=False, silent=False):
if not silent:
click.secho("Prune unnecessary core packages:", bold=True)
return _prune_packages(force, dry_run, silent, remove_unnecessary_core_packages)
def prune_platform_packages(force=False, dry_run=False, silent=False):
if not silent:
click.secho("Prune unnecessary development platform packages:", bold=True)
return _prune_packages(force, dry_run, silent, remove_unnecessary_platform_packages)
def _prune_packages(force, dry_run, silent, handler):
if not silent:
click.echo("Calculating...")
items = [
(
pkg,
fs.calculate_folder_size(pkg.path),
)
for pkg in handler(dry_run=True)
]
items = sorted(items, key=itemgetter(1), reverse=True)
reclaimed_space = sum([item[1] for item in items])
if items and not silent:
click.echo(
tabulate(
[
(
pkg.metadata.spec.humanize(),
str(pkg.metadata.version),
fs.humanize_file_size(size),
)
for (pkg, size) in items
],
headers=["Package", "Version", "Size"],
)
)
if not dry_run:
if not force:
click.confirm("Do you want to continue?", abort=True)
handler(dry_run=False)
if not silent:
click.secho("Space on disk: %s" % fs.humanize_file_size(reclaimed_space))
return reclaimed_space
def calculate_unnecessary_system_data():
return (
prune_cached_data(force=True, dry_run=True, silent=True)
+ prune_core_packages(force=True, dry_run=True, silent=True)
+ prune_platform_packages(force=True, dry_run=True, silent=True)
)

View File

@ -67,9 +67,9 @@ def ensure_python3(raise_exception=True):
return compatible return compatible
raise UserSideException( raise UserSideException(
"Python 3.6 or later is required for this operation. \n" "Python 3.6 or later is required for this operation. \n"
"Please install the latest Python 3 and reinstall PlatformIO Core using " "Please check a migration guide:\n"
"installation script:\n" "https://docs.platformio.org/en/latest/core/migration.html"
"https://docs.platformio.org/page/core/installation.html" "#drop-support-for-python-2-and-3-5"
) )
@ -78,6 +78,12 @@ if PY2:
string_types = (str, unicode) string_types = (str, unicode)
def create_task(coro, name=None):
raise NotImplementedError
def get_running_loop():
raise NotImplementedError
def is_bytes(x): def is_bytes(x):
return isinstance(x, (buffer, bytearray)) return isinstance(x, (buffer, bytearray))
@ -129,6 +135,12 @@ else:
import importlib.util import importlib.util
from glob import escape as glob_escape from glob import escape as glob_escape
if sys.version_info >= (3, 7):
from asyncio import create_task, get_running_loop
else:
from asyncio import ensure_future as create_task
from asyncio import get_event_loop as get_running_loop
string_types = (str,) string_types = (str,)
def is_bytes(x): def is_bytes(x):

View File

@ -1,4 +1,4 @@
{{ cxx_path }} clang
{{"%c"}} {{ !cc_flags }} {{"%c"}} {{ !cc_flags }}
{{"%cpp"}} {{ !cxx_flags }} {{"%cpp"}} {{ !cxx_flags }}

View File

@ -1,4 +1,4 @@
{{ cxx_path }} clang
{{"%c"}} {{ !cc_flags }} {{"%c"}} {{ !cc_flags }}
{{"%cpp"}} {{ !cxx_flags }} {{"%cpp"}} {{ !cxx_flags }}

View File

@ -1,4 +1,4 @@
{{ cxx_path }} clang
{{"%c"}} {{ !cc_flags }} {{"%c"}} {{ !cc_flags }}
{{"%cpp"}} {{ !cxx_flags }} {{"%cpp"}} {{ !cxx_flags }}

View File

@ -26,6 +26,7 @@ from platformio.commands import PlatformioCLI
from platformio.commands.lib.command import CTX_META_STORAGE_DIRS_KEY from platformio.commands.lib.command import CTX_META_STORAGE_DIRS_KEY
from platformio.commands.lib.command import lib_update as cmd_lib_update from platformio.commands.lib.command import lib_update as cmd_lib_update
from platformio.commands.platform import platform_update as cmd_platform_update from platformio.commands.platform import platform_update as cmd_platform_update
from platformio.commands.system.prune import calculate_unnecessary_system_data
from platformio.commands.upgrade import get_latest_version from platformio.commands.upgrade import get_latest_version
from platformio.compat import ensure_python3 from platformio.compat import ensure_python3
from platformio.package.manager.core import update_core_packages from platformio.package.manager.core import update_core_packages
@ -39,6 +40,8 @@ from platformio.proc import is_container
def on_platformio_start(ctx, force, caller): def on_platformio_start(ctx, force, caller):
ensure_python3(raise_exception=True)
app.set_session_var("command_ctx", ctx) app.set_session_var("command_ctx", ctx)
app.set_session_var("force_option", force) app.set_session_var("force_option", force)
set_caller(caller) set_caller(caller)
@ -46,24 +49,8 @@ def on_platformio_start(ctx, force, caller):
if PlatformioCLI.in_silence(): if PlatformioCLI.in_silence():
return return
after_upgrade(ctx) after_upgrade(ctx)
if not ensure_python3(raise_exception=False):
click.secho(
"""
Python 2 and Python 3.5 are not compatible with PlatformIO Core 5.0.
Please check the migration guide on how to fix this warning message:
""",
fg="yellow",
)
click.secho(
"https://docs.platformio.org/en/latest/core/migration.html"
"#drop-support-for-python-2-and-3-5",
fg="blue",
)
click.echo("")
def on_platformio_end(ctx, result): # pylint: disable=unused-argument def on_platformio_end(ctx, result): # pylint: disable=unused-argument
if PlatformioCLI.in_silence(): if PlatformioCLI.in_silence():
@ -73,6 +60,7 @@ def on_platformio_end(ctx, result): # pylint: disable=unused-argument
check_platformio_upgrade() check_platformio_upgrade()
check_internal_updates(ctx, "platforms") check_internal_updates(ctx, "platforms")
check_internal_updates(ctx, "libraries") check_internal_updates(ctx, "libraries")
check_prune_system()
except ( except (
http.HTTPClientError, http.HTTPClientError,
http.InternetIsOffline, http.InternetIsOffline,
@ -347,3 +335,31 @@ def check_internal_updates(ctx, what): # pylint: disable=too-many-branches
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("") click.echo("")
def check_prune_system():
last_check = app.get_state_item("last_check", {})
interval = 30 * 3600 * 24 # 1 time per month
if (time() - interval) < last_check.get("prune_system", 0):
return
last_check["prune_system"] = int(time())
app.set_state_item("last_check", last_check)
threshold_mb = int(app.get_setting("check_prune_system_threshold") or 0)
if threshold_mb <= 0:
return
unnecessary_mb = calculate_unnecessary_system_data() / 1024
if unnecessary_mb < threshold_mb:
return
terminal_width, _ = click.get_terminal_size()
click.echo()
click.echo("*" * terminal_width)
click.secho(
"We found %s of unnecessary PlatformIO system data (temporary files, "
"unnecessary packages, etc.).\nUse `pio system prune --dry-run` to list "
"them or `pio system prune` to save disk space."
% fs.humanize_file_size(unnecessary_mb),
fg="yellow",
)

View File

@ -26,7 +26,10 @@ class PackageManagerUpdateMixin(object):
def outdated(self, pkg, spec=None): def outdated(self, pkg, spec=None):
assert isinstance(pkg, PackageItem) assert isinstance(pkg, PackageItem)
assert not spec or isinstance(spec, PackageSpec) assert not spec or isinstance(spec, PackageSpec)
assert os.path.isdir(pkg.path) and pkg.metadata assert pkg.metadata
if not os.path.isdir(pkg.path):
return PackageOutdatedResult(current=pkg.metadata.version)
# skip detached package to a specific version # skip detached package to a specific version
detached_conditions = [ detached_conditions = [

View File

@ -27,7 +27,18 @@ from platformio.package.meta import PackageItem, PackageSpec
from platformio.proc import get_pythonexe_path from platformio.proc import get_pythonexe_path
def get_core_package_dir(name): def get_installed_core_packages():
result = []
pm = ToolPackageManager()
for name, requirements in __core_packages__.items():
spec = PackageSpec(owner="platformio", name=name, requirements=requirements)
pkg = pm.get_package(spec)
if pkg:
result.append(pkg)
return result
def get_core_package_dir(name, auto_install=True):
if name not in __core_packages__: if name not in __core_packages__:
raise exception.PlatformioException("Please upgrade PlatformIO Core") raise exception.PlatformioException("Please upgrade PlatformIO Core")
pm = ToolPackageManager() pm = ToolPackageManager()
@ -37,8 +48,10 @@ def get_core_package_dir(name):
pkg = pm.get_package(spec) pkg = pm.get_package(spec)
if pkg: if pkg:
return pkg.path return pkg.path
if not auto_install:
return None
assert pm.install(spec) assert pm.install(spec)
_remove_unnecessary_packages() remove_unnecessary_core_packages()
return pm.get_package(spec).path return pm.get_package(spec).path
@ -52,24 +65,40 @@ def update_core_packages(only_check=False, silent=False):
if not silent or pm.outdated(pkg, spec).is_outdated(): if not silent or pm.outdated(pkg, spec).is_outdated():
pm.update(pkg, spec, only_check=only_check) pm.update(pkg, spec, only_check=only_check)
if not only_check: if not only_check:
_remove_unnecessary_packages() remove_unnecessary_core_packages()
return True return True
def _remove_unnecessary_packages(): def remove_unnecessary_core_packages(dry_run=False):
candidates = []
pm = ToolPackageManager() pm = ToolPackageManager()
best_pkg_versions = {} best_pkg_versions = {}
for name, requirements in __core_packages__.items(): for name, requirements in __core_packages__.items():
spec = PackageSpec(owner="platformio", name=name, requirements=requirements) spec = PackageSpec(owner="platformio", name=name, requirements=requirements)
pkg = pm.get_package(spec) pkg = pm.get_package(spec)
if not pkg: if not pkg:
continue continue
best_pkg_versions[pkg.metadata.name] = pkg.metadata.version best_pkg_versions[pkg.metadata.name] = pkg.metadata.version
for pkg in pm.get_installed(): for pkg in pm.get_installed():
if pkg.metadata.name not in best_pkg_versions: skip_conds = [
continue os.path.isfile(os.path.join(pkg.path, ".piokeep")),
if pkg.metadata.version != best_pkg_versions[pkg.metadata.name]: pkg.metadata.spec.owner != "platformio",
pm.uninstall(pkg) pkg.metadata.name not in best_pkg_versions,
pkg.metadata.name in best_pkg_versions
and pkg.metadata.version == best_pkg_versions[pkg.metadata.name],
]
if not any(skip_conds):
candidates.append(pkg)
if dry_run:
return candidates
for pkg in candidates:
pm.uninstall(pkg)
return candidates
def inject_contrib_pysite(verify_openssl=False): def inject_contrib_pysite(verify_openssl=False):
@ -160,7 +189,6 @@ def build_contrib_pysite_package(target_dir, with_metadata=True):
pkg.dump_meta() pkg.dump_meta()
# remove unused files # remove unused files
shutil.rmtree(os.path.join(target_dir, "autobahn", "xbr", "contracts"))
for root, dirs, files in os.walk(target_dir): for root, dirs, files in os.walk(target_dir):
for t in ("_test", "test", "tests"): for t in ("_test", "test", "tests"):
if t in dirs: if t in dirs:
@ -169,19 +197,6 @@ def build_contrib_pysite_package(target_dir, with_metadata=True):
if name.endswith((".chm", ".pyc")): if name.endswith((".chm", ".pyc")):
os.remove(os.path.join(root, name)) os.remove(os.path.join(root, name))
# apply patches
with open(
os.path.join(target_dir, "autobahn", "twisted", "__init__.py"), "r+"
) as fp:
contents = fp.read()
contents = contents.replace(
"from autobahn.twisted.wamp import ApplicationSession",
"# from autobahn.twisted.wamp import ApplicationSession",
)
fp.seek(0)
fp.truncate()
fp.write(contents)
return target_dir return target_dir
@ -192,8 +207,6 @@ def get_contrib_pysite_deps():
twisted_version = "19.10.0" if PY2 else "20.3.0" twisted_version = "19.10.0" if PY2 else "20.3.0"
result = [ result = [
"twisted == %s" % twisted_version, "twisted == %s" % twisted_version,
"autobahn == %s" % ("19.11.2" if PY2 else "20.7.1"),
"json-rpc == 1.13.0",
] ]
# twisted[tls], see setup.py for %twisted_version% # twisted[tls], see setup.py for %twisted_version%
@ -201,14 +214,6 @@ def get_contrib_pysite_deps():
["pyopenssl >= 16.0.0", "service_identity >= 18.1.0", "idna >= 0.6, != 2.3"] ["pyopenssl >= 16.0.0", "service_identity >= 18.1.0", "idna >= 0.6, != 2.3"]
) )
# zeroconf
if PY2:
result.append(
"https://github.com/ivankravets/python-zeroconf/" "archive/pio-py27.zip"
)
else:
result.append("zeroconf == 0.26.0")
if "windows" in sys_type: if "windows" in sys_type:
result.append("pypiwin32 == 223") result.append("pypiwin32 == 223")
# workaround for twisted wheels # workaround for twisted wheels

View File

@ -112,16 +112,11 @@ class LibraryPackageManager(BasePackageManager): # pylint: disable=too-many-anc
) )
def _install_dependency(self, dependency, silent=False): def _install_dependency(self, dependency, silent=False):
if set(["name", "version"]) <= set(dependency.keys()) and any( spec = PackageSpec(
c in dependency["version"] for c in (":", "/", "@") owner=dependency.get("owner"),
): name=dependency.get("name"),
spec = PackageSpec("%s=%s" % (dependency["name"], dependency["version"])) requirements=dependency.get("version"),
else: )
spec = PackageSpec(
owner=dependency.get("owner"),
name=dependency.get("name"),
requirements=dependency.get("version"),
)
search_filters = { search_filters = {
key: value key: value
for key, value in dependency.items() for key, value in dependency.items()
@ -143,11 +138,12 @@ class LibraryPackageManager(BasePackageManager): # pylint: disable=too-many-anc
if not silent: if not silent:
self.print_message("Removing dependencies...", fg="yellow") self.print_message("Removing dependencies...", fg="yellow")
for dependency in manifest.get("dependencies"): for dependency in manifest.get("dependencies"):
pkg = self.get_package( spec = PackageSpec(
PackageSpec( owner=dependency.get("owner"),
name=dependency.get("name"), requirements=dependency.get("version") name=dependency.get("name"),
) requirements=dependency.get("version"),
) )
pkg = self.get_package(spec)
if not pkg: if not pkg:
continue continue
self._uninstall(pkg, silent=silent) self._uninstall(pkg, silent=silent)

View File

@ -12,10 +12,13 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import os
from platformio import util from platformio import util
from platformio.clients.http import HTTPClientError, InternetIsOffline from platformio.clients.http import HTTPClientError, InternetIsOffline
from platformio.package.exception import UnknownPackageError from platformio.package.exception import UnknownPackageError
from platformio.package.manager.base import BasePackageManager from platformio.package.manager.base import BasePackageManager
from platformio.package.manager.core import get_installed_core_packages
from platformio.package.manager.tool import ToolPackageManager from platformio.package.manager.tool import ToolPackageManager
from platformio.package.meta import PackageType from platformio.package.meta import PackageType
from platformio.platform.exception import IncompatiblePlatform, UnknownBoard from platformio.platform.exception import IncompatiblePlatform, UnknownBoard
@ -69,7 +72,6 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
) )
p.install_python_packages() p.install_python_packages()
p.on_installed() p.on_installed()
self.cleanup_packages(list(p.packages))
return pkg return pkg
def uninstall(self, spec, silent=False, skip_dependencies=False): def uninstall(self, spec, silent=False, skip_dependencies=False):
@ -83,7 +85,6 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
if not skip_dependencies: if not skip_dependencies:
p.uninstall_python_packages() p.uninstall_python_packages()
p.on_uninstalled() p.on_uninstalled()
self.cleanup_packages(list(p.packages))
return pkg return pkg
def update( # pylint: disable=arguments-differ, too-many-arguments def update( # pylint: disable=arguments-differ, too-many-arguments
@ -118,7 +119,6 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
) )
p.update_packages(only_check) p.update_packages(only_check)
self.cleanup_packages(list(p.packages))
if missed_pkgs: if missed_pkgs:
p.install_packages( p.install_packages(
@ -127,32 +127,6 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
return new_pkg or pkg return new_pkg or pkg
def cleanup_packages(self, names):
self.memcache_reset()
deppkgs = {}
for platform in PlatformPackageManager().get_installed():
p = PlatformFactory.new(platform)
for pkg in p.get_installed_packages():
if pkg.metadata.name not in deppkgs:
deppkgs[pkg.metadata.name] = set()
deppkgs[pkg.metadata.name].add(pkg.metadata.version)
pm = ToolPackageManager()
for pkg in pm.get_installed():
if pkg.metadata.name not in names:
continue
if (
pkg.metadata.name not in deppkgs
or pkg.metadata.version not in deppkgs[pkg.metadata.name]
):
try:
pm.uninstall(pkg.metadata.spec)
except UnknownPackageError:
pass
self.memcache_reset()
return True
@util.memoized(expire="5s") @util.memoized(expire="5s")
def get_installed_boards(self): def get_installed_boards(self):
boards = [] boards = []
@ -193,3 +167,37 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
): ):
return manifest return manifest
raise UnknownBoard(id_) raise UnknownBoard(id_)
#
# Helpers
#
def remove_unnecessary_platform_packages(dry_run=False):
candidates = []
required = set()
core_packages = get_installed_core_packages()
for platform in PlatformPackageManager().get_installed():
p = PlatformFactory.new(platform)
for pkg in p.get_installed_packages(with_optional=True):
required.add(pkg)
pm = ToolPackageManager()
for pkg in pm.get_installed():
skip_conds = [
pkg.metadata.spec.url,
os.path.isfile(os.path.join(pkg.path, ".piokeep")),
pkg in required,
pkg in core_packages,
]
if not any(skip_conds):
candidates.append(pkg)
if dry_run:
return candidates
for pkg in candidates:
pm.uninstall(pkg)
return candidates

View File

@ -107,16 +107,21 @@ class PackageSpec(object): # pylint: disable=too-many-instance-attributes
def __init__( # pylint: disable=redefined-builtin,too-many-arguments def __init__( # pylint: disable=redefined-builtin,too-many-arguments
self, raw=None, owner=None, id=None, name=None, requirements=None, url=None self, raw=None, owner=None, id=None, name=None, requirements=None, url=None
): ):
self._requirements = None
self.owner = owner self.owner = owner
self.id = id self.id = id
self.name = name self.name = name
self._requirements = None
self.url = url self.url = url
self.raw = raw self.raw = raw
if requirements: if requirements:
self.requirements = requirements try:
self.requirements = requirements
except ValueError as exc:
if not self.name or self.url or self.raw:
raise exc
self.raw = "%s=%s" % (self.name, requirements)
self._name_is_custom = False self._name_is_custom = False
self._parse(raw) self._parse(self.raw)
def __eq__(self, other): def __eq__(self, other):
return all( return all(
@ -405,7 +410,12 @@ class PackageItem(object):
) )
def __eq__(self, other): def __eq__(self, other):
return all([self.path == other.path, self.metadata == other.metadata]) if not self.path or not other.path:
return self.path == other.path
return os.path.realpath(self.path) == os.path.realpath(other.path)
def __hash__(self):
return hash(os.path.realpath(self.path))
def exists(self): def exists(self):
return os.path.isdir(self.path) return os.path.isdir(self.path)

View File

@ -17,18 +17,17 @@ from platformio.package.meta import PackageSpec
class PlatformPackagesMixin(object): class PlatformPackagesMixin(object):
def get_package_spec(self, name): def get_package_spec(self, name, version=None):
version = self.packages[name].get("version", "")
if any(c in version for c in (":", "/", "@")):
return PackageSpec("%s=%s" % (name, version))
return PackageSpec( return PackageSpec(
owner=self.packages[name].get("owner"), name=name, requirements=version owner=self.packages[name].get("owner"),
name=name,
requirements=version or self.packages[name].get("version"),
) )
def get_package(self, name): def get_package(self, name, spec=None):
if not name: if not name:
return None return None
return self.pm.get_package(self.get_package_spec(name)) return self.pm.get_package(spec or self.get_package_spec(name))
def get_package_dir(self, name): def get_package_dir(self, name):
pkg = self.get_package(name) pkg = self.get_package(name)
@ -38,12 +37,18 @@ class PlatformPackagesMixin(object):
pkg = self.get_package(name) pkg = self.get_package(name)
return str(pkg.metadata.version) if pkg else None return str(pkg.metadata.version) if pkg else None
def get_installed_packages(self): def get_installed_packages(self, with_optional=False):
result = [] result = []
for name in self.packages: for name, options in self.packages.items():
pkg = self.get_package(name) versions = [options.get("version")]
if pkg: if with_optional:
result.append(pkg) versions.extend(options.get("optionalVersions", []))
for version in versions:
if not version:
continue
pkg = self.get_package(name, self.get_package_spec(name, version))
if pkg:
result.append(pkg)
return result return result
def dump_used_packages(self): def dump_used_packages(self):

View File

@ -37,6 +37,8 @@ class PlatformFactory(object):
@classmethod @classmethod
def new(cls, pkg_or_spec): def new(cls, pkg_or_spec):
# pylint: disable=import-outside-toplevel
platform_dir = None platform_dir = None
platform_name = None platform_name = None
if isinstance(pkg_or_spec, PackageItem): if isinstance(pkg_or_spec, PackageItem):
@ -45,9 +47,7 @@ class PlatformFactory(object):
elif os.path.isdir(pkg_or_spec): elif os.path.isdir(pkg_or_spec):
platform_dir = pkg_or_spec platform_dir = pkg_or_spec
else: else:
from platformio.package.manager.platform import ( # pylint: disable=import-outside-toplevel from platformio.package.manager.platform import PlatformPackageManager
PlatformPackageManager,
)
pkg = PlatformPackageManager().get_package(pkg_or_spec) pkg = PlatformPackageManager().get_package(pkg_or_spec)
if not pkg: if not pkg:

View File

@ -24,6 +24,7 @@ from platformio.compat import (
WINDOWS, WINDOWS,
get_filesystem_encoding, get_filesystem_encoding,
get_locale_encoding, get_locale_encoding,
get_running_loop,
string_types, string_types,
) )
@ -31,7 +32,10 @@ from platformio.compat import (
class AsyncPipeBase(object): class AsyncPipeBase(object):
def __init__(self): def __init__(self):
self._fd_read, self._fd_write = os.pipe() self._fd_read, self._fd_write = os.pipe()
self._pipe_reader = os.fdopen(self._fd_read) if PY2:
self._pipe_reader = os.fdopen(self._fd_read)
else:
self._pipe_reader = os.fdopen(self._fd_read, errors="backslashreplace")
self._buffer = "" self._buffer = ""
self._thread = Thread(target=self.run) self._thread = Thread(target=self.run)
self._thread.start() self._thread.start()
@ -67,10 +71,10 @@ class BuildAsyncPipe(AsyncPipeBase):
line = "" line = ""
print_immediately = False print_immediately = False
for byte in iter(lambda: self._pipe_reader.read(1), ""): for char in iter(lambda: self._pipe_reader.read(1), ""):
self._buffer += byte self._buffer += char
if line and byte.strip() and line[-3:] == (byte * 3): if line and char.strip() and line[-3:] == (char * 3):
print_immediately = True print_immediately = True
if print_immediately: if print_immediately:
@ -78,12 +82,12 @@ class BuildAsyncPipe(AsyncPipeBase):
if line: if line:
self.data_callback(line) self.data_callback(line)
line = "" line = ""
self.data_callback(byte) self.data_callback(char)
if byte == "\n": if char == "\n":
print_immediately = False print_immediately = False
else: else:
line += byte line += char
if byte != "\n": if char != "\n":
continue continue
self.line_callback(line) self.line_callback(line)
line = "" line = ""
@ -214,3 +218,12 @@ def append_env_path(name, value):
return cur_value return cur_value
os.environ[name] = os.pathsep.join([cur_value, value]) os.environ[name] = os.pathsep.join([cur_value, value])
return os.environ[name] return os.environ[name]
def force_exit(code=0):
try:
get_running_loop().stop()
except: # pylint: disable=bare-except
pass
finally:
sys.exit(code)

View File

@ -681,6 +681,11 @@ ProjectOptions = OrderedDict(
"network address)" "network address)"
), ),
), ),
ConfigEnvOption(
group="debug",
name="debug_speed",
description="A debug adapter speed (JTAG speed)",
),
ConfigEnvOption( ConfigEnvOption(
group="debug", group="debug",
name="debug_svd_path", name="debug_svd_path",

View File

@ -25,7 +25,7 @@ from glob import glob
import click import click
from platformio import __version__, exception, proc from platformio import __version__, compat, exception, proc
from platformio.compat import PY2, WINDOWS from platformio.compat import PY2, WINDOWS
from platformio.fs import cd, load_json # pylint: disable=unused-import from platformio.fs import cd, load_json # pylint: disable=unused-import
from platformio.proc import exec_command # pylint: disable=unused-import from platformio.proc import exec_command # pylint: disable=unused-import
@ -162,14 +162,10 @@ def get_logical_devices():
def get_mdns_services(): def get_mdns_services():
# pylint: disable=import-outside-toplevel compat.ensure_python3()
try:
import zeroconf
except ImportError:
from platformio.package.manager.core import inject_contrib_pysite
inject_contrib_pysite() # pylint: disable=import-outside-toplevel
import zeroconf # pylint: disable=import-outside-toplevel import zeroconf
class mDNSListener(object): class mDNSListener(object):
def __init__(self): def __init__(self):
@ -178,15 +174,20 @@ def get_mdns_services():
self._found_services = [] self._found_services = []
def __enter__(self): def __enter__(self):
zeroconf.ServiceBrowser(self._zc, "_services._dns-sd._udp.local.", self) zeroconf.ServiceBrowser(
self._zc,
[
"_http._tcp.local.",
"_hap._tcp.local.",
"_services._dns-sd._udp.local.",
],
self,
)
return self return self
def __exit__(self, etype, value, traceback): def __exit__(self, etype, value, traceback):
self._zc.close() self._zc.close()
def remove_service(self, zc, type_, name):
pass
def add_service(self, zc, type_, name): def add_service(self, zc, type_, name):
try: try:
assert zeroconf.service_type_name(name) assert zeroconf.service_type_name(name)
@ -201,6 +202,12 @@ def get_mdns_services():
if s: if s:
self._found_services.append(s) self._found_services.append(s)
def remove_service(self, zc, type_, name):
pass
def update_service(self, zc, type_, name):
pass
def get_services(self): def get_services(self):
return self._found_services return self._found_services
@ -225,12 +232,7 @@ def get_mdns_services():
{ {
"type": service.type, "type": service.type,
"name": service.name, "name": service.name,
"ip": ".".join( "ip": ", ".join(service.parsed_addresses()),
[
str(c if isinstance(c, int) else ord(c))
for c in service.address
]
),
"port": service.port, "port": service.port,
"properties": properties, "properties": properties,
} }

View File

@ -167,3 +167,6 @@ ATTRS{idVendor}=="c251", ATTRS{idProduct}=="2710", MODE="0666", ENV{ID_MM_DEVICE
# CMSIS-DAP compatible adapters # CMSIS-DAP compatible adapters
ATTRS{product}=="*CMSIS-DAP*", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1" ATTRS{product}=="*CMSIS-DAP*", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Atmel AVR Dragon
ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="2107", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"

View File

@ -26,18 +26,28 @@ from platformio import (
from platformio.compat import PY2, WINDOWS from platformio.compat import PY2, WINDOWS
install_requires = [ minimal_requirements = [
"bottle<0.13", "bottle==0.12.*",
"click>=5,<8%s" % (",!=7.1,!=7.1.1" if WINDOWS else ""), "click>=5,<8%s" % (",!=7.1,!=7.1.1" if WINDOWS else ""),
"colorama", "colorama",
"pyserial>=3,<4,!=3.3", "marshmallow%s" % (">=2,<3" if PY2 else ">=2,<4"),
"requests>=2.4.0,<3", "pyelftools>=0.27,<1",
"semantic_version>=2.8.1,<3", "pyserial==3.*",
"tabulate>=0.8.3,<1", "requests==2.*",
"pyelftools>=0.25,<1", "semantic_version==2.8.*",
"marshmallow%s" % (">=2,<3" if PY2 else ">=2"), "tabulate==0.8.*",
] ]
if not PY2:
minimal_requirements.append("zeroconf==0.28.*")
home_requirements = [
"aiofiles==0.6.*",
"json-rpc==1.13.*",
"starlette==0.14.*",
"uvicorn==0.13.*",
"wsproto==1.0.*",
]
setup( setup(
name=__title__, name=__title__,
@ -48,10 +58,7 @@ setup(
author_email=__email__, author_email=__email__,
url=__url__, url=__url__,
license=__license__, license=__license__,
python_requires=", ".join( install_requires=minimal_requirements + ([] if PY2 else home_requirements),
[">=2.7", "!=3.0.*", "!=3.1.*", "!=3.2.*", "!=3.3.*", "!=3.4.*"]
),
install_requires=install_requires,
packages=find_packages(exclude=["tests.*", "tests"]) + ["scripts"], packages=find_packages(exclude=["tests.*", "tests"]) + ["scripts"],
package_data={ package_data={
"platformio": [ "platformio": [
@ -77,7 +84,6 @@ setup(
"Operating System :: OS Independent", "Operating System :: OS Independent",
"Programming Language :: C", "Programming Language :: C",
"Programming Language :: Python", "Programming Language :: Python",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 3", "Programming Language :: Python :: 3",
"Topic :: Software Development", "Topic :: Software Development",
"Topic :: Software Development :: Build Tools", "Topic :: Software Development :: Build Tools",

View File

@ -154,7 +154,7 @@ def test_check_includes_passed(clirunner, check_dir):
inc_count = l.count("-I") inc_count = l.count("-I")
# at least 1 include path for default mode # at least 1 include path for default mode
assert inc_count > 1 assert inc_count > 0
def test_check_silent_mode(clirunner, validate_cliresult, check_dir): def test_check_silent_mode(clirunner, validate_cliresult, check_dir):

View File

@ -169,6 +169,15 @@ def test_spec_vcs_urls():
url="git+git@github.com:platformio/platformio-core.git", url="git+git@github.com:platformio/platformio-core.git",
requirements="^1.2.3,!=5", requirements="^1.2.3,!=5",
) )
assert PackageSpec(
owner="platformio",
name="external-repo",
requirements="https://github.com/platformio/platformio-core",
) == PackageSpec(
owner="platformio",
name="external-repo",
url="git+https://github.com/platformio/platformio-core",
)
def test_spec_as_dict(): def test_spec_as_dict():

View File

@ -13,14 +13,14 @@
# limitations under the License. # limitations under the License.
[tox] [tox]
envlist = py27,py37,py38,py39 envlist = py36,py37,py38,py39
[testenv] [testenv]
passenv = * passenv = *
usedevelop = True usedevelop = True
deps = deps =
py36,py37,py38,py39: black black
isort<5 isort
pylint pylint
pytest pytest
pytest-xdist pytest-xdist