Merge branch 'release/v4.1.0'

This commit is contained in:
Ivan Kravets
2019-11-07 16:54:20 +02:00
119 changed files with 8521 additions and 4147 deletions

View File

@@ -6,14 +6,15 @@ platform:
environment: environment:
matrix: matrix:
- TOXENV: "py27" - TOXENV: "py27"
PLATFORMIO_BUILD_CACHE_DIR: C:/Temp/PIO_Build_Cache_P2_{build} PLATFORMIO_BUILD_CACHE_DIR: C:\Temp\PIO_Build_Cache_P2_{build}
- TOXENV: "py36" - TOXENV: "py36"
PLATFORMIO_BUILD_CACHE_DIR: C:/Temp/PIO_Build_Cache_P3_{build} PLATFORMIO_BUILD_CACHE_DIR: C:\Temp\PIO_Build_Cache_P3_{build}
install: install:
- cmd: git submodule update --init --recursive - cmd: git submodule update --init --recursive
- cmd: SET PATH=C:\MinGW\bin;%PATH% - cmd: SET PATH=C:\MinGW\bin;%PATH%
- cmd: SET PLATFORMIO_CORE_DIR=C:\.pio
- cmd: pip install --force-reinstall tox - cmd: pip install --force-reinstall tox
test_script: test_script:

View File

@@ -1,3 +1,3 @@
[settings] [settings]
line_length=79 line_length=88
known_third_party=bottle,click,pytest,requests,SCons,semantic_version,serial,twisted,autobahn,jsonrpc,tabulate known_third_party=SCons, twisted, autobahn, jsonrpc

View File

@@ -1,5 +1,7 @@
[MESSAGES CONTROL] [MESSAGES CONTROL]
disable= disable=
bad-continuation,
bad-whitespace,
missing-docstring, missing-docstring,
ungrouped-imports, ungrouped-imports,
invalid-name, invalid-name,
@@ -9,4 +11,5 @@ disable=
too-few-public-methods, too-few-public-methods,
useless-object-inheritance, useless-object-inheritance,
useless-import-alias, useless-import-alias,
fixme fixme,
bad-option-value

12
.readthedocs.yml Normal file
View File

@@ -0,0 +1,12 @@
# See https://docs.readthedocs.io/en/stable/config-file/index.html
version: 2
sphinx:
configuration: docs/conf.py
formats:
- pdf
submodules:
include: all

View File

@@ -3,8 +3,44 @@ Release Notes
.. _release_notes_4_0: .. _release_notes_4_0:
PlatformIO 4.0 PlatformIO Core 4.0
-------------- -------------------
4.1.0 (2019-11-07)
~~~~~~~~~~~~~~~~~~
* `PIO Check <http://docs.platformio.org/page/plus/pio-check.html>`__ automated code analysis without hassle:
- Potential NULL pointer dereferences
- Possible indexing beyond array bounds
- Suspicious assignments
- Reads of potentially uninitialized objects
- Unused variables or functions
- Out of scope memory usage.
* `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__ and Project Inspection
- Static Code Analysis
- Firmware File Explorer
- Firmware Memory Inspection
- Firmware Sections & Symbols Viewer.
* Added support for `Build Middlewares <http://docs.platformio.org/page/projectconf/advanced_scripting.html#build-middlewares>`__: configure custom build flags per specific file, skip any build nodes from a framework, replace build file with another on-the-fly, etc.
* Extend project environment configuration in "platformio.ini" with other sections using a new `extends <http://docs.platformio.org/page/projectconf/section_env_advanced.html#extends>`__ option (`issue #2953 <https://github.com/platformio/platformio-core/issues/2953>`_)
* Generate ``.ccls`` LSP file for `Emacs <https://docs.platformio.org/page/ide/emacs.html>`__ cross references, hierarchies, completion and semantic highlighting
* Added ``--no-ansi`` flag for `PIO Core <http://docs.platformio.org/page/userguide/index.html>`__ to disable ANSI control characters
* Added ``--shutdown-timeout`` option to `PIO Home Server <http://docs.platformio.org/page/userguide/cmd_home.html>`__
* Fixed an issue with project generator for `CLion IDE <http://docs.platformio.org/page/ide/clion.html>`__ when 2 environments were used (`issue #2824 <https://github.com/platformio/platformio-core/issues/2824>`_)
* Fixed default PIO Unified Debugger configuration for `J-Link probe <http://docs.platformio.org/page/plus/debug-tools/jlink.html>`__
* Fixed an issue when configuration file options partly ignored when using custom ``--project-conf`` (`issue #3034 <https://github.com/platformio/platformio-core/issues/3034>`_)
* Fixed an issue when installing a package using custom Git tag and submodules were not updated correctly (`issue #3060 <https://github.com/platformio/platformio-core/issues/3060>`_)
* Fixed an issue with linking process when ``$LDSCRIPT`` contains a space in path
* Fixed security issue when extracting items from TAR archive (`issue #2995 <https://github.com/platformio/platformio-core/issues/2995>`_)
* Fixed an issue with project generator when ``src_build_flags`` were not respected (`issue #3137 <https://github.com/platformio/platformio-core/issues/3137>`_)
* Fixed an issue when booleans in "platformio.ini" are not parsed properly (`issue #3022 <https://github.com/platformio/platformio-core/issues/3022>`_)
* Fixed an issue with invalid encoding when generating project for Visual Studio (`issue #3183 <https://github.com/platformio/platformio-core/issues/3183>`_)
* Fixed an issue when Project Config Parser does not remove in-line comments when Python 3 is used (`issue #3213 <https://github.com/platformio/platformio-core/issues/3213>`_)
* Fixed an issue with a GCC Linter for PlatformIO IDE for Atom (`issue #3218 <https://github.com/platformio/platformio-core/issues/3218>`_)
4.0.3 (2019-08-30) 4.0.3 (2019-08-30)
~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~
@@ -104,8 +140,8 @@ PlatformIO 4.0
- Fixed "systemd-udevd" warnings in `99-platformio-udev.rules <http://docs.platformio.org/page/faq.html#platformio-udev-rules>`__ (`issue #2442 <https://github.com/platformio/platformio-core/issues/2442>`_) - Fixed "systemd-udevd" warnings in `99-platformio-udev.rules <http://docs.platformio.org/page/faq.html#platformio-udev-rules>`__ (`issue #2442 <https://github.com/platformio/platformio-core/issues/2442>`_)
- Fixed an issue when package cache (Library Manager) expires too fast (`issue #2559 <https://github.com/platformio/platformio-core/issues/2559>`_) - Fixed an issue when package cache (Library Manager) expires too fast (`issue #2559 <https://github.com/platformio/platformio-core/issues/2559>`_)
PlatformIO 3.0 PlatformIO Core 3.0
-------------- -------------------
3.6.7 (2019-04-23) 3.6.7 (2019-04-23)
~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~
@@ -705,8 +741,8 @@ PlatformIO 3.0
(`issue #742 <https://github.com/platformio/platformio-core/issues/742>`_) (`issue #742 <https://github.com/platformio/platformio-core/issues/742>`_)
* Stopped supporting Python 2.6 * Stopped supporting Python 2.6
PlatformIO 2.0 PlatformIO Core 2.0
-------------- --------------------
2.11.2 (2016-08-02) 2.11.2 (2016-08-02)
~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~
@@ -1491,8 +1527,8 @@ PlatformIO 2.0
* Fixed bug with creating copies of source files * Fixed bug with creating copies of source files
(`issue #177 <https://github.com/platformio/platformio-core/issues/177>`_) (`issue #177 <https://github.com/platformio/platformio-core/issues/177>`_)
PlatformIO 1.0 PlatformIO Core 1.0
-------------- -------------------
1.5.0 (2015-05-15) 1.5.0 (2015-05-15)
~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~
@@ -1682,8 +1718,8 @@ PlatformIO 1.0
error (`issue #81 <https://github.com/platformio/platformio-core/issues/81>`_) error (`issue #81 <https://github.com/platformio/platformio-core/issues/81>`_)
* Several bug fixes, increased stability and performance improvements * Several bug fixes, increased stability and performance improvements
PlatformIO 0.0 PlatformIO Core 0.0
-------------- -------------------
0.10.2 (2015-01-06) 0.10.2 (2015-01-06)
~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~

View File

@@ -5,13 +5,14 @@ isort:
isort -rc ./platformio isort -rc ./platformio
isort -rc ./tests isort -rc ./tests
yapf: black:
yapf --recursive --in-place platformio/ black --target-version py27 ./platformio
black --target-version py27 ./tests
test: test:
py.test --verbose --capture=no --exitfirst -n 3 --dist=loadscope tests --ignore tests/test_examples.py --ignore tests/test_pkgmanifest.py py.test --verbose --capture=no --exitfirst -n 3 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: isort yapf lint test before-commit: isort black lint test
clean-docs: clean-docs:
rm -rf docs/_build rm -rf docs/_build

View File

@@ -34,16 +34,19 @@ PlatformIO
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png .. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core :target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is an open source ecosystem for IoT `PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ an open source ecosystem for embedded development
development. Cross-platform IDE and unified debugger. Remote unit testing and
firmware updates. * **Cross-platform IDE** and **Unified Debugger**
* **Static Code Analyzer** and **Remote Unit Testing**
* **Multi-platform** and **Multi-architecture Build System**
* **Firmware File Explorer** and **Memory Inspection**.
Get Started Get Started
----------- -----------
* `What is PlatformIO? <https://docs.platformio.org/en/latest/what-is-platformio.html?utm_source=github&utm_medium=core>`_ * `What is PlatformIO? <https://docs.platformio.org/en/latest/what-is-platformio.html?utm_source=github&utm_medium=core>`_
Open Source Instruments
----------- -----------
* `PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_ * `PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_
@@ -57,11 +60,10 @@ Open Source
PIO Plus PIO Plus
-------- --------
* `PIO Check <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `PIO Remote <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_ * `PIO Remote <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
* `PIO Unified Debugger <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_ * `PIO Unified Debugger <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `PIO Unit Testing <https://docs.platformio.org/en/latest/plus/unit-testing.html?utm_source=github&utm_medium=core>`_ * `PIO Unit Testing <https://docs.platformio.org/en/latest/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `Cloud IDEs Integration <https://docs.platformio.org/en/latest/ide.html?utm_source=github&utm_medium=core#solution-pio-delivery>`_
* `Integration Services <https://platformio.org/pricing?utm_source=github&utm_medium=core#enterprise-features>`_
Registry Registry
-------- --------
@@ -93,6 +95,7 @@ Development Platforms
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_ * `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_ * `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
* `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_ * `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_ * `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_ * `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
* `ST STM8 <https://platformio.org/platforms/ststm8?utm_source=github&utm_medium=core>`_ * `ST STM8 <https://platformio.org/platforms/ststm8?utm_source=github&utm_medium=core>`_
@@ -117,6 +120,7 @@ Frameworks
* `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_ * `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_ * `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_ * `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/frameworks/shakti?utm_source=github&utm_medium=core>`_
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_ * `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_ * `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_ * `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_

2
docs

Submodule docs updated: 704ff85c7d...28f91efb24

View File

@@ -12,16 +12,19 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
VERSION = (4, 0, 3) VERSION = (4, 1, 0)
__version__ = ".".join([str(s) for s in VERSION]) __version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio" __title__ = "platformio"
__description__ = ( __description__ = (
"An open source ecosystem for IoT development. " "An open source ecosystem for embedded development. "
"Cross-platform IDE and unified debugger. " "Cross-platform IDE and Unified Debugger. "
"Remote unit testing and firmware updates. " "Static Code Analyzer and Remote Unit Testing. "
"Multi-platform and Multi-architecture Build System. "
"Firmware File Explorer and Memory Inspection. "
"Arduino, ARM mbed, Espressif (ESP8266/ESP32), STM32, PIC32, nRF51/nRF52, " "Arduino, ARM mbed, Espressif (ESP8266/ESP32), STM32, PIC32, nRF51/nRF52, "
"FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3") "RISC-V, FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3"
)
__url__ = "https://platformio.org" __url__ = "https://platformio.org"
__author__ = "PlatformIO" __author__ = "PlatformIO"

View File

@@ -23,22 +23,42 @@ from platformio.commands import PlatformioCLI
from platformio.compat import CYGWIN from platformio.compat import CYGWIN
@click.command(cls=PlatformioCLI, @click.command(
context_settings=dict(help_option_names=["-h", "--help"])) cls=PlatformioCLI, context_settings=dict(help_option_names=["-h", "--help"])
)
@click.version_option(__version__, prog_name="PlatformIO") @click.version_option(__version__, prog_name="PlatformIO")
@click.option("--force", @click.option("--force", "-f", is_flag=True, help="DEPRECATE")
"-f", @click.option("--caller", "-c", help="Caller ID (service)")
is_flag=True, @click.option("--no-ansi", is_flag=True, help="Do not print ANSI control characters")
help="Force to accept any confirmation prompts.")
@click.option("--caller", "-c", help="Caller ID (service).")
@click.pass_context @click.pass_context
def cli(ctx, force, caller): def cli(ctx, force, caller, no_ansi):
try:
if (
no_ansi
or str(
os.getenv("PLATFORMIO_NO_ANSI", os.getenv("PLATFORMIO_DISABLE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: False
elif (
str(
os.getenv("PLATFORMIO_FORCE_ANSI", os.getenv("PLATFORMIO_FORCE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
maintenance.on_platformio_start(ctx, force, caller) maintenance.on_platformio_start(ctx, force, caller)
@cli.resultcallback() @cli.resultcallback()
@click.pass_context @click.pass_context
def process_result(ctx, result, force, caller): # pylint: disable=W0613 def process_result(ctx, result, *_, **__):
maintenance.on_platformio_end(ctx, result) maintenance.on_platformio_end(ctx, result)
@@ -50,21 +70,12 @@ def configure():
# https://urllib3.readthedocs.org # https://urllib3.readthedocs.org
# /en/latest/security.html#insecureplatformwarning # /en/latest/security.html#insecureplatformwarning
try: try:
import urllib3 import urllib3 # pylint: disable=import-outside-toplevel
urllib3.disable_warnings() urllib3.disable_warnings()
except (AttributeError, ImportError): except (AttributeError, ImportError):
pass pass
try:
if str(os.getenv("PLATFORMIO_DISABLE_COLOR", "")).lower() == "true":
# pylint: disable=protected-access
click._compat.isatty = lambda stream: False
elif str(os.getenv("PLATFORMIO_FORCE_COLOR", "")).lower() == "true":
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
# Handle IOError issue with VSCode's Terminal (Windows) # Handle IOError issue with VSCode's Terminal (Windows)
click_echo_origin = [click.echo, click.secho] click_echo_origin = [click.echo, click.secho]
@@ -73,7 +84,8 @@ def configure():
click_echo_origin[origin](*args, **kwargs) click_echo_origin[origin](*args, **kwargs)
except IOError: except IOError:
(sys.stderr.write if kwargs.get("err") else sys.stdout.write)( (sys.stderr.write if kwargs.get("err") else sys.stdout.write)(
"%s\n" % (args[0] if args else "")) "%s\n" % (args[0] if args else "")
)
click.echo = lambda *args, **kwargs: _safe_echo(0, *args, **kwargs) click.echo = lambda *args, **kwargs: _safe_echo(0, *args, **kwargs)
click.secho = lambda *args, **kwargs: _safe_echo(1, *args, **kwargs) click.secho = lambda *args, **kwargs: _safe_echo(1, *args, **kwargs)
@@ -87,7 +99,7 @@ def main(argv=None):
sys.argv = argv sys.argv = argv
try: try:
configure() configure()
cli(None, None, None) cli() # pylint: disable=no-value-for-parameter
except SystemExit: except SystemExit:
pass pass
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except

View File

@@ -17,30 +17,19 @@ import hashlib
import os import os
import uuid import uuid
from os import environ, getenv, listdir, remove from os import environ, getenv, listdir, remove
from os.path import abspath, dirname, expanduser, isdir, isfile, join from os.path import abspath, dirname, isdir, isfile, join
from time import time from time import time
import requests import requests
from platformio import exception, fs, lockfile from platformio import exception, fs, lockfile
from platformio.compat import (WINDOWS, dump_json_to_unicode, from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data
hashlib_encode_data)
from platformio.proc import is_ci from platformio.proc import is_ci
from platformio.project.helpers import (get_project_cache_dir, from platformio.project.helpers import (
get_project_core_dir) get_default_projects_dir,
get_project_cache_dir,
get_project_core_dir,
def get_default_projects_dir(): )
docs_dir = join(expanduser("~"), "Documents")
try:
assert WINDOWS
import ctypes.wintypes
buf = ctypes.create_unicode_buffer(ctypes.wintypes.MAX_PATH)
ctypes.windll.shell32.SHGetFolderPathW(None, 5, None, 0, buf)
docs_dir = buf.value
except: # pylint: disable=bare-except
pass
return join(docs_dir, "PlatformIO", "Projects")
def projects_dir_validate(projects_dir): def projects_dir_validate(projects_dir):
@@ -51,53 +40,53 @@ def projects_dir_validate(projects_dir):
DEFAULT_SETTINGS = { DEFAULT_SETTINGS = {
"auto_update_libraries": { "auto_update_libraries": {
"description": "Automatically update libraries (Yes/No)", "description": "Automatically update libraries (Yes/No)",
"value": False "value": False,
}, },
"auto_update_platforms": { "auto_update_platforms": {
"description": "Automatically update platforms (Yes/No)", "description": "Automatically update platforms (Yes/No)",
"value": False "value": False,
}, },
"check_libraries_interval": { "check_libraries_interval": {
"description": "Check for the library updates interval (days)", "description": "Check for the library updates interval (days)",
"value": 7 "value": 7,
}, },
"check_platformio_interval": { "check_platformio_interval": {
"description": "Check for the new PlatformIO interval (days)", "description": "Check for the new PlatformIO interval (days)",
"value": 3 "value": 3,
}, },
"check_platforms_interval": { "check_platforms_interval": {
"description": "Check for the platform updates interval (days)", "description": "Check for the platform updates interval (days)",
"value": 7 "value": 7,
}, },
"enable_cache": { "enable_cache": {
"description": "Enable caching for API requests and Library Manager", "description": "Enable caching for API requests and Library Manager",
"value": True "value": True,
},
"strict_ssl": {
"description": "Strict SSL for PlatformIO Services",
"value": False
}, },
"strict_ssl": {"description": "Strict SSL for PlatformIO Services", "value": False},
"enable_telemetry": { "enable_telemetry": {
"description": "description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"), "value": True,
"value": True
}, },
"force_verbose": { "force_verbose": {
"description": "Force verbose output when processing environments", "description": "Force verbose output when processing environments",
"value": False "value": False,
}, },
"projects_dir": { "projects_dir": {
"description": "Default location for PlatformIO projects (PIO Home)", "description": "Default location for PlatformIO projects (PIO Home)",
"value": get_default_projects_dir(), "value": get_default_projects_dir(),
"validator": projects_dir_validate "validator": projects_dir_validate,
}, },
} }
SESSION_VARS = {"command_ctx": None, "force_option": False, "caller_id": None} SESSION_VARS = {
"command_ctx": None,
"force_option": False,
"caller_id": None,
"custom_project_conf": None,
}
class State(object): class State(object):
def __init__(self, path=None, lock=False): def __init__(self, path=None, lock=False):
self.path = path self.path = path
self.lock = lock self.lock = lock
@@ -113,8 +102,12 @@ class State(object):
if isfile(self.path): if isfile(self.path):
self._storage = fs.load_json(self.path) self._storage = fs.load_json(self.path)
assert isinstance(self._storage, dict) assert isinstance(self._storage, dict)
except (AssertionError, ValueError, UnicodeDecodeError, except (
exception.InvalidJSONFile): AssertionError,
ValueError,
UnicodeDecodeError,
exception.InvalidJSONFile,
):
self._storage = {} self._storage = {}
return self return self
@@ -174,7 +167,6 @@ class State(object):
class ContentCache(object): class ContentCache(object):
def __init__(self, cache_dir=None): def __init__(self, cache_dir=None):
self.cache_dir = None self.cache_dir = None
self._db_path = None self._db_path = None
@@ -277,8 +269,11 @@ class ContentCache(object):
continue continue
expire, path = line.split("=") expire, path = line.split("=")
try: try:
if time() < int(expire) and isfile(path) and \ if (
path not in paths_for_delete: time() < int(expire)
and isfile(path)
and path not in paths_for_delete
):
newlines.append(line) newlines.append(line)
continue continue
except ValueError: except ValueError:
@@ -317,11 +312,11 @@ def sanitize_setting(name, value):
defdata = DEFAULT_SETTINGS[name] defdata = DEFAULT_SETTINGS[name]
try: try:
if "validator" in defdata: if "validator" in defdata:
value = defdata['validator'](value) value = defdata["validator"](value)
elif isinstance(defdata['value'], bool): elif isinstance(defdata["value"], bool):
if not isinstance(value, bool): if not isinstance(value, bool):
value = str(value).lower() in ("true", "yes", "y", "1") value = str(value).lower() in ("true", "yes", "y", "1")
elif isinstance(defdata['value'], int): elif isinstance(defdata["value"], int):
value = int(value) value = int(value)
except Exception: except Exception:
raise exception.InvalidSettingValue(value, name) raise exception.InvalidSettingValue(value, name)
@@ -351,24 +346,24 @@ def get_setting(name):
return sanitize_setting(name, getenv(_env_name)) return sanitize_setting(name, getenv(_env_name))
with State() as state: with State() as state:
if "settings" in state and name in state['settings']: if "settings" in state and name in state["settings"]:
return state['settings'][name] return state["settings"][name]
return DEFAULT_SETTINGS[name]['value'] return DEFAULT_SETTINGS[name]["value"]
def set_setting(name, value): def set_setting(name, value):
with State(lock=True) as state: with State(lock=True) as state:
if "settings" not in state: if "settings" not in state:
state['settings'] = {} state["settings"] = {}
state['settings'][name] = sanitize_setting(name, value) state["settings"][name] = sanitize_setting(name, value)
state.modified = True state.modified = True
def reset_settings(): def reset_settings():
with State(lock=True) as state: with State(lock=True) as state:
if "settings" in state: if "settings" in state:
del state['settings'] del state["settings"]
def get_session_var(name, default=None): def get_session_var(name, default=None):
@@ -381,11 +376,13 @@ def set_session_var(name, value):
def is_disabled_progressbar(): def is_disabled_progressbar():
return any([ return any(
get_session_var("force_option"), [
is_ci(), get_session_var("force_option"),
getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true" is_ci(),
]) getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true",
]
)
def get_cid(): def get_cid():
@@ -397,15 +394,22 @@ def get_cid():
uid = getenv("C9_UID") uid = getenv("C9_UID")
elif getenv("CHE_API", getenv("CHE_API_ENDPOINT")): elif getenv("CHE_API", getenv("CHE_API_ENDPOINT")):
try: try:
uid = requests.get("{api}/user?token={token}".format( uid = (
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")), requests.get(
token=getenv("USER_TOKEN"))).json().get("id") "{api}/user?token={token}".format(
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")),
token=getenv("USER_TOKEN"),
)
)
.json()
.get("id")
)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
if not uid: if not uid:
uid = uuid.getnode() uid = uuid.getnode()
cid = uuid.UUID(bytes=hashlib.md5(hashlib_encode_data(uid)).digest()) cid = uuid.UUID(bytes=hashlib.md5(hashlib_encode_data(uid)).digest())
cid = str(cid) cid = str(cid)
if WINDOWS or os.getuid() > 0: # yapf: disable pylint: disable=no-member if WINDOWS or os.getuid() > 0: # pylint: disable=no-member
set_state_item("cid", cid) set_state_item("cid", cid)
return cid return cid

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import sys
from os import environ, makedirs from os import environ, makedirs
from os.path import isdir, join from os.path import isdir, join
from time import time from time import time
@@ -28,10 +29,10 @@ from SCons.Script import Import # pylint: disable=import-error
from SCons.Script import Variables # pylint: disable=import-error from SCons.Script import Variables # pylint: disable=import-error
from platformio import fs from platformio import fs
from platformio.compat import PY2, dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformBase from platformio.managers.platform import PlatformBase
from platformio.proc import get_pythonexe_path from platformio.proc import get_pythonexe_path
from platformio.project import helpers as project_helpers from platformio.project.helpers import get_project_dir
AllowSubstExceptions(NameError) AllowSubstExceptions(NameError)
@@ -43,48 +44,44 @@ clivars.AddVariables(
("PROJECT_CONFIG",), ("PROJECT_CONFIG",),
("PIOENV",), ("PIOENV",),
("PIOTEST_RUNNING_NAME",), ("PIOTEST_RUNNING_NAME",),
("UPLOAD_PORT",) ("UPLOAD_PORT",),
) # yapf: disable )
DEFAULT_ENV_OPTIONS = dict( DEFAULT_ENV_OPTIONS = dict(
tools=[ tools=[
"ar", "gas", "gcc", "g++", "gnulink", "platformio", "pioplatform", "ar",
"pioproject", "piowinhooks", "piolib", "pioupload", "piomisc", "pioide" "gas",
"gcc",
"g++",
"gnulink",
"platformio",
"pioplatform",
"pioproject",
"piomaxlen",
"piolib",
"pioupload",
"piomisc",
"pioide",
"piosize",
], ],
toolpath=[join(fs.get_source_dir(), "builder", "tools")], toolpath=[join(fs.get_source_dir(), "builder", "tools")],
variables=clivars, variables=clivars,
# Propagating External Environment # Propagating External Environment
ENV=environ, ENV=environ,
UNIX_TIME=int(time()), UNIX_TIME=int(time()),
PROJECT_DIR=project_helpers.get_project_dir(), BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
PROJECTCORE_DIR=project_helpers.get_project_core_dir(), BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
PROJECTPACKAGES_DIR=project_helpers.get_project_packages_dir(), BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
PROJECTWORKSPACE_DIR=project_helpers.get_project_workspace_dir(),
PROJECTLIBDEPS_DIR=project_helpers.get_project_libdeps_dir(),
PROJECTINCLUDE_DIR=project_helpers.get_project_include_dir(),
PROJECTSRC_DIR=project_helpers.get_project_src_dir(),
PROJECTTEST_DIR=project_helpers.get_project_test_dir(),
PROJECTDATA_DIR=project_helpers.get_project_data_dir(),
PROJECTBUILD_DIR=project_helpers.get_project_build_dir(),
BUILDCACHE_DIR=project_helpers.get_project_optional_dir("build_cache_dir"),
BUILD_DIR=join("$PROJECTBUILD_DIR", "$PIOENV"),
BUILDSRC_DIR=join("$BUILD_DIR", "src"),
BUILDTEST_DIR=join("$BUILD_DIR", "test"),
LIBPATH=["$BUILD_DIR"], LIBPATH=["$BUILD_DIR"],
LIBSOURCE_DIRS=[
project_helpers.get_project_lib_dir(),
join("$PROJECTLIBDEPS_DIR", "$PIOENV"),
project_helpers.get_project_global_lib_dir()
],
PROGNAME="program", PROGNAME="program",
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"), PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PYTHONEXE=get_pythonexe_path()) PYTHONEXE=get_pythonexe_path(),
)
if not int(ARGUMENTS.get("PIOVERBOSE", 0)): if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
DEFAULT_ENV_OPTIONS['ARCOMSTR'] = "Archiving $TARGET" DEFAULT_ENV_OPTIONS["ARCOMSTR"] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS['LINKCOMSTR'] = "Linking $TARGET" DEFAULT_ENV_OPTIONS["LINKCOMSTR"] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS['RANLIBCOMSTR'] = "Indexing $TARGET" DEFAULT_ENV_OPTIONS["RANLIBCOMSTR"] = "Indexing $TARGET"
for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"): for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"):
DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET" DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET"
@@ -94,31 +91,59 @@ env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS)
env.Replace( env.Replace(
**{ **{
key: PlatformBase.decode_scons_arg(env[key]) key: PlatformBase.decode_scons_arg(env[key])
for key in list(clivars.keys()) if key in env for key in list(clivars.keys())
}) if key in env
}
)
if env.subst("$BUILDCACHE_DIR"): # Setup project optional directories
if not isdir(env.subst("$BUILDCACHE_DIR")): config = env.GetProjectConfig()
makedirs(env.subst("$BUILDCACHE_DIR")) env.Replace(
env.CacheDir("$BUILDCACHE_DIR") PROJECT_DIR=get_project_dir(),
PROJECT_CORE_DIR=config.get_optional_dir("core"),
PROJECT_PACKAGES_DIR=config.get_optional_dir("packages"),
PROJECT_WORKSPACE_DIR=config.get_optional_dir("workspace"),
PROJECT_LIBDEPS_DIR=config.get_optional_dir("libdeps"),
PROJECT_INCLUDE_DIR=config.get_optional_dir("include"),
PROJECT_SRC_DIR=config.get_optional_dir("src"),
PROJECTSRC_DIR=config.get_optional_dir("src"), # legacy for dev/platform
PROJECT_TEST_DIR=config.get_optional_dir("test"),
PROJECT_DATA_DIR=config.get_optional_dir("data"),
PROJECTDATA_DIR=config.get_optional_dir("data"), # legacy for dev/platform
PROJECT_BUILD_DIR=config.get_optional_dir("build"),
BUILD_CACHE_DIR=config.get_optional_dir("build_cache"),
LIBSOURCE_DIRS=[
config.get_optional_dir("lib"),
join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get_optional_dir("globallib"),
],
)
if env.subst("$BUILD_CACHE_DIR"):
if not isdir(env.subst("$BUILD_CACHE_DIR")):
makedirs(env.subst("$BUILD_CACHE_DIR"))
env.CacheDir("$BUILD_CACHE_DIR")
if int(ARGUMENTS.get("ISATTY", 0)): if int(ARGUMENTS.get("ISATTY", 0)):
# pylint: disable=protected-access # pylint: disable=protected-access
click._compat.isatty = lambda stream: True click._compat.isatty = lambda stream: True
if env.GetOption('clean'): if env.GetOption("clean"):
env.PioClean(env.subst("$BUILD_DIR")) env.PioClean(env.subst("$BUILD_DIR"))
env.Exit(0) env.Exit(0)
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)): elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
print("Verbose mode can be enabled via `-v, --verbose` option") click.echo("Verbose mode can be enabled via `-v, --verbose` option")
if not isdir(env.subst("$BUILD_DIR")):
makedirs(env.subst("$BUILD_DIR"))
env.LoadProjectOptions() env.LoadProjectOptions()
env.LoadPioPlatform() env.LoadPioPlatform()
env.SConscriptChdir(0) env.SConscriptChdir(0)
env.SConsignFile( env.SConsignFile(
join("$PROJECTBUILD_DIR", join("$BUILD_DIR", ".sconsign.py%d%d" % (sys.version_info[0], sys.version_info[1]))
".sconsign.dblite" if PY2 else ".sconsign3.dblite")) )
for item in env.GetExtraScripts("pre"): for item in env.GetExtraScripts("pre"):
env.SConscript(item, exports="env") env.SConscript(item, exports="env")
@@ -145,10 +170,13 @@ if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
Default("checkprogsize") Default("checkprogsize")
# Print configured protocols # Print configured protocols
env.AddPreAction(["upload", "program"], env.AddPreAction(
env.VerboseAction( ["upload", "program"],
lambda source, target, env: env.PrintUploadInfo(), env.VerboseAction(
"Configuring upload protocol...")) lambda source, target, env: env.PrintUploadInfo(),
"Configuring upload protocol...",
),
)
AlwaysBuild(env.Alias("debug", DEFAULT_TARGETS)) AlwaysBuild(env.Alias("debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS)) AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
@@ -156,12 +184,26 @@ AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
############################################################################## ##############################################################################
if "envdump" in COMMAND_LINE_TARGETS: if "envdump" in COMMAND_LINE_TARGETS:
print(env.Dump()) click.echo(env.Dump())
env.Exit(0) env.Exit(0)
if "idedata" in COMMAND_LINE_TARGETS: if "idedata" in COMMAND_LINE_TARGETS:
Import("projenv") Import("projenv")
print("\n%s\n" % dump_json_to_unicode( click.echo(
env.DumpIDEData(projenv) # pylint: disable=undefined-variable "\n%s\n"
)) % dump_json_to_unicode(
projenv.DumpIDEData() # pylint: disable=undefined-variable
)
)
env.Exit(0) env.Exit(0)
if "sizedata" in COMMAND_LINE_TARGETS:
AlwaysBuild(
env.Alias(
"sizedata",
DEFAULT_TARGETS,
env.VerboseAction(env.DumpSizeData, "Generating memory usage report..."),
)
)
Default("sizedata")

View File

@@ -25,11 +25,11 @@ from platformio.managers.core import get_core_package_dir
from platformio.proc import exec_command, where_is_program from platformio.proc import exec_command, where_is_program
def _dump_includes(env, projenv): def _dump_includes(env):
includes = [] includes = []
for item in projenv.get("CPPPATH", []): for item in env.get("CPPPATH", []):
includes.append(projenv.subst(item)) includes.append(env.subst(item))
# installed libs # installed libs
for lb in env.GetLibBuilders(): for lb in env.GetLibBuilders():
@@ -45,7 +45,7 @@ def _dump_includes(env, projenv):
join(toolchain_dir, "*", "include*"), join(toolchain_dir, "*", "include*"),
join(toolchain_dir, "*", "include", "c++", "*"), join(toolchain_dir, "*", "include", "c++", "*"),
join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"), join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
join(toolchain_dir, "lib", "gcc", "*", "*", "include*") join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
] ]
for g in toolchain_incglobs: for g in toolchain_incglobs:
includes.extend(glob(g)) includes.extend(glob(g))
@@ -54,9 +54,7 @@ def _dump_includes(env, projenv):
if unity_dir: if unity_dir:
includes.append(unity_dir) includes.append(unity_dir)
includes.extend( includes.extend([env.subst("$PROJECT_INCLUDE_DIR"), env.subst("$PROJECT_SRC_DIR")])
[env.subst("$PROJECTINCLUDE_DIR"),
env.subst("$PROJECTSRC_DIR")])
# remove duplicates # remove duplicates
result = [] result = []
@@ -71,15 +69,15 @@ def _get_gcc_defines(env):
items = [] items = []
try: try:
sysenv = environ.copy() sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH']) sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command("echo | %s -dM -E -" % env.subst("$CC"), result = exec_command(
env=sysenv, "echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True
shell=True) )
except OSError: except OSError:
return items return items
if result['returncode'] != 0: if result["returncode"] != 0:
return items return items
for line in result['out'].split("\n"): for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2) tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define": if not tokens or tokens[0] != "#define":
continue continue
@@ -94,17 +92,22 @@ def _dump_defines(env):
defines = [] defines = []
# global symbols # global symbols
for item in processDefines(env.get("CPPDEFINES", [])): for item in processDefines(env.get("CPPDEFINES", [])):
defines.append(env.subst(item).replace('\\', '')) defines.append(env.subst(item).replace("\\", ""))
# special symbol for Atmel AVR MCU # special symbol for Atmel AVR MCU
if env['PIOPLATFORM'] == "atmelavr": if env["PIOPLATFORM"] == "atmelavr":
board_mcu = env.get("BOARD_MCU") board_mcu = env.get("BOARD_MCU")
if not board_mcu and "BOARD" in env: if not board_mcu and "BOARD" in env:
board_mcu = env.BoardConfig().get("build.mcu") board_mcu = env.BoardConfig().get("build.mcu")
if board_mcu: if board_mcu:
defines.append( defines.append(
str("__AVR_%s__" % board_mcu.upper().replace( str(
"ATMEGA", "ATmega").replace("ATTINY", "ATtiny"))) "__AVR_%s__"
% board_mcu.upper()
.replace("ATMEGA", "ATmega")
.replace("ATTINY", "ATtiny")
)
)
# built-in GCC marcos # built-in GCC marcos
# if env.GetCompilerType() == "gcc": # if env.GetCompilerType() == "gcc":
@@ -135,38 +138,27 @@ def _get_svd_path(env):
return None return None
def DumpIDEData(env, projenv): def DumpIDEData(env):
LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS" LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS"
LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS" LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS"
data = { data = {
"env_name": "env_name": env["PIOENV"],
env['PIOENV'],
"libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()], "libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()],
"defines": "defines": _dump_defines(env),
_dump_defines(env), "includes": _dump_includes(env),
"includes": "cc_flags": env.subst(LINTCCOM),
_dump_includes(env, projenv), "cxx_flags": env.subst(LINTCXXCOM),
"cc_flags": "cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
env.subst(LINTCCOM), "cxx_path": where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"cxx_flags": "gdb_path": where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
env.subst(LINTCXXCOM), "prog_path": env.subst("$PROG_PATH"),
"cc_path": "flash_extra_images": [
where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")), {"offset": item[0], "path": env.subst(item[1])}
"cxx_path": for item in env.get("FLASH_EXTRA_IMAGES", [])
where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")), ],
"gdb_path": "svd_path": _get_svd_path(env),
where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")), "compiler_type": env.GetCompilerType(),
"prog_path":
env.subst("$PROG_PATH"),
"flash_extra_images": [{
"offset": item[0],
"path": env.subst(item[1])
} for item in env.get("FLASH_EXTRA_IMAGES", [])],
"svd_path":
_get_svd_path(env),
"compiler_type":
env.GetCompilerType()
} }
env_ = env.Clone() env_ = env.Clone()
@@ -180,10 +172,7 @@ def DumpIDEData(env, projenv):
_new_defines.append(item) _new_defines.append(item)
env_.Replace(CPPDEFINES=_new_defines) env_.Replace(CPPDEFINES=_new_defines)
data.update({ data.update({"cc_flags": env_.subst(LINTCCOM), "cxx_flags": env_.subst(LINTCXXCOM)})
"cc_flags": env_.subst(LINTCCOM),
"cxx_flags": env_.subst(LINTCXXCOM)
})
return data return data

View File

@@ -17,13 +17,11 @@
from __future__ import absolute_import from __future__ import absolute_import
import codecs
import hashlib import hashlib
import os import os
import re import re
import sys import sys
from os.path import (basename, commonprefix, expanduser, isdir, isfile, join, from os.path import basename, commonprefix, isdir, isfile, join, realpath, sep
realpath, sep)
import click import click
import SCons.Scanner # pylint: disable=import-error import SCons.Scanner # pylint: disable=import-error
@@ -33,13 +31,13 @@ from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.builder.tools import platformio as piotool from platformio.builder.tools import platformio as piotool
from platformio.compat import (WINDOWS, get_file_contents, hashlib_encode_data, from platformio.compat import WINDOWS, hashlib_encode_data, string_types
string_types)
from platformio.managers.lib import LibraryManager from platformio.managers.lib import LibraryManager
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.project.options import ProjectOptions
class LibBuilderFactory(object): class LibBuilderFactory(object):
@staticmethod @staticmethod
def new(env, path, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))): def new(env, path, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
clsname = "UnknownLibBuilder" clsname = "UnknownLibBuilder"
@@ -47,31 +45,30 @@ class LibBuilderFactory(object):
clsname = "PlatformIOLibBuilder" clsname = "PlatformIOLibBuilder"
else: else:
used_frameworks = LibBuilderFactory.get_used_frameworks(env, path) used_frameworks = LibBuilderFactory.get_used_frameworks(env, path)
common_frameworks = (set(env.get("PIOFRAMEWORK", [])) common_frameworks = set(env.get("PIOFRAMEWORK", [])) & set(used_frameworks)
& set(used_frameworks))
if common_frameworks: if common_frameworks:
clsname = "%sLibBuilder" % list(common_frameworks)[0].title() clsname = "%sLibBuilder" % list(common_frameworks)[0].title()
elif used_frameworks: elif used_frameworks:
clsname = "%sLibBuilder" % used_frameworks[0].title() clsname = "%sLibBuilder" % used_frameworks[0].title()
obj = getattr(sys.modules[__name__], clsname)(env, obj = getattr(sys.modules[__name__], clsname)(env, path, verbose=verbose)
path,
verbose=verbose)
assert isinstance(obj, LibBuilderBase) assert isinstance(obj, LibBuilderBase)
return obj return obj
@staticmethod @staticmethod
def get_used_frameworks(env, path): def get_used_frameworks(env, path):
if any( if any(
isfile(join(path, fname)) isfile(join(path, fname))
for fname in ("library.properties", "keywords.txt")): for fname in ("library.properties", "keywords.txt")
):
return ["arduino"] return ["arduino"]
if isfile(join(path, "module.json")): if isfile(join(path, "module.json")):
return ["mbed"] return ["mbed"]
include_re = re.compile(r'^#include\s+(<|")(Arduino|mbed)\.h(<|")', include_re = re.compile(
flags=re.MULTILINE) r'^#include\s+(<|")(Arduino|mbed)\.h(<|")', flags=re.MULTILINE
)
# check source files # check source files
for root, _, files in os.walk(path, followlinks=True): for root, _, files in os.walk(path, followlinks=True):
@@ -79,9 +76,10 @@ class LibBuilderFactory(object):
return ["mbed"] return ["mbed"]
for fname in files: for fname in files:
if not fs.path_endswith_ext( if not fs.path_endswith_ext(
fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT): fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT
):
continue continue
content = get_file_contents(join(root, fname)) content = fs.get_file_contents(join(root, fname))
if not content: if not content:
continue continue
if "Arduino.h" in content and include_re.search(content): if "Arduino.h" in content and include_re.search(content):
@@ -93,12 +91,6 @@ class LibBuilderFactory(object):
class LibBuilderBase(object): class LibBuilderBase(object):
LDF_MODES = ["off", "chain", "deep", "chain+", "deep+"]
LDF_MODE_DEFAULT = "chain"
COMPAT_MODES = ["off", "soft", "strict"]
COMPAT_MODE_DEFAULT = "soft"
CLASSIC_SCANNER = SCons.Scanner.C.CScanner() CLASSIC_SCANNER = SCons.Scanner.C.CScanner()
CCONDITIONAL_SCANNER = SCons.Scanner.C.CConditionalScanner() CCONDITIONAL_SCANNER = SCons.Scanner.C.CConditionalScanner()
# Max depth of nested includes: # Max depth of nested includes:
@@ -124,7 +116,7 @@ class LibBuilderBase(object):
self._processed_files = list() self._processed_files = list()
# reset source filter, could be overridden with extra script # reset source filter, could be overridden with extra script
self.env['SRC_FILTER'] = "" self.env["SRC_FILTER"] = ""
# process extra options and append to build environment # process extra options and append to build environment
self.process_extra_options() self.process_extra_options()
@@ -153,7 +145,8 @@ class LibBuilderBase(object):
@property @property
def dependencies(self): def dependencies(self):
return LibraryManager.normalize_dependencies( return LibraryManager.normalize_dependencies(
self._manifest.get("dependencies", [])) self._manifest.get("dependencies", [])
)
@property @property
def src_filter(self): def src_filter(self):
@@ -161,7 +154,7 @@ class LibBuilderBase(object):
"-<example%s>" % os.sep, "-<example%s>" % os.sep,
"-<examples%s>" % os.sep, "-<examples%s>" % os.sep,
"-<test%s>" % os.sep, "-<test%s>" % os.sep,
"-<tests%s>" % os.sep "-<tests%s>" % os.sep,
] ]
@property @property
@@ -172,8 +165,7 @@ class LibBuilderBase(object):
@property @property
def src_dir(self): def src_dir(self):
return (join(self.path, "src") return join(self.path, "src") if isdir(join(self.path, "src")) else self.path
if isdir(join(self.path, "src")) else self.path)
def get_include_dirs(self): def get_include_dirs(self):
items = [] items = []
@@ -214,40 +206,41 @@ class LibBuilderBase(object):
@property @property
def lib_archive(self): def lib_archive(self):
return self.env.GetProjectOption("lib_archive", True) return self.env.GetProjectOption("lib_archive")
@property @property
def lib_ldf_mode(self): def lib_ldf_mode(self):
return self.env.GetProjectOption("lib_ldf_mode", self.LDF_MODE_DEFAULT) return self.env.GetProjectOption("lib_ldf_mode")
@staticmethod @staticmethod
def validate_ldf_mode(mode): def validate_ldf_mode(mode):
ldf_modes = ProjectOptions["env.lib_ldf_mode"].type.choices
if isinstance(mode, string_types): if isinstance(mode, string_types):
mode = mode.strip().lower() mode = mode.strip().lower()
if mode in LibBuilderBase.LDF_MODES: if mode in ldf_modes:
return mode return mode
try: try:
return LibBuilderBase.LDF_MODES[int(mode)] return ldf_modes[int(mode)]
except (IndexError, ValueError): except (IndexError, ValueError):
pass pass
return LibBuilderBase.LDF_MODE_DEFAULT return ProjectOptions["env.lib_ldf_mode"].default
@property @property
def lib_compat_mode(self): def lib_compat_mode(self):
return self.env.GetProjectOption("lib_compat_mode", return self.env.GetProjectOption("lib_compat_mode")
self.COMPAT_MODE_DEFAULT)
@staticmethod @staticmethod
def validate_compat_mode(mode): def validate_compat_mode(mode):
compat_modes = ProjectOptions["env.lib_compat_mode"].type.choices
if isinstance(mode, string_types): if isinstance(mode, string_types):
mode = mode.strip().lower() mode = mode.strip().lower()
if mode in LibBuilderBase.COMPAT_MODES: if mode in compat_modes:
return mode return mode
try: try:
return LibBuilderBase.COMPAT_MODES[int(mode)] return compat_modes[int(mode)]
except (IndexError, ValueError): except (IndexError, ValueError):
pass pass
return LibBuilderBase.COMPAT_MODE_DEFAULT return ProjectOptions["env.lib_compat_mode"].default
def is_platforms_compatible(self, platforms): def is_platforms_compatible(self, platforms):
return True return True
@@ -263,11 +256,10 @@ class LibBuilderBase(object):
self.env.ProcessFlags(self.build_flags) self.env.ProcessFlags(self.build_flags)
if self.extra_script: if self.extra_script:
self.env.SConscriptChdir(1) self.env.SConscriptChdir(1)
self.env.SConscript(realpath(self.extra_script), self.env.SConscript(
exports={ realpath(self.extra_script),
"env": self.env, exports={"env": self.env, "pio_lib_builder": self},
"pio_lib_builder": self )
})
self.env.ProcessUnFlags(self.build_unflags) self.env.ProcessUnFlags(self.build_unflags)
def process_dependencies(self): def process_dependencies(self):
@@ -276,7 +268,7 @@ class LibBuilderBase(object):
for item in self.dependencies: for item in self.dependencies:
found = False found = False
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if item['name'] != lb.name: if item["name"] != lb.name:
continue continue
found = True found = True
if lb not in self.depbuilders: if lb not in self.depbuilders:
@@ -284,37 +276,48 @@ class LibBuilderBase(object):
break break
if not found and self.verbose: if not found and self.verbose:
sys.stderr.write("Warning: Ignored `%s` dependency for `%s` " sys.stderr.write(
"library\n" % (item['name'], self.name)) "Warning: Ignored `%s` dependency for `%s` "
"library\n" % (item["name"], self.name)
)
def get_search_files(self): def get_search_files(self):
items = [ items = [
join(self.src_dir, item) for item in self.env.MatchSourceFiles( join(self.src_dir, item)
self.src_dir, self.src_filter) for item in self.env.MatchSourceFiles(self.src_dir, self.src_filter)
] ]
include_dir = self.include_dir include_dir = self.include_dir
if include_dir: if include_dir:
items.extend([ items.extend(
join(include_dir, item) [
for item in self.env.MatchSourceFiles(include_dir) join(include_dir, item)
]) for item in self.env.MatchSourceFiles(include_dir)
]
)
return items return items
def _get_found_includes( # pylint: disable=too-many-branches def _get_found_includes( # pylint: disable=too-many-branches
self, search_files=None): self, search_files=None
):
# all include directories # all include directories
if not LibBuilderBase._INCLUDE_DIRS_CACHE: if not LibBuilderBase._INCLUDE_DIRS_CACHE:
LibBuilderBase._INCLUDE_DIRS_CACHE = [] LibBuilderBase._INCLUDE_DIRS_CACHE = [
self.env.Dir(d)
for d in ProjectAsLibBuilder(
self.envorigin, "$PROJECT_DIR"
).get_include_dirs()
]
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
LibBuilderBase._INCLUDE_DIRS_CACHE.extend( LibBuilderBase._INCLUDE_DIRS_CACHE.extend(
[self.env.Dir(d) for d in lb.get_include_dirs()]) [self.env.Dir(d) for d in lb.get_include_dirs()]
)
# append self include directories # append self include directories
include_dirs = [self.env.Dir(d) for d in self.get_include_dirs()] include_dirs = [self.env.Dir(d) for d in self.get_include_dirs()]
include_dirs.extend(LibBuilderBase._INCLUDE_DIRS_CACHE) include_dirs.extend(LibBuilderBase._INCLUDE_DIRS_CACHE)
result = [] result = []
for path in (search_files or []): for path in search_files or []:
if path in self._processed_files: if path in self._processed_files:
continue continue
self._processed_files.append(path) self._processed_files.append(path)
@@ -325,21 +328,27 @@ class LibBuilderBase(object):
self.env.File(path), self.env.File(path),
self.env, self.env,
tuple(include_dirs), tuple(include_dirs),
depth=self.CCONDITIONAL_SCANNER_DEPTH) depth=self.CCONDITIONAL_SCANNER_DEPTH,
)
# mark candidates already processed via Conditional Scanner # mark candidates already processed via Conditional Scanner
self._processed_files.extend([ self._processed_files.extend(
c.get_abspath() for c in candidates [
if c.get_abspath() not in self._processed_files c.get_abspath()
]) for c in candidates
if c.get_abspath() not in self._processed_files
]
)
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
if self.verbose and "+" in self.lib_ldf_mode: if self.verbose and "+" in self.lib_ldf_mode:
sys.stderr.write( sys.stderr.write(
"Warning! Classic Pre Processor is used for `%s`, " "Warning! Classic Pre Processor is used for `%s`, "
"advanced has failed with `%s`\n" % (path, e)) "advanced has failed with `%s`\n" % (path, e)
)
candidates = LibBuilderBase.CLASSIC_SCANNER( candidates = LibBuilderBase.CLASSIC_SCANNER(
self.env.File(path), self.env, tuple(include_dirs)) self.env.File(path), self.env, tuple(include_dirs)
)
# print(path, map(lambda n: n.get_abspath(), candidates)) # print(path, [c.get_abspath() for c in candidates])
for item in candidates: for item in candidates:
if item not in result: if item not in result:
result.append(item) result.append(item)
@@ -348,7 +357,7 @@ class LibBuilderBase(object):
_h_path = item.get_abspath() _h_path = item.get_abspath()
if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT): if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT):
continue continue
_f_part = _h_path[:_h_path.rindex(".")] _f_part = _h_path[: _h_path.rindex(".")]
for ext in piotool.SRC_C_EXT: for ext in piotool.SRC_C_EXT:
if not isfile("%s.%s" % (_f_part, ext)): if not isfile("%s.%s" % (_f_part, ext)):
continue continue
@@ -359,7 +368,6 @@ class LibBuilderBase(object):
return result return result
def depend_recursive(self, lb, search_files=None): def depend_recursive(self, lb, search_files=None):
def _already_depends(_lb): def _already_depends(_lb):
if self in _lb.depbuilders: if self in _lb.depbuilders:
return True return True
@@ -372,9 +380,10 @@ class LibBuilderBase(object):
if self != lb: if self != lb:
if _already_depends(lb): if _already_depends(lb):
if self.verbose: if self.verbose:
sys.stderr.write("Warning! Circular dependencies detected " sys.stderr.write(
"between `%s` and `%s`\n" % "Warning! Circular dependencies detected "
(self.path, lb.path)) "between `%s` and `%s`\n" % (self.path, lb.path)
)
self._circular_deps.append(lb) self._circular_deps.append(lb)
elif lb not in self._depbuilders: elif lb not in self._depbuilders:
self._depbuilders.append(lb) self._depbuilders.append(lb)
@@ -431,11 +440,10 @@ class LibBuilderBase(object):
if self.lib_archive: if self.lib_archive:
libs.append( libs.append(
self.env.BuildLibrary(self.build_dir, self.src_dir, self.env.BuildLibrary(self.build_dir, self.src_dir, self.src_filter)
self.src_filter)) )
else: else:
self.env.BuildSources(self.build_dir, self.src_dir, self.env.BuildSources(self.build_dir, self.src_dir, self.src_filter)
self.src_filter)
return libs return libs
@@ -444,19 +452,11 @@ class UnknownLibBuilder(LibBuilderBase):
class ArduinoLibBuilder(LibBuilderBase): class ArduinoLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
manifest = {}
if not isfile(join(self.path, "library.properties")):
return manifest
manifest_path = join(self.path, "library.properties") manifest_path = join(self.path, "library.properties")
with codecs.open(manifest_path, encoding="utf-8") as fp: if not isfile(manifest_path):
for line in fp.readlines(): return {}
if "=" not in line: return ManifestParserFactory.new_from_file(manifest_path).as_dict()
continue
key, value = line.split("=", 1)
manifest[key.strip()] = value.strip()
return manifest
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self) include_dirs = LibBuilderBase.get_include_dirs(self)
@@ -500,35 +500,18 @@ class ArduinoLibBuilder(LibBuilderBase):
return util.items_in_list(frameworks, ["arduino", "energia"]) return util.items_in_list(frameworks, ["arduino", "energia"])
def is_platforms_compatible(self, platforms): def is_platforms_compatible(self, platforms):
platforms_map = { items = self._manifest.get("platforms", [])
"avr": ["atmelavr"],
"sam": ["atmelsam"],
"samd": ["atmelsam"],
"esp8266": ["espressif8266"],
"esp32": ["espressif32"],
"arc32": ["intel_arc32"],
"stm32": ["ststm32"],
"nrf5": ["nordicnrf51", "nordicnrf52"]
}
items = []
for arch in self._manifest.get("architectures", "").split(","):
arch = arch.strip().lower()
if arch == "*":
items = "*"
break
if arch in platforms_map:
items.extend(platforms_map[arch])
if not items: if not items:
return LibBuilderBase.is_platforms_compatible(self, platforms) return LibBuilderBase.is_platforms_compatible(self, platforms)
return util.items_in_list(platforms, items) return util.items_in_list(platforms, items)
class MbedLibBuilder(LibBuilderBase): class MbedLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
if not isfile(join(self.path, "module.json")): manifest_path = join(self.path, "module.json")
if not isfile(manifest_path):
return {} return {}
return fs.load_json(join(self.path, "module.json")) return ManifestParserFactory.new_from_file(manifest_path).as_dict()
@property @property
def include_dir(self): def include_dir(self):
@@ -583,8 +566,7 @@ class MbedLibBuilder(LibBuilderBase):
mbed_config_path = join(self.env.subst(p), "mbed_config.h") mbed_config_path = join(self.env.subst(p), "mbed_config.h")
if isfile(mbed_config_path): if isfile(mbed_config_path):
break break
else: mbed_config_path = None
mbed_config_path = None
if not mbed_config_path: if not mbed_config_path:
return None return None
@@ -611,14 +593,15 @@ class MbedLibBuilder(LibBuilderBase):
# default macros # default macros
for macro in manifest.get("macros", []): for macro in manifest.get("macros", []):
macro = self._mbed_normalize_macro(macro) macro = self._mbed_normalize_macro(macro)
macros[macro['name']] = macro macros[macro["name"]] = macro
# configuration items # configuration items
for key, options in manifest.get("config", {}).items(): for key, options in manifest.get("config", {}).items():
if "value" not in options: if "value" not in options:
continue continue
macros[key] = dict(name=options.get("macro_name"), macros[key] = dict(
value=options.get("value")) name=options.get("macro_name"), value=options.get("value")
)
# overrode items per target # overrode items per target
for target, options in manifest.get("target_overrides", {}).items(): for target, options in manifest.get("target_overrides", {}).items():
@@ -626,25 +609,23 @@ class MbedLibBuilder(LibBuilderBase):
continue continue
for macro in options.get("target.macros_add", []): for macro in options.get("target.macros_add", []):
macro = self._mbed_normalize_macro(macro) macro = self._mbed_normalize_macro(macro)
macros[macro['name']] = macro macros[macro["name"]] = macro
for key, value in options.items(): for key, value in options.items():
if not key.startswith("target.") and key in macros: if not key.startswith("target.") and key in macros:
macros[key]['value'] = value macros[key]["value"] = value
# normalize macro names # normalize macro names
for key, macro in macros.items(): for key, macro in macros.items():
if not macro['name']: if not macro["name"]:
macro['name'] = key macro["name"] = key
if "." not in macro['name']: if "." not in macro["name"]:
macro['name'] = "%s.%s" % (manifest.get("name"), macro["name"] = "%s.%s" % (manifest.get("name"), macro["name"])
macro['name']) macro["name"] = re.sub(
macro['name'] = re.sub(r"[^a-z\d]+", r"[^a-z\d]+", "_", macro["name"], flags=re.I
"_", ).upper()
macro['name'], macro["name"] = "MBED_CONF_" + macro["name"]
flags=re.I).upper() if isinstance(macro["value"], bool):
macro['name'] = "MBED_CONF_" + macro['name'] macro["value"] = 1 if macro["value"] else 0
if isinstance(macro['value'], bool):
macro['value'] = 1 if macro['value'] else 0
return {macro["name"]: macro["value"] for macro in macros.values()} return {macro["name"]: macro["value"] for macro in macros.values()}
@@ -654,13 +635,13 @@ class MbedLibBuilder(LibBuilderBase):
for line in fp.readlines(): for line in fp.readlines():
line = line.strip() line = line.strip()
if line == "#endif": if line == "#endif":
lines.append( lines.append("// PlatformIO Library Dependency Finder (LDF)")
"// PlatformIO Library Dependency Finder (LDF)") lines.extend(
lines.extend([ [
"#define %s %s" % "#define %s %s" % (name, value if value is not None else "")
(name, value if value is not None else "") for name, value in macros.items()
for name, value in macros.items() ]
]) )
lines.append("") lines.append("")
if not line.startswith("#define"): if not line.startswith("#define"):
lines.append(line) lines.append(line)
@@ -674,22 +655,13 @@ class MbedLibBuilder(LibBuilderBase):
class PlatformIOLibBuilder(LibBuilderBase): class PlatformIOLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
assert isfile(join(self.path, "library.json")) manifest_path = join(self.path, "library.json")
manifest = fs.load_json(join(self.path, "library.json")) if not isfile(manifest_path):
assert "name" in manifest return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
# replace "espressif" old name dev/platform with ESP8266 def _has_arduino_manifest(self):
if "platforms" in manifest:
manifest['platforms'] = [
"espressif8266" if p == "espressif" else p
for p in util.items_to_list(manifest['platforms'])
]
return manifest
def _is_arduino_manifest(self):
return isfile(join(self.path, "library.properties")) return isfile(join(self.path, "library.properties"))
@property @property
@@ -710,9 +682,9 @@ class PlatformIOLibBuilder(LibBuilderBase):
def src_filter(self): def src_filter(self):
if "srcFilter" in self._manifest.get("build", {}): if "srcFilter" in self._manifest.get("build", {}):
return self._manifest.get("build").get("srcFilter") return self._manifest.get("build").get("srcFilter")
if self.env['SRC_FILTER']: if self.env["SRC_FILTER"]:
return self.env['SRC_FILTER'] return self.env["SRC_FILTER"]
if self._is_arduino_manifest(): if self._has_arduino_manifest():
return ArduinoLibBuilder.src_filter.fget(self) return ArduinoLibBuilder.src_filter.fget(self)
return LibBuilderBase.src_filter.fget(self) return LibBuilderBase.src_filter.fget(self)
@@ -736,11 +708,13 @@ class PlatformIOLibBuilder(LibBuilderBase):
@property @property
def lib_archive(self): def lib_archive(self):
global_value = self.env.GetProjectOption("lib_archive") unique_value = "_not_declared_%s" % id(self)
if global_value is not None: global_value = self.env.GetProjectOption("lib_archive", unique_value)
if global_value != unique_value:
return global_value return global_value
return self._manifest.get("build", {}).get( return self._manifest.get("build", {}).get(
"libArchive", LibBuilderBase.lib_archive.fget(self)) "libArchive", LibBuilderBase.lib_archive.fget(self)
)
@property @property
def lib_ldf_mode(self): def lib_ldf_mode(self):
@@ -748,7 +722,10 @@ class PlatformIOLibBuilder(LibBuilderBase):
self.env.GetProjectOption( self.env.GetProjectOption(
"lib_ldf_mode", "lib_ldf_mode",
self._manifest.get("build", {}).get( self._manifest.get("build", {}).get(
"libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self)))) "libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self)
),
)
)
@property @property
def lib_compat_mode(self): def lib_compat_mode(self):
@@ -756,8 +733,10 @@ class PlatformIOLibBuilder(LibBuilderBase):
self.env.GetProjectOption( self.env.GetProjectOption(
"lib_compat_mode", "lib_compat_mode",
self._manifest.get("build", {}).get( self._manifest.get("build", {}).get(
"libCompatMode", "libCompatMode", LibBuilderBase.lib_compat_mode.fget(self)
LibBuilderBase.lib_compat_mode.fget(self)))) ),
)
)
def is_platforms_compatible(self, platforms): def is_platforms_compatible(self, platforms):
items = self._manifest.get("platforms") items = self._manifest.get("platforms")
@@ -775,9 +754,12 @@ class PlatformIOLibBuilder(LibBuilderBase):
include_dirs = LibBuilderBase.get_include_dirs(self) include_dirs = LibBuilderBase.get_include_dirs(self)
# backwards compatibility with PlatformIO 2.0 # backwards compatibility with PlatformIO 2.0
if ("build" not in self._manifest and self._is_arduino_manifest() if (
and not isdir(join(self.path, "src")) "build" not in self._manifest
and isdir(join(self.path, "utility"))): and self._has_arduino_manifest()
and not isdir(join(self.path, "src"))
and isdir(join(self.path, "utility"))
):
include_dirs.append(join(self.path, "utility")) include_dirs.append(join(self.path, "utility"))
for path in self.env.get("CPPPATH", []): for path in self.env.get("CPPPATH", []):
@@ -788,25 +770,24 @@ class PlatformIOLibBuilder(LibBuilderBase):
class ProjectAsLibBuilder(LibBuilderBase): class ProjectAsLibBuilder(LibBuilderBase):
def __init__(self, env, *args, **kwargs): def __init__(self, env, *args, **kwargs):
# backup original value, will be reset in base.__init__ # backup original value, will be reset in base.__init__
project_src_filter = env.get("SRC_FILTER") project_src_filter = env.get("SRC_FILTER")
super(ProjectAsLibBuilder, self).__init__(env, *args, **kwargs) super(ProjectAsLibBuilder, self).__init__(env, *args, **kwargs)
self.env['SRC_FILTER'] = project_src_filter self.env["SRC_FILTER"] = project_src_filter
@property @property
def include_dir(self): def include_dir(self):
include_dir = self.env.subst("$PROJECTINCLUDE_DIR") include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
return include_dir if isdir(include_dir) else None return include_dir if isdir(include_dir) else None
@property @property
def src_dir(self): def src_dir(self):
return self.env.subst("$PROJECTSRC_DIR") return self.env.subst("$PROJECT_SRC_DIR")
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = [] include_dirs = []
project_include_dir = self.env.subst("$PROJECTINCLUDE_DIR") project_include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
if isdir(project_include_dir): if isdir(project_include_dir):
include_dirs.append(project_include_dir) include_dirs.append(project_include_dir)
for include_dir in LibBuilderBase.get_include_dirs(self): for include_dir in LibBuilderBase.get_include_dirs(self):
@@ -819,11 +800,14 @@ class ProjectAsLibBuilder(LibBuilderBase):
items = LibBuilderBase.get_search_files(self) items = LibBuilderBase.get_search_files(self)
# test files # test files
if "__test" in COMMAND_LINE_TARGETS: if "__test" in COMMAND_LINE_TARGETS:
items.extend([ items.extend(
join("$PROJECTTEST_DIR", [
item) for item in self.env.MatchSourceFiles( join("$PROJECT_TEST_DIR", item)
"$PROJECTTEST_DIR", "$PIOTEST_SRC_FILTER") for item in self.env.MatchSourceFiles(
]) "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
)
]
)
return items return items
@property @property
@@ -836,8 +820,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
@property @property
def src_filter(self): def src_filter(self):
return (self.env.get("SRC_FILTER") return self.env.get("SRC_FILTER") or LibBuilderBase.src_filter.fget(self)
or LibBuilderBase.src_filter.fget(self))
@property @property
def dependencies(self): def dependencies(self):
@@ -848,7 +831,6 @@ class ProjectAsLibBuilder(LibBuilderBase):
pass pass
def install_dependencies(self): def install_dependencies(self):
def _is_builtin(uri): def _is_builtin(uri):
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if lb.name == uri: if lb.name == uri:
@@ -871,8 +853,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
not_found_uri.append(uri) not_found_uri.append(uri)
did_install = False did_install = False
lm = LibraryManager( lm = LibraryManager(self.env.subst(join("$PROJECT_LIBDEPS_DIR", "$PIOENV")))
self.env.subst(join("$PROJECTLIBDEPS_DIR", "$PIOENV")))
for uri in not_found_uri: for uri in not_found_uri:
try: try:
lm.install(uri) lm.install(uri)
@@ -923,28 +904,26 @@ class ProjectAsLibBuilder(LibBuilderBase):
def GetLibSourceDirs(env): def GetLibSourceDirs(env):
items = env.GetProjectOption("lib_extra_dirs", []) items = env.GetProjectOption("lib_extra_dirs", [])
items.extend(env['LIBSOURCE_DIRS']) items.extend(env["LIBSOURCE_DIRS"])
return [ return [
env.subst(expanduser(item) if item.startswith("~") else item) env.subst(fs.expanduser(item) if item.startswith("~") else item)
for item in items for item in items
] ]
def IsCompatibleLibBuilder(env, def IsCompatibleLibBuilder(env, lb, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
lb,
verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
compat_mode = lb.lib_compat_mode compat_mode = lb.lib_compat_mode
if lb.name in env.GetProjectOption("lib_ignore", []): if lb.name in env.GetProjectOption("lib_ignore", []):
if verbose: if verbose:
sys.stderr.write("Ignored library %s\n" % lb.path) sys.stderr.write("Ignored library %s\n" % lb.path)
return None return None
if compat_mode == "strict" and not lb.is_platforms_compatible( if compat_mode == "strict" and not lb.is_platforms_compatible(env["PIOPLATFORM"]):
env['PIOPLATFORM']):
if verbose: if verbose:
sys.stderr.write("Platform incompatible library %s\n" % lb.path) sys.stderr.write("Platform incompatible library %s\n" % lb.path)
return False return False
if (compat_mode in ("soft", "strict") and "PIOFRAMEWORK" in env if compat_mode in ("soft", "strict") and not lb.is_frameworks_compatible(
and not lb.is_frameworks_compatible(env.get("PIOFRAMEWORK", []))): env.get("PIOFRAMEWORK", [])
):
if verbose: if verbose:
sys.stderr.write("Framework incompatible library %s\n" % lb.path) sys.stderr.write("Framework incompatible library %s\n" % lb.path)
return False return False
@@ -953,8 +932,10 @@ def IsCompatibleLibBuilder(env,
def GetLibBuilders(env): # pylint: disable=too-many-branches def GetLibBuilders(env): # pylint: disable=too-many-branches
if DefaultEnvironment().get("__PIO_LIB_BUILDERS", None) is not None: if DefaultEnvironment().get("__PIO_LIB_BUILDERS", None) is not None:
return sorted(DefaultEnvironment()['__PIO_LIB_BUILDERS'], return sorted(
key=lambda lb: 0 if lb.dependent else 1) DefaultEnvironment()["__PIO_LIB_BUILDERS"],
key=lambda lb: 0 if lb.dependent else 1,
)
DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=[]) DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=[])
@@ -974,7 +955,8 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
except exception.InvalidJSONFile: except exception.InvalidJSONFile:
if verbose: if verbose:
sys.stderr.write( sys.stderr.write(
"Skip library with broken manifest: %s\n" % lib_dir) "Skip library with broken manifest: %s\n" % lib_dir
)
continue continue
if env.IsCompatibleLibBuilder(lb): if env.IsCompatibleLibBuilder(lb):
DefaultEnvironment().Append(__PIO_LIB_BUILDERS=[lb]) DefaultEnvironment().Append(__PIO_LIB_BUILDERS=[lb])
@@ -989,15 +971,15 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
if verbose and found_incompat: if verbose and found_incompat:
sys.stderr.write( sys.stderr.write(
"More details about \"Library Compatibility Mode\": " 'More details about "Library Compatibility Mode": '
"https://docs.platformio.org/page/librarymanager/ldf.html#" "https://docs.platformio.org/page/librarymanager/ldf.html#"
"ldf-compat-mode\n") "ldf-compat-mode\n"
)
return DefaultEnvironment()['__PIO_LIB_BUILDERS'] return DefaultEnvironment()["__PIO_LIB_BUILDERS"]
def ConfigureProjectLibBuilder(env): def ConfigureProjectLibBuilder(env):
def _get_vcs_info(lb): def _get_vcs_info(lb):
path = LibraryManager.get_src_manifest_path(lb.path) path = LibraryManager.get_src_manifest_path(lb.path)
return fs.load_json(path) if path else None return fs.load_json(path) if path else None
@@ -1022,40 +1004,42 @@ def ConfigureProjectLibBuilder(env):
title += " %s" % lb.version title += " %s" % lb.version
if vcs_info and vcs_info.get("version"): if vcs_info and vcs_info.get("version"):
title += " #%s" % vcs_info.get("version") title += " #%s" % vcs_info.get("version")
sys.stdout.write("%s|-- %s" % (margin, title)) click.echo("%s|-- %s" % (margin, title), nl=False)
if int(ARGUMENTS.get("PIOVERBOSE", 0)): if int(ARGUMENTS.get("PIOVERBOSE", 0)):
if vcs_info: if vcs_info:
sys.stdout.write(" [%s]" % vcs_info.get("url")) click.echo(" [%s]" % vcs_info.get("url"), nl=False)
sys.stdout.write(" (") click.echo(" (", nl=False)
sys.stdout.write(lb.path) click.echo(lb.path, nl=False)
sys.stdout.write(")") click.echo(")", nl=False)
sys.stdout.write("\n") click.echo("")
if lb.depbuilders: if lb.depbuilders:
_print_deps_tree(lb, level + 1) _print_deps_tree(lb, level + 1)
project = ProjectAsLibBuilder(env, "$PROJECT_DIR") project = ProjectAsLibBuilder(env, "$PROJECT_DIR")
ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project) ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project)
print("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf") click.echo("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf")
print("LDF Modes: Finder ~ %s, Compatibility ~ %s" % click.echo(
(ldf_mode, project.lib_compat_mode)) "LDF Modes: Finder ~ %s, Compatibility ~ %s"
% (ldf_mode, project.lib_compat_mode)
)
project.install_dependencies() project.install_dependencies()
lib_builders = env.GetLibBuilders() lib_builders = env.GetLibBuilders()
print("Found %d compatible libraries" % len(lib_builders)) click.echo("Found %d compatible libraries" % len(lib_builders))
print("Scanning dependencies...") click.echo("Scanning dependencies...")
project.search_deps_recursive() project.search_deps_recursive()
if ldf_mode.startswith("chain") and project.depbuilders: if ldf_mode.startswith("chain") and project.depbuilders:
_correct_found_libs(lib_builders) _correct_found_libs(lib_builders)
if project.depbuilders: if project.depbuilders:
print("Dependency Graph") click.echo("Dependency Graph")
_print_deps_tree(project) _print_deps_tree(project)
else: else:
print("No dependencies") click.echo("No dependencies")
return project return project

View File

@@ -18,16 +18,17 @@ from hashlib import md5
from os import makedirs from os import makedirs
from os.path import isdir, isfile, join from os.path import isdir, isfile, join
from platformio import fs
from platformio.compat import WINDOWS, hashlib_encode_data from platformio.compat import WINDOWS, hashlib_encode_data
# Windows CLI has limit with command length to 8192 # Windows CLI has limit with command length to 8192
# Leave 2000 chars for flags and other options # Leave 2000 chars for flags and other options
MAX_SOURCES_LENGTH = 6000 MAX_LINE_LENGTH = 6000 if WINDOWS else 128072
def long_sources_hook(env, sources): def long_sources_hook(env, sources):
_sources = str(sources).replace("\\", "/") _sources = str(sources).replace("\\", "/")
if len(str(_sources)) < MAX_SOURCES_LENGTH: if len(str(_sources)) < MAX_LINE_LENGTH:
return sources return sources
# fix space in paths # fix space in paths
@@ -43,7 +44,7 @@ def long_sources_hook(env, sources):
def long_incflags_hook(env, incflags): def long_incflags_hook(env, incflags):
_incflags = env.subst(incflags).replace("\\", "/") _incflags = env.subst(incflags).replace("\\", "/")
if len(_incflags) < MAX_SOURCES_LENGTH: if len(_incflags) < MAX_LINE_LENGTH:
return incflags return incflags
# fix space in paths # fix space in paths
@@ -61,12 +62,12 @@ def _file_long_data(env, data):
build_dir = env.subst("$BUILD_DIR") build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir): if not isdir(build_dir):
makedirs(build_dir) makedirs(build_dir)
tmp_file = join(build_dir, tmp_file = join(
"longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest()) build_dir, "longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest()
)
if isfile(tmp_file): if isfile(tmp_file):
return tmp_file return tmp_file
with open(tmp_file, "w") as fp: fs.write_file_contents(tmp_file, data)
fp.write(data)
return tmp_file return tmp_file
@@ -75,18 +76,17 @@ def exists(_):
def generate(env): def generate(env):
if not WINDOWS:
return None
env.Replace(_long_sources_hook=long_sources_hook) env.Replace(_long_sources_hook=long_sources_hook)
env.Replace(_long_incflags_hook=long_incflags_hook) env.Replace(_long_incflags_hook=long_incflags_hook)
coms = {} coms = {}
for key in ("ARCOM", "LINKCOM"): for key in ("ARCOM", "LINKCOM"):
coms[key] = env.get(key, "").replace( coms[key] = env.get(key, "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}") "$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
for key in ("_CCCOMCOM", "ASPPCOM"): for key in ("_CCCOMCOM", "ASPPCOM"):
coms[key] = env.get(key, "").replace( coms[key] = env.get(key, "").replace(
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}") "$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}"
)
env.Replace(**coms) env.Replace(**coms)
return env return env

View File

@@ -25,7 +25,7 @@ from SCons.Action import Action # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error from SCons.Script import ARGUMENTS # pylint: disable=import-error
from platformio import fs, util from platformio import fs, util
from platformio.compat import get_file_contents, glob_escape from platformio.compat import glob_escape
from platformio.managers.core import get_core_package_dir from platformio.managers.core import get_core_package_dir
from platformio.proc import exec_command from platformio.proc import exec_command
@@ -39,7 +39,9 @@ class InoToCPPConverter(object):
([a-z_\d]+\s*) # name of prototype ([a-z_\d]+\s*) # name of prototype
\([a-z_,\.\*\&\[\]\s\d]*\) # arguments \([a-z_,\.\*\&\[\]\s\d]*\) # arguments
)\s*(\{|;) # must end with `{` or `;` )\s*(\{|;) # must end with `{` or `;`
""", re.X | re.M | re.I) """,
re.X | re.M | re.I,
)
DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I) DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I)
PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)" PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)"
@@ -60,10 +62,8 @@ class InoToCPPConverter(object):
assert nodes assert nodes
lines = [] lines = []
for node in nodes: for node in nodes:
contents = get_file_contents(node.get_path()) contents = fs.get_file_contents(node.get_path())
_lines = [ _lines = ['# 1 "%s"' % node.get_path().replace("\\", "/"), contents]
'# 1 "%s"' % node.get_path().replace("\\", "/"), contents
]
if self.is_main_node(contents): if self.is_main_node(contents):
lines = _lines + lines lines = _lines + lines
self._main_ino = node.get_path() self._main_ino = node.get_path()
@@ -78,21 +78,24 @@ class InoToCPPConverter(object):
def process(self, contents): def process(self, contents):
out_file = self._main_ino + ".cpp" out_file = self._main_ino + ".cpp"
assert self._gcc_preprocess(contents, out_file) assert self._gcc_preprocess(contents, out_file)
contents = get_file_contents(out_file) contents = fs.get_file_contents(out_file)
contents = self._join_multiline_strings(contents) contents = self._join_multiline_strings(contents)
with open(out_file, "w") as fp: fs.write_file_contents(
fp.write(self.append_prototypes(contents)) out_file, self.append_prototypes(contents), errors="backslashreplace"
)
return out_file return out_file
def _gcc_preprocess(self, contents, out_file): def _gcc_preprocess(self, contents, out_file):
tmp_path = mkstemp()[1] tmp_path = mkstemp()[1]
with open(tmp_path, "w") as fp: fs.write_file_contents(tmp_path, contents, errors="backslashreplace")
fp.write(contents)
self.env.Execute( self.env.Execute(
self.env.VerboseAction( self.env.VerboseAction(
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format( '$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path), out_file, tmp_path
"Converting " + basename(out_file[:-4]))) ),
"Converting " + basename(out_file[:-4]),
)
)
atexit.register(_delete_file, tmp_path) atexit.register(_delete_file, tmp_path)
return isfile(out_file) return isfile(out_file)
@@ -114,14 +117,15 @@ class InoToCPPConverter(object):
stropen = True stropen = True
newlines.append(line[:-1]) newlines.append(line[:-1])
continue continue
elif stropen: if stropen:
newlines[len(newlines) - 1] += line[:-1] newlines[len(newlines) - 1] += line[:-1]
continue continue
elif stropen and line.endswith(('",', '";')): elif stropen and line.endswith(('",', '";')):
newlines[len(newlines) - 1] += line newlines[len(newlines) - 1] += line
stropen = False stropen = False
newlines.append('#line %d "%s"' % newlines.append(
(linenum, self._main_ino.replace("\\", "/"))) '#line %d "%s"' % (linenum, self._main_ino.replace("\\", "/"))
)
continue continue
newlines.append(line) newlines.append(line)
@@ -141,8 +145,10 @@ class InoToCPPConverter(object):
prototypes = [] prototypes = []
reserved_keywords = set(["if", "else", "while"]) reserved_keywords = set(["if", "else", "while"])
for match in self.PROTOTYPE_RE.finditer(contents): for match in self.PROTOTYPE_RE.finditer(contents):
if (set([match.group(2).strip(), if (
match.group(3).strip()]) & reserved_keywords): set([match.group(2).strip(), match.group(3).strip()])
& reserved_keywords
):
continue continue
prototypes.append(match) prototypes.append(match)
return prototypes return prototypes
@@ -162,11 +168,8 @@ class InoToCPPConverter(object):
prototypes = self._parse_prototypes(contents) or [] prototypes = self._parse_prototypes(contents) or []
# skip already declared prototypes # skip already declared prototypes
declared = set( declared = set(m.group(1).strip() for m in prototypes if m.group(4) == ";")
m.group(1).strip() for m in prototypes if m.group(4) == ";") prototypes = [m for m in prototypes if m.group(1).strip() not in declared]
prototypes = [
m for m in prototypes if m.group(1).strip() not in declared
]
if not prototypes: if not prototypes:
return contents return contents
@@ -175,23 +178,29 @@ class InoToCPPConverter(object):
split_pos = prototypes[0].start() split_pos = prototypes[0].start()
match_ptrs = re.search( match_ptrs = re.search(
self.PROTOPTRS_TPLRE % ("|".join(prototype_names)), self.PROTOPTRS_TPLRE % ("|".join(prototype_names)),
contents[:split_pos], re.M) contents[:split_pos],
re.M,
)
if match_ptrs: if match_ptrs:
split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1 split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1
result = [] result = []
result.append(contents[:split_pos].strip()) result.append(contents[:split_pos].strip())
result.append("%s;" % ";\n".join([m.group(1) for m in prototypes])) result.append("%s;" % ";\n".join([m.group(1) for m in prototypes]))
result.append('#line %d "%s"' % (self._get_total_lines( result.append(
contents[:split_pos]), self._main_ino.replace("\\", "/"))) '#line %d "%s"'
% (
self._get_total_lines(contents[:split_pos]),
self._main_ino.replace("\\", "/"),
)
)
result.append(contents[split_pos:].strip()) result.append(contents[split_pos:].strip())
return "\n".join(result) return "\n".join(result)
def ConvertInoToCpp(env): def ConvertInoToCpp(env):
src_dir = glob_escape(env.subst("$PROJECTSRC_DIR")) src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = (env.Glob(join(src_dir, "*.ino")) + ino_nodes = env.Glob(join(src_dir, "*.ino")) + env.Glob(join(src_dir, "*.pde"))
env.Glob(join(src_dir, "*.pde")))
if not ino_nodes: if not ino_nodes:
return return
c = InoToCPPConverter(env) c = InoToCPPConverter(env)
@@ -214,13 +223,13 @@ def _get_compiler_type(env):
return "gcc" return "gcc"
try: try:
sysenv = environ.copy() sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH']) sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command([env.subst("$CC"), "-v"], env=sysenv) result = exec_command([env.subst("$CC"), "-v"], env=sysenv)
except OSError: except OSError:
return None return None
if result['returncode'] != 0: if result["returncode"] != 0:
return None return None
output = "".join([result['out'], result['err']]).lower() output = "".join([result["out"], result["err"]]).lower()
if "clang" in output and "LLVM" in output: if "clang" in output and "LLVM" in output:
return "clang" return "clang"
if "gcc" in output: if "gcc" in output:
@@ -233,7 +242,6 @@ def GetCompilerType(env):
def GetActualLDScript(env): def GetActualLDScript(env):
def _lookup_in_ldpath(script): def _lookup_in_ldpath(script):
for d in env.get("LIBPATH", []): for d in env.get("LIBPATH", []):
path = join(env.subst(d), script) path = join(env.subst(d), script)
@@ -248,7 +256,7 @@ def GetActualLDScript(env):
if f == "-T": if f == "-T":
script_in_next = True script_in_next = True
continue continue
elif script_in_next: if script_in_next:
script_in_next = False script_in_next = False
raw_script = f raw_script = f
elif f.startswith("-Wl,-T"): elif f.startswith("-Wl,-T"):
@@ -264,12 +272,13 @@ def GetActualLDScript(env):
if script: if script:
sys.stderr.write( sys.stderr.write(
"Error: Could not find '%s' LD script in LDPATH '%s'\n" % "Error: Could not find '%s' LD script in LDPATH '%s'\n"
(script, env.subst("$LIBPATH"))) % (script, env.subst("$LIBPATH"))
)
env.Exit(1) env.Exit(1)
if not script and "LDSCRIPT_PATH" in env: if not script and "LDSCRIPT_PATH" in env:
path = _lookup_in_ldpath(env['LDSCRIPT_PATH']) path = _lookup_in_ldpath(env["LDSCRIPT_PATH"])
if path: if path:
return path return path
@@ -292,35 +301,45 @@ def PioClean(env, clean_dir):
for f in files: for f in files:
dst = join(root, f) dst = join(root, f)
remove(dst) remove(dst)
print("Removed %s" % print(
(dst if clean_rel_path.startswith(".") else relpath(dst))) "Removed %s" % (dst if clean_rel_path.startswith(".") else relpath(dst))
)
print("Done cleaning") print("Done cleaning")
fs.rmtree(clean_dir) fs.rmtree(clean_dir)
env.Exit(0) env.Exit(0)
def ProcessDebug(env): def ConfigureDebugFlags(env):
if not env.subst("$PIODEBUGFLAGS"): def _cleanup_debug_flags(scope):
env.Replace(PIODEBUGFLAGS=["-Og", "-g3", "-ggdb3"]) if scope not in env:
env.Append(BUILD_FLAGS=list(env['PIODEBUGFLAGS']) + return
["-D__PLATFORMIO_BUILD_DEBUG__"]) unflags = ["-Os", "-g"]
unflags = ["-Os"] for level in [0, 1, 2]:
for level in [0, 1, 2]: for flag in ("O", "g", "ggdb"):
for flag in ("O", "g", "ggdb"): unflags.append("-%s%d" % (flag, level))
unflags.append("-%s%d" % (flag, level)) env[scope] = [f for f in env.get(scope, []) if f not in unflags]
env.Append(BUILD_UNFLAGS=unflags)
env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"])
debug_flags = ["-Og", "-g2", "-ggdb2"]
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
_cleanup_debug_flags(scope)
env.Append(**{scope: debug_flags})
def ProcessTest(env): def ConfigureTestTarget(env):
env.Append(CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"], env.Append(
CPPPATH=[join("$BUILD_DIR", "UnityTestLib")]) CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
unitylib = env.BuildLibrary(join("$BUILD_DIR", "UnityTestLib"), CPPPATH=[join("$BUILD_DIR", "UnityTestLib")],
get_core_package_dir("tool-unity")) )
unitylib = env.BuildLibrary(
join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity")
)
env.Prepend(LIBS=[unitylib]) env.Prepend(LIBS=[unitylib])
src_filter = ["+<*.cpp>", "+<*.c>"] src_filter = ["+<*.cpp>", "+<*.c>"]
if "PIOTEST_RUNNING_NAME" in env: if "PIOTEST_RUNNING_NAME" in env:
src_filter.append("+<%s%s>" % (env['PIOTEST_RUNNING_NAME'], sep)) src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], sep))
env.Replace(PIOTEST_SRC_FILTER=src_filter) env.Replace(PIOTEST_SRC_FILTER=src_filter)
@@ -330,7 +349,7 @@ def GetExtraScripts(env, scope):
if scope == "post" and ":" not in item: if scope == "post" and ":" not in item:
items.append(item) items.append(item)
elif item.startswith("%s:" % scope): elif item.startswith("%s:" % scope):
items.append(item[len(scope) + 1:]) items.append(item[len(scope) + 1 :])
if not items: if not items:
return items return items
with fs.cd(env.subst("$PROJECT_DIR")): with fs.cd(env.subst("$PROJECT_DIR")):
@@ -347,7 +366,7 @@ def generate(env):
env.AddMethod(GetActualLDScript) env.AddMethod(GetActualLDScript)
env.AddMethod(VerboseAction) env.AddMethod(VerboseAction)
env.AddMethod(PioClean) env.AddMethod(PioClean)
env.AddMethod(ProcessDebug) env.AddMethod(ConfigureDebugFlags)
env.AddMethod(ProcessTest) env.AddMethod(ConfigureTestTarget)
env.AddMethod(GetExtraScripts) env.AddMethod(GetExtraScripts)
return env return env

View File

@@ -33,8 +33,8 @@ def PioPlatform(env):
variables = env.GetProjectOptions(as_dict=True) variables = env.GetProjectOptions(as_dict=True)
if "framework" in variables: if "framework" in variables:
# support PIO Core 3.0 dev/platforms # support PIO Core 3.0 dev/platforms
variables['pioframework'] = variables['framework'] variables["pioframework"] = variables["framework"]
p = PlatformFactory.newPlatform(env['PLATFORM_MANIFEST']) p = PlatformFactory.newPlatform(env["PLATFORM_MANIFEST"])
p.configure_default_packages(variables, COMMAND_LINE_TARGETS) p.configure_default_packages(variables, COMMAND_LINE_TARGETS)
return p return p
@@ -54,7 +54,7 @@ def BoardConfig(env, board=None):
def GetFrameworkScript(env, framework): def GetFrameworkScript(env, framework):
p = env.PioPlatform() p = env.PioPlatform()
assert p.frameworks and framework in p.frameworks assert p.frameworks and framework in p.frameworks
script_path = env.subst(p.frameworks[framework]['script']) script_path = env.subst(p.frameworks[framework]["script"])
if not isfile(script_path): if not isfile(script_path):
script_path = join(p.get_dir(), script_path) script_path = join(p.get_dir(), script_path)
return script_path return script_path
@@ -65,7 +65,7 @@ def LoadPioPlatform(env):
installed_packages = p.get_installed_packages() installed_packages = p.get_installed_packages()
# Ensure real platform name # Ensure real platform name
env['PIOPLATFORM'] = p.name env["PIOPLATFORM"] = p.name
# Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH # Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH
systype = util.get_systype() systype = util.get_systype()
@@ -75,14 +75,13 @@ def LoadPioPlatform(env):
continue continue
pkg_dir = p.get_package_dir(name) pkg_dir = p.get_package_dir(name)
env.PrependENVPath( env.PrependENVPath(
"PATH", "PATH", join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir
join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir) )
if (not WINDOWS and isdir(join(pkg_dir, "lib")) if not WINDOWS and isdir(join(pkg_dir, "lib")) and type_ != "toolchain":
and type_ != "toolchain"):
env.PrependENVPath( env.PrependENVPath(
"DYLD_LIBRARY_PATH" "DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH",
if "darwin" in systype else "LD_LIBRARY_PATH", join(pkg_dir, "lib"),
join(pkg_dir, "lib")) )
# Platform specific LD Scripts # Platform specific LD Scripts
if isdir(join(p.get_dir(), "ldscripts")): if isdir(join(p.get_dir(), "ldscripts")):
@@ -94,16 +93,27 @@ def LoadPioPlatform(env):
# update board manifest with overridden data from INI config # update board manifest with overridden data from INI config
board_config = env.BoardConfig() board_config = env.BoardConfig()
for option, value in env.GetProjectOptions(): for option, value in env.GetProjectOptions():
if option.startswith("board_"): if not option.startswith("board_"):
board_config.update(option.lower()[6:], value) continue
option = option.lower()[6:]
try:
if isinstance(board_config.get(option), bool):
value = str(value).lower() in ("1", "yes", "true")
elif isinstance(board_config.get(option), int):
value = int(value)
except KeyError:
pass
board_config.update(option, value)
# load default variables from board config # load default variables from board config
for option_meta in ProjectOptions.values(): for option_meta in ProjectOptions.values():
if not option_meta.buildenvvar or option_meta.buildenvvar in env: if not option_meta.buildenvvar or option_meta.buildenvvar in env:
continue continue
data_path = (option_meta.name[6:] data_path = (
if option_meta.name.startswith("board_") else option_meta.name[6:]
option_meta.name.replace("_", ".")) if option_meta.name.startswith("board_")
else option_meta.name.replace("_", ".")
)
try: try:
env[option_meta.buildenvvar] = board_config.get(data_path) env[option_meta.buildenvvar] = board_config.get(data_path)
except KeyError: except KeyError:
@@ -118,22 +128,25 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
board_config = env.BoardConfig() if "BOARD" in env else None board_config = env.BoardConfig() if "BOARD" in env else None
def _get_configuration_data(): def _get_configuration_data():
return None if not board_config else [ return (
"CONFIGURATION:", None
"https://docs.platformio.org/page/boards/%s/%s.html" % if not board_config
(platform.name, board_config.id) else [
] "CONFIGURATION:",
"https://docs.platformio.org/page/boards/%s/%s.html"
% (platform.name, board_config.id),
]
)
def _get_plaform_data(): def _get_plaform_data():
data = ["PLATFORM: %s %s" % (platform.title, platform.version)] data = ["PLATFORM: %s %s" % (platform.title, platform.version)]
src_manifest_path = platform.pm.get_src_manifest_path( src_manifest_path = platform.pm.get_src_manifest_path(platform.get_dir())
platform.get_dir())
if src_manifest_path: if src_manifest_path:
src_manifest = fs.load_json(src_manifest_path) src_manifest = fs.load_json(src_manifest_path)
if "version" in src_manifest: if "version" in src_manifest:
data.append("#" + src_manifest['version']) data.append("#" + src_manifest["version"])
if int(ARGUMENTS.get("PIOVERBOSE", 0)): if int(ARGUMENTS.get("PIOVERBOSE", 0)):
data.append("(%s)" % src_manifest['url']) data.append("(%s)" % src_manifest["url"])
if board_config: if board_config:
data.extend([">", board_config.get("name")]) data.extend([">", board_config.get("name")])
return data return data
@@ -151,19 +164,22 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
return data return data
ram = board_config.get("upload", {}).get("maximum_ram_size") ram = board_config.get("upload", {}).get("maximum_ram_size")
flash = board_config.get("upload", {}).get("maximum_size") flash = board_config.get("upload", {}).get("maximum_size")
data.append("%s RAM, %s Flash" % data.append(
(fs.format_filesize(ram), fs.format_filesize(flash))) "%s RAM, %s Flash" % (fs.format_filesize(ram), fs.format_filesize(flash))
)
return data return data
def _get_debug_data(): def _get_debug_data():
debug_tools = board_config.get( debug_tools = (
"debug", {}).get("tools") if board_config else None board_config.get("debug", {}).get("tools") if board_config else None
)
if not debug_tools: if not debug_tools:
return None return None
data = [ data = [
"DEBUG:", "Current", "DEBUG:",
"(%s)" % board_config.get_debug_tool_name( "Current",
env.GetProjectOption("debug_tool")) "(%s)"
% board_config.get_debug_tool_name(env.GetProjectOption("debug_tool")),
] ]
onboard = [] onboard = []
external = [] external = []
@@ -187,21 +203,25 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
if not pkg_dir: if not pkg_dir:
continue continue
manifest = platform.pm.load_manifest(pkg_dir) manifest = platform.pm.load_manifest(pkg_dir)
original_version = util.get_original_version(manifest['version']) original_version = util.get_original_version(manifest["version"])
info = "%s %s" % (manifest['name'], manifest['version']) info = "%s %s" % (manifest["name"], manifest["version"])
extra = [] extra = []
if original_version: if original_version:
extra.append(original_version) extra.append(original_version)
if "__src_url" in manifest and int(ARGUMENTS.get("PIOVERBOSE", 0)): if "__src_url" in manifest and int(ARGUMENTS.get("PIOVERBOSE", 0)):
extra.append(manifest['__src_url']) extra.append(manifest["__src_url"])
if extra: if extra:
info += " (%s)" % ", ".join(extra) info += " (%s)" % ", ".join(extra)
data.append(info) data.append(info)
return ["PACKAGES:", ", ".join(data)] return ["PACKAGES:", ", ".join(data)]
for data in (_get_configuration_data(), _get_plaform_data(), for data in (
_get_hardware_data(), _get_debug_data(), _get_configuration_data(),
_get_packages_data()): _get_plaform_data(),
_get_hardware_data(),
_get_debug_data(),
_get_packages_data(),
):
if data and len(data) > 1: if data and len(data) > 1:
print(" ".join(data)) print(" ".join(data))

View File

@@ -18,22 +18,25 @@ from platformio.project.config import ProjectConfig, ProjectOptions
def GetProjectConfig(env): def GetProjectConfig(env):
return ProjectConfig.get_instance(env['PROJECT_CONFIG']) return ProjectConfig.get_instance(env["PROJECT_CONFIG"])
def GetProjectOptions(env, as_dict=False): def GetProjectOptions(env, as_dict=False):
return env.GetProjectConfig().items(env=env['PIOENV'], as_dict=as_dict) return env.GetProjectConfig().items(env=env["PIOENV"], as_dict=as_dict)
def GetProjectOption(env, option, default=None): def GetProjectOption(env, option, default=None):
return env.GetProjectConfig().get("env:" + env['PIOENV'], option, default) return env.GetProjectConfig().get("env:" + env["PIOENV"], option, default)
def LoadProjectOptions(env): def LoadProjectOptions(env):
for option, value in env.GetProjectOptions(): for option, value in env.GetProjectOptions():
option_meta = ProjectOptions.get("env." + option) option_meta = ProjectOptions.get("env." + option)
if (not option_meta or not option_meta.buildenvvar if (
or option_meta.buildenvvar in env): not option_meta
or not option_meta.buildenvvar
or option_meta.buildenvvar in env
):
continue continue
env[option_meta.buildenvvar] = value env[option_meta.buildenvvar] = value

View File

@@ -0,0 +1,254 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals
from __future__ import absolute_import
import sys
from os import environ, makedirs, remove
from os.path import isdir, join, splitdrive
from elftools.elf.descriptions import describe_sh_flags
from elftools.elf.elffile import ELFFile
from platformio.compat import dump_json_to_unicode
from platformio.proc import exec_command
from platformio.util import get_systype
def _run_tool(cmd, env, tool_args):
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir, "size-data-longcmd.txt")
with open(tmp_file, "w") as fp:
fp.write("\n".join(tool_args))
cmd.append("@" + tmp_file)
result = exec_command(cmd, env=sysenv)
remove(tmp_file)
return result
def _get_symbol_locations(env, elf_path, addrs):
if not addrs:
return {}
cmd = [env.subst("$CC").replace("-gcc", "-addr2line"), "-e", elf_path]
result = _run_tool(cmd, env, addrs)
locations = [line for line in result["out"].split("\n") if line]
assert len(addrs) == len(locations)
return dict(zip(addrs, [l.strip() for l in locations]))
def _get_demangled_names(env, mangled_names):
if not mangled_names:
return {}
result = _run_tool(
[env.subst("$CC").replace("-gcc", "-c++filt")], env, mangled_names
)
demangled_names = [line for line in result["out"].split("\n") if line]
assert len(mangled_names) == len(demangled_names)
return dict(
zip(
mangled_names,
[dn.strip().replace("::__FUNCTION__", "") for dn in demangled_names],
)
)
def _determine_section(sections, symbol_addr):
for section, info in sections.items():
if not _is_flash_section(info) and not _is_ram_section(info):
continue
if symbol_addr in range(info["start_addr"], info["start_addr"] + info["size"]):
return section
return "unknown"
def _is_ram_section(section):
return (
section.get("type", "") in ("SHT_NOBITS", "SHT_PROGBITS")
and section.get("flags", "") == "WA"
)
def _is_flash_section(section):
return section.get("type", "") == "SHT_PROGBITS" and "A" in section.get("flags", "")
def _is_valid_symbol(symbol_name, symbol_type, symbol_address):
return symbol_name and symbol_address != 0 and symbol_type != "STT_NOTYPE"
def _collect_sections_info(elffile):
sections = {}
for section in elffile.iter_sections():
if section.is_null() or section.name.startswith(".debug"):
continue
section_type = section["sh_type"]
section_flags = describe_sh_flags(section["sh_flags"])
section_size = section.data_size
sections[section.name] = {
"size": section_size,
"start_addr": section["sh_addr"],
"type": section_type,
"flags": section_flags,
}
return sections
def _collect_symbols_info(env, elffile, elf_path, sections):
symbols = []
symbol_section = elffile.get_section_by_name(".symtab")
if symbol_section.is_null():
sys.stderr.write("Couldn't find symbol table. Is ELF file stripped?")
env.Exit(1)
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
symbol_addrs = []
mangled_names = []
for s in symbol_section.iter_symbols():
symbol_info = s.entry["st_info"]
symbol_addr = s["st_value"]
symbol_size = s["st_size"]
symbol_type = symbol_info["type"]
if not _is_valid_symbol(s.name, symbol_type, symbol_addr):
continue
symbol = {
"addr": symbol_addr,
"bind": symbol_info["bind"],
"name": s.name,
"type": symbol_type,
"size": symbol_size,
"section": _determine_section(sections, symbol_addr),
}
if s.name.startswith("_Z"):
mangled_names.append(s.name)
symbol_addrs.append(hex(symbol_addr))
symbols.append(symbol)
symbol_locations = _get_symbol_locations(env, elf_path, symbol_addrs)
demangled_names = _get_demangled_names(env, mangled_names)
for symbol in symbols:
if symbol["name"].startswith("_Z"):
symbol["demangled_name"] = demangled_names.get(symbol["name"])
location = symbol_locations.get(hex(symbol["addr"]))
if not location or "?" in location:
continue
if "windows" in get_systype():
drive, tail = splitdrive(location)
location = join(drive.upper(), tail)
symbol["file"] = location
symbol["line"] = 0
if ":" in location:
file_, line = location.rsplit(":", 1)
if line.isdigit():
symbol["file"] = file_
symbol["line"] = int(line)
return symbols
def _calculate_firmware_size(sections):
flash_size = ram_size = 0
for section_info in sections.values():
if _is_flash_section(section_info):
flash_size += section_info.get("size", 0)
if _is_ram_section(section_info):
ram_size += section_info.get("size", 0)
return ram_size, flash_size
def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
data = {"device": {}, "memory": {}, "version": 1}
board = env.BoardConfig()
if board:
data["device"] = {
"mcu": board.get("build.mcu", ""),
"cpu": board.get("build.cpu", ""),
"frequency": board.get("build.f_cpu"),
"flash": int(board.get("upload.maximum_size", 0)),
"ram": int(board.get("upload.maximum_ram_size", 0)),
}
if data["device"]["frequency"] and data["device"]["frequency"].endswith("L"):
data["device"]["frequency"] = int(data["device"]["frequency"][0:-1])
elf_path = env.subst("$PIOMAINPROG")
with open(elf_path, "rb") as fp:
elffile = ELFFile(fp)
if not elffile.has_dwarf_info():
sys.stderr.write("Elf file doesn't contain DWARF information")
env.Exit(1)
sections = _collect_sections_info(elffile)
firmware_ram, firmware_flash = _calculate_firmware_size(sections)
data["memory"]["total"] = {
"ram_size": firmware_ram,
"flash_size": firmware_flash,
"sections": sections,
}
files = dict()
for symbol in _collect_symbols_info(env, elffile, elf_path, sections):
file_path = symbol.get("file") or "unknown"
if not files.get(file_path, {}):
files[file_path] = {"symbols": [], "ram_size": 0, "flash_size": 0}
symbol_size = symbol.get("size", 0)
section = sections.get(symbol.get("section", ""), {})
if _is_ram_section(section):
files[file_path]["ram_size"] += symbol_size
if _is_flash_section(section):
files[file_path]["flash_size"] += symbol_size
files[file_path]["symbols"].append(symbol)
data["memory"]["files"] = list()
for k, v in files.items():
file_data = {"path": k}
file_data.update(v)
data["memory"]["files"].append(file_data)
with open(join(env.subst("$BUILD_DIR"), "sizedata.json"), "w") as fp:
fp.write(dump_json_to_unicode(data))
def exists(_):
return True
def generate(env):
env.AddMethod(DumpSizeData)
return env

View File

@@ -60,9 +60,9 @@ def WaitForNewSerialPort(env, before):
prev_port = env.subst("$UPLOAD_PORT") prev_port = env.subst("$UPLOAD_PORT")
new_port = None new_port = None
elapsed = 0 elapsed = 0
before = [p['port'] for p in before] before = [p["port"] for p in before]
while elapsed < 5 and new_port is None: while elapsed < 5 and new_port is None:
now = [p['port'] for p in util.get_serial_ports()] now = [p["port"] for p in util.get_serial_ports()]
for p in now: for p in now:
if p not in before: if p not in before:
new_port = p new_port = p
@@ -84,10 +84,12 @@ def WaitForNewSerialPort(env, before):
sleep(1) sleep(1)
if not new_port: if not new_port:
sys.stderr.write("Error: Couldn't find a board on the selected port. " sys.stderr.write(
"Check that you have the correct port selected. " "Error: Couldn't find a board on the selected port. "
"If it is correct, try pressing the board's reset " "Check that you have the correct port selected. "
"button after initiating the upload.\n") "If it is correct, try pressing the board's reset "
"button after initiating the upload.\n"
)
env.Exit(1) env.Exit(1)
return new_port return new_port
@@ -99,8 +101,8 @@ def AutodetectUploadPort(*args, **kwargs):
def _get_pattern(): def _get_pattern():
if "UPLOAD_PORT" not in env: if "UPLOAD_PORT" not in env:
return None return None
if set(["*", "?", "[", "]"]) & set(env['UPLOAD_PORT']): if set(["*", "?", "[", "]"]) & set(env["UPLOAD_PORT"]):
return env['UPLOAD_PORT'] return env["UPLOAD_PORT"]
return None return None
def _is_match_pattern(port): def _is_match_pattern(port):
@@ -112,17 +114,13 @@ def AutodetectUploadPort(*args, **kwargs):
def _look_for_mbed_disk(): def _look_for_mbed_disk():
msdlabels = ("mbed", "nucleo", "frdm", "microbit") msdlabels = ("mbed", "nucleo", "frdm", "microbit")
for item in util.get_logical_devices(): for item in util.get_logical_devices():
if item['path'].startswith("/net") or not _is_match_pattern( if item["path"].startswith("/net") or not _is_match_pattern(item["path"]):
item['path']):
continue continue
mbed_pages = [ mbed_pages = [join(item["path"], n) for n in ("mbed.htm", "mbed.html")]
join(item['path'], n) for n in ("mbed.htm", "mbed.html")
]
if any(isfile(p) for p in mbed_pages): if any(isfile(p) for p in mbed_pages):
return item['path'] return item["path"]
if item['name'] \ if item["name"] and any(l in item["name"].lower() for l in msdlabels):
and any(l in item['name'].lower() for l in msdlabels): return item["path"]
return item['path']
return None return None
def _look_for_serial_port(): def _look_for_serial_port():
@@ -132,17 +130,17 @@ def AutodetectUploadPort(*args, **kwargs):
if "BOARD" in env and "build.hwids" in env.BoardConfig(): if "BOARD" in env and "build.hwids" in env.BoardConfig():
board_hwids = env.BoardConfig().get("build.hwids") board_hwids = env.BoardConfig().get("build.hwids")
for item in util.get_serial_ports(filter_hwid=True): for item in util.get_serial_ports(filter_hwid=True):
if not _is_match_pattern(item['port']): if not _is_match_pattern(item["port"]):
continue continue
port = item['port'] port = item["port"]
if upload_protocol.startswith("blackmagic"): if upload_protocol.startswith("blackmagic"):
if WINDOWS and port.startswith("COM") and len(port) > 4: if WINDOWS and port.startswith("COM") and len(port) > 4:
port = "\\\\.\\%s" % port port = "\\\\.\\%s" % port
if "GDB" in item['description']: if "GDB" in item["description"]:
return port return port
for hwid in board_hwids: for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "") hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']: if hwid_str in item["hwid"]:
return port return port
return port return port
@@ -150,9 +148,9 @@ def AutodetectUploadPort(*args, **kwargs):
print(env.subst("Use manually specified: $UPLOAD_PORT")) print(env.subst("Use manually specified: $UPLOAD_PORT"))
return return
if (env.subst("$UPLOAD_PROTOCOL") == "mbed" if env.subst("$UPLOAD_PROTOCOL") == "mbed" or (
or ("mbed" in env.subst("$PIOFRAMEWORK") "mbed" in env.subst("$PIOFRAMEWORK") and not env.subst("$UPLOAD_PROTOCOL")
and not env.subst("$UPLOAD_PROTOCOL"))): ):
env.Replace(UPLOAD_PORT=_look_for_mbed_disk()) env.Replace(UPLOAD_PORT=_look_for_mbed_disk())
else: else:
try: try:
@@ -168,7 +166,8 @@ def AutodetectUploadPort(*args, **kwargs):
"Error: Please specify `upload_port` for environment or use " "Error: Please specify `upload_port` for environment or use "
"global `--upload-port` option.\n" "global `--upload-port` option.\n"
"For some development platforms it can be a USB flash " "For some development platforms it can be a USB flash "
"drive (i.e. /media/<user>/<device name>)\n") "drive (i.e. /media/<user>/<device name>)\n"
)
env.Exit(1) env.Exit(1)
@@ -179,16 +178,17 @@ def UploadToDisk(_, target, source, env):
fpath = join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext)) fpath = join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext))
if not isfile(fpath): if not isfile(fpath):
continue continue
copyfile(fpath, copyfile(fpath, join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext))) print(
print("Firmware has been successfully uploaded.\n" "Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)") "(Some boards may require manual hard reset)"
)
def CheckUploadSize(_, target, source, env): def CheckUploadSize(_, target, source, env):
check_conditions = [ check_conditions = [
env.get("BOARD"), env.get("BOARD"),
env.get("SIZETOOL") or env.get("SIZECHECKCMD") env.get("SIZETOOL") or env.get("SIZECHECKCMD"),
] ]
if not all(check_conditions): if not all(check_conditions):
return return
@@ -198,9 +198,11 @@ def CheckUploadSize(_, target, source, env):
return return
def _configure_defaults(): def _configure_defaults():
env.Replace(SIZECHECKCMD="$SIZETOOL -B -d $SOURCES", env.Replace(
SIZEPROGREGEXP=r"^(\d+)\s+(\d+)\s+\d+\s", SIZECHECKCMD="$SIZETOOL -B -d $SOURCES",
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+") SIZEPROGREGEXP=r"^(\d+)\s+(\d+)\s+\d+\s",
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+",
)
def _get_size_output(): def _get_size_output():
cmd = env.get("SIZECHECKCMD") cmd = env.get("SIZECHECKCMD")
@@ -210,11 +212,11 @@ def CheckUploadSize(_, target, source, env):
cmd = cmd.split() cmd = cmd.split()
cmd = [arg.replace("$SOURCES", str(source[0])) for arg in cmd if arg] cmd = [arg.replace("$SOURCES", str(source[0])) for arg in cmd if arg]
sysenv = environ.copy() sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH']) sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(env.subst(cmd), env=sysenv) result = exec_command(env.subst(cmd), env=sysenv)
if result['returncode'] != 0: if result["returncode"] != 0:
return None return None
return result['out'].strip() return result["out"].strip()
def _calculate_size(output, pattern): def _calculate_size(output, pattern):
if not output or not pattern: if not output or not pattern:
@@ -238,7 +240,8 @@ def CheckUploadSize(_, target, source, env):
if used_blocks > blocks_per_progress: if used_blocks > blocks_per_progress:
used_blocks = blocks_per_progress used_blocks = blocks_per_progress
return "[{:{}}] {: 6.1%} (used {:d} bytes from {:d} bytes)".format( return "[{:{}}] {: 6.1%} (used {:d} bytes from {:d} bytes)".format(
"=" * used_blocks, blocks_per_progress, percent_raw, value, total) "=" * used_blocks, blocks_per_progress, percent_raw, value, total
)
if not env.get("SIZECHECKCMD") and not env.get("SIZEPROGREGEXP"): if not env.get("SIZECHECKCMD") and not env.get("SIZEPROGREGEXP"):
_configure_defaults() _configure_defaults()
@@ -246,12 +249,11 @@ def CheckUploadSize(_, target, source, env):
program_size = _calculate_size(output, env.get("SIZEPROGREGEXP")) program_size = _calculate_size(output, env.get("SIZEPROGREGEXP"))
data_size = _calculate_size(output, env.get("SIZEDATAREGEXP")) data_size = _calculate_size(output, env.get("SIZEDATAREGEXP"))
print("Memory Usage -> http://bit.ly/pio-memory-usage") print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"')
if data_max_size and data_size > -1: if data_max_size and data_size > -1:
print("DATA: %s" % _format_availale_bytes(data_size, data_max_size)) print("DATA: %s" % _format_availale_bytes(data_size, data_max_size))
if program_size > -1: if program_size > -1:
print("PROGRAM: %s" % print("PROGRAM: %s" % _format_availale_bytes(program_size, program_max_size))
_format_availale_bytes(program_size, program_max_size))
if int(ARGUMENTS.get("PIOVERBOSE", 0)): if int(ARGUMENTS.get("PIOVERBOSE", 0)):
print(output) print(output)
@@ -262,9 +264,10 @@ def CheckUploadSize(_, target, source, env):
# "than maximum allowed (%s bytes)\n" % (data_size, data_max_size)) # "than maximum allowed (%s bytes)\n" % (data_size, data_max_size))
# env.Exit(1) # env.Exit(1)
if program_size > program_max_size: if program_size > program_max_size:
sys.stderr.write("Error: The program size (%d bytes) is greater " sys.stderr.write(
"than maximum allowed (%s bytes)\n" % "Error: The program size (%d bytes) is greater "
(program_size, program_max_size)) "than maximum allowed (%s bytes)\n" % (program_size, program_max_size)
)
env.Exit(1) env.Exit(1)
@@ -272,8 +275,7 @@ def PrintUploadInfo(env):
configured = env.subst("$UPLOAD_PROTOCOL") configured = env.subst("$UPLOAD_PROTOCOL")
available = [configured] if configured else [] available = [configured] if configured else []
if "BOARD" in env: if "BOARD" in env:
available.extend(env.BoardConfig().get("upload", available.extend(env.BoardConfig().get("upload", {}).get("protocols", []))
{}).get("protocols", []))
if available: if available:
print("AVAILABLE: %s" % ", ".join(sorted(set(available)))) print("AVAILABLE: %s" % ", ".join(sorted(set(available))))
if configured: if configured:

View File

@@ -14,11 +14,12 @@
from __future__ import absolute_import from __future__ import absolute_import
import fnmatch
import os import os
import sys import sys
from os.path import basename, dirname, isdir, join, realpath
from SCons import Builder, Util # pylint: disable=import-error from SCons import Builder, Util # pylint: disable=import-error
from SCons.Node import FS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error from SCons.Script import AlwaysBuild # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error from SCons.Script import DefaultEnvironment # pylint: disable=import-error
@@ -54,7 +55,8 @@ def _build_project_deps(env):
key: project_lib_builder.env.get(key) key: project_lib_builder.env.get(key)
for key in ("LIBS", "LIBPATH", "LINKFLAGS") for key in ("LIBS", "LIBPATH", "LINKFLAGS")
if project_lib_builder.env.get(key) if project_lib_builder.env.get(key)
}) }
)
projenv = env.Clone() projenv = env.Clone()
@@ -65,27 +67,34 @@ def _build_project_deps(env):
is_test = "__test" in COMMAND_LINE_TARGETS is_test = "__test" in COMMAND_LINE_TARGETS
if is_test: if is_test:
projenv.BuildSources("$BUILDTEST_DIR", "$PROJECTTEST_DIR", projenv.BuildSources(
"$PIOTEST_SRC_FILTER") "$BUILD_TEST_DIR", "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
if not is_test or env.GetProjectOption("test_build_project_src", False): )
projenv.BuildSources("$BUILDSRC_DIR", "$PROJECTSRC_DIR", if not is_test or env.GetProjectOption("test_build_project_src"):
env.get("SRC_FILTER")) projenv.BuildSources(
"$BUILD_SRC_DIR", "$PROJECT_SRC_DIR", env.get("SRC_FILTER")
)
if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS: if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS:
sys.stderr.write( sys.stderr.write(
"Error: Nothing to build. Please put your source code files " "Error: Nothing to build. Please put your source code files "
"to '%s' folder\n" % env.subst("$PROJECTSRC_DIR")) "to '%s' folder\n" % env.subst("$PROJECT_SRC_DIR")
)
env.Exit(1) env.Exit(1)
Export("projenv") Export("projenv")
def BuildProgram(env): def BuildProgram(env):
def _append_pio_macros(): def _append_pio_macros():
env.AppendUnique(CPPDEFINES=[( env.AppendUnique(
"PLATFORMIO", CPPDEFINES=[
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())))]) (
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
)
]
)
_append_pio_macros() _append_pio_macros()
@@ -95,10 +104,6 @@ def BuildProgram(env):
if not Util.case_sensitive_suffixes(".s", ".S"): if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM") env.Replace(AS="$CC", ASCOM="$ASPPCOM")
if ("debug" in COMMAND_LINE_TARGETS
or env.GetProjectOption("build_type") == "debug"):
env.ProcessDebug()
# process extra flags from board # process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig(): if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags")) env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
@@ -109,37 +114,45 @@ def BuildProgram(env):
# process framework scripts # process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK")) env.BuildFrameworks(env.get("PIOFRAMEWORK"))
# restore PIO macros if it was deleted by framework is_build_type_debug = (
_append_pio_macros() set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
if is_build_type_debug:
env.ConfigureDebugFlags()
# remove specified flags # remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS")) env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS: if "__test" in COMMAND_LINE_TARGETS:
env.ProcessTest() env.ConfigureTestTarget()
# build project with dependencies
_build_project_deps(env)
# append into the beginning a main LD script # append into the beginning a main LD script
if (env.get("LDSCRIPT_PATH") if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
and not any("-Wl,-T" in f for f in env['LINKFLAGS'])): env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
env.Prepend(LINKFLAGS=["-T", "$LDSCRIPT_PATH"])
# enable "cyclic reference" for linker # enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc": if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ") env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group") env.Append(_LIBFLAGS=" -Wl,--end-group")
program = env.Program(join("$BUILD_DIR", env.subst("$PROGNAME")), # build project with dependencies
env['PIOBUILDFILES']) _build_project_deps(env)
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
)
env.Replace(PIOMAINPROG=program) env.Replace(PIOMAINPROG=program)
AlwaysBuild( AlwaysBuild(
env.Alias( env.Alias(
"checkprogsize", program, "checkprogsize",
env.VerboseAction(env.CheckUploadSize, program,
"Checking size $PIOMAINPROG"))) env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
)
)
print("Building in %s mode" % ("debug" if is_build_type_debug else "release"))
return program return program
@@ -155,30 +168,30 @@ def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
result[key].extend(value) result[key].extend(value)
cppdefines = [] cppdefines = []
for item in result['CPPDEFINES']: for item in result["CPPDEFINES"]:
if not Util.is_Sequence(item): if not Util.is_Sequence(item):
cppdefines.append(item) cppdefines.append(item)
continue continue
name, value = item[:2] name, value = item[:2]
if '\"' in value: if '"' in value:
value = value.replace('\"', '\\\"') value = value.replace('"', '\\"')
elif value.isdigit(): elif value.isdigit():
value = int(value) value = int(value)
elif value.replace(".", "", 1).isdigit(): elif value.replace(".", "", 1).isdigit():
value = float(value) value = float(value)
cppdefines.append((name, value)) cppdefines.append((name, value))
result['CPPDEFINES'] = cppdefines result["CPPDEFINES"] = cppdefines
# fix relative CPPPATH & LIBPATH # fix relative CPPPATH & LIBPATH
for k in ("CPPPATH", "LIBPATH"): for k in ("CPPPATH", "LIBPATH"):
for i, p in enumerate(result.get(k, [])): for i, p in enumerate(result.get(k, [])):
if isdir(p): if os.path.isdir(p):
result[k][i] = realpath(p) result[k][i] = os.path.realpath(p)
# fix relative path for "-include" # fix relative path for "-include"
for i, f in enumerate(result.get("CCFLAGS", [])): for i, f in enumerate(result.get("CCFLAGS", [])):
if isinstance(f, tuple) and f[0] == "-include": if isinstance(f, tuple) and f[0] == "-include":
result['CCFLAGS'][i] = (f[0], env.File(realpath(f[1].get_path()))) result["CCFLAGS"][i] = (f[0], env.File(os.path.realpath(f[1].get_path())))
return result return result
@@ -191,14 +204,15 @@ def ProcessFlags(env, flags): # pylint: disable=too-many-branches
# Cancel any previous definition of name, either built in or # Cancel any previous definition of name, either built in or
# provided with a -U option // Issue #191 # provided with a -U option // Issue #191
undefines = [ undefines = [
u for u in env.get("CCFLAGS", []) u
for u in env.get("CCFLAGS", [])
if isinstance(u, string_types) and u.startswith("-U") if isinstance(u, string_types) and u.startswith("-U")
] ]
if undefines: if undefines:
for undef in undefines: for undef in undefines:
env['CCFLAGS'].remove(undef) env["CCFLAGS"].remove(undef)
if undef[2:] in env['CPPDEFINES']: if undef[2:] in env["CPPDEFINES"]:
env['CPPDEFINES'].remove(undef[2:]) env["CPPDEFINES"].remove(undef[2:])
env.Append(_CPPDEFFLAGS=" %s" % " ".join(undefines)) env.Append(_CPPDEFFLAGS=" %s" % " ".join(undefines))
@@ -221,8 +235,7 @@ def ProcessUnFlags(env, flags):
for current in env.get(key, []): for current in env.get(key, []):
conditions = [ conditions = [
unflag == current, unflag == current,
isinstance(current, (tuple, list)) isinstance(current, (tuple, list)) and unflag[0] == current[0],
and unflag[0] == current[0]
] ]
if any(conditions): if any(conditions):
env[key].remove(current) env[key].remove(current)
@@ -231,15 +244,14 @@ def ProcessUnFlags(env, flags):
def MatchSourceFiles(env, src_dir, src_filter=None): def MatchSourceFiles(env, src_dir, src_filter=None):
src_filter = env.subst(src_filter) if src_filter else None src_filter = env.subst(src_filter) if src_filter else None
src_filter = src_filter or SRC_FILTER_DEFAULT src_filter = src_filter or SRC_FILTER_DEFAULT
return fs.match_src_files(env.subst(src_dir), src_filter, return fs.match_src_files(
SRC_BUILD_EXT + SRC_HEADER_EXT) env.subst(src_dir), src_filter, SRC_BUILD_EXT + SRC_HEADER_EXT
)
def CollectBuildFiles(env, def CollectBuildFiles(
variant_dir, env, variant_dir, src_dir, src_filter=None, duplicate=False
src_dir, ): # pylint: disable=too-many-locals
src_filter=None,
duplicate=False):
sources = [] sources = []
variants = [] variants = []
@@ -248,27 +260,44 @@ def CollectBuildFiles(env,
src_dir = src_dir[:-1] src_dir = src_dir[:-1]
for item in env.MatchSourceFiles(src_dir, src_filter): for item in env.MatchSourceFiles(src_dir, src_filter):
_reldir = dirname(item) _reldir = os.path.dirname(item)
_src_dir = join(src_dir, _reldir) if _reldir else src_dir _src_dir = os.path.join(src_dir, _reldir) if _reldir else src_dir
_var_dir = join(variant_dir, _reldir) if _reldir else variant_dir _var_dir = os.path.join(variant_dir, _reldir) if _reldir else variant_dir
if _var_dir not in variants: if _var_dir not in variants:
variants.append(_var_dir) variants.append(_var_dir)
env.VariantDir(_var_dir, _src_dir, duplicate) env.VariantDir(_var_dir, _src_dir, duplicate)
if fs.path_endswith_ext(item, SRC_BUILD_EXT): if fs.path_endswith_ext(item, SRC_BUILD_EXT):
sources.append(env.File(join(_var_dir, basename(item)))) sources.append(env.File(os.path.join(_var_dir, os.path.basename(item))))
for callback, pattern in env.get("__PIO_BUILD_MIDDLEWARES", []):
tmp = []
for node in sources:
if pattern and not fnmatch.fnmatch(node.get_path(), pattern):
tmp.append(node)
continue
n = callback(node)
if n:
tmp.append(n)
sources = tmp
return sources return sources
def AddBuildMiddleware(env, callback, pattern=None):
env.Append(__PIO_BUILD_MIDDLEWARES=[(callback, pattern)])
def BuildFrameworks(env, frameworks): def BuildFrameworks(env, frameworks):
if not frameworks: if not frameworks:
return return
if "BOARD" not in env: if "BOARD" not in env:
sys.stderr.write("Please specify `board` in `platformio.ini` to use " sys.stderr.write(
"with '%s' framework\n" % ", ".join(frameworks)) "Please specify `board` in `platformio.ini` to use "
"with '%s' framework\n" % ", ".join(frameworks)
)
env.Exit(1) env.Exit(1)
board_frameworks = env.BoardConfig().get("frameworks", []) board_frameworks = env.BoardConfig().get("frameworks", [])
@@ -276,8 +305,7 @@ def BuildFrameworks(env, frameworks):
if board_frameworks: if board_frameworks:
frameworks.insert(0, board_frameworks[0]) frameworks.insert(0, board_frameworks[0])
else: else:
sys.stderr.write( sys.stderr.write("Error: Please specify `board` in `platformio.ini`\n")
"Error: Please specify `board` in `platformio.ini`\n")
env.Exit(1) env.Exit(1)
for f in frameworks: for f in frameworks:
@@ -290,22 +318,24 @@ def BuildFrameworks(env, frameworks):
if f in board_frameworks: if f in board_frameworks:
SConscript(env.GetFrameworkScript(f), exports="env") SConscript(env.GetFrameworkScript(f), exports="env")
else: else:
sys.stderr.write( sys.stderr.write("Error: This board doesn't support %s framework!\n" % f)
"Error: This board doesn't support %s framework!\n" % f)
env.Exit(1) env.Exit(1)
def BuildLibrary(env, variant_dir, src_dir, src_filter=None): def BuildLibrary(env, variant_dir, src_dir, src_filter=None):
env.ProcessUnFlags(env.get("BUILD_UNFLAGS")) env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
return env.StaticLibrary( return env.StaticLibrary(
env.subst(variant_dir), env.subst(variant_dir), env.CollectBuildFiles(variant_dir, src_dir, src_filter)
env.CollectBuildFiles(variant_dir, src_dir, src_filter)) )
def BuildSources(env, variant_dir, src_dir, src_filter=None): def BuildSources(env, variant_dir, src_dir, src_filter=None):
nodes = env.CollectBuildFiles(variant_dir, src_dir, src_filter) nodes = env.CollectBuildFiles(variant_dir, src_dir, src_filter)
DefaultEnvironment().Append( DefaultEnvironment().Append(
PIOBUILDFILES=[env.Object(node) for node in nodes]) PIOBUILDFILES=[
env.Object(node) if isinstance(node, FS.File) else node for node in nodes
]
)
def exists(_): def exists(_):
@@ -319,6 +349,7 @@ def generate(env):
env.AddMethod(ProcessUnFlags) env.AddMethod(ProcessUnFlags)
env.AddMethod(MatchSourceFiles) env.AddMethod(MatchSourceFiles)
env.AddMethod(CollectBuildFiles) env.AddMethod(CollectBuildFiles)
env.AddMethod(AddBuildMiddleware)
env.AddMethod(BuildFrameworks) env.AddMethod(BuildFrameworks)
env.AddMethod(BuildLibrary) env.AddMethod(BuildLibrary)
env.AddMethod(BuildSources) env.AddMethod(BuildSources)

View File

@@ -13,7 +13,6 @@
# limitations under the License. # limitations under the License.
import os import os
from os.path import dirname, isfile, join
import click import click
@@ -22,13 +21,21 @@ class PlatformioCLI(click.MultiCommand):
leftover_args = [] leftover_args = []
def __init__(self, *args, **kwargs):
super(PlatformioCLI, self).__init__(*args, **kwargs)
self._pio_cmds_dir = os.path.dirname(__file__)
@staticmethod @staticmethod
def in_silence(): def in_silence():
args = PlatformioCLI.leftover_args args = PlatformioCLI.leftover_args
return args and any([ return args and any(
args[0] == "debug" and "--interpreter" in " ".join(args), [
args[0] == "upgrade", "--json-output" in args, "--version" in args args[0] == "debug" and "--interpreter" in " ".join(args),
]) args[0] == "upgrade",
"--json-output" in args,
"--version" in args,
]
)
def invoke(self, ctx): def invoke(self, ctx):
PlatformioCLI.leftover_args = ctx.args PlatformioCLI.leftover_args = ctx.args
@@ -38,35 +45,23 @@ class PlatformioCLI(click.MultiCommand):
def list_commands(self, ctx): def list_commands(self, ctx):
cmds = [] cmds = []
cmds_dir = dirname(__file__) for cmd_name in os.listdir(self._pio_cmds_dir):
for name in os.listdir(cmds_dir): if cmd_name.startswith("__init__"):
if name.startswith("__init__"):
continue continue
if isfile(join(cmds_dir, name, "command.py")): if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
cmds.append(name) cmds.append(cmd_name)
elif name.endswith(".py"): elif cmd_name.endswith(".py"):
cmds.append(name[:-3]) cmds.append(cmd_name[:-3])
cmds.sort() cmds.sort()
return cmds return cmds
def get_command(self, ctx, cmd_name): def get_command(self, ctx, cmd_name):
mod = None mod = None
try: try:
mod = __import__("platformio.commands." + cmd_name, None, None, mod_path = "platformio.commands." + cmd_name
["cli"]) if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
mod_path = "platformio.commands.%s.command" % cmd_name
mod = __import__(mod_path, None, None, ["cli"])
except ImportError: except ImportError:
try: raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return self._handle_obsolate_command(cmd_name)
except AttributeError:
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return mod.cli return mod.cli
@staticmethod
def _handle_obsolate_command(name):
if name == "platforms":
from platformio.commands import platform
return platform.cli
if name == "serialports":
from platformio.commands import device
return device.cli
raise AttributeError()

View File

@@ -34,9 +34,9 @@ def cli(query, installed, json_output): # pylint: disable=R0912
for board in _get_boards(installed): for board in _get_boards(installed):
if query and query.lower() not in json.dumps(board).lower(): if query and query.lower() not in json.dumps(board).lower():
continue continue
if board['platform'] not in grpboards: if board["platform"] not in grpboards:
grpboards[board['platform']] = [] grpboards[board["platform"]] = []
grpboards[board['platform']].append(board) grpboards[board["platform"]].append(board)
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
for (platform, boards) in sorted(grpboards.items()): for (platform, boards) in sorted(grpboards.items()):
@@ -50,11 +50,21 @@ def cli(query, installed, json_output): # pylint: disable=R0912
def print_boards(boards): def print_boards(boards):
click.echo( click.echo(
tabulate([(click.style(b['id'], fg="cyan"), b['mcu'], "%dMHz" % tabulate(
(b['fcpu'] / 1000000), fs.format_filesize( [
b['rom']), fs.format_filesize(b['ram']), b['name']) (
for b in boards], click.style(b["id"], fg="cyan"),
headers=["ID", "MCU", "Frequency", "Flash", "RAM", "Name"])) b["mcu"],
"%dMHz" % (b["fcpu"] / 1000000),
fs.format_filesize(b["rom"]),
fs.format_filesize(b["ram"]),
b["name"],
)
for b in boards
],
headers=["ID", "MCU", "Frequency", "Flash", "RAM", "Name"],
)
)
def _get_boards(installed=False): def _get_boards(installed=False):
@@ -66,7 +76,7 @@ def _print_boards_json(query, installed=False):
result = [] result = []
for board in _get_boards(installed): for board in _get_boards(installed):
if query: if query:
search_data = "%s %s" % (board['id'], json.dumps(board).lower()) search_data = "%s %s" % (board["id"], json.dumps(board).lower())
if query.lower() not in search_data.lower(): if query.lower() not in search_data.lower():
continue continue
result.append(board) result.append(board)

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,316 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
# pylint: disable=redefined-builtin,too-many-statements
import os
from collections import Counter
from os.path import dirname, isfile
from time import time
import click
from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools import CheckToolFactory
from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, get_project_dir
@click.command("check", short_help="Run a static analysis tool on code")
@click.option("-e", "--environment", multiple=True)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--pattern", multiple=True)
@click.option("--flags", multiple=True)
@click.option(
"--severity", multiple=True, type=click.Choice(DefectItem.SEVERITY_LABELS.values())
)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.option("--json-output", is_flag=True)
@click.option(
"--fail-on-defect",
multiple=True,
type=click.Choice(DefectItem.SEVERITY_LABELS.values()),
)
def cli(
environment,
project_dir,
project_conf,
pattern,
flags,
severity,
silent,
verbose,
json_output,
fail_on_defect,
):
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level
if isfile(project_dir):
project_dir = find_project_dir_above(project_dir)
results = []
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(environment)
default_envs = config.default_envs()
for envname in config.envs():
skipenv = any(
[
environment and envname not in environment,
not environment and default_envs and envname not in default_envs,
]
)
env_options = config.items(env=envname, as_dict=True)
env_dump = []
for k, v in env_options.items():
if k not in ("platform", "framework", "board"):
continue
env_dump.append(
"%s: %s" % (k, ", ".join(v) if isinstance(v, list) else v)
)
default_patterns = [
config.get_optional_dir("src"),
config.get_optional_dir("include"),
]
tool_options = dict(
verbose=verbose,
silent=silent,
patterns=pattern or env_options.get("check_patterns", default_patterns),
flags=flags or env_options.get("check_flags"),
severity=[DefectItem.SEVERITY_LABELS[DefectItem.SEVERITY_HIGH]]
if silent
else severity or config.get("env:" + envname, "check_severity"),
)
for tool in config.get("env:" + envname, "check_tool"):
if skipenv:
results.append({"env": envname, "tool": tool})
continue
if not silent and not json_output:
print_processing_header(tool, envname, env_dump)
ct = CheckToolFactory.new(
tool, project_dir, config, envname, tool_options
)
result = {"env": envname, "tool": tool, "duration": time()}
rc = ct.check(
on_defect_callback=None
if (json_output or verbose)
else lambda defect: click.echo(repr(defect))
)
result["defects"] = ct.get_defects()
result["duration"] = time() - result["duration"]
result["succeeded"] = rc == 0
if fail_on_defect:
result["succeeded"] = rc == 0 and not any(
DefectItem.SEVERITY_LABELS[d.severity] in fail_on_defect
for d in result["defects"]
)
result["stats"] = collect_component_stats(result)
results.append(result)
if verbose:
click.echo("\n".join(repr(d) for d in result["defects"]))
if not json_output and not silent:
if rc != 0:
click.echo(
"Error: %s failed to perform check! Please "
"examine tool output in verbose mode." % tool
)
elif not result["defects"]:
click.echo("No defects found")
print_processing_footer(result)
if json_output:
click.echo(dump_json_to_unicode(results_to_json(results)))
elif not silent:
print_check_summary(results)
command_failed = any(r.get("succeeded") is False for r in results)
if command_failed:
raise exception.ReturnErrorCode(1)
def results_to_json(raw):
results = []
for item in raw:
if item.get("succeeded") is None:
continue
item.update(
{
"succeeded": bool(item.get("succeeded")),
"defects": [d.as_dict() for d in item.get("defects", [])],
}
)
results.append(item)
return results
def print_processing_header(tool, envname, envdump):
click.echo(
"Checking %s > %s (%s)"
% (click.style(envname, fg="cyan", bold=True), tool, "; ".join(envdump))
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def collect_component_stats(result):
components = dict()
def _append_defect(component, defect):
if not components.get(component):
components[component] = Counter()
components[component].update({DefectItem.SEVERITY_LABELS[defect.severity]: 1})
for defect in result.get("defects", []):
component = dirname(defect.file) or defect.file
_append_defect(component, defect)
if component.startswith(get_project_dir()):
while os.sep in component:
component = dirname(component)
_append_defect(component, defect)
return components
def print_defects_stats(results):
if not results:
return
component_stats = {}
for r in results:
for k, v in r.get("stats", {}).items():
if not component_stats.get(k):
component_stats[k] = Counter()
component_stats[k].update(r["stats"][k])
if not component_stats:
return
severity_labels = list(DefectItem.SEVERITY_LABELS.values())
severity_labels.reverse()
tabular_data = list()
for k, v in component_stats.items():
tool_defect = [v.get(s, 0) for s in severity_labels]
tabular_data.append([k] + tool_defect)
total = ["Total"] + [sum(d) for d in list(zip(*tabular_data))[1:]]
tabular_data.sort()
tabular_data.append([]) # Empty line as delimiter
tabular_data.append(total)
headers = ["Component"]
headers.extend([l.upper() for l in severity_labels])
headers = [click.style(h, bold=True) for h in headers]
click.echo(tabulate(tabular_data, headers=headers, numalign="center"))
click.echo()
def print_check_summary(results):
click.echo()
tabular_data = []
succeeded_nums = 0
failed_nums = 0
duration = 0
print_defects_stats(results)
for result in results:
duration += result.get("duration", 0)
if result.get("succeeded") is False:
failed_nums += 1
status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None:
status_str = "IGNORED"
else:
succeeded_nums += 1
status_str = click.style("PASSED", fg="green")
tabular_data.append(
(
click.style(result["env"], fg="cyan"),
result["tool"],
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Environment", "Tool", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green",
)

View File

@@ -0,0 +1,95 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os.path import abspath, relpath
import click
from platformio.project.helpers import get_project_dir
# pylint: disable=too-many-instance-attributes, redefined-builtin
# pylint: disable=too-many-arguments
class DefectItem(object):
SEVERITY_HIGH = 1
SEVERITY_MEDIUM = 2
SEVERITY_LOW = 4
SEVERITY_LABELS = {4: "low", 2: "medium", 1: "high"}
def __init__(
self,
severity,
category,
message,
file="unknown",
line=0,
column=0,
id=None,
callstack=None,
cwe=None,
):
assert severity in (self.SEVERITY_HIGH, self.SEVERITY_MEDIUM, self.SEVERITY_LOW)
self.severity = severity
self.category = category
self.message = message
self.line = int(line)
self.column = int(column)
self.callstack = callstack
self.cwe = cwe
self.id = id
self.file = file
if file.startswith(get_project_dir()):
self.file = relpath(file, get_project_dir())
def __repr__(self):
defect_color = None
if self.severity == self.SEVERITY_HIGH:
defect_color = "red"
elif self.severity == self.SEVERITY_MEDIUM:
defect_color = "yellow"
format_str = "{file}:{line}: [{severity}:{category}] {message} {id}"
return format_str.format(
severity=click.style(self.SEVERITY_LABELS[self.severity], fg=defect_color),
category=click.style(self.category.lower(), fg=defect_color),
file=click.style(self.file, bold=True),
message=self.message,
line=self.line,
id="%s" % "[%s]" % self.id if self.id else "",
)
def __or__(self, defect):
return self.severity | defect.severity
@staticmethod
def severity_to_int(label):
for key, value in DefectItem.SEVERITY_LABELS.items():
if label == value:
return key
raise Exception("Unknown severity label -> %s" % label)
def as_dict(self):
return {
"severity": self.SEVERITY_LABELS[self.severity],
"category": self.category,
"message": self.message,
"file": abspath(self.file),
"line": self.line,
"column": self.column,
"callstack": self.callstack,
"id": self.id,
"cwe": self.cwe,
}

View File

@@ -0,0 +1,30 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import exception
from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool
from platformio.commands.check.tools.cppcheck import CppcheckCheckTool
class CheckToolFactory(object):
@staticmethod
def new(tool, project_dir, config, envname, options):
cls = None
if tool == "cppcheck":
cls = CppcheckCheckTool
elif tool == "clangtidy":
cls = ClangtidyCheckTool
else:
raise exception.PlatformioException("Unknown check tool `%s`" % tool)
return cls(project_dir, config, envname, options)

View File

@@ -0,0 +1,171 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import glob
import os
import click
from platformio import fs, proc
from platformio.commands.check.defect import DefectItem
from platformio.project.helpers import get_project_dir, load_project_ide_data
class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
def __init__(self, project_dir, config, envname, options):
self.config = config
self.envname = envname
self.options = options
self.cpp_defines = []
self.cpp_flags = []
self.cpp_includes = []
self._defects = []
self._on_defect_callback = None
self._bad_input = False
self._load_cpp_data(project_dir, envname)
# detect all defects by default
if not self.options.get("severity"):
self.options["severity"] = [
DefectItem.SEVERITY_LOW,
DefectItem.SEVERITY_MEDIUM,
DefectItem.SEVERITY_HIGH,
]
# cast to severity by ids
self.options["severity"] = [
s if isinstance(s, int) else DefectItem.severity_to_int(s)
for s in self.options["severity"]
]
def _load_cpp_data(self, project_dir, envname):
data = load_project_ide_data(project_dir, envname)
if not data:
return
self.cpp_flags = data.get("cxx_flags", "").split(" ")
self.cpp_includes = data.get("includes", [])
self.cpp_defines = data.get("defines", [])
self.cpp_defines.extend(self._get_toolchain_defines(data.get("cc_path")))
def get_flags(self, tool):
result = []
flags = self.options.get("flags") or []
for flag in flags:
if ":" not in flag:
result.extend([f for f in flag.split(" ") if f])
elif flag.startswith("%s:" % tool):
result.extend([f for f in flag.split(":", 1)[1].split(" ") if f])
return result
@staticmethod
def _get_toolchain_defines(cc_path):
defines = []
result = proc.exec_command("echo | %s -dM -E -x c++ -" % cc_path, shell=True)
for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define":
continue
if len(tokens) > 2:
defines.append("%s=%s" % (tokens[1], tokens[2]))
else:
defines.append(tokens[1])
return defines
@staticmethod
def is_flag_set(flag, flags):
return any(flag in f for f in flags)
def get_defects(self):
return self._defects
def configure_command(self):
raise NotImplementedError
def on_tool_output(self, line):
line = self.tool_output_filter(line)
if not line:
return
defect = self.parse_defect(line)
if not isinstance(defect, DefectItem):
if self.options.get("verbose"):
click.echo(line)
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
@staticmethod
def tool_output_filter(line):
return line
@staticmethod
def parse_defect(raw_line):
return raw_line
def clean_up(self):
pass
def get_project_target_files(self):
allowed_extensions = (".h", ".hpp", ".c", ".cc", ".cpp", ".ino")
result = []
def _add_file(path):
if not path.endswith(allowed_extensions):
return
result.append(os.path.abspath(path))
for pattern in self.options["patterns"]:
for item in glob.glob(pattern):
if not os.path.isdir(item):
_add_file(item)
for root, _, files in os.walk(item, followlinks=True):
for f in files:
_add_file(os.path.join(root, f))
return result
def get_source_language(self):
with fs.cd(get_project_dir()):
for _, __, files in os.walk(self.config.get_optional_dir("src")):
for name in files:
if "." not in name:
continue
if os.path.splitext(name)[1].lower() in (".cpp", ".cxx", ".ino"):
return "c++"
return "c"
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
cmd = self.configure_command()
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
self.clean_up()
return self._bad_input

View File

@@ -0,0 +1,67 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
from os.path import join
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class ClangtidyCheckTool(CheckToolBase):
def tool_output_filter(self, line):
if not self.options.get("verbose") and "[clang-diagnostic-error]" in line:
return ""
if "[CommonOptionsParser]" in line:
self._bad_input = True
return line
if any(d in line for d in ("note: ", "error: ", "warning: ")):
return line
return ""
def parse_defect(self, raw_line):
match = re.match(r"^(.*):(\d+):(\d+):\s+([^:]+):\s(.+)\[([^]]+)\]$", raw_line)
if not match:
return raw_line
file_, line, column, category, message, defect_id = match.groups()
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
return DefectItem(severity, category, message, file_, line, column, defect_id)
def configure_command(self):
tool_path = join(get_core_package_dir("tool-clangtidy"), "clang-tidy")
cmd = [tool_path, "--quiet"]
flags = self.get_flags("clangtidy")
if not self.is_flag_set("--checks", flags):
cmd.append("--checks=*")
cmd.extend(flags)
cmd.extend(self.get_project_target_files())
cmd.append("--")
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.extend(["-I%s" % inc for inc in self.cpp_includes])
return cmd

View File

@@ -0,0 +1,158 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os import remove
from os.path import isfile, join
from tempfile import NamedTemporaryFile
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class CppcheckCheckTool(CheckToolBase):
def __init__(self, *args, **kwargs):
self._tmp_files = []
self.defect_fields = [
"severity",
"message",
"file",
"line",
"column",
"callstack",
"cwe",
"id",
]
super(CppcheckCheckTool, self).__init__(*args, **kwargs)
def tool_output_filter(self, line):
if (
not self.options.get("verbose")
and "--suppress=unmatchedSuppression:" in line
):
return ""
if any(
msg in line
for msg in (
"No C or C++ source files found",
"unrecognized command line option",
)
):
self._bad_input = True
return line
def parse_defect(self, raw_line):
if "<&PIO&>" not in raw_line or any(
f not in raw_line for f in self.defect_fields
):
return None
args = dict()
for field in raw_line.split("<&PIO&>"):
field = field.strip().replace('"', "")
name, value = field.split("=", 1)
args[name] = value
args["category"] = args["severity"]
if args["severity"] == "error":
args["severity"] = DefectItem.SEVERITY_HIGH
elif args["severity"] == "warning":
args["severity"] = DefectItem.SEVERITY_MEDIUM
else:
args["severity"] = DefectItem.SEVERITY_LOW
return DefectItem(**args)
def configure_command(self):
tool_path = join(get_core_package_dir("tool-cppcheck"), "cppcheck")
cmd = [
tool_path,
"--error-exitcode=1",
"--verbose" if self.options.get("verbose") else "--quiet",
]
cmd.append(
'--template="%s"'
% "<&PIO&>".join(["{0}={{{0}}}".format(f) for f in self.defect_fields])
)
flags = self.get_flags("cppcheck")
if not flags:
# by default user can suppress reporting individual defects
# directly in code // cppcheck-suppress warningID
cmd.append("--inline-suppr")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=unspecified")
if not self.is_flag_set("--enable", flags):
enabled_checks = [
"warning",
"style",
"performance",
"portability",
"unusedFunction",
]
cmd.append("--enable=%s" % ",".join(enabled_checks))
if not self.is_flag_set("--language", flags):
if self.get_source_language() == "c++":
cmd.append("--language=c++")
if not self.is_flag_set("--std", flags):
for f in self.cpp_flags:
if "-std" in f:
# Standards with GNU extensions are not allowed
cmd.append("-" + f.replace("gnu", "c"))
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.extend(flags)
cmd.append("--file-list=%s" % self._generate_src_file())
cmd.append("--includes-file=%s" % self._generate_inc_file())
core_dir = self.config.get_optional_dir("core")
cmd.append("--suppress=*:%s*" % core_dir)
cmd.append("--suppress=unmatchedSuppression:%s*" % core_dir)
return cmd
def _create_tmp_file(self, data):
with NamedTemporaryFile("w", delete=False) as fp:
fp.write(data)
self._tmp_files.append(fp.name)
return fp.name
def _generate_src_file(self):
src_files = [
f for f in self.get_project_target_files() if not f.endswith((".h", ".hpp"))
]
return self._create_tmp_file("\n".join(src_files))
def _generate_inc_file(self):
return self._create_tmp_file("\n".join(self.cpp_includes))
def clean_up(self):
for f in self._tmp_files:
if isfile(f):
remove(f)
# delete temporary dump files generated by addons
if not self.is_flag_set("--addon", self.get_flags("cppcheck")):
return
for f in self.get_project_target_files():
dump_file = f + ".dump"
if isfile(dump_file):
remove(dump_file)

View File

@@ -14,7 +14,7 @@
from glob import glob from glob import glob
from os import getenv, makedirs, remove from os import getenv, makedirs, remove
from os.path import abspath, basename, expanduser, isdir, isfile, join from os.path import abspath, basename, isdir, isfile, join
from shutil import copyfile, copytree from shutil import copyfile, copytree
from tempfile import mkdtemp from tempfile import mkdtemp
@@ -23,7 +23,7 @@ import click
from platformio import app, fs from platformio import app, fs
from platformio.commands.init import cli as cmd_init from platformio.commands.init import cli as cmd_init
from platformio.commands.init import validate_boards from platformio.commands.init import validate_boards
from platformio.commands.run import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import glob_escape from platformio.compat import glob_escape
from platformio.exception import CIBuildEnvsEmpty from platformio.exception import CIBuildEnvsEmpty
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@@ -34,7 +34,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
value = list(value) value = list(value)
for i, p in enumerate(value): for i, p in enumerate(value):
if p.startswith("~"): if p.startswith("~"):
value[i] = expanduser(p) value[i] = fs.expanduser(p)
value[i] = abspath(value[i]) value[i] = abspath(value[i])
if not glob(value[i]): if not glob(value[i]):
invalid_path = p invalid_path = p
@@ -48,37 +48,37 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
@click.command("ci", short_help="Continuous Integration") @click.command("ci", short_help="Continuous Integration")
@click.argument("src", nargs=-1, callback=validate_path) @click.argument("src", nargs=-1, callback=validate_path)
@click.option("-l", @click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
"--lib",
multiple=True,
callback=validate_path,
metavar="DIRECTORY")
@click.option("--exclude", multiple=True) @click.option("--exclude", multiple=True)
@click.option("-b", @click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
"--board", @click.option(
multiple=True, "--build-dir",
metavar="ID", default=mkdtemp,
callback=validate_boards) type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
@click.option("--build-dir", )
default=mkdtemp,
type=click.Path(file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("--keep-build-dir", is_flag=True) @click.option("--keep-build-dir", is_flag=True)
@click.option("-c", @click.option(
"--project-conf", "-c",
type=click.Path(exists=True, "--project-conf",
file_okay=True, type=click.Path(
dir_okay=False, exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
readable=True, ),
resolve_path=True)) )
@click.option("-O", "--project-option", multiple=True) @click.option("-O", "--project-option", multiple=True)
@click.option("-v", "--verbose", is_flag=True) @click.option("-v", "--verbose", is_flag=True)
@click.pass_context @click.pass_context
def cli( # pylint: disable=too-many-arguments, too-many-branches def cli( # pylint: disable=too-many-arguments, too-many-branches
ctx, src, lib, exclude, board, build_dir, keep_build_dir, project_conf, ctx,
project_option, verbose): src,
lib,
exclude,
board,
build_dir,
keep_build_dir,
project_conf,
project_option,
verbose,
):
if not src and getenv("PLATFORMIO_CI_SRC"): if not src and getenv("PLATFORMIO_CI_SRC"):
src = validate_path(ctx, None, getenv("PLATFORMIO_CI_SRC").split(":")) src = validate_path(ctx, None, getenv("PLATFORMIO_CI_SRC").split(":"))
@@ -110,10 +110,9 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
_exclude_contents(build_dir, exclude) _exclude_contents(build_dir, exclude)
# initialise project # initialise project
ctx.invoke(cmd_init, ctx.invoke(
project_dir=build_dir, cmd_init, project_dir=build_dir, board=board, project_option=project_option
board=board, )
project_option=project_option)
# process project # process project
ctx.invoke(cmd_run, project_dir=build_dir, verbose=verbose) ctx.invoke(cmd_run, project_dir=build_dir, verbose=verbose)
@@ -127,27 +126,27 @@ def _copy_contents(dst_dir, contents):
for path in contents: for path in contents:
if isdir(path): if isdir(path):
items['dirs'].add(path) items["dirs"].add(path)
elif isfile(path): elif isfile(path):
items['files'].add(path) items["files"].add(path)
dst_dir_name = basename(dst_dir) dst_dir_name = basename(dst_dir)
if dst_dir_name == "src" and len(items['dirs']) == 1: if dst_dir_name == "src" and len(items["dirs"]) == 1:
copytree(list(items['dirs']).pop(), dst_dir, symlinks=True) copytree(list(items["dirs"]).pop(), dst_dir, symlinks=True)
else: else:
if not isdir(dst_dir): if not isdir(dst_dir):
makedirs(dst_dir) makedirs(dst_dir)
for d in items['dirs']: for d in items["dirs"]:
copytree(d, join(dst_dir, basename(d)), symlinks=True) copytree(d, join(dst_dir, basename(d)), symlinks=True)
if not items['files']: if not items["files"]:
return return
if dst_dir_name == "lib": if dst_dir_name == "lib":
dst_dir = join(dst_dir, mkdtemp(dir=dst_dir)) dst_dir = join(dst_dir, mkdtemp(dir=dst_dir))
for f in items['files']: for f in items["files"]:
dst_file = join(dst_dir, basename(f)) dst_file = join(dst_dir, basename(f))
if f == dst_file: if f == dst_file:
continue continue

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.commands.debug.command import cli

View File

@@ -30,7 +30,7 @@ from platformio import app, exception, fs, proc, util
from platformio.commands.debug import helpers, initcfgs from platformio.commands.debug import helpers, initcfgs
from platformio.commands.debug.process import BaseProcess from platformio.commands.debug.process import BaseProcess
from platformio.commands.debug.server import DebugServer from platformio.commands.debug.server import DebugServer
from platformio.compat import hashlib_encode_data from platformio.compat import hashlib_encode_data, is_bytes
from platformio.project.helpers import get_project_cache_dir from platformio.project.helpers import get_project_cache_dir
from platformio.telemetry import MeasurementProtocol from platformio.telemetry import MeasurementProtocol
@@ -53,8 +53,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
if not isdir(get_project_cache_dir()): if not isdir(get_project_cache_dir()):
os.makedirs(get_project_cache_dir()) os.makedirs(get_project_cache_dir())
self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(), self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(), prefix=".piodebug-")
prefix=".piodebug-")
self._target_is_run = False self._target_is_run = False
self._last_server_activity = 0 self._last_server_activity = 0
@@ -70,39 +69,40 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
"PROG_PATH": prog_path, "PROG_PATH": prog_path,
"PROG_DIR": dirname(prog_path), "PROG_DIR": dirname(prog_path),
"PROG_NAME": basename(splitext(prog_path)[0]), "PROG_NAME": basename(splitext(prog_path)[0]),
"DEBUG_PORT": self.debug_options['port'], "DEBUG_PORT": self.debug_options["port"],
"UPLOAD_PROTOCOL": self.debug_options['upload_protocol'], "UPLOAD_PROTOCOL": self.debug_options["upload_protocol"],
"INIT_BREAK": self.debug_options['init_break'] or "", "INIT_BREAK": self.debug_options["init_break"] or "",
"LOAD_CMDS": "\n".join(self.debug_options['load_cmds'] or []), "LOAD_CMDS": "\n".join(self.debug_options["load_cmds"] or []),
} }
self._debug_server.spawn(patterns) self._debug_server.spawn(patterns)
if not patterns['DEBUG_PORT']: if not patterns["DEBUG_PORT"]:
patterns['DEBUG_PORT'] = self._debug_server.get_debug_port() patterns["DEBUG_PORT"] = self._debug_server.get_debug_port()
self.generate_pioinit(self._gdbsrc_dir, patterns) self.generate_pioinit(self._gdbsrc_dir, patterns)
# start GDB client # start GDB client
args = [ args = [
"piogdb", "piogdb",
"-q", "-q",
"--directory", self._gdbsrc_dir, "--directory",
"--directory", self.project_dir, self._gdbsrc_dir,
"-l", "10" "--directory",
] # yapf: disable self.project_dir,
"-l",
"10",
]
args.extend(self.args) args.extend(self.args)
if not gdb_path: if not gdb_path:
raise exception.DebugInvalidOptions("GDB client is not configured") raise exception.DebugInvalidOptions("GDB client is not configured")
gdb_data_dir = self._get_data_dir(gdb_path) gdb_data_dir = self._get_data_dir(gdb_path)
if gdb_data_dir: if gdb_data_dir:
args.extend(["--data-directory", gdb_data_dir]) args.extend(["--data-directory", gdb_data_dir])
args.append(patterns['PROG_PATH']) args.append(patterns["PROG_PATH"])
return reactor.spawnProcess(self, return reactor.spawnProcess(
gdb_path, self, gdb_path, args, path=self.project_dir, env=os.environ
args, )
path=self.project_dir,
env=os.environ)
@staticmethod @staticmethod
def _get_data_dir(gdb_path): def _get_data_dir(gdb_path):
@@ -112,8 +112,9 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
return gdb_data_dir if isdir(gdb_data_dir) else None return gdb_data_dir if isdir(gdb_data_dir) else None
def generate_pioinit(self, dst_dir, patterns): def generate_pioinit(self, dst_dir, patterns):
server_exe = (self.debug_options.get("server") server_exe = (
or {}).get("executable", "").lower() (self.debug_options.get("server") or {}).get("executable", "").lower()
)
if "jlink" in server_exe: if "jlink" in server_exe:
cfg = initcfgs.GDB_JLINK_INIT_CONFIG cfg = initcfgs.GDB_JLINK_INIT_CONFIG
elif "st-util" in server_exe: elif "st-util" in server_exe:
@@ -122,43 +123,43 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
cfg = initcfgs.GDB_MSPDEBUG_INIT_CONFIG cfg = initcfgs.GDB_MSPDEBUG_INIT_CONFIG
elif "qemu" in server_exe: elif "qemu" in server_exe:
cfg = initcfgs.GDB_QEMU_INIT_CONFIG cfg = initcfgs.GDB_QEMU_INIT_CONFIG
elif self.debug_options['require_debug_port']: elif self.debug_options["require_debug_port"]:
cfg = initcfgs.GDB_BLACKMAGIC_INIT_CONFIG cfg = initcfgs.GDB_BLACKMAGIC_INIT_CONFIG
else: else:
cfg = initcfgs.GDB_DEFAULT_INIT_CONFIG cfg = initcfgs.GDB_DEFAULT_INIT_CONFIG
commands = cfg.split("\n") commands = cfg.split("\n")
if self.debug_options['init_cmds']: if self.debug_options["init_cmds"]:
commands = self.debug_options['init_cmds'] commands = self.debug_options["init_cmds"]
commands.extend(self.debug_options['extra_cmds']) commands.extend(self.debug_options["extra_cmds"])
if not any("define pio_reset_target" in cmd for cmd in commands): if not any("define pio_reset_run_target" in cmd for cmd in commands):
commands = [ commands = [
"define pio_reset_target", "define pio_reset_run_target",
" echo Warning! Undefined pio_reset_target command\\n", " echo Warning! Undefined pio_reset_run_target command\\n",
" mon reset", " monitor reset",
"end" "end",
] + commands # yapf: disable ] + commands
if not any("define pio_reset_halt_target" in cmd for cmd in commands): if not any("define pio_reset_halt_target" in cmd for cmd in commands):
commands = [ commands = [
"define pio_reset_halt_target", "define pio_reset_halt_target",
" echo Warning! Undefined pio_reset_halt_target command\\n", " echo Warning! Undefined pio_reset_halt_target command\\n",
" mon reset halt", " monitor reset halt",
"end" "end",
] + commands # yapf: disable ] + commands
if not any("define pio_restart_target" in cmd for cmd in commands): if not any("define pio_restart_target" in cmd for cmd in commands):
commands += [ commands += [
"define pio_restart_target", "define pio_restart_target",
" pio_reset_halt_target", " pio_reset_halt_target",
" $INIT_BREAK", " $INIT_BREAK",
" %s" % ("continue" if patterns['INIT_BREAK'] else "next"), " %s" % ("continue" if patterns["INIT_BREAK"] else "next"),
"end" "end",
] # yapf: disable ]
banner = [ banner = [
"echo PlatformIO Unified Debugger -> http://bit.ly/pio-debug\\n", "echo PlatformIO Unified Debugger -> http://bit.ly/pio-debug\\n",
"echo PlatformIO: debug_tool = %s\\n" % self.debug_options['tool'], "echo PlatformIO: debug_tool = %s\\n" % self.debug_options["tool"],
"echo PlatformIO: Initializing remote target...\\n" "echo PlatformIO: Initializing remote target...\\n",
] ]
footer = ["echo %s\\n" % self.INIT_COMPLETED_BANNER] footer = ["echo %s\\n" % self.INIT_COMPLETED_BANNER]
commands = banner + commands + footer commands = banner + commands + footer
@@ -192,7 +193,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
if b"-gdb-exit" in data or data.strip() in (b"q", b"quit"): if b"-gdb-exit" in data or data.strip() in (b"q", b"quit"):
# Allow terminating via SIGINT/CTRL+C # Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler) signal.signal(signal.SIGINT, signal.default_int_handler)
self.transport.write(b"pio_reset_target\n") self.transport.write(b"pio_reset_run_target\n")
self.transport.write(data) self.transport.write(data)
def processEnded(self, reason): # pylint: disable=unused-argument def processEnded(self, reason): # pylint: disable=unused-argument
@@ -214,8 +215,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._handle_error(data) self._handle_error(data)
# go to init break automatically # go to init break automatically
if self.INIT_COMPLETED_BANNER.encode() in data: if self.INIT_COMPLETED_BANNER.encode() in data:
self._auto_continue_timer = task.LoopingCall( self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
self._auto_exec_continue)
self._auto_continue_timer.start(0.1) self._auto_continue_timer.start(0.1)
def errReceived(self, data): def errReceived(self, data):
@@ -223,10 +223,9 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._handle_error(data) self._handle_error(data)
def console_log(self, msg): def console_log(self, msg):
if helpers.is_mi_mode(self.args): if helpers.is_gdbmi_mode():
self.outReceived(('~"%s\\n"\n' % msg).encode()) msg = helpers.escape_gdbmi_stream("~", msg)
else: self.outReceived(msg if is_bytes(msg) else msg.encode())
self.outReceived(("%s\n" % msg).encode())
def _auto_exec_continue(self): def _auto_exec_continue(self):
auto_exec_delay = 0.5 # in seconds auto_exec_delay = 0.5 # in seconds
@@ -236,29 +235,34 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._auto_continue_timer.stop() self._auto_continue_timer.stop()
self._auto_continue_timer = None self._auto_continue_timer = None
if not self.debug_options['init_break'] or self._target_is_run: if not self.debug_options["init_break"] or self._target_is_run:
return return
self.console_log( self.console_log(
"PlatformIO: Resume the execution to `debug_init_break = %s`" % "PlatformIO: Resume the execution to `debug_init_break = %s`\n"
self.debug_options['init_break']) % self.debug_options["init_break"]
self.console_log("PlatformIO: More configuration options -> " )
"http://bit.ly/pio-debug") self.console_log(
self.transport.write(b"0-exec-continue\n" if helpers. "PlatformIO: More configuration options -> http://bit.ly/pio-debug\n"
is_mi_mode(self.args) else b"continue\n") )
self.transport.write(
b"0-exec-continue\n" if helpers.is_gdbmi_mode() else b"continue\n"
)
self._target_is_run = True self._target_is_run = True
def _handle_error(self, data): def _handle_error(self, data):
if (self.PIO_SRC_NAME.encode() not in data if self.PIO_SRC_NAME.encode() not in data or b"Error in sourced" not in data:
or b"Error in sourced" not in data):
return return
configuration = {"debug": self.debug_options, "env": self.env_options} configuration = {"debug": self.debug_options, "env": self.env_options}
exd = re.sub(r'\\(?!")', "/", json.dumps(configuration)) exd = re.sub(r'\\(?!")', "/", json.dumps(configuration))
exd = re.sub(r'"(?:[a-z]\:)?((/[^"/]+)+)"', exd = re.sub(
lambda m: '"%s"' % join(*m.group(1).split("/")[-2:]), exd, r'"(?:[a-z]\:)?((/[^"/]+)+)"',
re.I | re.M) lambda m: '"%s"' % join(*m.group(1).split("/")[-2:]),
exd,
re.I | re.M,
)
mp = MeasurementProtocol() mp = MeasurementProtocol()
mp['exd'] = "DebugGDBPioInitError: %s" % exd mp["exd"] = "DebugGDBPioInitError: %s" % exd
mp['exf'] = 1 mp["exf"] = 1
mp.send("exception") mp.send("exception")
self.transport.loseConnection() self.transport.loseConnection()

View File

@@ -17,43 +17,45 @@
import os import os
import signal import signal
from os.path import isfile, join from os.path import isfile
import click import click
from platformio import exception, fs, proc, util from platformio import app, exception, fs, proc, util
from platformio.commands.debug import helpers from platformio.commands.debug import helpers
from platformio.managers.core import inject_contrib_pysite from platformio.managers.core import inject_contrib_pysite
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (is_platformio_project, from platformio.project.helpers import is_platformio_project, load_project_ide_data
load_project_ide_data)
@click.command("debug", @click.command(
context_settings=dict(ignore_unknown_options=True), "debug",
short_help="PIO Unified Debugger") context_settings=dict(ignore_unknown_options=True),
@click.option("-d", short_help="PIO Unified Debugger",
"--project-dir", )
default=os.getcwd, @click.option(
type=click.Path(exists=True, "-d",
file_okay=False, "--project-dir",
dir_okay=True, default=os.getcwd,
writable=True, type=click.Path(
resolve_path=True)) exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
@click.option("-c", ),
"--project-conf", )
type=click.Path(exists=True, @click.option(
file_okay=True, "-c",
dir_okay=False, "--project-conf",
readable=True, type=click.Path(
resolve_path=True)) exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--environment", "-e", metavar="<environment>") @click.option("--environment", "-e", metavar="<environment>")
@click.option("--verbose", "-v", is_flag=True) @click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"])) @click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED) @click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
@click.pass_context @click.pass_context
def cli(ctx, project_dir, project_conf, environment, verbose, interface, def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unprocessed):
__unprocessed): app.set_session_var("custom_project_conf", project_conf)
# use env variables from Eclipse or CLion # use env variables from Eclipse or CLion
for sysenv in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"): for sysenv in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"):
if is_platformio_project(project_dir): if is_platformio_project(project_dir):
@@ -62,8 +64,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface,
project_dir = os.getenv(sysenv) project_dir = os.getenv(sysenv)
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig.get_instance( config = ProjectConfig.get_instance(project_conf)
project_conf or join(project_dir, "platformio.ini"))
config.validate(envs=[environment] if environment else None) config.validate(envs=[environment] if environment else None)
env_name = environment or helpers.get_default_debug_env(config) env_name = environment or helpers.get_default_debug_env(config)
@@ -74,76 +75,81 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface,
assert debug_options assert debug_options
if not interface: if not interface:
return helpers.predebug_project(ctx, project_dir, env_name, False, return helpers.predebug_project(ctx, project_dir, env_name, False, verbose)
verbose)
configuration = load_project_ide_data(project_dir, env_name) configuration = load_project_ide_data(project_dir, env_name)
if not configuration: if not configuration:
raise exception.DebugInvalidOptions( raise exception.DebugInvalidOptions("Could not load debug configuration")
"Could not load debug configuration")
if "--version" in __unprocessed: if "--version" in __unprocessed:
result = proc.exec_command([configuration['gdb_path'], "--version"]) result = proc.exec_command([configuration["gdb_path"], "--version"])
if result['returncode'] == 0: if result["returncode"] == 0:
return click.echo(result['out']) return click.echo(result["out"])
raise exception.PlatformioException("\n".join( raise exception.PlatformioException("\n".join([result["out"], result["err"]]))
[result['out'], result['err']]))
try: try:
fs.ensure_udev_rules() fs.ensure_udev_rules()
except exception.InvalidUdevRules as e: except exception.InvalidUdevRules as e:
for line in str(e).split("\n") + [""]: click.echo(
click.echo( helpers.escape_gdbmi_stream("~", str(e) + "\n")
('~"%s\\n"' if helpers.is_mi_mode(__unprocessed) else "%s") % if helpers.is_gdbmi_mode()
line) else str(e) + "\n",
nl=False,
)
debug_options['load_cmds'] = helpers.configure_esp32_load_cmds( debug_options["load_cmds"] = helpers.configure_esp32_load_cmds(
debug_options, configuration) debug_options, configuration
)
rebuild_prog = False rebuild_prog = False
preload = debug_options['load_cmds'] == ["preload"] preload = debug_options["load_cmds"] == ["preload"]
load_mode = debug_options['load_mode'] load_mode = debug_options["load_mode"]
if load_mode == "always": if load_mode == "always":
rebuild_prog = ( rebuild_prog = preload or not helpers.has_debug_symbols(
preload configuration["prog_path"]
or not helpers.has_debug_symbols(configuration['prog_path'])) )
elif load_mode == "modified": elif load_mode == "modified":
rebuild_prog = ( rebuild_prog = helpers.is_prog_obsolete(
helpers.is_prog_obsolete(configuration['prog_path']) configuration["prog_path"]
or not helpers.has_debug_symbols(configuration['prog_path'])) ) or not helpers.has_debug_symbols(configuration["prog_path"])
else: else:
rebuild_prog = not isfile(configuration['prog_path']) rebuild_prog = not isfile(configuration["prog_path"])
if preload or (not rebuild_prog and load_mode != "always"): if preload or (not rebuild_prog and load_mode != "always"):
# don't load firmware through debug server # don't load firmware through debug server
debug_options['load_cmds'] = [] debug_options["load_cmds"] = []
if rebuild_prog: if rebuild_prog:
if helpers.is_mi_mode(__unprocessed): if helpers.is_gdbmi_mode():
click.echo('~"Preparing firmware for debugging...\\n"') click.echo(
output = helpers.GDBBytesIO() helpers.escape_gdbmi_stream(
with util.capture_std_streams(output): "~", "Preparing firmware for debugging...\n"
helpers.predebug_project(ctx, project_dir, env_name, preload, ),
verbose) nl=False,
output.close() )
stream = helpers.GDBMIConsoleStream()
with util.capture_std_streams(stream):
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
stream.close()
else: else:
click.echo("Preparing firmware for debugging...") click.echo("Preparing firmware for debugging...")
helpers.predebug_project(ctx, project_dir, env_name, preload, helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
verbose)
# save SHA sum of newly created prog # save SHA sum of newly created prog
if load_mode == "modified": if load_mode == "modified":
helpers.is_prog_obsolete(configuration['prog_path']) helpers.is_prog_obsolete(configuration["prog_path"])
if not isfile(configuration['prog_path']): if not isfile(configuration["prog_path"]):
raise exception.DebugInvalidOptions("Program/firmware is missed") raise exception.DebugInvalidOptions("Program/firmware is missed")
# run debugging client # run debugging client
inject_contrib_pysite() inject_contrib_pysite()
# pylint: disable=import-outside-toplevel
from platformio.commands.debug.client import GDBClient, reactor from platformio.commands.debug.client import GDBClient, reactor
client = GDBClient(project_dir, __unprocessed, debug_options, env_options) client = GDBClient(project_dir, __unprocessed, debug_options, env_options)
client.spawn(configuration['gdb_path'], configuration['prog_path']) client.spawn(configuration["gdb_path"], configuration["prog_path"])
signal.signal(signal.SIGINT, lambda *args, **kwargs: None) signal.signal(signal.SIGINT, lambda *args, **kwargs: None)
reactor.run() reactor.run()

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import re
import sys import sys
import time import time
from fnmatch import fnmatch from fnmatch import fnmatch
@@ -20,28 +21,46 @@ from io import BytesIO
from os.path import isfile from os.path import isfile
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.commands.platform import \ from platformio.commands import PlatformioCLI
platform_install as cmd_platform_install from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.run import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes
from platformio.managers.platform import PlatformFactory from platformio.managers.platform import PlatformFactory
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions
class GDBBytesIO(BytesIO): # pylint: disable=too-few-public-methods class GDBMIConsoleStream(BytesIO): # pylint: disable=too-few-public-methods
STDOUT = sys.stdout STDOUT = sys.stdout
def write(self, text): def write(self, text):
if "\n" in text: self.STDOUT.write(escape_gdbmi_stream("~", text))
for line in text.strip().split("\n"):
self.STDOUT.write('~"%s\\n"\n' % line)
else:
self.STDOUT.write('~"%s"' % text)
self.STDOUT.flush() self.STDOUT.flush()
def is_mi_mode(args): def is_gdbmi_mode():
return "--interpreter" in " ".join(args) return "--interpreter" in " ".join(PlatformioCLI.leftover_args)
def escape_gdbmi_stream(prefix, stream):
bytes_stream = False
if is_bytes(stream):
bytes_stream = True
stream = stream.decode()
if not stream:
return b"" if bytes_stream else ""
ends_nl = stream.endswith("\n")
stream = re.sub(r"\\+", "\\\\\\\\", stream)
stream = stream.replace('"', '\\"')
stream = stream.replace("\n", "\\n")
stream = '%s"%s"' % (prefix, stream)
if ends_nl:
stream += "\n"
return stream.encode() if bytes_stream else stream
def get_default_debug_env(config): def get_default_debug_env(config):
@@ -57,41 +76,41 @@ def get_default_debug_env(config):
def predebug_project(ctx, project_dir, env_name, preload, verbose): def predebug_project(ctx, project_dir, env_name, preload, verbose):
ctx.invoke(cmd_run, ctx.invoke(
project_dir=project_dir, cmd_run,
environment=[env_name], project_dir=project_dir,
target=["debug"] + (["upload"] if preload else []), environment=[env_name],
verbose=verbose) target=["debug"] + (["upload"] if preload else []),
verbose=verbose,
)
if preload: if preload:
time.sleep(5) time.sleep(5)
def validate_debug_options(cmd_ctx, env_options): def validate_debug_options(cmd_ctx, env_options):
def _cleanup_cmds(items): def _cleanup_cmds(items):
items = ProjectConfig.parse_multi_values(items) items = ProjectConfig.parse_multi_values(items)
return [ return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
"$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items
]
try: try:
platform = PlatformFactory.newPlatform(env_options['platform']) platform = PlatformFactory.newPlatform(env_options["platform"])
except exception.UnknownPlatform: except exception.UnknownPlatform:
cmd_ctx.invoke(cmd_platform_install, cmd_ctx.invoke(
platforms=[env_options['platform']], cmd_platform_install,
skip_default_package=True) platforms=[env_options["platform"]],
platform = PlatformFactory.newPlatform(env_options['platform']) skip_default_package=True,
)
platform = PlatformFactory.newPlatform(env_options["platform"])
board_config = platform.board_config(env_options['board']) board_config = platform.board_config(env_options["board"])
tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool")) tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool"))
tool_settings = board_config.get("debug", {}).get("tools", tool_settings = board_config.get("debug", {}).get("tools", {}).get(tool_name, {})
{}).get(tool_name, {})
server_options = None server_options = None
# specific server per a system # specific server per a system
if isinstance(tool_settings.get("server", {}), list): if isinstance(tool_settings.get("server", {}), list):
for item in tool_settings['server'][:]: for item in tool_settings["server"][:]:
tool_settings['server'] = item tool_settings["server"] = item
if util.get_systype() in item.get("system", []): if util.get_systype() in item.get("system", []):
break break
@@ -100,76 +119,98 @@ def validate_debug_options(cmd_ctx, env_options):
server_options = { server_options = {
"cwd": None, "cwd": None,
"executable": None, "executable": None,
"arguments": env_options.get("debug_server") "arguments": env_options.get("debug_server"),
} }
server_options['executable'] = server_options['arguments'][0] server_options["executable"] = server_options["arguments"][0]
server_options['arguments'] = server_options['arguments'][1:] server_options["arguments"] = server_options["arguments"][1:]
elif "server" in tool_settings: elif "server" in tool_settings:
server_package = tool_settings['server'].get("package") server_package = tool_settings["server"].get("package")
server_package_dir = platform.get_package_dir( server_package_dir = (
server_package) if server_package else None platform.get_package_dir(server_package) if server_package else None
)
if server_package and not server_package_dir: if server_package and not server_package_dir:
platform.install_packages(with_packages=[server_package], platform.install_packages(
skip_default_package=True, with_packages=[server_package], skip_default_package=True, silent=True
silent=True) )
server_package_dir = platform.get_package_dir(server_package) server_package_dir = platform.get_package_dir(server_package)
server_options = dict( server_options = dict(
cwd=server_package_dir if server_package else None, cwd=server_package_dir if server_package else None,
executable=tool_settings['server'].get("executable"), executable=tool_settings["server"].get("executable"),
arguments=[ arguments=[
a.replace("$PACKAGE_DIR", server_package_dir) a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir else a if server_package_dir
for a in tool_settings['server'].get("arguments", []) else a
]) for a in tool_settings["server"].get("arguments", [])
],
)
extra_cmds = _cleanup_cmds(env_options.get("debug_extra_cmds")) extra_cmds = _cleanup_cmds(env_options.get("debug_extra_cmds"))
extra_cmds.extend(_cleanup_cmds(tool_settings.get("extra_cmds"))) extra_cmds.extend(_cleanup_cmds(tool_settings.get("extra_cmds")))
result = dict( result = dict(
tool=tool_name, tool=tool_name,
upload_protocol=env_options.get( upload_protocol=env_options.get(
"upload_protocol", "upload_protocol", board_config.get("upload", {}).get("protocol")
board_config.get("upload", {}).get("protocol")), ),
load_cmds=_cleanup_cmds( load_cmds=_cleanup_cmds(
env_options.get( env_options.get(
"debug_load_cmds", "debug_load_cmds",
tool_settings.get("load_cmds", tool_settings.get(
tool_settings.get("load_cmd", "load")))), "load_cmds",
load_mode=env_options.get("debug_load_mode", tool_settings.get(
tool_settings.get("load_mode", "always")), "load_cmd", ProjectOptions["env.debug_load_cmds"].default
),
),
)
),
load_mode=env_options.get(
"debug_load_mode",
tool_settings.get(
"load_mode", ProjectOptions["env.debug_load_mode"].default
),
),
init_break=env_options.get( init_break=env_options.get(
"debug_init_break", tool_settings.get("init_break", "debug_init_break",
"tbreak main")), tool_settings.get(
"init_break", ProjectOptions["env.debug_init_break"].default
),
),
init_cmds=_cleanup_cmds( init_cmds=_cleanup_cmds(
env_options.get("debug_init_cmds", env_options.get("debug_init_cmds", tool_settings.get("init_cmds"))
tool_settings.get("init_cmds"))), ),
extra_cmds=extra_cmds, extra_cmds=extra_cmds,
require_debug_port=tool_settings.get("require_debug_port", False), require_debug_port=tool_settings.get("require_debug_port", False),
port=reveal_debug_port( port=reveal_debug_port(
env_options.get("debug_port", tool_settings.get("port")), env_options.get("debug_port", tool_settings.get("port")),
tool_name, tool_settings), tool_name,
server=server_options) tool_settings,
),
server=server_options,
)
return result return result
def configure_esp32_load_cmds(debug_options, configuration): def configure_esp32_load_cmds(debug_options, configuration):
ignore_conds = [ ignore_conds = [
debug_options['load_cmds'] != ["load"], debug_options["load_cmds"] != ["load"],
"xtensa-esp32" not in configuration.get("cc_path", ""), "xtensa-esp32" not in configuration.get("cc_path", ""),
not configuration.get("flash_extra_images"), not all([ not configuration.get("flash_extra_images"),
isfile(item['path']) not all(
for item in configuration.get("flash_extra_images") [isfile(item["path"]) for item in configuration.get("flash_extra_images")]
]) ),
] ]
if any(ignore_conds): if any(ignore_conds):
return debug_options['load_cmds'] return debug_options["load_cmds"]
mon_cmds = [ mon_cmds = [
'monitor program_esp32 "{{{path}}}" {offset} verify'.format( 'monitor program_esp32 "{{{path}}}" {offset} verify'.format(
path=fs.to_unix_path(item['path']), offset=item['offset']) path=fs.to_unix_path(item["path"]), offset=item["offset"]
)
for item in configuration.get("flash_extra_images") for item in configuration.get("flash_extra_images")
] ]
mon_cmds.append('monitor program_esp32 "{%s.bin}" 0x10000 verify' % mon_cmds.append(
fs.to_unix_path(configuration['prog_path'][:-4])) 'monitor program_esp32 "{%s.bin}" 0x10000 verify'
% fs.to_unix_path(configuration["prog_path"][:-4])
)
return mon_cmds return mon_cmds
@@ -181,7 +222,7 @@ def has_debug_symbols(prog_path):
b".debug_abbrev": False, b".debug_abbrev": False,
b" -Og": False, b" -Og": False,
b" -g": False, b" -g": False,
b"__PLATFORMIO_BUILD_DEBUG__": False b"__PLATFORMIO_BUILD_DEBUG__": False,
} }
with open(prog_path, "rb") as fp: with open(prog_path, "rb") as fp:
last_data = b"" last_data = b""
@@ -210,19 +251,16 @@ def is_prog_obsolete(prog_path):
break break
shasum.update(data) shasum.update(data)
new_digest = shasum.hexdigest() new_digest = shasum.hexdigest()
old_digest = None old_digest = (
if isfile(prog_hash_path): fs.get_file_contents(prog_hash_path) if isfile(prog_hash_path) else None
with open(prog_hash_path, "r") as fp: )
old_digest = fp.read()
if new_digest == old_digest: if new_digest == old_digest:
return False return False
with open(prog_hash_path, "w") as fp: fs.write_file_contents(prog_hash_path, new_digest)
fp.write(new_digest)
return True return True
def reveal_debug_port(env_debug_port, tool_name, tool_settings): def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _get_pattern(): def _get_pattern():
if not env_debug_port: if not env_debug_port:
return None return None
@@ -238,18 +276,21 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _look_for_serial_port(hwids): def _look_for_serial_port(hwids):
for item in util.get_serialports(filter_hwid=True): for item in util.get_serialports(filter_hwid=True):
if not _is_match_pattern(item['port']): if not _is_match_pattern(item["port"]):
continue continue
port = item['port'] port = item["port"]
if tool_name.startswith("blackmagic"): if tool_name.startswith("blackmagic"):
if "windows" in util.get_systype() and \ if (
port.startswith("COM") and len(port) > 4: "windows" in util.get_systype()
and port.startswith("COM")
and len(port) > 4
):
port = "\\\\.\\%s" % port port = "\\\\.\\%s" % port
if "GDB" in item['description']: if "GDB" in item["description"]:
return port return port
for hwid in hwids: for hwid in hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "") hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']: if hwid_str in item["hwid"]:
return port return port
return None return None
@@ -261,5 +302,6 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
debug_port = _look_for_serial_port(tool_settings.get("hwids", [])) debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
if not debug_port: if not debug_port:
raise exception.DebugInvalidOptions( raise exception.DebugInvalidOptions(
"Please specify `debug_port` for environment") "Please specify `debug_port` for environment"
)
return debug_port return debug_port

View File

@@ -17,50 +17,51 @@ define pio_reset_halt_target
monitor reset halt monitor reset halt
end end
define pio_reset_target define pio_reset_run_target
monitor reset monitor reset
end end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
$INIT_BREAK
pio_reset_halt_target
$LOAD_CMDS
monitor init monitor init
$LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$INIT_BREAK
""" """
GDB_STUTIL_INIT_CONFIG = """ GDB_STUTIL_INIT_CONFIG = """
define pio_reset_halt_target define pio_reset_halt_target
monitor halt
monitor reset monitor reset
monitor halt
end end
define pio_reset_target define pio_reset_run_target
monitor reset monitor reset
end end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
$INIT_BREAK
pio_reset_halt_target
$LOAD_CMDS $LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$INIT_BREAK
""" """
GDB_JLINK_INIT_CONFIG = """ GDB_JLINK_INIT_CONFIG = """
define pio_reset_halt_target define pio_reset_halt_target
monitor halt
monitor reset monitor reset
monitor halt
end end
define pio_reset_target define pio_reset_run_target
monitor clrbp
monitor reset monitor reset
monitor go
end end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
$INIT_BREAK monitor clrbp
pio_reset_halt_target monitor speed auto
$LOAD_CMDS $LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$INIT_BREAK
""" """
GDB_BLACKMAGIC_INIT_CONFIG = """ GDB_BLACKMAGIC_INIT_CONFIG = """
@@ -74,7 +75,7 @@ define pio_reset_halt_target
set language auto set language auto
end end
define pio_reset_target define pio_reset_run_target
pio_reset_halt_target pio_reset_halt_target
end end
@@ -82,8 +83,8 @@ target extended-remote $DEBUG_PORT
monitor swdp_scan monitor swdp_scan
attach 1 attach 1
set mem inaccessible-by-default off set mem inaccessible-by-default off
$INIT_BREAK
$LOAD_CMDS $LOAD_CMDS
$INIT_BREAK
set language c set language c
set *0xE000ED0C = 0x05FA0004 set *0xE000ED0C = 0x05FA0004
@@ -98,14 +99,14 @@ GDB_MSPDEBUG_INIT_CONFIG = """
define pio_reset_halt_target define pio_reset_halt_target
end end
define pio_reset_target define pio_reset_run_target
end end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
$INIT_BREAK
monitor erase monitor erase
$LOAD_CMDS $LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$INIT_BREAK
""" """
GDB_QEMU_INIT_CONFIG = """ GDB_QEMU_INIT_CONFIG = """
@@ -113,12 +114,12 @@ define pio_reset_halt_target
monitor system_reset monitor system_reset
end end
define pio_reset_target define pio_reset_run_target
pio_reset_halt_target monitor system_reset
end end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
$INIT_BREAK
$LOAD_CMDS $LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$INIT_BREAK
""" """

View File

@@ -32,7 +32,7 @@ class BaseProcess(protocol.ProcessProtocol, object):
COMMON_PATTERNS = { COMMON_PATTERNS = {
"PLATFORMIO_HOME_DIR": get_project_core_dir(), "PLATFORMIO_HOME_DIR": get_project_core_dir(),
"PLATFORMIO_CORE_DIR": get_project_core_dir(), "PLATFORMIO_CORE_DIR": get_project_core_dir(),
"PYTHONEXE": get_pythonexe_path() "PYTHONEXE": get_pythonexe_path(),
} }
def apply_patterns(self, source, patterns=None): def apply_patterns(self, source, patterns=None):
@@ -52,8 +52,7 @@ class BaseProcess(protocol.ProcessProtocol, object):
if isinstance(source, string_types): if isinstance(source, string_types):
source = _replace(source) source = _replace(source)
elif isinstance(source, (list, dict)): elif isinstance(source, (list, dict)):
items = enumerate(source) if isinstance(source, items = enumerate(source) if isinstance(source, list) else source.items()
list) else source.items()
for key, value in items: for key, value in items:
if isinstance(value, string_types): if isinstance(value, string_types):
source[key] = _replace(value) source[key] = _replace(value)
@@ -67,9 +66,9 @@ class BaseProcess(protocol.ProcessProtocol, object):
with open(LOG_FILE, "ab") as fp: with open(LOG_FILE, "ab") as fp:
fp.write(data) fp.write(data)
while data: while data:
chunk = data[:self.STDOUT_CHUNK_SIZE] chunk = data[: self.STDOUT_CHUNK_SIZE]
click.echo(chunk, nl=False) click.echo(chunk, nl=False)
data = data[self.STDOUT_CHUNK_SIZE:] data = data[self.STDOUT_CHUNK_SIZE :]
@staticmethod @staticmethod
def errReceived(data): def errReceived(data):

View File

@@ -19,12 +19,12 @@ from twisted.internet import error # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error from twisted.internet import reactor # pylint: disable=import-error
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
from platformio.commands.debug.process import BaseProcess from platformio.commands.debug.process import BaseProcess
from platformio.proc import where_is_program from platformio.proc import where_is_program
class DebugServer(BaseProcess): class DebugServer(BaseProcess):
def __init__(self, debug_options, env_options): def __init__(self, debug_options, env_options):
self.debug_options = debug_options self.debug_options = debug_options
self.env_options = env_options self.env_options = env_options
@@ -39,13 +39,16 @@ class DebugServer(BaseProcess):
if not server: if not server:
return None return None
server = self.apply_patterns(server, patterns) server = self.apply_patterns(server, patterns)
server_executable = server['executable'] server_executable = server["executable"]
if not server_executable: if not server_executable:
return None return None
if server['cwd']: if server["cwd"]:
server_executable = join(server['cwd'], server_executable) server_executable = join(server["cwd"], server_executable)
if ("windows" in systype and not server_executable.endswith(".exe") if (
and isfile(server_executable + ".exe")): "windows" in systype
and not server_executable.endswith(".exe")
and isfile(server_executable + ".exe")
):
server_executable = server_executable + ".exe" server_executable = server_executable + ".exe"
if not isfile(server_executable): if not isfile(server_executable):
@@ -55,48 +58,56 @@ class DebugServer(BaseProcess):
"\nCould not launch Debug Server '%s'. Please check that it " "\nCould not launch Debug Server '%s'. Please check that it "
"is installed and is included in a system PATH\n\n" "is installed and is included in a system PATH\n\n"
"See documentation or contact contact@platformio.org:\n" "See documentation or contact contact@platformio.org:\n"
"http://docs.platformio.org/page/plus/debugging.html\n" % "http://docs.platformio.org/page/plus/debugging.html\n"
server_executable) % server_executable
)
self._debug_port = ":3333" self._debug_port = ":3333"
openocd_pipe_allowed = all([ openocd_pipe_allowed = all(
not self.debug_options['port'], [not self.debug_options["port"], "openocd" in server_executable]
"openocd" in server_executable )
]) # yapf: disable
if openocd_pipe_allowed: if openocd_pipe_allowed:
args = [] args = []
if server['cwd']: if server["cwd"]:
args.extend(["-s", server['cwd']]) args.extend(["-s", server["cwd"]])
args.extend([ args.extend(
"-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled" ["-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled"]
]) )
args.extend(server['arguments']) args.extend(server["arguments"])
str_args = " ".join( str_args = " ".join(
[arg if arg.startswith("-") else '"%s"' % arg for arg in args]) [arg if arg.startswith("-") else '"%s"' % arg for arg in args]
)
self._debug_port = '| "%s" %s' % (server_executable, str_args) self._debug_port = '| "%s" %s' % (server_executable, str_args)
self._debug_port = fs.to_unix_path(self._debug_port) self._debug_port = fs.to_unix_path(self._debug_port)
else: else:
env = os.environ.copy() env = os.environ.copy()
# prepend server "lib" folder to LD path # prepend server "lib" folder to LD path
if ("windows" not in systype and server['cwd'] if (
and isdir(join(server['cwd'], "lib"))): "windows" not in systype
ld_key = ("DYLD_LIBRARY_PATH" and server["cwd"]
if "darwin" in systype else "LD_LIBRARY_PATH") and isdir(join(server["cwd"], "lib"))
env[ld_key] = join(server['cwd'], "lib") ):
ld_key = (
"DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH"
)
env[ld_key] = join(server["cwd"], "lib")
if os.environ.get(ld_key): if os.environ.get(ld_key):
env[ld_key] = "%s:%s" % (env[ld_key], env[ld_key] = "%s:%s" % (env[ld_key], os.environ.get(ld_key))
os.environ.get(ld_key))
# prepend BIN to PATH # prepend BIN to PATH
if server['cwd'] and isdir(join(server['cwd'], "bin")): if server["cwd"] and isdir(join(server["cwd"], "bin")):
env['PATH'] = "%s%s%s" % ( env["PATH"] = "%s%s%s" % (
join(server['cwd'], "bin"), os.pathsep, join(server["cwd"], "bin"),
os.environ.get("PATH", os.environ.get("Path", ""))) os.pathsep,
os.environ.get("PATH", os.environ.get("Path", "")),
)
self._transport = reactor.spawnProcess( self._transport = reactor.spawnProcess(
self, self,
server_executable, [server_executable] + server['arguments'], server_executable,
path=server['cwd'], [server_executable] + server["arguments"],
env=env) path=server["cwd"],
env=env,
)
if "mspdebug" in server_executable.lower(): if "mspdebug" in server_executable.lower():
self._debug_port = ":2000" self._debug_port = ":2000"
elif "jlink" in server_executable.lower(): elif "jlink" in server_executable.lower():
@@ -109,6 +120,11 @@ class DebugServer(BaseProcess):
def get_debug_port(self): def get_debug_port(self):
return self._debug_port return self._debug_port
def outReceived(self, data):
super(DebugServer, self).outReceived(
escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data
)
def processEnded(self, reason): def processEnded(self, reason):
self._process_ended = True self._process_ended = True
super(DebugServer, self).processEnded(reason) super(DebugServer, self).processEnded(reason)

View File

@@ -15,12 +15,11 @@
import sys import sys
from fnmatch import fnmatch from fnmatch import fnmatch
from os import getcwd from os import getcwd
from os.path import join
import click import click
from serial.tools import miniterm from serial.tools import miniterm
from platformio import exception, util from platformio import exception, fs, util
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@@ -36,27 +35,29 @@ def cli():
@click.option("--mdns", is_flag=True, help="List multicast DNS services") @click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output): serial, logical, mdns, json_output
):
if not logical and not mdns: if not logical and not mdns:
serial = True serial = True
data = {} data = {}
if serial: if serial:
data['serial'] = util.get_serial_ports() data["serial"] = util.get_serial_ports()
if logical: if logical:
data['logical'] = util.get_logical_devices() data["logical"] = util.get_logical_devices()
if mdns: if mdns:
data['mdns'] = util.get_mdns_services() data["mdns"] = util.get_mdns_services()
single_key = list(data)[0] if len(list(data)) == 1 else None single_key = list(data)[0] if len(list(data)) == 1 else None
if json_output: if json_output:
return click.echo( return click.echo(
dump_json_to_unicode(data[single_key] if single_key else data)) dump_json_to_unicode(data[single_key] if single_key else data)
)
titles = { titles = {
"serial": "Serial Ports", "serial": "Serial Ports",
"logical": "Logical Devices", "logical": "Logical Devices",
"mdns": "Multicast DNS Services" "mdns": "Multicast DNS Services",
} }
for key, value in data.items(): for key, value in data.items():
@@ -66,31 +67,38 @@ def device_list( # pylint: disable=too-many-branches
if key == "serial": if key == "serial":
for item in value: for item in value:
click.secho(item['port'], fg="cyan") click.secho(item["port"], fg="cyan")
click.echo("-" * len(item['port'])) click.echo("-" * len(item["port"]))
click.echo("Hardware ID: %s" % item['hwid']) click.echo("Hardware ID: %s" % item["hwid"])
click.echo("Description: %s" % item['description']) click.echo("Description: %s" % item["description"])
click.echo("") click.echo("")
if key == "logical": if key == "logical":
for item in value: for item in value:
click.secho(item['path'], fg="cyan") click.secho(item["path"], fg="cyan")
click.echo("-" * len(item['path'])) click.echo("-" * len(item["path"]))
click.echo("Name: %s" % item['name']) click.echo("Name: %s" % item["name"])
click.echo("") click.echo("")
if key == "mdns": if key == "mdns":
for item in value: for item in value:
click.secho(item['name'], fg="cyan") click.secho(item["name"], fg="cyan")
click.echo("-" * len(item['name'])) click.echo("-" * len(item["name"]))
click.echo("Type: %s" % item['type']) click.echo("Type: %s" % item["type"])
click.echo("IP: %s" % item['ip']) click.echo("IP: %s" % item["ip"])
click.echo("Port: %s" % item['port']) click.echo("Port: %s" % item["port"])
if item['properties']: if item["properties"]:
click.echo("Properties: %s" % ("; ".join([ click.echo(
"%s=%s" % (k, v) "Properties: %s"
for k, v in item['properties'].items() % (
]))) "; ".join(
[
"%s=%s" % (k, v)
for k, v in item["properties"].items()
]
)
)
)
click.echo("") click.echo("")
if single_key: if single_key:
@@ -102,66 +110,72 @@ def device_list( # pylint: disable=too-many-branches
@cli.command("monitor", short_help="Monitor device (Serial)") @cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name") @click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600") @click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option("--parity", @click.option(
default="N", "--parity",
type=click.Choice(["N", "E", "O", "S", "M"]), default="N",
help="Set parity, default=N") type=click.Choice(["N", "E", "O", "S", "M"]),
@click.option("--rtscts", help="Set parity, default=N",
is_flag=True, )
help="Enable RTS/CTS flow control, default=Off") @click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option("--xonxoff", @click.option(
is_flag=True, "--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
help="Enable software flow control, default=Off") )
@click.option("--rts", @click.option(
default=None, "--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
type=click.IntRange(0, 1), )
help="Set initial RTS line state") @click.option(
@click.option("--dtr", "--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
default=None, )
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off") @click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option("--encoding", @click.option(
default="UTF-8", "--encoding",
help="Set the encoding for the serial port (e.g. hexlify, " default="UTF-8",
"Latin1, UTF-8), default: UTF-8") help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add text transformation") @click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option("--eol", @click.option(
default="CRLF", "--eol",
type=click.Choice(["CR", "LF", "CRLF"]), default="CRLF",
help="End of line mode, default=CRLF") type=click.Choice(["CR", "LF", "CRLF"]),
@click.option("--raw", help="End of line mode, default=CRLF",
is_flag=True, )
help="Do not apply any encodings/transformations") @click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option("--exit-char", @click.option(
type=int, "--exit-char",
default=3, type=int,
help="ASCII code of special character that is used to exit " default=3,
"the application, default=3 (Ctrl+C)") help="ASCII code of special character that is used to exit "
@click.option("--menu-char", "the application, default=3 (Ctrl+C)",
type=int, )
default=20, @click.option(
help="ASCII code of special character that is used to " "--menu-char",
"control miniterm (menu), default=20 (DEC)") type=int,
@click.option("--quiet", default=20,
is_flag=True, help="ASCII code of special character that is used to "
help="Diagnostics: suppress non-error messages, default=Off") "control miniterm (menu), default=20 (DEC)",
@click.option("-d", )
"--project-dir", @click.option(
default=getcwd, "--quiet",
type=click.Path(exists=True, is_flag=True,
file_okay=False, help="Diagnostics: suppress non-error messages, default=Off",
dir_okay=True, )
resolve_path=True)) @click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option( @click.option(
"-e", "-e",
"--environment", "--environment",
help="Load configuration from `platformio.ini` and specified environment") help="Load configuration from `platformio.ini` and specified environment",
)
def device_monitor(**kwargs): # pylint: disable=too-many-branches def device_monitor(**kwargs): # pylint: disable=too-many-branches
env_options = {} env_options = {}
try: try:
env_options = get_project_options(kwargs['project_dir'], with fs.cd(kwargs["project_dir"]):
kwargs['environment']) env_options = get_project_options(kwargs["environment"])
for k in ("port", "speed", "rts", "dtr"): for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k k2 = "monitor_%s" % k
if k == "speed": if k == "speed":
@@ -173,10 +187,10 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
except exception.NotPlatformIOProject: except exception.NotPlatformIOProject:
pass pass
if not kwargs['port']: if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True) ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1: if len(ports) == 1:
kwargs['port'] = ports[0]['port'] kwargs["port"] = ports[0]["port"]
sys.argv = ["monitor"] + env_options.get("monitor_flags", []) sys.argv = ["monitor"] + env_options.get("monitor_flags", [])
for k, v in kwargs.items(): for k, v in kwargs.items():
@@ -194,23 +208,25 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
else: else:
sys.argv.extend([k, str(v)]) sys.argv.extend([k, str(v)])
if kwargs['port'] and (set(["*", "?", "[", "]"]) & set(kwargs['port'])): if kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
for item in util.get_serial_ports(): for item in util.get_serial_ports():
if fnmatch(item['port'], kwargs['port']): if fnmatch(item["port"], kwargs["port"]):
kwargs['port'] = item['port'] kwargs["port"] = item["port"]
break break
try: try:
miniterm.main(default_port=kwargs['port'], miniterm.main(
default_baudrate=kwargs['baud'] or 9600, default_port=kwargs["port"],
default_rts=kwargs['rts'], default_baudrate=kwargs["baud"] or 9600,
default_dtr=kwargs['dtr']) default_rts=kwargs["rts"],
default_dtr=kwargs["dtr"],
)
except Exception as e: except Exception as e:
raise exception.MinitermException(e) raise exception.MinitermException(e)
def get_project_options(project_dir, environment=None): def get_project_options(environment=None):
config = ProjectConfig.get_instance(join(project_dir, "platformio.ini")) config = ProjectConfig.get_instance()
config.validate(envs=[environment] if environment else None) config.validate(envs=[environment] if environment else None)
if not environment: if not environment:
default_envs = config.default_envs() default_envs = config.default_envs()

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.commands.home.command import cli

View File

@@ -12,6 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-locals
import mimetypes import mimetypes
import socket import socket
from os.path import isdir from os.path import isdir
@@ -19,8 +21,7 @@ from os.path import isdir
import click import click
from platformio import exception from platformio import exception
from platformio.managers.core import (get_core_package_dir, from platformio.managers.core import get_core_package_dir, inject_contrib_pysite
inject_contrib_pysite)
@click.command("home", short_help="PIO Home") @click.command("home", short_help="PIO Home")
@@ -28,17 +29,30 @@ from platformio.managers.core import (get_core_package_dir,
@click.option( @click.option(
"--host", "--host",
default="127.0.0.1", default="127.0.0.1",
help="HTTP host, default=127.0.0.1. " help=(
"You can open PIO Home for inbound connections with --host=0.0.0.0") "HTTP host, default=127.0.0.1. You can open PIO Home for inbound "
@click.option("--no-open", is_flag=True) # pylint: disable=too-many-locals "connections with --host=0.0.0.0"
def cli(port, host, no_open): ),
)
@click.option("--no-open", is_flag=True)
@click.option(
"--shutdown-timeout",
default=0,
type=int,
help=(
"Automatically shutdown server on timeout (in seconds) when no clients "
"are connected. Default is 0 which means never auto shutdown"
),
)
def cli(port, host, no_open, shutdown_timeout):
# pylint: disable=import-error, import-outside-toplevel
# import contrib modules # import contrib modules
inject_contrib_pysite() inject_contrib_pysite()
# pylint: disable=import-error
from autobahn.twisted.resource import WebSocketResource from autobahn.twisted.resource import WebSocketResource
from twisted.internet import reactor from twisted.internet import reactor
from twisted.web import server from twisted.web import server
# pylint: enable=import-error
from platformio.commands.home.rpc.handlers.app import AppRPC from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC from platformio.commands.home.rpc.handlers.misc import MiscRPC
@@ -48,7 +62,7 @@ def cli(port, host, no_open):
from platformio.commands.home.rpc.server import JSONRPCServerFactory from platformio.commands.home.rpc.server import JSONRPCServerFactory
from platformio.commands.home.web import WebRoot from platformio.commands.home.web import WebRoot
factory = JSONRPCServerFactory() factory = JSONRPCServerFactory(shutdown_timeout)
factory.addHandler(AppRPC(), namespace="app") factory.addHandler(AppRPC(), namespace="app")
factory.addHandler(IDERPC(), namespace="ide") factory.addHandler(IDERPC(), namespace="ide")
factory.addHandler(MiscRPC(), namespace="misc") factory.addHandler(MiscRPC(), namespace="misc")
@@ -89,14 +103,18 @@ def cli(port, host, no_open):
else: else:
reactor.callLater(1, lambda: click.launch(home_url)) reactor.callLater(1, lambda: click.launch(home_url))
click.echo("\n".join([ click.echo(
"", "\n".join(
" ___I_", [
" /\\-_--\\ PlatformIO Home", "",
"/ \\_-__\\", " ___I_",
"|[]| [] | %s" % home_url, " /\\-_--\\ PlatformIO Home",
"|__|____|______________%s" % ("_" * len(host)), "/ \\_-__\\",
])) "|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)),
]
)
)
click.echo("") click.echo("")
click.echo("Open PIO Home in your browser by this URL => %s" % home_url) click.echo("Open PIO Home in your browser by this URL => %s" % home_url)

View File

@@ -27,7 +27,6 @@ from platformio.proc import where_is_program
class AsyncSession(requests.Session): class AsyncSession(requests.Session):
def __init__(self, n=None, *args, **kwargs): def __init__(self, n=None, *args, **kwargs):
if n: if n:
pool = reactor.getThreadPool() pool = reactor.getThreadPool()
@@ -51,7 +50,8 @@ def requests_session():
@util.memoized(expire="60s") @util.memoized(expire="60s")
def get_core_fullpath(): def get_core_fullpath():
return where_is_program( return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else "")) "platformio" + (".exe" if "windows" in util.get_systype() else "")
)
@util.memoized(expire="10s") @util.memoized(expire="10s")
@@ -60,9 +60,7 @@ def is_twitter_blocked():
timeout = 2 timeout = 2
try: try:
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")): if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
requests.get("http://%s" % ip, requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout)
allow_redirects=False,
timeout=timeout)
else: else:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80)) socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80))
return False return False

View File

@@ -14,11 +14,10 @@
from __future__ import absolute_import from __future__ import absolute_import
from os.path import expanduser, join from os.path import join
from platformio import __version__, app, util from platformio import __version__, app, fs, util
from platformio.project.helpers import (get_project_core_dir, from platformio.project.helpers import get_project_core_dir, is_platformio_project
is_platformio_project)
class AppRPC(object): class AppRPC(object):
@@ -26,8 +25,13 @@ class AppRPC(object):
APPSTATE_PATH = join(get_project_core_dir(), "homestate.json") APPSTATE_PATH = join(get_project_core_dir(), "homestate.json")
IGNORE_STORAGE_KEYS = [ IGNORE_STORAGE_KEYS = [
"cid", "coreVersion", "coreSystype", "coreCaller", "coreSettings", "cid",
"homeDir", "projectsDir" "coreVersion",
"coreSystype",
"coreCaller",
"coreSettings",
"homeDir",
"projectsDir",
] ]
@staticmethod @staticmethod
@@ -37,31 +41,28 @@ class AppRPC(object):
# base data # base data
caller_id = app.get_session_var("caller_id") caller_id = app.get_session_var("caller_id")
storage['cid'] = app.get_cid() storage["cid"] = app.get_cid()
storage['coreVersion'] = __version__ storage["coreVersion"] = __version__
storage['coreSystype'] = util.get_systype() storage["coreSystype"] = util.get_systype()
storage['coreCaller'] = (str(caller_id).lower() storage["coreCaller"] = str(caller_id).lower() if caller_id else None
if caller_id else None) storage["coreSettings"] = {
storage['coreSettings'] = {
name: { name: {
"description": data['description'], "description": data["description"],
"default_value": data['value'], "default_value": data["value"],
"value": app.get_setting(name) "value": app.get_setting(name),
} }
for name, data in app.DEFAULT_SETTINGS.items() for name, data in app.DEFAULT_SETTINGS.items()
} }
storage['homeDir'] = expanduser("~") storage["homeDir"] = fs.expanduser("~")
storage['projectsDir'] = storage['coreSettings']['projects_dir'][ storage["projectsDir"] = storage["coreSettings"]["projects_dir"]["value"]
'value']
# skip non-existing recent projects # skip non-existing recent projects
storage['recentProjects'] = [ storage["recentProjects"] = [
p for p in storage.get("recentProjects", []) p for p in storage.get("recentProjects", []) if is_platformio_project(p)
if is_platformio_project(p)
] ]
state['storage'] = storage state["storage"] = storage
state.modified = False # skip saving extra fields state.modified = False # skip saving extra fields
return state.as_dict() return state.as_dict()

View File

@@ -19,20 +19,18 @@ from twisted.internet import defer # pylint: disable=import-error
class IDERPC(object): class IDERPC(object):
def __init__(self): def __init__(self):
self._queue = {} self._queue = {}
def send_command(self, command, params, sid=0): def send_command(self, sid, command, params):
if not self._queue.get(sid): if not self._queue.get(sid):
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4005, message="PIO Home IDE agent is not started") code=4005, message="PIO Home IDE agent is not started"
)
while self._queue[sid]: while self._queue[sid]:
self._queue[sid].pop().callback({ self._queue[sid].pop().callback(
"id": time.time(), {"id": time.time(), "method": command, "params": params}
"method": command, )
"params": params
})
def listen_commands(self, sid=0): def listen_commands(self, sid=0):
if sid not in self._queue: if sid not in self._queue:
@@ -40,5 +38,10 @@ class IDERPC(object):
self._queue[sid].append(defer.Deferred()) self._queue[sid].append(defer.Deferred())
return self._queue[sid][-1] return self._queue[sid][-1]
def open_project(self, project_dir, sid=0): def open_project(self, sid, project_dir):
return self.send_command("open_project", project_dir, sid) return self.send_command(sid, "open_project", project_dir)
def open_text_document(self, sid, path, line=None, column=None):
return self.send_command(
sid, "open_text_document", dict(path=path, line=line, column=column)
)

View File

@@ -22,33 +22,31 @@ from platformio.commands.home.rpc.handlers.os import OSRPC
class MiscRPC(object): class MiscRPC(object):
def load_latest_tweets(self, data_url):
def load_latest_tweets(self, username): cache_key = data_url
cache_key = "piohome_latest_tweets_" + str(username)
cache_valid = "7d" cache_valid = "7d"
with app.ContentCache() as cc: with app.ContentCache() as cc:
cache_data = cc.get(cache_key) cache_data = cc.get(cache_key)
if cache_data: if cache_data:
cache_data = json.loads(cache_data) cache_data = json.loads(cache_data)
# automatically update cache in background every 12 hours # automatically update cache in background every 12 hours
if cache_data['time'] < (time.time() - (3600 * 12)): if cache_data["time"] < (time.time() - (3600 * 12)):
reactor.callLater(5, self._preload_latest_tweets, username, reactor.callLater(
cache_key, cache_valid) 5, self._preload_latest_tweets, data_url, cache_key, cache_valid
return cache_data['result'] )
return cache_data["result"]
result = self._preload_latest_tweets(username, cache_key, cache_valid) result = self._preload_latest_tweets(data_url, cache_key, cache_valid)
return result return result
@staticmethod @staticmethod
@defer.inlineCallbacks @defer.inlineCallbacks
def _preload_latest_tweets(username, cache_key, cache_valid): def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = yield OSRPC.fetch_content( result = json.loads((yield OSRPC.fetch_content(data_url)))
"https://api.platformio.org/tweets/" + username)
result = json.loads(result)
with app.ContentCache() as cc: with app.ContentCache() as cc:
cc.set(cache_key, cc.set(
json.dumps({ cache_key,
"time": int(time.time()), json.dumps({"time": int(time.time()), "result": result}),
"result": result cache_valid,
}), cache_valid) )
defer.returnValue(result) defer.returnValue(result)

View File

@@ -19,30 +19,29 @@ import glob
import os import os
import shutil import shutil
from functools import cmp_to_key from functools import cmp_to_key
from os.path import expanduser, isdir, isfile, join from os.path import isdir, isfile, join
import click import click
from twisted.internet import defer # pylint: disable=import-error from twisted.internet import defer # pylint: disable=import-error
from platformio import app, util from platformio import app, fs, util
from platformio.commands.home import helpers from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding from platformio.compat import PY2, get_filesystem_encoding
class OSRPC(object): class OSRPC(object):
@staticmethod @staticmethod
@defer.inlineCallbacks @defer.inlineCallbacks
def fetch_content(uri, data=None, headers=None, cache_valid=None): def fetch_content(uri, data=None, headers=None, cache_valid=None):
if not headers: if not headers:
headers = { headers = {
"User-Agent": "User-Agent": (
("Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) " "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) "
"AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 " "AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 "
"Safari/603.3.8") "Safari/603.3.8"
)
} }
cache_key = (app.ContentCache.key_from_args(uri, data) cache_key = app.ContentCache.key_from_args(uri, data) if cache_valid else None
if cache_valid else None)
with app.ContentCache() as cc: with app.ContentCache() as cc:
if cache_key: if cache_key:
result = cc.get(cache_key) result = cc.get(cache_key)
@@ -66,7 +65,7 @@ class OSRPC(object):
defer.returnValue(result) defer.returnValue(result)
def request_content(self, uri, data=None, headers=None, cache_valid=None): def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith('http'): if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid) return self.fetch_content(uri, data, headers, cache_valid)
if not isfile(uri): if not isfile(uri):
return None return None
@@ -80,8 +79,12 @@ class OSRPC(object):
@staticmethod @staticmethod
def reveal_file(path): def reveal_file(path):
return click.launch( return click.launch(
path.encode(get_filesystem_encoding()) if PY2 else path, path.encode(get_filesystem_encoding()) if PY2 else path, locate=True
locate=True) )
@staticmethod
def open_file(path):
return click.launch(path.encode(get_filesystem_encoding()) if PY2 else path)
@staticmethod @staticmethod
def is_file(path): def is_file(path):
@@ -109,13 +112,11 @@ class OSRPC(object):
pathnames = [pathnames] pathnames = [pathnames]
result = set() result = set()
for pathname in pathnames: for pathname in pathnames:
result |= set( result |= set(glob.glob(join(root, pathname) if root else pathname))
glob.glob(join(root, pathname) if root else pathname))
return list(result) return list(result)
@staticmethod @staticmethod
def list_dir(path): def list_dir(path):
def _cmp(x, y): def _cmp(x, y):
if x[1] and not y[1]: if x[1] and not y[1]:
return -1 return -1
@@ -129,7 +130,7 @@ class OSRPC(object):
items = [] items = []
if path.startswith("~"): if path.startswith("~"):
path = expanduser(path) path = fs.expanduser(path)
if not isdir(path): if not isdir(path):
return items return items
for item in os.listdir(path): for item in os.listdir(path):
@@ -146,7 +147,7 @@ class OSRPC(object):
def get_logical_devices(): def get_logical_devices():
items = [] items = []
for item in util.get_logical_devices(): for item in util.get_logical_devices():
if item['name']: if item["name"]:
item['name'] = item['name'] item["name"] = item["name"]
items.append(item) items.append(item)
return items return items

View File

@@ -27,8 +27,7 @@ from twisted.internet import utils # pylint: disable=import-error
from platformio import __main__, __version__, fs from platformio import __main__, __version__, fs
from platformio.commands.home import helpers from platformio.commands.home import helpers
from platformio.compat import (PY2, get_filesystem_encoding, is_bytes, from platformio.compat import PY2, get_filesystem_encoding, is_bytes, string_types
string_types)
try: try:
from thread import get_ident as thread_get_ident from thread import get_ident as thread_get_ident
@@ -37,7 +36,6 @@ except ImportError:
class MultiThreadingStdStream(object): class MultiThreadingStdStream(object):
def __init__(self, parent_stream): def __init__(self, parent_stream):
self._buffers = {thread_get_ident(): parent_stream} self._buffers = {thread_get_ident(): parent_stream}
@@ -54,7 +52,8 @@ class MultiThreadingStdStream(object):
thread_id = thread_get_ident() thread_id = thread_get_ident()
self._ensure_thread_buffer(thread_id) self._ensure_thread_buffer(thread_id)
return self._buffers[thread_id].write( return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value) value.decode() if is_bytes(value) else value
)
def get_value_and_reset(self): def get_value_and_reset(self):
result = "" result = ""
@@ -68,7 +67,6 @@ class MultiThreadingStdStream(object):
class PIOCoreRPC(object): class PIOCoreRPC(object):
@staticmethod @staticmethod
def version(): def version():
return __version__ return __version__
@@ -104,16 +102,15 @@ class PIOCoreRPC(object):
else: else:
result = yield PIOCoreRPC._call_inline(args, options) result = yield PIOCoreRPC._call_inline(args, options)
try: try:
defer.returnValue( defer.returnValue(PIOCoreRPC._process_result(result, to_json))
PIOCoreRPC._process_result(result, to_json))
except ValueError: except ValueError:
# fall-back to subprocess method # fall-back to subprocess method
result = yield PIOCoreRPC._call_subprocess(args, options) result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue( defer.returnValue(PIOCoreRPC._process_result(result, to_json))
PIOCoreRPC._process_result(result, to_json))
except Exception as e: # pylint: disable=bare-except except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4003, message="PIO Core Call Error", data=str(e)) code=4003, message="PIO Core Call Error", data=str(e)
)
@staticmethod @staticmethod
def _call_inline(args, options): def _call_inline(args, options):
@@ -123,8 +120,11 @@ class PIOCoreRPC(object):
def _thread_task(): def _thread_task():
with fs.cd(cwd): with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args) exit_code = __main__.main(["-c"] + args)
return (PIOCoreRPC.thread_stdout.get_value_and_reset(), return (
PIOCoreRPC.thread_stderr.get_value_and_reset(), exit_code) PIOCoreRPC.thread_stdout.get_value_and_reset(),
PIOCoreRPC.thread_stderr.get_value_and_reset(),
exit_code,
)
return threads.deferToThread(_thread_task) return threads.deferToThread(_thread_task)
@@ -135,8 +135,8 @@ class PIOCoreRPC(object):
helpers.get_core_fullpath(), helpers.get_core_fullpath(),
args, args,
path=cwd, path=cwd,
env={k: v env={k: v for k, v in os.environ.items() if "%" not in k},
for k, v in os.environ.items() if "%" not in k}) )
@staticmethod @staticmethod
def _process_result(result, to_json=False): def _process_result(result, to_json=False):

View File

@@ -17,8 +17,7 @@ from __future__ import absolute_import
import os import os
import shutil import shutil
import time import time
from os.path import (basename, expanduser, getmtime, isdir, isfile, join, from os.path import basename, getmtime, isdir, isfile, join, realpath, sep
realpath, sep)
import jsonrpc # pylint: disable=import-error import jsonrpc # pylint: disable=import-error
@@ -29,38 +28,75 @@ from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_libdeps_dir, from platformio.project.helpers import get_project_dir, is_platformio_project
get_project_src_dir, from platformio.project.options import get_config_options_schema
is_platformio_project)
class ProjectRPC(object): class ProjectRPC(object):
@staticmethod
def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict)
assert "path" in init_kwargs
project_dir = get_project_dir()
if isfile(init_kwargs["path"]):
project_dir = os.path.dirname(init_kwargs["path"])
with fs.cd(project_dir):
return getattr(ProjectConfig(**init_kwargs), method)(*args)
@staticmethod
def config_load(path):
return ProjectConfig(
path, parse_extra=False, expand_interpolations=False
).as_tuple()
@staticmethod
def config_dump(path, data):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
config.update(data, clear=True)
return config.save()
@staticmethod
def config_update_description(path, text):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
if not config.has_section("platformio"):
config.add_section("platformio")
if text:
config.set("platformio", "description", text)
else:
if config.has_option("platformio", "description"):
config.remove_option("platformio", "description")
if not config.options("platformio"):
config.remove_section("platformio")
return config.save()
@staticmethod
def get_config_schema():
return get_config_options_schema()
@staticmethod @staticmethod
def _get_projects(project_dirs=None): def _get_projects(project_dirs=None):
def _get_project_data():
def _get_project_data(project_dir):
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []} data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
config = ProjectConfig(join(project_dir, "platformio.ini")) config = ProjectConfig()
libdeps_dir = get_project_libdeps_dir() data["envs"] = config.envs()
data["description"] = config.get("platformio", "description")
data['libExtraDirs'].extend( data["libExtraDirs"].extend(config.get("platformio", "lib_extra_dirs", []))
config.get("platformio", "lib_extra_dirs", []))
libdeps_dir = config.get_optional_dir("libdeps")
for section in config.sections(): for section in config.sections():
if not section.startswith("env:"): if not section.startswith("env:"):
continue continue
data['envLibdepsDirs'].append(join(libdeps_dir, section[4:])) data["envLibdepsDirs"].append(join(libdeps_dir, section[4:]))
if config.has_option(section, "board"): if config.has_option(section, "board"):
data['boards'].append(config.get(section, "board")) data["boards"].append(config.get(section, "board"))
data['libExtraDirs'].extend( data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
config.get(section, "lib_extra_dirs", []))
# skip non existing folders and resolve full path # skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"): for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [ data[key] = [
expanduser(d) if d.startswith("~") else realpath(d) fs.expanduser(d) if d.startswith("~") else realpath(d)
for d in data[key] if isdir(d) for d in data[key]
if isdir(d)
] ]
return data return data
@@ -69,7 +105,7 @@ class ProjectRPC(object):
return (sep).join(path.split(sep)[-2:]) return (sep).join(path.split(sep)[-2:])
if not project_dirs: if not project_dirs:
project_dirs = AppRPC.load_state()['storage']['recentProjects'] project_dirs = AppRPC.load_state()["storage"]["recentProjects"]
result = [] result = []
pm = PlatformManager() pm = PlatformManager()
@@ -78,36 +114,36 @@ class ProjectRPC(object):
boards = [] boards = []
try: try:
with fs.cd(project_dir): with fs.cd(project_dir):
data = _get_project_data(project_dir) data = _get_project_data()
except exception.PlatformIOProjectException: except exception.PlatformIOProjectException:
continue continue
for board_id in data.get("boards", []): for board_id in data.get("boards", []):
name = board_id name = board_id
try: try:
name = pm.board_config(board_id)['name'] name = pm.board_config(board_id)["name"]
except exception.PlatformioException: except exception.PlatformioException:
pass pass
boards.append({"id": board_id, "name": name}) boards.append({"id": board_id, "name": name})
result.append({ result.append(
"path": {
project_dir, "path": project_dir,
"name": "name": _path_to_name(project_dir),
_path_to_name(project_dir), "modified": int(getmtime(project_dir)),
"modified": "boards": boards,
int(getmtime(project_dir)), "description": data.get("description"),
"boards": "envs": data.get("envs", []),
boards, "envLibStorages": [
"envLibStorages": [{ {"name": basename(d), "path": d}
"name": basename(d), for d in data.get("envLibdepsDirs", [])
"path": d ],
} for d in data.get("envLibdepsDirs", [])], "extraLibStorages": [
"extraLibStorages": [{ {"name": _path_to_name(d), "path": d}
"name": _path_to_name(d), for d in data.get("libExtraDirs", [])
"path": d ],
} for d in data.get("libExtraDirs", [])] }
}) )
return result return result
def get_projects(self, project_dirs=None): def get_projects(self, project_dirs=None):
@@ -117,7 +153,7 @@ class ProjectRPC(object):
def get_project_examples(): def get_project_examples():
result = [] result = []
for manifest in PlatformManager().get_installed(): for manifest in PlatformManager().get_installed():
examples_dir = join(manifest['__pkg_dir'], "examples") examples_dir = join(manifest["__pkg_dir"], "examples")
if not isdir(examples_dir): if not isdir(examples_dir):
continue continue
items = [] items = []
@@ -126,28 +162,30 @@ class ProjectRPC(object):
try: try:
config = ProjectConfig(join(project_dir, "platformio.ini")) config = ProjectConfig(join(project_dir, "platformio.ini"))
config.validate(silent=True) config.validate(silent=True)
project_description = config.get("platformio", project_description = config.get("platformio", "description")
"description")
except exception.PlatformIOProjectException: except exception.PlatformIOProjectException:
continue continue
path_tokens = project_dir.split(sep) path_tokens = project_dir.split(sep)
items.append({ items.append(
"name": {
"/".join(path_tokens[path_tokens.index("examples") + 1:]), "name": "/".join(
"path": path_tokens[path_tokens.index("examples") + 1 :]
project_dir, ),
"description": "path": project_dir,
project_description "description": project_description,
}) }
result.append({ )
"platform": { result.append(
"title": manifest['title'], {
"version": manifest['version'] "platform": {
}, "title": manifest["title"],
"items": sorted(items, key=lambda item: item['name']) "version": manifest["version"],
}) },
return sorted(result, key=lambda data: data['platform']['title']) "items": sorted(items, key=lambda item: item["name"]),
}
)
return sorted(result, key=lambda data: data["platform"]["title"])
def init(self, board, framework, project_dir): def init(self, board, framework, project_dir):
assert project_dir assert project_dir
@@ -157,9 +195,11 @@ class ProjectRPC(object):
args = ["init", "--board", board] args = ["init", "--board", board]
if framework: if framework:
args.extend(["--project-option", "framework = %s" % framework]) args.extend(["--project-option", "framework = %s" % framework])
if (state['storage']['coreCaller'] and state['storage']['coreCaller'] if (
in ProjectGenerator.get_supported_ides()): state["storage"]["coreCaller"]
args.extend(["--ide", state['storage']['coreCaller']]) and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": project_dir}) d = PIOCoreRPC.call(args, options={"cwd": project_dir})
d.addCallback(self._generate_project_main, project_dir, framework) d.addCallback(self._generate_project_main, project_dir, framework)
return d return d
@@ -168,89 +208,99 @@ class ProjectRPC(object):
def _generate_project_main(_, project_dir, framework): def _generate_project_main(_, project_dir, framework):
main_content = None main_content = None
if framework == "arduino": if framework == "arduino":
main_content = "\n".join([ main_content = "\n".join(
"#include <Arduino.h>", [
"", "#include <Arduino.h>",
"void setup() {", "",
" // put your setup code here, to run once:", "void setup() {",
"}", " // put your setup code here, to run once:",
"", "}",
"void loop() {", "",
" // put your main code here, to run repeatedly:", "void loop() {",
"}" " // put your main code here, to run repeatedly:",
"" "}",
]) # yapf: disable "",
]
)
elif framework == "mbed": elif framework == "mbed":
main_content = "\n".join([ main_content = "\n".join(
"#include <mbed.h>", [
"", "#include <mbed.h>",
"int main() {", "",
"", "int main() {",
" // put your setup code here, to run once:", "",
"", " // put your setup code here, to run once:",
" while(1) {", "",
" // put your main code here, to run repeatedly:", " while(1) {",
" }", " // put your main code here, to run repeatedly:",
"}", " }",
"" "}",
]) # yapf: disable "",
]
)
if not main_content: if not main_content:
return project_dir return project_dir
with fs.cd(project_dir): with fs.cd(project_dir):
src_dir = get_project_src_dir() config = ProjectConfig()
src_dir = config.get_optional_dir("src")
main_path = join(src_dir, "main.cpp") main_path = join(src_dir, "main.cpp")
if isfile(main_path): if isfile(main_path):
return project_dir return project_dir
if not isdir(src_dir): if not isdir(src_dir):
os.makedirs(src_dir) os.makedirs(src_dir)
with open(main_path, "w") as f: fs.write_file_contents(main_path, main_content.strip())
f.write(main_content.strip())
return project_dir return project_dir
def import_arduino(self, board, use_arduino_libs, arduino_project_dir): def import_arduino(self, board, use_arduino_libs, arduino_project_dir):
board = str(board) board = str(board)
if arduino_project_dir and PY2: if arduino_project_dir and PY2:
arduino_project_dir = arduino_project_dir.encode( arduino_project_dir = arduino_project_dir.encode(get_filesystem_encoding())
get_filesystem_encoding())
# don't import PIO Project # don't import PIO Project
if is_platformio_project(arduino_project_dir): if is_platformio_project(arduino_project_dir):
return arduino_project_dir return arduino_project_dir
is_arduino_project = any([ is_arduino_project = any(
isfile( [
join(arduino_project_dir, isfile(
"%s.%s" % (basename(arduino_project_dir), ext))) join(
for ext in ("ino", "pde") arduino_project_dir,
]) "%s.%s" % (basename(arduino_project_dir), ext),
)
)
for ext in ("ino", "pde")
]
)
if not is_arduino_project: if not is_arduino_project:
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4000, code=4000, message="Not an Arduino project: %s" % arduino_project_dir
message="Not an Arduino project: %s" % arduino_project_dir) )
state = AppRPC.load_state() state = AppRPC.load_state()
project_dir = join(state['storage']['projectsDir'], project_dir = join(
time.strftime("%y%m%d-%H%M%S-") + board) state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
)
if not isdir(project_dir): if not isdir(project_dir):
os.makedirs(project_dir) os.makedirs(project_dir)
args = ["init", "--board", board] args = ["init", "--board", board]
args.extend(["--project-option", "framework = arduino"]) args.extend(["--project-option", "framework = arduino"])
if use_arduino_libs: if use_arduino_libs:
args.extend([ args.extend(
"--project-option", ["--project-option", "lib_extra_dirs = ~/Documents/Arduino/libraries"]
"lib_extra_dirs = ~/Documents/Arduino/libraries" )
]) if (
if (state['storage']['coreCaller'] and state['storage']['coreCaller'] state["storage"]["coreCaller"]
in ProjectGenerator.get_supported_ides()): and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
args.extend(["--ide", state['storage']['coreCaller']]) ):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": project_dir}) d = PIOCoreRPC.call(args, options={"cwd": project_dir})
d.addCallback(self._finalize_arduino_import, project_dir, d.addCallback(self._finalize_arduino_import, project_dir, arduino_project_dir)
arduino_project_dir)
return d return d
@staticmethod @staticmethod
def _finalize_arduino_import(_, project_dir, arduino_project_dir): def _finalize_arduino_import(_, project_dir, arduino_project_dir):
with fs.cd(project_dir): with fs.cd(project_dir):
src_dir = get_project_src_dir() config = ProjectConfig()
src_dir = config.get_optional_dir("src")
if isdir(src_dir): if isdir(src_dir):
fs.rmtree(src_dir) fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir) shutil.copytree(arduino_project_dir, src_dir)
@@ -260,18 +310,21 @@ class ProjectRPC(object):
def import_pio(project_dir): def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir): if not project_dir or not is_platformio_project(project_dir):
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4001, code=4001, message="Not an PlatformIO project: %s" % project_dir
message="Not an PlatformIO project: %s" % project_dir) )
new_project_dir = join( new_project_dir = join(
AppRPC.load_state()['storage']['projectsDir'], AppRPC.load_state()["storage"]["projectsDir"],
time.strftime("%y%m%d-%H%M%S-") + basename(project_dir)) time.strftime("%y%m%d-%H%M%S-") + basename(project_dir),
)
shutil.copytree(project_dir, new_project_dir) shutil.copytree(project_dir, new_project_dir)
state = AppRPC.load_state() state = AppRPC.load_state()
args = ["init"] args = ["init"]
if (state['storage']['coreCaller'] and state['storage']['coreCaller'] if (
in ProjectGenerator.get_supported_ides()): state["storage"]["coreCaller"]
args.extend(["--ide", state['storage']['coreCaller']]) and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": new_project_dir}) d = PIOCoreRPC.call(args, options={"cwd": new_project_dir})
d.addCallback(lambda _: new_project_dir) d.addCallback(lambda _: new_project_dir)
return d return d

View File

@@ -16,49 +16,58 @@
import click import click
import jsonrpc import jsonrpc
from autobahn.twisted.websocket import (WebSocketServerFactory, from autobahn.twisted.websocket import WebSocketServerFactory, WebSocketServerProtocol
WebSocketServerProtocol)
from jsonrpc.exceptions import JSONRPCDispatchException from jsonrpc.exceptions import JSONRPCDispatchException
from twisted.internet import defer from twisted.internet import defer, reactor
from platformio.compat import PY2, dump_json_to_unicode, is_bytes from platformio.compat import PY2, dump_json_to_unicode, is_bytes
class JSONRPCServerProtocol(WebSocketServerProtocol): class JSONRPCServerProtocol(WebSocketServerProtocol):
def onOpen(self):
self.factory.connection_nums += 1
if self.factory.shutdown_timer:
self.factory.shutdown_timer.cancel()
self.factory.shutdown_timer = None
def onClose(self, wasClean, code, reason): # pylint: disable=unused-argument
self.factory.connection_nums -= 1
if self.factory.connection_nums == 0:
self.factory.shutdownByTimeout()
def onMessage(self, payload, isBinary): # pylint: disable=unused-argument def onMessage(self, payload, isBinary): # pylint: disable=unused-argument
# click.echo("> %s" % payload) # click.echo("> %s" % payload)
response = jsonrpc.JSONRPCResponseManager.handle( response = jsonrpc.JSONRPCResponseManager.handle(
payload, self.factory.dispatcher).data payload, self.factory.dispatcher
).data
# if error # if error
if "result" not in response: if "result" not in response:
self.sendJSONResponse(response) self.sendJSONResponse(response)
return None return None
d = defer.maybeDeferred(lambda: response['result']) d = defer.maybeDeferred(lambda: response["result"])
d.addCallback(self._callback, response) d.addCallback(self._callback, response)
d.addErrback(self._errback, response) d.addErrback(self._errback, response)
return None return None
def _callback(self, result, response): def _callback(self, result, response):
response['result'] = result response["result"] = result
self.sendJSONResponse(response) self.sendJSONResponse(response)
def _errback(self, failure, response): def _errback(self, failure, response):
if isinstance(failure.value, JSONRPCDispatchException): if isinstance(failure.value, JSONRPCDispatchException):
e = failure.value e = failure.value
else: else:
e = JSONRPCDispatchException(code=4999, e = JSONRPCDispatchException(code=4999, message=failure.getErrorMessage())
message=failure.getErrorMessage())
del response["result"] del response["result"]
response['error'] = e.error._data # pylint: disable=protected-access response["error"] = e.error._data # pylint: disable=protected-access
self.sendJSONResponse(response) self.sendJSONResponse(response)
def sendJSONResponse(self, response): def sendJSONResponse(self, response):
# click.echo("< %s" % response) # click.echo("< %s" % response)
if "error" in response: if "error" in response:
click.secho("Error: %s" % response['error'], fg="red", err=True) click.secho("Error: %s" % response["error"], fg="red", err=True)
response = dump_json_to_unicode(response) response = dump_json_to_unicode(response)
if not PY2 and not is_bytes(response): if not PY2 and not is_bytes(response):
response = response.encode("utf-8") response = response.encode("utf-8")
@@ -68,10 +77,25 @@ class JSONRPCServerProtocol(WebSocketServerProtocol):
class JSONRPCServerFactory(WebSocketServerFactory): class JSONRPCServerFactory(WebSocketServerFactory):
protocol = JSONRPCServerProtocol protocol = JSONRPCServerProtocol
connection_nums = 0
shutdown_timer = 0
def __init__(self): def __init__(self, shutdown_timeout=0):
super(JSONRPCServerFactory, self).__init__() super(JSONRPCServerFactory, self).__init__()
self.shutdown_timeout = shutdown_timeout
self.dispatcher = jsonrpc.Dispatcher() self.dispatcher = jsonrpc.Dispatcher()
def shutdownByTimeout(self):
if self.shutdown_timeout < 1:
return
def _auto_shutdown_server():
click.echo("Automatically shutdown server on timeout")
reactor.stop()
self.shutdown_timer = reactor.callLater(
self.shutdown_timeout, _auto_shutdown_server
)
def addHandler(self, handler, namespace): def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace) self.dispatcher.build_method_map(handler, prefix="%s." % namespace)

View File

@@ -17,14 +17,12 @@ from twisted.web import static # pylint: disable=import-error
class WebRoot(static.File): class WebRoot(static.File):
def render_GET(self, request): def render_GET(self, request):
if request.args.get("__shutdown__", False): if request.args.get("__shutdown__", False):
reactor.stop() reactor.stop()
return "Server has been stopped" return "Server has been stopped"
request.setHeader("cache-control", request.setHeader("cache-control", "no-cache, no-store, must-revalidate")
"no-cache, no-store, must-revalidate")
request.setHeader("pragma", "no-cache") request.setHeader("pragma", "no-cache")
request.setHeader("expires", "0") request.setHeader("expires", "0")
return static.File.render_GET(self, request) return static.File.render_GET(self, request)

View File

@@ -20,16 +20,11 @@ from os.path import isdir, isfile, join
import click import click
from platformio import exception, fs from platformio import exception, fs
from platformio.commands.platform import \ from platformio.commands.platform import platform_install as cli_platform_install
platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_include_dir, from platformio.project.helpers import is_platformio_project
get_project_lib_dir,
get_project_src_dir,
get_project_test_dir,
is_platformio_project)
def validate_boards(ctx, param, value): # pylint: disable=W0613 def validate_boards(ctx, param, value): # pylint: disable=W0613
@@ -40,66 +35,66 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
except exception.UnknownBoard: except exception.UnknownBoard:
raise click.BadParameter( raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` " "`%s`. Please search for board ID using `platformio boards` "
"command" % id_) "command" % id_
)
return value return value
@click.command("init", @click.command("init", short_help="Initialize PlatformIO project or update existing")
short_help="Initialize PlatformIO project or update existing") @click.option(
@click.option("--project-dir", "--project-dir",
"-d", "-d",
default=getcwd, default=getcwd,
type=click.Path(exists=True, type=click.Path(
file_okay=False, exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
dir_okay=True, ),
writable=True, )
resolve_path=True)) @click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("-b", @click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
"--board",
multiple=True,
metavar="ID",
callback=validate_boards)
@click.option("--ide",
type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-O", "--project-option", multiple=True) @click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="") @click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@click.pass_context @click.pass_context
def cli( def cli(
ctx, # pylint: disable=R0913 ctx, # pylint: disable=R0913
project_dir, project_dir,
board, board,
ide, ide,
project_option, project_option,
env_prefix, env_prefix,
silent): silent,
):
if not silent: if not silent:
if project_dir == getcwd(): if project_dir == getcwd():
click.secho("\nThe current working directory", click.secho("\nThe current working directory", fg="yellow", nl=False)
fg="yellow",
nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False) click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow") click.secho("will be used for the project.", fg="yellow")
click.echo("") click.echo("")
click.echo("The next files/directories have been created in %s" % click.echo(
click.style(project_dir, fg="cyan")) "The next files/directories have been created in %s"
click.echo("%s - Put project header files here" % % click.style(project_dir, fg="cyan")
click.style("include", fg="cyan")) )
click.echo("%s - Put here project specific (private) libraries" % click.echo(
click.style("lib", fg="cyan")) "%s - Put project header files here" % click.style("include", fg="cyan")
click.echo("%s - Put project source files here" % )
click.style("src", fg="cyan")) click.echo(
click.echo("%s - Project Configuration File" % "%s - Put here project specific (private) libraries"
click.style("platformio.ini", fg="cyan")) % click.style("lib", fg="cyan")
)
click.echo("%s - Put project source files here" % click.style("src", fg="cyan"))
click.echo(
"%s - Project Configuration File" % click.style("platformio.ini", fg="cyan")
)
is_new_project = not is_platformio_project(project_dir) is_new_project = not is_platformio_project(project_dir)
if is_new_project: if is_new_project:
init_base_project(project_dir) init_base_project(project_dir)
if board: if board:
fill_project_envs(ctx, project_dir, board, project_option, env_prefix, fill_project_envs(
ide is not None) ctx, project_dir, board, project_option, env_prefix, ide is not None
)
if ide: if ide:
pg = ProjectGenerator(project_dir, ide, board) pg = ProjectGenerator(project_dir, ide, board)
@@ -115,9 +110,9 @@ def cli(
if ide: if ide:
click.secho( click.secho(
"\nProject has been successfully %s including configuration files " "\nProject has been successfully %s including configuration files "
"for `%s` IDE." % "for `%s` IDE." % ("initialized" if is_new_project else "updated", ide),
("initialized" if is_new_project else "updated", ide), fg="green",
fg="green") )
else: else:
click.secho( click.secho(
"\nProject has been successfully %s! Useful commands:\n" "\nProject has been successfully %s! Useful commands:\n"
@@ -125,19 +120,21 @@ def cli(
"`pio run --target upload` or `pio run -t upload` " "`pio run --target upload` or `pio run -t upload` "
"- upload firmware to a target\n" "- upload firmware to a target\n"
"`pio run --target clean` - clean project (remove compiled files)" "`pio run --target clean` - clean project (remove compiled files)"
"\n`pio run --help` - additional information" % "\n`pio run --help` - additional information"
("initialized" if is_new_project else "updated"), % ("initialized" if is_new_project else "updated"),
fg="green") fg="green",
)
def init_base_project(project_dir): def init_base_project(project_dir):
ProjectConfig(join(project_dir, "platformio.ini")).save()
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig()
config.save()
dir_to_readme = [ dir_to_readme = [
(get_project_src_dir(), None), (config.get_optional_dir("src"), None),
(get_project_include_dir(), init_include_readme), (config.get_optional_dir("include"), init_include_readme),
(get_project_lib_dir(), init_lib_readme), (config.get_optional_dir("lib"), init_lib_readme),
(get_project_test_dir(), init_test_readme), (config.get_optional_dir("test"), init_test_readme),
] ]
for (path, cb) in dir_to_readme: for (path, cb) in dir_to_readme:
if isdir(path): if isdir(path):
@@ -148,8 +145,9 @@ def init_base_project(project_dir):
def init_include_readme(include_dir): def init_include_readme(include_dir):
with open(join(include_dir, "README"), "w") as f: fs.write_file_contents(
f.write(""" join(include_dir, "README"),
"""
This directory is intended for project header files. This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions A header file is a file containing C declarations and macro definitions
@@ -188,12 +186,15 @@ Read more about using header files in official GCC documentation:
* Computed Includes * Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
""") """,
)
def init_lib_readme(lib_dir): def init_lib_readme(lib_dir):
with open(join(lib_dir, "README"), "w") as f: # pylint: disable=line-too-long
f.write(""" fs.write_file_contents(
join(lib_dir, "README"),
"""
This directory is intended for project specific (private) libraries. This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file. PlatformIO will compile them to static libraries and link into executable file.
@@ -239,12 +240,14 @@ libraries scanning project source files.
More information about PlatformIO Library Dependency Finder More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html - https://docs.platformio.org/page/librarymanager/ldf.html
""") """,
)
def init_test_readme(test_dir): def init_test_readme(test_dir):
with open(join(test_dir, "README"), "w") as f: fs.write_file_contents(
f.write(""" join(test_dir, "README"),
"""
This directory is intended for PIO Unit Testing and project tests. This directory is intended for PIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of Unit Testing is a software testing method by which individual units of
@@ -255,15 +258,17 @@ in the development cycle.
More information about PIO Unit Testing: More information about PIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html - https://docs.platformio.org/page/plus/unit-testing.html
""") """,
)
def init_ci_conf(project_dir): def init_ci_conf(project_dir):
conf_path = join(project_dir, ".travis.yml") conf_path = join(project_dir, ".travis.yml")
if isfile(conf_path): if isfile(conf_path):
return return
with open(conf_path, "w") as f: fs.write_file_contents(
f.write("""# Continuous Integration (CI) is the practice, in software conf_path,
"""# Continuous Integration (CI) is the practice, in software
# engineering, of merging all developer working copies with a shared mainline # engineering, of merging all developer working copies with a shared mainline
# several times a day < https://docs.platformio.org/page/ci/index.html > # several times a day < https://docs.platformio.org/page/ci/index.html >
# #
@@ -330,27 +335,24 @@ def init_ci_conf(project_dir):
# #
# script: # script:
# - platformio ci --lib="." --board=ID_1 --board=ID_2 --board=ID_N # - platformio ci --lib="." --board=ID_1 --board=ID_2 --board=ID_N
""") """,
)
def init_cvs_ignore(project_dir): def init_cvs_ignore(project_dir):
conf_path = join(project_dir, ".gitignore") conf_path = join(project_dir, ".gitignore")
if isfile(conf_path): if isfile(conf_path):
return return
with open(conf_path, "w") as fp: fs.write_file_contents(conf_path, ".pio\n")
fp.write(".pio\n")
def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix, def fill_project_envs(
force_download): ctx, project_dir, board_ids, project_option, env_prefix, force_download
config = ProjectConfig(join(project_dir, "platformio.ini"), ):
parse_extra=False) config = ProjectConfig(join(project_dir, "platformio.ini"), parse_extra=False)
used_boards = [] used_boards = []
for section in config.sections(): for section in config.sections():
cond = [ cond = [section.startswith("env:"), config.has_option(section, "board")]
section.startswith("env:"),
config.has_option(section, "board")
]
if all(cond): if all(cond):
used_boards.append(config.get(section, "board")) used_boards.append(config.get(section, "board"))
@@ -359,17 +361,17 @@ def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
modified = False modified = False
for id_ in board_ids: for id_ in board_ids:
board_config = pm.board_config(id_) board_config = pm.board_config(id_)
used_platforms.append(board_config['platform']) used_platforms.append(board_config["platform"])
if id_ in used_boards: if id_ in used_boards:
continue continue
used_boards.append(id_) used_boards.append(id_)
modified = True modified = True
envopts = {"platform": board_config['platform'], "board": id_} envopts = {"platform": board_config["platform"], "board": id_}
# find default framework for board # find default framework for board
frameworks = board_config.get("frameworks") frameworks = board_config.get("frameworks")
if frameworks: if frameworks:
envopts['framework'] = frameworks[0] envopts["framework"] = frameworks[0]
for item in project_option: for item in project_option:
if "=" not in item: if "=" not in item:
@@ -391,10 +393,9 @@ def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
def _install_dependent_platforms(ctx, platforms): def _install_dependent_platforms(ctx, platforms):
installed_platforms = [ installed_platforms = [p["name"] for p in PlatformManager().get_installed()]
p['name'] for p in PlatformManager().get_installed()
]
if set(platforms) <= set(installed_platforms): if set(platforms) <= set(installed_platforms):
return return
ctx.invoke(cli_platform_install, ctx.invoke(
platforms=list(set(platforms) - set(installed_platforms))) cli_platform_install, platforms=list(set(platforms) - set(installed_platforms))
)

View File

@@ -14,24 +14,22 @@
# pylint: disable=too-many-branches, too-many-locals # pylint: disable=too-many-branches, too-many-locals
import os
import time import time
from os.path import isdir, join
import click import click
import semantic_version import semantic_version
from tabulate import tabulate from tabulate import tabulate
from platformio import exception, fs, util from platformio import exception, util
from platformio.commands import PlatformioCLI from platformio.commands import PlatformioCLI
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.lib import (LibraryManager, get_builtin_libs, from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib
is_builtin_lib) from platformio.package.manifest.parser import ManifestParserFactory
from platformio.package.manifest.schema import ManifestSchema, ManifestValidationError
from platformio.proc import is_ci from platformio.proc import is_ci
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_dir, from platformio.project.helpers import get_project_dir, is_platformio_project
get_project_global_lib_dir,
get_project_libdeps_dir,
is_platformio_project)
try: try:
from urllib.parse import quote from urllib.parse import quote
@@ -44,36 +42,43 @@ CTX_META_STORAGE_DIRS_KEY = __name__ + ".storage_dirs"
CTX_META_STORAGE_LIBDEPS_KEY = __name__ + ".storage_lib_deps" CTX_META_STORAGE_LIBDEPS_KEY = __name__ + ".storage_lib_deps"
def get_project_global_lib_dir():
return ProjectConfig.get_instance().get_optional_dir("globallib")
@click.group(short_help="Library Manager") @click.group(short_help="Library Manager")
@click.option("-d", @click.option(
"--storage-dir", "-d",
multiple=True, "--storage-dir",
default=None, multiple=True,
type=click.Path(exists=True, default=None,
file_okay=False, type=click.Path(
dir_okay=True, exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
writable=True, ),
resolve_path=True), help="Manage custom library storage",
help="Manage custom library storage") )
@click.option("-g", @click.option(
"--global", "-g", "--global", is_flag=True, help="Manage global PlatformIO library storage"
is_flag=True, )
help="Manage global PlatformIO library storage")
@click.option( @click.option(
"-e", "-e",
"--environment", "--environment",
multiple=True, multiple=True,
help=("Manage libraries for the specific project build environments " help=(
"declared in `platformio.ini`")) "Manage libraries for the specific project build environments "
"declared in `platformio.ini`"
),
)
@click.pass_context @click.pass_context
def cli(ctx, **options): def cli(ctx, **options):
storage_cmds = ("install", "uninstall", "update", "list") storage_cmds = ("install", "uninstall", "update", "list")
# skip commands that don't need storage folder # skip commands that don't need storage folder
if ctx.invoked_subcommand not in storage_cmds or \ if ctx.invoked_subcommand not in storage_cmds or (
(len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")): len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")
):
return return
storage_dirs = list(options['storage_dir']) storage_dirs = list(options["storage_dir"])
if options['global']: if options["global"]:
storage_dirs.append(get_project_global_lib_dir()) storage_dirs.append(get_project_global_lib_dir())
if not storage_dirs: if not storage_dirs:
if is_platformio_project(): if is_platformio_project():
@@ -84,15 +89,16 @@ def cli(ctx, **options):
"Warning! Global library storage is used automatically. " "Warning! Global library storage is used automatically. "
"Please use `platformio lib --global %s` command to remove " "Please use `platformio lib --global %s` command to remove "
"this warning." % ctx.invoked_subcommand, "this warning." % ctx.invoked_subcommand,
fg="yellow") fg="yellow",
)
if not storage_dirs: if not storage_dirs:
raise exception.NotGlobalLibDir(get_project_dir(), raise exception.NotGlobalLibDir(
get_project_global_lib_dir(), get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand
ctx.invoked_subcommand) )
in_silence = PlatformioCLI.in_silence() in_silence = PlatformioCLI.in_silence()
ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options['environment'] ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options["environment"]
ctx.meta[CTX_META_INPUT_DIRS_KEY] = storage_dirs ctx.meta[CTX_META_INPUT_DIRS_KEY] = storage_dirs
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = [] ctx.meta[CTX_META_STORAGE_DIRS_KEY] = []
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY] = {} ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY] = {}
@@ -100,18 +106,17 @@ def cli(ctx, **options):
if not is_platformio_project(storage_dir): if not is_platformio_project(storage_dir):
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir) ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
continue continue
with fs.cd(storage_dir): config = ProjectConfig.get_instance(os.path.join(storage_dir, "platformio.ini"))
libdeps_dir = get_project_libdeps_dir() config.validate(options["environment"], silent=in_silence)
config = ProjectConfig.get_instance(join(storage_dir, libdeps_dir = config.get_optional_dir("libdeps")
"platformio.ini"))
config.validate(options['environment'], silent=in_silence)
for env in config.envs(): for env in config.envs():
if options['environment'] and env not in options['environment']: if options["environment"] and env not in options["environment"]:
continue continue
storage_dir = join(libdeps_dir, env) storage_dir = os.path.join(libdeps_dir, env)
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir) ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY][storage_dir] = config.get( ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY][storage_dir] = config.get(
"env:" + env, "lib_deps", []) "env:" + env, "lib_deps", []
)
@cli.command("install", short_help="Install library") @cli.command("install", short_help="Install library")
@@ -119,21 +124,19 @@ def cli(ctx, **options):
@click.option( @click.option(
"--save", "--save",
is_flag=True, is_flag=True,
help="Save installed libraries into the `platformio.ini` dependency list") help="Save installed libraries into the `platformio.ini` dependency list",
@click.option("-s", )
"--silent", @click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
is_flag=True, @click.option(
help="Suppress progress reporting") "--interactive", is_flag=True, help="Allow to make a choice for all prompts"
@click.option("--interactive", )
is_flag=True, @click.option(
help="Allow to make a choice for all prompts") "-f", "--force", is_flag=True, help="Reinstall/redownload library if exists"
@click.option("-f", )
"--force",
is_flag=True,
help="Reinstall/redownload library if exists")
@click.pass_context @click.pass_context
def lib_install( # pylint: disable=too-many-arguments def lib_install( # pylint: disable=too-many-arguments
ctx, libraries, save, silent, interactive, force): ctx, libraries, save, silent, interactive, force
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY] storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, []) storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, [])
@@ -144,25 +147,22 @@ def lib_install( # pylint: disable=too-many-arguments
lm = LibraryManager(storage_dir) lm = LibraryManager(storage_dir)
if libraries: if libraries:
for library in libraries: for library in libraries:
pkg_dir = lm.install(library, pkg_dir = lm.install(
silent=silent, library, silent=silent, interactive=interactive, force=force
interactive=interactive, )
force=force)
installed_manifests[library] = lm.load_manifest(pkg_dir) installed_manifests[library] = lm.load_manifest(pkg_dir)
elif storage_dir in storage_libdeps: elif storage_dir in storage_libdeps:
builtin_lib_storages = None builtin_lib_storages = None
for library in storage_libdeps[storage_dir]: for library in storage_libdeps[storage_dir]:
try: try:
pkg_dir = lm.install(library, pkg_dir = lm.install(
silent=silent, library, silent=silent, interactive=interactive, force=force
interactive=interactive, )
force=force)
installed_manifests[library] = lm.load_manifest(pkg_dir) installed_manifests[library] = lm.load_manifest(pkg_dir)
except exception.LibNotFound as e: except exception.LibNotFound as e:
if builtin_lib_storages is None: if builtin_lib_storages is None:
builtin_lib_storages = get_builtin_libs() builtin_lib_storages = get_builtin_libs()
if not silent or not is_builtin_lib( if not silent or not is_builtin_lib(builtin_lib_storages, library):
builtin_lib_storages, library):
click.secho("Warning! %s" % e, fg="yellow") click.secho("Warning! %s" % e, fg="yellow")
if not save or not libraries: if not save or not libraries:
@@ -171,7 +171,7 @@ def lib_install( # pylint: disable=too-many-arguments
input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, []) input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, [])
project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY]
for input_dir in input_dirs: for input_dir in input_dirs:
config = ProjectConfig.get_instance(join(input_dir, "platformio.ini")) config = ProjectConfig.get_instance(os.path.join(input_dir, "platformio.ini"))
config.validate(project_environments) config.validate(project_environments)
for env in config.envs(): for env in config.envs():
if project_environments and env not in project_environments: if project_environments and env not in project_environments:
@@ -183,8 +183,8 @@ def lib_install( # pylint: disable=too-many-arguments
continue continue
manifest = installed_manifests[library] manifest = installed_manifests[library]
try: try:
assert library.lower() == manifest['name'].lower() assert library.lower() == manifest["name"].lower()
assert semantic_version.Version(manifest['version']) assert semantic_version.Version(manifest["version"])
lib_deps.append("{name}@^{version}".format(**manifest)) lib_deps.append("{name}@^{version}".format(**manifest))
except (AssertionError, ValueError): except (AssertionError, ValueError):
lib_deps.append(library) lib_deps.append(library)
@@ -206,13 +206,15 @@ def lib_uninstall(ctx, libraries):
@cli.command("update", short_help="Update installed libraries") @cli.command("update", short_help="Update installed libraries")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]") @click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option("-c", @click.option(
"--only-check", "-c",
is_flag=True, "--only-check",
help="DEPRECATED. Please use `--dry-run` instead") is_flag=True,
@click.option("--dry-run", help="DEPRECATED. Please use `--dry-run` instead",
is_flag=True, )
help="Do not update, only check for the new versions") @click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
@click.pass_context @click.pass_context
def lib_update(ctx, libraries, only_check, dry_run, json_output): def lib_update(ctx, libraries, only_check, dry_run, json_output):
@@ -226,14 +228,12 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
_libraries = libraries _libraries = libraries
if not _libraries: if not _libraries:
_libraries = [ _libraries = [manifest["__pkg_dir"] for manifest in lm.get_installed()]
manifest['__pkg_dir'] for manifest in lm.get_installed()
]
if only_check and json_output: if only_check and json_output:
result = [] result = []
for library in _libraries: for library in _libraries:
pkg_dir = library if isdir(library) else None pkg_dir = library if os.path.isdir(library) else None
requirements = None requirements = None
url = None url = None
if not pkg_dir: if not pkg_dir:
@@ -245,7 +245,7 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
if not latest: if not latest:
continue continue
manifest = lm.load_manifest(pkg_dir) manifest = lm.load_manifest(pkg_dir)
manifest['versionLatest'] = latest manifest["versionLatest"] = latest
result.append(manifest) result.append(manifest)
json_result[storage_dir] = result json_result[storage_dir] = result
else: else:
@@ -254,8 +254,10 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
if json_output: if json_output:
return click.echo( return click.echo(
dump_json_to_unicode(json_result[storage_dirs[0]] dump_json_to_unicode(
if len(storage_dirs) == 1 else json_result)) json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True return True
@@ -274,15 +276,17 @@ def lib_list(ctx, json_output):
if json_output: if json_output:
json_result[storage_dir] = items json_result[storage_dir] = items
elif items: elif items:
for item in sorted(items, key=lambda i: i['name']): for item in sorted(items, key=lambda i: i["name"]):
print_lib_item(item) print_lib_item(item)
else: else:
click.echo("No items found") click.echo("No items found")
if json_output: if json_output:
return click.echo( return click.echo(
dump_json_to_unicode(json_result[storage_dirs[0]] dump_json_to_unicode(
if len(storage_dirs) == 1 else json_result)) json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True return True
@@ -298,9 +302,11 @@ def lib_list(ctx, json_output):
@click.option("-f", "--framework", multiple=True) @click.option("-f", "--framework", multiple=True)
@click.option("-p", "--platform", multiple=True) @click.option("-p", "--platform", multiple=True)
@click.option("-i", "--header", multiple=True) @click.option("-i", "--header", multiple=True)
@click.option("--noninteractive", @click.option(
is_flag=True, "--noninteractive",
help="Do not prompt, automatically paginate with delay") is_flag=True,
help="Do not prompt, automatically paginate with delay",
)
def lib_search(query, json_output, page, noninteractive, **filters): def lib_search(query, json_output, page, noninteractive, **filters):
if not query: if not query:
query = [] query = []
@@ -311,55 +317,61 @@ def lib_search(query, json_output, page, noninteractive, **filters):
for value in values: for value in values:
query.append('%s:"%s"' % (key, value)) query.append('%s:"%s"' % (key, value))
result = util.get_api_result("/v2/lib/search", result = util.get_api_result(
dict(query=" ".join(query), page=page), "/v2/lib/search", dict(query=" ".join(query), page=page), cache_valid="1d"
cache_valid="1d") )
if json_output: if json_output:
click.echo(dump_json_to_unicode(result)) click.echo(dump_json_to_unicode(result))
return return
if result['total'] == 0: if result["total"] == 0:
click.secho( click.secho(
"Nothing has been found by your request\n" "Nothing has been found by your request\n"
"Try a less-specific search or use truncation (or wildcard) " "Try a less-specific search or use truncation (or wildcard) "
"operator", "operator",
fg="yellow", fg="yellow",
nl=False) nl=False,
)
click.secho(" *", fg="green") click.secho(" *", fg="green")
click.secho("For example: DS*, PCA*, DHT* and etc.\n", fg="yellow") click.secho("For example: DS*, PCA*, DHT* and etc.\n", fg="yellow")
click.echo("For more examples and advanced search syntax, " click.echo(
"please use documentation:") "For more examples and advanced search syntax, please use documentation:"
)
click.secho( click.secho(
"https://docs.platformio.org/page/userguide/lib/cmd_search.html\n", "https://docs.platformio.org/page/userguide/lib/cmd_search.html\n",
fg="cyan") fg="cyan",
)
return return
click.secho("Found %d libraries:\n" % result['total'], click.secho(
fg="green" if result['total'] else "yellow") "Found %d libraries:\n" % result["total"],
fg="green" if result["total"] else "yellow",
)
while True: while True:
for item in result['items']: for item in result["items"]:
print_lib_item(item) print_lib_item(item)
if (int(result['page']) * int(result['perpage']) >= int( if int(result["page"]) * int(result["perpage"]) >= int(result["total"]):
result['total'])):
break break
if noninteractive: if noninteractive:
click.echo() click.echo()
click.secho("Loading next %d libraries... Press Ctrl+C to stop!" % click.secho(
result['perpage'], "Loading next %d libraries... Press Ctrl+C to stop!"
fg="yellow") % result["perpage"],
fg="yellow",
)
click.echo() click.echo()
time.sleep(5) time.sleep(5)
elif not click.confirm("Show next libraries?"): elif not click.confirm("Show next libraries?"):
break break
result = util.get_api_result("/v2/lib/search", { result = util.get_api_result(
"query": " ".join(query), "/v2/lib/search",
"page": int(result['page']) + 1 {"query": " ".join(query), "page": int(result["page"]) + 1},
}, cache_valid="1d",
cache_valid="1d") )
@cli.command("builtin", short_help="List built-in libraries") @cli.command("builtin", short_help="List built-in libraries")
@@ -371,13 +383,13 @@ def lib_builtin(storage, json_output):
return click.echo(dump_json_to_unicode(items)) return click.echo(dump_json_to_unicode(items))
for storage_ in items: for storage_ in items:
if not storage_['items']: if not storage_["items"]:
continue continue
click.secho(storage_['name'], fg="green") click.secho(storage_["name"], fg="green")
click.echo("*" * len(storage_['name'])) click.echo("*" * len(storage_["name"]))
click.echo() click.echo()
for item in sorted(storage_['items'], key=lambda i: i['name']): for item in sorted(storage_["items"], key=lambda i: i["name"]):
print_lib_item(item) print_lib_item(item)
return True return True
@@ -389,27 +401,29 @@ def lib_builtin(storage, json_output):
def lib_show(library, json_output): def lib_show(library, json_output):
lm = LibraryManager() lm = LibraryManager()
name, requirements, _ = lm.parse_pkg_uri(library) name, requirements, _ = lm.parse_pkg_uri(library)
lib_id = lm.search_lib_id({ lib_id = lm.search_lib_id(
"name": name, {"name": name, "requirements": requirements},
"requirements": requirements silent=json_output,
}, interactive=not json_output,
silent=json_output, )
interactive=not json_output)
lib = util.get_api_result("/lib/info/%d" % lib_id, cache_valid="1d") lib = util.get_api_result("/lib/info/%d" % lib_id, cache_valid="1d")
if json_output: if json_output:
return click.echo(dump_json_to_unicode(lib)) return click.echo(dump_json_to_unicode(lib))
click.secho(lib['name'], fg="cyan") click.secho(lib["name"], fg="cyan")
click.echo("=" * len(lib['name'])) click.echo("=" * len(lib["name"]))
click.secho("#ID: %d" % lib['id'], bold=True) click.secho("#ID: %d" % lib["id"], bold=True)
click.echo(lib['description']) click.echo(lib["description"])
click.echo() click.echo()
click.echo( click.echo(
"Version: %s, released %s" % "Version: %s, released %s"
(lib['version']['name'], % (
time.strftime("%c", util.parse_date(lib['version']['released'])))) lib["version"]["name"],
click.echo("Manifest: %s" % lib['confurl']) time.strftime("%c", util.parse_date(lib["version"]["released"])),
)
)
click.echo("Manifest: %s" % lib["confurl"])
for key in ("homepage", "repository", "license"): for key in ("homepage", "repository", "license"):
if key not in lib or not lib[key]: if key not in lib or not lib[key]:
continue continue
@@ -436,23 +450,33 @@ def lib_show(library, json_output):
if _authors: if _authors:
blocks.append(("Authors", _authors)) blocks.append(("Authors", _authors))
blocks.append(("Keywords", lib['keywords'])) blocks.append(("Keywords", lib["keywords"]))
for key in ("frameworks", "platforms"): for key in ("frameworks", "platforms"):
if key not in lib or not lib[key]: if key not in lib or not lib[key]:
continue continue
blocks.append(("Compatible %s" % key, [i['title'] for i in lib[key]])) blocks.append(("Compatible %s" % key, [i["title"] for i in lib[key]]))
blocks.append(("Headers", lib['headers'])) blocks.append(("Headers", lib["headers"]))
blocks.append(("Examples", lib['examples'])) blocks.append(("Examples", lib["examples"]))
blocks.append(("Versions", [ blocks.append(
"%s, released %s" % (
(v['name'], time.strftime("%c", util.parse_date(v['released']))) "Versions",
for v in lib['versions'] [
])) "%s, released %s"
blocks.append(("Unique Downloads", [ % (v["name"], time.strftime("%c", util.parse_date(v["released"])))
"Today: %s" % lib['dlstats']['day'], for v in lib["versions"]
"Week: %s" % lib['dlstats']['week'], ],
"Month: %s" % lib['dlstats']['month'] )
])) )
blocks.append(
(
"Unique Downloads",
[
"Today: %s" % lib["dlstats"]["day"],
"Week: %s" % lib["dlstats"]["week"],
"Month: %s" % lib["dlstats"]["month"],
],
)
)
for (title, rows) in blocks: for (title, rows) in blocks:
click.echo() click.echo()
@@ -467,16 +491,22 @@ def lib_show(library, json_output):
@cli.command("register", short_help="Register a new library") @cli.command("register", short_help="Register a new library")
@click.argument("config_url") @click.argument("config_url")
def lib_register(config_url): def lib_register(config_url):
if (not config_url.startswith("http://") if not config_url.startswith("http://") and not config_url.startswith("https://"):
and not config_url.startswith("https://")):
raise exception.InvalidLibConfURL(config_url) raise exception.InvalidLibConfURL(config_url)
result = util.get_api_result("/lib/register", # Validate manifest
data=dict(config_url=config_url)) data, error = ManifestSchema(strict=False).load(
if "message" in result and result['message']: ManifestParserFactory.new_from_url(config_url).as_dict()
click.secho(result['message'], )
fg="green" if "successed" in result and result['successed'] if error:
else "red") raise ManifestValidationError(error, data)
result = util.get_api_result("/lib/register", data=dict(config_url=config_url))
if "message" in result and result["message"]:
click.secho(
result["message"],
fg="green" if "successed" in result and result["successed"] else "red",
)
@cli.command("stats", short_help="Library Registry Statistics") @cli.command("stats", short_help="Library Registry Statistics")
@@ -488,46 +518,56 @@ def lib_stats(json_output):
return click.echo(dump_json_to_unicode(result)) return click.echo(dump_json_to_unicode(result))
for key in ("updated", "added"): for key in ("updated", "added"):
tabular_data = [(click.style(item['name'], fg="cyan"), tabular_data = [
time.strftime("%c", util.parse_date(item['date'])), (
"https://platformio.org/lib/show/%s/%s" % click.style(item["name"], fg="cyan"),
(item['id'], quote(item['name']))) time.strftime("%c", util.parse_date(item["date"])),
for item in result.get(key, [])] "https://platformio.org/lib/show/%s/%s"
table = tabulate(tabular_data, % (item["id"], quote(item["name"])),
headers=[ )
click.style("RECENTLY " + key.upper(), bold=True), for item in result.get(key, [])
"Date", "URL" ]
]) table = tabulate(
tabular_data,
headers=[click.style("RECENTLY " + key.upper(), bold=True), "Date", "URL"],
)
click.echo(table) click.echo(table)
click.echo() click.echo()
for key in ("lastkeywords", "topkeywords"): for key in ("lastkeywords", "topkeywords"):
tabular_data = [(click.style(name, fg="cyan"), tabular_data = [
"https://platformio.org/lib/search?query=" + (
quote("keyword:%s" % name)) click.style(name, fg="cyan"),
for name in result.get(key, [])] "https://platformio.org/lib/search?query=" + quote("keyword:%s" % name),
)
for name in result.get(key, [])
]
table = tabulate( table = tabulate(
tabular_data, tabular_data,
headers=[ headers=[
click.style( click.style(
("RECENT" if key == "lastkeywords" else "POPULAR") + ("RECENT" if key == "lastkeywords" else "POPULAR") + " KEYWORDS",
" KEYWORDS", bold=True,
bold=True), "URL" ),
]) "URL",
],
)
click.echo(table) click.echo(table)
click.echo() click.echo()
for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth", for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth", "Month")):
"Month")): tabular_data = [
tabular_data = [(click.style(item['name'], fg="cyan"), (
"https://platformio.org/lib/show/%s/%s" % click.style(item["name"], fg="cyan"),
(item['id'], quote(item['name']))) "https://platformio.org/lib/show/%s/%s"
for item in result.get(key, [])] % (item["id"], quote(item["name"])),
table = tabulate(tabular_data, )
headers=[ for item in result.get(key, [])
click.style("FEATURED: " + title.upper(), ]
bold=True), "URL" table = tabulate(
]) tabular_data,
headers=[click.style("FEATURED: " + title.upper(), bold=True), "URL"],
)
click.echo(table) click.echo(table)
click.echo() click.echo()
@@ -538,15 +578,16 @@ def print_storage_header(storage_dirs, storage_dir):
if storage_dirs and storage_dirs[0] != storage_dir: if storage_dirs and storage_dirs[0] != storage_dir:
click.echo("") click.echo("")
click.echo( click.echo(
click.style("Library Storage: ", bold=True) + click.style("Library Storage: ", bold=True)
click.style(storage_dir, fg="blue")) + click.style(storage_dir, fg="blue")
)
def print_lib_item(item): def print_lib_item(item):
click.secho(item['name'], fg="cyan") click.secho(item["name"], fg="cyan")
click.echo("=" * len(item['name'])) click.echo("=" * len(item["name"]))
if "id" in item: if "id" in item:
click.secho("#ID: %d" % item['id'], bold=True) click.secho("#ID: %d" % item["id"], bold=True)
if "description" in item or "url" in item: if "description" in item or "url" in item:
click.echo(item.get("description", item.get("url", ""))) click.echo(item.get("description", item.get("url", "")))
click.echo() click.echo()
@@ -562,14 +603,26 @@ def print_lib_item(item):
for key in ("frameworks", "platforms"): for key in ("frameworks", "platforms"):
if key not in item: if key not in item:
continue continue
click.echo("Compatible %s: %s" % (key, ", ".join( click.echo(
[i['title'] if isinstance(i, dict) else i for i in item[key]]))) "Compatible %s: %s"
% (
key,
", ".join(
[i["title"] if isinstance(i, dict) else i for i in item[key]]
),
)
)
if "authors" in item or "authornames" in item: if "authors" in item or "authornames" in item:
click.echo("Authors: %s" % ", ".join( click.echo(
item.get("authornames", "Authors: %s"
[a.get("name", "") for a in item.get("authors", [])]))) % ", ".join(
item.get(
"authornames", [a.get("name", "") for a in item.get("authors", [])]
)
)
)
if "__src_url" in item: if "__src_url" in item:
click.secho("Source: %s" % item['__src_url']) click.secho("Source: %s" % item["__src_url"])
click.echo() click.echo()

View File

@@ -29,24 +29,27 @@ def cli():
def _print_platforms(platforms): def _print_platforms(platforms):
for platform in platforms: for platform in platforms:
click.echo("{name} ~ {title}".format(name=click.style(platform['name'], click.echo(
fg="cyan"), "{name} ~ {title}".format(
title=platform['title'])) name=click.style(platform["name"], fg="cyan"), title=platform["title"]
click.echo("=" * (3 + len(platform['name'] + platform['title']))) )
click.echo(platform['description']) )
click.echo("=" * (3 + len(platform["name"] + platform["title"])))
click.echo(platform["description"])
click.echo() click.echo()
if "homepage" in platform: if "homepage" in platform:
click.echo("Home: %s" % platform['homepage']) click.echo("Home: %s" % platform["homepage"])
if "frameworks" in platform and platform['frameworks']: if "frameworks" in platform and platform["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(platform['frameworks'])) click.echo("Frameworks: %s" % ", ".join(platform["frameworks"]))
if "packages" in platform: if "packages" in platform:
click.echo("Packages: %s" % ", ".join(platform['packages'])) click.echo("Packages: %s" % ", ".join(platform["packages"]))
if "version" in platform: if "version" in platform:
if "__src_url" in platform: if "__src_url" in platform:
click.echo("Version: #%s (%s)" % click.echo(
(platform['version'], platform['__src_url'])) "Version: #%s (%s)" % (platform["version"], platform["__src_url"])
)
else: else:
click.echo("Version: " + platform['version']) click.echo("Version: " + platform["version"])
click.echo() click.echo()
@@ -54,7 +57,7 @@ def _get_registry_platforms():
platforms = util.get_api_result("/platforms", cache_valid="7d") platforms = util.get_api_result("/platforms", cache_valid="7d")
pm = PlatformManager() pm = PlatformManager()
for platform in platforms or []: for platform in platforms or []:
platform['versions'] = pm.get_all_repo_versions(platform['name']) platform["versions"] = pm.get_all_repo_versions(platform["name"])
return platforms return platforms
@@ -65,22 +68,22 @@ def _get_platform_data(*args, **kwargs):
return _get_registry_platform_data(*args, **kwargs) return _get_registry_platform_data(*args, **kwargs)
def _get_installed_platform_data(platform, def _get_installed_platform_data(platform, with_boards=True, expose_packages=True):
with_boards=True,
expose_packages=True):
p = PlatformFactory.newPlatform(platform) p = PlatformFactory.newPlatform(platform)
data = dict(name=p.name, data = dict(
title=p.title, name=p.name,
description=p.description, title=p.title,
version=p.version, description=p.description,
homepage=p.homepage, version=p.version,
repository=p.repository_url, homepage=p.homepage,
url=p.vendor_url, repository=p.repository_url,
docs=p.docs_url, url=p.vendor_url,
license=p.license, docs=p.docs_url,
forDesktop=not p.is_embedded(), license=p.license,
frameworks=sorted(list(p.frameworks) if p.frameworks else []), forDesktop=not p.is_embedded(),
packages=list(p.packages) if p.packages else []) frameworks=sorted(list(p.frameworks) if p.frameworks else []),
packages=list(p.packages) if p.packages else [],
)
# if dump to API # if dump to API
# del data['version'] # del data['version']
@@ -94,18 +97,20 @@ def _get_installed_platform_data(platform,
data[key] = manifest[key] data[key] = manifest[key]
if with_boards: if with_boards:
data['boards'] = [c.get_brief_data() for c in p.get_boards().values()] data["boards"] = [c.get_brief_data() for c in p.get_boards().values()]
if not data['packages'] or not expose_packages: if not data["packages"] or not expose_packages:
return data return data
data['packages'] = [] data["packages"] = []
installed_pkgs = p.get_installed_packages() installed_pkgs = p.get_installed_packages()
for name, opts in p.packages.items(): for name, opts in p.packages.items():
item = dict(name=name, item = dict(
type=p.get_package_type(name), name=name,
requirements=opts.get("version"), type=p.get_package_type(name),
optional=opts.get("optional") is True) requirements=opts.get("version"),
optional=opts.get("optional") is True,
)
if name in installed_pkgs: if name in installed_pkgs:
for key, value in installed_pkgs[name].items(): for key, value in installed_pkgs[name].items():
if key not in ("url", "version", "description"): if key not in ("url", "version", "description"):
@@ -113,40 +118,42 @@ def _get_installed_platform_data(platform,
item[key] = value item[key] = value
if key == "version": if key == "version":
item["originalVersion"] = util.get_original_version(value) item["originalVersion"] = util.get_original_version(value)
data['packages'].append(item) data["packages"].append(item)
return data return data
def _get_registry_platform_data( # pylint: disable=unused-argument def _get_registry_platform_data( # pylint: disable=unused-argument
platform, platform, with_boards=True, expose_packages=True
with_boards=True, ):
expose_packages=True):
_data = None _data = None
for p in _get_registry_platforms(): for p in _get_registry_platforms():
if p['name'] == platform: if p["name"] == platform:
_data = p _data = p
break break
if not _data: if not _data:
return None return None
data = dict(name=_data['name'], data = dict(
title=_data['title'], name=_data["name"],
description=_data['description'], title=_data["title"],
homepage=_data['homepage'], description=_data["description"],
repository=_data['repository'], homepage=_data["homepage"],
url=_data['url'], repository=_data["repository"],
license=_data['license'], url=_data["url"],
forDesktop=_data['forDesktop'], license=_data["license"],
frameworks=_data['frameworks'], forDesktop=_data["forDesktop"],
packages=_data['packages'], frameworks=_data["frameworks"],
versions=_data['versions']) packages=_data["packages"],
versions=_data["versions"],
)
if with_boards: if with_boards:
data['boards'] = [ data["boards"] = [
board for board in PlatformManager().get_registered_boards() board
if board['platform'] == _data['name'] for board in PlatformManager().get_registered_boards()
if board["platform"] == _data["name"]
] ]
return data return data
@@ -164,9 +171,10 @@ def platform_search(query, json_output):
if query and query.lower() not in search_data.lower(): if query and query.lower() not in search_data.lower():
continue continue
platforms.append( platforms.append(
_get_registry_platform_data(platform['name'], _get_registry_platform_data(
with_boards=False, platform["name"], with_boards=False, expose_packages=False
expose_packages=False)) )
)
if json_output: if json_output:
click.echo(dump_json_to_unicode(platforms)) click.echo(dump_json_to_unicode(platforms))
@@ -185,15 +193,15 @@ def platform_frameworks(query, json_output):
search_data = dump_json_to_unicode(framework) search_data = dump_json_to_unicode(framework)
if query and query.lower() not in search_data.lower(): if query and query.lower() not in search_data.lower():
continue continue
framework['homepage'] = ("https://platformio.org/frameworks/" + framework["homepage"] = "https://platformio.org/frameworks/" + framework["name"]
framework['name']) framework["platforms"] = [
framework['platforms'] = [ platform["name"]
platform['name'] for platform in _get_registry_platforms() for platform in _get_registry_platforms()
if framework['name'] in platform['frameworks'] if framework["name"] in platform["frameworks"]
] ]
frameworks.append(framework) frameworks.append(framework)
frameworks = sorted(frameworks, key=lambda manifest: manifest['name']) frameworks = sorted(frameworks, key=lambda manifest: manifest["name"])
if json_output: if json_output:
click.echo(dump_json_to_unicode(frameworks)) click.echo(dump_json_to_unicode(frameworks))
else: else:
@@ -207,11 +215,12 @@ def platform_list(json_output):
pm = PlatformManager() pm = PlatformManager()
for manifest in pm.get_installed(): for manifest in pm.get_installed():
platforms.append( platforms.append(
_get_installed_platform_data(manifest['__pkg_dir'], _get_installed_platform_data(
with_boards=False, manifest["__pkg_dir"], with_boards=False, expose_packages=False
expose_packages=False)) )
)
platforms = sorted(platforms, key=lambda manifest: manifest['name']) platforms = sorted(platforms, key=lambda manifest: manifest["name"])
if json_output: if json_output:
click.echo(dump_json_to_unicode(platforms)) click.echo(dump_json_to_unicode(platforms))
else: else:
@@ -228,55 +237,58 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
if json_output: if json_output:
return click.echo(dump_json_to_unicode(data)) return click.echo(dump_json_to_unicode(data))
click.echo("{name} ~ {title}".format(name=click.style(data['name'], click.echo(
fg="cyan"), "{name} ~ {title}".format(
title=data['title'])) name=click.style(data["name"], fg="cyan"), title=data["title"]
click.echo("=" * (3 + len(data['name'] + data['title']))) )
click.echo(data['description']) )
click.echo("=" * (3 + len(data["name"] + data["title"])))
click.echo(data["description"])
click.echo() click.echo()
if "version" in data: if "version" in data:
click.echo("Version: %s" % data['version']) click.echo("Version: %s" % data["version"])
if data['homepage']: if data["homepage"]:
click.echo("Home: %s" % data['homepage']) click.echo("Home: %s" % data["homepage"])
if data['repository']: if data["repository"]:
click.echo("Repository: %s" % data['repository']) click.echo("Repository: %s" % data["repository"])
if data['url']: if data["url"]:
click.echo("Vendor: %s" % data['url']) click.echo("Vendor: %s" % data["url"])
if data['license']: if data["license"]:
click.echo("License: %s" % data['license']) click.echo("License: %s" % data["license"])
if data['frameworks']: if data["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(data['frameworks'])) click.echo("Frameworks: %s" % ", ".join(data["frameworks"]))
if not data['packages']: if not data["packages"]:
return None return None
if not isinstance(data['packages'][0], dict): if not isinstance(data["packages"][0], dict):
click.echo("Packages: %s" % ", ".join(data['packages'])) click.echo("Packages: %s" % ", ".join(data["packages"]))
else: else:
click.echo() click.echo()
click.secho("Packages", bold=True) click.secho("Packages", bold=True)
click.echo("--------") click.echo("--------")
for item in data['packages']: for item in data["packages"]:
click.echo() click.echo()
click.echo("Package %s" % click.style(item['name'], fg="yellow")) click.echo("Package %s" % click.style(item["name"], fg="yellow"))
click.echo("-" * (8 + len(item['name']))) click.echo("-" * (8 + len(item["name"])))
if item['type']: if item["type"]:
click.echo("Type: %s" % item['type']) click.echo("Type: %s" % item["type"])
click.echo("Requirements: %s" % item['requirements']) click.echo("Requirements: %s" % item["requirements"])
click.echo("Installed: %s" % click.echo(
("Yes" if item.get("version") else "No (optional)")) "Installed: %s" % ("Yes" if item.get("version") else "No (optional)")
)
if "version" in item: if "version" in item:
click.echo("Version: %s" % item['version']) click.echo("Version: %s" % item["version"])
if "originalVersion" in item: if "originalVersion" in item:
click.echo("Original version: %s" % item['originalVersion']) click.echo("Original version: %s" % item["originalVersion"])
if "description" in item: if "description" in item:
click.echo("Description: %s" % item['description']) click.echo("Description: %s" % item["description"])
if data['boards']: if data["boards"]:
click.echo() click.echo()
click.secho("Boards", bold=True) click.secho("Boards", bold=True)
click.echo("------") click.echo("------")
print_boards(data['boards']) print_boards(data["boards"])
return True return True
@@ -290,20 +302,26 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
"-f", "-f",
"--force", "--force",
is_flag=True, is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist") help="Reinstall/redownload dev/platform and its packages if exist",
def platform_install(platforms, with_package, without_package, )
skip_default_package, force): def platform_install(
platforms, with_package, without_package, skip_default_package, force
):
pm = PlatformManager() pm = PlatformManager()
for platform in platforms: for platform in platforms:
if pm.install(name=platform, if pm.install(
with_packages=with_package, name=platform,
without_packages=without_package, with_packages=with_package,
skip_default_package=skip_default_package, without_packages=without_package,
force=force): skip_default_package=skip_default_package,
click.secho("The platform '%s' has been successfully installed!\n" force=force,
"The rest of packages will be installed automatically " ):
"depending on your build environment." % platform, click.secho(
fg="green") "The platform '%s' has been successfully installed!\n"
"The rest of packages will be installed automatically "
"depending on your build environment." % platform,
fg="green",
)
@cli.command("uninstall", short_help="Uninstall development platform") @cli.command("uninstall", short_help="Uninstall development platform")
@@ -312,35 +330,39 @@ def platform_uninstall(platforms):
pm = PlatformManager() pm = PlatformManager()
for platform in platforms: for platform in platforms:
if pm.uninstall(platform): if pm.uninstall(platform):
click.secho("The platform '%s' has been successfully " click.secho(
"uninstalled!" % platform, "The platform '%s' has been successfully uninstalled!" % platform,
fg="green") fg="green",
)
@cli.command("update", short_help="Update installed development platforms") @cli.command("update", short_help="Update installed development platforms")
@click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]") @click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]")
@click.option("-p", @click.option(
"--only-packages", "-p", "--only-packages", is_flag=True, help="Update only the platform packages"
is_flag=True, )
help="Update only the platform packages") @click.option(
@click.option("-c", "-c",
"--only-check", "--only-check",
is_flag=True, is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead") help="DEPRECATED. Please use `--dry-run` instead",
@click.option("--dry-run", )
is_flag=True, @click.option(
help="Do not update, only check for the new versions") "--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def platform_update( # pylint: disable=too-many-locals def platform_update( # pylint: disable=too-many-locals
platforms, only_packages, only_check, dry_run, json_output): platforms, only_packages, only_check, dry_run, json_output
):
pm = PlatformManager() pm = PlatformManager()
pkg_dir_to_name = {} pkg_dir_to_name = {}
if not platforms: if not platforms:
platforms = [] platforms = []
for manifest in pm.get_installed(): for manifest in pm.get_installed():
platforms.append(manifest['__pkg_dir']) platforms.append(manifest["__pkg_dir"])
pkg_dir_to_name[manifest['__pkg_dir']] = manifest.get( pkg_dir_to_name[manifest["__pkg_dir"]] = manifest.get(
"title", manifest['name']) "title", manifest["name"]
)
only_check = dry_run or only_check only_check = dry_run or only_check
@@ -356,14 +378,16 @@ def platform_update( # pylint: disable=too-many-locals
if not pkg_dir: if not pkg_dir:
continue continue
latest = pm.outdated(pkg_dir, requirements) latest = pm.outdated(pkg_dir, requirements)
if (not latest and not PlatformFactory.newPlatform( if (
pkg_dir).are_outdated_packages()): not latest
and not PlatformFactory.newPlatform(pkg_dir).are_outdated_packages()
):
continue continue
data = _get_installed_platform_data(pkg_dir, data = _get_installed_platform_data(
with_boards=False, pkg_dir, with_boards=False, expose_packages=False
expose_packages=False) )
if latest: if latest:
data['versionLatest'] = latest data["versionLatest"] = latest
result.append(data) result.append(data)
return click.echo(dump_json_to_unicode(result)) return click.echo(dump_json_to_unicode(result))
@@ -371,8 +395,9 @@ def platform_update( # pylint: disable=too-many-locals
app.clean_cache() app.clean_cache()
for platform in platforms: for platform in platforms:
click.echo( click.echo(
"Platform %s" % "Platform %s"
click.style(pkg_dir_to_name.get(platform, platform), fg="cyan")) % click.style(pkg_dir_to_name.get(platform, platform), fg="cyan")
)
click.echo("--------") click.echo("--------")
pm.update(platform, only_packages=only_packages, only_check=only_check) pm.update(platform, only_packages=only_packages, only_check=only_check)
click.echo() click.echo()

View File

@@ -23,7 +23,6 @@ import click
from platformio import exception, fs from platformio import exception, fs
from platformio.commands.device import device_monitor as cmd_device_monitor from platformio.commands.device import device_monitor as cmd_device_monitor
from platformio.compat import get_file_contents
from platformio.managers.core import pioplus_call from platformio.managers.core import pioplus_call
# pylint: disable=unused-argument # pylint: disable=unused-argument
@@ -43,13 +42,12 @@ def remote_agent():
@remote_agent.command("start", short_help="Start agent") @remote_agent.command("start", short_help="Start agent")
@click.option("-n", "--name") @click.option("-n", "--name")
@click.option("-s", "--share", multiple=True, metavar="E-MAIL") @click.option("-s", "--share", multiple=True, metavar="E-MAIL")
@click.option("-d", @click.option(
"--working-dir", "-d",
envvar="PLATFORMIO_REMOTE_AGENT_DIR", "--working-dir",
type=click.Path(file_okay=False, envvar="PLATFORMIO_REMOTE_AGENT_DIR",
dir_okay=True, type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
writable=True, )
resolve_path=True))
def remote_agent_start(**kwargs): def remote_agent_start(**kwargs):
pioplus_call(sys.argv[1:]) pioplus_call(sys.argv[1:])
@@ -64,15 +62,16 @@ def remote_agent_list():
pioplus_call(sys.argv[1:]) pioplus_call(sys.argv[1:])
@cli.command("update", @cli.command("update", short_help="Update installed Platforms, Packages and Libraries")
short_help="Update installed Platforms, Packages and Libraries") @click.option(
@click.option("-c", "-c",
"--only-check", "--only-check",
is_flag=True, is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead") help="DEPRECATED. Please use `--dry-run` instead",
@click.option("--dry-run", )
is_flag=True, @click.option(
help="Do not update, only check for the new versions") "--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
def remote_update(only_check, dry_run): def remote_update(only_check, dry_run):
pioplus_call(sys.argv[1:]) pioplus_call(sys.argv[1:])
@@ -81,14 +80,14 @@ def remote_update(only_check, dry_run):
@click.option("-e", "--environment", multiple=True) @click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True) @click.option("-t", "--target", multiple=True)
@click.option("--upload-port") @click.option("--upload-port")
@click.option("-d", @click.option(
"--project-dir", "-d",
default=getcwd, "--project-dir",
type=click.Path(exists=True, default=getcwd,
file_okay=True, type=click.Path(
dir_okay=True, exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
writable=True, ),
resolve_path=True)) )
@click.option("--disable-auto-clean", is_flag=True) @click.option("--disable-auto-clean", is_flag=True)
@click.option("-r", "--force-remote", is_flag=True) @click.option("-r", "--force-remote", is_flag=True)
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@@ -102,14 +101,14 @@ def remote_run(**kwargs):
@click.option("--ignore", "-i", multiple=True, metavar="<pattern>") @click.option("--ignore", "-i", multiple=True, metavar="<pattern>")
@click.option("--upload-port") @click.option("--upload-port")
@click.option("--test-port") @click.option("--test-port")
@click.option("-d", @click.option(
"--project-dir", "-d",
default=getcwd, "--project-dir",
type=click.Path(exists=True, default=getcwd,
file_okay=False, type=click.Path(
dir_okay=True, exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
writable=True, ),
resolve_path=True)) )
@click.option("-r", "--force-remote", is_flag=True) @click.option("-r", "--force-remote", is_flag=True)
@click.option("--without-building", is_flag=True) @click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True) @click.option("--without-uploading", is_flag=True)
@@ -131,58 +130,61 @@ def device_list(json_output):
@remote_device.command("monitor", short_help="Monitor remote device") @remote_device.command("monitor", short_help="Monitor remote device")
@click.option("--port", "-p", help="Port, a number or a device name") @click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", @click.option(
"-b", "--baud", "-b", type=int, default=9600, help="Set baud rate, default=9600"
type=int, )
default=9600, @click.option(
help="Set baud rate, default=9600") "--parity",
@click.option("--parity", default="N",
default="N", type=click.Choice(["N", "E", "O", "S", "M"]),
type=click.Choice(["N", "E", "O", "S", "M"]), help="Set parity, default=N",
help="Set parity, default=N") )
@click.option("--rtscts", @click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
is_flag=True, @click.option(
help="Enable RTS/CTS flow control, default=Off") "--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
@click.option("--xonxoff", )
is_flag=True, @click.option(
help="Enable software flow control, default=Off") "--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
@click.option("--rts", )
default=None, @click.option(
type=click.IntRange(0, 1), "--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
help="Set initial RTS line state") )
@click.option("--dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off") @click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option("--encoding", @click.option(
default="UTF-8", "--encoding",
help="Set the encoding for the serial port (e.g. hexlify, " default="UTF-8",
"Latin1, UTF-8), default: UTF-8") help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add text transformation") @click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option("--eol", @click.option(
default="CRLF", "--eol",
type=click.Choice(["CR", "LF", "CRLF"]), default="CRLF",
help="End of line mode, default=CRLF") type=click.Choice(["CR", "LF", "CRLF"]),
@click.option("--raw", help="End of line mode, default=CRLF",
is_flag=True, )
help="Do not apply any encodings/transformations") @click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option("--exit-char", @click.option(
type=int, "--exit-char",
default=3, type=int,
help="ASCII code of special character that is used to exit " default=3,
"the application, default=3 (Ctrl+C)") help="ASCII code of special character that is used to exit "
@click.option("--menu-char", "the application, default=3 (Ctrl+C)",
type=int, )
default=20, @click.option(
help="ASCII code of special character that is used to " "--menu-char",
"control miniterm (menu), default=20 (DEC)") type=int,
@click.option("--quiet", default=20,
is_flag=True, help="ASCII code of special character that is used to "
help="Diagnostics: suppress non-error messages, default=Off") "control miniterm (menu), default=20 (DEC)",
)
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off",
)
@click.pass_context @click.pass_context
def device_monitor(ctx, **kwargs): def device_monitor(ctx, **kwargs):
def _tx_target(sock_dir): def _tx_target(sock_dir):
try: try:
pioplus_call(sys.argv[1:] + ["--sock", sock_dir]) pioplus_call(sys.argv[1:] + ["--sock", sock_dir])
@@ -192,13 +194,13 @@ def device_monitor(ctx, **kwargs):
sock_dir = mkdtemp(suffix="pioplus") sock_dir = mkdtemp(suffix="pioplus")
sock_file = join(sock_dir, "sock") sock_file = join(sock_dir, "sock")
try: try:
t = threading.Thread(target=_tx_target, args=(sock_dir, )) t = threading.Thread(target=_tx_target, args=(sock_dir,))
t.start() t.start()
while t.is_alive() and not isfile(sock_file): while t.is_alive() and not isfile(sock_file):
sleep(0.1) sleep(0.1)
if not t.is_alive(): if not t.is_alive():
return return
kwargs['port'] = get_file_contents(sock_file) kwargs["port"] = fs.get_file_contents(sock_file)
ctx.invoke(cmd_device_monitor, **kwargs) ctx.invoke(cmd_device_monitor, **kwargs)
t.join(2) t.join(2)
finally: finally:

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.commands.run.command import cli

View File

@@ -14,21 +14,19 @@
from multiprocessing import cpu_count from multiprocessing import cpu_count
from os import getcwd from os import getcwd
from os.path import isfile, join from os.path import isfile
from time import time from time import time
import click import click
from tabulate import tabulate from tabulate import tabulate
from platformio import exception, fs, util from platformio import app, exception, fs, util
from platformio.commands.device import device_monitor as cmd_device_monitor from platformio.commands.device import device_monitor as cmd_device_monitor
from platformio.commands.run.helpers import (clean_build_dir, from platformio.commands.run.helpers import clean_build_dir, handle_legacy_libdeps
handle_legacy_libdeps)
from platformio.commands.run.processor import EnvironmentProcessor from platformio.commands.run.processor import EnvironmentProcessor
from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (find_project_dir_above, from platformio.project.helpers import find_project_dir_above
get_project_build_dir)
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches # pylint: disable=too-many-arguments,too-many-locals,too-many-branches
@@ -42,34 +40,49 @@ except NotImplementedError:
@click.option("-e", "--environment", multiple=True) @click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True) @click.option("-t", "--target", multiple=True)
@click.option("--upload-port") @click.option("--upload-port")
@click.option("-d", @click.option(
"--project-dir", "-d",
default=getcwd, "--project-dir",
type=click.Path(exists=True, default=getcwd,
file_okay=True, type=click.Path(
dir_okay=True, exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
writable=True, ),
resolve_path=True)) )
@click.option("-c", @click.option(
"--project-conf", "-c",
type=click.Path(exists=True, "--project-conf",
file_okay=True, type=click.Path(
dir_okay=False, exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
readable=True, ),
resolve_path=True)) )
@click.option("-j", @click.option(
"--jobs", "-j",
type=int, "--jobs",
default=DEFAULT_JOB_NUMS, type=int,
help=("Allow N jobs at once. " default=DEFAULT_JOB_NUMS,
"Default is a number of CPUs in a system (N=%d)" % help=(
DEFAULT_JOB_NUMS)) "Allow N jobs at once. "
"Default is a number of CPUs in a system (N=%d)" % DEFAULT_JOB_NUMS
),
)
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True) @click.option("-v", "--verbose", is_flag=True)
@click.option("--disable-auto-clean", is_flag=True) @click.option("--disable-auto-clean", is_flag=True)
@click.pass_context @click.pass_context
def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs, def cli(
silent, verbose, disable_auto_clean): ctx,
environment,
target,
upload_port,
project_dir,
project_conf,
jobs,
silent,
verbose,
disable_auto_clean,
):
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level # find project directory on upper level
if isfile(project_dir): if isfile(project_dir):
project_dir = find_project_dir_above(project_dir) project_dir = find_project_dir_above(project_dir)
@@ -77,47 +90,58 @@ def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs,
is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig.get_instance( config = ProjectConfig.get_instance(project_conf)
project_conf or join(project_dir, "platformio.ini"))
config.validate(environment) config.validate(environment)
# clean obsolete build dir # clean obsolete build dir
if not disable_auto_clean: if not disable_auto_clean:
build_dir = config.get_optional_dir("build")
try: try:
clean_build_dir(get_project_build_dir(), config) clean_build_dir(build_dir, config)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
click.secho( click.secho(
"Can not remove temporary directory `%s`. Please remove " "Can not remove temporary directory `%s`. Please remove "
"it manually to avoid build issues" % "it manually to avoid build issues" % build_dir,
get_project_build_dir(force=True), fg="yellow",
fg="yellow") )
handle_legacy_libdeps(project_dir, config) handle_legacy_libdeps(project_dir, config)
default_envs = config.default_envs() default_envs = config.default_envs()
results = [] results = []
for env in config.envs(): for env in config.envs():
skipenv = any([ skipenv = any(
environment and env not in environment, not environment [
and default_envs and env not in default_envs environment and env not in environment,
]) not environment and default_envs and env not in default_envs,
]
)
if skipenv: if skipenv:
results.append({"env": env}) results.append({"env": env})
continue continue
# print empty line between multi environment project # print empty line between multi environment project
if not silent and any( if not silent and any(r.get("succeeded") is not None for r in results):
r.get("succeeded") is not None for r in results):
click.echo() click.echo()
results.append( results.append(
process_env(ctx, env, config, environment, target, upload_port, process_env(
silent, verbose, jobs, is_test_running)) ctx,
env,
config,
environment,
target,
upload_port,
silent,
verbose,
jobs,
is_test_running,
)
)
command_failed = any(r.get("succeeded") is False for r in results) command_failed = any(r.get("succeeded") is False for r in results)
if (not is_test_running and (command_failed or not silent) if not is_test_running and (command_failed or not silent) and len(results) > 1:
and len(results) > 1):
print_processing_summary(results) print_processing_summary(results)
if command_failed: if command_failed:
@@ -125,24 +149,39 @@ def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs,
return True return True
def process_env(ctx, name, config, environments, targets, upload_port, silent, def process_env(
verbose, jobs, is_test_running): ctx,
name,
config,
environments,
targets,
upload_port,
silent,
verbose,
jobs,
is_test_running,
):
if not is_test_running and not silent: if not is_test_running and not silent:
print_processing_header(name, config, verbose) print_processing_header(name, config, verbose)
ep = EnvironmentProcessor(ctx, name, config, targets, upload_port, silent, ep = EnvironmentProcessor(
verbose, jobs) ctx, name, config, targets, upload_port, silent, verbose, jobs
)
result = {"env": name, "duration": time(), "succeeded": ep.process()} result = {"env": name, "duration": time(), "succeeded": ep.process()}
result['duration'] = time() - result['duration'] result["duration"] = time() - result["duration"]
# print footer on error or when is not unit testing # print footer on error or when is not unit testing
if not is_test_running and (not silent or not result['succeeded']): if not is_test_running and (not silent or not result["succeeded"]):
print_processing_footer(result) print_processing_footer(result)
if (result['succeeded'] and "monitor" in ep.get_build_targets() if (
and "nobuild" not in ep.get_build_targets()): result["succeeded"]
ctx.invoke(cmd_device_monitor, and "monitor" in ep.get_build_targets()
environment=environments[0] if environments else None) and "nobuild" not in ep.get_build_targets()
):
ctx.invoke(
cmd_device_monitor, environment=environments[0] if environments else None
)
return result return result
@@ -151,10 +190,11 @@ def print_processing_header(env, config, verbose=False):
env_dump = [] env_dump = []
for k, v in config.items(env=env): for k, v in config.items(env=env):
if verbose or k in ("platform", "framework", "board"): if verbose or k in ("platform", "framework", "board"):
env_dump.append("%s: %s" % env_dump.append("%s: %s" % (k, ", ".join(v) if isinstance(v, list) else v))
(k, ", ".join(v) if isinstance(v, list) else v)) click.echo(
click.echo("Processing %s (%s)" % "Processing %s (%s)"
(click.style(env, fg="cyan", bold=True), "; ".join(env_dump))) % (click.style(env, fg="cyan", bold=True), "; ".join(env_dump))
)
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True) click.secho("-" * terminal_width, bold=True)
@@ -162,10 +202,17 @@ def print_processing_header(env, config, verbose=False):
def print_processing_footer(result): def print_processing_footer(result):
is_failed = not result.get("succeeded") is_failed = not result.get("succeeded")
util.print_labeled_bar( util.print_labeled_bar(
"[%s] Took %.2f seconds" % "[%s] Took %.2f seconds"
((click.style("FAILED", fg="red", bold=True) if is_failed else % (
click.style("SUCCESS", fg="green", bold=True)), result['duration']), (
is_error=is_failed) click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("SUCCESS", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def print_processing_summary(results): def print_processing_summary(results):
@@ -186,20 +233,31 @@ def print_processing_summary(results):
status_str = click.style("SUCCESS", fg="green") status_str = click.style("SUCCESS", fg="green")
tabular_data.append( tabular_data.append(
(click.style(result['env'], fg="cyan"), status_str, (
util.humanize_duration_time(result.get("duration")))) click.style(result["env"], fg="cyan"),
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo() click.echo()
click.echo(tabulate(tabular_data, click.echo(
headers=[ tabulate(
click.style(s, bold=True) tabular_data,
for s in ("Environment", "Status", "Duration") headers=[
]), click.style(s, bold=True) for s in ("Environment", "Status", "Duration")
err=failed_nums) ],
),
err=failed_nums,
)
util.print_labeled_bar( util.print_labeled_bar(
"%s%d succeeded in %s" % "%s%d succeeded in %s"
("%d failed, " % failed_nums if failed_nums else "", succeeded_nums, % (
util.humanize_duration_time(duration)), "%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums, is_error=failed_nums,
fg="red" if failed_nums else "green") fg="red" if failed_nums else "green",
)

View File

@@ -18,15 +18,14 @@ from os.path import isdir, isfile, join
import click import click
from platformio import fs from platformio import fs
from platformio.project.helpers import (compute_project_checksum, from platformio.project.helpers import compute_project_checksum, get_project_dir
get_project_dir,
get_project_libdeps_dir)
def handle_legacy_libdeps(project_dir, config): def handle_legacy_libdeps(project_dir, config):
legacy_libdeps_dir = join(project_dir, ".piolibdeps") legacy_libdeps_dir = join(project_dir, ".piolibdeps")
if (not isdir(legacy_libdeps_dir) if not isdir(legacy_libdeps_dir) or legacy_libdeps_dir == config.get_optional_dir(
or legacy_libdeps_dir == get_project_libdeps_dir()): "libdeps"
):
return return
if not config.has_section("env"): if not config.has_section("env"):
config.add_section("env") config.add_section("env")
@@ -39,7 +38,8 @@ def handle_legacy_libdeps(project_dir, config):
" file using `lib_deps` option and remove `{0}` folder." " file using `lib_deps` option and remove `{0}` folder."
"\nMore details -> http://docs.platformio.org/page/projectconf/" "\nMore details -> http://docs.platformio.org/page/projectconf/"
"section_env_library.html#lib-deps".format(legacy_libdeps_dir), "section_env_library.html#lib-deps".format(legacy_libdeps_dir),
fg="yellow") fg="yellow",
)
def clean_build_dir(build_dir, config): def clean_build_dir(build_dir, config):
@@ -53,12 +53,9 @@ def clean_build_dir(build_dir, config):
if isdir(build_dir): if isdir(build_dir):
# check project structure # check project structure
if isfile(checksum_file): if isfile(checksum_file) and fs.get_file_contents(checksum_file) == checksum:
with open(checksum_file) as f: return
if f.read() == checksum:
return
fs.rmtree(build_dir) fs.rmtree(build_dir)
makedirs(build_dir) makedirs(build_dir)
with open(checksum_file, "w") as f: fs.write_file_contents(checksum_file, checksum)
f.write(checksum)

View File

@@ -13,8 +13,7 @@
# limitations under the License. # limitations under the License.
from platformio import exception, telemetry from platformio import exception, telemetry
from platformio.commands.platform import \ from platformio.commands.platform import platform_install as cmd_platform_install
platform_install as cmd_platform_install
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
from platformio.managers.platform import PlatformFactory from platformio.managers.platform import PlatformFactory
@@ -22,10 +21,9 @@ from platformio.managers.platform import PlatformFactory
class EnvironmentProcessor(object): class EnvironmentProcessor(object):
def __init__( # pylint: disable=too-many-arguments def __init__( # pylint: disable=too-many-arguments
self, cmd_ctx, name, config, targets, upload_port, silent, verbose, self, cmd_ctx, name, config, targets, upload_port, silent, verbose, jobs
jobs): ):
self.cmd_ctx = cmd_ctx self.cmd_ctx = cmd_ctx
self.name = name self.name = name
self.config = config self.config = config
@@ -40,25 +38,28 @@ class EnvironmentProcessor(object):
variables = {"pioenv": self.name, "project_config": self.config.path} variables = {"pioenv": self.name, "project_config": self.config.path}
if CTX_META_TEST_RUNNING_NAME in self.cmd_ctx.meta: if CTX_META_TEST_RUNNING_NAME in self.cmd_ctx.meta:
variables['piotest_running_name'] = self.cmd_ctx.meta[ variables["piotest_running_name"] = self.cmd_ctx.meta[
CTX_META_TEST_RUNNING_NAME] CTX_META_TEST_RUNNING_NAME
]
if self.upload_port: if self.upload_port:
# override upload port with a custom from CLI # override upload port with a custom from CLI
variables['upload_port'] = self.upload_port variables["upload_port"] = self.upload_port
return variables return variables
def get_build_targets(self): def get_build_targets(self):
if self.targets: return (
return [t for t in self.targets] self.targets
return self.config.get("env:" + self.name, "targets", []) if self.targets
else self.config.get("env:" + self.name, "targets", [])
)
def process(self): def process(self):
if "platform" not in self.options: if "platform" not in self.options:
raise exception.UndefinedEnvPlatform(self.name) raise exception.UndefinedEnvPlatform(self.name)
build_vars = self.get_build_variables() build_vars = self.get_build_variables()
build_targets = self.get_build_targets() build_targets = list(self.get_build_targets())
telemetry.on_run_environment(self.options, build_targets) telemetry.on_run_environment(self.options, build_targets)
@@ -67,13 +68,14 @@ class EnvironmentProcessor(object):
build_targets.remove("monitor") build_targets.remove("monitor")
try: try:
p = PlatformFactory.newPlatform(self.options['platform']) p = PlatformFactory.newPlatform(self.options["platform"])
except exception.UnknownPlatform: except exception.UnknownPlatform:
self.cmd_ctx.invoke(cmd_platform_install, self.cmd_ctx.invoke(
platforms=[self.options['platform']], cmd_platform_install,
skip_default_package=True) platforms=[self.options["platform"]],
p = PlatformFactory.newPlatform(self.options['platform']) skip_default_package=True,
)
p = PlatformFactory.newPlatform(self.options["platform"])
result = p.run(build_vars, build_targets, self.silent, self.verbose, result = p.run(build_vars, build_targets, self.silent, self.verbose, self.jobs)
self.jobs) return result["returncode"] == 0
return result['returncode'] == 0

View File

@@ -42,20 +42,24 @@ def settings_get(name):
raw_value = app.get_setting(key) raw_value = app.get_setting(key)
formatted_value = format_value(raw_value) formatted_value = format_value(raw_value)
if raw_value != options['value']: if raw_value != options["value"]:
default_formatted_value = format_value(options['value']) default_formatted_value = format_value(options["value"])
formatted_value += "%s" % ( formatted_value += "%s" % (
"\n" if len(default_formatted_value) > 10 else " ") "\n" if len(default_formatted_value) > 10 else " "
formatted_value += "[%s]" % click.style(default_formatted_value, )
fg="yellow") formatted_value += "[%s]" % click.style(
default_formatted_value, fg="yellow"
)
tabular_data.append( tabular_data.append(
(click.style(key, (click.style(key, fg="cyan"), formatted_value, options["description"])
fg="cyan"), formatted_value, options['description'])) )
click.echo( click.echo(
tabulate(tabular_data, tabulate(
headers=["Name", "Current value [Default]", "Description"])) tabular_data, headers=["Name", "Current value [Default]", "Description"]
)
)
@cli.command("set", short_help="Set new value for the setting") @cli.command("set", short_help="Set new value for the setting")

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.commands.test.command import cli

View File

@@ -22,70 +22,91 @@ from time import time
import click import click
from tabulate import tabulate from tabulate import tabulate
from platformio import exception, fs, util from platformio import app, exception, fs, util
from platformio.commands.test.embedded import EmbeddedTestProcessor from platformio.commands.test.embedded import EmbeddedTestProcessor
from platformio.commands.test.native import NativeTestProcessor from platformio.commands.test.native import NativeTestProcessor
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_test_dir
@click.command("test", short_help="Unit Testing") @click.command("test", short_help="Unit Testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>") @click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option("--filter", @click.option(
"-f", "--filter",
multiple=True, "-f",
metavar="<pattern>", multiple=True,
help="Filter tests by a pattern") metavar="<pattern>",
@click.option("--ignore", help="Filter tests by a pattern",
"-i", )
multiple=True, @click.option(
metavar="<pattern>", "--ignore",
help="Ignore tests by a pattern") "-i",
multiple=True,
metavar="<pattern>",
help="Ignore tests by a pattern",
)
@click.option("--upload-port") @click.option("--upload-port")
@click.option("--test-port") @click.option("--test-port")
@click.option("-d", @click.option(
"--project-dir", "-d",
default=getcwd, "--project-dir",
type=click.Path(exists=True, default=getcwd,
file_okay=False, type=click.Path(
dir_okay=True, exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
writable=True, ),
resolve_path=True)) )
@click.option("-c", @click.option(
"--project-conf", "-c",
type=click.Path(exists=True, "--project-conf",
file_okay=True, type=click.Path(
dir_okay=False, exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
readable=True, ),
resolve_path=True)) )
@click.option("--without-building", is_flag=True) @click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True) @click.option("--without-uploading", is_flag=True)
@click.option("--without-testing", is_flag=True) @click.option("--without-testing", is_flag=True)
@click.option("--no-reset", is_flag=True) @click.option("--no-reset", is_flag=True)
@click.option("--monitor-rts", @click.option(
default=None, "--monitor-rts",
type=click.IntRange(0, 1), default=None,
help="Set initial RTS line state for Serial Monitor") type=click.IntRange(0, 1),
@click.option("--monitor-dtr", help="Set initial RTS line state for Serial Monitor",
default=None, )
type=click.IntRange(0, 1), @click.option(
help="Set initial DTR line state for Serial Monitor") "--monitor-dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state for Serial Monitor",
)
@click.option("--verbose", "-v", is_flag=True) @click.option("--verbose", "-v", is_flag=True)
@click.pass_context @click.pass_context
def cli( # pylint: disable=redefined-builtin def cli( # pylint: disable=redefined-builtin
ctx, environment, ignore, filter, upload_port, test_port, project_dir, ctx,
project_conf, without_building, without_uploading, without_testing, environment,
no_reset, monitor_rts, monitor_dtr, verbose): ignore,
filter,
upload_port,
test_port,
project_dir,
project_conf,
without_building,
without_uploading,
without_testing,
no_reset,
monitor_rts,
monitor_dtr,
verbose,
):
app.set_session_var("custom_project_conf", project_conf)
with fs.cd(project_dir): with fs.cd(project_dir):
test_dir = get_project_test_dir() config = ProjectConfig.get_instance(project_conf)
config.validate(envs=environment)
test_dir = config.get_optional_dir("test")
if not isdir(test_dir): if not isdir(test_dir):
raise exception.TestDirNotExists(test_dir) raise exception.TestDirNotExists(test_dir)
test_names = get_test_names(test_dir) test_names = get_test_names(test_dir)
config = ProjectConfig.get_instance(
project_conf or join(project_dir, "platformio.ini"))
config.validate(envs=environment)
click.echo("Verbose mode can be enabled via `-v, --verbose` option") click.echo("Verbose mode can be enabled via `-v, --verbose` option")
click.secho("Collected %d items" % len(test_names), bold=True) click.secho("Collected %d items" % len(test_names), bold=True)
@@ -99,19 +120,16 @@ def cli( # pylint: disable=redefined-builtin
# filter and ignore patterns # filter and ignore patterns
patterns = dict(filter=list(filter), ignore=list(ignore)) patterns = dict(filter=list(filter), ignore=list(ignore))
for key in patterns: for key in patterns:
patterns[key].extend( patterns[key].extend(config.get(section, "test_%s" % key, []))
config.get(section, "test_%s" % key, []))
skip_conditions = [ skip_conditions = [
environment and envname not in environment, environment and envname not in environment,
not environment and default_envs not environment and default_envs and envname not in default_envs,
and envname not in default_envs,
testname != "*" and patterns['filter'] and
not any([fnmatch(testname, p)
for p in patterns['filter']]),
testname != "*" testname != "*"
and any([fnmatch(testname, p) and patterns["filter"]
for p in patterns['ignore']]), and not any([fnmatch(testname, p) for p in patterns["filter"]]),
testname != "*"
and any([fnmatch(testname, p) for p in patterns["ignore"]]),
] ]
if any(skip_conditions): if any(skip_conditions):
results.append({"env": envname, "test": testname}) results.append({"env": envname, "test": testname})
@@ -120,29 +138,36 @@ def cli( # pylint: disable=redefined-builtin
click.echo() click.echo()
print_processing_header(testname, envname) print_processing_header(testname, envname)
cls = (NativeTestProcessor cls = (
if config.get(section, "platform") == "native" else NativeTestProcessor
EmbeddedTestProcessor) if config.get(section, "platform") == "native"
else EmbeddedTestProcessor
)
tp = cls( tp = cls(
ctx, testname, envname, ctx,
dict(project_config=config, testname,
project_dir=project_dir, envname,
upload_port=upload_port, dict(
test_port=test_port, project_config=config,
without_building=without_building, project_dir=project_dir,
without_uploading=without_uploading, upload_port=upload_port,
without_testing=without_testing, test_port=test_port,
no_reset=no_reset, without_building=without_building,
monitor_rts=monitor_rts, without_uploading=without_uploading,
monitor_dtr=monitor_dtr, without_testing=without_testing,
verbose=verbose)) no_reset=no_reset,
monitor_rts=monitor_rts,
monitor_dtr=monitor_dtr,
verbose=verbose,
),
)
result = { result = {
"env": envname, "env": envname,
"test": testname, "test": testname,
"duration": time(), "duration": time(),
"succeeded": tp.process() "succeeded": tp.process(),
} }
result['duration'] = time() - result['duration'] result["duration"] = time() - result["duration"]
results.append(result) results.append(result)
print_processing_footer(result) print_processing_footer(result)
@@ -168,8 +193,13 @@ def get_test_names(test_dir):
def print_processing_header(test, env): def print_processing_header(test, env):
click.echo("Processing %s in %s environment" % (click.style( click.echo(
test, fg="yellow", bold=True), click.style(env, fg="cyan", bold=True))) "Processing %s in %s environment"
% (
click.style(test, fg="yellow", bold=True),
click.style(env, fg="cyan", bold=True),
)
)
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True) click.secho("-" * terminal_width, bold=True)
@@ -177,10 +207,17 @@ def print_processing_header(test, env):
def print_processing_footer(result): def print_processing_footer(result):
is_failed = not result.get("succeeded") is_failed = not result.get("succeeded")
util.print_labeled_bar( util.print_labeled_bar(
"[%s] Took %.2f seconds" % "[%s] Took %.2f seconds"
((click.style("FAILED", fg="red", bold=True) if is_failed else % (
click.style("PASSED", fg="green", bold=True)), result['duration']), (
is_error=is_failed) click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def print_testing_summary(results): def print_testing_summary(results):
@@ -203,20 +240,32 @@ def print_testing_summary(results):
status_str = click.style("PASSED", fg="green") status_str = click.style("PASSED", fg="green")
tabular_data.append( tabular_data.append(
(result['test'], click.style(result['env'], fg="cyan"), status_str, (
util.humanize_duration_time(result.get("duration")))) result["test"],
click.style(result["env"], fg="cyan"),
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(tabulate(tabular_data, click.echo(
headers=[ tabulate(
click.style(s, bold=True) tabular_data,
for s in ("Test", "Environment", "Status", headers=[
"Duration") click.style(s, bold=True)
]), for s in ("Test", "Environment", "Status", "Duration")
err=failed_nums) ],
),
err=failed_nums,
)
util.print_labeled_bar( util.print_labeled_bar(
"%s%d succeeded in %s" % "%s%d succeeded in %s"
("%d failed, " % failed_nums if failed_nums else "", succeeded_nums, % (
util.humanize_duration_time(duration)), "%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums, is_error=failed_nums,
fg="red" if failed_nums else "green") fg="red" if failed_nums else "green",
)

View File

@@ -27,47 +27,50 @@ class EmbeddedTestProcessor(TestProcessorBase):
SERIAL_TIMEOUT = 600 SERIAL_TIMEOUT = 600
def process(self): def process(self):
if not self.options['without_building']: if not self.options["without_building"]:
self.print_progress("Building...") self.print_progress("Building...")
target = ["__test"] target = ["__test"]
if self.options['without_uploading']: if self.options["without_uploading"]:
target.append("checkprogsize") target.append("checkprogsize")
if not self.build_or_upload(target): if not self.build_or_upload(target):
return False return False
if not self.options['without_uploading']: if not self.options["without_uploading"]:
self.print_progress("Uploading...") self.print_progress("Uploading...")
target = ["upload"] target = ["upload"]
if self.options['without_building']: if self.options["without_building"]:
target.append("nobuild") target.append("nobuild")
else: else:
target.append("__test") target.append("__test")
if not self.build_or_upload(target): if not self.build_or_upload(target):
return False return False
if self.options['without_testing']: if self.options["without_testing"]:
return None return None
self.print_progress("Testing...") self.print_progress("Testing...")
return self.run() return self.run()
def run(self): def run(self):
click.echo("If you don't see any output for the first 10 secs, " click.echo(
"please reset board (press reset button)") "If you don't see any output for the first 10 secs, "
"please reset board (press reset button)"
)
click.echo() click.echo()
try: try:
ser = serial.Serial(baudrate=self.get_baudrate(), ser = serial.Serial(
timeout=self.SERIAL_TIMEOUT) baudrate=self.get_baudrate(), timeout=self.SERIAL_TIMEOUT
)
ser.port = self.get_test_port() ser.port = self.get_test_port()
ser.rts = self.options['monitor_rts'] ser.rts = self.options["monitor_rts"]
ser.dtr = self.options['monitor_dtr'] ser.dtr = self.options["monitor_dtr"]
ser.open() ser.open()
except serial.SerialException as e: except serial.SerialException as e:
click.secho(str(e), fg="red", err=True) click.secho(str(e), fg="red", err=True)
return False return False
if not self.options['no_reset']: if not self.options["no_reset"]:
ser.flushInput() ser.flushInput()
ser.setDTR(False) ser.setDTR(False)
ser.setRTS(False) ser.setRTS(False)
@@ -90,7 +93,7 @@ class EmbeddedTestProcessor(TestProcessorBase):
if not line: if not line:
continue continue
if isinstance(line, bytes): if isinstance(line, bytes):
line = line.decode("utf8") line = line.decode("utf8", "ignore")
self.on_run_out(line) self.on_run_out(line)
if all([l in line for l in ("Tests", "Failures", "Ignored")]): if all([l in line for l in ("Tests", "Failures", "Ignored")]):
break break
@@ -105,17 +108,16 @@ class EmbeddedTestProcessor(TestProcessorBase):
return self.env_options.get("test_port") return self.env_options.get("test_port")
assert set(["platform", "board"]) & set(self.env_options.keys()) assert set(["platform", "board"]) & set(self.env_options.keys())
p = PlatformFactory.newPlatform(self.env_options['platform']) p = PlatformFactory.newPlatform(self.env_options["platform"])
board_hwids = p.board_config(self.env_options['board']).get( board_hwids = p.board_config(self.env_options["board"]).get("build.hwids", [])
"build.hwids", [])
port = None port = None
elapsed = 0 elapsed = 0
while elapsed < 5 and not port: while elapsed < 5 and not port:
for item in util.get_serialports(): for item in util.get_serialports():
port = item['port'] port = item["port"]
for hwid in board_hwids: for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "") hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']: if hwid_str in item["hwid"]:
return port return port
# check if port is already configured # check if port is already configured
@@ -131,5 +133,6 @@ class EmbeddedTestProcessor(TestProcessorBase):
if not port: if not port:
raise exception.PlatformioException( raise exception.PlatformioException(
"Please specify `test_port` for environment or use " "Please specify `test_port` for environment or use "
"global `--test-port` option.") "global `--test-port` option."
)
return port return port

View File

@@ -14,30 +14,28 @@
from os.path import join from os.path import join
from platformio import fs, proc from platformio import proc
from platformio.commands.test.processor import TestProcessorBase from platformio.commands.test.processor import TestProcessorBase
from platformio.proc import LineBufferedAsyncPipe from platformio.proc import LineBufferedAsyncPipe
from platformio.project.helpers import get_project_build_dir
class NativeTestProcessor(TestProcessorBase): class NativeTestProcessor(TestProcessorBase):
def process(self): def process(self):
if not self.options['without_building']: if not self.options["without_building"]:
self.print_progress("Building...") self.print_progress("Building...")
if not self.build_or_upload(["__test"]): if not self.build_or_upload(["__test"]):
return False return False
if self.options['without_testing']: if self.options["without_testing"]:
return None return None
self.print_progress("Testing...") self.print_progress("Testing...")
return self.run() return self.run()
def run(self): def run(self):
with fs.cd(self.options['project_dir']): build_dir = self.options["project_config"].get_optional_dir("build")
build_dir = get_project_build_dir()
result = proc.exec_command( result = proc.exec_command(
[join(build_dir, self.env_name, "program")], [join(build_dir, self.env_name, "program")],
stdout=LineBufferedAsyncPipe(self.on_run_out), stdout=LineBufferedAsyncPipe(self.on_run_out),
stderr=LineBufferedAsyncPipe(self.on_run_out)) stderr=LineBufferedAsyncPipe(self.on_run_out),
)
assert "returncode" in result assert "returncode" in result
return result['returncode'] == 0 and not self._run_failed return result["returncode"] == 0 and not self._run_failed

View File

@@ -19,8 +19,7 @@ from string import Template
import click import click
from platformio import exception from platformio import exception, fs
from platformio.project.helpers import get_project_test_dir
TRANSPORT_OPTIONS = { TRANSPORT_OPTIONS = {
"arduino": { "arduino": {
@@ -29,7 +28,7 @@ TRANSPORT_OPTIONS = {
"putchar": "Serial.write(c)", "putchar": "Serial.write(c)",
"flush": "Serial.flush()", "flush": "Serial.flush()",
"begin": "Serial.begin($baudrate)", "begin": "Serial.begin($baudrate)",
"end": "Serial.end()" "end": "Serial.end()",
}, },
"mbed": { "mbed": {
"include": "#include <mbed.h>", "include": "#include <mbed.h>",
@@ -37,7 +36,7 @@ TRANSPORT_OPTIONS = {
"putchar": "pc.putc(c)", "putchar": "pc.putc(c)",
"flush": "", "flush": "",
"begin": "pc.baud($baudrate)", "begin": "pc.baud($baudrate)",
"end": "" "end": "",
}, },
"espidf": { "espidf": {
"include": "#include <stdio.h>", "include": "#include <stdio.h>",
@@ -45,7 +44,7 @@ TRANSPORT_OPTIONS = {
"putchar": "putchar(c)", "putchar": "putchar(c)",
"flush": "fflush(stdout)", "flush": "fflush(stdout)",
"begin": "", "begin": "",
"end": "" "end": "",
}, },
"native": { "native": {
"include": "#include <stdio.h>", "include": "#include <stdio.h>",
@@ -53,7 +52,7 @@ TRANSPORT_OPTIONS = {
"putchar": "putchar(c)", "putchar": "putchar(c)",
"flush": "fflush(stdout)", "flush": "fflush(stdout)",
"begin": "", "begin": "",
"end": "" "end": "",
}, },
"custom": { "custom": {
"include": '#include "unittest_transport.h"', "include": '#include "unittest_transport.h"',
@@ -61,8 +60,8 @@ TRANSPORT_OPTIONS = {
"putchar": "unittest_uart_putchar(c)", "putchar": "unittest_uart_putchar(c)",
"flush": "unittest_uart_flush()", "flush": "unittest_uart_flush()",
"begin": "unittest_uart_begin()", "begin": "unittest_uart_begin()",
"end": "unittest_uart_end()" "end": "unittest_uart_end()",
} },
} }
CTX_META_TEST_IS_RUNNING = __name__ + ".test_running" CTX_META_TEST_IS_RUNNING = __name__ + ".test_running"
@@ -79,8 +78,7 @@ class TestProcessorBase(object):
self.test_name = testname self.test_name = testname
self.options = options self.options = options
self.env_name = envname self.env_name = envname
self.env_options = options['project_config'].items(env=envname, self.env_options = options["project_config"].items(env=envname, as_dict=True)
as_dict=True)
self._run_failed = False self._run_failed = False
self._outputcpp_generated = False self._outputcpp_generated = False
@@ -90,10 +88,11 @@ class TestProcessorBase(object):
elif "framework" in self.env_options: elif "framework" in self.env_options:
transport = self.env_options.get("framework")[0] transport = self.env_options.get("framework")[0]
if "test_transport" in self.env_options: if "test_transport" in self.env_options:
transport = self.env_options['test_transport'] transport = self.env_options["test_transport"]
if transport not in TRANSPORT_OPTIONS: if transport not in TRANSPORT_OPTIONS:
raise exception.PlatformioException( raise exception.PlatformioException(
"Unknown Unit Test transport `%s`" % transport) "Unknown Unit Test transport `%s`" % transport
)
return transport.lower() return transport.lower()
def get_baudrate(self): def get_baudrate(self):
@@ -104,21 +103,27 @@ class TestProcessorBase(object):
def build_or_upload(self, target): def build_or_upload(self, target):
if not self._outputcpp_generated: if not self._outputcpp_generated:
self.generate_outputcpp(get_project_test_dir()) self.generate_outputcpp(
self.options["project_config"].get_optional_dir("test")
)
self._outputcpp_generated = True self._outputcpp_generated = True
if self.test_name != "*": if self.test_name != "*":
self.cmd_ctx.meta[CTX_META_TEST_RUNNING_NAME] = self.test_name self.cmd_ctx.meta[CTX_META_TEST_RUNNING_NAME] = self.test_name
try: try:
from platformio.commands.run import cli as cmd_run # pylint: disable=import-outside-toplevel
return self.cmd_ctx.invoke(cmd_run, from platformio.commands.run.command import cli as cmd_run
project_dir=self.options['project_dir'],
upload_port=self.options['upload_port'], return self.cmd_ctx.invoke(
silent=not self.options['verbose'], cmd_run,
environment=[self.env_name], project_dir=self.options["project_dir"],
disable_auto_clean="nobuild" in target, upload_port=self.options["upload_port"],
target=target) silent=not self.options["verbose"],
environment=[self.env_name],
disable_auto_clean="nobuild" in target,
target=target,
)
except exception.ReturnErrorCode: except exception.ReturnErrorCode:
return False return False
@@ -131,8 +136,7 @@ class TestProcessorBase(object):
def on_run_out(self, line): def on_run_out(self, line):
line = line.strip() line = line.strip()
if line.endswith(":PASS"): if line.endswith(":PASS"):
click.echo("%s\t[%s]" % click.echo("%s\t[%s]" % (line[:-5], click.style("PASSED", fg="green")))
(line[:-5], click.style("PASSED", fg="green")))
elif ":FAIL" in line: elif ":FAIL" in line:
self._run_failed = True self._run_failed = True
click.echo("%s\t[%s]" % (line, click.style("FAILED", fg="red"))) click.echo("%s\t[%s]" % (line, click.style("FAILED", fg="red")))
@@ -142,36 +146,38 @@ class TestProcessorBase(object):
def generate_outputcpp(self, test_dir): def generate_outputcpp(self, test_dir):
assert isdir(test_dir) assert isdir(test_dir)
cpp_tpl = "\n".join([ cpp_tpl = "\n".join(
"$include", [
"#include <output_export.h>", "$include",
"", "#include <output_export.h>",
"$object", "",
"", "$object",
"#ifdef __GNUC__", "",
"void output_start(unsigned int baudrate __attribute__((unused)))", "#ifdef __GNUC__",
"#else", "void output_start(unsigned int baudrate __attribute__((unused)))",
"void output_start(unsigned int baudrate)", "#else",
"#endif", "void output_start(unsigned int baudrate)",
"{", "#endif",
" $begin;", "{",
"}", " $begin;",
"", "}",
"void output_char(int c)", "",
"{", "void output_char(int c)",
" $putchar;", "{",
"}", " $putchar;",
"", "}",
"void output_flush(void)", "",
"{", "void output_flush(void)",
" $flush;", "{",
"}", " $flush;",
"", "}",
"void output_complete(void)", "",
"{", "void output_complete(void)",
" $end;", "{",
"}" " $end;",
]) # yapf: disable "}",
]
)
def delete_tmptest_file(file_): def delete_tmptest_file(file_):
try: try:
@@ -181,14 +187,13 @@ class TestProcessorBase(object):
click.secho( click.secho(
"Warning: Could not remove temporary file '%s'. " "Warning: Could not remove temporary file '%s'. "
"Please remove it manually." % file_, "Please remove it manually." % file_,
fg="yellow") fg="yellow",
)
tpl = Template(cpp_tpl).substitute( tpl = Template(cpp_tpl).substitute(TRANSPORT_OPTIONS[self.get_transport()])
TRANSPORT_OPTIONS[self.get_transport()])
data = Template(tpl).substitute(baudrate=self.get_baudrate()) data = Template(tpl).substitute(baudrate=self.get_baudrate())
tmp_file = join(test_dir, "output_export.cpp") tmp_file = join(test_dir, "output_export.cpp")
with open(tmp_file, "w") as f: fs.write_file_contents(tmp_file, data)
f.write(data)
atexit.register(delete_tmptest_file, tmp_file) atexit.register(delete_tmptest_file, tmp_file)

View File

@@ -22,18 +22,19 @@ from platformio.managers.core import update_core_packages
from platformio.managers.lib import LibraryManager from platformio.managers.lib import LibraryManager
@click.command("update", @click.command(
short_help="Update installed platforms, packages and libraries") "update", short_help="Update installed platforms, packages and libraries"
@click.option("--core-packages", )
is_flag=True, @click.option("--core-packages", is_flag=True, help="Update only the core packages")
help="Update only the core packages") @click.option(
@click.option("-c", "-c",
"--only-check", "--only-check",
is_flag=True, is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead") help="DEPRECATED. Please use `--dry-run` instead",
@click.option("--dry-run", )
is_flag=True, @click.option(
help="Do not update, only check for the new versions") "--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.pass_context @click.pass_context
def cli(ctx, core_packages, only_check, dry_run): def cli(ctx, core_packages, only_check, dry_run):
# cleanup lib search results, cached board and platform lists # cleanup lib search results, cached board and platform lists

View File

@@ -19,27 +19,29 @@ from zipfile import ZipFile
import click import click
import requests import requests
from platformio import VERSION, __version__, exception, util from platformio import VERSION, __version__, app, exception, util
from platformio.compat import WINDOWS from platformio.compat import WINDOWS
from platformio.proc import exec_command, get_pythonexe_path from platformio.proc import exec_command, get_pythonexe_path
from platformio.project.helpers import get_project_cache_dir from platformio.project.helpers import get_project_cache_dir
@click.command("upgrade", @click.command("upgrade", short_help="Upgrade PlatformIO to the latest version")
short_help="Upgrade PlatformIO to the latest version")
@click.option("--dev", is_flag=True, help="Use development branch") @click.option("--dev", is_flag=True, help="Use development branch")
def cli(dev): def cli(dev):
if not dev and __version__ == get_latest_version(): if not dev and __version__ == get_latest_version():
return click.secho( return click.secho(
"You're up-to-date!\nPlatformIO %s is currently the " "You're up-to-date!\nPlatformIO %s is currently the "
"newest version available." % __version__, "newest version available." % __version__,
fg="green") fg="green",
)
click.secho("Please wait while upgrading PlatformIO ...", fg="yellow") click.secho("Please wait while upgrading PlatformIO ...", fg="yellow")
to_develop = dev or not all(c.isdigit() for c in __version__ if c != ".") to_develop = dev or not all(c.isdigit() for c in __version__ if c != ".")
cmds = (["pip", "install", "--upgrade", cmds = (
get_pip_package(to_develop)], ["platformio", "--version"]) ["pip", "install", "--upgrade", get_pip_package(to_develop)],
["platformio", "--version"],
)
cmd = None cmd = None
r = {} r = {}
@@ -49,26 +51,30 @@ def cli(dev):
r = exec_command(cmd) r = exec_command(cmd)
# try pip with disabled cache # try pip with disabled cache
if r['returncode'] != 0 and cmd[2] == "pip": if r["returncode"] != 0 and cmd[2] == "pip":
cmd.insert(3, "--no-cache-dir") cmd.insert(3, "--no-cache-dir")
r = exec_command(cmd) r = exec_command(cmd)
assert r['returncode'] == 0 assert r["returncode"] == 0
assert "version" in r['out'] assert "version" in r["out"]
actual_version = r['out'].strip().split("version", 1)[1].strip() actual_version = r["out"].strip().split("version", 1)[1].strip()
click.secho("PlatformIO has been successfully upgraded to %s" % click.secho(
actual_version, "PlatformIO has been successfully upgraded to %s" % actual_version,
fg="green") fg="green",
)
click.echo("Release notes: ", nl=False) click.echo("Release notes: ", nl=False)
click.secho("https://docs.platformio.org/en/latest/history.html", click.secho("https://docs.platformio.org/en/latest/history.html", fg="cyan")
fg="cyan") if app.get_session_var("caller_id"):
click.secho(
"Warning! Please restart IDE to affect PIO Home changes", fg="yellow"
)
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
if not r: if not r:
raise exception.UpgradeError("\n".join([str(cmd), str(e)])) raise exception.UpgradeError("\n".join([str(cmd), str(e)]))
permission_errors = ("permission denied", "not permitted") permission_errors = ("permission denied", "not permitted")
if (any(m in r['err'].lower() for m in permission_errors) if any(m in r["err"].lower() for m in permission_errors) and not WINDOWS:
and not WINDOWS): click.secho(
click.secho(""" """
----------------- -----------------
Permission denied Permission denied
----------------- -----------------
@@ -78,10 +84,11 @@ You need the `sudo` permission to install Python packages. Try
WARNING! Don't use `sudo` for the rest PlatformIO commands. WARNING! Don't use `sudo` for the rest PlatformIO commands.
""", """,
fg="yellow", fg="yellow",
err=True) err=True,
)
raise exception.ReturnErrorCode(1) raise exception.ReturnErrorCode(1)
raise exception.UpgradeError("\n".join([str(cmd), r['out'], r['err']])) raise exception.UpgradeError("\n".join([str(cmd), r["out"], r["err"]]))
return True return True
@@ -89,18 +96,17 @@ WARNING! Don't use `sudo` for the rest PlatformIO commands.
def get_pip_package(to_develop): def get_pip_package(to_develop):
if not to_develop: if not to_develop:
return "platformio" return "platformio"
dl_url = ("https://github.com/platformio/" dl_url = "https://github.com/platformio/platformio-core/archive/develop.zip"
"platformio-core/archive/develop.zip")
cache_dir = get_project_cache_dir() cache_dir = get_project_cache_dir()
if not os.path.isdir(cache_dir): if not os.path.isdir(cache_dir):
os.makedirs(cache_dir) os.makedirs(cache_dir)
pkg_name = os.path.join(cache_dir, "piocoredevelop.zip") pkg_name = os.path.join(cache_dir, "piocoredevelop.zip")
try: try:
with open(pkg_name, "w") as fp: with open(pkg_name, "w") as fp:
r = exec_command(["curl", "-fsSL", dl_url], r = exec_command(
stdout=fp, ["curl", "-fsSL", dl_url], stdout=fp, universal_newlines=True
universal_newlines=True) )
assert r['returncode'] == 0 assert r["returncode"] == 0
# check ZIP structure # check ZIP structure
with ZipFile(pkg_name) as zp: with ZipFile(pkg_name) as zp:
assert zp.testzip() is None assert zp.testzip() is None
@@ -127,7 +133,8 @@ def get_develop_latest_version():
r = requests.get( r = requests.get(
"https://raw.githubusercontent.com/platformio/platformio" "https://raw.githubusercontent.com/platformio/platformio"
"/develop/platformio/__init__.py", "/develop/platformio/__init__.py",
headers=util.get_request_defheaders()) headers=util.get_request_defheaders(),
)
r.raise_for_status() r.raise_for_status()
for line in r.text.split("\n"): for line in r.text.split("\n"):
line = line.strip() line = line.strip()
@@ -145,7 +152,8 @@ def get_develop_latest_version():
def get_pypi_latest_version(): def get_pypi_latest_version():
r = requests.get("https://pypi.org/pypi/platformio/json", r = requests.get(
headers=util.get_request_defheaders()) "https://pypi.org/pypi/platformio/json", headers=util.get_request_defheaders()
)
r.raise_for_status() r.raise_for_status()
return r.json()['info']['version'] return r.json()["info"]["version"]

View File

@@ -12,24 +12,41 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=unused-import # pylint: disable=unused-import, no-name-in-module, import-error,
# pylint: disable=no-member, undefined-variable
import inspect
import json import json
import locale
import os import os
import re import re
import sys import sys
PY2 = sys.version_info[0] == 2 PY2 = sys.version_info[0] == 2
CYGWIN = sys.platform.startswith('cygwin') CYGWIN = sys.platform.startswith("cygwin")
WINDOWS = sys.platform.startswith('win') WINDOWS = sys.platform.startswith("win")
def get_filesystem_encoding(): def get_filesystem_encoding():
return sys.getfilesystemencoding() or sys.getdefaultencoding() return sys.getfilesystemencoding() or sys.getdefaultencoding()
def get_locale_encoding():
return locale.getdefaultlocale()[1]
def get_class_attributes(cls):
attributes = inspect.getmembers(cls, lambda a: not (inspect.isroutine(a)))
return {
a[0]: a[1]
for a in attributes
if not (a[0].startswith("__") and a[0].endswith("__"))
}
if PY2: if PY2:
# pylint: disable=undefined-variable import imp
string_types = (str, unicode) string_types = (str, unicode)
def is_bytes(x): def is_bytes(x):
@@ -40,10 +57,6 @@ if PY2:
return path return path
return path.decode(get_filesystem_encoding()).encode("utf-8") return path.decode(get_filesystem_encoding()).encode("utf-8")
def get_file_contents(path):
with open(path) as f:
return f.read()
def hashlib_encode_data(data): def hashlib_encode_data(data):
if is_bytes(data): if is_bytes(data):
return data return data
@@ -56,13 +69,12 @@ if PY2:
def dump_json_to_unicode(obj): def dump_json_to_unicode(obj):
if isinstance(obj, unicode): if isinstance(obj, unicode):
return obj return obj
return json.dumps(obj, return json.dumps(
encoding=get_filesystem_encoding(), obj, encoding=get_filesystem_encoding(), ensure_ascii=False, sort_keys=True
ensure_ascii=False, ).encode("utf8")
sort_keys=True).encode("utf8")
_magic_check = re.compile('([*?[])') _magic_check = re.compile("([*?[])")
_magic_check_bytes = re.compile(b'([*?[])') _magic_check_bytes = re.compile(b"([*?[])")
def glob_escape(pathname): def glob_escape(pathname):
"""Escape all special characters.""" """Escape all special characters."""
@@ -72,14 +84,20 @@ if PY2:
# escaped. # escaped.
drive, pathname = os.path.splitdrive(pathname) drive, pathname = os.path.splitdrive(pathname)
if isinstance(pathname, bytes): if isinstance(pathname, bytes):
pathname = _magic_check_bytes.sub(br'[\1]', pathname) pathname = _magic_check_bytes.sub(br"[\1]", pathname)
else: else:
pathname = _magic_check.sub(r'[\1]', pathname) pathname = _magic_check.sub(r"[\1]", pathname)
return drive + pathname return drive + pathname
else:
from glob import escape as glob_escape # pylint: disable=no-name-in-module
string_types = (str, ) def load_python_module(name, pathname):
return imp.load_source(name, pathname)
else:
import importlib.util
from glob import escape as glob_escape
string_types = (str,)
def is_bytes(x): def is_bytes(x):
return isinstance(x, (bytes, memoryview, bytearray)) return isinstance(x, (bytes, memoryview, bytearray))
@@ -87,14 +105,6 @@ else:
def path_to_unicode(path): def path_to_unicode(path):
return path return path
def get_file_contents(path):
try:
with open(path) as f:
return f.read()
except UnicodeDecodeError:
with open(path, encoding="latin-1") as f:
return f.read()
def hashlib_encode_data(data): def hashlib_encode_data(data):
if is_bytes(data): if is_bytes(data):
return data return data
@@ -106,3 +116,9 @@ else:
if isinstance(obj, string_types): if isinstance(obj, string_types):
return obj return obj
return json.dumps(obj, ensure_ascii=False, sort_keys=True) return json.dumps(obj, ensure_ascii=False, sort_keys=True)
def load_python_module(name, pathname):
spec = importlib.util.spec_from_file_location(name, pathname)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module

View File

@@ -22,8 +22,11 @@ import click
import requests import requests
from platformio import util from platformio import util
from platformio.exception import (FDSHASumMismatch, FDSizeMismatch, from platformio.exception import (
FDUnrecognizedStatusCode) FDSHASumMismatch,
FDSizeMismatch,
FDUnrecognizedStatusCode,
)
from platformio.proc import exec_command from platformio.proc import exec_command
@@ -34,17 +37,22 @@ class FileDownloader(object):
def __init__(self, url, dest_dir=None): def __init__(self, url, dest_dir=None):
self._request = None self._request = None
# make connection # make connection
self._request = requests.get(url, self._request = requests.get(
stream=True, url,
headers=util.get_request_defheaders(), stream=True,
verify=version_info >= (2, 7, 9)) headers=util.get_request_defheaders(),
verify=version_info >= (2, 7, 9),
)
if self._request.status_code != 200: if self._request.status_code != 200:
raise FDUnrecognizedStatusCode(self._request.status_code, url) raise FDUnrecognizedStatusCode(self._request.status_code, url)
disposition = self._request.headers.get("content-disposition") disposition = self._request.headers.get("content-disposition")
if disposition and "filename=" in disposition: if disposition and "filename=" in disposition:
self._fname = disposition[disposition.index("filename=") + self._fname = (
9:].replace('"', "").replace("'", "") disposition[disposition.index("filename=") + 9 :]
.replace('"', "")
.replace("'", "")
)
else: else:
self._fname = [p for p in url.split("/") if p][-1] self._fname = [p for p in url.split("/") if p][-1]
self._fname = str(self._fname) self._fname = str(self._fname)
@@ -64,7 +72,7 @@ class FileDownloader(object):
def get_size(self): def get_size(self):
if "content-length" not in self._request.headers: if "content-length" not in self._request.headers:
return -1 return -1
return int(self._request.headers['content-length']) return int(self._request.headers["content-length"])
def start(self, with_progress=True): def start(self, with_progress=True):
label = "Downloading" label = "Downloading"
@@ -101,11 +109,11 @@ class FileDownloader(object):
dlsha1 = None dlsha1 = None
try: try:
result = exec_command(["sha1sum", self._destination]) result = exec_command(["sha1sum", self._destination])
dlsha1 = result['out'] dlsha1 = result["out"]
except (OSError, ValueError): except (OSError, ValueError):
try: try:
result = exec_command(["shasum", "-a", "1", self._destination]) result = exec_command(["shasum", "-a", "1", self._destination])
dlsha1 = result['out'] dlsha1 = result["out"]
except (OSError, ValueError): except (OSError, ValueError):
pass pass
if not dlsha1: if not dlsha1:

View File

@@ -64,8 +64,10 @@ class IncompatiblePlatform(PlatformioException):
class PlatformNotInstalledYet(PlatformioException): class PlatformNotInstalledYet(PlatformioException):
MESSAGE = ("The platform '{0}' has not been installed yet. " MESSAGE = (
"Use `platformio platform install {0}` command") "The platform '{0}' has not been installed yet. "
"Use `platformio platform install {0}` command"
)
class UnknownBoard(PlatformioException): class UnknownBoard(PlatformioException):
@@ -102,22 +104,27 @@ class MissingPackageManifest(PlatformIOPackageException):
class UndefinedPackageVersion(PlatformIOPackageException): class UndefinedPackageVersion(PlatformIOPackageException):
MESSAGE = ("Could not find a version that satisfies the requirement '{0}'" MESSAGE = (
" for your system '{1}'") "Could not find a version that satisfies the requirement '{0}'"
" for your system '{1}'"
)
class PackageInstallError(PlatformIOPackageException): class PackageInstallError(PlatformIOPackageException):
MESSAGE = ("Could not install '{0}' with version requirements '{1}' " MESSAGE = (
"for your system '{2}'.\n\n" "Could not install '{0}' with version requirements '{1}' "
"Please try this solution -> http://bit.ly/faq-package-manager") "for your system '{2}'.\n\n"
"Please try this solution -> http://bit.ly/faq-package-manager"
)
class ExtractArchiveItemError(PlatformIOPackageException): class ExtractArchiveItemError(PlatformIOPackageException):
MESSAGE = ( MESSAGE = (
"Could not extract `{0}` to `{1}`. Try to disable antivirus " "Could not extract `{0}` to `{1}`. Try to disable antivirus "
"tool or check this solution -> http://bit.ly/faq-package-manager") "tool or check this solution -> http://bit.ly/faq-package-manager"
)
class UnsupportedArchiveType(PlatformIOPackageException): class UnsupportedArchiveType(PlatformIOPackageException):
@@ -132,14 +139,17 @@ class FDUnrecognizedStatusCode(PlatformIOPackageException):
class FDSizeMismatch(PlatformIOPackageException): class FDSizeMismatch(PlatformIOPackageException):
MESSAGE = ("The size ({0:d} bytes) of downloaded file '{1}' " MESSAGE = (
"is not equal to remote size ({2:d} bytes)") "The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)"
)
class FDSHASumMismatch(PlatformIOPackageException): class FDSHASumMismatch(PlatformIOPackageException):
MESSAGE = ("The 'sha1' sum '{0}' of downloaded file '{1}' " MESSAGE = (
"is not equal to remote '{2}'") "The 'sha1' sum '{0}' of downloaded file '{1}' is not equal to remote '{2}'"
)
# #
@@ -156,12 +166,13 @@ class NotPlatformIOProject(PlatformIOProjectException):
MESSAGE = ( MESSAGE = (
"Not a PlatformIO project. `platformio.ini` file has not been " "Not a PlatformIO project. `platformio.ini` file has not been "
"found in current working directory ({0}). To initialize new project " "found in current working directory ({0}). To initialize new project "
"please use `platformio init` command") "please use `platformio init` command"
)
class InvalidProjectConf(PlatformIOProjectException): class InvalidProjectConf(PlatformIOProjectException):
MESSAGE = ("Invalid '{0}' (project configuration file): '{1}'") MESSAGE = "Invalid '{0}' (project configuration file): '{1}'"
class UndefinedEnvPlatform(PlatformIOProjectException): class UndefinedEnvPlatform(PlatformIOProjectException):
@@ -191,9 +202,11 @@ class ProjectOptionValueError(PlatformIOProjectException):
class LibNotFound(PlatformioException): class LibNotFound(PlatformioException):
MESSAGE = ("Library `{0}` has not been found in PlatformIO Registry.\n" MESSAGE = (
"You can ignore this message, if `{0}` is a built-in library " "Library `{0}` has not been found in PlatformIO Registry.\n"
"(included in framework, SDK). E.g., SPI, Wire, etc.") "You can ignore this message, if `{0}` is a built-in library "
"(included in framework, SDK). E.g., SPI, Wire, etc."
)
class NotGlobalLibDir(UserSideException): class NotGlobalLibDir(UserSideException):
@@ -203,7 +216,8 @@ class NotGlobalLibDir(UserSideException):
"To manage libraries in global storage `{1}`,\n" "To manage libraries in global storage `{1}`,\n"
"please use `platformio lib --global {2}` or specify custom storage " "please use `platformio lib --global {2}` or specify custom storage "
"`platformio lib --storage-dir /path/to/storage/ {2}`.\n" "`platformio lib --storage-dir /path/to/storage/ {2}`.\n"
"Check `platformio lib --help` for details.") "Check `platformio lib --help` for details."
)
class InvalidLibConfURL(PlatformioException): class InvalidLibConfURL(PlatformioException):
@@ -224,7 +238,8 @@ class MissedUdevRules(InvalidUdevRules):
MESSAGE = ( MESSAGE = (
"Warning! Please install `99-platformio-udev.rules`. \nMode details: " "Warning! Please install `99-platformio-udev.rules`. \nMode details: "
"https://docs.platformio.org/en/latest/faq.html#platformio-udev-rules") "https://docs.platformio.org/en/latest/faq.html#platformio-udev-rules"
)
class OutdatedUdevRules(InvalidUdevRules): class OutdatedUdevRules(InvalidUdevRules):
@@ -232,7 +247,8 @@ class OutdatedUdevRules(InvalidUdevRules):
MESSAGE = ( MESSAGE = (
"Warning! Your `{0}` are outdated. Please update or reinstall them." "Warning! Your `{0}` are outdated. Please update or reinstall them."
"\n Mode details: https://docs.platformio.org" "\n Mode details: https://docs.platformio.org"
"/en/latest/faq.html#platformio-udev-rules") "/en/latest/faq.html#platformio-udev-rules"
)
# #
@@ -260,7 +276,8 @@ class InternetIsOffline(UserSideException):
MESSAGE = ( MESSAGE = (
"You are not connected to the Internet.\n" "You are not connected to the Internet.\n"
"If you build a project first time, we need Internet connection " "If you build a project first time, we need Internet connection "
"to install all dependencies and toolchains.") "to install all dependencies and toolchains."
)
class BuildScriptNotFound(PlatformioException): class BuildScriptNotFound(PlatformioException):
@@ -285,9 +302,11 @@ class InvalidJSONFile(PlatformioException):
class CIBuildEnvsEmpty(PlatformioException): class CIBuildEnvsEmpty(PlatformioException):
MESSAGE = ("Can't find PlatformIO build environments.\n" MESSAGE = (
"Please specify `--board` or path to `platformio.ini` with " "Can't find PlatformIO build environments.\n"
"predefined environments using `--project-conf` option") "Please specify `--board` or path to `platformio.ini` with "
"predefined environments using `--project-conf` option"
)
class UpgradeError(PlatformioException): class UpgradeError(PlatformioException):
@@ -307,13 +326,16 @@ class HomeDirPermissionsError(PlatformioException):
"current user and PlatformIO can not store configuration data.\n" "current user and PlatformIO can not store configuration data.\n"
"Please check the permissions and owner of that directory.\n" "Please check the permissions and owner of that directory.\n"
"Otherwise, please remove manually `{0}` directory and PlatformIO " "Otherwise, please remove manually `{0}` directory and PlatformIO "
"will create new from the current user.") "will create new from the current user."
)
class CygwinEnvDetected(PlatformioException): class CygwinEnvDetected(PlatformioException):
MESSAGE = ("PlatformIO does not work within Cygwin environment. " MESSAGE = (
"Use native Terminal instead.") "PlatformIO does not work within Cygwin environment. "
"Use native Terminal instead."
)
class DebugSupportError(PlatformioException): class DebugSupportError(PlatformioException):
@@ -322,7 +344,8 @@ class DebugSupportError(PlatformioException):
"Currently, PlatformIO does not support debugging for `{0}`.\n" "Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/" "Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org" "platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html") "/page/plus/debugging.html"
)
class DebugInvalidOptions(PlatformioException): class DebugInvalidOptions(PlatformioException):
@@ -331,8 +354,10 @@ class DebugInvalidOptions(PlatformioException):
class TestDirNotExists(PlatformioException): class TestDirNotExists(PlatformioException):
MESSAGE = "A test folder '{0}' does not exist.\nPlease create 'test' "\ MESSAGE = (
"directory in project's root and put a test set.\n"\ "A test folder '{0}' does not exist.\nPlease create 'test' "
"More details about Unit "\ "directory in project's root and put a test set.\n"
"Testing: http://docs.platformio.org/page/plus/"\ "More details about Unit "
"unit-testing.html" "Testing: http://docs.platformio.org/page/plus/"
"unit-testing.html"
)

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import io
import json import json
import os import os
import re import re
@@ -23,11 +24,10 @@ from glob import glob
import click import click
from platformio import exception from platformio import exception
from platformio.compat import WINDOWS, get_file_contents, glob_escape from platformio.compat import WINDOWS, glob_escape
class cd(object): class cd(object):
def __init__(self, new_path): def __init__(self, new_path):
self.new_path = new_path self.new_path = new_path
self.prev_path = os.getcwd() self.prev_path = os.getcwd()
@@ -49,6 +49,30 @@ def get_source_dir():
return os.path.dirname(curpath) return os.path.dirname(curpath)
def get_file_contents(path):
try:
with open(path) as fp:
return fp.read()
except UnicodeDecodeError:
click.secho(
"Unicode decode error has occurred, please remove invalid "
"(non-ASCII or non-UTF8) characters from %s file" % path,
fg="yellow",
err=True,
)
with io.open(path, encoding="latin-1") as fp:
return fp.read()
def write_file_contents(path, contents, errors=None):
try:
with open(path, "w") as fp:
return fp.write(contents)
except UnicodeEncodeError:
with io.open(path, "w", encoding="latin-1", errors=errors) as fp:
return fp.write(contents)
def load_json(file_path): def load_json(file_path):
try: try:
with open(file_path, "r") as f: with open(file_path, "r") as f:
@@ -65,34 +89,37 @@ def format_filesize(filesize):
if filesize < base: if filesize < base:
return "%d%s" % (filesize, suffix) return "%d%s" % (filesize, suffix)
for i, suffix in enumerate("KMGTPEZY"): for i, suffix in enumerate("KMGTPEZY"):
unit = base**(i + 2) unit = base ** (i + 2)
if filesize >= unit: if filesize >= unit:
continue continue
if filesize % (base**(i + 1)): if filesize % (base ** (i + 1)):
return "%.2f%sB" % ((base * filesize / unit), suffix) return "%.2f%sB" % ((base * filesize / unit), suffix)
break break
return "%d%sB" % ((base * filesize / unit), suffix) return "%d%sB" % ((base * filesize / unit), suffix)
def ensure_udev_rules(): def ensure_udev_rules():
from platformio.util import get_systype from platformio.util import get_systype # pylint: disable=import-outside-toplevel
def _rules_to_set(rules_path): def _rules_to_set(rules_path):
return set(l.strip() for l in get_file_contents(rules_path).split("\n") return set(
if l.strip() and not l.startswith("#")) l.strip()
for l in get_file_contents(rules_path).split("\n")
if l.strip() and not l.startswith("#")
)
if "linux" not in get_systype(): if "linux" not in get_systype():
return None return None
installed_rules = [ installed_rules = [
"/etc/udev/rules.d/99-platformio-udev.rules", "/etc/udev/rules.d/99-platformio-udev.rules",
"/lib/udev/rules.d/99-platformio-udev.rules" "/lib/udev/rules.d/99-platformio-udev.rules",
] ]
if not any(os.path.isfile(p) for p in installed_rules): if not any(os.path.isfile(p) for p in installed_rules):
raise exception.MissedUdevRules raise exception.MissedUdevRules
origin_path = os.path.abspath( origin_path = os.path.abspath(
os.path.join(get_source_dir(), "..", "scripts", os.path.join(get_source_dir(), "..", "scripts", "99-platformio-udev.rules")
"99-platformio-udev.rules")) )
if not os.path.isfile(origin_path): if not os.path.isfile(origin_path):
return None return None
@@ -117,7 +144,6 @@ def path_endswith_ext(path, extensions):
def match_src_files(src_dir, src_filter=None, src_exts=None): def match_src_files(src_dir, src_filter=None, src_exts=None):
def _append_build_item(items, item, src_dir): def _append_build_item(items, item, src_dir):
if not src_exts or path_endswith_ext(item, src_exts): if not src_exts or path_endswith_ext(item, src_exts):
items.add(item.replace(src_dir + os.sep, "")) items.add(item.replace(src_dir + os.sep, ""))
@@ -135,8 +161,7 @@ def match_src_files(src_dir, src_filter=None, src_exts=None):
if os.path.isdir(item): if os.path.isdir(item):
for root, _, files in os.walk(item, followlinks=True): for root, _, files in os.walk(item, followlinks=True):
for f in files: for f in files:
_append_build_item(items, os.path.join(root, f), _append_build_item(items, os.path.join(root, f), src_dir)
src_dir)
else: else:
_append_build_item(items, item, src_dir) _append_build_item(items, item, src_dir)
if action == "+": if action == "+":
@@ -152,8 +177,16 @@ def to_unix_path(path):
return re.sub(r"[\\]+", "/", path) return re.sub(r"[\\]+", "/", path)
def rmtree(path): def expanduser(path):
"""
Be compatible with Python 3.8, on Windows skip HOME and check for USERPROFILE
"""
if not WINDOWS or not path.startswith("~") or "USERPROFILE" not in os.environ:
return os.path.expanduser(path)
return os.environ["USERPROFILE"] + path[1:]
def rmtree(path):
def _onerror(func, path, __): def _onerror(func, path, __):
try: try:
st_mode = os.stat(path).st_mode st_mode = os.stat(path).st_mode
@@ -161,9 +194,10 @@ def rmtree(path):
os.chmod(path, st_mode | stat.S_IWRITE) os.chmod(path, st_mode | stat.S_IWRITE)
func(path) func(path)
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
click.secho("%s \nPlease manually remove the file `%s`" % click.secho(
(str(e), path), "%s \nPlease manually remove the file `%s`" % (str(e), path),
fg="red", fg="red",
err=True) err=True,
)
return shutil.rmtree(path, onerror=_onerror) return shutil.rmtree(path, onerror=_onerror)

View File

@@ -12,28 +12,22 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import codecs import io
import os import os
import sys import sys
from os.path import abspath, basename, expanduser, isdir, isfile, join, relpath from os.path import abspath, basename, isdir, isfile, join, relpath
import bottle import bottle
from platformio import fs, util from platformio import fs, util
from platformio.compat import get_file_contents
from platformio.proc import where_is_program from platformio.proc import where_is_program
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_lib_dir, from platformio.project.helpers import load_project_ide_data
get_project_libdeps_dir,
get_project_src_dir,
load_project_ide_data)
class ProjectGenerator(object): class ProjectGenerator(object):
def __init__(self, project_dir, ide, boards): def __init__(self, project_dir, ide, boards):
self.config = ProjectConfig.get_instance( self.config = ProjectConfig.get_instance(join(project_dir, "platformio.ini"))
join(project_dir, "platformio.ini"))
self.config.validate() self.config.validate()
self.project_dir = project_dir self.project_dir = project_dir
self.ide = str(ide) self.ide = str(ide)
@@ -42,8 +36,7 @@ class ProjectGenerator(object):
@staticmethod @staticmethod
def get_supported_ides(): def get_supported_ides():
tpls_dir = join(fs.get_source_dir(), "ide", "tpls") tpls_dir = join(fs.get_source_dir(), "ide", "tpls")
return sorted( return sorted([d for d in os.listdir(tpls_dir) if isdir(join(tpls_dir, d))])
[d for d in os.listdir(tpls_dir) if isdir(join(tpls_dir, d))])
def get_best_envname(self, boards=None): def get_best_envname(self, boards=None):
envname = None envname = None
@@ -71,29 +64,30 @@ class ProjectGenerator(object):
"project_name": basename(self.project_dir), "project_name": basename(self.project_dir),
"project_dir": self.project_dir, "project_dir": self.project_dir,
"env_name": self.env_name, "env_name": self.env_name,
"user_home_dir": abspath(expanduser("~")), "user_home_dir": abspath(fs.expanduser("~")),
"platformio_path": "platformio_path": sys.argv[0]
sys.argv[0] if isfile(sys.argv[0]) if isfile(sys.argv[0])
else where_is_program("platformio"), else where_is_program("platformio"),
"env_path": os.getenv("PATH"), "env_path": os.getenv("PATH"),
"env_pathsep": os.pathsep "env_pathsep": os.pathsep,
} # yapf: disable }
# default env configuration # default env configuration
tpl_vars.update(self.config.items(env=self.env_name, as_dict=True)) tpl_vars.update(self.config.items(env=self.env_name, as_dict=True))
# build data # build data
tpl_vars.update( tpl_vars.update(load_project_ide_data(self.project_dir, self.env_name) or {})
load_project_ide_data(self.project_dir, self.env_name) or {})
with fs.cd(self.project_dir): with fs.cd(self.project_dir):
tpl_vars.update({ tpl_vars.update(
"src_files": self.get_src_files(), {
"project_src_dir": get_project_src_dir(), "src_files": self.get_src_files(),
"project_lib_dir": get_project_lib_dir(), "project_src_dir": self.config.get_optional_dir("src"),
"project_libdeps_dir": join( "project_lib_dir": self.config.get_optional_dir("lib"),
get_project_libdeps_dir(), self.env_name) "project_libdeps_dir": join(
self.config.get_optional_dir("libdeps"), self.env_name
}) # yapf: disable ),
}
)
for key, value in tpl_vars.items(): for key, value in tpl_vars.items():
if key.endswith(("_path", "_dir")): if key.endswith(("_path", "_dir")):
@@ -103,13 +97,13 @@ class ProjectGenerator(object):
continue continue
tpl_vars[key] = [fs.to_unix_path(inc) for inc in tpl_vars[key]] tpl_vars[key] = [fs.to_unix_path(inc) for inc in tpl_vars[key]]
tpl_vars['to_unix_path'] = fs.to_unix_path tpl_vars["to_unix_path"] = fs.to_unix_path
return tpl_vars return tpl_vars
def get_src_files(self): def get_src_files(self):
result = [] result = []
with fs.cd(self.project_dir): with fs.cd(self.project_dir):
for root, _, files in os.walk(get_project_src_dir()): for root, _, files in os.walk(self.config.get_optional_dir("src")):
for f in files: for f in files:
result.append(relpath(join(root, f))) result.append(relpath(join(root, f)))
return result return result
@@ -142,11 +136,11 @@ class ProjectGenerator(object):
@staticmethod @staticmethod
def _render_tpl(tpl_path, tpl_vars): def _render_tpl(tpl_path, tpl_vars):
return bottle.template(get_file_contents(tpl_path), **tpl_vars) return bottle.template(fs.get_file_contents(tpl_path), **tpl_vars)
@staticmethod @staticmethod
def _merge_contents(dst_path, contents): def _merge_contents(dst_path, contents):
if basename(dst_path) == ".gitignore" and isfile(dst_path): if basename(dst_path) == ".gitignore" and isfile(dst_path):
return return
with codecs.open(dst_path, "w", encoding="utf8") as fp: with io.open(dst_path, "w", encoding="utf8") as fp:
fp.write(contents) fp.write(contents)

View File

@@ -1,4 +1,4 @@
% _defines = " ".join(["-D%s" % d for d in defines]) % _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
{ {
"execPath": "{{ cxx_path }}", "execPath": "{{ cxx_path }}",
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}", "gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",

View File

@@ -6,7 +6,7 @@
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO. # The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
cmake_minimum_required(VERSION 3.2) cmake_minimum_required(VERSION 3.2)
project({{project_name}}) project("{{project_name}}")
include(CMakeListsPrivate.txt) include(CMakeListsPrivate.txt)
@@ -62,6 +62,12 @@ add_custom_target(
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
) )
add_custom_target(
PLATFORMIO_BUILD_DEBUG ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --target debug "$<$<NOT:$<CONFIG:All>>:-e${CMAKE_BUILD_TYPE}>"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target( add_custom_target(
PLATFORMIO_UPDATE_ALL ALL PLATFORMIO_UPDATE_ALL ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion update COMMAND ${PLATFORMIO_CMD} -f -c clion update
@@ -80,4 +86,4 @@ add_custom_target(
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR} WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
) )
add_executable(${PROJECT_NAME} ${SRC_LIST}) add_executable(Z_DUMMY_TARGET ${SRC_LIST})

View File

@@ -31,6 +31,9 @@ set(CMAKE_CONFIGURATION_TYPES "{{ env_name }}" CACHE STRING "" FORCE)
% end % end
set(PLATFORMIO_CMD "{{ _normalize_path(platformio_path) }}") set(PLATFORMIO_CMD "{{ _normalize_path(platformio_path) }}")
% if svd_path:
set(SVD_PATH "{{ _normalize_path(svd_path) }}")
% end
SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}") SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}")
SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}") SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}")
@@ -57,8 +60,9 @@ if (CMAKE_BUILD_TYPE MATCHES "{{ env_name }}")
%end %end
endif() endif()
% leftover_envs = set(envs) ^ set([env_name]) % leftover_envs = list(set(envs) ^ set([env_name]))
% %
% ide_data = {}
% if leftover_envs: % if leftover_envs:
% ide_data = load_project_ide_data(project_dir, leftover_envs) % ide_data = load_project_ide_data(project_dir, leftover_envs)
% end % end

View File

@@ -0,0 +1,22 @@
% import re
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
% cc_stds = STD_RE.findall(cc_flags)
% cxx_stds = STD_RE.findall(cxx_flags)
%
%
clang
% if cc_stds:
{{"%c"}} -std=c{{ cc_stds[-1] }}
% end
% if cxx_stds:
{{"%cpp"}} -std=c++{{ cxx_stds[-1] }}
% end
% for include in includes:
-I{{ include }}
% end
% for define in defines:
-D{{ define }}
% end

View File

@@ -1,9 +1,9 @@
% _defines = " ".join(["-D%s" % d for d in defines]) % _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
{ {
"execPath": "{{ cxx_path }}", "execPath": "{{ cxx_path }}",
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}", "gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}", "gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccErrorLimit": 15, "gccErrorLimit": 15,
"gccIncludePaths": "{{! ','.join("'{}'".format(inc) for inc in includes)}}", "gccIncludePaths": "{{ ','.join(includes) }}",
"gccSuppressWarnings": false "gccSuppressWarnings": false
} }

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup> <ItemGroup>
<Filter Include="Source Files"> <Filter Include="Source Files">

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Project DefaultTargets="Build" ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup> <PropertyGroup>
<Path>{{env_path}}</Path> <Path>{{env_path}}</Path>

View File

@@ -26,10 +26,12 @@ LOCKFILE_INTERFACE_MSVCRT = 2
try: try:
import fcntl import fcntl
LOCKFILE_CURRENT_INTERFACE = LOCKFILE_INTERFACE_FCNTL LOCKFILE_CURRENT_INTERFACE = LOCKFILE_INTERFACE_FCNTL
except ImportError: except ImportError:
try: try:
import msvcrt import msvcrt
LOCKFILE_CURRENT_INTERFACE = LOCKFILE_INTERFACE_MSVCRT LOCKFILE_CURRENT_INTERFACE = LOCKFILE_INTERFACE_MSVCRT
except ImportError: except ImportError:
LOCKFILE_CURRENT_INTERFACE = None LOCKFILE_CURRENT_INTERFACE = None
@@ -40,7 +42,6 @@ class LockFileExists(Exception):
class LockFile(object): class LockFile(object):
def __init__(self, path, timeout=LOCKFILE_TIMEOUT, delay=LOCKFILE_DELAY): def __init__(self, path, timeout=LOCKFILE_TIMEOUT, delay=LOCKFILE_DELAY):
self.timeout = timeout self.timeout = timeout
self.delay = delay self.delay = delay

View File

@@ -49,12 +49,16 @@ def on_platformio_end(ctx, result): # pylint: disable=unused-argument
check_platformio_upgrade() check_platformio_upgrade()
check_internal_updates(ctx, "platforms") check_internal_updates(ctx, "platforms")
check_internal_updates(ctx, "libraries") check_internal_updates(ctx, "libraries")
except (exception.InternetIsOffline, exception.GetLatestVersionError, except (
exception.APIRequestError): exception.InternetIsOffline,
exception.GetLatestVersionError,
exception.APIRequestError,
):
click.secho( click.secho(
"Failed to check for PlatformIO upgrades. " "Failed to check for PlatformIO upgrades. "
"Please check your Internet connection.", "Please check your Internet connection.",
fg="red") fg="red",
)
def on_platformio_exception(e): def on_platformio_exception(e):
@@ -78,15 +82,17 @@ def set_caller(caller=None):
class Upgrader(object): class Upgrader(object):
def __init__(self, from_version, to_version): def __init__(self, from_version, to_version):
self.from_version = semantic_version.Version.coerce( self.from_version = semantic_version.Version.coerce(
util.pepver_to_semver(from_version)) util.pepver_to_semver(from_version)
)
self.to_version = semantic_version.Version.coerce( self.to_version = semantic_version.Version.coerce(
util.pepver_to_semver(to_version)) util.pepver_to_semver(to_version)
)
self._upgraders = [(semantic_version.Version("3.5.0-a.2"), self._upgraders = [
self._update_dev_platforms)] (semantic_version.Version("3.5.0-a.2"), self._update_dev_platforms)
]
def run(self, ctx): def run(self, ctx):
if self.from_version > self.to_version: if self.from_version > self.to_version:
@@ -114,19 +120,21 @@ def after_upgrade(ctx):
if last_version == "0.0.0": if last_version == "0.0.0":
app.set_state_item("last_version", __version__) app.set_state_item("last_version", __version__)
elif semantic_version.Version.coerce(util.pepver_to_semver( elif semantic_version.Version.coerce(
last_version)) > semantic_version.Version.coerce( util.pepver_to_semver(last_version)
util.pepver_to_semver(__version__)): ) > semantic_version.Version.coerce(util.pepver_to_semver(__version__)):
click.secho("*" * terminal_width, fg="yellow") click.secho("*" * terminal_width, fg="yellow")
click.secho("Obsolete PIO Core v%s is used (previous was %s)" % click.secho(
(__version__, last_version), "Obsolete PIO Core v%s is used (previous was %s)"
fg="yellow") % (__version__, last_version),
click.secho("Please remove multiple PIO Cores from a system:", fg="yellow",
fg="yellow") )
click.secho("Please remove multiple PIO Cores from a system:", fg="yellow")
click.secho( click.secho(
"https://docs.platformio.org/page/faq.html" "https://docs.platformio.org/page/faq.html"
"#multiple-pio-cores-in-a-system", "#multiple-pio-cores-in-a-system",
fg="cyan") fg="cyan",
)
click.secho("*" * terminal_width, fg="yellow") click.secho("*" * terminal_width, fg="yellow")
return return
else: else:
@@ -139,37 +147,53 @@ def after_upgrade(ctx):
u = Upgrader(last_version, __version__) u = Upgrader(last_version, __version__)
if u.run(ctx): if u.run(ctx):
app.set_state_item("last_version", __version__) app.set_state_item("last_version", __version__)
click.secho("PlatformIO has been successfully upgraded to %s!\n" % click.secho(
__version__, "PlatformIO has been successfully upgraded to %s!\n" % __version__,
fg="green") fg="green",
telemetry.on_event(category="Auto", )
action="Upgrade", telemetry.on_event(
label="%s > %s" % (last_version, __version__)) category="Auto",
action="Upgrade",
label="%s > %s" % (last_version, __version__),
)
else: else:
raise exception.UpgradeError("Auto upgrading...") raise exception.UpgradeError("Auto upgrading...")
click.echo("") click.echo("")
# PlatformIO banner # PlatformIO banner
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("If you like %s, please:" % click.echo("If you like %s, please:" % (click.style("PlatformIO", fg="cyan")))
(click.style("PlatformIO", fg="cyan")))
click.echo("- %s us on Twitter to stay up-to-date "
"on the latest project news > %s" %
(click.style("follow", fg="cyan"),
click.style("https://twitter.com/PlatformIO_Org", fg="cyan")))
click.echo( click.echo(
"- %s it on GitHub > %s" % "- %s us on Twitter to stay up-to-date "
(click.style("star", fg="cyan"), "on the latest project news > %s"
click.style("https://github.com/platformio/platformio", fg="cyan"))) % (
click.style("follow", fg="cyan"),
click.style("https://twitter.com/PlatformIO_Org", fg="cyan"),
)
)
click.echo(
"- %s it on GitHub > %s"
% (
click.style("star", fg="cyan"),
click.style("https://github.com/platformio/platformio", fg="cyan"),
)
)
if not getenv("PLATFORMIO_IDE"): if not getenv("PLATFORMIO_IDE"):
click.echo( click.echo(
"- %s PlatformIO IDE for IoT development > %s" % "- %s PlatformIO IDE for embedded development > %s"
(click.style("try", fg="cyan"), % (
click.style("https://platformio.org/platformio-ide", fg="cyan"))) click.style("try", fg="cyan"),
click.style("https://platformio.org/platformio-ide", fg="cyan"),
)
)
if not is_ci(): if not is_ci():
click.echo("- %s us with PlatformIO Plus > %s" % click.echo(
(click.style("support", fg="cyan"), "- %s us with PlatformIO Plus > %s"
click.style("https://pioplus.com", fg="cyan"))) % (
click.style("support", fg="cyan"),
click.style("https://pioplus.com", fg="cyan"),
)
)
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("") click.echo("")
@@ -181,7 +205,7 @@ def check_platformio_upgrade():
if (time() - interval) < last_check.get("platformio_upgrade", 0): if (time() - interval) < last_check.get("platformio_upgrade", 0):
return return
last_check['platformio_upgrade'] = int(time()) last_check["platformio_upgrade"] = int(time())
app.set_state_item("last_check", last_check) app.set_state_item("last_check", last_check)
util.internet_on(raise_exception=True) util.internet_on(raise_exception=True)
@@ -190,23 +214,23 @@ def check_platformio_upgrade():
update_core_packages(silent=True) update_core_packages(silent=True)
latest_version = get_latest_version() latest_version = get_latest_version()
if semantic_version.Version.coerce(util.pepver_to_semver( if semantic_version.Version.coerce(
latest_version)) <= semantic_version.Version.coerce( util.pepver_to_semver(latest_version)
util.pepver_to_semver(__version__)): ) <= semantic_version.Version.coerce(util.pepver_to_semver(__version__)):
return return
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
click.echo("") click.echo("")
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.secho("There is a new version %s of PlatformIO available.\n" click.secho(
"Please upgrade it via `" % latest_version, "There is a new version %s of PlatformIO available.\n"
fg="yellow", "Please upgrade it via `" % latest_version,
nl=False) fg="yellow",
nl=False,
)
if getenv("PLATFORMIO_IDE"): if getenv("PLATFORMIO_IDE"):
click.secho("PlatformIO IDE Menu: Upgrade PlatformIO", click.secho("PlatformIO IDE Menu: Upgrade PlatformIO", fg="cyan", nl=False)
fg="cyan",
nl=False)
click.secho("`.", fg="yellow") click.secho("`.", fg="yellow")
elif join("Cellar", "platformio") in fs.get_source_dir(): elif join("Cellar", "platformio") in fs.get_source_dir():
click.secho("brew update && brew upgrade", fg="cyan", nl=False) click.secho("brew update && brew upgrade", fg="cyan", nl=False)
@@ -217,8 +241,7 @@ def check_platformio_upgrade():
click.secho("pip install -U platformio", fg="cyan", nl=False) click.secho("pip install -U platformio", fg="cyan", nl=False)
click.secho("` command.", fg="yellow") click.secho("` command.", fg="yellow")
click.secho("Changes: ", fg="yellow", nl=False) click.secho("Changes: ", fg="yellow", nl=False)
click.secho("https://docs.platformio.org/en/latest/history.html", click.secho("https://docs.platformio.org/en/latest/history.html", fg="cyan")
fg="cyan")
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("") click.echo("")
@@ -229,7 +252,7 @@ def check_internal_updates(ctx, what):
if (time() - interval) < last_check.get(what + "_update", 0): if (time() - interval) < last_check.get(what + "_update", 0):
return return
last_check[what + '_update'] = int(time()) last_check[what + "_update"] = int(time())
app.set_state_item("last_check", last_check) app.set_state_item("last_check", last_check)
util.internet_on(raise_exception=True) util.internet_on(raise_exception=True)
@@ -237,15 +260,17 @@ def check_internal_updates(ctx, what):
pm = PlatformManager() if what == "platforms" else LibraryManager() pm = PlatformManager() if what == "platforms" else LibraryManager()
outdated_items = [] outdated_items = []
for manifest in pm.get_installed(): for manifest in pm.get_installed():
if manifest['name'] in outdated_items: if manifest["name"] in outdated_items:
continue continue
conds = [ conds = [
pm.outdated(manifest['__pkg_dir']), what == "platforms" pm.outdated(manifest["__pkg_dir"]),
what == "platforms"
and PlatformFactory.newPlatform( and PlatformFactory.newPlatform(
manifest['__pkg_dir']).are_outdated_packages() manifest["__pkg_dir"]
).are_outdated_packages(),
] ]
if any(conds): if any(conds):
outdated_items.append(manifest['name']) outdated_items.append(manifest["name"])
if not outdated_items: if not outdated_items:
return return
@@ -254,26 +279,32 @@ def check_internal_updates(ctx, what):
click.echo("") click.echo("")
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.secho("There are the new updates for %s (%s)" % click.secho(
(what, ", ".join(outdated_items)), "There are the new updates for %s (%s)" % (what, ", ".join(outdated_items)),
fg="yellow") fg="yellow",
)
if not app.get_setting("auto_update_" + what): if not app.get_setting("auto_update_" + what):
click.secho("Please update them via ", fg="yellow", nl=False) click.secho("Please update them via ", fg="yellow", nl=False)
click.secho("`platformio %s update`" % click.secho(
("lib --global" if what == "libraries" else "platform"), "`platformio %s update`"
fg="cyan", % ("lib --global" if what == "libraries" else "platform"),
nl=False) fg="cyan",
nl=False,
)
click.secho(" command.\n", fg="yellow") click.secho(" command.\n", fg="yellow")
click.secho( click.secho(
"If you want to manually check for the new versions " "If you want to manually check for the new versions "
"without updating, please use ", "without updating, please use ",
fg="yellow", fg="yellow",
nl=False) nl=False,
click.secho("`platformio %s update --dry-run`" % )
("lib --global" if what == "libraries" else "platform"), click.secho(
fg="cyan", "`platformio %s update --dry-run`"
nl=False) % ("lib --global" if what == "libraries" else "platform"),
fg="cyan",
nl=False,
)
click.secho(" command.", fg="yellow") click.secho(" command.", fg="yellow")
else: else:
click.secho("Please wait while updating %s ..." % what, fg="yellow") click.secho("Please wait while updating %s ..." % what, fg="yellow")
@@ -284,9 +315,7 @@ def check_internal_updates(ctx, what):
ctx.invoke(cmd_lib_update, libraries=outdated_items) ctx.invoke(cmd_lib_update, libraries=outdated_items)
click.echo() click.echo()
telemetry.on_event(category="Auto", telemetry.on_event(category="Auto", action="Update", label=what.title())
action="Update",
label=what.title())
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("") click.echo("")

View File

@@ -21,15 +21,16 @@ from platformio import __version__, exception, fs
from platformio.compat import PY2, WINDOWS from platformio.compat import PY2, WINDOWS
from platformio.managers.package import PackageManager from platformio.managers.package import PackageManager
from platformio.proc import copy_pythonpath_to_osenv, get_pythonexe_path from platformio.proc import copy_pythonpath_to_osenv, get_pythonexe_path
from platformio.project.helpers import get_project_packages_dir from platformio.project.config import ProjectConfig
CORE_PACKAGES = { CORE_PACKAGES = {
"contrib-piohome": "^2.3.2", "contrib-piohome": "~3.0.0",
"contrib-pysite": "contrib-pysite": "~2.%d%d.0" % (sys.version_info[0], sys.version_info[1]),
"~2.%d%d.190418" % (sys.version_info[0], sys.version_info[1]), "tool-pioplus": "^2.5.8",
"tool-pioplus": "^2.5.2",
"tool-unity": "~1.20403.0", "tool-unity": "~1.20403.0",
"tool-scons": "~2.20501.7" if PY2 else "~3.30101.0" "tool-scons": "~2.20501.7" if PY2 else "~3.30101.0",
"tool-cppcheck": "~1.189.0",
"tool-clangtidy": "^1.80000.0",
} }
PIOPLUS_AUTO_UPDATES_MAX = 100 PIOPLUS_AUTO_UPDATES_MAX = 100
@@ -38,20 +39,21 @@ PIOPLUS_AUTO_UPDATES_MAX = 100
class CorePackageManager(PackageManager): class CorePackageManager(PackageManager):
def __init__(self): def __init__(self):
super(CorePackageManager, self).__init__(get_project_packages_dir(), [ config = ProjectConfig.get_instance()
"https://dl.bintray.com/platformio/dl-packages/manifest.json", packages_dir = config.get_optional_dir("packages")
"http%s://dl.platformio.org/packages/manifest.json" % super(CorePackageManager, self).__init__(
("" if sys.version_info < (2, 7, 9) else "s") packages_dir,
]) [
"https://dl.bintray.com/platformio/dl-packages/manifest.json",
"http%s://dl.platformio.org/packages/manifest.json"
% ("" if sys.version_info < (2, 7, 9) else "s"),
],
)
def install( # pylint: disable=keyword-arg-before-vararg def install( # pylint: disable=keyword-arg-before-vararg
self, self, name, requirements=None, *args, **kwargs
name, ):
requirements=None,
*args,
**kwargs):
PackageManager.install(self, name, requirements, *args, **kwargs) PackageManager.install(self, name, requirements, *args, **kwargs)
self.cleanup_packages() self.cleanup_packages()
return self.get_package_dir(name, requirements) return self.get_package_dir(name, requirements)
@@ -68,12 +70,12 @@ class CorePackageManager(PackageManager):
pkg_dir = self.get_package_dir(name, requirements) pkg_dir = self.get_package_dir(name, requirements)
if not pkg_dir: if not pkg_dir:
continue continue
best_pkg_versions[name] = self.load_manifest(pkg_dir)['version'] best_pkg_versions[name] = self.load_manifest(pkg_dir)["version"]
for manifest in self.get_installed(): for manifest in self.get_installed():
if manifest['name'] not in best_pkg_versions: if manifest["name"] not in best_pkg_versions:
continue continue
if manifest['version'] != best_pkg_versions[manifest['name']]: if manifest["version"] != best_pkg_versions[manifest["name"]]:
self.uninstall(manifest['__pkg_dir'], after_update=True) self.uninstall(manifest["__pkg_dir"], after_update=True)
self.cache_reset() self.cache_reset()
return True return True
@@ -101,7 +103,8 @@ def update_core_packages(only_check=False, silent=False):
def inject_contrib_pysite(): def inject_contrib_pysite():
from site import addsitedir from site import addsitedir # pylint: disable=import-outside-toplevel
contrib_pysite_dir = get_core_package_dir("contrib-pysite") contrib_pysite_dir = get_core_package_dir("contrib-pysite")
if contrib_pysite_dir in sys.path: if contrib_pysite_dir in sys.path:
return return
@@ -114,16 +117,18 @@ def pioplus_call(args, **kwargs):
raise exception.PlatformioException( raise exception.PlatformioException(
"PlatformIO Core Plus v%s does not run under Python version %s.\n" "PlatformIO Core Plus v%s does not run under Python version %s.\n"
"Minimum supported version is 2.7.6, please upgrade Python.\n" "Minimum supported version is 2.7.6, please upgrade Python.\n"
"Python 3 is not yet supported.\n" % (__version__, sys.version)) "Python 3 is not yet supported.\n" % (__version__, sys.version)
)
pioplus_path = join(get_core_package_dir("tool-pioplus"), "pioplus") pioplus_path = join(get_core_package_dir("tool-pioplus"), "pioplus")
pythonexe_path = get_pythonexe_path() pythonexe_path = get_pythonexe_path()
os.environ['PYTHONEXEPATH'] = pythonexe_path os.environ["PYTHONEXEPATH"] = pythonexe_path
os.environ['PYTHONPYSITEDIR'] = get_core_package_dir("contrib-pysite") os.environ["PYTHONPYSITEDIR"] = get_core_package_dir("contrib-pysite")
os.environ['PIOCOREPYSITEDIR'] = dirname(fs.get_source_dir() or "") os.environ["PIOCOREPYSITEDIR"] = dirname(fs.get_source_dir() or "")
if dirname(pythonexe_path) not in os.environ['PATH'].split(os.pathsep): if dirname(pythonexe_path) not in os.environ["PATH"].split(os.pathsep):
os.environ['PATH'] = (os.pathsep).join( os.environ["PATH"] = (os.pathsep).join(
[dirname(pythonexe_path), os.environ['PATH']]) [dirname(pythonexe_path), os.environ["PATH"]]
)
copy_pythonpath_to_osenv() copy_pythonpath_to_osenv()
code = subprocess.call([pioplus_path] + args, **kwargs) code = subprocess.call([pioplus_path] + args, **kwargs)

View File

@@ -16,7 +16,6 @@
# pylint: disable=too-many-return-statements # pylint: disable=too-many-return-statements
import json import json
import re
from glob import glob from glob import glob
from os.path import isdir, join from os.path import isdir, join
@@ -27,7 +26,7 @@ from platformio import app, exception, util
from platformio.compat import glob_escape, string_types from platformio.compat import glob_escape, string_types
from platformio.managers.package import BasePkgManager from platformio.managers.package import BasePkgManager
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.project.helpers import get_project_global_lib_dir from platformio.project.config import ProjectConfig
class LibraryManager(BasePkgManager): class LibraryManager(BasePkgManager):
@@ -35,16 +34,14 @@ class LibraryManager(BasePkgManager):
FILE_CACHE_VALID = "30d" # 1 month FILE_CACHE_VALID = "30d" # 1 month
def __init__(self, package_dir=None): def __init__(self, package_dir=None):
if not package_dir: self.config = ProjectConfig.get_instance()
package_dir = get_project_global_lib_dir() super(LibraryManager, self).__init__(
super(LibraryManager, self).__init__(package_dir) package_dir or self.config.get_optional_dir("globallib")
)
@property @property
def manifest_names(self): def manifest_names(self):
return [ return [".library.json", "library.json", "library.properties", "module.json"]
".library.json", "library.json", "library.properties",
"module.json"
]
def get_manifest_path(self, pkg_dir): def get_manifest_path(self, pkg_dir):
path = BasePkgManager.get_manifest_path(self, pkg_dir) path = BasePkgManager.get_manifest_path(self, pkg_dir)
@@ -64,75 +61,6 @@ class LibraryManager(BasePkgManager):
return None return None
def load_manifest(self, pkg_dir):
manifest = BasePkgManager.load_manifest(self, pkg_dir)
if not manifest:
return manifest
# if Arduino library.properties
if "sentence" in manifest:
manifest['frameworks'] = ["arduino"]
manifest['description'] = manifest['sentence']
del manifest['sentence']
if "author" in manifest:
if isinstance(manifest['author'], dict):
manifest['authors'] = [manifest['author']]
else:
manifest['authors'] = [{"name": manifest['author']}]
del manifest['author']
if "authors" in manifest and not isinstance(manifest['authors'], list):
manifest['authors'] = [manifest['authors']]
if "keywords" not in manifest:
keywords = []
for keyword in re.split(r"[\s/]+",
manifest.get("category", "Uncategorized")):
keyword = keyword.strip()
if not keyword:
continue
keywords.append(keyword.lower())
manifest['keywords'] = keywords
if "category" in manifest:
del manifest['category']
# don't replace VCS URL
if "url" in manifest and "description" in manifest:
manifest['homepage'] = manifest['url']
del manifest['url']
if "architectures" in manifest:
platforms = []
platforms_map = {
"avr": "atmelavr",
"sam": "atmelsam",
"samd": "atmelsam",
"esp8266": "espressif8266",
"esp32": "espressif32",
"arc32": "intel_arc32"
}
for arch in manifest['architectures'].split(","):
arch = arch.strip()
if arch == "*":
platforms = "*"
break
if arch in platforms_map:
platforms.append(platforms_map[arch])
manifest['platforms'] = platforms
del manifest['architectures']
# convert listed items via comma to array
for key in ("keywords", "frameworks", "platforms"):
if key not in manifest or \
not isinstance(manifest[key], string_types):
continue
manifest[key] = [
i.strip() for i in manifest[key].split(",") if i.strip()
]
return manifest
@staticmethod @staticmethod
def normalize_dependencies(dependencies): def normalize_dependencies(dependencies):
if not dependencies: if not dependencies:
@@ -153,13 +81,10 @@ class LibraryManager(BasePkgManager):
if item[k] == "*": if item[k] == "*":
del item[k] del item[k]
elif isinstance(item[k], string_types): elif isinstance(item[k], string_types):
item[k] = [ item[k] = [i.strip() for i in item[k].split(",") if i.strip()]
i.strip() for i in item[k].split(",") if i.strip()
]
return items return items
def max_satisfying_repo_version(self, versions, requirements=None): def max_satisfying_repo_version(self, versions, requirements=None):
def _cmp_dates(datestr1, datestr2): def _cmp_dates(datestr1, datestr2):
date1 = util.parse_date(datestr1) date1 = util.parse_date(datestr1)
date2 = util.parse_date(datestr2) date2 = util.parse_date(datestr2)
@@ -169,61 +94,66 @@ class LibraryManager(BasePkgManager):
semver_spec = None semver_spec = None
try: try:
semver_spec = semantic_version.SimpleSpec( semver_spec = (
requirements) if requirements else None semantic_version.SimpleSpec(requirements) if requirements else None
)
except ValueError: except ValueError:
pass pass
item = {} item = {}
for v in versions: for v in versions:
semver_new = self.parse_semver_version(v['name']) semver_new = self.parse_semver_version(v["name"])
if semver_spec: if semver_spec:
if not semver_new or semver_new not in semver_spec: if not semver_new or semver_new not in semver_spec:
continue continue
if not item or self.parse_semver_version( if not item or self.parse_semver_version(item["name"]) < semver_new:
item['name']) < semver_new:
item = v item = v
elif requirements: elif requirements:
if requirements == v['name']: if requirements == v["name"]:
return v return v
else: else:
if not item or _cmp_dates(item['released'], if not item or _cmp_dates(item["released"], v["released"]) == -1:
v['released']) == -1:
item = v item = v
return item return item
def get_latest_repo_version(self, name, requirements, silent=False): def get_latest_repo_version(self, name, requirements, silent=False):
item = self.max_satisfying_repo_version( item = self.max_satisfying_repo_version(
util.get_api_result("/lib/info/%d" % self.search_lib_id( util.get_api_result(
{ "/lib/info/%d"
"name": name, % self.search_lib_id(
"requirements": requirements {"name": name, "requirements": requirements}, silent=silent
}, silent=silent), ),
cache_valid="1h")['versions'], requirements) cache_valid="1h",
return item['name'] if item else None )["versions"],
requirements,
)
return item["name"] if item else None
def _install_from_piorepo(self, name, requirements): def _install_from_piorepo(self, name, requirements):
assert name.startswith("id="), name assert name.startswith("id="), name
version = self.get_latest_repo_version(name, requirements) version = self.get_latest_repo_version(name, requirements)
if not version: if not version:
raise exception.UndefinedPackageVersion(requirements or "latest", raise exception.UndefinedPackageVersion(
util.get_systype()) requirements or "latest", util.get_systype()
dl_data = util.get_api_result("/lib/download/" + str(name[3:]), )
dict(version=version), dl_data = util.get_api_result(
cache_valid="30d") "/lib/download/" + str(name[3:]), dict(version=version), cache_valid="30d"
)
assert dl_data assert dl_data
return self._install_from_url( return self._install_from_url(
name, dl_data['url'].replace("http://", "https://") name,
if app.get_setting("strict_ssl") else dl_data['url'], requirements) dl_data["url"].replace("http://", "https://")
if app.get_setting("strict_ssl")
else dl_data["url"],
requirements,
)
def search_lib_id( # pylint: disable=too-many-branches def search_lib_id( # pylint: disable=too-many-branches
self, self, filters, silent=False, interactive=False
filters, ):
silent=False,
interactive=False):
assert isinstance(filters, dict) assert isinstance(filters, dict)
assert "name" in filters assert "name" in filters
@@ -234,8 +164,10 @@ class LibraryManager(BasePkgManager):
# looking in PIO Library Registry # looking in PIO Library Registry
if not silent: if not silent:
click.echo("Looking for %s library in registry" % click.echo(
click.style(filters['name'], fg="cyan")) "Looking for %s library in registry"
% click.style(filters["name"], fg="cyan")
)
query = [] query = []
for key in filters: for key in filters:
if key not in ("name", "authors", "frameworks", "platforms"): if key not in ("name", "authors", "frameworks", "platforms"):
@@ -244,25 +176,30 @@ class LibraryManager(BasePkgManager):
if not isinstance(values, list): if not isinstance(values, list):
values = [v.strip() for v in values.split(",") if v] values = [v.strip() for v in values.split(",") if v]
for value in values: for value in values:
query.append('%s:"%s"' % query.append(
(key[:-1] if key.endswith("s") else key, value)) '%s:"%s"' % (key[:-1] if key.endswith("s") else key, value)
)
lib_info = None lib_info = None
result = util.get_api_result("/v2/lib/search", result = util.get_api_result(
dict(query=" ".join(query)), "/v2/lib/search", dict(query=" ".join(query)), cache_valid="1h"
cache_valid="1h") )
if result['total'] == 1: if result["total"] == 1:
lib_info = result['items'][0] lib_info = result["items"][0]
elif result['total'] > 1: elif result["total"] > 1:
if silent and not interactive: if silent and not interactive:
lib_info = result['items'][0] lib_info = result["items"][0]
else: else:
click.secho("Conflict: More than one library has been found " click.secho(
"by request %s:" % json.dumps(filters), "Conflict: More than one library has been found "
fg="yellow", "by request %s:" % json.dumps(filters),
err=True) fg="yellow",
err=True,
)
# pylint: disable=import-outside-toplevel
from platformio.commands.lib import print_lib_item from platformio.commands.lib import print_lib_item
for item in result['items']:
for item in result["items"]:
print_lib_item(item) print_lib_item(item)
if not interactive: if not interactive:
@@ -270,36 +207,39 @@ class LibraryManager(BasePkgManager):
"Automatically chose the first available library " "Automatically chose the first available library "
"(use `--interactive` option to make a choice)", "(use `--interactive` option to make a choice)",
fg="yellow", fg="yellow",
err=True) err=True,
lib_info = result['items'][0] )
lib_info = result["items"][0]
else: else:
deplib_id = click.prompt("Please choose library ID", deplib_id = click.prompt(
type=click.Choice([ "Please choose library ID",
str(i['id']) type=click.Choice([str(i["id"]) for i in result["items"]]),
for i in result['items'] )
])) for item in result["items"]:
for item in result['items']: if item["id"] == int(deplib_id):
if item['id'] == int(deplib_id):
lib_info = item lib_info = item
break break
if not lib_info: if not lib_info:
if list(filters) == ["name"]: if list(filters) == ["name"]:
raise exception.LibNotFound(filters['name']) raise exception.LibNotFound(filters["name"])
raise exception.LibNotFound(str(filters)) raise exception.LibNotFound(str(filters))
if not silent: if not silent:
click.echo("Found: %s" % click.style( click.echo(
"https://platformio.org/lib/show/{id}/{name}".format( "Found: %s"
**lib_info), % click.style(
fg="blue")) "https://platformio.org/lib/show/{id}/{name}".format(**lib_info),
return int(lib_info['id']) fg="blue",
)
)
return int(lib_info["id"])
def _get_lib_id_from_installed(self, filters): def _get_lib_id_from_installed(self, filters):
if filters['name'].startswith("id="): if filters["name"].startswith("id="):
return int(filters['name'][3:]) return int(filters["name"][3:])
package_dir = self.get_package_dir( package_dir = self.get_package_dir(
filters['name'], filters.get("requirements", filters["name"], filters.get("requirements", filters.get("version"))
filters.get("version"))) )
if not package_dir: if not package_dir:
return None return None
manifest = self.load_manifest(package_dir) manifest = self.load_manifest(package_dir)
@@ -311,52 +251,55 @@ class LibraryManager(BasePkgManager):
continue continue
if key not in manifest: if key not in manifest:
return None return None
if not util.items_in_list(util.items_to_list(filters[key]), if not util.items_in_list(
util.items_to_list(manifest[key])): util.items_to_list(filters[key]), util.items_to_list(manifest[key])
):
return None return None
if "authors" in filters: if "authors" in filters:
if "authors" not in manifest: if "authors" not in manifest:
return None return None
manifest_authors = manifest['authors'] manifest_authors = manifest["authors"]
if not isinstance(manifest_authors, list): if not isinstance(manifest_authors, list):
manifest_authors = [manifest_authors] manifest_authors = [manifest_authors]
manifest_authors = [ manifest_authors = [
a['name'] for a in manifest_authors a["name"]
for a in manifest_authors
if isinstance(a, dict) and "name" in a if isinstance(a, dict) and "name" in a
] ]
filter_authors = filters['authors'] filter_authors = filters["authors"]
if not isinstance(filter_authors, list): if not isinstance(filter_authors, list):
filter_authors = [filter_authors] filter_authors = [filter_authors]
if not set(filter_authors) <= set(manifest_authors): if not set(filter_authors) <= set(manifest_authors):
return None return None
return int(manifest['id']) return int(manifest["id"])
def install( # pylint: disable=arguments-differ def install( # pylint: disable=arguments-differ
self, self,
name, name,
requirements=None, requirements=None,
silent=False, silent=False,
after_update=False, after_update=False,
interactive=False, interactive=False,
force=False): force=False,
):
_name, _requirements, _url = self.parse_pkg_uri(name, requirements) _name, _requirements, _url = self.parse_pkg_uri(name, requirements)
if not _url: if not _url:
name = "id=%d" % self.search_lib_id( name = "id=%d" % self.search_lib_id(
{ {"name": _name, "requirements": _requirements},
"name": _name,
"requirements": _requirements
},
silent=silent, silent=silent,
interactive=interactive) interactive=interactive,
)
requirements = _requirements requirements = _requirements
pkg_dir = BasePkgManager.install(self, pkg_dir = BasePkgManager.install(
name, self,
requirements, name,
silent=silent, requirements,
after_update=after_update, silent=silent,
force=force) after_update=after_update,
force=force,
)
if not pkg_dir: if not pkg_dir:
return None return None
@@ -369,7 +312,7 @@ class LibraryManager(BasePkgManager):
click.secho("Installing dependencies", fg="yellow") click.secho("Installing dependencies", fg="yellow")
builtin_lib_storages = None builtin_lib_storages = None
for filters in self.normalize_dependencies(manifest['dependencies']): for filters in self.normalize_dependencies(manifest["dependencies"]):
assert "name" in filters assert "name" in filters
# avoid circle dependencies # avoid circle dependencies
@@ -381,35 +324,42 @@ class LibraryManager(BasePkgManager):
self.INSTALL_HISTORY.append(history_key) self.INSTALL_HISTORY.append(history_key)
if any(s in filters.get("version", "") for s in ("\\", "/")): if any(s in filters.get("version", "") for s in ("\\", "/")):
self.install("{name}={version}".format(**filters), self.install(
silent=silent, "{name}={version}".format(**filters),
after_update=after_update, silent=silent,
interactive=interactive, after_update=after_update,
force=force) interactive=interactive,
force=force,
)
else: else:
try: try:
lib_id = self.search_lib_id(filters, silent, interactive) lib_id = self.search_lib_id(filters, silent, interactive)
except exception.LibNotFound as e: except exception.LibNotFound as e:
if builtin_lib_storages is None: if builtin_lib_storages is None:
builtin_lib_storages = get_builtin_libs() builtin_lib_storages = get_builtin_libs()
if not silent or is_builtin_lib(builtin_lib_storages, if not silent or is_builtin_lib(
filters['name']): builtin_lib_storages, filters["name"]
):
click.secho("Warning! %s" % e, fg="yellow") click.secho("Warning! %s" % e, fg="yellow")
continue continue
if filters.get("version"): if filters.get("version"):
self.install(lib_id, self.install(
filters.get("version"), lib_id,
silent=silent, filters.get("version"),
after_update=after_update, silent=silent,
interactive=interactive, after_update=after_update,
force=force) interactive=interactive,
force=force,
)
else: else:
self.install(lib_id, self.install(
silent=silent, lib_id,
after_update=after_update, silent=silent,
interactive=interactive, after_update=after_update,
force=force) interactive=interactive,
force=force,
)
return pkg_dir return pkg_dir
@@ -418,21 +368,23 @@ def get_builtin_libs(storage_names=None):
storage_names = storage_names or [] storage_names = storage_names or []
pm = PlatformManager() pm = PlatformManager()
for manifest in pm.get_installed(): for manifest in pm.get_installed():
p = PlatformFactory.newPlatform(manifest['__pkg_dir']) p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for storage in p.get_lib_storages(): for storage in p.get_lib_storages():
if storage_names and storage['name'] not in storage_names: if storage_names and storage["name"] not in storage_names:
continue continue
lm = LibraryManager(storage['path']) lm = LibraryManager(storage["path"])
items.append({ items.append(
"name": storage['name'], {
"path": storage['path'], "name": storage["name"],
"items": lm.get_installed() "path": storage["path"],
}) "items": lm.get_installed(),
}
)
return items return items
def is_builtin_lib(storages, name): def is_builtin_lib(storages, name):
for storage in storages or []: for storage in storages or []:
if any(l.get("name") == name for l in storage['items']): if any(l.get("name") == name for l in storage["items"]):
return True return True
return False return False

View File

@@ -12,7 +12,6 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import codecs
import hashlib import hashlib
import json import json
import os import os
@@ -29,6 +28,10 @@ from platformio import __version__, app, exception, fs, telemetry, util
from platformio.compat import hashlib_encode_data from platformio.compat import hashlib_encode_data
from platformio.downloader import FileDownloader from platformio.downloader import FileDownloader
from platformio.lockfile import LockFile from platformio.lockfile import LockFile
from platformio.package.manifest.parser import (
ManifestParserError,
ManifestParserFactory,
)
from platformio.unpacker import FileUnpacker from platformio.unpacker import FileUnpacker
from platformio.vcsclient import VCSClientFactory from platformio.vcsclient import VCSClientFactory
@@ -36,7 +39,6 @@ from platformio.vcsclient import VCSClientFactory
class PackageRepoIterator(object): class PackageRepoIterator(object):
def __init__(self, package, repositories): def __init__(self, package, repositories):
assert isinstance(repositories, list) assert isinstance(repositories, list)
self.package = package self.package = package
@@ -77,7 +79,7 @@ class PkgRepoMixin(object):
@staticmethod @staticmethod
def is_system_compatible(valid_systems): def is_system_compatible(valid_systems):
if valid_systems in (None, "all", "*"): if not valid_systems or "*" in valid_systems:
return True return True
if not isinstance(valid_systems, list): if not isinstance(valid_systems, list):
valid_systems = list([valid_systems]) valid_systems = list([valid_systems])
@@ -87,8 +89,9 @@ class PkgRepoMixin(object):
item = None item = None
reqspec = None reqspec = None
try: try:
reqspec = semantic_version.SimpleSpec( reqspec = (
requirements) if requirements else None semantic_version.SimpleSpec(requirements) if requirements else None
)
except ValueError: except ValueError:
pass pass
@@ -99,33 +102,32 @@ class PkgRepoMixin(object):
# if PkgRepoMixin.PIO_VERSION not in requirements.SimpleSpec( # if PkgRepoMixin.PIO_VERSION not in requirements.SimpleSpec(
# v['engines']['platformio']): # v['engines']['platformio']):
# continue # continue
specver = semantic_version.Version(v['version']) specver = semantic_version.Version(v["version"])
if reqspec and specver not in reqspec: if reqspec and specver not in reqspec:
continue continue
if not item or semantic_version.Version(item['version']) < specver: if not item or semantic_version.Version(item["version"]) < specver:
item = v item = v
return item return item
def get_latest_repo_version( # pylint: disable=unused-argument def get_latest_repo_version( # pylint: disable=unused-argument
self, self, name, requirements, silent=False
name, ):
requirements,
silent=False):
version = None version = None
for versions in PackageRepoIterator(name, self.repositories): for versions in PackageRepoIterator(name, self.repositories):
pkgdata = self.max_satisfying_repo_version(versions, requirements) pkgdata = self.max_satisfying_repo_version(versions, requirements)
if not pkgdata: if not pkgdata:
continue continue
if not version or semantic_version.compare(pkgdata['version'], if (
version) == 1: not version
version = pkgdata['version'] or semantic_version.compare(pkgdata["version"], version) == 1
):
version = pkgdata["version"]
return version return version
def get_all_repo_versions(self, name): def get_all_repo_versions(self, name):
result = [] result = []
for versions in PackageRepoIterator(name, self.repositories): for versions in PackageRepoIterator(name, self.repositories):
result.extend( result.extend([semantic_version.Version(v["version"]) for v in versions])
[semantic_version.Version(v['version']) for v in versions])
return [str(v) for v in sorted(set(result))] return [str(v) for v in sorted(set(result))]
@@ -154,7 +156,8 @@ class PkgInstallerMixin(object):
if result: if result:
return result return result
result = [ result = [
join(src_dir, name) for name in sorted(os.listdir(src_dir)) join(src_dir, name)
for name in sorted(os.listdir(src_dir))
if isdir(join(src_dir, name)) if isdir(join(src_dir, name))
] ]
self.cache_set(cache_key, result) self.cache_set(cache_key, result)
@@ -189,14 +192,17 @@ class PkgInstallerMixin(object):
click.secho( click.secho(
"Error: Please read http://bit.ly/package-manager-ioerror", "Error: Please read http://bit.ly/package-manager-ioerror",
fg="red", fg="red",
err=True) err=True,
)
raise e raise e
if sha1: if sha1:
fd.verify(sha1) fd.verify(sha1)
dst_path = fd.get_filepath() dst_path = fd.get_filepath()
if not self.FILE_CACHE_VALID or getsize( if (
dst_path) > PkgInstallerMixin.FILE_CACHE_MAX_SIZE: not self.FILE_CACHE_VALID
or getsize(dst_path) > PkgInstallerMixin.FILE_CACHE_MAX_SIZE
):
return dst_path return dst_path
with app.ContentCache() as cc: with app.ContentCache() as cc:
@@ -232,15 +238,15 @@ class PkgInstallerMixin(object):
return None return None
@staticmethod @staticmethod
def parse_pkg_uri( # pylint: disable=too-many-branches def parse_pkg_uri(text, requirements=None): # pylint: disable=too-many-branches
text, requirements=None):
text = str(text) text = str(text)
name, url = None, None name, url = None, None
# Parse requirements # Parse requirements
req_conditions = [ req_conditions = [
"@" in text, not requirements, ":" not in text "@" in text,
or text.rfind("/") < text.rfind("@") not requirements,
":" not in text or text.rfind("/") < text.rfind("@"),
] ]
if all(req_conditions): if all(req_conditions):
text, requirements = text.rsplit("@", 1) text, requirements = text.rsplit("@", 1)
@@ -259,17 +265,16 @@ class PkgInstallerMixin(object):
elif "/" in text or "\\" in text: elif "/" in text or "\\" in text:
git_conditions = [ git_conditions = [
# Handle GitHub URL (https://github.com/user/package) # Handle GitHub URL (https://github.com/user/package)
text.startswith("https://github.com/") and not text.endswith( text.startswith("https://github.com/")
(".zip", ".tar.gz")), and not text.endswith((".zip", ".tar.gz")),
(text.split("#", 1)[0] (text.split("#", 1)[0] if "#" in text else text).endswith(".git"),
if "#" in text else text).endswith(".git")
] ]
hg_conditions = [ hg_conditions = [
# Handle Developer Mbed URL # Handle Developer Mbed URL
# (https://developer.mbed.org/users/user/code/package/) # (https://developer.mbed.org/users/user/code/package/)
# (https://os.mbed.com/users/user/code/package/) # (https://os.mbed.com/users/user/code/package/)
text.startswith("https://developer.mbed.org"), text.startswith("https://developer.mbed.org"),
text.startswith("https://os.mbed.com") text.startswith("https://os.mbed.com"),
] ]
if any(git_conditions): if any(git_conditions):
url = "git+" + text url = "git+" + text
@@ -296,9 +301,9 @@ class PkgInstallerMixin(object):
@staticmethod @staticmethod
def get_install_dirname(manifest): def get_install_dirname(manifest):
name = re.sub(r"[^\da-z\_\-\. ]", "_", manifest['name'], flags=re.I) name = re.sub(r"[^\da-z\_\-\. ]", "_", manifest["name"], flags=re.I)
if "id" in manifest: if "id" in manifest:
name += "_ID%d" % manifest['id'] name += "_ID%d" % manifest["id"]
return str(name) return str(name)
@classmethod @classmethod
@@ -322,10 +327,9 @@ class PkgInstallerMixin(object):
return None return None
def manifest_exists(self, pkg_dir): def manifest_exists(self, pkg_dir):
return self.get_manifest_path(pkg_dir) or \ return self.get_manifest_path(pkg_dir) or self.get_src_manifest_path(pkg_dir)
self.get_src_manifest_path(pkg_dir)
def load_manifest(self, pkg_dir): def load_manifest(self, pkg_dir): # pylint: disable=too-many-branches
cache_key = "load_manifest-%s" % pkg_dir cache_key = "load_manifest-%s" % pkg_dir
result = self.cache_get(cache_key) result = self.cache_get(cache_key)
if result: if result:
@@ -341,31 +345,26 @@ class PkgInstallerMixin(object):
if not manifest_path and not src_manifest_path: if not manifest_path and not src_manifest_path:
return None return None
if manifest_path and manifest_path.endswith(".json"): try:
manifest = fs.load_json(manifest_path) manifest = ManifestParserFactory.new_from_file(manifest_path).as_dict()
elif manifest_path and manifest_path.endswith(".properties"): except ManifestParserError:
with codecs.open(manifest_path, encoding="utf-8") as fp: pass
for line in fp.readlines():
if "=" not in line:
continue
key, value = line.split("=", 1)
manifest[key.strip()] = value.strip()
if src_manifest: if src_manifest:
if "version" in src_manifest: if "version" in src_manifest:
manifest['version'] = src_manifest['version'] manifest["version"] = src_manifest["version"]
manifest['__src_url'] = src_manifest['url'] manifest["__src_url"] = src_manifest["url"]
# handle a custom package name # handle a custom package name
autogen_name = self.parse_pkg_uri(manifest['__src_url'])[0] autogen_name = self.parse_pkg_uri(manifest["__src_url"])[0]
if "name" not in manifest or autogen_name != src_manifest['name']: if "name" not in manifest or autogen_name != src_manifest["name"]:
manifest['name'] = src_manifest['name'] manifest["name"] = src_manifest["name"]
if "name" not in manifest: if "name" not in manifest:
manifest['name'] = basename(pkg_dir) manifest["name"] = basename(pkg_dir)
if "version" not in manifest: if "version" not in manifest:
manifest['version'] = "0.0.0" manifest["version"] = "0.0.0"
manifest['__pkg_dir'] = pkg_dir manifest["__pkg_dir"] = pkg_dir
self.cache_set(cache_key, manifest) self.cache_set(cache_key, manifest)
return manifest return manifest
@@ -390,25 +389,24 @@ class PkgInstallerMixin(object):
continue continue
elif pkg_id and manifest.get("id") != pkg_id: elif pkg_id and manifest.get("id") != pkg_id:
continue continue
elif not pkg_id and manifest['name'] != name: elif not pkg_id and manifest["name"] != name:
continue continue
elif not PkgRepoMixin.is_system_compatible(manifest.get("system")): elif not PkgRepoMixin.is_system_compatible(manifest.get("system")):
continue continue
# strict version or VCS HASH # strict version or VCS HASH
if requirements and requirements == manifest['version']: if requirements and requirements == manifest["version"]:
return manifest return manifest
try: try:
if requirements and not semantic_version.SimpleSpec( if requirements and not semantic_version.SimpleSpec(requirements).match(
requirements).match( self.parse_semver_version(manifest["version"], raise_exception=True)
self.parse_semver_version(manifest['version'], ):
raise_exception=True)):
continue continue
elif not best or (self.parse_semver_version( if not best or (
manifest['version'], raise_exception=True) > self.parse_semver_version(manifest["version"], raise_exception=True)
self.parse_semver_version( > self.parse_semver_version(best["version"], raise_exception=True)
best['version'], raise_exception=True)): ):
best = manifest best = manifest
except ValueError: except ValueError:
pass pass
@@ -417,12 +415,15 @@ class PkgInstallerMixin(object):
def get_package_dir(self, name, requirements=None, url=None): def get_package_dir(self, name, requirements=None, url=None):
manifest = self.get_package(name, requirements, url) manifest = self.get_package(name, requirements, url)
return manifest.get("__pkg_dir") if manifest and isdir( return (
manifest.get("__pkg_dir")) else None manifest.get("__pkg_dir")
if manifest and isdir(manifest.get("__pkg_dir"))
else None
)
def get_package_by_dir(self, pkg_dir): def get_package_by_dir(self, pkg_dir):
for manifest in self.get_installed(): for manifest in self.get_installed():
if manifest['__pkg_dir'] == abspath(pkg_dir): if manifest["__pkg_dir"] == abspath(pkg_dir):
return manifest return manifest
return None return None
@@ -443,9 +444,9 @@ class PkgInstallerMixin(object):
if not pkgdata: if not pkgdata:
continue continue
try: try:
pkg_dir = self._install_from_url(name, pkgdata['url'], pkg_dir = self._install_from_url(
requirements, name, pkgdata["url"], requirements, pkgdata.get("sha1")
pkgdata.get("sha1")) )
break break
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
click.secho("Warning! Package Mirror: %s" % e, fg="yellow") click.secho("Warning! Package Mirror: %s" % e, fg="yellow")
@@ -455,16 +456,12 @@ class PkgInstallerMixin(object):
util.internet_on(raise_exception=True) util.internet_on(raise_exception=True)
raise exception.UnknownPackage(name) raise exception.UnknownPackage(name)
if not pkgdata: if not pkgdata:
raise exception.UndefinedPackageVersion(requirements or "latest", raise exception.UndefinedPackageVersion(
util.get_systype()) requirements or "latest", util.get_systype()
)
return pkg_dir return pkg_dir
def _install_from_url(self, def _install_from_url(self, name, url, requirements=None, sha1=None, track=False):
name,
url,
requirements=None,
sha1=None,
track=False):
tmp_dir = mkdtemp("-package", self.TMP_FOLDER_PREFIX, self.package_dir) tmp_dir = mkdtemp("-package", self.TMP_FOLDER_PREFIX, self.package_dir)
src_manifest_dir = None src_manifest_dir = None
src_manifest = {"name": name, "url": url, "requirements": requirements} src_manifest = {"name": name, "url": url, "requirements": requirements}
@@ -486,7 +483,7 @@ class PkgInstallerMixin(object):
vcs = VCSClientFactory.newClient(tmp_dir, url) vcs = VCSClientFactory.newClient(tmp_dir, url)
assert vcs.export() assert vcs.export()
src_manifest_dir = vcs.storage_dir src_manifest_dir = vcs.storage_dir
src_manifest['version'] = vcs.get_current_revision() src_manifest["version"] = vcs.get_current_revision()
_tmp_dir = tmp_dir _tmp_dir = tmp_dir
if not src_manifest_dir: if not src_manifest_dir:
@@ -515,7 +512,8 @@ class PkgInstallerMixin(object):
json.dump(_data, fp) json.dump(_data, fp)
def _install_from_tmp_dir( # pylint: disable=too-many-branches def _install_from_tmp_dir( # pylint: disable=too-many-branches
self, tmp_dir, requirements=None): self, tmp_dir, requirements=None
):
tmp_manifest = self.load_manifest(tmp_dir) tmp_manifest = self.load_manifest(tmp_dir)
assert set(["name", "version"]) <= set(tmp_manifest) assert set(["name", "version"]) <= set(tmp_manifest)
@@ -523,28 +521,30 @@ class PkgInstallerMixin(object):
pkg_dir = join(self.package_dir, pkg_dirname) pkg_dir = join(self.package_dir, pkg_dirname)
cur_manifest = self.load_manifest(pkg_dir) cur_manifest = self.load_manifest(pkg_dir)
tmp_semver = self.parse_semver_version(tmp_manifest['version']) tmp_semver = self.parse_semver_version(tmp_manifest["version"])
cur_semver = None cur_semver = None
if cur_manifest: if cur_manifest:
cur_semver = self.parse_semver_version(cur_manifest['version']) cur_semver = self.parse_semver_version(cur_manifest["version"])
# package should satisfy requirements # package should satisfy requirements
if requirements: if requirements:
mismatch_error = ( mismatch_error = "Package version %s doesn't satisfy requirements %s" % (
"Package version %s doesn't satisfy requirements %s" % tmp_manifest["version"],
(tmp_manifest['version'], requirements)) requirements,
)
try: try:
assert tmp_semver and tmp_semver in semantic_version.SimpleSpec( assert tmp_semver and tmp_semver in semantic_version.SimpleSpec(
requirements), mismatch_error requirements
), mismatch_error
except (AssertionError, ValueError): except (AssertionError, ValueError):
assert tmp_manifest['version'] == requirements, mismatch_error assert tmp_manifest["version"] == requirements, mismatch_error
# check if package already exists # check if package already exists
if cur_manifest: if cur_manifest:
# 0-overwrite, 1-rename, 2-fix to a version # 0-overwrite, 1-rename, 2-fix to a version
action = 0 action = 0
if "__src_url" in cur_manifest: if "__src_url" in cur_manifest:
if cur_manifest['__src_url'] != tmp_manifest.get("__src_url"): if cur_manifest["__src_url"] != tmp_manifest.get("__src_url"):
action = 1 action = 1
elif "__src_url" in tmp_manifest: elif "__src_url" in tmp_manifest:
action = 2 action = 2
@@ -556,25 +556,25 @@ class PkgInstallerMixin(object):
# rename # rename
if action == 1: if action == 1:
target_dirname = "%s@%s" % (pkg_dirname, target_dirname = "%s@%s" % (pkg_dirname, cur_manifest["version"])
cur_manifest['version'])
if "__src_url" in cur_manifest: if "__src_url" in cur_manifest:
target_dirname = "%s@src-%s" % ( target_dirname = "%s@src-%s" % (
pkg_dirname, pkg_dirname,
hashlib.md5( hashlib.md5(
hashlib_encode_data( hashlib_encode_data(cur_manifest["__src_url"])
cur_manifest['__src_url'])).hexdigest()) ).hexdigest(),
)
shutil.move(pkg_dir, join(self.package_dir, target_dirname)) shutil.move(pkg_dir, join(self.package_dir, target_dirname))
# fix to a version # fix to a version
elif action == 2: elif action == 2:
target_dirname = "%s@%s" % (pkg_dirname, target_dirname = "%s@%s" % (pkg_dirname, tmp_manifest["version"])
tmp_manifest['version'])
if "__src_url" in tmp_manifest: if "__src_url" in tmp_manifest:
target_dirname = "%s@src-%s" % ( target_dirname = "%s@src-%s" % (
pkg_dirname, pkg_dirname,
hashlib.md5( hashlib.md5(
hashlib_encode_data( hashlib_encode_data(tmp_manifest["__src_url"])
tmp_manifest['__src_url'])).hexdigest()) ).hexdigest(),
)
pkg_dir = join(self.package_dir, target_dirname) pkg_dir = join(self.package_dir, target_dirname)
# remove previous/not-satisfied package # remove previous/not-satisfied package
@@ -622,9 +622,9 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
if "__src_url" in manifest: if "__src_url" in manifest:
try: try:
vcs = VCSClientFactory.newClient(pkg_dir, vcs = VCSClientFactory.newClient(
manifest['__src_url'], pkg_dir, manifest["__src_url"], silent=True
silent=True) )
except (AttributeError, exception.PlatformioException): except (AttributeError, exception.PlatformioException):
return None return None
if not vcs.can_be_updated: if not vcs.can_be_updated:
@@ -633,10 +633,10 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
else: else:
try: try:
latest = self.get_latest_repo_version( latest = self.get_latest_repo_version(
"id=%d" % "id=%d" % manifest["id"] if "id" in manifest else manifest["name"],
manifest['id'] if "id" in manifest else manifest['name'],
requirements, requirements,
silent=True) silent=True,
)
except (exception.PlatformioException, ValueError): except (exception.PlatformioException, ValueError):
return None return None
@@ -646,21 +646,17 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
up_to_date = False up_to_date = False
try: try:
assert "__src_url" not in manifest assert "__src_url" not in manifest
up_to_date = (self.parse_semver_version(manifest['version'], up_to_date = self.parse_semver_version(
raise_exception=True) >= manifest["version"], raise_exception=True
self.parse_semver_version(latest, ) >= self.parse_semver_version(latest, raise_exception=True)
raise_exception=True))
except (AssertionError, ValueError): except (AssertionError, ValueError):
up_to_date = latest == manifest['version'] up_to_date = latest == manifest["version"]
return False if up_to_date else latest return False if up_to_date else latest
def install(self, def install(
name, self, name, requirements=None, silent=False, after_update=False, force=False
requirements=None, ):
silent=False,
after_update=False,
force=False):
pkg_dir = None pkg_dir = None
# interprocess lock # interprocess lock
with LockFile(self.package_dir): with LockFile(self.package_dir):
@@ -690,34 +686,38 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
if not silent: if not silent:
click.secho( click.secho(
"{name} @ {version} is already installed".format( "{name} @ {version} is already installed".format(
**self.load_manifest(package_dir)), **self.load_manifest(package_dir)
fg="yellow") ),
fg="yellow",
)
return package_dir return package_dir
if url: if url:
pkg_dir = self._install_from_url(name, pkg_dir = self._install_from_url(name, url, requirements, track=True)
url,
requirements,
track=True)
else: else:
pkg_dir = self._install_from_piorepo(name, requirements) pkg_dir = self._install_from_piorepo(name, requirements)
if not pkg_dir or not self.manifest_exists(pkg_dir): if not pkg_dir or not self.manifest_exists(pkg_dir):
raise exception.PackageInstallError(name, requirements or "*", raise exception.PackageInstallError(
util.get_systype()) name, requirements or "*", util.get_systype()
)
manifest = self.load_manifest(pkg_dir) manifest = self.load_manifest(pkg_dir)
assert manifest assert manifest
if not after_update: if not after_update:
telemetry.on_event(category=self.__class__.__name__, telemetry.on_event(
action="Install", category=self.__class__.__name__,
label=manifest['name']) action="Install",
label=manifest["name"],
)
click.secho( click.secho(
"{name} @ {version} has been successfully installed!".format( "{name} @ {version} has been successfully installed!".format(
**manifest), **manifest
fg="green") ),
fg="green",
)
return pkg_dir return pkg_dir
@@ -729,18 +729,20 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
if isdir(package) and self.get_package_by_dir(package): if isdir(package) and self.get_package_by_dir(package):
pkg_dir = package pkg_dir = package
else: else:
name, requirements, url = self.parse_pkg_uri( name, requirements, url = self.parse_pkg_uri(package, requirements)
package, requirements)
pkg_dir = self.get_package_dir(name, requirements, url) pkg_dir = self.get_package_dir(name, requirements, url)
if not pkg_dir: if not pkg_dir:
raise exception.UnknownPackage("%s @ %s" % raise exception.UnknownPackage(
(package, requirements or "*")) "%s @ %s" % (package, requirements or "*")
)
manifest = self.load_manifest(pkg_dir) manifest = self.load_manifest(pkg_dir)
click.echo("Uninstalling %s @ %s: \t" % (click.style( click.echo(
manifest['name'], fg="cyan"), manifest['version']), "Uninstalling %s @ %s: \t"
nl=False) % (click.style(manifest["name"], fg="cyan"), manifest["version"]),
nl=False,
)
if islink(pkg_dir): if islink(pkg_dir):
os.unlink(pkg_dir) os.unlink(pkg_dir)
@@ -749,19 +751,21 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
self.cache_reset() self.cache_reset()
# unfix package with the same name # unfix package with the same name
pkg_dir = self.get_package_dir(manifest['name']) pkg_dir = self.get_package_dir(manifest["name"])
if pkg_dir and "@" in pkg_dir: if pkg_dir and "@" in pkg_dir:
shutil.move( shutil.move(
pkg_dir, pkg_dir, join(self.package_dir, self.get_install_dirname(manifest))
join(self.package_dir, self.get_install_dirname(manifest))) )
self.cache_reset() self.cache_reset()
click.echo("[%s]" % click.style("OK", fg="green")) click.echo("[%s]" % click.style("OK", fg="green"))
if not after_update: if not after_update:
telemetry.on_event(category=self.__class__.__name__, telemetry.on_event(
action="Uninstall", category=self.__class__.__name__,
label=manifest['name']) action="Uninstall",
label=manifest["name"],
)
return True return True
@@ -773,16 +777,19 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
pkg_dir = self.get_package_dir(*self.parse_pkg_uri(package)) pkg_dir = self.get_package_dir(*self.parse_pkg_uri(package))
if not pkg_dir: if not pkg_dir:
raise exception.UnknownPackage("%s @ %s" % raise exception.UnknownPackage("%s @ %s" % (package, requirements or "*"))
(package, requirements or "*"))
manifest = self.load_manifest(pkg_dir) manifest = self.load_manifest(pkg_dir)
name = manifest['name'] name = manifest["name"]
click.echo("{} {:<40} @ {:<15}".format( click.echo(
"Checking" if only_check else "Updating", "{} {:<40} @ {:<15}".format(
click.style(manifest['name'], fg="cyan"), manifest['version']), "Checking" if only_check else "Updating",
nl=False) click.style(manifest["name"], fg="cyan"),
manifest["version"],
),
nl=False,
)
if not util.internet_on(): if not util.internet_on():
click.echo("[%s]" % (click.style("Off-line", fg="yellow"))) click.echo("[%s]" % (click.style("Off-line", fg="yellow")))
return None return None
@@ -799,22 +806,22 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
return True return True
if "__src_url" in manifest: if "__src_url" in manifest:
vcs = VCSClientFactory.newClient(pkg_dir, manifest['__src_url']) vcs = VCSClientFactory.newClient(pkg_dir, manifest["__src_url"])
assert vcs.update() assert vcs.update()
self._update_src_manifest(dict(version=vcs.get_current_revision()), self._update_src_manifest(
vcs.storage_dir) dict(version=vcs.get_current_revision()), vcs.storage_dir
)
else: else:
self.uninstall(pkg_dir, after_update=True) self.uninstall(pkg_dir, after_update=True)
self.install(name, latest, after_update=True) self.install(name, latest, after_update=True)
telemetry.on_event(category=self.__class__.__name__, telemetry.on_event(
action="Update", category=self.__class__.__name__, action="Update", label=manifest["name"]
label=manifest['name']) )
return True return True
class PackageManager(BasePkgManager): class PackageManager(BasePkgManager):
@property @property
def manifest_names(self): def manifest_names(self):
return ["package.json"] return ["package.json"]

View File

@@ -12,27 +12,22 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-public-methods, too-many-instance-attributes
import base64 import base64
import os import os
import re import re
import sys import sys
from imp import load_source
from os.path import basename, dirname, isdir, isfile, join from os.path import basename, dirname, isdir, isfile, join
import click import click
import semantic_version import semantic_version
from platformio import __version__, app, exception, fs, util from platformio import __version__, app, exception, fs, proc, util
from platformio.compat import PY2, hashlib_encode_data, is_bytes from platformio.compat import PY2, hashlib_encode_data, is_bytes, load_python_module
from platformio.managers.core import get_core_package_dir from platformio.managers.core import get_core_package_dir
from platformio.managers.package import BasePkgManager, PackageManager from platformio.managers.package import BasePkgManager, PackageManager
from platformio.proc import (BuildAsyncPipe, copy_pythonpath_to_osenv,
exec_command, get_pythonexe_path)
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_boards_dir,
get_project_core_dir,
get_project_packages_dir,
get_project_platforms_dir)
try: try:
from urllib.parse import quote from urllib.parse import quote
@@ -41,16 +36,18 @@ except ImportError:
class PlatformManager(BasePkgManager): class PlatformManager(BasePkgManager):
def __init__(self, package_dir=None, repositories=None): def __init__(self, package_dir=None, repositories=None):
if not repositories: if not repositories:
repositories = [ repositories = [
"https://dl.bintray.com/platformio/dl-platforms/manifest.json", "https://dl.bintray.com/platformio/dl-platforms/manifest.json",
"{0}://dl.platformio.org/platforms/manifest.json".format( "{0}://dl.platformio.org/platforms/manifest.json".format(
"https" if app.get_setting("strict_ssl") else "http") "https" if app.get_setting("strict_ssl") else "http"
),
] ]
BasePkgManager.__init__(self, package_dir self.config = ProjectConfig.get_instance()
or get_project_platforms_dir(), repositories) BasePkgManager.__init__(
self, package_dir or self.config.get_optional_dir("platforms"), repositories
)
@property @property
def manifest_names(self): def manifest_names(self):
@@ -65,21 +62,21 @@ class PlatformManager(BasePkgManager):
return manifest_path return manifest_path
return None return None
def install(self, def install(
name, self,
requirements=None, name,
with_packages=None, requirements=None,
without_packages=None, with_packages=None,
skip_default_package=False, without_packages=None,
after_update=False, skip_default_package=False,
silent=False, after_update=False,
force=False, silent=False,
**_): # pylint: disable=too-many-arguments, arguments-differ force=False,
platform_dir = BasePkgManager.install(self, **_
name, ): # pylint: disable=too-many-arguments, arguments-differ
requirements, platform_dir = BasePkgManager.install(
silent=silent, self, name, requirements, silent=silent, force=force
force=force) )
p = PlatformFactory.newPlatform(platform_dir) p = PlatformFactory.newPlatform(platform_dir)
# don't cleanup packages or install them after update # don't cleanup packages or install them after update
@@ -87,11 +84,13 @@ class PlatformManager(BasePkgManager):
if after_update: if after_update:
return True return True
p.install_packages(with_packages, p.install_packages(
without_packages, with_packages,
skip_default_package, without_packages,
silent=silent, skip_default_package,
force=force) silent=silent,
force=force,
)
return self.cleanup_packages(list(p.packages)) return self.cleanup_packages(list(p.packages))
def uninstall(self, package, requirements=None, after_update=False): def uninstall(self, package, requirements=None, after_update=False):
@@ -115,11 +114,8 @@ class PlatformManager(BasePkgManager):
return self.cleanup_packages(list(p.packages)) return self.cleanup_packages(list(p.packages))
def update( # pylint: disable=arguments-differ def update( # pylint: disable=arguments-differ
self, self, package, requirements=None, only_check=False, only_packages=False
package, ):
requirements=None,
only_check=False,
only_packages=False):
if isdir(package): if isdir(package):
pkg_dir = package pkg_dir = package
else: else:
@@ -143,8 +139,9 @@ class PlatformManager(BasePkgManager):
self.cleanup_packages(list(p.packages)) self.cleanup_packages(list(p.packages))
if missed_pkgs: if missed_pkgs:
p.install_packages(with_packages=list(missed_pkgs), p.install_packages(
skip_default_package=True) with_packages=list(missed_pkgs), skip_default_package=True
)
return True return True
@@ -152,20 +149,22 @@ class PlatformManager(BasePkgManager):
self.cache_reset() self.cache_reset()
deppkgs = {} deppkgs = {}
for manifest in PlatformManager().get_installed(): for manifest in PlatformManager().get_installed():
p = PlatformFactory.newPlatform(manifest['__pkg_dir']) p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for pkgname, pkgmanifest in p.get_installed_packages().items(): for pkgname, pkgmanifest in p.get_installed_packages().items():
if pkgname not in deppkgs: if pkgname not in deppkgs:
deppkgs[pkgname] = set() deppkgs[pkgname] = set()
deppkgs[pkgname].add(pkgmanifest['version']) deppkgs[pkgname].add(pkgmanifest["version"])
pm = PackageManager(get_project_packages_dir()) pm = PackageManager(self.config.get_optional_dir("packages"))
for manifest in pm.get_installed(): for manifest in pm.get_installed():
if manifest['name'] not in names: if manifest["name"] not in names:
continue continue
if (manifest['name'] not in deppkgs if (
or manifest['version'] not in deppkgs[manifest['name']]): manifest["name"] not in deppkgs
or manifest["version"] not in deppkgs[manifest["name"]]
):
try: try:
pm.uninstall(manifest['__pkg_dir'], after_update=True) pm.uninstall(manifest["__pkg_dir"], after_update=True)
except exception.UnknownPackage: except exception.UnknownPackage:
pass pass
@@ -176,7 +175,7 @@ class PlatformManager(BasePkgManager):
def get_installed_boards(self): def get_installed_boards(self):
boards = [] boards = []
for manifest in self.get_installed(): for manifest in self.get_installed():
p = PlatformFactory.newPlatform(manifest['__pkg_dir']) p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for config in p.get_boards().values(): for config in p.get_boards().values():
board = config.get_brief_data() board = config.get_brief_data()
if board not in boards: if board not in boards:
@@ -189,30 +188,31 @@ class PlatformManager(BasePkgManager):
def get_all_boards(self): def get_all_boards(self):
boards = self.get_installed_boards() boards = self.get_installed_boards()
know_boards = ["%s:%s" % (b['platform'], b['id']) for b in boards] know_boards = ["%s:%s" % (b["platform"], b["id"]) for b in boards]
try: try:
for board in self.get_registered_boards(): for board in self.get_registered_boards():
key = "%s:%s" % (board['platform'], board['id']) key = "%s:%s" % (board["platform"], board["id"])
if key not in know_boards: if key not in know_boards:
boards.append(board) boards.append(board)
except (exception.APIRequestError, exception.InternetIsOffline): except (exception.APIRequestError, exception.InternetIsOffline):
pass pass
return sorted(boards, key=lambda b: b['name']) return sorted(boards, key=lambda b: b["name"])
def board_config(self, id_, platform=None): def board_config(self, id_, platform=None):
for manifest in self.get_installed_boards(): for manifest in self.get_installed_boards():
if manifest['id'] == id_ and (not platform if manifest["id"] == id_ and (
or manifest['platform'] == platform): not platform or manifest["platform"] == platform
):
return manifest return manifest
for manifest in self.get_registered_boards(): for manifest in self.get_registered_boards():
if manifest['id'] == id_ and (not platform if manifest["id"] == id_ and (
or manifest['platform'] == platform): not platform or manifest["platform"] == platform
):
return manifest return manifest
raise exception.UnknownBoard(id_) raise exception.UnknownBoard(id_)
class PlatformFactory(object): class PlatformFactory(object):
@staticmethod @staticmethod
def get_clsname(name): def get_clsname(name):
name = re.sub(r"[^\da-z\_]+", "", name, flags=re.I) name = re.sub(r"[^\da-z\_]+", "", name, flags=re.I)
@@ -220,13 +220,10 @@ class PlatformFactory(object):
@staticmethod @staticmethod
def load_module(name, path): def load_module(name, path):
module = None
try: try:
module = load_source("platformio.managers.platform.%s" % name, return load_python_module("platformio.managers.platform.%s" % name, path)
path)
except ImportError: except ImportError:
raise exception.UnknownPlatform(name) raise exception.UnknownPlatform(name)
return module
@classmethod @classmethod
def newPlatform(cls, name, requirements=None): def newPlatform(cls, name, requirements=None):
@@ -234,28 +231,29 @@ class PlatformFactory(object):
platform_dir = None platform_dir = None
if isdir(name): if isdir(name):
platform_dir = name platform_dir = name
name = pm.load_manifest(platform_dir)['name'] name = pm.load_manifest(platform_dir)["name"]
elif name.endswith("platform.json") and isfile(name): elif name.endswith("platform.json") and isfile(name):
platform_dir = dirname(name) platform_dir = dirname(name)
name = fs.load_json(name)['name'] name = fs.load_json(name)["name"]
else: else:
name, requirements, url = pm.parse_pkg_uri(name, requirements) name, requirements, url = pm.parse_pkg_uri(name, requirements)
platform_dir = pm.get_package_dir(name, requirements, url) platform_dir = pm.get_package_dir(name, requirements, url)
if platform_dir: if platform_dir:
name = pm.load_manifest(platform_dir)['name'] name = pm.load_manifest(platform_dir)["name"]
if not platform_dir: if not platform_dir:
raise exception.UnknownPlatform( raise exception.UnknownPlatform(
name if not requirements else "%s@%s" % (name, requirements)) name if not requirements else "%s@%s" % (name, requirements)
)
platform_cls = None platform_cls = None
if isfile(join(platform_dir, "platform.py")): if isfile(join(platform_dir, "platform.py")):
platform_cls = getattr( platform_cls = getattr(
cls.load_module(name, join(platform_dir, "platform.py")), cls.load_module(name, join(platform_dir, "platform.py")),
cls.get_clsname(name)) cls.get_clsname(name),
)
else: else:
platform_cls = type(str(cls.get_clsname(name)), (PlatformBase, ), platform_cls = type(str(cls.get_clsname(name)), (PlatformBase,), {})
{})
_instance = platform_cls(join(platform_dir, "platform.json")) _instance = platform_cls(join(platform_dir, "platform.json"))
assert isinstance(_instance, PlatformBase) assert isinstance(_instance, PlatformBase)
@@ -263,14 +261,14 @@ class PlatformFactory(object):
class PlatformPackagesMixin(object): class PlatformPackagesMixin(object):
def install_packages( # pylint: disable=too-many-arguments def install_packages( # pylint: disable=too-many-arguments
self, self,
with_packages=None, with_packages=None,
without_packages=None, without_packages=None,
skip_default_package=False, skip_default_package=False,
silent=False, silent=False,
force=False): force=False,
):
with_packages = set(self.find_pkg_names(with_packages or [])) with_packages = set(self.find_pkg_names(with_packages or []))
without_packages = set(self.find_pkg_names(without_packages or [])) without_packages = set(self.find_pkg_names(without_packages or []))
@@ -283,12 +281,13 @@ class PlatformPackagesMixin(object):
version = opts.get("version", "") version = opts.get("version", "")
if name in without_packages: if name in without_packages:
continue continue
elif (name in with_packages or if name in with_packages or not (
not (skip_default_package or opts.get("optional", False))): skip_default_package or opts.get("optional", False)
):
if ":" in version: if ":" in version:
self.pm.install("%s=%s" % (name, version), self.pm.install(
silent=silent, "%s=%s" % (name, version), silent=silent, force=force
force=force) )
else: else:
self.pm.install(name, version, silent=silent, force=force) self.pm.install(name, version, silent=silent, force=force)
@@ -305,9 +304,12 @@ class PlatformPackagesMixin(object):
result.append(_name) result.append(_name)
found = True found = True
if (self.frameworks and candidate.startswith("framework-") if (
and candidate[10:] in self.frameworks): self.frameworks
result.append(self.frameworks[candidate[10:]]['package']) and candidate.startswith("framework-")
and candidate[10:] in self.frameworks
):
result.append(self.frameworks[candidate[10:]]["package"])
found = True found = True
if not found: if not found:
@@ -320,7 +322,7 @@ class PlatformPackagesMixin(object):
requirements = self.packages[name].get("version", "") requirements = self.packages[name].get("version", "")
if ":" in requirements: if ":" in requirements:
_, requirements, __ = self.pm.parse_pkg_uri(requirements) _, requirements, __ = self.pm.parse_pkg_uri(requirements)
self.pm.update(manifest['__pkg_dir'], requirements, only_check) self.pm.update(manifest["__pkg_dir"], requirements, only_check)
def get_installed_packages(self): def get_installed_packages(self):
items = {} items = {}
@@ -335,7 +337,7 @@ class PlatformPackagesMixin(object):
requirements = self.packages[name].get("version", "") requirements = self.packages[name].get("version", "")
if ":" in requirements: if ":" in requirements:
_, requirements, __ = self.pm.parse_pkg_uri(requirements) _, requirements, __ = self.pm.parse_pkg_uri(requirements)
if self.pm.outdated(manifest['__pkg_dir'], requirements): if self.pm.outdated(manifest["__pkg_dir"], requirements):
return True return True
return False return False
@@ -343,7 +345,8 @@ class PlatformPackagesMixin(object):
version = self.packages[name].get("version", "") version = self.packages[name].get("version", "")
if ":" in version: if ":" in version:
return self.pm.get_package_dir( return self.pm.get_package_dir(
*self.pm.parse_pkg_uri("%s=%s" % (name, version))) *self.pm.parse_pkg_uri("%s=%s" % (name, version))
)
return self.pm.get_package_dir(name, version) return self.pm.get_package_dir(name, version)
def get_package_version(self, name): def get_package_version(self, name):
@@ -368,15 +371,15 @@ class PlatformRunMixin(object):
return value.decode() if is_bytes(value) else value return value.decode() if is_bytes(value) else value
def run( # pylint: disable=too-many-arguments def run( # pylint: disable=too-many-arguments
self, variables, targets, silent, verbose, jobs): self, variables, targets, silent, verbose, jobs
):
assert isinstance(variables, dict) assert isinstance(variables, dict)
assert isinstance(targets, list) assert isinstance(targets, list)
config = ProjectConfig.get_instance(variables['project_config']) options = self.config.items(env=variables["pioenv"], as_dict=True)
options = config.items(env=variables['pioenv'], as_dict=True)
if "framework" in options: if "framework" in options:
# support PIO Core 3.0 dev/platforms # support PIO Core 3.0 dev/platforms
options['pioframework'] = options['framework'] options["pioframework"] = options["framework"]
self.configure_default_packages(options, targets) self.configure_default_packages(options, targets)
self.install_packages(silent=True) self.install_packages(silent=True)
@@ -386,12 +389,12 @@ class PlatformRunMixin(object):
if "clean" in targets: if "clean" in targets:
targets = ["-c", "."] targets = ["-c", "."]
variables['platform_manifest'] = self.manifest_path variables["platform_manifest"] = self.manifest_path
if "build_script" not in variables: if "build_script" not in variables:
variables['build_script'] = self.get_build_script() variables["build_script"] = self.get_build_script()
if not isfile(variables['build_script']): if not isfile(variables["build_script"]):
raise exception.BuildScriptNotFound(variables['build_script']) raise exception.BuildScriptNotFound(variables["build_script"])
result = self._run_scons(variables, targets, jobs) result = self._run_scons(variables, targets, jobs)
assert "returncode" in result assert "returncode" in result
@@ -400,16 +403,18 @@ class PlatformRunMixin(object):
def _run_scons(self, variables, targets, jobs): def _run_scons(self, variables, targets, jobs):
args = [ args = [
get_pythonexe_path(), proc.get_pythonexe_path(),
join(get_core_package_dir("tool-scons"), "script", "scons"), join(get_core_package_dir("tool-scons"), "script", "scons"),
"-Q", "--warn=no-no-parallel-support", "-Q",
"--jobs", str(jobs), "--warn=no-no-parallel-support",
"--sconstruct", join(fs.get_source_dir(), "builder", "main.py") "--jobs",
] # yapf: disable str(jobs),
"--sconstruct",
join(fs.get_source_dir(), "builder", "main.py"),
]
args.append("PIOVERBOSE=%d" % (1 if self.verbose else 0)) args.append("PIOVERBOSE=%d" % (1 if self.verbose else 0))
# pylint: disable=protected-access # pylint: disable=protected-access
args.append("ISATTY=%d" % args.append("ISATTY=%d" % (1 if click._compat.isatty(sys.stdout) else 0))
(1 if click._compat.isatty(sys.stdout) else 0))
args += targets args += targets
# encode and append variables # encode and append variables
@@ -423,15 +428,25 @@ class PlatformRunMixin(object):
except IOError: except IOError:
pass pass
copy_pythonpath_to_osenv() proc.copy_pythonpath_to_osenv()
result = exec_command( if click._compat.isatty(sys.stdout):
args, result = proc.exec_command(
stdout=BuildAsyncPipe( args,
line_callback=self._on_stdout_line, stdout=proc.BuildAsyncPipe(
data_callback=lambda data: _write_and_flush(sys.stdout, data)), line_callback=self._on_stdout_line,
stderr=BuildAsyncPipe( data_callback=lambda data: _write_and_flush(sys.stdout, data),
line_callback=self._on_stderr_line, ),
data_callback=lambda data: _write_and_flush(sys.stderr, data))) stderr=proc.BuildAsyncPipe(
line_callback=self._on_stderr_line,
data_callback=lambda data: _write_and_flush(sys.stderr, data),
),
)
else:
result = proc.exec_command(
args,
stdout=proc.LineBufferedAsyncPipe(line_callback=self._on_stdout_line),
stderr=proc.LineBufferedAsyncPipe(line_callback=self._on_stderr_line),
)
return result return result
def _on_stdout_line(self, line): def _on_stdout_line(self, line):
@@ -447,7 +462,7 @@ class PlatformRunMixin(object):
b_pos = line.rfind(": No such file or directory") b_pos = line.rfind(": No such file or directory")
if a_pos == -1 or b_pos == -1: if a_pos == -1 or b_pos == -1:
return return
self._echo_missed_dependency(line[a_pos + 12:b_pos].strip()) self._echo_missed_dependency(line[a_pos + 12 : b_pos].strip())
def _echo_line(self, line, level): def _echo_line(self, line, level):
if line.startswith("scons: "): if line.startswith("scons: "):
@@ -472,18 +487,20 @@ class PlatformRunMixin(object):
* Web > {link} * Web > {link}
* *
{dots} {dots}
""".format(filename=filename, """.format(
filename_styled=click.style(filename, fg="cyan"), filename=filename,
link=click.style( filename_styled=click.style(filename, fg="cyan"),
"https://platformio.org/lib/search?query=header:%s" % link=click.style(
quote(filename, safe=""), "https://platformio.org/lib/search?query=header:%s"
fg="blue"), % quote(filename, safe=""),
dots="*" * (56 + len(filename))) fg="blue",
),
dots="*" * (56 + len(filename)),
)
click.echo(banner, err=True) click.echo(banner, err=True)
class PlatformBase( # pylint: disable=too-many-public-methods class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
PlatformPackagesMixin, PlatformRunMixin):
PIO_VERSION = semantic_version.Version(util.pepver_to_semver(__version__)) PIO_VERSION = semantic_version.Version(util.pepver_to_semver(__version__))
_BOARDS_CACHE = {} _BOARDS_CACHE = {}
@@ -497,9 +514,10 @@ class PlatformBase( # pylint: disable=too-many-public-methods
self._manifest = fs.load_json(manifest_path) self._manifest = fs.load_json(manifest_path)
self._custom_packages = None self._custom_packages = None
self.pm = PackageManager(get_project_packages_dir(), self.config = ProjectConfig.get_instance()
self.package_repositories) self.pm = PackageManager(
self.config.get_optional_dir("packages"), self.package_repositories
)
# if self.engines and "platformio" in self.engines: # if self.engines and "platformio" in self.engines:
# if self.PIO_VERSION not in semantic_version.SimpleSpec( # if self.PIO_VERSION not in semantic_version.SimpleSpec(
# self.engines['platformio']): # self.engines['platformio']):
@@ -508,19 +526,19 @@ class PlatformBase( # pylint: disable=too-many-public-methods
@property @property
def name(self): def name(self):
return self._manifest['name'] return self._manifest["name"]
@property @property
def title(self): def title(self):
return self._manifest['title'] return self._manifest["title"]
@property @property
def description(self): def description(self):
return self._manifest['description'] return self._manifest["description"]
@property @property
def version(self): def version(self):
return self._manifest['version'] return self._manifest["version"]
@property @property
def homepage(self): def homepage(self):
@@ -561,7 +579,7 @@ class PlatformBase( # pylint: disable=too-many-public-methods
@property @property
def packages(self): def packages(self):
packages = self._manifest.get("packages", {}) packages = self._manifest.get("packages", {})
for item in (self._custom_packages or []): for item in self._custom_packages or []:
name = item name = item
version = "*" version = "*"
if "@" in item: if "@" in item:
@@ -569,10 +587,7 @@ class PlatformBase( # pylint: disable=too-many-public-methods
name = name.strip() name = name.strip()
if name not in packages: if name not in packages:
packages[name] = {} packages[name] = {}
packages[name].update({ packages[name].update({"version": version.strip(), "optional": False})
"version": version.strip(),
"optional": False
})
return packages return packages
def get_dir(self): def get_dir(self):
@@ -591,20 +606,18 @@ class PlatformBase( # pylint: disable=too-many-public-methods
return False return False
def get_boards(self, id_=None): def get_boards(self, id_=None):
def _append_board(board_id, manifest_path): def _append_board(board_id, manifest_path):
config = PlatformBoardConfig(manifest_path) config = PlatformBoardConfig(manifest_path)
if "platform" in config and config.get("platform") != self.name: if "platform" in config and config.get("platform") != self.name:
return return
if "platforms" in config \ if "platforms" in config and self.name not in config.get("platforms"):
and self.name not in config.get("platforms"):
return return
config.manifest['platform'] = self.name config.manifest["platform"] = self.name
self._BOARDS_CACHE[board_id] = config self._BOARDS_CACHE[board_id] = config
bdirs = [ bdirs = [
get_project_boards_dir(), self.config.get_optional_dir("boards"),
join(get_project_core_dir(), "boards"), join(self.config.get_optional_dir("core"), "boards"),
join(self.get_dir(), "boards"), join(self.get_dir(), "boards"),
] ]
@@ -649,28 +662,28 @@ class PlatformBase( # pylint: disable=too-many-public-methods
continue continue
_pkg_name = self.frameworks[framework].get("package") _pkg_name = self.frameworks[framework].get("package")
if _pkg_name: if _pkg_name:
self.packages[_pkg_name]['optional'] = False self.packages[_pkg_name]["optional"] = False
# enable upload tools for upload targets # enable upload tools for upload targets
if any(["upload" in t for t in targets] + ["program" in targets]): if any(["upload" in t for t in targets] + ["program" in targets]):
for name, opts in self.packages.items(): for name, opts in self.packages.items():
if opts.get("type") == "uploader": if opts.get("type") == "uploader":
self.packages[name]['optional'] = False self.packages[name]["optional"] = False
# skip all packages in "nobuild" mode # skip all packages in "nobuild" mode
# allow only upload tools and frameworks # allow only upload tools and frameworks
elif "nobuild" in targets and opts.get("type") != "framework": elif "nobuild" in targets and opts.get("type") != "framework":
self.packages[name]['optional'] = True self.packages[name]["optional"] = True
def get_lib_storages(self): def get_lib_storages(self):
storages = [] storages = {}
for opts in (self.frameworks or {}).values(): for opts in (self.frameworks or {}).values():
if "package" not in opts: if "package" not in opts:
continue continue
pkg_dir = self.get_package_dir(opts['package']) pkg_dir = self.get_package_dir(opts["package"])
if not pkg_dir or not isdir(join(pkg_dir, "libraries")): if not pkg_dir or not isdir(join(pkg_dir, "libraries")):
continue continue
libs_dir = join(pkg_dir, "libraries") libs_dir = join(pkg_dir, "libraries")
storages.append({"name": opts['package'], "path": libs_dir}) storages[libs_dir] = opts["package"]
libcores_dir = join(libs_dir, "__cores__") libcores_dir = join(libs_dir, "__cores__")
if not isdir(libcores_dir): if not isdir(libcores_dir):
continue continue
@@ -678,16 +691,12 @@ class PlatformBase( # pylint: disable=too-many-public-methods
libcore_dir = join(libcores_dir, item) libcore_dir = join(libcores_dir, item)
if not isdir(libcore_dir): if not isdir(libcore_dir):
continue continue
storages.append({ storages[libcore_dir] = "%s-core-%s" % (opts["package"], item)
"name": "%s-core-%s" % (opts['package'], item),
"path": libcore_dir
})
return storages return [dict(name=name, path=path) for path, name in storages.items()]
class PlatformBoardConfig(object): class PlatformBoardConfig(object):
def __init__(self, manifest_path): def __init__(self, manifest_path):
self._id = basename(manifest_path)[:-5] self._id = basename(manifest_path)[:-5]
assert isfile(manifest_path) assert isfile(manifest_path)
@@ -698,8 +707,8 @@ class PlatformBoardConfig(object):
raise exception.InvalidBoardManifest(manifest_path) raise exception.InvalidBoardManifest(manifest_path)
if not set(["name", "url", "vendor"]) <= set(self._manifest): if not set(["name", "url", "vendor"]) <= set(self._manifest):
raise exception.PlatformioException( raise exception.PlatformioException(
"Please specify name, url and vendor fields for " + "Please specify name, url and vendor fields for " + manifest_path
manifest_path) )
def get(self, path, default=None): def get(self, path, default=None):
try: try:
@@ -751,41 +760,33 @@ class PlatformBoardConfig(object):
def get_brief_data(self): def get_brief_data(self):
return { return {
"id": "id": self.id,
self.id, "name": self._manifest["name"],
"name": "platform": self._manifest.get("platform"),
self._manifest['name'], "mcu": self._manifest.get("build", {}).get("mcu", "").upper(),
"platform": "fcpu": int(
self._manifest.get("platform"), "".join(
"mcu": [
self._manifest.get("build", {}).get("mcu", "").upper(), c
"fcpu": for c in str(self._manifest.get("build", {}).get("f_cpu", "0L"))
int("".join([ if c.isdigit()
c for c in str( ]
self._manifest.get("build", {}).get("f_cpu", "0L")) )
if c.isdigit() ),
])), "ram": self._manifest.get("upload", {}).get("maximum_ram_size", 0),
"ram": "rom": self._manifest.get("upload", {}).get("maximum_size", 0),
self._manifest.get("upload", {}).get("maximum_ram_size", 0), "connectivity": self._manifest.get("connectivity"),
"rom": "frameworks": self._manifest.get("frameworks"),
self._manifest.get("upload", {}).get("maximum_size", 0), "debug": self.get_debug_data(),
"connectivity": "vendor": self._manifest["vendor"],
self._manifest.get("connectivity"), "url": self._manifest["url"],
"frameworks":
self._manifest.get("frameworks"),
"debug":
self.get_debug_data(),
"vendor":
self._manifest['vendor'],
"url":
self._manifest['url']
} }
def get_debug_data(self): def get_debug_data(self):
if not self._manifest.get("debug", {}).get("tools"): if not self._manifest.get("debug", {}).get("tools"):
return None return None
tools = {} tools = {}
for name, options in self._manifest['debug']['tools'].items(): for name, options in self._manifest["debug"]["tools"].items():
tools[name] = {} tools[name] = {}
for key, value in options.items(): for key, value in options.items():
if key in ("default", "onboard"): if key in ("default", "onboard"):
@@ -798,22 +799,23 @@ class PlatformBoardConfig(object):
if tool_name == "custom": if tool_name == "custom":
return tool_name return tool_name
if not debug_tools: if not debug_tools:
raise exception.DebugSupportError(self._manifest['name']) raise exception.DebugSupportError(self._manifest["name"])
if tool_name: if tool_name:
if tool_name in debug_tools: if tool_name in debug_tools:
return tool_name return tool_name
raise exception.DebugInvalidOptions( raise exception.DebugInvalidOptions(
"Unknown debug tool `%s`. Please use one of `%s` or `custom`" % "Unknown debug tool `%s`. Please use one of `%s` or `custom`"
(tool_name, ", ".join(sorted(list(debug_tools))))) % (tool_name, ", ".join(sorted(list(debug_tools))))
)
# automatically select best tool # automatically select best tool
data = {"default": [], "onboard": [], "external": []} data = {"default": [], "onboard": [], "external": []}
for key, value in debug_tools.items(): for key, value in debug_tools.items():
if value.get("default"): if value.get("default"):
data['default'].append(key) data["default"].append(key)
elif value.get("onboard"): elif value.get("onboard"):
data['onboard'].append(key) data["onboard"].append(key)
data['external'].append(key) data["external"].append(key)
for key, value in data.items(): for key, value in data.items():
if not value: if not value:

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,36 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.exception import PlatformioException
class ManifestException(PlatformioException):
pass
class ManifestParserError(ManifestException):
pass
class ManifestValidationError(ManifestException):
def __init__(self, error, data):
super(ManifestValidationError, self).__init__()
self.error = error
self.data = data
def __str__(self):
return (
"Invalid manifest fields: %s. \nPlease check specification -> "
"http://docs.platformio.org/page/librarymanager/config.html" % self.error
)

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,553 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import re
import requests
from platformio.compat import get_class_attributes, string_types
from platformio.fs import get_file_contents
from platformio.package.exception import ManifestParserError
from platformio.project.helpers import is_platformio_project
try:
from urllib.parse import urlparse
except ImportError:
from urlparse import urlparse
class ManifestFileType(object):
PLATFORM_JSON = "platform.json"
LIBRARY_JSON = "library.json"
LIBRARY_PROPERTIES = "library.properties"
MODULE_JSON = "module.json"
PACKAGE_JSON = "package.json"
@classmethod
def from_uri(cls, uri):
if uri.endswith(".properties"):
return ManifestFileType.LIBRARY_PROPERTIES
if uri.endswith("platform.json"):
return ManifestFileType.PLATFORM_JSON
if uri.endswith("module.json"):
return ManifestFileType.MODULE_JSON
if uri.endswith("package.json"):
return ManifestFileType.PACKAGE_JSON
if uri.endswith("library.json"):
return ManifestFileType.LIBRARY_JSON
return None
class ManifestParserFactory(object):
@staticmethod
def type_to_clsname(t):
t = t.replace(".", " ")
t = t.title()
return "%sManifestParser" % t.replace(" ", "")
@staticmethod
def new_from_file(path, remote_url=False):
if not path or not os.path.isfile(path):
raise ManifestParserError("Manifest file does not exist %s" % path)
for t in get_class_attributes(ManifestFileType).values():
if path.endswith(t):
return ManifestParserFactory.new(get_file_contents(path), t, remote_url)
raise ManifestParserError("Unknown manifest file type %s" % path)
@staticmethod
def new_from_dir(path, remote_url=None):
assert os.path.isdir(path), "Invalid directory %s" % path
type_from_uri = ManifestFileType.from_uri(remote_url) if remote_url else None
if type_from_uri and os.path.isfile(os.path.join(path, type_from_uri)):
return ManifestParserFactory.new(
get_file_contents(os.path.join(path, type_from_uri)),
type_from_uri,
remote_url=remote_url,
package_dir=path,
)
file_order = [
ManifestFileType.PLATFORM_JSON,
ManifestFileType.LIBRARY_JSON,
ManifestFileType.LIBRARY_PROPERTIES,
ManifestFileType.MODULE_JSON,
ManifestFileType.PACKAGE_JSON,
]
for t in file_order:
if not os.path.isfile(os.path.join(path, t)):
continue
return ManifestParserFactory.new(
get_file_contents(os.path.join(path, t)),
t,
remote_url=remote_url,
package_dir=path,
)
raise ManifestParserError("Unknown manifest file type in %s directory" % path)
@staticmethod
def new_from_url(remote_url):
r = requests.get(remote_url)
r.raise_for_status()
return ManifestParserFactory.new(
r.text,
ManifestFileType.from_uri(remote_url) or ManifestFileType.LIBRARY_JSON,
remote_url,
)
@staticmethod
def new(contents, type, remote_url=None, package_dir=None):
# pylint: disable=redefined-builtin
clsname = ManifestParserFactory.type_to_clsname(type)
if clsname not in globals():
raise ManifestParserError("Unknown manifest file type %s" % clsname)
return globals()[clsname](contents, remote_url, package_dir)
class BaseManifestParser(object):
def __init__(self, contents, remote_url=None, package_dir=None):
self.remote_url = remote_url
self.package_dir = package_dir
try:
self._data = self.parse(contents)
except Exception as e:
raise ManifestParserError("Could not parse manifest -> %s" % e)
self._data = self.parse_examples(self._data)
# remove None fields
for key in list(self._data.keys()):
if self._data[key] is None:
del self._data[key]
def parse(self, contents):
raise NotImplementedError
def as_dict(self):
return self._data
@staticmethod
def cleanup_author(author):
assert isinstance(author, dict)
if author.get("email"):
author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"])
for key in list(author.keys()):
if author[key] is None:
del author[key]
return author
@staticmethod
def parse_author_name_and_email(raw):
if raw == "None" or "://" in raw:
return (None, None)
name = raw
email = None
for ldel, rdel in [("<", ">"), ("(", ")")]:
if ldel in raw and rdel in raw:
name = raw[: raw.index(ldel)]
email = raw[raw.index(ldel) + 1 : raw.index(rdel)]
return (name.strip(), email.strip() if email else None)
def parse_examples(self, data):
examples = data.get("examples")
if (
not examples
or not isinstance(examples, list)
or not all(isinstance(v, dict) for v in examples)
):
examples = None
if not examples and self.package_dir:
data["examples"] = self.parse_examples_from_dir(self.package_dir)
if "examples" in data and not data["examples"]:
del data["examples"]
return data
@staticmethod
def parse_examples_from_dir(package_dir):
assert os.path.isdir(package_dir)
examples_dir = os.path.join(package_dir, "examples")
if not os.path.isdir(examples_dir):
examples_dir = os.path.join(package_dir, "Examples")
if not os.path.isdir(examples_dir):
return None
allowed_exts = (
".c",
".cc",
".cpp",
".h",
".hpp",
".asm",
".ASM",
".s",
".S",
".ino",
".pde",
)
result = {}
last_pio_project = None
for root, _, files in os.walk(examples_dir):
# skip hidden files, symlinks, and folders
files = [
f
for f in files
if not f.startswith(".") and not os.path.islink(os.path.join(root, f))
]
if os.path.basename(root).startswith(".") or not files:
continue
if is_platformio_project(root):
last_pio_project = root
result[last_pio_project] = dict(
name=os.path.relpath(root, examples_dir),
base=os.path.relpath(root, package_dir),
files=files,
)
continue
if last_pio_project:
if root.startswith(last_pio_project):
result[last_pio_project]["files"].extend(
[
os.path.relpath(os.path.join(root, f), last_pio_project)
for f in files
]
)
continue
last_pio_project = None
matched_files = [f for f in files if f.endswith(allowed_exts)]
if not matched_files:
continue
result[root] = dict(
name="Examples"
if root == examples_dir
else os.path.relpath(root, examples_dir),
base=os.path.relpath(root, package_dir),
files=matched_files,
)
result = list(result.values())
# normalize example names
for item in result:
item["name"] = item["name"].replace(os.path.sep, "/")
item["name"] = re.sub(r"[^a-z\d\d\-\_/]+", "_", item["name"], flags=re.I)
return result or None
class LibraryJsonManifestParser(BaseManifestParser):
def parse(self, contents):
data = json.loads(contents)
data = self._process_renamed_fields(data)
# normalize Union[str, list] fields
for k in ("keywords", "platforms", "frameworks"):
if k in data:
data[k] = self._str_to_list(data[k], sep=",")
if "authors" in data:
data["authors"] = self._parse_authors(data["authors"])
if "platforms" in data:
data["platforms"] = self._parse_platforms(data["platforms"]) or None
if "export" in data:
data["export"] = self._parse_export(data["export"])
return data
@staticmethod
def _str_to_list(value, sep=",", lowercase=True):
if isinstance(value, string_types):
value = value.split(sep)
assert isinstance(value, list)
result = []
for item in value:
item = item.strip()
if not item:
continue
if lowercase:
item = item.lower()
result.append(item)
return result
@staticmethod
def _process_renamed_fields(data):
if "url" in data:
data["homepage"] = data["url"]
del data["url"]
for key in ("include", "exclude"):
if key not in data:
continue
if "export" not in data:
data["export"] = {}
data["export"][key] = data[key]
del data[key]
return data
def _parse_authors(self, raw):
if not raw:
return None
# normalize Union[dict, list] fields
if not isinstance(raw, list):
raw = [raw]
return [self.cleanup_author(author) for author in raw]
@staticmethod
def _parse_platforms(raw):
assert isinstance(raw, list)
result = []
# renamed platforms
for item in raw:
if item == "espressif":
item = "espressif8266"
result.append(item)
return result
@staticmethod
def _parse_export(raw):
if not isinstance(raw, dict):
return None
result = {}
for k in ("include", "exclude"):
if k not in raw:
continue
result[k] = raw[k] if isinstance(raw[k], list) else [raw[k]]
return result
class ModuleJsonManifestParser(BaseManifestParser):
def parse(self, contents):
data = json.loads(contents)
data["frameworks"] = ["mbed"]
data["platforms"] = ["*"]
data["export"] = {"exclude": ["tests", "test", "*.doxyfile", "*.pdf"]}
if "author" in data:
data["authors"] = self._parse_authors(data.get("author"))
del data["author"]
if "licenses" in data:
data["license"] = self._parse_license(data.get("licenses"))
del data["licenses"]
return data
def _parse_authors(self, raw):
if not raw:
return None
result = []
for author in raw.split(","):
name, email = self.parse_author_name_and_email(author)
if not name:
continue
result.append(self.cleanup_author(dict(name=name, email=email)))
return result
@staticmethod
def _parse_license(raw):
if not raw or not isinstance(raw, list):
return None
return raw[0].get("type")
class LibraryPropertiesManifestParser(BaseManifestParser):
def parse(self, contents):
data = self._parse_properties(contents)
repository = self._parse_repository(data)
homepage = data.get("url")
if repository and repository["url"] == homepage:
homepage = None
data.update(
dict(
frameworks=["arduino"],
homepage=homepage,
repository=repository or None,
description=self._parse_description(data),
platforms=self._parse_platforms(data) or ["*"],
keywords=self._parse_keywords(data),
export=self._parse_export(),
)
)
if "author" in data:
data["authors"] = self._parse_authors(data)
del data["author"]
return data
@staticmethod
def _parse_properties(contents):
data = {}
for line in contents.splitlines():
line = line.strip()
if not line or "=" not in line:
continue
# skip comments
if line.startswith("#"):
continue
key, value = line.split("=", 1)
data[key.strip()] = value.strip()
return data
@staticmethod
def _parse_description(properties):
lines = []
for k in ("sentence", "paragraph"):
if k in properties and properties[k] not in lines:
lines.append(properties[k])
if len(lines) == 2 and not lines[0].endswith("."):
lines[0] += "."
return " ".join(lines)
@staticmethod
def _parse_keywords(properties):
result = []
for item in re.split(r"[\s/]+", properties.get("category", "uncategorized")):
item = item.strip()
if not item:
continue
result.append(item.lower())
return result
@staticmethod
def _parse_platforms(properties):
result = []
platforms_map = {
"avr": "atmelavr",
"sam": "atmelsam",
"samd": "atmelsam",
"esp8266": "espressif8266",
"esp32": "espressif32",
"arc32": "intel_arc32",
"stm32": "ststm32",
}
for arch in properties.get("architectures", "").split(","):
if "particle-" in arch:
raise ManifestParserError("Particle is not supported yet")
arch = arch.strip()
if not arch:
continue
if arch == "*":
return ["*"]
if arch in platforms_map:
result.append(platforms_map[arch])
return result
def _parse_authors(self, properties):
if "author" not in properties:
return None
authors = []
for author in properties["author"].split(","):
name, email = self.parse_author_name_and_email(author)
if not name:
continue
authors.append(self.cleanup_author(dict(name=name, email=email)))
for author in properties.get("maintainer", "").split(","):
name, email = self.parse_author_name_and_email(author)
if not name:
continue
found = False
for item in authors:
if item.get("name", "").lower() != name.lower():
continue
found = True
item["maintainer"] = True
if not item.get("email"):
item["email"] = email
if not found:
authors.append(
self.cleanup_author(dict(name=name, email=email, maintainer=True))
)
return authors
def _parse_repository(self, properties):
if self.remote_url:
repo_parse = urlparse(self.remote_url)
repo_path_tokens = repo_parse.path[1:].split("/")[:-1]
if "github" in repo_parse.netloc:
return dict(
type="git",
url="%s://github.com/%s"
% (repo_parse.scheme, "/".join(repo_path_tokens[:2])),
)
if "raw" in repo_path_tokens:
return dict(
type="git",
url="%s://%s/%s"
% (
repo_parse.scheme,
repo_parse.netloc,
"/".join(repo_path_tokens[: repo_path_tokens.index("raw")]),
),
)
if properties.get("url", "").startswith("https://github.com"):
return dict(type="git", url=properties["url"])
return None
def _parse_export(self):
result = {"exclude": ["extras", "docs", "tests", "test", "*.doxyfile", "*.pdf"]}
include = None
if self.remote_url:
repo_parse = urlparse(self.remote_url)
repo_path_tokens = repo_parse.path[1:].split("/")[:-1]
if "github" in repo_parse.netloc:
include = "/".join(repo_path_tokens[3:]) or None
elif "raw" in repo_path_tokens:
include = (
"/".join(repo_path_tokens[repo_path_tokens.index("raw") + 2 :])
or None
)
if include:
result["include"] = [include]
return result
class PlatformJsonManifestParser(BaseManifestParser):
def parse(self, contents):
data = json.loads(contents)
if "frameworks" in data:
data["frameworks"] = self._parse_frameworks(data["frameworks"])
return data
@staticmethod
def _parse_frameworks(raw):
if not isinstance(raw, dict):
return None
return [name.lower() for name in raw.keys()]
class PackageJsonManifestParser(BaseManifestParser):
def parse(self, contents):
data = json.loads(contents)
data = self._parse_system(data)
data = self._parse_homepage(data)
return data
@staticmethod
def _parse_system(data):
if "system" not in data:
return data
if data["system"] in ("*", ["*"], "all"):
del data["system"]
return data
if not isinstance(data["system"], list):
data["system"] = [data["system"]]
data["system"] = [s.strip().lower() for s in data["system"]]
return data
@staticmethod
def _parse_homepage(data):
if "url" in data:
data["homepage"] = data["url"]
del data["url"]
return data

View File

@@ -0,0 +1,182 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import requests
import semantic_version
from marshmallow import Schema, ValidationError, fields, validate, validates
from platformio.package.exception import ManifestValidationError
from platformio.util import memoized
class StrictSchema(Schema):
def handle_error(self, error, data):
# skip broken records
if self.many:
error.data = [
item for idx, item in enumerate(data) if idx not in error.messages
]
else:
error.data = None
raise error
class StrictListField(fields.List):
def _deserialize(self, value, attr, data):
try:
return super(StrictListField, self)._deserialize(value, attr, data)
except ValidationError as exc:
if exc.data:
exc.data = [item for item in exc.data if item is not None]
raise exc
class AuthorSchema(StrictSchema):
name = fields.Str(required=True, validate=validate.Length(min=1, max=50))
email = fields.Email(validate=validate.Length(min=1, max=50))
maintainer = fields.Bool(default=False)
url = fields.Url(validate=validate.Length(min=1, max=255))
class RepositorySchema(StrictSchema):
type = fields.Str(
required=True,
validate=validate.OneOf(
["git", "hg", "svn"],
error="Invalid repository type, please use one of [git, hg, svn]",
),
)
url = fields.Str(required=True, validate=validate.Length(min=1, max=255))
branch = fields.Str(validate=validate.Length(min=1, max=50))
class ExportSchema(Schema):
include = StrictListField(fields.Str)
exclude = StrictListField(fields.Str)
class ExampleSchema(StrictSchema):
name = fields.Str(
required=True,
validate=[
validate.Length(min=1, max=100),
validate.Regexp(
r"^[a-zA-Z\d\-\_/]+$", error="Only [a-zA-Z0-9-_/] chars are allowed"
),
],
)
base = fields.Str(required=True)
files = StrictListField(fields.Str, required=True)
class ManifestSchema(Schema):
# Required fields
name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
version = fields.Str(required=True, validate=validate.Length(min=1, max=50))
# Optional fields
authors = fields.Nested(AuthorSchema, many=True)
description = fields.Str(validate=validate.Length(min=1, max=1000))
homepage = fields.Url(validate=validate.Length(min=1, max=255))
license = fields.Str(validate=validate.Length(min=1, max=255))
repository = fields.Nested(RepositorySchema)
export = fields.Nested(ExportSchema)
examples = fields.Nested(ExampleSchema, many=True)
keywords = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^[a-z\d\-\+\. ]+$", error="Only [a-z0-9-+. ] chars are allowed"
),
]
)
)
platforms = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
),
]
)
)
frameworks = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
),
]
)
)
# platform.json specific
title = fields.Str(validate=validate.Length(min=1, max=100))
# package.json specific
system = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^[a-z\d\-_]+$", error="Only [a-z0-9-_] chars are allowed"
),
]
)
)
def handle_error(self, error, data):
if self.strict:
raise ManifestValidationError(error, data)
@validates("version")
def validate_version(self, value): # pylint: disable=no-self-use
try:
value = str(value)
assert "." in value
semantic_version.Version.coerce(value)
except (AssertionError, ValueError):
raise ValidationError(
"Invalid semantic versioning format, see https://semver.org/"
)
@validates("license")
def validate_license(self, value):
try:
spdx = self.load_spdx_licenses()
except requests.exceptions.RequestException:
raise ValidationError("Could not load SPDX licenses for validation")
for item in spdx.get("licenses", []):
if item.get("licenseId") == value:
return
raise ValidationError(
"Invalid SPDX license identifier. See valid identifiers at "
"https://spdx.org/licenses/"
)
@staticmethod
@memoized(expire="1h")
def load_spdx_licenses():
r = requests.get(
"https://raw.githubusercontent.com/spdx/license-list-data"
"/v3.6/json/licenses.json"
)
r.raise_for_status()
return r.json()

View File

@@ -19,11 +19,15 @@ from os.path import isdir, isfile, join, normpath
from threading import Thread from threading import Thread
from platformio import exception from platformio import exception
from platformio.compat import WINDOWS, string_types from platformio.compat import (
WINDOWS,
get_filesystem_encoding,
get_locale_encoding,
string_types,
)
class AsyncPipeBase(object): class AsyncPipeBase(object):
def __init__(self): def __init__(self):
self._fd_read, self._fd_write = os.pipe() self._fd_read, self._fd_write = os.pipe()
self._pipe_reader = os.fdopen(self._fd_read) self._pipe_reader = os.fdopen(self._fd_read)
@@ -53,7 +57,6 @@ class AsyncPipeBase(object):
class BuildAsyncPipe(AsyncPipeBase): class BuildAsyncPipe(AsyncPipeBase):
def __init__(self, line_callback, data_callback): def __init__(self, line_callback, data_callback):
self.line_callback = line_callback self.line_callback = line_callback
self.data_callback = data_callback self.data_callback = data_callback
@@ -88,7 +91,6 @@ class BuildAsyncPipe(AsyncPipeBase):
class LineBufferedAsyncPipe(AsyncPipeBase): class LineBufferedAsyncPipe(AsyncPipeBase):
def __init__(self, line_callback): def __init__(self, line_callback):
self.line_callback = line_callback self.line_callback = line_callback
super(LineBufferedAsyncPipe, self).__init__() super(LineBufferedAsyncPipe, self).__init__()
@@ -109,8 +111,8 @@ def exec_command(*args, **kwargs):
p = subprocess.Popen(*args, **kwargs) p = subprocess.Popen(*args, **kwargs)
try: try:
result['out'], result['err'] = p.communicate() result["out"], result["err"] = p.communicate()
result['returncode'] = p.returncode result["returncode"] = p.returncode
except KeyboardInterrupt: except KeyboardInterrupt:
raise exception.AbortedByUser() raise exception.AbortedByUser()
finally: finally:
@@ -125,7 +127,9 @@ def exec_command(*args, **kwargs):
for k, v in result.items(): for k, v in result.items():
if isinstance(result[k], bytes): if isinstance(result[k], bytes):
try: try:
result[k] = result[k].decode(sys.getdefaultencoding()) result[k] = result[k].decode(
get_locale_encoding() or get_filesystem_encoding()
)
except UnicodeDecodeError: except UnicodeDecodeError:
result[k] = result[k].decode("latin-1") result[k] = result[k].decode("latin-1")
if v and isinstance(v, string_types): if v and isinstance(v, string_types):
@@ -160,24 +164,22 @@ def copy_pythonpath_to_osenv():
for p in os.sys.path: for p in os.sys.path:
conditions = [p not in _PYTHONPATH] conditions = [p not in _PYTHONPATH]
if not WINDOWS: if not WINDOWS:
conditions.append( conditions.append(isdir(join(p, "click")) or isdir(join(p, "platformio")))
isdir(join(p, "click")) or isdir(join(p, "platformio")))
if all(conditions): if all(conditions):
_PYTHONPATH.append(p) _PYTHONPATH.append(p)
os.environ['PYTHONPATH'] = os.pathsep.join(_PYTHONPATH) os.environ["PYTHONPATH"] = os.pathsep.join(_PYTHONPATH)
def where_is_program(program, envpath=None): def where_is_program(program, envpath=None):
env = os.environ env = os.environ
if envpath: if envpath:
env['PATH'] = envpath env["PATH"] = envpath
# try OS's built-in commands # try OS's built-in commands
try: try:
result = exec_command(["where" if WINDOWS else "which", program], result = exec_command(["where" if WINDOWS else "which", program], env=env)
env=env) if result["returncode"] == 0 and isfile(result["out"].strip()):
if result['returncode'] == 0 and isfile(result['out'].strip()): return result["out"].strip()
return result['out'].strip()
except OSError: except OSError:
pass pass

View File

@@ -16,11 +16,12 @@ import glob
import json import json
import os import os
import re import re
from os.path import expanduser, getmtime, isfile from hashlib import sha1
import click import click
from platformio import exception from platformio import exception, fs
from platformio.compat import PY2, WINDOWS, hashlib_encode_data
from platformio.project.options import ProjectOptions from platformio.project.options import ProjectOptions
try: try:
@@ -41,7 +42,7 @@ CONFIG_HEADER = """;PlatformIO Project Configuration File
""" """
class ProjectConfig(object): class ProjectConfigBase(object):
INLINE_COMMENT_RE = re.compile(r"\s+;.*$") INLINE_COMMENT_RE = re.compile(r"\s+;.*$")
VARTPL_RE = re.compile(r"\$\{([^\.\}]+)\.([^\}]+)\}") VARTPL_RE = re.compile(r"\$\{([^\.\}]+)\.([^\}]+)\}")
@@ -49,7 +50,6 @@ class ProjectConfig(object):
expand_interpolations = True expand_interpolations = True
warnings = [] warnings = []
_instances = {}
_parser = None _parser = None
_parsed = [] _parsed = []
@@ -66,30 +66,34 @@ class ProjectConfig(object):
if not item or item.startswith((";", "#")): if not item or item.startswith((";", "#")):
continue continue
if ";" in item: if ";" in item:
item = ProjectConfig.INLINE_COMMENT_RE.sub("", item).strip() item = ProjectConfigBase.INLINE_COMMENT_RE.sub("", item).strip()
result.append(item) result.append(item)
return result return result
@staticmethod @staticmethod
def get_instance(path): def get_default_path():
mtime = getmtime(path) if isfile(path) else 0 from platformio import app # pylint: disable=import-outside-toplevel
instance = ProjectConfig._instances.get(path)
if instance and instance["mtime"] != mtime:
instance = None
if not instance:
instance = {"mtime": mtime, "config": ProjectConfig(path)}
ProjectConfig._instances[path] = instance
return instance["config"]
def __init__(self, path, parse_extra=True, expand_interpolations=True): return app.get_session_var("custom_project_conf") or os.path.join(
os.getcwd(), "platformio.ini"
)
def __init__(self, path=None, parse_extra=True, expand_interpolations=True):
path = self.get_default_path() if path is None else path
self.path = path self.path = path
self.expand_interpolations = expand_interpolations self.expand_interpolations = expand_interpolations
self.warnings = [] self.warnings = []
self._parsed = [] self._parsed = []
self._parser = ConfigParser.ConfigParser() self._parser = (
if isfile(path): ConfigParser.ConfigParser()
if PY2
else ConfigParser.ConfigParser(inline_comment_prefixes=("#", ";"))
)
if path and os.path.isfile(path):
self.read(path, parse_extra) self.read(path, parse_extra)
self._maintain_renaimed_options()
def __getattr__(self, name): def __getattr__(self, name):
return getattr(self._parser, name) return getattr(self._parser, name)
@@ -108,31 +112,32 @@ class ProjectConfig(object):
# load extra configs # load extra configs
for pattern in self.get("platformio", "extra_configs", []): for pattern in self.get("platformio", "extra_configs", []):
if pattern.startswith("~"): if pattern.startswith("~"):
pattern = expanduser(pattern) pattern = fs.expanduser(pattern)
for item in glob.glob(pattern): for item in glob.glob(pattern):
self.read(item) self.read(item)
self._maintain_renaimed_options()
def _maintain_renaimed_options(self): def _maintain_renaimed_options(self):
# legacy `lib_extra_dirs` in [platformio] # legacy `lib_extra_dirs` in [platformio]
if (self._parser.has_section("platformio") if self._parser.has_section("platformio") and self._parser.has_option(
and self._parser.has_option("platformio", "lib_extra_dirs")): "platformio", "lib_extra_dirs"
):
if not self._parser.has_section("env"): if not self._parser.has_section("env"):
self._parser.add_section("env") self._parser.add_section("env")
self._parser.set("env", "lib_extra_dirs", self._parser.set(
self._parser.get("platformio", "lib_extra_dirs")) "env",
"lib_extra_dirs",
self._parser.get("platformio", "lib_extra_dirs"),
)
self._parser.remove_option("platformio", "lib_extra_dirs") self._parser.remove_option("platformio", "lib_extra_dirs")
self.warnings.append( self.warnings.append(
"`lib_extra_dirs` configuration option is deprecated in " "`lib_extra_dirs` configuration option is deprecated in "
"section [platformio]! Please move it to global `env` section") "section [platformio]! Please move it to global `env` section"
)
renamed_options = {} renamed_options = {}
for option in ProjectOptions.values(): for option in ProjectOptions.values():
if option.oldnames: if option.oldnames:
renamed_options.update( renamed_options.update({name: option.name for name in option.oldnames})
{name: option.name
for name in option.oldnames})
for section in self._parser.sections(): for section in self._parser.sections():
scope = section.split(":", 1)[0] scope = section.split(":", 1)[0]
@@ -143,54 +148,74 @@ class ProjectConfig(object):
self.warnings.append( self.warnings.append(
"`%s` configuration option in section [%s] is " "`%s` configuration option in section [%s] is "
"deprecated and will be removed in the next release! " "deprecated and will be removed in the next release! "
"Please use `%s` instead" % "Please use `%s` instead"
(option, section, renamed_options[option])) % (option, section, renamed_options[option])
)
# rename on-the-fly # rename on-the-fly
self._parser.set(section, renamed_options[option], self._parser.set(
self._parser.get(section, option)) section,
renamed_options[option],
self._parser.get(section, option),
)
self._parser.remove_option(section, option) self._parser.remove_option(section, option)
continue continue
# unknown # unknown
unknown_conditions = [ unknown_conditions = [
("%s.%s" % (scope, option)) not in ProjectOptions, ("%s.%s" % (scope, option)) not in ProjectOptions,
scope != "env" or scope != "env" or not option.startswith(("custom_", "board_")),
not option.startswith(("custom_", "board_")) ]
] # yapf: disable
if all(unknown_conditions): if all(unknown_conditions):
self.warnings.append( self.warnings.append(
"Ignore unknown configuration option `%s` " "Ignore unknown configuration option `%s` "
"in section [%s]" % (option, section)) "in section [%s]" % (option, section)
)
return True return True
def walk_options(self, root_section):
extends_queue = (
["env", root_section] if root_section.startswith("env:") else [root_section]
)
extends_done = []
while extends_queue:
section = extends_queue.pop()
extends_done.append(section)
if not self._parser.has_section(section):
continue
for option in self._parser.options(section):
yield (section, option)
if self._parser.has_option(section, "extends"):
extends_queue.extend(
self.parse_multi_values(self._parser.get(section, "extends"))[::-1]
)
def options(self, section=None, env=None): def options(self, section=None, env=None):
result = []
assert section or env assert section or env
if not section: if not section:
section = "env:" + env section = "env:" + env
options = self._parser.options(section)
# handle global options from [env] if not self.expand_interpolations:
if ((env or section.startswith("env:")) return self._parser.options(section)
and self._parser.has_section("env")):
for option in self._parser.options("env"): for _, option in self.walk_options(section):
if option not in options: if option not in result:
options.append(option) result.append(option)
# handle system environment variables # handle system environment variables
scope = section.split(":", 1)[0] scope = section.split(":", 1)[0]
for option_meta in ProjectOptions.values(): for option_meta in ProjectOptions.values():
if option_meta.scope != scope or option_meta.name in options: if option_meta.scope != scope or option_meta.name in result:
continue continue
if option_meta.sysenvvar and option_meta.sysenvvar in os.environ: if option_meta.sysenvvar and option_meta.sysenvvar in os.environ:
options.append(option_meta.name) result.append(option_meta.name)
return options return result
def has_option(self, section, option): def has_option(self, section, option):
if self._parser.has_option(section, option): if self._parser.has_option(section, option):
return True return True
return (section.startswith("env:") and self._parser.has_section("env") return option in self.options(section)
and self._parser.has_option("env", option))
def items(self, section=None, env=None, as_dict=False): def items(self, section=None, env=None, as_dict=False):
assert section or env assert section or env
@@ -198,29 +223,36 @@ class ProjectConfig(object):
section = "env:" + env section = "env:" + env
if as_dict: if as_dict:
return { return {
option: self.get(section, option) option: self.get(section, option) for option in self.options(section)
for option in self.options(section)
} }
return [(option, self.get(section, option)) return [(option, self.get(section, option)) for option in self.options(section)]
for option in self.options(section)]
def set(self, section, option, value): def set(self, section, option, value):
if isinstance(value, (list, tuple)): if isinstance(value, (list, tuple)):
value = "\n".join(value) value = "\n".join(value)
if value: elif isinstance(value, bool):
value = "\n" + value # start from a new line value = "yes" if value else "no"
elif isinstance(value, (int, float)):
value = str(value)
# start multi-line value from a new line
if "\n" in value and not value.startswith("\n"):
value = "\n" + value
self._parser.set(section, option, value) self._parser.set(section, option, value)
def getraw(self, section, option): def getraw(self, section, option):
if not self.expand_interpolations: if not self.expand_interpolations:
return self._parser.get(section, option) return self._parser.get(section, option)
try: value = None
found = False
for sec, opt in self.walk_options(section):
if opt == option:
value = self._parser.get(sec, option)
found = True
break
if not found:
value = self._parser.get(section, option) value = self._parser.get(section, option)
except ConfigParser.NoOptionError as e:
if not section.startswith("env:"):
raise e
value = self._parser.get("env", option)
if "${" not in value or "}" not in value: if "${" not in value or "}" not in value:
return value return value
@@ -232,7 +264,7 @@ class ProjectConfig(object):
return os.getenv(option) return os.getenv(option)
return self.getraw(section, option) return self.getraw(section, option)
def get(self, section, option, default=None): def get(self, section, option, default=None): # pylint: disable=too-many-branches
value = None value = None
try: try:
value = self.getraw(section, option) value = self.getraw(section, option)
@@ -241,8 +273,7 @@ class ProjectConfig(object):
except ConfigParser.Error as e: except ConfigParser.Error as e:
raise exception.InvalidProjectConf(self.path, str(e)) raise exception.InvalidProjectConf(self.path, str(e))
option_meta = ProjectOptions.get("%s.%s" % option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option))
(section.split(":", 1)[0], option))
if not option_meta: if not option_meta:
return value or default return value or default
@@ -263,17 +294,20 @@ class ProjectConfig(object):
value = envvar_value value = envvar_value
# option is not specified by user # option is not specified by user
if value is None: if value is None or (
return default option_meta.multiple and value == [] and option_meta.default
):
return default if default is not None else option_meta.default
try: try:
return self._covert_value(value, option_meta.type) return self.cast_to(value, option_meta.type)
except click.BadParameter as e: except click.BadParameter as e:
raise exception.ProjectOptionValueError(e.format_message(), option, if not self.expand_interpolations:
section) return value
raise exception.ProjectOptionValueError(e.format_message(), option, section)
@staticmethod @staticmethod
def _covert_value(value, to_type): def cast_to(value, to_type):
items = value items = value
if not isinstance(value, (list, tuple)): if not isinstance(value, (list, tuple)):
items = [value] items = [value]
@@ -290,7 +324,7 @@ class ProjectConfig(object):
return self.get("platformio", "default_envs", []) return self.get("platformio", "default_envs", [])
def validate(self, envs=None, silent=False): def validate(self, envs=None, silent=False):
if not isfile(self.path): if not os.path.isfile(self.path):
raise exception.NotPlatformIOProject(self.path) raise exception.NotPlatformIOProject(self.path)
# check envs # check envs
known = set(self.envs()) known = set(self.envs())
@@ -298,18 +332,114 @@ class ProjectConfig(object):
raise exception.ProjectEnvsNotAvailable() raise exception.ProjectEnvsNotAvailable()
unknown = set(list(envs or []) + self.default_envs()) - known unknown = set(list(envs or []) + self.default_envs()) - known
if unknown: if unknown:
raise exception.UnknownEnvNames(", ".join(unknown), raise exception.UnknownEnvNames(", ".join(unknown), ", ".join(known))
", ".join(known))
if not silent: if not silent:
for warning in self.warnings: for warning in self.warnings:
click.secho("Warning! %s" % warning, fg="yellow") click.secho("Warning! %s" % warning, fg="yellow")
return True return True
class ProjectConfigDirsMixin(object):
def _get_core_dir(self, exists=False):
default = ProjectOptions["platformio.core_dir"].default
core_dir = self.get("platformio", "core_dir")
win_core_dir = None
if WINDOWS and core_dir == default:
win_core_dir = os.path.splitdrive(core_dir)[0] + "\\.platformio"
if os.path.isdir(win_core_dir):
core_dir = win_core_dir
if exists and not os.path.isdir(core_dir):
try:
os.makedirs(core_dir)
except OSError as e:
if win_core_dir:
os.makedirs(win_core_dir)
core_dir = win_core_dir
else:
raise e
return core_dir
def get_optional_dir(self, name, exists=False):
if not ProjectOptions.get("platformio.%s_dir" % name):
raise ValueError("Unknown optional directory -> " + name)
if name == "core":
result = self._get_core_dir(exists)
else:
result = self.get("platformio", name + "_dir")
if result is None:
return None
project_dir = os.getcwd()
# patterns
if "$PROJECT_HASH" in result:
result = result.replace(
"$PROJECT_HASH",
"%s-%s"
% (
os.path.basename(project_dir),
sha1(hashlib_encode_data(project_dir)).hexdigest()[:10],
),
)
if "$PROJECT_DIR" in result:
result = result.replace("$PROJECT_DIR", project_dir)
if "$PROJECT_CORE_DIR" in result:
result = result.replace("$PROJECT_CORE_DIR", self.get_optional_dir("core"))
if "$PROJECT_WORKSPACE_DIR" in result:
result = result.replace(
"$PROJECT_WORKSPACE_DIR", self.get_optional_dir("workspace")
)
if result.startswith("~"):
result = fs.expanduser(result)
result = os.path.realpath(result)
if exists and not os.path.isdir(result):
os.makedirs(result)
return result
class ProjectConfig(ProjectConfigBase, ProjectConfigDirsMixin):
_instances = {}
@staticmethod
def get_instance(path=None):
path = ProjectConfig.get_default_path() if path is None else path
mtime = os.path.getmtime(path) if os.path.isfile(path) else 0
instance = ProjectConfig._instances.get(path)
if instance and instance["mtime"] != mtime:
instance = None
if not instance:
instance = {"mtime": mtime, "config": ProjectConfig(path)}
ProjectConfig._instances[path] = instance
return instance["config"]
def __repr__(self):
return "<ProjectConfig %s>" % (self.path or "in-memory")
def as_tuple(self):
return [(s, self.items(s)) for s in self.sections()]
def to_json(self): def to_json(self):
result = {} return json.dumps(self.as_tuple())
for section in self.sections():
result[section] = self.items(section, as_dict=True) def update(self, data, clear=False):
return json.dumps(result) assert isinstance(data, list)
if clear:
self._parser = ConfigParser.ConfigParser()
for section, options in data:
if not self._parser.has_section(section):
self._parser.add_section(section)
for option, value in options:
self.set(section, option, value)
def save(self, path=None): def save(self, path=None):
path = path or self.path path = path or self.path
@@ -318,3 +448,4 @@ class ProjectConfig(object):
with open(path or self.path, "w") as fp: with open(path or self.path, "w") as fp:
fp.write(CONFIG_HEADER) fp.write(CONFIG_HEADER)
self._parser.write(fp) self._parser.write(fp)
return True

View File

@@ -16,12 +16,11 @@ import json
import os import os
from hashlib import sha1 from hashlib import sha1
from os import walk from os import walk
from os.path import (basename, dirname, expanduser, isdir, isfile, join, from os.path import dirname, isdir, isfile, join
realpath, splitdrive)
from click.testing import CliRunner from click.testing import CliRunner
from platformio import __version__, exception from platformio import __version__, exception, fs
from platformio.compat import WINDOWS, hashlib_encode_data from platformio.compat import WINDOWS, hashlib_encode_data
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@@ -46,123 +45,62 @@ def find_project_dir_above(path):
return None return None
def get_project_optional_dir(name, default=None):
project_dir = get_project_dir()
config = ProjectConfig.get_instance(join(project_dir, "platformio.ini"))
optional_dir = config.get("platformio", name)
if not optional_dir:
return default
if "$PROJECT_HASH" in optional_dir:
optional_dir = optional_dir.replace(
"$PROJECT_HASH", "%s-%s" %
(basename(project_dir), sha1(
hashlib_encode_data(project_dir)).hexdigest()[:10]))
if optional_dir.startswith("~"):
optional_dir = expanduser(optional_dir)
return realpath(optional_dir)
def get_project_core_dir(): def get_project_core_dir():
default = join(expanduser("~"), ".platformio") """ Deprecated, use ProjectConfig.get_optional_dir("core") instead """
core_dir = get_project_optional_dir( return ProjectConfig.get_instance(
"core_dir", get_project_optional_dir("home_dir", default)) join(get_project_dir(), "platformio.ini")
win_core_dir = None ).get_optional_dir("core", exists=True)
if WINDOWS and core_dir == default:
win_core_dir = splitdrive(core_dir)[0] + "\\.platformio"
if isdir(win_core_dir):
core_dir = win_core_dir
if not isdir(core_dir):
try:
os.makedirs(core_dir)
except OSError as e:
if win_core_dir:
os.makedirs(win_core_dir)
core_dir = win_core_dir
else:
raise e
assert isdir(core_dir)
return core_dir
def get_project_global_lib_dir():
return get_project_optional_dir("globallib_dir",
join(get_project_core_dir(), "lib"))
def get_project_platforms_dir():
return get_project_optional_dir("platforms_dir",
join(get_project_core_dir(), "platforms"))
def get_project_packages_dir():
return get_project_optional_dir("packages_dir",
join(get_project_core_dir(), "packages"))
def get_project_cache_dir(): def get_project_cache_dir():
return get_project_optional_dir("cache_dir", """ Deprecated, use ProjectConfig.get_optional_dir("cache") instead """
join(get_project_core_dir(), ".cache")) return ProjectConfig.get_instance(
join(get_project_dir(), "platformio.ini")
).get_optional_dir("cache")
def get_project_workspace_dir(): def get_project_global_lib_dir():
return get_project_optional_dir("workspace_dir", """
join(get_project_dir(), ".pio")) Deprecated, use ProjectConfig.get_optional_dir("globallib") instead
"platformio-node-helpers" depends on it
"""
def get_project_build_dir(force=False): return ProjectConfig.get_instance(
path = get_project_optional_dir("build_dir", join(get_project_dir(), "platformio.ini")
join(get_project_workspace_dir(), "build")) ).get_optional_dir("globallib")
try:
if not isdir(path):
os.makedirs(path)
except Exception as e: # pylint: disable=broad-except
if not force:
raise Exception(e)
return path
def get_project_libdeps_dir():
return get_project_optional_dir(
"libdeps_dir", join(get_project_workspace_dir(), "libdeps"))
def get_project_lib_dir(): def get_project_lib_dir():
return get_project_optional_dir("lib_dir", join(get_project_dir(), "lib")) """
Deprecated, use ProjectConfig.get_optional_dir("lib") instead
"platformio-node-helpers" depends on it
"""
return ProjectConfig.get_instance(
join(get_project_dir(), "platformio.ini")
).get_optional_dir("lib")
def get_project_include_dir(): def get_project_libdeps_dir():
return get_project_optional_dir("include_dir", """
join(get_project_dir(), "include")) Deprecated, use ProjectConfig.get_optional_dir("libdeps") instead
"platformio-node-helpers" depends on it
"""
return ProjectConfig.get_instance(
join(get_project_dir(), "platformio.ini")
).get_optional_dir("libdeps")
def get_project_src_dir(): def get_default_projects_dir():
return get_project_optional_dir("src_dir", join(get_project_dir(), "src")) docs_dir = join(fs.expanduser("~"), "Documents")
try:
assert WINDOWS
import ctypes.wintypes # pylint: disable=import-outside-toplevel
buf = ctypes.create_unicode_buffer(ctypes.wintypes.MAX_PATH)
def get_project_test_dir(): ctypes.windll.shell32.SHGetFolderPathW(None, 5, None, 0, buf)
return get_project_optional_dir("test_dir", join(get_project_dir(), docs_dir = buf.value
"test")) except: # pylint: disable=bare-except
pass
return join(docs_dir, "PlatformIO", "Projects")
def get_project_boards_dir():
return get_project_optional_dir("boards_dir",
join(get_project_dir(), "boards"))
def get_project_data_dir():
return get_project_optional_dir("data_dir", join(get_project_dir(),
"data"))
def get_project_shared_dir():
return get_project_optional_dir("shared_dir",
join(get_project_dir(), "shared"))
def compute_project_checksum(config): def compute_project_checksum(config):
@@ -174,8 +112,11 @@ def compute_project_checksum(config):
# project file structure # project file structure
check_suffixes = (".c", ".cc", ".cpp", ".h", ".hpp", ".s", ".S") check_suffixes = (".c", ".cc", ".cpp", ".h", ".hpp", ".s", ".S")
for d in (get_project_include_dir(), get_project_src_dir(), for d in (
get_project_lib_dir()): config.get_optional_dir("include"),
config.get_optional_dir("src"),
config.get_optional_dir("lib"),
):
if not isdir(d): if not isdir(d):
continue continue
chunks = [] chunks = []
@@ -194,17 +135,21 @@ def compute_project_checksum(config):
return checksum.hexdigest() return checksum.hexdigest()
def load_project_ide_data(project_dir, envs): def load_project_ide_data(project_dir, env_or_envs):
from platformio.commands.run import cli as cmd_run # pylint: disable=import-outside-toplevel
assert envs from platformio.commands.run.command import cli as cmd_run
if not isinstance(envs, (list, tuple, set)):
assert env_or_envs
envs = env_or_envs
if not isinstance(envs, list):
envs = [envs] envs = [envs]
args = ["--project-dir", project_dir, "--target", "idedata"] args = ["--project-dir", project_dir, "--target", "idedata"]
for env in envs: for env in envs:
args.extend(["-e", env]) args.extend(["-e", env])
result = CliRunner().invoke(cmd_run, args) result = CliRunner().invoke(cmd_run, args)
if result.exit_code != 0 and not isinstance(result.exception, if result.exit_code != 0 and not isinstance(
exception.ReturnErrorCode): result.exception, exception.ReturnErrorCode
):
raise result.exception raise result.exception
if '"includes":' not in result.output: if '"includes":' not in result.output:
raise exception.PlatformioException(result.output) raise exception.PlatformioException(result.output)
@@ -212,11 +157,10 @@ def load_project_ide_data(project_dir, envs):
data = {} data = {}
for line in result.output.split("\n"): for line in result.output.split("\n"):
line = line.strip() line = line.strip()
if (line.startswith('{"') and line.endswith("}") if line.startswith('{"') and line.endswith("}") and "env_name" in line:
and "env_name" in line):
_data = json.loads(line) _data = json.loads(line)
if "env_name" in _data: if "env_name" in _data:
data[_data['env_name']] = _data data[_data["env_name"]] = _data
if len(envs) == 1 and envs[0] in data: if not isinstance(env_or_envs, list) and env_or_envs in data:
return data[envs[0]] return data[env_or_envs]
return data or None return data or None

View File

@@ -14,24 +14,60 @@
# pylint: disable=redefined-builtin, too-many-arguments # pylint: disable=redefined-builtin, too-many-arguments
from collections import OrderedDict, namedtuple import os
from collections import OrderedDict
import click import click
ConfigOptionClass = namedtuple("ConfigOption", [ from platformio import fs
"scope", "name", "type", "multiple", "sysenvvar", "buildenvvar", "oldnames"
])
def ConfigOption(scope, class ConfigOption(object): # pylint: disable=too-many-instance-attributes
name, def __init__(
type=str, self,
multiple=False, scope,
sysenvvar=None, group,
buildenvvar=None, name,
oldnames=None): description,
return ConfigOptionClass(scope, name, type, multiple, sysenvvar, type=str,
buildenvvar, oldnames) multiple=False,
sysenvvar=None,
buildenvvar=None,
oldnames=None,
default=None,
):
self.scope = scope
self.group = group
self.name = name
self.description = description
self.type = type
self.multiple = multiple
self.sysenvvar = sysenvvar
self.buildenvvar = buildenvvar
self.oldnames = oldnames
self.default = default
def as_dict(self):
result = dict(
scope=self.scope,
group=self.group,
name=self.name,
description=self.description,
type="string",
multiple=self.multiple,
sysenvvar=self.sysenvvar,
default=self.default,
)
if isinstance(self.type, click.ParamType):
result["type"] = self.type.name
if isinstance(self.type, (click.IntRange, click.FloatRange)):
result["min"] = self.type.min
result["max"] = self.type.max
if isinstance(self.type, click.Choice):
result["choices"] = self.type.choices
return result
def ConfigPlatformioOption(*args, **kwargs): def ConfigPlatformioOption(*args, **kwargs):
@@ -42,162 +78,610 @@ def ConfigEnvOption(*args, **kwargs):
return ConfigOption("env", *args, **kwargs) return ConfigOption("env", *args, **kwargs)
ProjectOptions = OrderedDict([ ProjectOptions = OrderedDict(
("%s.%s" % (option.scope, option.name), option) for option in [ [
# ("%s.%s" % (option.scope, option.name), option)
# [platformio] for option in [
# #
ConfigPlatformioOption(name="description"), # [platformio]
ConfigPlatformioOption(name="default_envs", #
oldnames=["env_default"], ConfigPlatformioOption(
multiple=True, group="generic",
sysenvvar="PLATFORMIO_DEFAULT_ENVS"), name="description",
ConfigPlatformioOption(name="extra_configs", multiple=True), description="Describe a project with a short information",
),
# Dirs ConfigPlatformioOption(
ConfigPlatformioOption(name="core_dir", group="generic",
oldnames=["home_dir"], name="default_envs",
sysenvvar="PLATFORMIO_CORE_DIR"), description=(
ConfigPlatformioOption(name="globallib_dir", "Configure a list with environments which PlatformIO should "
sysenvvar="PLATFORMIO_GLOBALLIB_DIR"), "process by default"
ConfigPlatformioOption(name="platforms_dir", ),
sysenvvar="PLATFORMIO_PLATFORMS_DIR"), oldnames=["env_default"],
ConfigPlatformioOption(name="packages_dir", multiple=True,
sysenvvar="PLATFORMIO_PACKAGES_DIR"), sysenvvar="PLATFORMIO_DEFAULT_ENVS",
ConfigPlatformioOption(name="cache_dir", ),
sysenvvar="PLATFORMIO_CACHE_DIR"), ConfigPlatformioOption(
ConfigPlatformioOption(name="build_cache_dir", group="generic",
sysenvvar="PLATFORMIO_BUILD_CACHE_DIR"), name="extra_configs",
ConfigPlatformioOption(name="workspace_dir", description=(
sysenvvar="PLATFORMIO_WORKSPACE_DIR"), "Extend main configuration with the extra configuration files"
ConfigPlatformioOption(name="build_dir", ),
sysenvvar="PLATFORMIO_BUILD_DIR"), multiple=True,
ConfigPlatformioOption(name="libdeps_dir", ),
sysenvvar="PLATFORMIO_LIBDEPS_DIR"), # Dirs
ConfigPlatformioOption(name="lib_dir", sysenvvar="PLATFORMIO_LIB_DIR"), ConfigPlatformioOption(
ConfigPlatformioOption(name="include_dir", group="directory",
sysenvvar="PLATFORMIO_INCLUDE_DIR"), name="core_dir",
ConfigPlatformioOption(name="src_dir", sysenvvar="PLATFORMIO_SRC_DIR"), description=(
ConfigPlatformioOption(name="test_dir", "PlatformIO Core location where it keeps installed development "
sysenvvar="PLATFORMIO_TEST_DIR"), "platforms, packages, global libraries, "
ConfigPlatformioOption(name="boards_dir", "and other internal information"
sysenvvar="PLATFORMIO_BOARDS_DIR"), ),
ConfigPlatformioOption(name="data_dir", oldnames=["home_dir"],
sysenvvar="PLATFORMIO_DATA_DIR"), sysenvvar="PLATFORMIO_CORE_DIR",
ConfigPlatformioOption(name="shared_dir", default=os.path.join(fs.expanduser("~"), ".platformio"),
sysenvvar="PLATFORMIO_SHARED_DIR"), ),
ConfigPlatformioOption(
# group="directory",
# [env] name="globallib_dir",
# description=(
"A library folder/storage where PlatformIO Library Dependency "
# Generic "Finder (LDF) looks for global libraries"
ConfigEnvOption(name="platform", buildenvvar="PIOPLATFORM"), ),
ConfigEnvOption(name="platform_packages", multiple=True), sysenvvar="PLATFORMIO_GLOBALLIB_DIR",
ConfigEnvOption( default=os.path.join("$PROJECT_CORE_DIR", "lib"),
name="framework", multiple=True, buildenvvar="PIOFRAMEWORK"), ),
ConfigPlatformioOption(
# Board group="directory",
ConfigEnvOption(name="board", buildenvvar="BOARD"), name="platforms_dir",
ConfigEnvOption(name="board_build.mcu", description=(
oldnames=["board_mcu"], "A location where PlatformIO Core keeps installed development "
buildenvvar="BOARD_MCU"), "platforms"
ConfigEnvOption(name="board_build.f_cpu", ),
oldnames=["board_f_cpu"], sysenvvar="PLATFORMIO_PLATFORMS_DIR",
buildenvvar="BOARD_F_CPU"), default=os.path.join("$PROJECT_CORE_DIR", "platforms"),
ConfigEnvOption(name="board_build.f_flash", ),
oldnames=["board_f_flash"], ConfigPlatformioOption(
buildenvvar="BOARD_F_FLASH"), group="directory",
ConfigEnvOption(name="board_build.flash_mode", name="packages_dir",
oldnames=["board_flash_mode"], description=(
buildenvvar="BOARD_FLASH_MODE"), "A location where PlatformIO Core keeps installed packages"
),
# Build sysenvvar="PLATFORMIO_PACKAGES_DIR",
ConfigEnvOption(name="build_type", default=os.path.join("$PROJECT_CORE_DIR", "packages"),
type=click.Choice(["release", "debug"])), ),
ConfigEnvOption(name="build_flags", ConfigPlatformioOption(
multiple=True, group="directory",
sysenvvar="PLATFORMIO_BUILD_FLAGS", name="cache_dir",
buildenvvar="BUILD_FLAGS"), description=(
ConfigEnvOption(name="src_build_flags", "A location where PlatformIO Core stores caching information "
multiple=True, "(requests to PlatformIO Registry, downloaded packages and "
sysenvvar="PLATFORMIO_SRC_BUILD_FLAGS", "other service information)"
buildenvvar="SRC_BUILD_FLAGS"), ),
ConfigEnvOption(name="build_unflags", sysenvvar="PLATFORMIO_CACHE_DIR",
multiple=True, default=os.path.join("$PROJECT_CORE_DIR", ".cache"),
sysenvvar="PLATFORMIO_BUILD_UNFLAGS", ),
buildenvvar="BUILD_UNFLAGS"), ConfigPlatformioOption(
ConfigEnvOption(name="src_filter", group="directory",
multiple=True, name="build_cache_dir",
sysenvvar="PLATFORMIO_SRC_FILTER", description=(
buildenvvar="SRC_FILTER"), "A location where PlatformIO Core keeps derived files from a "
ConfigEnvOption(name="targets", multiple=True), "build system (objects, firmwares, ELFs) and caches them between "
"build environments"
# Upload ),
ConfigEnvOption(name="upload_port", sysenvvar="PLATFORMIO_BUILD_CACHE_DIR",
sysenvvar="PLATFORMIO_UPLOAD_PORT", ),
buildenvvar="UPLOAD_PORT"), ConfigPlatformioOption(
ConfigEnvOption(name="upload_protocol", buildenvvar="UPLOAD_PROTOCOL"), group="directory",
ConfigEnvOption( name="workspace_dir",
name="upload_speed", type=click.INT, buildenvvar="UPLOAD_SPEED"), description=(
ConfigEnvOption(name="upload_flags", "A path to a project workspace directory where PlatformIO keeps "
multiple=True, "by default compiled objects, static libraries, firmwares, and "
sysenvvar="PLATFORMIO_UPLOAD_FLAGS", "external library dependencies"
buildenvvar="UPLOAD_FLAGS"), ),
ConfigEnvOption(name="upload_resetmethod", sysenvvar="PLATFORMIO_WORKSPACE_DIR",
buildenvvar="UPLOAD_RESETMETHOD"), default=os.path.join("$PROJECT_DIR", ".pio"),
ConfigEnvOption(name="upload_command", buildenvvar="UPLOADCMD"), ),
ConfigPlatformioOption(
# Monitor group="directory",
ConfigEnvOption(name="monitor_port"), name="build_dir",
ConfigEnvOption(name="monitor_speed", oldnames=["monitor_baud"]), description=(
ConfigEnvOption(name="monitor_rts", type=click.IntRange(0, 1)), "PlatformIO Build System uses this folder for project environments"
ConfigEnvOption(name="monitor_dtr", type=click.IntRange(0, 1)), " to store compiled object files, static libraries, firmwares, "
ConfigEnvOption(name="monitor_flags", multiple=True), "and other cached information"
),
# Library sysenvvar="PLATFORMIO_BUILD_DIR",
ConfigEnvOption(name="lib_deps", default=os.path.join("$PROJECT_WORKSPACE_DIR", "build"),
oldnames=["lib_use", "lib_force", "lib_install"], ),
multiple=True), ConfigPlatformioOption(
ConfigEnvOption(name="lib_ignore", multiple=True), group="directory",
ConfigEnvOption(name="lib_extra_dirs", name="libdeps_dir",
multiple=True, description=(
sysenvvar="PLATFORMIO_LIB_EXTRA_DIRS"), "Internal storage where Library Manager will install project "
ConfigEnvOption(name="lib_ldf_mode", "dependencies declared via `lib_deps` option"
type=click.Choice( ),
["off", "chain", "deep", "chain+", "deep+"])), sysenvvar="PLATFORMIO_LIBDEPS_DIR",
ConfigEnvOption(name="lib_compat_mode", default=os.path.join("$PROJECT_WORKSPACE_DIR", "libdeps"),
type=click.Choice(["off", "soft", "strict"])), ),
ConfigEnvOption(name="lib_archive", type=click.BOOL), ConfigPlatformioOption(
group="directory",
# Test name="include_dir",
ConfigEnvOption(name="test_filter", multiple=True), description=(
ConfigEnvOption(name="test_ignore", multiple=True), "A default location for project header files. PlatformIO Build "
ConfigEnvOption(name="test_port"), "System automatically adds this path to CPPPATH scope"
ConfigEnvOption(name="test_speed", type=click.INT), ),
ConfigEnvOption(name="test_transport"), sysenvvar="PLATFORMIO_INCLUDE_DIR",
ConfigEnvOption(name="test_build_project_src", type=click.BOOL), default=os.path.join("$PROJECT_DIR", "include"),
),
# Debug ConfigPlatformioOption(
ConfigEnvOption(name="debug_tool"), group="directory",
ConfigEnvOption(name="debug_init_break"), name="src_dir",
ConfigEnvOption(name="debug_init_cmds", multiple=True), description=(
ConfigEnvOption(name="debug_extra_cmds", multiple=True), "A default location where PlatformIO Build System looks for the "
ConfigEnvOption(name="debug_load_cmds", "project C/C++ source files"
oldnames=["debug_load_cmd"], ),
multiple=True), sysenvvar="PLATFORMIO_SRC_DIR",
ConfigEnvOption(name="debug_load_mode", default=os.path.join("$PROJECT_DIR", "src"),
type=click.Choice(["always", "modified", "manual"])), ),
ConfigEnvOption(name="debug_server", multiple=True), ConfigPlatformioOption(
ConfigEnvOption(name="debug_port"), group="directory",
ConfigEnvOption(name="debug_svd_path", name="lib_dir",
type=click.Path( description="A storage for the custom/private project libraries",
exists=True, file_okay=True, dir_okay=False)), sysenvvar="PLATFORMIO_LIB_DIR",
default=os.path.join("$PROJECT_DIR", "lib"),
# Other ),
ConfigEnvOption(name="extra_scripts", ConfigPlatformioOption(
oldnames=["extra_script"], group="directory",
multiple=True, name="data_dir",
sysenvvar="PLATFORMIO_EXTRA_SCRIPTS") description=(
"A data directory to store contents which can be uploaded to "
"file system (SPIFFS, etc.)"
),
sysenvvar="PLATFORMIO_DATA_DIR",
default=os.path.join("$PROJECT_DIR", "data"),
),
ConfigPlatformioOption(
group="directory",
name="test_dir",
description=(
"A location where PIO Unit Testing engine looks for "
"test source files"
),
sysenvvar="PLATFORMIO_TEST_DIR",
default=os.path.join("$PROJECT_DIR", "test"),
),
ConfigPlatformioOption(
group="directory",
name="boards_dir",
description="A global storage for custom board manifests",
sysenvvar="PLATFORMIO_BOARDS_DIR",
default=os.path.join("$PROJECT_DIR", "boards"),
),
ConfigPlatformioOption(
group="directory",
name="shared_dir",
description=(
"A location which PIO Remote uses to synchronize extra files "
"between remote machines"
),
sysenvvar="PLATFORMIO_SHARED_DIR",
default=os.path.join("$PROJECT_DIR", "shared"),
),
#
# [env]
#
# Platform
ConfigEnvOption(
group="platform",
name="platform",
description="A name or specification of development platform",
buildenvvar="PIOPLATFORM",
),
ConfigEnvOption(
group="platform",
name="platform_packages",
description="Custom packages and specifications",
multiple=True,
),
ConfigEnvOption(
group="platform",
name="framework",
description="A list of project dependent frameworks",
multiple=True,
buildenvvar="PIOFRAMEWORK",
),
# Board
ConfigEnvOption(
group="board",
name="board",
description="A board ID",
buildenvvar="BOARD",
),
ConfigEnvOption(
group="board",
name="board_build.mcu",
description="A custom board MCU",
oldnames=["board_mcu"],
buildenvvar="BOARD_MCU",
),
ConfigEnvOption(
group="board",
name="board_build.f_cpu",
description="A custom MCU frequency",
oldnames=["board_f_cpu"],
buildenvvar="BOARD_F_CPU",
),
ConfigEnvOption(
group="board",
name="board_build.f_flash",
description="A custom flash frequency",
oldnames=["board_f_flash"],
buildenvvar="BOARD_F_FLASH",
),
ConfigEnvOption(
group="board",
name="board_build.flash_mode",
description="A custom flash mode",
oldnames=["board_flash_mode"],
buildenvvar="BOARD_FLASH_MODE",
),
# Build
ConfigEnvOption(
group="build",
name="build_type",
description="Project build configuration",
type=click.Choice(["release", "debug"]),
default="release",
),
ConfigEnvOption(
group="build",
name="build_flags",
description=(
"Custom build flags/options for preprocessing, compilation, "
"assembly, and linking processes"
),
multiple=True,
sysenvvar="PLATFORMIO_BUILD_FLAGS",
buildenvvar="BUILD_FLAGS",
),
ConfigEnvOption(
group="build",
name="src_build_flags",
description=(
"The same as `build_flags` but configures flags the only for "
"project source files (`src` folder)"
),
multiple=True,
sysenvvar="PLATFORMIO_SRC_BUILD_FLAGS",
buildenvvar="SRC_BUILD_FLAGS",
),
ConfigEnvOption(
group="build",
name="build_unflags",
description="A list with flags/option which should be removed",
multiple=True,
sysenvvar="PLATFORMIO_BUILD_UNFLAGS",
buildenvvar="BUILD_UNFLAGS",
),
ConfigEnvOption(
group="build",
name="src_filter",
description=(
"Control which source files should be included/excluded from a "
"build process"
),
multiple=True,
sysenvvar="PLATFORMIO_SRC_FILTER",
buildenvvar="SRC_FILTER",
default="+<*> -<.git/> -<.svn/>",
),
ConfigEnvOption(
group="build",
name="targets",
description="A custom list of targets for PlatformIO Build System",
multiple=True,
),
# Upload
ConfigEnvOption(
group="upload",
name="upload_port",
description=(
"An upload port which `uploader` tool uses for a firmware flashing"
),
sysenvvar="PLATFORMIO_UPLOAD_PORT",
buildenvvar="UPLOAD_PORT",
),
ConfigEnvOption(
group="upload",
name="upload_protocol",
description="A protocol that `uploader` tool uses to talk to a board",
buildenvvar="UPLOAD_PROTOCOL",
),
ConfigEnvOption(
group="upload",
name="upload_speed",
description=(
"A connection speed (baud rate) which `uploader` tool uses when "
"sending firmware to a board"
),
type=click.INT,
buildenvvar="UPLOAD_SPEED",
),
ConfigEnvOption(
group="upload",
name="upload_flags",
description="An extra flags for `uploader` tool",
multiple=True,
sysenvvar="PLATFORMIO_UPLOAD_FLAGS",
buildenvvar="UPLOAD_FLAGS",
),
ConfigEnvOption(
group="upload",
name="upload_resetmethod",
description="A custom reset method",
buildenvvar="UPLOAD_RESETMETHOD",
),
ConfigEnvOption(
group="upload",
name="upload_command",
description=(
"A custom upload command which overwrites a default from "
"development platform"
),
buildenvvar="UPLOADCMD",
),
# Monitor
ConfigEnvOption(
group="monitor",
name="monitor_port",
description="A port, a number or a device name",
),
ConfigEnvOption(
group="monitor",
name="monitor_speed",
description="A monitor speed (baud rate)",
type=click.INT,
oldnames=["monitor_baud"],
default=9600,
),
ConfigEnvOption(
group="monitor",
name="monitor_rts",
description="A monitor initial RTS line state",
type=click.IntRange(0, 1),
),
ConfigEnvOption(
group="monitor",
name="monitor_dtr",
description="A monitor initial DTR line state",
type=click.IntRange(0, 1),
),
ConfigEnvOption(
group="monitor",
name="monitor_flags",
description=(
"The extra flags and options for `platformio device monitor` "
"command"
),
multiple=True,
),
# Library
ConfigEnvOption(
group="library",
name="lib_deps",
description=(
"A list of project library dependencies which should be installed "
"automatically before a build process"
),
oldnames=["lib_use", "lib_force", "lib_install"],
multiple=True,
),
ConfigEnvOption(
group="library",
name="lib_ignore",
description=(
"A list of library names which should be ignored by "
"Library Dependency Finder (LDF)"
),
multiple=True,
),
ConfigEnvOption(
group="library",
name="lib_extra_dirs",
description=(
"A list of extra directories/storages where Library Dependency "
"Finder (LDF) will look for dependencies"
),
multiple=True,
sysenvvar="PLATFORMIO_LIB_EXTRA_DIRS",
),
ConfigEnvOption(
group="library",
name="lib_ldf_mode",
description=(
"Control how Library Dependency Finder (LDF) should analyze "
"dependencies (`#include` directives)"
),
type=click.Choice(["off", "chain", "deep", "chain+", "deep+"]),
default="chain",
),
ConfigEnvOption(
group="library",
name="lib_compat_mode",
description=(
"Configure a strictness (compatibility mode by frameworks, "
"development platforms) of Library Dependency Finder (LDF)"
),
type=click.Choice(["off", "soft", "strict"]),
default="soft",
),
ConfigEnvOption(
group="library",
name="lib_archive",
description=(
"Create an archive (`*.a`, static library) from the object files "
"and link it into a firmware (program)"
),
type=click.BOOL,
default=True,
),
# Check
ConfigEnvOption(
group="check",
name="check_tool",
description="A list of check tools used for analysis",
type=click.Choice(["cppcheck", "clangtidy"]),
multiple=True,
default=["cppcheck"],
),
ConfigEnvOption(
group="check",
name="check_patterns",
description=(
"Configure a list of target files or directories for checking "
"(Unix shell-style wildcards)"
),
multiple=True,
),
ConfigEnvOption(
group="check",
name="check_flags",
description="An extra flags to be passed to a check tool",
multiple=True,
),
ConfigEnvOption(
group="check",
name="check_severity",
description="List of defect severity types for analysis",
multiple=True,
type=click.Choice(["low", "medium", "high"]),
default=["low", "medium", "high"],
),
# Test
ConfigEnvOption(
group="test",
name="test_filter",
description="Process tests where the name matches specified patterns",
multiple=True,
),
ConfigEnvOption(
group="test",
name="test_ignore",
description="Ignore tests where the name matches specified patterns",
multiple=True,
),
ConfigEnvOption(
group="test",
name="test_port",
description="A serial port to communicate with a target device",
),
ConfigEnvOption(
group="test",
name="test_speed",
description="A connection speed (baud rate) to communicate with a target device",
type=click.INT,
),
ConfigEnvOption(group="test", name="test_transport", description="",),
ConfigEnvOption(
group="test",
name="test_build_project_src",
description="",
type=click.BOOL,
default=False,
),
# Debug
ConfigEnvOption(
group="debug",
name="debug_tool",
description="A name of debugging tool",
),
ConfigEnvOption(
group="debug",
name="debug_init_break",
description=(
"An initial breakpoint that makes program stop whenever a "
"certain point in the program is reached"
),
default="tbreak main",
),
ConfigEnvOption(
group="debug",
name="debug_init_cmds",
description="Initial commands to be passed to a back-end debugger",
multiple=True,
),
ConfigEnvOption(
group="debug",
name="debug_extra_cmds",
description="An extra commands to be passed to a back-end debugger",
multiple=True,
),
ConfigEnvOption(
group="debug",
name="debug_load_cmds",
description=(
"A list of commands to be used to load program/firmware "
"to a target device"
),
oldnames=["debug_load_cmd"],
multiple=True,
default=["load"],
),
ConfigEnvOption(
group="debug",
name="debug_load_mode",
description=(
"Allows one to control when PlatformIO should load debugging "
"firmware to the end target"
),
type=click.Choice(["always", "modified", "manual"]),
default="always",
),
ConfigEnvOption(
group="debug",
name="debug_server",
description="Allows one to setup a custom debugging server",
multiple=True,
),
ConfigEnvOption(
group="debug",
name="debug_port",
description=(
"A debugging port of a remote target (a serial device or "
"network address)"
),
),
ConfigEnvOption(
group="debug",
name="debug_svd_path",
description=(
"A custom path to SVD file which contains information about "
"device peripherals"
),
type=click.Path(exists=True, file_okay=True, dir_okay=False),
),
# Advanced
ConfigEnvOption(
group="advanced",
name="extends",
description=(
"Inherit configuration from other sections or build environments"
),
multiple=True,
),
ConfigEnvOption(
group="advanced",
name="extra_scripts",
description="A list of PRE and POST extra scripts",
oldnames=["extra_script"],
multiple=True,
sysenvvar="PLATFORMIO_EXTRA_SCRIPTS",
),
]
] ]
]) )
def get_config_options_schema():
return [opt.as_dict() for opt in ProjectOptions.values()]

View File

@@ -38,7 +38,6 @@ except ImportError:
class TelemetryBase(object): class TelemetryBase(object):
def __init__(self): def __init__(self):
self._params = {} self._params = {}
@@ -64,17 +63,17 @@ class MeasurementProtocol(TelemetryBase):
"event_category": "ec", "event_category": "ec",
"event_action": "ea", "event_action": "ea",
"event_label": "el", "event_label": "el",
"event_value": "ev" "event_value": "ev",
} }
def __init__(self): def __init__(self):
super(MeasurementProtocol, self).__init__() super(MeasurementProtocol, self).__init__()
self['v'] = 1 self["v"] = 1
self['tid'] = self.TID self["tid"] = self.TID
self['cid'] = app.get_cid() self["cid"] = app.get_cid()
try: try:
self['sr'] = "%dx%d" % click.get_terminal_size() self["sr"] = "%dx%d" % click.get_terminal_size()
except ValueError: except ValueError:
pass pass
@@ -93,7 +92,7 @@ class MeasurementProtocol(TelemetryBase):
super(MeasurementProtocol, self).__setitem__(name, value) super(MeasurementProtocol, self).__setitem__(name, value)
def _prefill_appinfo(self): def _prefill_appinfo(self):
self['av'] = __version__ self["av"] = __version__
# gather dependent packages # gather dependent packages
dpdata = [] dpdata = []
@@ -102,10 +101,9 @@ class MeasurementProtocol(TelemetryBase):
dpdata.append("Caller/%s" % app.get_session_var("caller_id")) dpdata.append("Caller/%s" % app.get_session_var("caller_id"))
if getenv("PLATFORMIO_IDE"): if getenv("PLATFORMIO_IDE"):
dpdata.append("IDE/%s" % getenv("PLATFORMIO_IDE")) dpdata.append("IDE/%s" % getenv("PLATFORMIO_IDE"))
self['an'] = " ".join(dpdata) self["an"] = " ".join(dpdata)
def _prefill_custom_data(self): def _prefill_custom_data(self):
def _filter_args(items): def _filter_args(items):
result = [] result = []
stop = False stop = False
@@ -119,17 +117,16 @@ class MeasurementProtocol(TelemetryBase):
return result return result
caller_id = str(app.get_session_var("caller_id")) caller_id = str(app.get_session_var("caller_id"))
self['cd1'] = util.get_systype() self["cd1"] = util.get_systype()
self['cd2'] = "Python/%s %s" % (platform.python_version(), self["cd2"] = "Python/%s %s" % (platform.python_version(), platform.platform())
platform.platform())
# self['cd3'] = " ".join(_filter_args(sys.argv[1:])) # self['cd3'] = " ".join(_filter_args(sys.argv[1:]))
self['cd4'] = 1 if (not util.is_ci() and self["cd4"] = (
(caller_id or not is_container())) else 0 1 if (not util.is_ci() and (caller_id or not is_container())) else 0
)
if caller_id: if caller_id:
self['cd5'] = caller_id.lower() self["cd5"] = caller_id.lower()
def _prefill_screen_name(self): def _prefill_screen_name(self):
def _first_arg_from_list(args_, list_): def _first_arg_from_list(args_, list_):
for _arg in args_: for _arg in args_:
if _arg in list_: if _arg in list_:
@@ -146,12 +143,27 @@ class MeasurementProtocol(TelemetryBase):
return return
cmd_path = args[:1] cmd_path = args[:1]
if args[0] in ("platform", "platforms", "serialports", "device", if args[0] in (
"settings", "account"): "platform",
"platforms",
"serialports",
"device",
"settings",
"account",
):
cmd_path = args[:2] cmd_path = args[:2]
if args[0] == "lib" and len(args) > 1: if args[0] == "lib" and len(args) > 1:
lib_subcmds = ("builtin", "install", "list", "register", "search", lib_subcmds = (
"show", "stats", "uninstall", "update") "builtin",
"install",
"list",
"register",
"search",
"show",
"stats",
"uninstall",
"update",
)
sub_cmd = _first_arg_from_list(args[1:], lib_subcmds) sub_cmd = _first_arg_from_list(args[1:], lib_subcmds)
if sub_cmd: if sub_cmd:
cmd_path.append(sub_cmd) cmd_path.append(sub_cmd)
@@ -165,24 +177,25 @@ class MeasurementProtocol(TelemetryBase):
sub_cmd = _first_arg_from_list(args[2:], remote2_subcmds) sub_cmd = _first_arg_from_list(args[2:], remote2_subcmds)
if sub_cmd: if sub_cmd:
cmd_path.append(sub_cmd) cmd_path.append(sub_cmd)
self['screen_name'] = " ".join([p.title() for p in cmd_path]) self["screen_name"] = " ".join([p.title() for p in cmd_path])
@staticmethod @staticmethod
def _ignore_hit(): def _ignore_hit():
if not app.get_setting("enable_telemetry"): if not app.get_setting("enable_telemetry"):
return True return True
if app.get_session_var("caller_id") and \ if app.get_session_var("caller_id") and all(
all(c in sys.argv for c in ("run", "idedata")): c in sys.argv for c in ("run", "idedata")
):
return True return True
return False return False
def send(self, hittype): def send(self, hittype):
if self._ignore_hit(): if self._ignore_hit():
return return
self['t'] = hittype self["t"] = hittype
# correct queue time # correct queue time
if "qt" in self._params and isinstance(self['qt'], float): if "qt" in self._params and isinstance(self["qt"], float):
self['qt'] = int((time() - self['qt']) * 1000) self["qt"] = int((time() - self["qt"]) * 1000)
MPDataPusher().push(self._params) MPDataPusher().push(self._params)
@@ -202,7 +215,7 @@ class MPDataPusher(object):
# if network is off-line # if network is off-line
if self._http_offline: if self._http_offline:
if "qt" not in item: if "qt" not in item:
item['qt'] = time() item["qt"] = time()
self._failedque.append(item) self._failedque.append(item)
return return
@@ -243,7 +256,7 @@ class MPDataPusher(object):
item = self._queue.get() item = self._queue.get()
_item = item.copy() _item = item.copy()
if "qt" not in _item: if "qt" not in _item:
_item['qt'] = time() _item["qt"] = time()
self._failedque.append(_item) self._failedque.append(_item)
if self._send_data(item): if self._send_data(item):
self._failedque.remove(_item) self._failedque.remove(_item)
@@ -259,7 +272,8 @@ class MPDataPusher(object):
"https://ssl.google-analytics.com/collect", "https://ssl.google-analytics.com/collect",
data=data, data=data,
headers=util.get_request_defheaders(), headers=util.get_request_defheaders(),
timeout=1) timeout=1,
)
r.raise_for_status() r.raise_for_status()
return True return True
except requests.exceptions.HTTPError as e: except requests.exceptions.HTTPError as e:
@@ -284,11 +298,10 @@ def on_command():
def measure_ci(): def measure_ci():
event = {"category": "CI", "action": "NoName", "label": None} event = {"category": "CI", "action": "NoName", "label": None}
known_cis = ("TRAVIS", "APPVEYOR", "GITLAB_CI", "CIRCLECI", "SHIPPABLE", known_cis = ("TRAVIS", "APPVEYOR", "GITLAB_CI", "CIRCLECI", "SHIPPABLE", "DRONE")
"DRONE")
for name in known_cis: for name in known_cis:
if getenv(name, "false").lower() == "true": if getenv(name, "false").lower() == "true":
event['action'] = name event["action"] = name
break break
on_event(**event) on_event(**event)
@@ -307,32 +320,37 @@ def on_run_environment(options, targets):
def on_event(category, action, label=None, value=None, screen_name=None): def on_event(category, action, label=None, value=None, screen_name=None):
mp = MeasurementProtocol() mp = MeasurementProtocol()
mp['event_category'] = category[:150] mp["event_category"] = category[:150]
mp['event_action'] = action[:500] mp["event_action"] = action[:500]
if label: if label:
mp['event_label'] = label[:500] mp["event_label"] = label[:500]
if value: if value:
mp['event_value'] = int(value) mp["event_value"] = int(value)
if screen_name: if screen_name:
mp['screen_name'] = screen_name[:2048] mp["screen_name"] = screen_name[:2048]
mp.send("event") mp.send("event")
def on_exception(e): def on_exception(e):
def _cleanup_description(text): def _cleanup_description(text):
text = text.replace("Traceback (most recent call last):", "") text = text.replace("Traceback (most recent call last):", "")
text = re.sub(r'File "([^"]+)"', text = re.sub(
lambda m: join(*m.group(1).split(sep)[-2:]), r'File "([^"]+)"',
text, lambda m: join(*m.group(1).split(sep)[-2:]),
flags=re.M) text,
flags=re.M,
)
text = re.sub(r"\s+", " ", text, flags=re.M) text = re.sub(r"\s+", " ", text, flags=re.M)
return text.strip() return text.strip()
skip_conditions = [ skip_conditions = [
isinstance(e, cls) for cls in (IOError, exception.ReturnErrorCode, isinstance(e, cls)
exception.UserSideException, for cls in (
exception.PlatformIOProjectException) IOError,
exception.ReturnErrorCode,
exception.UserSideException,
exception.PlatformIOProjectException,
)
] ]
try: try:
skip_conditions.append("[API] Account: " in str(e)) skip_conditions.append("[API] Account: " in str(e))
@@ -340,14 +358,16 @@ def on_exception(e):
e = ue e = ue
if any(skip_conditions): if any(skip_conditions):
return return
is_crash = any([ is_crash = any(
not isinstance(e, exception.PlatformioException), [
"Error" in e.__class__.__name__ not isinstance(e, exception.PlatformioException),
]) "Error" in e.__class__.__name__,
]
)
mp = MeasurementProtocol() mp = MeasurementProtocol()
description = _cleanup_description(format_exc() if is_crash else str(e)) description = _cleanup_description(format_exc() if is_crash else str(e))
mp['exd'] = ("%s: %s" % (type(e).__name__, description))[:2048] mp["exd"] = ("%s: %s" % (type(e).__name__, description))[:2048]
mp['exf'] = 1 if is_crash else 0 mp["exf"] = 1 if is_crash else 0
mp.send("exception") mp.send("exception")
@@ -373,7 +393,7 @@ def backup_reports(items):
KEEP_MAX_REPORTS = 100 KEEP_MAX_REPORTS = 100
tm = app.get_state_item("telemetry", {}) tm = app.get_state_item("telemetry", {})
if "backup" not in tm: if "backup" not in tm:
tm['backup'] = [] tm["backup"] = []
for params in items: for params in items:
# skip static options # skip static options
@@ -383,28 +403,28 @@ def backup_reports(items):
# store time in UNIX format # store time in UNIX format
if "qt" not in params: if "qt" not in params:
params['qt'] = time() params["qt"] = time()
elif not isinstance(params['qt'], float): elif not isinstance(params["qt"], float):
params['qt'] = time() - (params['qt'] / 1000) params["qt"] = time() - (params["qt"] / 1000)
tm['backup'].append(params) tm["backup"].append(params)
tm['backup'] = tm['backup'][KEEP_MAX_REPORTS * -1:] tm["backup"] = tm["backup"][KEEP_MAX_REPORTS * -1 :]
app.set_state_item("telemetry", tm) app.set_state_item("telemetry", tm)
def resend_backuped_reports(): def resend_backuped_reports():
tm = app.get_state_item("telemetry", {}) tm = app.get_state_item("telemetry", {})
if "backup" not in tm or not tm['backup']: if "backup" not in tm or not tm["backup"]:
return False return False
for report in tm['backup']: for report in tm["backup"]:
mp = MeasurementProtocol() mp = MeasurementProtocol()
for key, value in report.items(): for key, value in report.items():
mp[key] = value mp[key] = value
mp.send(report['t']) mp.send(report["t"])
# clean # clean
tm['backup'] = [] tm["backup"] = []
app.set_state_item("telemetry", tm) app.set_state_item("telemetry", tm)
return True return True

View File

@@ -12,8 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from os import chmod import os
from os.path import exists, join
from tarfile import open as tarfile_open from tarfile import open as tarfile_open
from time import mktime from time import mktime
from zipfile import ZipFile from zipfile import ZipFile
@@ -24,7 +23,6 @@ from platformio import exception, util
class ArchiveBase(object): class ArchiveBase(object):
def __init__(self, arhfileobj): def __init__(self, arhfileobj):
self._afo = arhfileobj self._afo = arhfileobj
@@ -34,6 +32,9 @@ class ArchiveBase(object):
def get_item_filename(self, item): def get_item_filename(self, item):
raise NotImplementedError() raise NotImplementedError()
def is_link(self, item):
raise NotImplementedError()
def extract_item(self, item, dest_dir): def extract_item(self, item, dest_dir):
self._afo.extract(item, dest_dir) self._afo.extract(item, dest_dir)
self.after_extract(item, dest_dir) self.after_extract(item, dest_dir)
@@ -46,7 +47,6 @@ class ArchiveBase(object):
class TARArchive(ArchiveBase): class TARArchive(ArchiveBase):
def __init__(self, archpath): def __init__(self, archpath):
super(TARArchive, self).__init__(tarfile_open(archpath)) super(TARArchive, self).__init__(tarfile_open(archpath))
@@ -57,12 +57,37 @@ class TARArchive(ArchiveBase):
return item.name return item.name
@staticmethod @staticmethod
def islink(item): def is_link(item):
return item.islnk() or item.issym() return item.islnk() or item.issym()
@staticmethod
def resolve_path(path):
return os.path.realpath(os.path.abspath(path))
def is_bad_path(self, path, base):
return not self.resolve_path(os.path.join(base, path)).startswith(base)
def is_bad_link(self, item, base):
return not self.resolve_path(
os.path.join(os.path.join(base, os.path.dirname(item.name)), item.linkname)
).startswith(base)
def extract_item(self, item, dest_dir):
bad_conds = [
self.is_bad_path(item.name, dest_dir),
self.is_link(item) and self.is_bad_link(item, dest_dir),
]
if not any(bad_conds):
super(TARArchive, self).extract_item(item, dest_dir)
else:
click.secho(
"Blocked insecure item `%s` from TAR archive" % item.name,
fg="red",
err=True,
)
class ZIPArchive(ArchiveBase): class ZIPArchive(ArchiveBase):
def __init__(self, archpath): def __init__(self, archpath):
super(ZIPArchive, self).__init__(ZipFile(archpath)) super(ZIPArchive, self).__init__(ZipFile(archpath))
@@ -70,12 +95,18 @@ class ZIPArchive(ArchiveBase):
def preserve_permissions(item, dest_dir): def preserve_permissions(item, dest_dir):
attrs = item.external_attr >> 16 attrs = item.external_attr >> 16
if attrs: if attrs:
chmod(join(dest_dir, item.filename), attrs) os.chmod(os.path.join(dest_dir, item.filename), attrs)
@staticmethod @staticmethod
def preserve_mtime(item, dest_dir): def preserve_mtime(item, dest_dir):
util.change_filemtime(join(dest_dir, item.filename), util.change_filemtime(
mktime(tuple(item.date_time) + tuple([0, 0, 0]))) os.path.join(dest_dir, item.filename),
mktime(tuple(item.date_time) + tuple([0, 0, 0])),
)
@staticmethod
def is_link(_):
return False
def get_items(self): def get_items(self):
return self._afo.infolist() return self._afo.infolist()
@@ -83,22 +114,18 @@ class ZIPArchive(ArchiveBase):
def get_item_filename(self, item): def get_item_filename(self, item):
return item.filename return item.filename
def islink(self, item):
raise NotImplementedError()
def after_extract(self, item, dest_dir): def after_extract(self, item, dest_dir):
self.preserve_permissions(item, dest_dir) self.preserve_permissions(item, dest_dir)
self.preserve_mtime(item, dest_dir) self.preserve_mtime(item, dest_dir)
class FileUnpacker(object): class FileUnpacker(object):
def __init__(self, archpath): def __init__(self, archpath):
self.archpath = archpath self.archpath = archpath
self._unpacker = None self._unpacker = None
def __enter__(self): def __enter__(self):
if self.archpath.lower().endswith((".gz", ".bz2")): if self.archpath.lower().endswith((".gz", ".bz2", ".tar")):
self._unpacker = TARArchive(self.archpath) self._unpacker = TARArchive(self.archpath)
elif self.archpath.lower().endswith(".zip"): elif self.archpath.lower().endswith(".zip"):
self._unpacker = ZIPArchive(self.archpath) self._unpacker = ZIPArchive(self.archpath)
@@ -110,7 +137,7 @@ class FileUnpacker(object):
if self._unpacker: if self._unpacker:
self._unpacker.close() self._unpacker.close()
def unpack(self, dest_dir=".", with_progress=True): def unpack(self, dest_dir=".", with_progress=True, check_unpacked=True):
assert self._unpacker assert self._unpacker
if not with_progress: if not with_progress:
click.echo("Unpacking...") click.echo("Unpacking...")
@@ -122,12 +149,15 @@ class FileUnpacker(object):
for item in pb: for item in pb:
self._unpacker.extract_item(item, dest_dir) self._unpacker.extract_item(item, dest_dir)
if not check_unpacked:
return True
# check on disk # check on disk
for item in self._unpacker.get_items(): for item in self._unpacker.get_items():
filename = self._unpacker.get_item_filename(item) filename = self._unpacker.get_item_filename(item)
item_path = join(dest_dir, filename) item_path = os.path.join(dest_dir, filename)
try: try:
if not self._unpacker.islink(item) and not exists(item_path): if not self._unpacker.is_link(item) and not os.path.exists(item_path):
raise exception.ExtractArchiveItemError(filename, dest_dir) raise exception.ExtractArchiveItemError(filename, dest_dir)
except NotImplementedError: except NotImplementedError:
pass pass

View File

@@ -40,7 +40,6 @@ from platformio.proc import is_ci # pylint: disable=unused-import
class memoized(object): class memoized(object):
def __init__(self, expire=0): def __init__(self, expire=0):
expire = str(expire) expire = str(expire)
if expire.isdigit(): if expire.isdigit():
@@ -51,13 +50,12 @@ class memoized(object):
self.cache = {} self.cache = {}
def __call__(self, func): def __call__(self, func):
@wraps(func) @wraps(func)
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
key = str(args) + str(kwargs) key = str(args) + str(kwargs)
if (key not in self.cache if key not in self.cache or (
or (self.expire > 0 self.expire > 0 and self.cache[key][0] < time.time() - self.expire
and self.cache[key][0] < time.time() - self.expire)): ):
self.cache[key] = (time.time(), func(*args, **kwargs)) self.cache[key] = (time.time(), func(*args, **kwargs))
return self.cache[key][1] return self.cache[key][1]
@@ -69,13 +67,11 @@ class memoized(object):
class throttle(object): class throttle(object):
def __init__(self, threshhold): def __init__(self, threshhold):
self.threshhold = threshhold # milliseconds self.threshhold = threshhold # milliseconds
self.last = 0 self.last = 0
def __call__(self, func): def __call__(self, func):
@wraps(func) @wraps(func)
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
diff = int(round((time.time() - self.last) * 1000)) diff = int(round((time.time() - self.last) * 1000))
@@ -130,6 +126,7 @@ def change_filemtime(path, mtime):
def get_serial_ports(filter_hwid=False): def get_serial_ports(filter_hwid=False):
try: try:
# pylint: disable=import-outside-toplevel
from serial.tools.list_ports import comports from serial.tools.list_ports import comports
except ImportError: except ImportError:
raise exception.GetSerialPortsError(os.name) raise exception.GetSerialPortsError(os.name)
@@ -166,17 +163,14 @@ def get_logical_devices():
if WINDOWS: if WINDOWS:
try: try:
result = exec_command( result = exec_command(
["wmic", "logicaldisk", "get", ["wmic", "logicaldisk", "get", "name,VolumeName"]
"name,VolumeName"]).get("out", "") ).get("out", "")
devicenamere = re.compile(r"^([A-Z]{1}\:)\s*(\S+)?") devicenamere = re.compile(r"^([A-Z]{1}\:)\s*(\S+)?")
for line in result.split("\n"): for line in result.split("\n"):
match = devicenamere.match(line.strip()) match = devicenamere.match(line.strip())
if not match: if not match:
continue continue
items.append({ items.append({"path": match.group(1) + "\\", "name": match.group(2)})
"path": match.group(1) + "\\",
"name": match.group(2)
})
return items return items
except WindowsError: # pylint: disable=undefined-variable except WindowsError: # pylint: disable=undefined-variable
pass pass
@@ -192,35 +186,31 @@ def get_logical_devices():
match = devicenamere.match(line.strip()) match = devicenamere.match(line.strip())
if not match: if not match:
continue continue
items.append({ items.append({"path": match.group(1), "name": os.path.basename(match.group(1))})
"path": match.group(1),
"name": os.path.basename(match.group(1))
})
return items return items
def get_mdns_services(): def get_mdns_services():
# pylint: disable=import-outside-toplevel
try: try:
import zeroconf import zeroconf
except ImportError: except ImportError:
from site import addsitedir from site import addsitedir
from platformio.managers.core import get_core_package_dir from platformio.managers.core import get_core_package_dir
contrib_pysite_dir = get_core_package_dir("contrib-pysite") contrib_pysite_dir = get_core_package_dir("contrib-pysite")
addsitedir(contrib_pysite_dir) addsitedir(contrib_pysite_dir)
sys.path.insert(0, contrib_pysite_dir) sys.path.insert(0, contrib_pysite_dir)
import zeroconf import zeroconf # pylint: disable=import-outside-toplevel
class mDNSListener(object): class mDNSListener(object):
def __init__(self): def __init__(self):
self._zc = zeroconf.Zeroconf( self._zc = zeroconf.Zeroconf(interfaces=zeroconf.InterfaceChoice.All)
interfaces=zeroconf.InterfaceChoice.All)
self._found_types = [] self._found_types = []
self._found_services = [] self._found_services = []
def __enter__(self): def __enter__(self):
zeroconf.ServiceBrowser(self._zc, "_services._dns-sd._udp.local.", zeroconf.ServiceBrowser(self._zc, "_services._dns-sd._udp.local.", self)
self)
return self return self
def __exit__(self, etype, value, traceback): def __exit__(self, etype, value, traceback):
@@ -233,8 +223,7 @@ def get_mdns_services():
try: try:
assert zeroconf.service_type_name(name) assert zeroconf.service_type_name(name)
assert str(name) assert str(name)
except (AssertionError, UnicodeError, except (AssertionError, UnicodeError, zeroconf.BadTypeInNameException):
zeroconf.BadTypeInNameException):
return return
if name not in self._found_types: if name not in self._found_types:
self._found_types.append(name) self._found_types.append(name)
@@ -255,29 +244,29 @@ def get_mdns_services():
if service.properties: if service.properties:
try: try:
properties = { properties = {
k.decode("utf8"): k.decode("utf8"): v.decode("utf8")
v.decode("utf8") if isinstance(v, bytes) else v if isinstance(v, bytes)
else v
for k, v in service.properties.items() for k, v in service.properties.items()
} }
json.dumps(properties) json.dumps(properties)
except UnicodeDecodeError: except UnicodeDecodeError:
properties = None properties = None
items.append({ items.append(
"type": {
service.type, "type": service.type,
"name": "name": service.name,
service.name, "ip": ".".join(
"ip": [
".".join([ str(c if isinstance(c, int) else ord(c))
str(c if isinstance(c, int) else ord(c)) for c in service.address
for c in service.address ]
]), ),
"port": "port": service.port,
service.port, "properties": properties,
"properties": }
properties )
})
return items return items
@@ -293,11 +282,9 @@ def _api_request_session():
@throttle(500) @throttle(500)
def _get_api_result( def _get_api_result(
url, # pylint: disable=too-many-branches url, params=None, data=None, auth=None # pylint: disable=too-many-branches
params=None, ):
data=None, from platformio.app import get_setting # pylint: disable=import-outside-toplevel
auth=None):
from platformio.app import get_setting
result = {} result = {}
r = None r = None
@@ -311,30 +298,29 @@ def _get_api_result(
try: try:
if data: if data:
r = _api_request_session().post(url, r = _api_request_session().post(
params=params, url,
data=data, params=params,
headers=headers, data=data,
auth=auth, headers=headers,
verify=verify_ssl) auth=auth,
verify=verify_ssl,
)
else: else:
r = _api_request_session().get(url, r = _api_request_session().get(
params=params, url, params=params, headers=headers, auth=auth, verify=verify_ssl
headers=headers, )
auth=auth,
verify=verify_ssl)
result = r.json() result = r.json()
r.raise_for_status() r.raise_for_status()
return r.text return r.text
except requests.exceptions.HTTPError as e: except requests.exceptions.HTTPError as e:
if result and "message" in result: if result and "message" in result:
raise exception.APIRequestError(result['message']) raise exception.APIRequestError(result["message"])
if result and "errors" in result: if result and "errors" in result:
raise exception.APIRequestError(result['errors'][0]['title']) raise exception.APIRequestError(result["errors"][0]["title"])
raise exception.APIRequestError(e) raise exception.APIRequestError(e)
except ValueError: except ValueError:
raise exception.APIRequestError("Invalid response: %s" % raise exception.APIRequestError("Invalid response: %s" % r.text.encode("utf-8"))
r.text.encode("utf-8"))
finally: finally:
if r: if r:
r.close() r.close()
@@ -342,11 +328,13 @@ def _get_api_result(
def get_api_result(url, params=None, data=None, auth=None, cache_valid=None): def get_api_result(url, params=None, data=None, auth=None, cache_valid=None):
from platformio.app import ContentCache from platformio.app import ContentCache # pylint: disable=import-outside-toplevel
total = 0 total = 0
max_retries = 5 max_retries = 5
cache_key = (ContentCache.key_from_args(url, params, data, auth) cache_key = (
if cache_valid else None) ContentCache.key_from_args(url, params, data, auth) if cache_valid else None
)
while total < max_retries: while total < max_retries:
try: try:
with ContentCache() as cc: with ContentCache() as cc:
@@ -363,24 +351,25 @@ def get_api_result(url, params=None, data=None, auth=None, cache_valid=None):
with ContentCache() as cc: with ContentCache() as cc:
cc.set(cache_key, result, cache_valid) cc.set(cache_key, result, cache_valid)
return json.loads(result) return json.loads(result)
except (requests.exceptions.ConnectionError, except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as e:
requests.exceptions.Timeout) as e:
total += 1 total += 1
if not PlatformioCLI.in_silence(): if not PlatformioCLI.in_silence():
click.secho( click.secho(
"[API] ConnectionError: {0} (incremented retry: max={1}, " "[API] ConnectionError: {0} (incremented retry: max={1}, "
"total={2})".format(e, max_retries, total), "total={2})".format(e, max_retries, total),
fg="yellow") fg="yellow",
)
time.sleep(2 * total) time.sleep(2 * total)
raise exception.APIRequestError( raise exception.APIRequestError(
"Could not connect to PlatformIO API Service. " "Could not connect to PlatformIO API Service. Please try later."
"Please try later.") )
PING_INTERNET_IPS = [ PING_INTERNET_IPS = [
"192.30.253.113", # github.com "192.30.253.113", # github.com
"193.222.52.25" # dl.platformio.org "31.28.1.238", # dl.platformio.org
"193.222.52.25", # dl.platformio.org
] ]
@@ -391,12 +380,9 @@ def _internet_on():
for ip in PING_INTERNET_IPS: for ip in PING_INTERNET_IPS:
try: try:
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")): if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
requests.get("http://%s" % ip, requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout)
allow_redirects=False,
timeout=timeout)
else: else:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect( socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80))
(ip, 80))
return True return True
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
@@ -438,8 +424,7 @@ def merge_dicts(d1, d2, path=None):
if path is None: if path is None:
path = [] path = []
for key in d2: for key in d2:
if (key in d1 and isinstance(d1[key], dict) if key in d1 and isinstance(d1[key], dict) and isinstance(d2[key], dict):
and isinstance(d2[key], dict)):
merge_dicts(d1[key], d2[key], path + [str(key)]) merge_dicts(d1[key], d2[key], path + [str(key)])
else: else:
d1[key] = d2[key] d1[key] = d2[key]
@@ -450,9 +435,7 @@ def print_labeled_bar(label, is_error=False, fg=None):
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
width = len(click.unstyle(label)) width = len(click.unstyle(label))
half_line = "=" * int((terminal_width - width - 2) / 2) half_line = "=" * int((terminal_width - width - 2) / 2)
click.secho("%s %s %s" % (half_line, label, half_line), click.secho("%s %s %s" % (half_line, label, half_line), fg=fg, err=is_error)
fg=fg,
err=is_error)
def humanize_duration_time(duration): def humanize_duration_time(duration):

View File

@@ -27,7 +27,6 @@ except ImportError:
class VCSClientFactory(object): class VCSClientFactory(object):
@staticmethod @staticmethod
def newClient(src_dir, remote_url, silent=False): def newClient(src_dir, remote_url, silent=False):
result = urlparse(remote_url) result = urlparse(remote_url)
@@ -38,15 +37,14 @@ class VCSClientFactory(object):
remote_url = remote_url[4:] remote_url = remote_url[4:]
elif "+" in result.scheme: elif "+" in result.scheme:
type_, _ = result.scheme.split("+", 1) type_, _ = result.scheme.split("+", 1)
remote_url = remote_url[len(type_) + 1:] remote_url = remote_url[len(type_) + 1 :]
if "#" in remote_url: if "#" in remote_url:
remote_url, tag = remote_url.rsplit("#", 1) remote_url, tag = remote_url.rsplit("#", 1)
if not type_: if not type_:
raise PlatformioException("VCS: Unknown repository type %s" % raise PlatformioException("VCS: Unknown repository type %s" % remote_url)
remote_url) obj = getattr(modules[__name__], "%sClient" % type_.title())(
obj = getattr(modules[__name__], src_dir, remote_url, tag, silent
"%sClient" % type_.title())(src_dir, remote_url, tag, )
silent)
assert isinstance(obj, VCSClientBase) assert isinstance(obj, VCSClientBase)
return obj return obj
@@ -71,8 +69,8 @@ class VCSClientBase(object):
assert self.run_cmd(["--version"]) assert self.run_cmd(["--version"])
except (AssertionError, OSError, PlatformioException): except (AssertionError, OSError, PlatformioException):
raise UserSideException( raise UserSideException(
"VCS: `%s` client is not installed in your system" % "VCS: `%s` client is not installed in your system" % self.command
self.command) )
return True return True
@property @property
@@ -98,24 +96,23 @@ class VCSClientBase(object):
def run_cmd(self, args, **kwargs): def run_cmd(self, args, **kwargs):
args = [self.command] + args args = [self.command] + args
if "cwd" not in kwargs: if "cwd" not in kwargs:
kwargs['cwd'] = self.src_dir kwargs["cwd"] = self.src_dir
try: try:
check_call(args, **kwargs) check_call(args, **kwargs)
return True return True
except CalledProcessError as e: except CalledProcessError as e:
raise PlatformioException("VCS: Could not process command %s" % raise PlatformioException("VCS: Could not process command %s" % e.cmd)
e.cmd)
def get_cmd_output(self, args, **kwargs): def get_cmd_output(self, args, **kwargs):
args = [self.command] + args args = [self.command] + args
if "cwd" not in kwargs: if "cwd" not in kwargs:
kwargs['cwd'] = self.src_dir kwargs["cwd"] = self.src_dir
result = exec_command(args, **kwargs) result = exec_command(args, **kwargs)
if result['returncode'] == 0: if result["returncode"] == 0:
return result['out'].strip() return result["out"].strip()
raise PlatformioException( raise PlatformioException(
"VCS: Could not receive an output from `%s` command (%s)" % "VCS: Could not receive an output from `%s` command (%s)" % (args, result)
(args, result)) )
class GitClient(VCSClientBase): class GitClient(VCSClientBase):
@@ -127,7 +124,8 @@ class GitClient(VCSClientBase):
return VCSClientBase.check_client(self) return VCSClientBase.check_client(self)
except UserSideException: except UserSideException:
raise UserSideException( raise UserSideException(
"Please install Git client from https://git-scm.com/downloads") "Please install Git client from https://git-scm.com/downloads"
)
def get_branches(self): def get_branches(self):
output = self.get_cmd_output(["branch"]) output = self.get_cmd_output(["branch"])
@@ -166,7 +164,10 @@ class GitClient(VCSClientBase):
args += [self.remote_url, self.src_dir] args += [self.remote_url, self.src_dir]
assert self.run_cmd(args) assert self.run_cmd(args)
if is_commit: if is_commit:
return self.run_cmd(["reset", "--hard", self.tag]) assert self.run_cmd(["reset", "--hard", self.tag])
return self.run_cmd(
["submodule", "update", "--init", "--recursive", "--force"]
)
return True return True
def update(self): def update(self):
@@ -232,7 +233,8 @@ class SvnClient(VCSClientBase):
def get_current_revision(self): def get_current_revision(self):
output = self.get_cmd_output( output = self.get_cmd_output(
["info", "--non-interactive", "--trust-server-cert", "-r", "HEAD"]) ["info", "--non-interactive", "--trust-server-cert", "-r", "HEAD"]
)
for line in output.split("\n"): for line in output.split("\n"):
line = line.strip() line = line.strip()
if line.startswith("Revision:"): if line.startswith("Revision:"):

View File

@@ -25,45 +25,35 @@
# #
# CP210X USB UART # CP210X USB UART
SUBSYSTEMS=="usb", ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", MODE:="0666" ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# FT232R USB UART
SUBSYSTEMS=="usb", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6001", MODE:="0666"
# FT231XS USB UART # FT231XS USB UART
SUBSYSTEMS=="usb", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6015", MODE:="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6015", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Prolific Technology, Inc. PL2303 Serial Port # Prolific Technology, Inc. PL2303 Serial Port
SUBSYSTEMS=="usb", ATTRS{idVendor}=="067b", ATTRS{idProduct}=="2303", MODE:="0666" ATTRS{idVendor}=="067b", ATTRS{idProduct}=="2303", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# QinHeng Electronics HL-340 USB-Serial adapter # QinHeng Electronics HL-340 USB-Serial adapter
SUBSYSTEMS=="usb", ATTRS{idVendor}=="1a86", ATTRS{idProduct}=="7523", MODE:="0666" ATTRS{idVendor}=="1a86", ATTRS{idProduct}=="7523", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Arduino boards # Arduino boards
SUBSYSTEMS=="usb", ATTRS{idVendor}=="2341", ATTRS{idProduct}=="[08][02]*", MODE:="0666" ATTRS{idVendor}=="2341", ATTRS{idProduct}=="[08][02]*", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="2a03", ATTRS{idProduct}=="[08][02]*", MODE:="0666" ATTRS{idVendor}=="2a03", ATTRS{idProduct}=="[08][02]*", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Arduino SAM-BA # Arduino SAM-BA
ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="6124", ENV{ID_MM_DEVICE_IGNORE}="1" ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="6124", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{MTP_NO_PROBE}="1"
ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="6124", ENV{MTP_NO_PROBE}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="6124", MODE:="0666"
KERNEL=="ttyACM*", ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="6124", MODE:="0666"
# Digistump boards # Digistump boards
SUBSYSTEMS=="usb", ATTRS{idVendor}=="16d0", ATTRS{idProduct}=="0753", MODE:="0666" ATTRS{idVendor}=="16d0", ATTRS{idProduct}=="0753", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
KERNEL=="ttyACM*", ATTRS{idVendor}=="16d0", ATTRS{idProduct}=="0753", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
# STM32 discovery boards, with onboard st/linkv2
SUBSYSTEMS=="usb", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="374?", MODE:="0666"
# Maple with DFU # Maple with DFU
SUBSYSTEMS=="usb", ATTRS{idVendor}=="1eaf", ATTRS{idProduct}=="000[34]", MODE:="0666" ATTRS{idVendor}=="1eaf", ATTRS{idProduct}=="000[34]", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# USBtiny # USBtiny
SUBSYSTEMS=="usb", ATTRS{idProduct}=="0c9f", ATTRS{idVendor}=="1781", MODE="0666" ATTRS{idProduct}=="0c9f", ATTRS{idVendor}=="1781", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# USBasp V2.0 # USBasp V2.0
SUBSYSTEMS=="usb", ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="05dc", MODE:="0666" ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="05dc", MODE:="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Teensy boards # Teensy boards
ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="04[789B]?", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1" ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="04[789B]?", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
@@ -72,191 +62,108 @@ SUBSYSTEMS=="usb", ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="04[789ABCD]?", MO
KERNEL=="ttyACM*", ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="04[789B]?", MODE:="0666" KERNEL=="ttyACM*", ATTRS{idVendor}=="16c0", ATTRS{idProduct}=="04[789B]?", MODE:="0666"
#TI Stellaris Launchpad #TI Stellaris Launchpad
SUBSYSTEMS=="usb", ATTRS{idVendor}=="1cbe", ATTRS{idProduct}=="00fd", MODE="0666" ATTRS{idVendor}=="1cbe", ATTRS{idProduct}=="00fd", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
#TI MSP430 Launchpad #TI MSP430 Launchpad
SUBSYSTEMS=="usb", ATTRS{idVendor}=="0451", ATTRS{idProduct}=="f432", MODE="0666" ATTRS{idVendor}=="0451", ATTRS{idProduct}=="f432", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
#GD32V DFU Bootloader
ATTRS{idVendor}=="28e9", ATTRS{idProduct}=="0189", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# #
# Debuggers # Debuggers
# #
# Black Magic Probe # Black Magic Probe
SUBSYSTEM=="tty", ATTRS{interface}=="Black Magic GDB Server" SUBSYSTEM=="tty", ATTRS{interface}=="Black Magic GDB Server", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
SUBSYSTEM=="tty", ATTRS{interface}=="Black Magic UART Port" SUBSYSTEM=="tty", ATTRS{interface}=="Black Magic UART Port", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# opendous and estick # opendous and estick
ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="204f", MODE="0666" ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="204f", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Original FT232/FT245 VID:PID # Original FT232/FT245 VID:PID
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6001", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6001", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Original FT2232 VID:PID # Original FT2232 VID:PID
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6010", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6010", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Original FT4232 VID:PID # Original FT4232 VID:PID
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6011", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6011", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Original FT232H VID:PID # Original FT232H VID:PID
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6014", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6014", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# DISTORTEC JTAG-lock-pick Tiny 2 # DISTORTEC JTAG-lock-pick Tiny 2
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8220", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8220", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# TUMPA, TUMPA Lite # TUMPA, TUMPA Lite
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8a98", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8a98", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8a99", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="8a99", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# XDS100v2 # XDS100v2
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="a6d0", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="a6d0", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Xverve Signalyzer Tool (DT-USB-ST), Signalyzer LITE (DT-USB-SLITE) # Xverve Signalyzer Tool (DT-USB-ST), Signalyzer LITE (DT-USB-SLITE)
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bca0", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bca0", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bca1", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bca1", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# TI/Luminary Stellaris Evaluation Board FTDI (several) # TI/Luminary Stellaris Evaluation Board FTDI (several)
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bcd9", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bcd9", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# TI/Luminary Stellaris In-Circuit Debug Interface FTDI (ICDI) Board # TI/Luminary Stellaris In-Circuit Debug Interface FTDI (ICDI) Board
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bcda", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bcda", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# egnite Turtelizer 2 # egnite Turtelizer 2
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bdc8", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="bdc8", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Section5 ICEbear # Section5 ICEbear
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="c140", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="c140", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="c141", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="c141", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Amontec JTAGkey and JTAGkey-tiny # Amontec JTAGkey and JTAGkey-tiny
ATTRS{idVendor}=="0403", ATTRS{idProduct}=="cff8", MODE="0666" ATTRS{idVendor}=="0403", ATTRS{idProduct}=="cff8", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# TI ICDI # TI ICDI
ATTRS{idVendor}=="0451", ATTRS{idProduct}=="c32a", MODE="0666" ATTRS{idVendor}=="0451", ATTRS{idProduct}=="c32a", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# STLink v1 # STLink probes
ATTRS{idVendor}=="0483", ATTRS{idProduct}=="3744", MODE="0666" ATTRS{idVendor}=="0483", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# STLink v2
ATTRS{idVendor}=="0483", ATTRS{idProduct}=="3748", MODE="0666"
# STLink v2-1
ATTRS{idVendor}=="0483", ATTRS{idProduct}=="374b", MODE="0666"
# Hilscher NXHX Boards # Hilscher NXHX Boards
ATTRS{idVendor}=="0640", ATTRS{idProduct}=="0028", MODE="0666" ATTRS{idVendor}=="0640", ATTRS{idProduct}=="0028", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Hitex STR9-comStick # Hitex probes
ATTRS{idVendor}=="0640", ATTRS{idProduct}=="002c", MODE="0666" ATTRS{idVendor}=="0640", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Hitex STM32-PerformanceStick
ATTRS{idVendor}=="0640", ATTRS{idProduct}=="002d", MODE="0666"
# Altera USB Blaster # Altera USB Blaster
ATTRS{idVendor}=="09fb", ATTRS{idProduct}=="6001", MODE="0666" ATTRS{idVendor}=="09fb", ATTRS{idProduct}=="6001", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Amontec JTAGkey-HiSpeed # Amontec JTAGkey-HiSpeed
ATTRS{idVendor}=="0fbb", ATTRS{idProduct}=="1000", MODE="0666" ATTRS{idVendor}=="0fbb", ATTRS{idProduct}=="1000", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# SEGGER J-Link # SEGGER J-Link
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0101", MODE="0666" ATTRS{idVendor}=="1366", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0102", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0103", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0104", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0105", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0107", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="0108", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1010", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1011", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1012", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1013", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1014", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1015", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1016", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1017", MODE="0666"
ATTRS{idVendor}=="1366", ATTRS{idProduct}=="1018", MODE="0666"
# Raisonance RLink # Raisonance RLink
ATTRS{idVendor}=="138e", ATTRS{idProduct}=="9000", MODE="0666" ATTRS{idVendor}=="138e", ATTRS{idProduct}=="9000", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Debug Board for Neo1973 # Debug Board for Neo1973
ATTRS{idVendor}=="1457", ATTRS{idProduct}=="5118", MODE="0666" ATTRS{idVendor}=="1457", ATTRS{idProduct}=="5118", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Olimex ARM-USB-OCD # Olimex probes
ATTRS{idVendor}=="15ba", ATTRS{idProduct}=="0003", MODE="0666" ATTRS{idVendor}=="15ba", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Olimex ARM-USB-OCD-TINY
ATTRS{idVendor}=="15ba", ATTRS{idProduct}=="0004", MODE="0666"
# Olimex ARM-JTAG-EW
ATTRS{idVendor}=="15ba", ATTRS{idProduct}=="001e", MODE="0666"
# Olimex ARM-USB-OCD-TINY-H
ATTRS{idVendor}=="15ba", ATTRS{idProduct}=="002a", MODE="0666"
# Olimex ARM-USB-OCD-H
ATTRS{idVendor}=="15ba", ATTRS{idProduct}=="002b", MODE="0666"
# USBprog with OpenOCD firmware # USBprog with OpenOCD firmware
ATTRS{idVendor}=="1781", ATTRS{idProduct}=="0c63", MODE="0666" ATTRS{idVendor}=="1781", ATTRS{idProduct}=="0c63", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# TI/Luminary Stellaris In-Circuit Debug Interface (ICDI) Board # TI/Luminary Stellaris In-Circuit Debug Interface (ICDI) Board
ATTRS{idVendor}=="1cbe", ATTRS{idProduct}=="00fd", MODE="0666" ATTRS{idVendor}=="1cbe", ATTRS{idProduct}=="00fd", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Marvell Sheevaplug # Marvell Sheevaplug
ATTRS{idVendor}=="9e88", ATTRS{idProduct}=="9e8f", MODE="0666" ATTRS{idVendor}=="9e88", ATTRS{idProduct}=="9e8f", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Keil Software, Inc. ULink # Keil Software, Inc. ULink
ATTRS{idVendor}=="c251", ATTRS{idProduct}=="2710", MODE="0666" ATTRS{idVendor}=="c251", ATTRS{idProduct}=="2710", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# CMSIS-DAP compatible adapters # CMSIS-DAP compatible adapters
ATTRS{product}=="*CMSIS-DAP*", MODE="0666" ATTRS{product}=="*CMSIS-DAP*", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
#SEGGER J-LIK
ATTR{idProduct}=="1001", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1002", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1003", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1004", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1005", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1006", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1007", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1008", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1009", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100a", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100b", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100c", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100d", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100e", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="100f", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1010", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1011", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1012", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1013", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1014", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1015", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1016", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1017", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1018", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1019", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101a", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101b", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101c", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101d", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101e", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="101f", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1020", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1021", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1022", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1023", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1024", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1025", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1026", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1027", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1028", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="1029", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102a", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102b", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102c", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102d", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102e", ATTR{idVendor}=="1366", MODE="0666"
ATTR{idProduct}=="102f", ATTR{idVendor}=="1366", MODE="0666"

View File

@@ -13,16 +13,22 @@
# limitations under the License. # limitations under the License.
import os import os
import urlparse
from os.path import dirname, isdir, isfile, join, realpath from os.path import dirname, isdir, isfile, join, realpath
from sys import exit as sys_exit from sys import exit as sys_exit
from sys import path from sys import path
path.append("..") path.append("..")
import click
from platformio import fs, util from platformio import fs, util
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.managers.platform import PlatformFactory, PlatformManager
try:
from urlparse import ParseResult, urlparse, urlunparse
except ImportError:
from urllib.parse import ParseResult, urlparse, urlunparse
RST_COPYRIGHT = """.. Copyright (c) 2014-present PlatformIO <contact@platformio.org> RST_COPYRIGHT = """.. Copyright (c) 2014-present PlatformIO <contact@platformio.org>
Licensed under the Apache License, Version 2.0 (the "License"); Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License. you may not use this file except in compliance with the License.
@@ -48,14 +54,14 @@ def is_compat_platform_and_framework(platform, framework):
def campaign_url(url, source="platformio", medium="docs"): def campaign_url(url, source="platformio", medium="docs"):
data = urlparse.urlparse(url) data = urlparse(url)
query = data.query query = data.query
if query: if query:
query += "&" query += "&"
query += "utm_source=%s&utm_medium=%s" % (source, medium) query += "utm_source=%s&utm_medium=%s" % (source, medium)
return urlparse.urlunparse( return urlunparse(
urlparse.ParseResult(data.scheme, data.netloc, data.path, data.params, ParseResult(data.scheme, data.netloc, data.path, data.params, query,
query, data.fragment)) data.fragment))
def generate_boards_table(boards, skip_columns=None): def generate_boards_table(boards, skip_columns=None):
@@ -64,7 +70,7 @@ def generate_boards_table(boards, skip_columns=None):
("Platform", ":ref:`platform_{platform}`"), ("Platform", ":ref:`platform_{platform}`"),
("Debug", "{debug}"), ("Debug", "{debug}"),
("MCU", "{mcu}"), ("MCU", "{mcu}"),
("Frequency", "{f_cpu:d}MHz"), ("Frequency", "{f_cpu}MHz"),
("Flash", "{rom}"), ("Flash", "{rom}"),
("RAM", "{ram}"), ("RAM", "{ram}"),
] ]
@@ -90,15 +96,14 @@ def generate_boards_table(boards, skip_columns=None):
elif data['debug']: elif data['debug']:
debug = "External" debug = "External"
variables = dict( variables = dict(id=data['id'],
id=data['id'], name=data['name'],
name=data['name'], platform=data['platform'],
platform=data['platform'], debug=debug,
debug=debug, mcu=data['mcu'].upper(),
mcu=data['mcu'].upper(), f_cpu=int(data['fcpu'] / 1000000.0),
f_cpu=int(data['fcpu']) / 1000000, ram=fs.format_filesize(data['ram']),
ram=fs.format_filesize(data['ram']), rom=fs.format_filesize(data['rom']))
rom=fs.format_filesize(data['rom']))
for (name, template) in columns: for (name, template) in columns:
if skip_columns and name in skip_columns: if skip_columns and name in skip_columns:
@@ -132,8 +137,9 @@ Frameworks
lines.append(""" lines.append("""
* - :ref:`framework_{name}` * - :ref:`framework_{name}`
- {description}""".format(**framework)) - {description}""".format(**framework))
assert known >= set(frameworks), "Unknown frameworks %s " % ( if set(frameworks) - known:
set(frameworks) - known) click.secho("Unknown frameworks %s " % (
set(frameworks) - known), fg="red")
return lines return lines
@@ -208,8 +214,8 @@ Boards listed below have on-board debug probe and **ARE READY** for debugging!
You do not need to use/buy external debug probe. You do not need to use/buy external debug probe.
""") """)
lines.extend( lines.extend(
generate_boards_table( generate_boards_table(onboard_debug,
onboard_debug, skip_columns=skip_board_columns)) skip_columns=skip_board_columns))
if external_debug: if external_debug:
lines.append(""" lines.append("""
External Debug Tools External Debug Tools
@@ -220,8 +226,8 @@ external debug probe. They **ARE NOT READY** for debugging.
Please click on board name for the further details. Please click on board name for the further details.
""") """)
lines.extend( lines.extend(
generate_boards_table( generate_boards_table(external_debug,
external_debug, skip_columns=skip_board_columns)) skip_columns=skip_board_columns))
return lines return lines
@@ -239,13 +245,18 @@ Packages
* - Name * - Name
- Description""") - Description""")
for name in sorted(packagenames): for name in sorted(packagenames):
assert name in API_PACKAGES, name if name not in API_PACKAGES:
lines.append(""" click.secho("Unknown package `%s`" % name, fg="red")
lines.append("""
* - {name}
-
""".format(name=name))
else:
lines.append("""
* - `{name} <{url}>`__ * - `{name} <{url}>`__
- {description}""".format( - {description}""".format(name=name,
name=name, url=campaign_url(API_PACKAGES[name]['url']),
url=campaign_url(API_PACKAGES[name]['url']), description=API_PACKAGES[name]['description']))
description=API_PACKAGES[name]['description']))
if is_embedded: if is_embedded:
lines.append(""" lines.append("""
@@ -344,8 +355,9 @@ Examples are listed from `%s development platform repository <%s>`_:
generate_debug_contents( generate_debug_contents(
compatible_boards, compatible_boards,
skip_board_columns=["Platform"], skip_board_columns=["Platform"],
extra_rst="%s_debug.rst" % name if isfile( extra_rst="%s_debug.rst" %
join(rst_dir, "%s_debug.rst" % name)) else None)) name if isfile(join(rst_dir, "%s_debug.rst" %
name)) else None))
# #
# Development version of dev/platform # Development version of dev/platform
@@ -483,8 +495,9 @@ For more detailed information please visit `vendor site <%s>`_.
lines.extend( lines.extend(
generate_debug_contents( generate_debug_contents(
compatible_boards, compatible_boards,
extra_rst="%s_debug.rst" % type_ if isfile( extra_rst="%s_debug.rst" %
join(rst_dir, "%s_debug.rst" % type_)) else None)) type_ if isfile(join(rst_dir, "%s_debug.rst" %
type_)) else None))
if compatible_platforms: if compatible_platforms:
# examples # examples
@@ -494,11 +507,10 @@ Examples
""") """)
for manifest in compatible_platforms: for manifest in compatible_platforms:
p = PlatformFactory.newPlatform(manifest['name']) p = PlatformFactory.newPlatform(manifest['name'])
lines.append( lines.append("* `%s for %s <%s>`_" %
"* `%s for %s <%s>`_" % (data['title'], manifest['title'],
(data['title'], manifest['title'], campaign_url("%s/tree/master/examples" %
campaign_url( p.repository_url[:-4])))
"%s/tree/master/examples" % p.repository_url[:-4])))
# Platforms # Platforms
lines.extend( lines.extend(
@@ -568,7 +580,7 @@ popular embedded boards and IDE.
else: else:
platforms[platform] = [data] platforms[platform] = [data]
for platform, boards in sorted(platforms.iteritems()): for platform, boards in sorted(platforms.items()):
p = PlatformFactory.newPlatform(platform) p = PlatformFactory.newPlatform(platform)
lines.append(p.title) lines.append(p.title)
lines.append("-" * len(p.title)) lines.append("-" * len(p.title))
@@ -605,21 +617,20 @@ def update_embedded_board(rst_path, board):
board_manifest_url = board_manifest_url[:-4] board_manifest_url = board_manifest_url[:-4]
board_manifest_url += "/blob/master/boards/%s.json" % board['id'] board_manifest_url += "/blob/master/boards/%s.json" % board['id']
variables = dict( variables = dict(id=board['id'],
id=board['id'], name=board['name'],
name=board['name'], platform=board['platform'],
platform=board['platform'], platform_description=platform.description,
platform_description=platform.description, url=campaign_url(board['url']),
url=campaign_url(board['url']), mcu=board_config.get("build", {}).get("mcu", ""),
mcu=board_config.get("build", {}).get("mcu", ""), mcu_upper=board['mcu'].upper(),
mcu_upper=board['mcu'].upper(), f_cpu=board['fcpu'],
f_cpu=board['fcpu'], f_cpu_mhz=int(int(board['fcpu']) / 1000000),
f_cpu_mhz=int(board['fcpu']) / 1000000, ram=fs.format_filesize(board['ram']),
ram=fs.format_filesize(board['ram']), rom=fs.format_filesize(board['rom']),
rom=fs.format_filesize(board['rom']), vendor=board['vendor'],
vendor=board['vendor'], board_manifest_url=board_manifest_url,
board_manifest_url=board_manifest_url, upload_protocol=board_config.get("upload.protocol", ""))
upload_protocol=board_config.get("upload.protocol", ""))
lines = [RST_COPYRIGHT] lines = [RST_COPYRIGHT]
lines.append(".. _board_{platform}_{id}:".format(**variables)) lines.append(".. _board_{platform}_{id}:".format(**variables))
@@ -639,7 +650,7 @@ Platform :ref:`platform_{platform}`: {platform_description}
* - **Microcontroller** * - **Microcontroller**
- {mcu_upper} - {mcu_upper}
* - **Frequency** * - **Frequency**
- {f_cpu_mhz}MHz - {f_cpu_mhz:d}MHz
* - **Flash** * - **Flash**
- {rom} - {rom}
* - **RAM** * - **RAM**
@@ -805,15 +816,14 @@ Boards
.. note:: .. note::
For more detailed ``board`` information please scroll tables below by horizontal. For more detailed ``board`` information please scroll tables below by horizontal.
""") """)
for vendor, boards in sorted(vendors.iteritems()): for vendor, boards in sorted(vendors.items()):
lines.append(str(vendor)) lines.append(str(vendor))
lines.append("~" * len(vendor)) lines.append("~" * len(vendor))
lines.extend(generate_boards_table(boards)) lines.extend(generate_boards_table(boards))
# save # save
with open( with open(join(fs.get_source_dir(), "..", "docs", "plus", "debugging.rst"),
join(fs.get_source_dir(), "..", "docs", "plus", "debugging.rst"), "r+") as fp:
"r+") as fp:
content = fp.read() content = fp.read()
fp.seek(0) fp.seek(0)
fp.truncate() fp.truncate()
@@ -823,7 +833,9 @@ Boards
# Debug tools # Debug tools
for tool, platforms in tool_to_platforms.items(): for tool, platforms in tool_to_platforms.items():
tool_path = join(DOCS_ROOT_DIR, "plus", "debug-tools", "%s.rst" % tool) tool_path = join(DOCS_ROOT_DIR, "plus", "debug-tools", "%s.rst" % tool)
assert isfile(tool_path), tool if not isfile(tool_path):
click.secho("Unknown debug tool `%s`" % tool, fg="red")
continue
platforms = sorted(set(platforms)) platforms = sorted(set(platforms))
lines = [".. begin_platforms"] lines = [".. begin_platforms"]

Some files were not shown because too many files have changed in this diff Show More