forked from platformio/platformio-core
Merge branch 'release/v4.2.0'
This commit is contained in:
44
HISTORY.rst
44
HISTORY.rst
@ -6,6 +6,47 @@ Release Notes
|
||||
PlatformIO Core 4.0
|
||||
-------------------
|
||||
|
||||
4.2.0 (2020-02-12)
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
* `PlatformIO Home 3.1 <http://docs.platformio.org/page/home/index.html>`__:
|
||||
|
||||
- Project Manager
|
||||
- Project Configuration UI for `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__
|
||||
|
||||
* `PIO Check <http://docs.platformio.org/page/plus/pio-check.html>`__ – automated code analysis without hassle:
|
||||
|
||||
- Added support for `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ static code analyzer
|
||||
|
||||
* Initial support for `Project Manager <https://docs.platformio.org/page/userguide/project/index.html>`_ CLI:
|
||||
|
||||
- Show computed project configuration with a new `platformio project config <https://docs.platformio.org/page/userguide/project/cmd_config.html>`_ command or dump to JSON with ``platformio project config --json-output`` (`issue #3335 <https://github.com/platformio/platformio-core/issues/3335>`_)
|
||||
- Moved ``platformio init`` command to `platformio project init <https://docs.platformio.org/page/userguide/project/cmd_init.html>`_
|
||||
|
||||
* Generate `compilation database "compile_commands.json" <https://docs.platformio.org/page/faq.html#compilation-database-compile-commands-json>`_ (`issue #2990 <https://github.com/platformio/platformio-core/issues/2990>`_)
|
||||
* Control debug flags and optimization level with a new `debug_build_flags <https://docs.platformio.org/page/projectconf/section_env_debug.html#debug-build-flags>`__ option
|
||||
* Install a dev-platform with ALL declared packages using a new ``--with-all-packages`` option for `pio platform install <https://docs.platformio.org/page/userguide/platforms/cmd_install.html>`__ command (`issue #3345 <https://github.com/platformio/platformio-core/issues/3345>`_)
|
||||
* Added support for "pythonPackages" in `platform.json <https://docs.platformio.org/page/platforms/creating_platform.html#manifest-file-platform-json>`__ manifest (PlatformIO Package Manager will install dependent Python packages from PyPi registry automatically when dev-platform is installed)
|
||||
* Handle project configuration (monitor, test, and upload options) for PIO Remote commands (`issue #2591 <https://github.com/platformio/platformio-core/issues/2591>`_)
|
||||
* Added support for Arduino's library.properties ``depends`` field (`issue #2781 <https://github.com/platformio/platformio-core/issues/2781>`_)
|
||||
* Autodetect monitor port for boards with specified HWIDs (`issue #3349 <https://github.com/platformio/platformio-core/issues/3349>`_)
|
||||
* Updated SCons tool to 3.1.2
|
||||
* Updated Unity tool to 2.5.0
|
||||
* Made package ManifestSchema compatible with marshmallow >= 3 (`issue #3296 <https://github.com/platformio/platformio-core/issues/3296>`_)
|
||||
* Warn about broken library manifest when scanning dependencies (`issue #3268 <https://github.com/platformio/platformio-core/issues/3268>`_)
|
||||
* Do not overwrite custom items in VSCode's "extensions.json" (`issue #3374 <https://github.com/platformio/platformio-core/issues/3374>`_)
|
||||
* Fixed an issue when ``env.BoardConfig()`` does not work for custom boards in extra scripts of libraries (`issue #3264 <https://github.com/platformio/platformio-core/issues/3264>`_)
|
||||
* Fixed an issue with "start-group/end-group" linker flags on Native development platform (`issue #3282 <https://github.com/platformio/platformio-core/issues/3282>`_)
|
||||
* Fixed default PIO Unified Debugger configuration for `J-Link probe <http://docs.platformio.org/page/plus/debug-tools/jlink.html>`__
|
||||
* Fixed an issue with LDF when header files not found if "libdeps_dir" is within a subdirectory of "lib_extra_dirs" (`issue #3311 <https://github.com/platformio/platformio-core/issues/3311>`_)
|
||||
* Fixed an issue "Import of non-existent variable 'projenv''" when development platform does not call "env.BuildProgram()" (`issue #3315 <https://github.com/platformio/platformio-core/issues/3315>`_)
|
||||
* Fixed an issue when invalid CLI command does not return non-zero exit code
|
||||
* Fixed an issue when Project Inspector crashes when flash use > 100% (`issue #3368 <https://github.com/platformio/platformio-core/issues/3368>`_)
|
||||
* Fixed a "UnicodeDecodeError" when listing built-in libraries on macOS with Python 2.7 (`issue #3370 <https://github.com/platformio/platformio-core/issues/3370>`_)
|
||||
* Fixed an issue with improperly handled compiler flags with space symbols in VSCode template (`issue #3364 <https://github.com/platformio/platformio-core/issues/3364>`_)
|
||||
* Fixed an issue when no error is raised if referred parameter (interpolation) is missing in a project configuration file (`issue #3279 <https://github.com/platformio/platformio-core/issues/3279>`_)
|
||||
|
||||
|
||||
4.1.0 (2019-11-07)
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
@ -18,8 +59,9 @@ PlatformIO Core 4.0
|
||||
- Unused variables or functions
|
||||
- Out of scope memory usage.
|
||||
|
||||
* `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__ and Project Inspection
|
||||
* `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__:
|
||||
|
||||
- Project Inspection
|
||||
- Static Code Analysis
|
||||
- Firmware File Explorer
|
||||
- Firmware Memory Inspection
|
||||
|
1
MANIFEST.in
Normal file
1
MANIFEST.in
Normal file
@ -0,0 +1 @@
|
||||
include LICENSE
|
6
Makefile
6
Makefile
@ -5,14 +5,14 @@ isort:
|
||||
isort -rc ./platformio
|
||||
isort -rc ./tests
|
||||
|
||||
black:
|
||||
format:
|
||||
black --target-version py27 ./platformio
|
||||
black --target-version py27 ./tests
|
||||
|
||||
test:
|
||||
py.test --verbose --capture=no --exitfirst -n 3 --dist=loadscope tests --ignore tests/test_examples.py
|
||||
py.test --verbose --capture=no --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
|
||||
|
||||
before-commit: isort black lint test
|
||||
before-commit: isort format lint test
|
||||
|
||||
clean-docs:
|
||||
rm -rf docs/_build
|
||||
|
24
README.rst
24
README.rst
@ -34,12 +34,13 @@ PlatformIO
|
||||
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
|
||||
:target: https://platformio.org?utm_source=github&utm_medium=core
|
||||
|
||||
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ an open source ecosystem for embedded development
|
||||
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ a new generation ecosystem for embedded development
|
||||
|
||||
* **Cross-platform IDE** and **Unified Debugger**
|
||||
* **Static Code Analyzer** and **Remote Unit Testing**
|
||||
* **Multi-platform** and **Multi-architecture Build System**
|
||||
* **Firmware File Explorer** and **Memory Inspection**.
|
||||
* Open source, maximum permissive Apache 2.0 license
|
||||
* Cross-platform IDE and Unified Debugger
|
||||
* Static Code Analyzer and Remote Unit Testing
|
||||
* Multi-platform and Multi-architecture Build System
|
||||
* Firmware File Explorer and Memory Inspection.
|
||||
|
||||
Get Started
|
||||
-----------
|
||||
@ -91,10 +92,10 @@ Development Platforms
|
||||
* `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_
|
||||
* `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_
|
||||
* `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_
|
||||
* `Nuclei <https://platformio.org/platforms/nuclei?utm_source=github&utm_medium=core>`_
|
||||
* `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_
|
||||
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
|
||||
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
|
||||
* `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_
|
||||
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
|
||||
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
|
||||
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
|
||||
@ -108,24 +109,25 @@ Frameworks
|
||||
----------
|
||||
|
||||
* `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_
|
||||
* `ARTIK SDK <https://platformio.org/frameworks/artik-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_
|
||||
* `Energia <https://platformio.org/frameworks/energia?utm_source=github&utm_medium=core>`_
|
||||
* `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_
|
||||
* `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `GigaDevice GD32V SDK <https://platformio.org/frameworks/gd32vf103-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `Kendryte FreeRTOS SDK <https://platformio.org/frameworks/kendryte-freertos-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_
|
||||
* `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
|
||||
* `Mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
|
||||
* `Nuclei SDK <https://platformio.org/frameworks/nuclei-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
|
||||
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
|
||||
* `Shakti <https://platformio.org/frameworks/shakti?utm_source=github&utm_medium=core>`_
|
||||
* `Shakti SDK <https://platformio.org/frameworks/shakti-sdk?utm_source=github&utm_medium=core>`_
|
||||
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
|
||||
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
|
||||
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_
|
||||
* `Tizen RT <https://platformio.org/frameworks/tizenrt?utm_source=github&utm_medium=core>`_
|
||||
* `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_
|
||||
* `Zephyr <https://platformio.org/frameworks/zephyr?utm_source=github&utm_medium=core>`_
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
2
docs
2
docs
Submodule docs updated: 28f91efb24...dc25f117fd
2
examples
2
examples
Submodule examples updated: 9070288cff...e1d641126d
@ -12,12 +12,12 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
VERSION = (4, 1, 0)
|
||||
VERSION = (4, 2, 0)
|
||||
__version__ = ".".join([str(s) for s in VERSION])
|
||||
|
||||
__title__ = "platformio"
|
||||
__description__ = (
|
||||
"An open source ecosystem for embedded development. "
|
||||
"A new generation ecosystem for embedded development. "
|
||||
"Cross-platform IDE and Unified Debugger. "
|
||||
"Static Code Analyzer and Remote Unit Testing. "
|
||||
"Multi-platform and Multi-architecture Build System. "
|
||||
|
@ -100,8 +100,9 @@ def main(argv=None):
|
||||
try:
|
||||
configure()
|
||||
cli() # pylint: disable=no-value-for-parameter
|
||||
except SystemExit:
|
||||
pass
|
||||
except SystemExit as e:
|
||||
if e.code and str(e.code).isdigit():
|
||||
exit_code = int(e.code)
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
if not isinstance(e, exception.ReturnErrorCode):
|
||||
maintenance.on_platformio_exception(e)
|
||||
|
@ -17,7 +17,7 @@ import hashlib
|
||||
import os
|
||||
import uuid
|
||||
from os import environ, getenv, listdir, remove
|
||||
from os.path import abspath, dirname, isdir, isfile, join
|
||||
from os.path import dirname, isdir, isfile, join, realpath
|
||||
from time import time
|
||||
|
||||
import requests
|
||||
@ -34,7 +34,7 @@ from platformio.project.helpers import (
|
||||
|
||||
def projects_dir_validate(projects_dir):
|
||||
assert isdir(projects_dir)
|
||||
return abspath(projects_dir)
|
||||
return realpath(projects_dir)
|
||||
|
||||
|
||||
DEFAULT_SETTINGS = {
|
||||
@ -199,6 +199,7 @@ class ContentCache(object):
|
||||
return True
|
||||
|
||||
def get_cache_path(self, key):
|
||||
assert "/" not in key and "\\" not in key
|
||||
key = str(key)
|
||||
assert len(key) > 3
|
||||
return join(self.cache_dir, key[-2:], key)
|
||||
|
@ -50,10 +50,10 @@ clivars.AddVariables(
|
||||
DEFAULT_ENV_OPTIONS = dict(
|
||||
tools=[
|
||||
"ar",
|
||||
"gas",
|
||||
"gcc",
|
||||
"g++",
|
||||
"gnulink",
|
||||
"as",
|
||||
"cc",
|
||||
"c++",
|
||||
"link",
|
||||
"platformio",
|
||||
"pioplatform",
|
||||
"pioproject",
|
||||
@ -72,6 +72,7 @@ DEFAULT_ENV_OPTIONS = dict(
|
||||
BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
|
||||
BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
|
||||
BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
|
||||
COMPILATIONDB_PATH=join("$BUILD_DIR", "compile_commands.json"),
|
||||
LIBPATH=["$BUILD_DIR"],
|
||||
PROGNAME="program",
|
||||
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
|
||||
@ -134,6 +135,10 @@ if env.GetOption("clean"):
|
||||
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
|
||||
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
|
||||
|
||||
# Dynamically load dependent tools
|
||||
if "compiledb" in COMMAND_LINE_TARGETS:
|
||||
env.Tool("compilation_db")
|
||||
|
||||
if not isdir(env.subst("$BUILD_DIR")):
|
||||
makedirs(env.subst("$BUILD_DIR"))
|
||||
|
||||
@ -161,7 +166,9 @@ for item in env.GetExtraScripts("post"):
|
||||
##############################################################################
|
||||
|
||||
# Checking program size
|
||||
if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
|
||||
if env.get("SIZETOOL") and not (
|
||||
set(["nobuild", "sizedata"]) & set(COMMAND_LINE_TARGETS)
|
||||
):
|
||||
env.Depends(["upload", "program"], "checkprogsize")
|
||||
# Replace platform's "size" target with our
|
||||
_new_targets = [t for t in DEFAULT_TARGETS if str(t) != "size"]
|
||||
@ -169,6 +176,9 @@ if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
|
||||
Default(_new_targets)
|
||||
Default("checkprogsize")
|
||||
|
||||
if "compiledb" in COMMAND_LINE_TARGETS:
|
||||
env.Alias("compiledb", env.CompilationDatabase("$COMPILATIONDB_PATH"))
|
||||
|
||||
# Print configured protocols
|
||||
env.AddPreAction(
|
||||
["upload", "program"],
|
||||
@ -188,7 +198,10 @@ if "envdump" in COMMAND_LINE_TARGETS:
|
||||
env.Exit(0)
|
||||
|
||||
if "idedata" in COMMAND_LINE_TARGETS:
|
||||
Import("projenv")
|
||||
try:
|
||||
Import("projenv")
|
||||
except: # pylint: disable=bare-except
|
||||
projenv = env
|
||||
click.echo(
|
||||
"\n%s\n"
|
||||
% dump_json_to_unicode(
|
||||
|
209
platformio/builder/tools/compilation_db.py
Normal file
209
platformio/builder/tools/compilation_db.py
Normal file
@ -0,0 +1,209 @@
|
||||
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
|
||||
# Copyright 2015 MongoDB Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# pylint: disable=unused-argument, protected-access, unused-variable, import-error
|
||||
# Original: https://github.com/mongodb/mongo/blob/master/site_scons/site_tools/compilation_db.py
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import itertools
|
||||
import json
|
||||
import os
|
||||
|
||||
import SCons
|
||||
|
||||
from platformio.builder.tools.platformio import SRC_ASM_EXT, SRC_C_EXT, SRC_CXX_EXT
|
||||
|
||||
# Implements the ability for SCons to emit a compilation database for the MongoDB project. See
|
||||
# http://clang.llvm.org/docs/JSONCompilationDatabase.html for details on what a compilation
|
||||
# database is, and why you might want one. The only user visible entry point here is
|
||||
# 'env.CompilationDatabase'. This method takes an optional 'target' to name the file that
|
||||
# should hold the compilation database, otherwise, the file defaults to compile_commands.json,
|
||||
# which is the name that most clang tools search for by default.
|
||||
|
||||
# TODO: Is there a better way to do this than this global? Right now this exists so that the
|
||||
# emitter we add can record all of the things it emits, so that the scanner for the top level
|
||||
# compilation database can access the complete list, and also so that the writer has easy
|
||||
# access to write all of the files. But it seems clunky. How can the emitter and the scanner
|
||||
# communicate more gracefully?
|
||||
__COMPILATION_DB_ENTRIES = []
|
||||
|
||||
|
||||
# We make no effort to avoid rebuilding the entries. Someday, perhaps we could and even
|
||||
# integrate with the cache, but there doesn't seem to be much call for it.
|
||||
class __CompilationDbNode(SCons.Node.Python.Value):
|
||||
def __init__(self, value):
|
||||
SCons.Node.Python.Value.__init__(self, value)
|
||||
self.Decider(changed_since_last_build_node)
|
||||
|
||||
|
||||
def changed_since_last_build_node(child, target, prev_ni, node):
|
||||
""" Dummy decider to force always building"""
|
||||
return True
|
||||
|
||||
|
||||
def makeEmitCompilationDbEntry(comstr):
|
||||
"""
|
||||
Effectively this creates a lambda function to capture:
|
||||
* command line
|
||||
* source
|
||||
* target
|
||||
:param comstr: unevaluated command line
|
||||
:return: an emitter which has captured the above
|
||||
"""
|
||||
user_action = SCons.Action.Action(comstr)
|
||||
|
||||
def EmitCompilationDbEntry(target, source, env):
|
||||
"""
|
||||
This emitter will be added to each c/c++ object build to capture the info needed
|
||||
for clang tools
|
||||
:param target: target node(s)
|
||||
:param source: source node(s)
|
||||
:param env: Environment for use building this node
|
||||
:return: target(s), source(s)
|
||||
"""
|
||||
|
||||
dbtarget = __CompilationDbNode(source)
|
||||
|
||||
entry = env.__COMPILATIONDB_Entry(
|
||||
target=dbtarget,
|
||||
source=[],
|
||||
__COMPILATIONDB_UTARGET=target,
|
||||
__COMPILATIONDB_USOURCE=source,
|
||||
__COMPILATIONDB_UACTION=user_action,
|
||||
__COMPILATIONDB_ENV=env,
|
||||
)
|
||||
|
||||
# TODO: Technically, these next two lines should not be required: it should be fine to
|
||||
# cache the entries. However, they don't seem to update properly. Since they are quick
|
||||
# to re-generate disable caching and sidestep this problem.
|
||||
env.AlwaysBuild(entry)
|
||||
env.NoCache(entry)
|
||||
|
||||
__COMPILATION_DB_ENTRIES.append(dbtarget)
|
||||
|
||||
return target, source
|
||||
|
||||
return EmitCompilationDbEntry
|
||||
|
||||
|
||||
def CompilationDbEntryAction(target, source, env, **kw):
|
||||
"""
|
||||
Create a dictionary with evaluated command line, target, source
|
||||
and store that info as an attribute on the target
|
||||
(Which has been stored in __COMPILATION_DB_ENTRIES array
|
||||
:param target: target node(s)
|
||||
:param source: source node(s)
|
||||
:param env: Environment for use building this node
|
||||
:param kw:
|
||||
:return: None
|
||||
"""
|
||||
|
||||
command = env["__COMPILATIONDB_UACTION"].strfunction(
|
||||
target=env["__COMPILATIONDB_UTARGET"],
|
||||
source=env["__COMPILATIONDB_USOURCE"],
|
||||
env=env["__COMPILATIONDB_ENV"],
|
||||
)
|
||||
|
||||
entry = {
|
||||
"directory": env.Dir("#").abspath,
|
||||
"command": command,
|
||||
"file": str(env["__COMPILATIONDB_USOURCE"][0]),
|
||||
}
|
||||
|
||||
target[0].write(entry)
|
||||
|
||||
|
||||
def WriteCompilationDb(target, source, env):
|
||||
entries = []
|
||||
|
||||
for s in __COMPILATION_DB_ENTRIES:
|
||||
item = s.read()
|
||||
item["file"] = os.path.abspath(item["file"])
|
||||
entries.append(item)
|
||||
|
||||
with open(str(target[0]), "w") as target_file:
|
||||
json.dump(
|
||||
entries, target_file, sort_keys=True, indent=4, separators=(",", ": ")
|
||||
)
|
||||
|
||||
|
||||
def ScanCompilationDb(node, env, path):
|
||||
return __COMPILATION_DB_ENTRIES
|
||||
|
||||
|
||||
def generate(env, **kwargs):
|
||||
|
||||
static_obj, shared_obj = SCons.Tool.createObjBuilders(env)
|
||||
|
||||
env["COMPILATIONDB_COMSTR"] = kwargs.get(
|
||||
"COMPILATIONDB_COMSTR", "Building compilation database $TARGET"
|
||||
)
|
||||
|
||||
components_by_suffix = itertools.chain(
|
||||
itertools.product(
|
||||
[".%s" % ext for ext in SRC_C_EXT],
|
||||
[
|
||||
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CCCOM"),
|
||||
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCCCOM"),
|
||||
],
|
||||
),
|
||||
itertools.product(
|
||||
[".%s" % ext for ext in SRC_CXX_EXT],
|
||||
[
|
||||
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CXXCOM"),
|
||||
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCXXCOM"),
|
||||
],
|
||||
),
|
||||
itertools.product(
|
||||
[".%s" % ext for ext in SRC_ASM_EXT],
|
||||
[(static_obj, SCons.Defaults.StaticObjectEmitter, "$ASCOM")],
|
||||
),
|
||||
)
|
||||
|
||||
for entry in components_by_suffix:
|
||||
suffix = entry[0]
|
||||
builder, base_emitter, command = entry[1]
|
||||
|
||||
# Assumes a dictionary emitter
|
||||
emitter = builder.emitter[suffix]
|
||||
builder.emitter[suffix] = SCons.Builder.ListEmitter(
|
||||
[emitter, makeEmitCompilationDbEntry(command)]
|
||||
)
|
||||
|
||||
env["BUILDERS"]["__COMPILATIONDB_Entry"] = SCons.Builder.Builder(
|
||||
action=SCons.Action.Action(CompilationDbEntryAction, None),
|
||||
)
|
||||
|
||||
env["BUILDERS"]["__COMPILATIONDB_Database"] = SCons.Builder.Builder(
|
||||
action=SCons.Action.Action(WriteCompilationDb, "$COMPILATIONDB_COMSTR"),
|
||||
target_scanner=SCons.Scanner.Scanner(
|
||||
function=ScanCompilationDb, node_class=None
|
||||
),
|
||||
)
|
||||
|
||||
def CompilationDatabase(env, target):
|
||||
result = env.__COMPILATIONDB_Database(target=target, source=[])
|
||||
|
||||
env.AlwaysBuild(result)
|
||||
env.NoCache(result)
|
||||
|
||||
return result
|
||||
|
||||
env.AddMethod(CompilationDatabase, "CompilationDatabase")
|
||||
|
||||
|
||||
def exists(env):
|
||||
return True
|
@ -14,9 +14,8 @@
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import os
|
||||
from glob import glob
|
||||
from os import environ
|
||||
from os.path import abspath, isfile, join
|
||||
|
||||
from SCons.Defaults import processDefines # pylint: disable=import-error
|
||||
|
||||
@ -42,10 +41,10 @@ def _dump_includes(env):
|
||||
continue
|
||||
toolchain_dir = glob_escape(p.get_package_dir(name))
|
||||
toolchain_incglobs = [
|
||||
join(toolchain_dir, "*", "include*"),
|
||||
join(toolchain_dir, "*", "include", "c++", "*"),
|
||||
join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
|
||||
join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
|
||||
os.path.join(toolchain_dir, "*", "include*"),
|
||||
os.path.join(toolchain_dir, "*", "include", "c++", "*"),
|
||||
os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
|
||||
os.path.join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
|
||||
]
|
||||
for g in toolchain_incglobs:
|
||||
includes.extend(glob(g))
|
||||
@ -59,8 +58,9 @@ def _dump_includes(env):
|
||||
# remove duplicates
|
||||
result = []
|
||||
for item in includes:
|
||||
item = os.path.realpath(item)
|
||||
if item not in result:
|
||||
result.append(abspath(item))
|
||||
result.append(item)
|
||||
|
||||
return result
|
||||
|
||||
@ -68,7 +68,7 @@ def _dump_includes(env):
|
||||
def _get_gcc_defines(env):
|
||||
items = []
|
||||
try:
|
||||
sysenv = environ.copy()
|
||||
sysenv = os.environ.copy()
|
||||
sysenv["PATH"] = str(env["ENV"]["PATH"])
|
||||
result = exec_command(
|
||||
"echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True
|
||||
@ -119,7 +119,7 @@ def _dump_defines(env):
|
||||
def _get_svd_path(env):
|
||||
svd_path = env.GetProjectOption("debug_svd_path")
|
||||
if svd_path:
|
||||
return abspath(svd_path)
|
||||
return os.path.realpath(svd_path)
|
||||
|
||||
if "BOARD" not in env:
|
||||
return None
|
||||
@ -129,18 +129,29 @@ def _get_svd_path(env):
|
||||
except (AssertionError, KeyError):
|
||||
return None
|
||||
# custom path to SVD file
|
||||
if isfile(svd_path):
|
||||
if os.path.isfile(svd_path):
|
||||
return svd_path
|
||||
# default file from ./platform/misc/svd folder
|
||||
p = env.PioPlatform()
|
||||
if isfile(join(p.get_dir(), "misc", "svd", svd_path)):
|
||||
return abspath(join(p.get_dir(), "misc", "svd", svd_path))
|
||||
if os.path.isfile(os.path.join(p.get_dir(), "misc", "svd", svd_path)):
|
||||
return os.path.realpath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
|
||||
return None
|
||||
|
||||
|
||||
def _escape_build_flag(flags):
|
||||
return [flag if " " not in flag else '"%s"' % flag for flag in flags]
|
||||
|
||||
|
||||
def DumpIDEData(env):
|
||||
LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS"
|
||||
LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS"
|
||||
|
||||
env["__escape_build_flag"] = _escape_build_flag
|
||||
|
||||
LINTCCOM = (
|
||||
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
|
||||
)
|
||||
LINTCXXCOM = (
|
||||
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
|
||||
)
|
||||
|
||||
data = {
|
||||
"env_name": env["PIOENV"],
|
||||
|
@ -33,7 +33,10 @@ from platformio import exception, fs, util
|
||||
from platformio.builder.tools import platformio as piotool
|
||||
from platformio.compat import WINDOWS, hashlib_encode_data, string_types
|
||||
from platformio.managers.lib import LibraryManager
|
||||
from platformio.package.manifest.parser import ManifestParserFactory
|
||||
from platformio.package.manifest.parser import (
|
||||
ManifestParserError,
|
||||
ManifestParserFactory,
|
||||
)
|
||||
from platformio.project.options import ProjectOptions
|
||||
|
||||
|
||||
@ -108,7 +111,14 @@ class LibBuilderBase(object):
|
||||
self.path = realpath(env.subst(path))
|
||||
self.verbose = verbose
|
||||
|
||||
self._manifest = manifest if manifest else self.load_manifest()
|
||||
try:
|
||||
self._manifest = manifest if manifest else self.load_manifest()
|
||||
except ManifestParserError:
|
||||
click.secho(
|
||||
"Warning! Ignoring broken library manifest in " + self.path, fg="yellow"
|
||||
)
|
||||
self._manifest = {}
|
||||
|
||||
self._is_dependent = False
|
||||
self._is_built = False
|
||||
self._depbuilders = list()
|
||||
@ -144,9 +154,7 @@ class LibBuilderBase(object):
|
||||
|
||||
@property
|
||||
def dependencies(self):
|
||||
return LibraryManager.normalize_dependencies(
|
||||
self._manifest.get("dependencies", [])
|
||||
)
|
||||
return self._manifest.get("dependencies")
|
||||
|
||||
@property
|
||||
def src_filter(self):
|
||||
@ -358,7 +366,7 @@ class LibBuilderBase(object):
|
||||
if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT):
|
||||
continue
|
||||
_f_part = _h_path[: _h_path.rindex(".")]
|
||||
for ext in piotool.SRC_C_EXT:
|
||||
for ext in piotool.SRC_C_EXT + piotool.SRC_CXX_EXT:
|
||||
if not isfile("%s.%s" % (_f_part, ext)):
|
||||
continue
|
||||
_c_path = self.env.File("%s.%s" % (_f_part, ext))
|
||||
@ -876,7 +884,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
|
||||
if not lib_dir:
|
||||
continue
|
||||
for lb in self.env.GetLibBuilders():
|
||||
if lib_dir not in lb:
|
||||
if lib_dir != lb.path:
|
||||
continue
|
||||
if lb not in self.depbuilders:
|
||||
self.depend_recursive(lb)
|
||||
|
@ -314,17 +314,24 @@ def ConfigureDebugFlags(env):
|
||||
if scope not in env:
|
||||
return
|
||||
unflags = ["-Os", "-g"]
|
||||
for level in [0, 1, 2]:
|
||||
for level in [0, 1, 2, 3]:
|
||||
for flag in ("O", "g", "ggdb"):
|
||||
unflags.append("-%s%d" % (flag, level))
|
||||
env[scope] = [f for f in env.get(scope, []) if f not in unflags]
|
||||
|
||||
env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"])
|
||||
|
||||
debug_flags = ["-Og", "-g2", "-ggdb2"]
|
||||
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
|
||||
_cleanup_debug_flags(scope)
|
||||
env.Append(**{scope: debug_flags})
|
||||
|
||||
debug_flags = env.ParseFlags(env.GetProjectOption("debug_build_flags"))
|
||||
env.MergeFlags(debug_flags)
|
||||
optimization_flags = [
|
||||
f for f in debug_flags.get("CCFLAGS", []) if f.startswith(("-O", "-g"))
|
||||
]
|
||||
|
||||
if optimization_flags:
|
||||
env.AppendUnique(ASFLAGS=optimization_flags, LINKFLAGS=optimization_flags)
|
||||
|
||||
|
||||
def ConfigureTestTarget(env):
|
||||
|
@ -40,15 +40,15 @@ def PioPlatform(env):
|
||||
|
||||
|
||||
def BoardConfig(env, board=None):
|
||||
p = env.PioPlatform()
|
||||
try:
|
||||
board = board or env.get("BOARD")
|
||||
assert board, "BoardConfig: Board is not defined"
|
||||
config = p.board_config(board)
|
||||
except (AssertionError, exception.UnknownBoard) as e:
|
||||
sys.stderr.write("Error: %s\n" % str(e))
|
||||
env.Exit(1)
|
||||
return config
|
||||
with fs.cd(env.subst("$PROJECT_DIR")):
|
||||
try:
|
||||
p = env.PioPlatform()
|
||||
board = board or env.get("BOARD")
|
||||
assert board, "BoardConfig: Board is not defined"
|
||||
return p.board_config(board)
|
||||
except (AssertionError, exception.UnknownBoard) as e:
|
||||
sys.stderr.write("Error: %s\n" % str(e))
|
||||
env.Exit(1)
|
||||
|
||||
|
||||
def GetFrameworkScript(env, framework):
|
||||
@ -213,7 +213,9 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
|
||||
if extra:
|
||||
info += " (%s)" % ", ".join(extra)
|
||||
data.append(info)
|
||||
return ["PACKAGES:", ", ".join(data)]
|
||||
if not data:
|
||||
return None
|
||||
return ["PACKAGES:"] + ["\n - %s" % d for d in sorted(data)]
|
||||
|
||||
for data in (
|
||||
_get_configuration_data(),
|
||||
|
@ -251,9 +251,9 @@ def CheckUploadSize(_, target, source, env):
|
||||
|
||||
print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"')
|
||||
if data_max_size and data_size > -1:
|
||||
print("DATA: %s" % _format_availale_bytes(data_size, data_max_size))
|
||||
print("RAM: %s" % _format_availale_bytes(data_size, data_max_size))
|
||||
if program_size > -1:
|
||||
print("PROGRAM: %s" % _format_availale_bytes(program_size, program_max_size))
|
||||
print("Flash: %s" % _format_availale_bytes(program_size, program_max_size))
|
||||
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
|
||||
print(output)
|
||||
|
||||
|
@ -31,8 +31,10 @@ from platformio.compat import string_types
|
||||
from platformio.util import pioversion_to_intstr
|
||||
|
||||
SRC_HEADER_EXT = ["h", "hpp"]
|
||||
SRC_C_EXT = ["c", "cc", "cpp"]
|
||||
SRC_BUILD_EXT = SRC_C_EXT + ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
|
||||
SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
|
||||
SRC_C_EXT = ["c"]
|
||||
SRC_CXX_EXT = ["cc", "cpp", "cxx", "c++"]
|
||||
SRC_BUILD_EXT = SRC_C_EXT + SRC_CXX_EXT + SRC_ASM_EXT
|
||||
SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % os.sep, "-<.svn%s>" % os.sep]
|
||||
|
||||
|
||||
@ -44,7 +46,88 @@ def scons_patched_match_splitext(path, suffixes=None):
|
||||
return tokens
|
||||
|
||||
|
||||
def _build_project_deps(env):
|
||||
def GetBuildType(env):
|
||||
return (
|
||||
"debug"
|
||||
if (
|
||||
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
|
||||
or env.GetProjectOption("build_type") == "debug"
|
||||
)
|
||||
else "release"
|
||||
)
|
||||
|
||||
|
||||
def BuildProgram(env):
|
||||
env.ProcessProgramDeps()
|
||||
env.ProcessProjectDeps()
|
||||
|
||||
# append into the beginning a main LD script
|
||||
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
|
||||
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
|
||||
|
||||
# enable "cyclic reference" for linker
|
||||
if env.get("LIBS") and env.GetCompilerType() == "gcc":
|
||||
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
|
||||
env.Append(_LIBFLAGS=" -Wl,--end-group")
|
||||
|
||||
program = env.Program(
|
||||
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
|
||||
)
|
||||
env.Replace(PIOMAINPROG=program)
|
||||
|
||||
AlwaysBuild(
|
||||
env.Alias(
|
||||
"checkprogsize",
|
||||
program,
|
||||
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
|
||||
)
|
||||
)
|
||||
|
||||
print("Building in %s mode" % env.GetBuildType())
|
||||
|
||||
return program
|
||||
|
||||
|
||||
def ProcessProgramDeps(env):
|
||||
def _append_pio_macros():
|
||||
env.AppendUnique(
|
||||
CPPDEFINES=[
|
||||
(
|
||||
"PLATFORMIO",
|
||||
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
_append_pio_macros()
|
||||
|
||||
env.PrintConfiguration()
|
||||
|
||||
# fix ASM handling under non case-sensitive OS
|
||||
if not Util.case_sensitive_suffixes(".s", ".S"):
|
||||
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
|
||||
|
||||
# process extra flags from board
|
||||
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
|
||||
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
|
||||
|
||||
# apply user flags
|
||||
env.ProcessFlags(env.get("BUILD_FLAGS"))
|
||||
|
||||
# process framework scripts
|
||||
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
|
||||
|
||||
if env.GetBuildType() == "debug":
|
||||
env.ConfigureDebugFlags()
|
||||
|
||||
# remove specified flags
|
||||
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
|
||||
|
||||
if "__test" in COMMAND_LINE_TARGETS:
|
||||
env.ConfigureTestTarget()
|
||||
|
||||
|
||||
def ProcessProjectDeps(env):
|
||||
project_lib_builder = env.ConfigureProjectLibBuilder()
|
||||
|
||||
# prepend project libs to the beginning of list
|
||||
@ -85,78 +168,6 @@ def _build_project_deps(env):
|
||||
Export("projenv")
|
||||
|
||||
|
||||
def BuildProgram(env):
|
||||
def _append_pio_macros():
|
||||
env.AppendUnique(
|
||||
CPPDEFINES=[
|
||||
(
|
||||
"PLATFORMIO",
|
||||
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
_append_pio_macros()
|
||||
|
||||
env.PrintConfiguration()
|
||||
|
||||
# fix ASM handling under non case-sensitive OS
|
||||
if not Util.case_sensitive_suffixes(".s", ".S"):
|
||||
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
|
||||
|
||||
# process extra flags from board
|
||||
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
|
||||
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
|
||||
|
||||
# apply user flags
|
||||
env.ProcessFlags(env.get("BUILD_FLAGS"))
|
||||
|
||||
# process framework scripts
|
||||
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
|
||||
|
||||
is_build_type_debug = (
|
||||
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
|
||||
or env.GetProjectOption("build_type") == "debug"
|
||||
)
|
||||
if is_build_type_debug:
|
||||
env.ConfigureDebugFlags()
|
||||
|
||||
# remove specified flags
|
||||
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
|
||||
|
||||
if "__test" in COMMAND_LINE_TARGETS:
|
||||
env.ConfigureTestTarget()
|
||||
|
||||
# append into the beginning a main LD script
|
||||
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
|
||||
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
|
||||
|
||||
# enable "cyclic reference" for linker
|
||||
if env.get("LIBS") and env.GetCompilerType() == "gcc":
|
||||
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
|
||||
env.Append(_LIBFLAGS=" -Wl,--end-group")
|
||||
|
||||
# build project with dependencies
|
||||
_build_project_deps(env)
|
||||
|
||||
program = env.Program(
|
||||
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
|
||||
)
|
||||
env.Replace(PIOMAINPROG=program)
|
||||
|
||||
AlwaysBuild(
|
||||
env.Alias(
|
||||
"checkprogsize",
|
||||
program,
|
||||
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
|
||||
)
|
||||
)
|
||||
|
||||
print("Building in %s mode" % ("debug" if is_build_type_debug else "release"))
|
||||
|
||||
return program
|
||||
|
||||
|
||||
def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
|
||||
if not isinstance(flags, list):
|
||||
flags = [flags]
|
||||
@ -343,7 +354,10 @@ def exists(_):
|
||||
|
||||
|
||||
def generate(env):
|
||||
env.AddMethod(GetBuildType)
|
||||
env.AddMethod(BuildProgram)
|
||||
env.AddMethod(ProcessProgramDeps)
|
||||
env.AddMethod(ProcessProjectDeps)
|
||||
env.AddMethod(ParseFlagsExtended)
|
||||
env.AddMethod(ProcessFlags)
|
||||
env.AddMethod(ProcessUnFlags)
|
||||
|
@ -63,5 +63,18 @@ class PlatformioCLI(click.MultiCommand):
|
||||
mod_path = "platformio.commands.%s.command" % cmd_name
|
||||
mod = __import__(mod_path, None, None, ["cli"])
|
||||
except ImportError:
|
||||
try:
|
||||
return self._handle_obsolate_command(cmd_name)
|
||||
except AttributeError:
|
||||
pass
|
||||
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
|
||||
return mod.cli
|
||||
|
||||
@staticmethod
|
||||
def _handle_obsolate_command(name):
|
||||
# pylint: disable=import-outside-toplevel
|
||||
if name == "init":
|
||||
from platformio.commands.project import project_init
|
||||
|
||||
return project_init
|
||||
raise AttributeError()
|
||||
|
@ -32,7 +32,10 @@ def cli(query, installed, json_output): # pylint: disable=R0912
|
||||
|
||||
grpboards = {}
|
||||
for board in _get_boards(installed):
|
||||
if query and query.lower() not in json.dumps(board).lower():
|
||||
if query and not any(
|
||||
query.lower() in str(board.get(k, "")).lower()
|
||||
for k in ("id", "name", "mcu", "vendor", "platform", "frameworks")
|
||||
):
|
||||
continue
|
||||
if board["platform"] not in grpboards:
|
||||
grpboards[board["platform"]] = []
|
||||
|
@ -12,7 +12,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from os.path import abspath, relpath
|
||||
import os
|
||||
|
||||
import click
|
||||
|
||||
@ -52,7 +52,7 @@ class DefectItem(object):
|
||||
self.id = id
|
||||
self.file = file
|
||||
if file.startswith(get_project_dir()):
|
||||
self.file = relpath(file, get_project_dir())
|
||||
self.file = os.path.relpath(file, get_project_dir())
|
||||
|
||||
def __repr__(self):
|
||||
defect_color = None
|
||||
@ -86,7 +86,7 @@ class DefectItem(object):
|
||||
"severity": self.SEVERITY_LABELS[self.severity],
|
||||
"category": self.category,
|
||||
"message": self.message,
|
||||
"file": abspath(self.file),
|
||||
"file": os.path.realpath(self.file),
|
||||
"line": self.line,
|
||||
"column": self.column,
|
||||
"callstack": self.callstack,
|
||||
|
@ -15,6 +15,7 @@
|
||||
from platformio import exception
|
||||
from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool
|
||||
from platformio.commands.check.tools.cppcheck import CppcheckCheckTool
|
||||
from platformio.commands.check.tools.pvsstudio import PvsStudioCheckTool
|
||||
|
||||
|
||||
class CheckToolFactory(object):
|
||||
@ -25,6 +26,8 @@ class CheckToolFactory(object):
|
||||
cls = CppcheckCheckTool
|
||||
elif tool == "clangtidy":
|
||||
cls = ClangtidyCheckTool
|
||||
elif tool == "pvs-studio":
|
||||
cls = PvsStudioCheckTool
|
||||
else:
|
||||
raise exception.PlatformioException("Unknown check tool `%s`" % tool)
|
||||
return cls(project_dir, config, envname, options)
|
||||
|
@ -27,10 +27,13 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
|
||||
self.config = config
|
||||
self.envname = envname
|
||||
self.options = options
|
||||
self.cpp_defines = []
|
||||
self.cpp_flags = []
|
||||
self.cc_flags = []
|
||||
self.cxx_flags = []
|
||||
self.cpp_includes = []
|
||||
|
||||
self.cpp_defines = []
|
||||
self.toolchain_defines = []
|
||||
self.cc_path = None
|
||||
self.cxx_path = None
|
||||
self._defects = []
|
||||
self._on_defect_callback = None
|
||||
self._bad_input = False
|
||||
@ -53,16 +56,19 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
|
||||
data = load_project_ide_data(project_dir, envname)
|
||||
if not data:
|
||||
return
|
||||
self.cpp_flags = data.get("cxx_flags", "").split(" ")
|
||||
self.cc_flags = data.get("cc_flags", "").split(" ")
|
||||
self.cxx_flags = data.get("cxx_flags", "").split(" ")
|
||||
self.cpp_includes = data.get("includes", [])
|
||||
self.cpp_defines = data.get("defines", [])
|
||||
self.cpp_defines.extend(self._get_toolchain_defines(data.get("cc_path")))
|
||||
self.cc_path = data.get("cc_path")
|
||||
self.cxx_path = data.get("cxx_path")
|
||||
self.toolchain_defines = self._get_toolchain_defines(self.cc_path)
|
||||
|
||||
def get_flags(self, tool):
|
||||
result = []
|
||||
flags = self.options.get("flags") or []
|
||||
for flag in flags:
|
||||
if ":" not in flag:
|
||||
if ":" not in flag or flag.startswith("-"):
|
||||
result.extend([f for f in flag.split(" ") if f])
|
||||
elif flag.startswith("%s:" % tool):
|
||||
result.extend([f for f in flag.split(":", 1)[1].split(" ") if f])
|
||||
@ -132,7 +138,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
|
||||
def _add_file(path):
|
||||
if not path.endswith(allowed_extensions):
|
||||
return
|
||||
result.append(os.path.abspath(path))
|
||||
result.append(os.path.realpath(path))
|
||||
|
||||
for pattern in self.options["patterns"]:
|
||||
for item in glob.glob(pattern):
|
||||
|
@ -61,7 +61,7 @@ class ClangtidyCheckTool(CheckToolBase):
|
||||
cmd.extend(self.get_project_target_files())
|
||||
cmd.append("--")
|
||||
|
||||
cmd.extend(["-D%s" % d for d in self.cpp_defines])
|
||||
cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
|
||||
cmd.extend(["-I%s" % inc for inc in self.cpp_includes])
|
||||
|
||||
return cmd
|
||||
|
@ -112,18 +112,18 @@ class CppcheckCheckTool(CheckToolBase):
|
||||
cmd.append("--language=c++")
|
||||
|
||||
if not self.is_flag_set("--std", flags):
|
||||
for f in self.cpp_flags:
|
||||
for f in self.cxx_flags + self.cc_flags:
|
||||
if "-std" in f:
|
||||
# Standards with GNU extensions are not allowed
|
||||
cmd.append("-" + f.replace("gnu", "c"))
|
||||
|
||||
cmd.extend(["-D%s" % d for d in self.cpp_defines])
|
||||
cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
|
||||
cmd.extend(flags)
|
||||
|
||||
cmd.append("--file-list=%s" % self._generate_src_file())
|
||||
cmd.append("--includes-file=%s" % self._generate_inc_file())
|
||||
|
||||
core_dir = self.config.get_optional_dir("core")
|
||||
core_dir = self.config.get_optional_dir("packages")
|
||||
cmd.append("--suppress=*:%s*" % core_dir)
|
||||
cmd.append("--suppress=unmatchedSuppression:%s*" % core_dir)
|
||||
|
||||
|
226
platformio/commands/check/tools/pvsstudio.py
Normal file
226
platformio/commands/check/tools/pvsstudio.py
Normal file
@ -0,0 +1,226 @@
|
||||
# Copyright (c) 2020-present PlatformIO <contact@platformio.org>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
from xml.etree.ElementTree import fromstring
|
||||
|
||||
import click
|
||||
|
||||
from platformio import proc, util
|
||||
from platformio.commands.check.defect import DefectItem
|
||||
from platformio.commands.check.tools.base import CheckToolBase
|
||||
from platformio.managers.core import get_core_package_dir
|
||||
|
||||
|
||||
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes
|
||||
def __init__(self, *args, **kwargs):
|
||||
self._tmp_dir = tempfile.mkdtemp(prefix="piocheck")
|
||||
self._tmp_preprocessed_file = self._generate_tmp_file_path() + ".i"
|
||||
self._tmp_output_file = self._generate_tmp_file_path() + ".pvs"
|
||||
self._tmp_cfg_file = self._generate_tmp_file_path() + ".cfg"
|
||||
self._tmp_cmd_file = self._generate_tmp_file_path() + ".cmd"
|
||||
self.tool_path = os.path.join(
|
||||
get_core_package_dir("tool-pvs-studio"),
|
||||
"x64" if "windows" in util.get_systype() else "bin",
|
||||
"pvs-studio",
|
||||
)
|
||||
super(PvsStudioCheckTool, self).__init__(*args, **kwargs)
|
||||
|
||||
with open(self._tmp_cfg_file, "w") as fp:
|
||||
fp.write(
|
||||
"exclude-path = "
|
||||
+ self.config.get_optional_dir("packages").replace("\\", "/")
|
||||
)
|
||||
|
||||
with open(self._tmp_cmd_file, "w") as fp:
|
||||
fp.write(
|
||||
" ".join(
|
||||
['-I"%s"' % inc.replace("\\", "/") for inc in self.cpp_includes]
|
||||
)
|
||||
)
|
||||
|
||||
def _process_defects(self, defects):
|
||||
for defect in defects:
|
||||
if not isinstance(defect, DefectItem):
|
||||
return
|
||||
if defect.severity not in self.options["severity"]:
|
||||
return
|
||||
self._defects.append(defect)
|
||||
if self._on_defect_callback:
|
||||
self._on_defect_callback(defect)
|
||||
|
||||
def _demangle_report(self, output_file):
|
||||
converter_tool = os.path.join(
|
||||
get_core_package_dir("tool-pvs-studio"),
|
||||
"HtmlGenerator"
|
||||
if "windows" in util.get_systype()
|
||||
else os.path.join("bin", "plog-converter"),
|
||||
)
|
||||
|
||||
cmd = (
|
||||
converter_tool,
|
||||
"-t",
|
||||
"xml",
|
||||
output_file,
|
||||
"-m",
|
||||
"cwe",
|
||||
"-m",
|
||||
"misra",
|
||||
"-a",
|
||||
# Enable all possible analyzers and defect levels
|
||||
"GA:1,2,3;64:1,2,3;OP:1,2,3;CS:1,2,3;MISRA:1,2,3",
|
||||
"--cerr",
|
||||
)
|
||||
|
||||
result = proc.exec_command(cmd)
|
||||
if result["returncode"] != 0:
|
||||
click.echo(result["err"])
|
||||
self._bad_input = True
|
||||
|
||||
return result["err"]
|
||||
|
||||
def parse_defects(self, output_file):
|
||||
defects = []
|
||||
|
||||
report = self._demangle_report(output_file)
|
||||
if not report:
|
||||
self._bad_input = True
|
||||
return []
|
||||
|
||||
try:
|
||||
defects_data = fromstring(report)
|
||||
except: # pylint: disable=bare-except
|
||||
click.echo("Error: Couldn't decode generated report!")
|
||||
self._bad_input = True
|
||||
return []
|
||||
|
||||
for table in defects_data.iter("PVS-Studio_Analysis_Log"):
|
||||
message = table.find("Message").text
|
||||
category = table.find("ErrorType").text
|
||||
line = table.find("Line").text
|
||||
file_ = table.find("File").text
|
||||
defect_id = table.find("ErrorCode").text
|
||||
cwe = table.find("CWECode")
|
||||
cwe_id = None
|
||||
if cwe is not None:
|
||||
cwe_id = cwe.text.lower().replace("cwe-", "")
|
||||
misra = table.find("MISRA")
|
||||
if misra is not None:
|
||||
message += " [%s]" % misra.text
|
||||
|
||||
severity = DefectItem.SEVERITY_LOW
|
||||
if category == "error":
|
||||
severity = DefectItem.SEVERITY_HIGH
|
||||
elif category == "warning":
|
||||
severity = DefectItem.SEVERITY_MEDIUM
|
||||
|
||||
defects.append(
|
||||
DefectItem(
|
||||
severity, category, message, file_, line, id=defect_id, cwe=cwe_id
|
||||
)
|
||||
)
|
||||
|
||||
return defects
|
||||
|
||||
def configure_command(self, src_file): # pylint: disable=arguments-differ
|
||||
if os.path.isfile(self._tmp_output_file):
|
||||
os.remove(self._tmp_output_file)
|
||||
|
||||
if not os.path.isfile(self._tmp_preprocessed_file):
|
||||
click.echo(
|
||||
"Error: Missing preprocessed file '%s'" % (self._tmp_preprocessed_file)
|
||||
)
|
||||
return ""
|
||||
|
||||
cmd = [
|
||||
self.tool_path,
|
||||
"--skip-cl-exe",
|
||||
"yes",
|
||||
"--language",
|
||||
"C" if src_file.endswith(".c") else "C++",
|
||||
"--preprocessor",
|
||||
"gcc",
|
||||
"--cfg",
|
||||
self._tmp_cfg_file,
|
||||
"--source-file",
|
||||
src_file,
|
||||
"--i-file",
|
||||
self._tmp_preprocessed_file,
|
||||
"--output-file",
|
||||
self._tmp_output_file,
|
||||
]
|
||||
|
||||
flags = self.get_flags("pvs-studio")
|
||||
if not self.is_flag_set("--platform", flags):
|
||||
cmd.append("--platform=arm")
|
||||
cmd.extend(flags)
|
||||
|
||||
return cmd
|
||||
|
||||
def _generate_tmp_file_path(self):
|
||||
# pylint: disable=protected-access
|
||||
return os.path.join(self._tmp_dir, next(tempfile._get_candidate_names()))
|
||||
|
||||
def _prepare_preprocessed_file(self, src_file):
|
||||
flags = self.cxx_flags
|
||||
compiler = self.cxx_path
|
||||
if src_file.endswith(".c"):
|
||||
flags = self.cc_flags
|
||||
compiler = self.cc_path
|
||||
|
||||
cmd = [compiler, src_file, "-E", "-o", self._tmp_preprocessed_file]
|
||||
cmd.extend([f for f in flags if f])
|
||||
cmd.extend(["-D%s" % d for d in self.cpp_defines])
|
||||
cmd.append('@"%s"' % self._tmp_cmd_file)
|
||||
|
||||
result = proc.exec_command(" ".join(cmd), shell=True)
|
||||
if result["returncode"] != 0:
|
||||
if self.options.get("verbose"):
|
||||
click.echo(" ".join(cmd))
|
||||
click.echo(result["err"])
|
||||
self._bad_input = True
|
||||
|
||||
def clean_up(self):
|
||||
if os.path.isdir(self._tmp_dir):
|
||||
shutil.rmtree(self._tmp_dir)
|
||||
|
||||
def check(self, on_defect_callback=None):
|
||||
self._on_defect_callback = on_defect_callback
|
||||
src_files = [
|
||||
f for f in self.get_project_target_files() if not f.endswith((".h", ".hpp"))
|
||||
]
|
||||
|
||||
for src_file in src_files:
|
||||
self._prepare_preprocessed_file(src_file)
|
||||
cmd = self.configure_command(src_file)
|
||||
if self.options.get("verbose"):
|
||||
click.echo(" ".join(cmd))
|
||||
if not cmd:
|
||||
self._bad_input = True
|
||||
continue
|
||||
|
||||
result = proc.exec_command(cmd)
|
||||
# pylint: disable=unsupported-membership-test
|
||||
if result["returncode"] != 0 or "License was not entered" in result["err"]:
|
||||
self._bad_input = True
|
||||
click.echo(result["err"])
|
||||
continue
|
||||
|
||||
self._process_defects(self.parse_defects(self._tmp_output_file))
|
||||
|
||||
self.clean_up()
|
||||
|
||||
return self._bad_input
|
@ -14,15 +14,15 @@
|
||||
|
||||
from glob import glob
|
||||
from os import getenv, makedirs, remove
|
||||
from os.path import abspath, basename, isdir, isfile, join
|
||||
from os.path import basename, isdir, isfile, join, realpath
|
||||
from shutil import copyfile, copytree
|
||||
from tempfile import mkdtemp
|
||||
|
||||
import click
|
||||
|
||||
from platformio import app, fs
|
||||
from platformio.commands.init import cli as cmd_init
|
||||
from platformio.commands.init import validate_boards
|
||||
from platformio.commands.project import project_init as cmd_project_init
|
||||
from platformio.commands.project import validate_boards
|
||||
from platformio.commands.run.command import cli as cmd_run
|
||||
from platformio.compat import glob_escape
|
||||
from platformio.exception import CIBuildEnvsEmpty
|
||||
@ -35,7 +35,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
|
||||
for i, p in enumerate(value):
|
||||
if p.startswith("~"):
|
||||
value[i] = fs.expanduser(p)
|
||||
value[i] = abspath(value[i])
|
||||
value[i] = realpath(value[i])
|
||||
if not glob(value[i]):
|
||||
invalid_path = p
|
||||
break
|
||||
@ -111,7 +111,10 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
|
||||
|
||||
# initialise project
|
||||
ctx.invoke(
|
||||
cmd_init, project_dir=build_dir, board=board, project_option=project_option
|
||||
cmd_project_init,
|
||||
project_dir=build_dir,
|
||||
board=board,
|
||||
project_option=project_option,
|
||||
)
|
||||
|
||||
# process project
|
||||
@ -158,7 +161,7 @@ def _exclude_contents(dst_dir, patterns):
|
||||
for p in patterns:
|
||||
contents += glob(join(glob_escape(dst_dir), p))
|
||||
for path in contents:
|
||||
path = abspath(path)
|
||||
path = realpath(path)
|
||||
if isdir(path):
|
||||
fs.rmtree(path)
|
||||
elif isfile(path):
|
||||
|
@ -12,13 +12,12 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import signal
|
||||
import time
|
||||
from hashlib import sha1
|
||||
from os.path import abspath, basename, dirname, isdir, join, splitext
|
||||
from os.path import basename, dirname, isdir, join, realpath, splitext
|
||||
from tempfile import mkdtemp
|
||||
|
||||
from twisted.internet import protocol # pylint: disable=import-error
|
||||
@ -26,13 +25,13 @@ from twisted.internet import reactor # pylint: disable=import-error
|
||||
from twisted.internet import stdio # pylint: disable=import-error
|
||||
from twisted.internet import task # pylint: disable=import-error
|
||||
|
||||
from platformio import app, exception, fs, proc, util
|
||||
from platformio import app, fs, proc, telemetry, util
|
||||
from platformio.commands.debug import helpers, initcfgs
|
||||
from platformio.commands.debug.exception import DebugInvalidOptionsError
|
||||
from platformio.commands.debug.process import BaseProcess
|
||||
from platformio.commands.debug.server import DebugServer
|
||||
from platformio.compat import hashlib_encode_data, is_bytes
|
||||
from platformio.project.helpers import get_project_cache_dir
|
||||
from platformio.telemetry import MeasurementProtocol
|
||||
|
||||
LOG_FILE = None
|
||||
|
||||
@ -58,6 +57,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
|
||||
self._target_is_run = False
|
||||
self._last_server_activity = 0
|
||||
self._auto_continue_timer = None
|
||||
self._errors_buffer = b""
|
||||
|
||||
def spawn(self, gdb_path, prog_path):
|
||||
session_hash = gdb_path + prog_path
|
||||
@ -94,7 +94,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
|
||||
]
|
||||
args.extend(self.args)
|
||||
if not gdb_path:
|
||||
raise exception.DebugInvalidOptions("GDB client is not configured")
|
||||
raise DebugInvalidOptionsError("GDB client is not configured")
|
||||
gdb_data_dir = self._get_data_dir(gdb_path)
|
||||
if gdb_data_dir:
|
||||
args.extend(["--data-directory", gdb_data_dir])
|
||||
@ -108,7 +108,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
|
||||
def _get_data_dir(gdb_path):
|
||||
if "msp430" in gdb_path:
|
||||
return None
|
||||
gdb_data_dir = abspath(join(dirname(gdb_path), "..", "share", "gdb"))
|
||||
gdb_data_dir = realpath(join(dirname(gdb_path), "..", "share", "gdb"))
|
||||
return gdb_data_dir if isdir(gdb_data_dir) else None
|
||||
|
||||
def generate_pioinit(self, dst_dir, patterns):
|
||||
@ -215,6 +215,9 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
|
||||
self._handle_error(data)
|
||||
# go to init break automatically
|
||||
if self.INIT_COMPLETED_BANNER.encode() in data:
|
||||
telemetry.send_event(
|
||||
"Debug", "Started", telemetry.encode_run_environment(self.env_options)
|
||||
)
|
||||
self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
|
||||
self._auto_continue_timer.start(0.1)
|
||||
|
||||
@ -250,20 +253,19 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
|
||||
self._target_is_run = True
|
||||
|
||||
def _handle_error(self, data):
|
||||
self._errors_buffer += data
|
||||
if self.PIO_SRC_NAME.encode() not in data or b"Error in sourced" not in data:
|
||||
return
|
||||
configuration = {"debug": self.debug_options, "env": self.env_options}
|
||||
exd = re.sub(r'\\(?!")', "/", json.dumps(configuration))
|
||||
exd = re.sub(
|
||||
r'"(?:[a-z]\:)?((/[^"/]+)+)"',
|
||||
lambda m: '"%s"' % join(*m.group(1).split("/")[-2:]),
|
||||
exd,
|
||||
re.I | re.M,
|
||||
|
||||
last_erros = self._errors_buffer.decode()
|
||||
last_erros = " ".join(reversed(last_erros.split("\n")))
|
||||
last_erros = re.sub(r'((~|&)"|\\n\"|\\t)', " ", last_erros, flags=re.M)
|
||||
|
||||
err = "%s -> %s" % (
|
||||
telemetry.encode_run_environment(self.env_options),
|
||||
last_erros,
|
||||
)
|
||||
mp = MeasurementProtocol()
|
||||
mp["exd"] = "DebugGDBPioInitError: %s" % exd
|
||||
mp["exf"] = 1
|
||||
mp.send("exception")
|
||||
telemetry.send_exception("DebugInitError: %s" % err)
|
||||
self.transport.loseConnection()
|
||||
|
||||
def _kill_previous_session(self):
|
||||
|
@ -23,8 +23,10 @@ import click
|
||||
|
||||
from platformio import app, exception, fs, proc, util
|
||||
from platformio.commands.debug import helpers
|
||||
from platformio.commands.debug.exception import DebugInvalidOptionsError
|
||||
from platformio.managers.core import inject_contrib_pysite
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.exception import ProjectEnvsNotAvailableError
|
||||
from platformio.project.helpers import is_platformio_project, load_project_ide_data
|
||||
|
||||
|
||||
@ -70,7 +72,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
|
||||
env_name = environment or helpers.get_default_debug_env(config)
|
||||
env_options = config.items(env=env_name, as_dict=True)
|
||||
if not set(env_options.keys()) >= set(["platform", "board"]):
|
||||
raise exception.ProjectEnvsNotAvailable()
|
||||
raise ProjectEnvsNotAvailableError()
|
||||
debug_options = helpers.validate_debug_options(ctx, env_options)
|
||||
assert debug_options
|
||||
|
||||
@ -79,7 +81,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
|
||||
|
||||
configuration = load_project_ide_data(project_dir, env_name)
|
||||
if not configuration:
|
||||
raise exception.DebugInvalidOptions("Could not load debug configuration")
|
||||
raise DebugInvalidOptionsError("Could not load debug configuration")
|
||||
|
||||
if "--version" in __unprocessed:
|
||||
result = proc.exec_command([configuration["gdb_path"], "--version"])
|
||||
@ -140,7 +142,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
|
||||
helpers.is_prog_obsolete(configuration["prog_path"])
|
||||
|
||||
if not isfile(configuration["prog_path"]):
|
||||
raise exception.DebugInvalidOptions("Program/firmware is missed")
|
||||
raise DebugInvalidOptionsError("Program/firmware is missed")
|
||||
|
||||
# run debugging client
|
||||
inject_contrib_pysite()
|
||||
|
33
platformio/commands/debug/exception.py
Normal file
33
platformio/commands/debug/exception.py
Normal file
@ -0,0 +1,33 @@
|
||||
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from platformio.exception import PlatformioException, UserSideException
|
||||
|
||||
|
||||
class DebugError(PlatformioException):
|
||||
pass
|
||||
|
||||
|
||||
class DebugSupportError(DebugError, UserSideException):
|
||||
|
||||
MESSAGE = (
|
||||
"Currently, PlatformIO does not support debugging for `{0}`.\n"
|
||||
"Please request support at https://github.com/platformio/"
|
||||
"platformio-core/issues \nor visit -> https://docs.platformio.org"
|
||||
"/page/plus/debugging.html"
|
||||
)
|
||||
|
||||
|
||||
class DebugInvalidOptionsError(DebugError, UserSideException):
|
||||
pass
|
@ -22,6 +22,7 @@ from os.path import isfile
|
||||
|
||||
from platformio import exception, fs, util
|
||||
from platformio.commands import PlatformioCLI
|
||||
from platformio.commands.debug.exception import DebugInvalidOptionsError
|
||||
from platformio.commands.platform import platform_install as cmd_platform_install
|
||||
from platformio.commands.run.command import cli as cmd_run
|
||||
from platformio.compat import is_bytes
|
||||
@ -301,7 +302,5 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
|
||||
|
||||
debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
|
||||
if not debug_port:
|
||||
raise exception.DebugInvalidOptions(
|
||||
"Please specify `debug_port` for environment"
|
||||
)
|
||||
raise DebugInvalidOptionsError("Please specify `debug_port` for environment")
|
||||
return debug_port
|
||||
|
@ -59,8 +59,8 @@ end
|
||||
target extended-remote $DEBUG_PORT
|
||||
monitor clrbp
|
||||
monitor speed auto
|
||||
$LOAD_CMDS
|
||||
pio_reset_halt_target
|
||||
$LOAD_CMDS
|
||||
$INIT_BREAK
|
||||
"""
|
||||
|
||||
|
@ -15,10 +15,10 @@
|
||||
import os
|
||||
from os.path import isdir, isfile, join
|
||||
|
||||
from twisted.internet import error # pylint: disable=import-error
|
||||
from twisted.internet import reactor # pylint: disable=import-error
|
||||
|
||||
from platformio import exception, fs, util
|
||||
from platformio import fs, util
|
||||
from platformio.commands.debug.exception import DebugInvalidOptionsError
|
||||
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
|
||||
from platformio.commands.debug.process import BaseProcess
|
||||
from platformio.proc import where_is_program
|
||||
@ -54,7 +54,7 @@ class DebugServer(BaseProcess):
|
||||
if not isfile(server_executable):
|
||||
server_executable = where_is_program(server_executable)
|
||||
if not isfile(server_executable):
|
||||
raise exception.DebugInvalidOptions(
|
||||
raise DebugInvalidOptionsError(
|
||||
"\nCould not launch Debug Server '%s'. Please check that it "
|
||||
"is installed and is included in a system PATH\n\n"
|
||||
"See documentation or contact contact@platformio.org:\n"
|
||||
@ -134,5 +134,5 @@ class DebugServer(BaseProcess):
|
||||
return
|
||||
try:
|
||||
self._transport.signalProcess("KILL")
|
||||
except (OSError, error.ProcessExitedAlready):
|
||||
except: # pylint: disable=bare-except
|
||||
pass
|
||||
|
@ -21,7 +21,9 @@ from serial.tools import miniterm
|
||||
|
||||
from platformio import exception, fs, util
|
||||
from platformio.compat import dump_json_to_unicode
|
||||
from platformio.managers.platform import PlatformFactory
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.exception import NotPlatformIOProjectError
|
||||
|
||||
|
||||
@click.group(short_help="Monitor device or list existing")
|
||||
@ -172,48 +174,49 @@ def device_list( # pylint: disable=too-many-branches
|
||||
help="Load configuration from `platformio.ini` and specified environment",
|
||||
)
|
||||
def device_monitor(**kwargs): # pylint: disable=too-many-branches
|
||||
env_options = {}
|
||||
click.echo(
|
||||
"Looking for advanced Serial Monitor with UI? "
|
||||
"Check http://bit.ly/pio-advanced-monitor"
|
||||
)
|
||||
project_options = {}
|
||||
try:
|
||||
with fs.cd(kwargs["project_dir"]):
|
||||
env_options = get_project_options(kwargs["environment"])
|
||||
for k in ("port", "speed", "rts", "dtr"):
|
||||
k2 = "monitor_%s" % k
|
||||
if k == "speed":
|
||||
k = "baud"
|
||||
if kwargs[k] is None and k2 in env_options:
|
||||
kwargs[k] = env_options[k2]
|
||||
if k != "port":
|
||||
kwargs[k] = int(kwargs[k])
|
||||
except exception.NotPlatformIOProject:
|
||||
project_options = get_project_options(kwargs["environment"])
|
||||
kwargs = apply_project_monitor_options(kwargs, project_options)
|
||||
except NotPlatformIOProjectError:
|
||||
pass
|
||||
|
||||
if not kwargs["port"]:
|
||||
ports = util.get_serial_ports(filter_hwid=True)
|
||||
if len(ports) == 1:
|
||||
kwargs["port"] = ports[0]["port"]
|
||||
|
||||
sys.argv = ["monitor"] + env_options.get("monitor_flags", [])
|
||||
for k, v in kwargs.items():
|
||||
if k in ("port", "baud", "rts", "dtr", "environment", "project_dir"):
|
||||
continue
|
||||
k = "--" + k.replace("_", "-")
|
||||
if k in env_options.get("monitor_flags", []):
|
||||
continue
|
||||
if isinstance(v, bool):
|
||||
if v:
|
||||
sys.argv.append(k)
|
||||
elif isinstance(v, tuple):
|
||||
for i in v:
|
||||
sys.argv.extend([k, i])
|
||||
else:
|
||||
sys.argv.extend([k, str(v)])
|
||||
|
||||
if kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
|
||||
elif "platform" in project_options and "board" in project_options:
|
||||
board_hwids = get_board_hwids(
|
||||
kwargs["project_dir"],
|
||||
project_options["platform"],
|
||||
project_options["board"],
|
||||
)
|
||||
for item in ports:
|
||||
for hwid in board_hwids:
|
||||
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
|
||||
if hwid_str in item["hwid"]:
|
||||
kwargs["port"] = item["port"]
|
||||
break
|
||||
if kwargs["port"]:
|
||||
break
|
||||
elif kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
|
||||
for item in util.get_serial_ports():
|
||||
if fnmatch(item["port"], kwargs["port"]):
|
||||
kwargs["port"] = item["port"]
|
||||
break
|
||||
|
||||
# override system argv with patched options
|
||||
sys.argv = ["monitor"] + options_to_argv(
|
||||
kwargs,
|
||||
project_options,
|
||||
ignore=("port", "baud", "rts", "dtr", "environment", "project_dir"),
|
||||
)
|
||||
|
||||
try:
|
||||
miniterm.main(
|
||||
default_port=kwargs["port"],
|
||||
@ -225,6 +228,37 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
|
||||
raise exception.MinitermException(e)
|
||||
|
||||
|
||||
def apply_project_monitor_options(cli_options, project_options):
|
||||
for k in ("port", "speed", "rts", "dtr"):
|
||||
k2 = "monitor_%s" % k
|
||||
if k == "speed":
|
||||
k = "baud"
|
||||
if cli_options[k] is None and k2 in project_options:
|
||||
cli_options[k] = project_options[k2]
|
||||
if k != "port":
|
||||
cli_options[k] = int(cli_options[k])
|
||||
return cli_options
|
||||
|
||||
|
||||
def options_to_argv(cli_options, project_options, ignore=None):
|
||||
result = project_options.get("monitor_flags", [])
|
||||
for k, v in cli_options.items():
|
||||
if v is None or (ignore and k in ignore):
|
||||
continue
|
||||
k = "--" + k.replace("_", "-")
|
||||
if k in project_options.get("monitor_flags", []):
|
||||
continue
|
||||
if isinstance(v, bool):
|
||||
if v:
|
||||
result.append(k)
|
||||
elif isinstance(v, tuple):
|
||||
for i in v:
|
||||
result.extend([k, i])
|
||||
else:
|
||||
result.extend([k, str(v)])
|
||||
return result
|
||||
|
||||
|
||||
def get_project_options(environment=None):
|
||||
config = ProjectConfig.get_instance()
|
||||
config.validate(envs=[environment] if environment else None)
|
||||
@ -235,3 +269,12 @@ def get_project_options(environment=None):
|
||||
else:
|
||||
environment = config.envs()[0]
|
||||
return config.items(env=environment, as_dict=True)
|
||||
|
||||
|
||||
def get_board_hwids(project_dir, platform, board):
|
||||
with fs.cd(project_dir):
|
||||
return (
|
||||
PlatformFactory.newPlatform(platform)
|
||||
.board_config(board)
|
||||
.get("build.hwids", [])
|
||||
)
|
||||
|
@ -21,6 +21,7 @@ from os.path import isdir
|
||||
import click
|
||||
|
||||
from platformio import exception
|
||||
from platformio.compat import WINDOWS
|
||||
from platformio.managers.core import get_core_package_dir, inject_contrib_pysite
|
||||
|
||||
|
||||
@ -87,15 +88,7 @@ def cli(port, host, no_open, shutdown_timeout):
|
||||
if host == "__do_not_start__":
|
||||
return
|
||||
|
||||
# if already started
|
||||
already_started = False
|
||||
socket.setdefaulttimeout(1)
|
||||
try:
|
||||
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
|
||||
already_started = True
|
||||
except: # pylint: disable=bare-except
|
||||
pass
|
||||
|
||||
already_started = is_port_used(host, port)
|
||||
home_url = "http://%s:%d" % (host, port)
|
||||
if not no_open:
|
||||
if already_started:
|
||||
@ -116,12 +109,35 @@ def cli(port, host, no_open, shutdown_timeout):
|
||||
)
|
||||
)
|
||||
click.echo("")
|
||||
click.echo("Open PIO Home in your browser by this URL => %s" % home_url)
|
||||
click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
|
||||
|
||||
if already_started:
|
||||
click.secho(
|
||||
"PlatformIO Home server is already started in another process.", fg="yellow"
|
||||
)
|
||||
return
|
||||
|
||||
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.")
|
||||
|
||||
reactor.listenTCP(port, site, interface=host)
|
||||
reactor.run()
|
||||
|
||||
|
||||
def is_port_used(host, port):
|
||||
socket.setdefaulttimeout(1)
|
||||
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
if WINDOWS:
|
||||
try:
|
||||
s.bind((host, port))
|
||||
s.close()
|
||||
return False
|
||||
except (OSError, socket.error):
|
||||
pass
|
||||
else:
|
||||
try:
|
||||
s.connect((host, port))
|
||||
s.close()
|
||||
except socket.error:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
@ -23,7 +23,7 @@ from platformio.commands.home.rpc.handlers.os import OSRPC
|
||||
|
||||
class MiscRPC(object):
|
||||
def load_latest_tweets(self, data_url):
|
||||
cache_key = data_url
|
||||
cache_key = app.ContentCache.key_from_args(data_url, "tweets")
|
||||
cache_valid = "7d"
|
||||
with app.ContentCache() as cc:
|
||||
cache_data = cc.get(cache_key)
|
||||
|
@ -14,12 +14,10 @@
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import codecs
|
||||
import glob
|
||||
import os
|
||||
import shutil
|
||||
from functools import cmp_to_key
|
||||
from os.path import isdir, isfile, join
|
||||
|
||||
import click
|
||||
from twisted.internet import defer # pylint: disable=import-error
|
||||
@ -67,10 +65,9 @@ class OSRPC(object):
|
||||
def request_content(self, uri, data=None, headers=None, cache_valid=None):
|
||||
if uri.startswith("http"):
|
||||
return self.fetch_content(uri, data, headers, cache_valid)
|
||||
if not isfile(uri):
|
||||
return None
|
||||
with codecs.open(uri, encoding="utf-8") as fp:
|
||||
return fp.read()
|
||||
if os.path.isfile(uri):
|
||||
return fs.get_file_contents(uri, encoding="utf8")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def open_url(url):
|
||||
@ -88,16 +85,20 @@ class OSRPC(object):
|
||||
|
||||
@staticmethod
|
||||
def is_file(path):
|
||||
return isfile(path)
|
||||
return os.path.isfile(path)
|
||||
|
||||
@staticmethod
|
||||
def is_dir(path):
|
||||
return isdir(path)
|
||||
return os.path.isdir(path)
|
||||
|
||||
@staticmethod
|
||||
def make_dirs(path):
|
||||
return os.makedirs(path)
|
||||
|
||||
@staticmethod
|
||||
def get_file_mtime(path):
|
||||
return os.path.getmtime(path)
|
||||
|
||||
@staticmethod
|
||||
def rename(src, dst):
|
||||
return os.rename(src, dst)
|
||||
@ -112,7 +113,7 @@ class OSRPC(object):
|
||||
pathnames = [pathnames]
|
||||
result = set()
|
||||
for pathname in pathnames:
|
||||
result |= set(glob.glob(join(root, pathname) if root else pathname))
|
||||
result |= set(glob.glob(os.path.join(root, pathname) if root else pathname))
|
||||
return list(result)
|
||||
|
||||
@staticmethod
|
||||
@ -131,13 +132,13 @@ class OSRPC(object):
|
||||
items = []
|
||||
if path.startswith("~"):
|
||||
path = fs.expanduser(path)
|
||||
if not isdir(path):
|
||||
if not os.path.isdir(path):
|
||||
return items
|
||||
for item in os.listdir(path):
|
||||
try:
|
||||
item_is_dir = isdir(join(path, item))
|
||||
item_is_dir = os.path.isdir(os.path.join(path, item))
|
||||
if item_is_dir:
|
||||
os.listdir(join(path, item))
|
||||
os.listdir(os.path.join(path, item))
|
||||
items.append((item, item_is_dir))
|
||||
except OSError:
|
||||
pass
|
||||
|
@ -17,7 +17,6 @@ from __future__ import absolute_import
|
||||
import os
|
||||
import shutil
|
||||
import time
|
||||
from os.path import basename, getmtime, isdir, isfile, join, realpath, sep
|
||||
|
||||
import jsonrpc # pylint: disable=import-error
|
||||
|
||||
@ -28,6 +27,7 @@ from platformio.compat import PY2, get_filesystem_encoding
|
||||
from platformio.ide.projectgenerator import ProjectGenerator
|
||||
from platformio.managers.platform import PlatformManager
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.exception import ProjectError
|
||||
from platformio.project.helpers import get_project_dir, is_platformio_project
|
||||
from platformio.project.options import get_config_options_schema
|
||||
|
||||
@ -38,7 +38,7 @@ class ProjectRPC(object):
|
||||
assert isinstance(init_kwargs, dict)
|
||||
assert "path" in init_kwargs
|
||||
project_dir = get_project_dir()
|
||||
if isfile(init_kwargs["path"]):
|
||||
if os.path.isfile(init_kwargs["path"]):
|
||||
project_dir = os.path.dirname(init_kwargs["path"])
|
||||
with fs.cd(project_dir):
|
||||
return getattr(ProjectConfig(**init_kwargs), method)(*args)
|
||||
@ -74,7 +74,7 @@ class ProjectRPC(object):
|
||||
return get_config_options_schema()
|
||||
|
||||
@staticmethod
|
||||
def _get_projects(project_dirs=None):
|
||||
def get_projects():
|
||||
def _get_project_data():
|
||||
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
|
||||
config = ProjectConfig()
|
||||
@ -86,7 +86,7 @@ class ProjectRPC(object):
|
||||
for section in config.sections():
|
||||
if not section.startswith("env:"):
|
||||
continue
|
||||
data["envLibdepsDirs"].append(join(libdeps_dir, section[4:]))
|
||||
data["envLibdepsDirs"].append(os.path.join(libdeps_dir, section[4:]))
|
||||
if config.has_option(section, "board"):
|
||||
data["boards"].append(config.get(section, "board"))
|
||||
data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
|
||||
@ -94,28 +94,27 @@ class ProjectRPC(object):
|
||||
# skip non existing folders and resolve full path
|
||||
for key in ("envLibdepsDirs", "libExtraDirs"):
|
||||
data[key] = [
|
||||
fs.expanduser(d) if d.startswith("~") else realpath(d)
|
||||
fs.expanduser(d) if d.startswith("~") else os.path.realpath(d)
|
||||
for d in data[key]
|
||||
if isdir(d)
|
||||
if os.path.isdir(d)
|
||||
]
|
||||
|
||||
return data
|
||||
|
||||
def _path_to_name(path):
|
||||
return (sep).join(path.split(sep)[-2:])
|
||||
|
||||
if not project_dirs:
|
||||
project_dirs = AppRPC.load_state()["storage"]["recentProjects"]
|
||||
return (os.path.sep).join(path.split(os.path.sep)[-2:])
|
||||
|
||||
result = []
|
||||
pm = PlatformManager()
|
||||
for project_dir in project_dirs:
|
||||
for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
|
||||
if not os.path.isdir(project_dir):
|
||||
continue
|
||||
data = {}
|
||||
boards = []
|
||||
try:
|
||||
with fs.cd(project_dir):
|
||||
data = _get_project_data()
|
||||
except exception.PlatformIOProjectException:
|
||||
except ProjectError:
|
||||
continue
|
||||
|
||||
for board_id in data.get("boards", []):
|
||||
@ -130,12 +129,12 @@ class ProjectRPC(object):
|
||||
{
|
||||
"path": project_dir,
|
||||
"name": _path_to_name(project_dir),
|
||||
"modified": int(getmtime(project_dir)),
|
||||
"modified": int(os.path.getmtime(project_dir)),
|
||||
"boards": boards,
|
||||
"description": data.get("description"),
|
||||
"envs": data.get("envs", []),
|
||||
"envLibStorages": [
|
||||
{"name": basename(d), "path": d}
|
||||
{"name": os.path.basename(d), "path": d}
|
||||
for d in data.get("envLibdepsDirs", [])
|
||||
],
|
||||
"extraLibStorages": [
|
||||
@ -146,27 +145,24 @@ class ProjectRPC(object):
|
||||
)
|
||||
return result
|
||||
|
||||
def get_projects(self, project_dirs=None):
|
||||
return self._get_projects(project_dirs)
|
||||
|
||||
@staticmethod
|
||||
def get_project_examples():
|
||||
result = []
|
||||
for manifest in PlatformManager().get_installed():
|
||||
examples_dir = join(manifest["__pkg_dir"], "examples")
|
||||
if not isdir(examples_dir):
|
||||
examples_dir = os.path.join(manifest["__pkg_dir"], "examples")
|
||||
if not os.path.isdir(examples_dir):
|
||||
continue
|
||||
items = []
|
||||
for project_dir, _, __ in os.walk(examples_dir):
|
||||
project_description = None
|
||||
try:
|
||||
config = ProjectConfig(join(project_dir, "platformio.ini"))
|
||||
config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
|
||||
config.validate(silent=True)
|
||||
project_description = config.get("platformio", "description")
|
||||
except exception.PlatformIOProjectException:
|
||||
except ProjectError:
|
||||
continue
|
||||
|
||||
path_tokens = project_dir.split(sep)
|
||||
path_tokens = project_dir.split(os.path.sep)
|
||||
items.append(
|
||||
{
|
||||
"name": "/".join(
|
||||
@ -190,7 +186,7 @@ class ProjectRPC(object):
|
||||
def init(self, board, framework, project_dir):
|
||||
assert project_dir
|
||||
state = AppRPC.load_state()
|
||||
if not isdir(project_dir):
|
||||
if not os.path.isdir(project_dir):
|
||||
os.makedirs(project_dir)
|
||||
args = ["init", "--board", board]
|
||||
if framework:
|
||||
@ -243,10 +239,10 @@ class ProjectRPC(object):
|
||||
with fs.cd(project_dir):
|
||||
config = ProjectConfig()
|
||||
src_dir = config.get_optional_dir("src")
|
||||
main_path = join(src_dir, "main.cpp")
|
||||
if isfile(main_path):
|
||||
main_path = os.path.join(src_dir, "main.cpp")
|
||||
if os.path.isfile(main_path):
|
||||
return project_dir
|
||||
if not isdir(src_dir):
|
||||
if not os.path.isdir(src_dir):
|
||||
os.makedirs(src_dir)
|
||||
fs.write_file_contents(main_path, main_content.strip())
|
||||
return project_dir
|
||||
@ -261,10 +257,10 @@ class ProjectRPC(object):
|
||||
|
||||
is_arduino_project = any(
|
||||
[
|
||||
isfile(
|
||||
join(
|
||||
os.path.isfile(
|
||||
os.path.join(
|
||||
arduino_project_dir,
|
||||
"%s.%s" % (basename(arduino_project_dir), ext),
|
||||
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
|
||||
)
|
||||
)
|
||||
for ext in ("ino", "pde")
|
||||
@ -276,10 +272,10 @@ class ProjectRPC(object):
|
||||
)
|
||||
|
||||
state = AppRPC.load_state()
|
||||
project_dir = join(
|
||||
project_dir = os.path.join(
|
||||
state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
|
||||
)
|
||||
if not isdir(project_dir):
|
||||
if not os.path.isdir(project_dir):
|
||||
os.makedirs(project_dir)
|
||||
args = ["init", "--board", board]
|
||||
args.extend(["--project-option", "framework = arduino"])
|
||||
@ -301,7 +297,7 @@ class ProjectRPC(object):
|
||||
with fs.cd(project_dir):
|
||||
config = ProjectConfig()
|
||||
src_dir = config.get_optional_dir("src")
|
||||
if isdir(src_dir):
|
||||
if os.path.isdir(src_dir):
|
||||
fs.rmtree(src_dir)
|
||||
shutil.copytree(arduino_project_dir, src_dir)
|
||||
return project_dir
|
||||
@ -312,9 +308,9 @@ class ProjectRPC(object):
|
||||
raise jsonrpc.exceptions.JSONRPCDispatchException(
|
||||
code=4001, message="Not an PlatformIO project: %s" % project_dir
|
||||
)
|
||||
new_project_dir = join(
|
||||
new_project_dir = os.path.join(
|
||||
AppRPC.load_state()["storage"]["projectsDir"],
|
||||
time.strftime("%y%m%d-%H%M%S-") + basename(project_dir),
|
||||
time.strftime("%y%m%d-%H%M%S-") + os.path.basename(project_dir),
|
||||
)
|
||||
shutil.copytree(project_dir, new_project_dir)
|
||||
|
||||
|
@ -26,7 +26,7 @@ from platformio.commands import PlatformioCLI
|
||||
from platformio.compat import dump_json_to_unicode
|
||||
from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib
|
||||
from platformio.package.manifest.parser import ManifestParserFactory
|
||||
from platformio.package.manifest.schema import ManifestSchema, ManifestValidationError
|
||||
from platformio.package.manifest.schema import ManifestSchema
|
||||
from platformio.proc import is_ci
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.helpers import get_project_dir, is_platformio_project
|
||||
@ -495,11 +495,9 @@ def lib_register(config_url):
|
||||
raise exception.InvalidLibConfURL(config_url)
|
||||
|
||||
# Validate manifest
|
||||
data, error = ManifestSchema(strict=False).load(
|
||||
ManifestSchema().load_manifest(
|
||||
ManifestParserFactory.new_from_url(config_url).as_dict()
|
||||
)
|
||||
if error:
|
||||
raise ManifestValidationError(error, data)
|
||||
|
||||
result = util.get_api_result("/lib/register", data=dict(config_url=config_url))
|
||||
if "message" in result and result["message"]:
|
||||
|
@ -20,6 +20,7 @@ from platformio import app, exception, util
|
||||
from platformio.commands.boards import print_boards
|
||||
from platformio.compat import dump_json_to_unicode
|
||||
from platformio.managers.platform import PlatformFactory, PlatformManager
|
||||
from platformio.package.pack import PackagePacker
|
||||
|
||||
|
||||
@click.group(short_help="Platform Manager")
|
||||
@ -298,14 +299,20 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
|
||||
@click.option("--with-package", multiple=True)
|
||||
@click.option("--without-package", multiple=True)
|
||||
@click.option("--skip-default-package", is_flag=True)
|
||||
@click.option("--with-all-packages", is_flag=True)
|
||||
@click.option(
|
||||
"-f",
|
||||
"--force",
|
||||
is_flag=True,
|
||||
help="Reinstall/redownload dev/platform and its packages if exist",
|
||||
)
|
||||
def platform_install(
|
||||
platforms, with_package, without_package, skip_default_package, force
|
||||
def platform_install( # pylint: disable=too-many-arguments
|
||||
platforms,
|
||||
with_package,
|
||||
without_package,
|
||||
skip_default_package,
|
||||
with_all_packages,
|
||||
force,
|
||||
):
|
||||
pm = PlatformManager()
|
||||
for platform in platforms:
|
||||
@ -314,6 +321,7 @@ def platform_install(
|
||||
with_packages=with_package,
|
||||
without_packages=without_package,
|
||||
skip_default_package=skip_default_package,
|
||||
with_all_packages=with_all_packages,
|
||||
force=force,
|
||||
):
|
||||
click.secho(
|
||||
@ -403,3 +411,13 @@ def platform_update( # pylint: disable=too-many-locals
|
||||
click.echo()
|
||||
|
||||
return True
|
||||
|
||||
|
||||
@cli.command(
|
||||
"pack", short_help="Create a tarball from development platform/tool package"
|
||||
)
|
||||
@click.argument("package", required=True, metavar="[source directory, tar.gz or zip]")
|
||||
def platform_pack(package):
|
||||
p = PackagePacker(package)
|
||||
tarball_path = p.pack()
|
||||
click.secho('Wrote a tarball to "%s"' % tarball_path, fg="green")
|
||||
|
@ -14,19 +14,61 @@
|
||||
|
||||
# pylint: disable=too-many-arguments,too-many-locals, too-many-branches
|
||||
|
||||
from os import getcwd, makedirs
|
||||
from os.path import isdir, isfile, join
|
||||
import os
|
||||
|
||||
import click
|
||||
from tabulate import tabulate
|
||||
|
||||
from platformio import exception, fs
|
||||
from platformio.commands.platform import platform_install as cli_platform_install
|
||||
from platformio.ide.projectgenerator import ProjectGenerator
|
||||
from platformio.managers.platform import PlatformManager
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.exception import NotPlatformIOProjectError
|
||||
from platformio.project.helpers import is_platformio_project
|
||||
|
||||
|
||||
@click.group(short_help="Project Manager")
|
||||
def cli():
|
||||
pass
|
||||
|
||||
|
||||
@cli.command("config", short_help="Show computed configuration")
|
||||
@click.option(
|
||||
"-d",
|
||||
"--project-dir",
|
||||
default=os.getcwd,
|
||||
type=click.Path(
|
||||
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
|
||||
),
|
||||
)
|
||||
@click.option("--json-output", is_flag=True)
|
||||
def project_config(project_dir, json_output):
|
||||
if not is_platformio_project(project_dir):
|
||||
raise NotPlatformIOProjectError(project_dir)
|
||||
with fs.cd(project_dir):
|
||||
config = ProjectConfig.get_instance()
|
||||
if json_output:
|
||||
return click.echo(config.to_json())
|
||||
click.echo(
|
||||
"Computed project configuration for %s" % click.style(project_dir, fg="cyan")
|
||||
)
|
||||
for section, options in config.as_tuple():
|
||||
click.echo()
|
||||
click.secho(section, fg="cyan")
|
||||
click.echo("-" * len(section))
|
||||
click.echo(
|
||||
tabulate(
|
||||
[
|
||||
(name, "=", "\n".join(value) if isinstance(value, list) else value)
|
||||
for name, value in options
|
||||
],
|
||||
tablefmt="plain",
|
||||
)
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
def validate_boards(ctx, param, value): # pylint: disable=W0613
|
||||
pm = PlatformManager()
|
||||
for id_ in value:
|
||||
@ -40,11 +82,11 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
|
||||
return value
|
||||
|
||||
|
||||
@click.command("init", short_help="Initialize PlatformIO project or update existing")
|
||||
@cli.command("init", short_help="Initialize a project or update existing")
|
||||
@click.option(
|
||||
"--project-dir",
|
||||
"-d",
|
||||
default=getcwd,
|
||||
default=os.getcwd,
|
||||
type=click.Path(
|
||||
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
|
||||
),
|
||||
@ -55,7 +97,7 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
|
||||
@click.option("--env-prefix", default="")
|
||||
@click.option("-s", "--silent", is_flag=True)
|
||||
@click.pass_context
|
||||
def cli(
|
||||
def project_init(
|
||||
ctx, # pylint: disable=R0913
|
||||
project_dir,
|
||||
board,
|
||||
@ -65,7 +107,7 @@ def cli(
|
||||
silent,
|
||||
):
|
||||
if not silent:
|
||||
if project_dir == getcwd():
|
||||
if project_dir == os.getcwd():
|
||||
click.secho("\nThe current working directory", fg="yellow", nl=False)
|
||||
click.secho(" %s " % project_dir, fg="cyan", nl=False)
|
||||
click.secho("will be used for the project.", fg="yellow")
|
||||
@ -137,16 +179,16 @@ def init_base_project(project_dir):
|
||||
(config.get_optional_dir("test"), init_test_readme),
|
||||
]
|
||||
for (path, cb) in dir_to_readme:
|
||||
if isdir(path):
|
||||
if os.path.isdir(path):
|
||||
continue
|
||||
makedirs(path)
|
||||
os.makedirs(path)
|
||||
if cb:
|
||||
cb(path)
|
||||
|
||||
|
||||
def init_include_readme(include_dir):
|
||||
fs.write_file_contents(
|
||||
join(include_dir, "README"),
|
||||
os.path.join(include_dir, "README"),
|
||||
"""
|
||||
This directory is intended for project header files.
|
||||
|
||||
@ -193,7 +235,7 @@ https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
|
||||
def init_lib_readme(lib_dir):
|
||||
# pylint: disable=line-too-long
|
||||
fs.write_file_contents(
|
||||
join(lib_dir, "README"),
|
||||
os.path.join(lib_dir, "README"),
|
||||
"""
|
||||
This directory is intended for project specific (private) libraries.
|
||||
PlatformIO will compile them to static libraries and link into executable file.
|
||||
@ -246,7 +288,7 @@ More information about PlatformIO Library Dependency Finder
|
||||
|
||||
def init_test_readme(test_dir):
|
||||
fs.write_file_contents(
|
||||
join(test_dir, "README"),
|
||||
os.path.join(test_dir, "README"),
|
||||
"""
|
||||
This directory is intended for PIO Unit Testing and project tests.
|
||||
|
||||
@ -263,8 +305,8 @@ More information about PIO Unit Testing:
|
||||
|
||||
|
||||
def init_ci_conf(project_dir):
|
||||
conf_path = join(project_dir, ".travis.yml")
|
||||
if isfile(conf_path):
|
||||
conf_path = os.path.join(project_dir, ".travis.yml")
|
||||
if os.path.isfile(conf_path):
|
||||
return
|
||||
fs.write_file_contents(
|
||||
conf_path,
|
||||
@ -340,8 +382,8 @@ def init_ci_conf(project_dir):
|
||||
|
||||
|
||||
def init_cvs_ignore(project_dir):
|
||||
conf_path = join(project_dir, ".gitignore")
|
||||
if isfile(conf_path):
|
||||
conf_path = os.path.join(project_dir, ".gitignore")
|
||||
if os.path.isfile(conf_path):
|
||||
return
|
||||
fs.write_file_contents(conf_path, ".pio\n")
|
||||
|
||||
@ -349,7 +391,9 @@ def init_cvs_ignore(project_dir):
|
||||
def fill_project_envs(
|
||||
ctx, project_dir, board_ids, project_option, env_prefix, force_download
|
||||
):
|
||||
config = ProjectConfig(join(project_dir, "platformio.ini"), parse_extra=False)
|
||||
config = ProjectConfig(
|
||||
os.path.join(project_dir, "platformio.ini"), parse_extra=False
|
||||
)
|
||||
used_boards = []
|
||||
for section in config.sections():
|
||||
cond = [section.startswith("env:"), config.has_option(section, "board")]
|
@ -12,18 +12,18 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
import threading
|
||||
from os import getcwd
|
||||
from os.path import isfile, join
|
||||
from tempfile import mkdtemp
|
||||
from time import sleep
|
||||
|
||||
import click
|
||||
|
||||
from platformio import exception, fs
|
||||
from platformio.commands.device import device_monitor as cmd_device_monitor
|
||||
from platformio.commands import device
|
||||
from platformio.managers.core import pioplus_call
|
||||
from platformio.project.exception import NotPlatformIOProjectError
|
||||
|
||||
# pylint: disable=unused-argument
|
||||
|
||||
@ -83,7 +83,7 @@ def remote_update(only_check, dry_run):
|
||||
@click.option(
|
||||
"-d",
|
||||
"--project-dir",
|
||||
default=getcwd,
|
||||
default=os.getcwd,
|
||||
type=click.Path(
|
||||
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
|
||||
),
|
||||
@ -104,7 +104,7 @@ def remote_run(**kwargs):
|
||||
@click.option(
|
||||
"-d",
|
||||
"--project-dir",
|
||||
default=getcwd,
|
||||
default=os.getcwd,
|
||||
type=click.Path(
|
||||
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
|
||||
),
|
||||
@ -130,9 +130,7 @@ def device_list(json_output):
|
||||
|
||||
@remote_device.command("monitor", short_help="Monitor remote device")
|
||||
@click.option("--port", "-p", help="Port, a number or a device name")
|
||||
@click.option(
|
||||
"--baud", "-b", type=int, default=9600, help="Set baud rate, default=9600"
|
||||
)
|
||||
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
|
||||
@click.option(
|
||||
"--parity",
|
||||
default="N",
|
||||
@ -183,25 +181,49 @@ def device_list(json_output):
|
||||
is_flag=True,
|
||||
help="Diagnostics: suppress non-error messages, default=Off",
|
||||
)
|
||||
@click.option(
|
||||
"-d",
|
||||
"--project-dir",
|
||||
default=os.getcwd,
|
||||
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
|
||||
)
|
||||
@click.option(
|
||||
"-e",
|
||||
"--environment",
|
||||
help="Load configuration from `platformio.ini` and specified environment",
|
||||
)
|
||||
@click.pass_context
|
||||
def device_monitor(ctx, **kwargs):
|
||||
project_options = {}
|
||||
try:
|
||||
with fs.cd(kwargs["project_dir"]):
|
||||
project_options = device.get_project_options(kwargs["environment"])
|
||||
kwargs = device.apply_project_monitor_options(kwargs, project_options)
|
||||
except NotPlatformIOProjectError:
|
||||
pass
|
||||
|
||||
kwargs["baud"] = kwargs["baud"] or 9600
|
||||
|
||||
def _tx_target(sock_dir):
|
||||
pioplus_argv = ["remote", "device", "monitor"]
|
||||
pioplus_argv.extend(device.options_to_argv(kwargs, project_options))
|
||||
pioplus_argv.extend(["--sock", sock_dir])
|
||||
try:
|
||||
pioplus_call(sys.argv[1:] + ["--sock", sock_dir])
|
||||
pioplus_call(pioplus_argv)
|
||||
except exception.ReturnErrorCode:
|
||||
pass
|
||||
|
||||
sock_dir = mkdtemp(suffix="pioplus")
|
||||
sock_file = join(sock_dir, "sock")
|
||||
sock_file = os.path.join(sock_dir, "sock")
|
||||
try:
|
||||
t = threading.Thread(target=_tx_target, args=(sock_dir,))
|
||||
t.start()
|
||||
while t.is_alive() and not isfile(sock_file):
|
||||
while t.is_alive() and not os.path.isfile(sock_file):
|
||||
sleep(0.1)
|
||||
if not t.is_alive():
|
||||
return
|
||||
kwargs["port"] = fs.get_file_contents(sock_file)
|
||||
ctx.invoke(cmd_device_monitor, **kwargs)
|
||||
ctx.invoke(device.device_monitor, **kwargs)
|
||||
t.join(2)
|
||||
finally:
|
||||
fs.rmtree(sock_dir)
|
||||
|
@ -16,6 +16,7 @@ from platformio import exception, telemetry
|
||||
from platformio.commands.platform import platform_install as cmd_platform_install
|
||||
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
|
||||
from platformio.managers.platform import PlatformFactory
|
||||
from platformio.project.exception import UndefinedEnvPlatformError
|
||||
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
@ -56,12 +57,12 @@ class EnvironmentProcessor(object):
|
||||
|
||||
def process(self):
|
||||
if "platform" not in self.options:
|
||||
raise exception.UndefinedEnvPlatform(self.name)
|
||||
raise UndefinedEnvPlatformError(self.name)
|
||||
|
||||
build_vars = self.get_build_variables()
|
||||
build_targets = list(self.get_build_targets())
|
||||
|
||||
telemetry.on_run_environment(self.options, build_targets)
|
||||
telemetry.send_run_environment(self.options, build_targets)
|
||||
|
||||
# skip monitor target, we call it above
|
||||
if "monitor" in build_targets:
|
||||
|
@ -107,7 +107,8 @@ def cli( # pylint: disable=redefined-builtin
|
||||
raise exception.TestDirNotExists(test_dir)
|
||||
test_names = get_test_names(test_dir)
|
||||
|
||||
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
|
||||
if not verbose:
|
||||
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
|
||||
click.secho("Collected %d items" % len(test_names), bold=True)
|
||||
|
||||
results = []
|
||||
@ -159,6 +160,7 @@ def cli( # pylint: disable=redefined-builtin
|
||||
monitor_rts=monitor_rts,
|
||||
monitor_dtr=monitor_dtr,
|
||||
verbose=verbose,
|
||||
silent=not verbose,
|
||||
),
|
||||
)
|
||||
result = {
|
||||
|
@ -46,7 +46,7 @@ class EmbeddedTestProcessor(TestProcessorBase):
|
||||
return False
|
||||
|
||||
if self.options["without_testing"]:
|
||||
return None
|
||||
return True
|
||||
|
||||
self.print_progress("Testing...")
|
||||
return self.run()
|
||||
|
@ -119,7 +119,8 @@ class TestProcessorBase(object):
|
||||
cmd_run,
|
||||
project_dir=self.options["project_dir"],
|
||||
upload_port=self.options["upload_port"],
|
||||
silent=not self.options["verbose"],
|
||||
verbose=self.options["verbose"],
|
||||
silent=self.options["silent"],
|
||||
environment=[self.env_name],
|
||||
disable_auto_clean="nobuild" in target,
|
||||
target=target,
|
||||
|
@ -32,11 +32,14 @@ def get_filesystem_encoding():
|
||||
|
||||
|
||||
def get_locale_encoding():
|
||||
return locale.getdefaultlocale()[1]
|
||||
try:
|
||||
return locale.getdefaultlocale()[1]
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def get_class_attributes(cls):
|
||||
attributes = inspect.getmembers(cls, lambda a: not (inspect.isroutine(a)))
|
||||
attributes = inspect.getmembers(cls, lambda a: not inspect.isroutine(a))
|
||||
return {
|
||||
a[0]: a[1]
|
||||
for a in attributes
|
||||
|
@ -12,10 +12,12 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import hashlib
|
||||
import io
|
||||
import math
|
||||
import sys
|
||||
from email.utils import parsedate_tz
|
||||
from math import ceil
|
||||
from os.path import getsize, join
|
||||
from sys import version_info
|
||||
from time import mktime
|
||||
|
||||
import click
|
||||
@ -27,13 +29,9 @@ from platformio.exception import (
|
||||
FDSizeMismatch,
|
||||
FDUnrecognizedStatusCode,
|
||||
)
|
||||
from platformio.proc import exec_command
|
||||
|
||||
|
||||
class FileDownloader(object):
|
||||
|
||||
CHUNK_SIZE = 1024
|
||||
|
||||
def __init__(self, url, dest_dir=None):
|
||||
self._request = None
|
||||
# make connection
|
||||
@ -41,7 +39,7 @@ class FileDownloader(object):
|
||||
url,
|
||||
stream=True,
|
||||
headers=util.get_request_defheaders(),
|
||||
verify=version_info >= (2, 7, 9),
|
||||
verify=sys.version_info >= (2, 7, 9),
|
||||
)
|
||||
if self._request.status_code != 200:
|
||||
raise FDUnrecognizedStatusCode(self._request.status_code, url)
|
||||
@ -74,18 +72,19 @@ class FileDownloader(object):
|
||||
return -1
|
||||
return int(self._request.headers["content-length"])
|
||||
|
||||
def start(self, with_progress=True):
|
||||
def start(self, with_progress=True, silent=False):
|
||||
label = "Downloading"
|
||||
itercontent = self._request.iter_content(chunk_size=self.CHUNK_SIZE)
|
||||
itercontent = self._request.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE)
|
||||
f = open(self._destination, "wb")
|
||||
try:
|
||||
if not with_progress or self.get_size() == -1:
|
||||
click.echo("%s..." % label)
|
||||
if not silent:
|
||||
click.echo("%s..." % label)
|
||||
for chunk in itercontent:
|
||||
if chunk:
|
||||
f.write(chunk)
|
||||
else:
|
||||
chunks = int(ceil(self.get_size() / float(self.CHUNK_SIZE)))
|
||||
chunks = int(math.ceil(self.get_size() / float(io.DEFAULT_BUFFER_SIZE)))
|
||||
with click.progressbar(length=chunks, label=label) as pb:
|
||||
for _ in pb:
|
||||
f.write(next(itercontent))
|
||||
@ -102,25 +101,19 @@ class FileDownloader(object):
|
||||
_dlsize = getsize(self._destination)
|
||||
if self.get_size() != -1 and _dlsize != self.get_size():
|
||||
raise FDSizeMismatch(_dlsize, self._fname, self.get_size())
|
||||
|
||||
if not sha1:
|
||||
return None
|
||||
|
||||
dlsha1 = None
|
||||
try:
|
||||
result = exec_command(["sha1sum", self._destination])
|
||||
dlsha1 = result["out"]
|
||||
except (OSError, ValueError):
|
||||
try:
|
||||
result = exec_command(["shasum", "-a", "1", self._destination])
|
||||
dlsha1 = result["out"]
|
||||
except (OSError, ValueError):
|
||||
pass
|
||||
if not dlsha1:
|
||||
return None
|
||||
dlsha1 = dlsha1[1:41] if dlsha1.startswith("\\") else dlsha1[:40]
|
||||
if sha1.lower() != dlsha1.lower():
|
||||
raise FDSHASumMismatch(dlsha1, self._fname, sha1)
|
||||
checksum = hashlib.sha1()
|
||||
with io.open(self._destination, "rb", buffering=0) as fp:
|
||||
while True:
|
||||
chunk = fp.read(io.DEFAULT_BUFFER_SIZE)
|
||||
if not chunk:
|
||||
break
|
||||
checksum.update(chunk)
|
||||
|
||||
if sha1.lower() != checksum.hexdigest().lower():
|
||||
raise FDSHASumMismatch(checksum.hexdigest(), self._fname, sha1)
|
||||
return True
|
||||
|
||||
def _preserve_filemtime(self, lmdate):
|
||||
|
@ -152,49 +152,6 @@ class FDSHASumMismatch(PlatformIOPackageException):
|
||||
)
|
||||
|
||||
|
||||
#
|
||||
# Project
|
||||
#
|
||||
|
||||
|
||||
class PlatformIOProjectException(PlatformioException):
|
||||
pass
|
||||
|
||||
|
||||
class NotPlatformIOProject(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = (
|
||||
"Not a PlatformIO project. `platformio.ini` file has not been "
|
||||
"found in current working directory ({0}). To initialize new project "
|
||||
"please use `platformio init` command"
|
||||
)
|
||||
|
||||
|
||||
class InvalidProjectConf(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = "Invalid '{0}' (project configuration file): '{1}'"
|
||||
|
||||
|
||||
class UndefinedEnvPlatform(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = "Please specify platform for '{0}' environment"
|
||||
|
||||
|
||||
class ProjectEnvsNotAvailable(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = "Please setup environments in `platformio.ini` file"
|
||||
|
||||
|
||||
class UnknownEnvNames(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = "Unknown environment names '{0}'. Valid names are '{1}'"
|
||||
|
||||
|
||||
class ProjectOptionValueError(PlatformIOProjectException):
|
||||
|
||||
MESSAGE = "{0} for option `{1}` in section [{2}]"
|
||||
|
||||
|
||||
#
|
||||
# Library
|
||||
#
|
||||
@ -319,7 +276,7 @@ class UpgradeError(PlatformioException):
|
||||
"""
|
||||
|
||||
|
||||
class HomeDirPermissionsError(PlatformioException):
|
||||
class HomeDirPermissionsError(UserSideException):
|
||||
|
||||
MESSAGE = (
|
||||
"The directory `{0}` or its parent directory is not owned by the "
|
||||
@ -338,20 +295,6 @@ class CygwinEnvDetected(PlatformioException):
|
||||
)
|
||||
|
||||
|
||||
class DebugSupportError(PlatformioException):
|
||||
|
||||
MESSAGE = (
|
||||
"Currently, PlatformIO does not support debugging for `{0}`.\n"
|
||||
"Please request support at https://github.com/platformio/"
|
||||
"platformio-core/issues \nor visit -> https://docs.platformio.org"
|
||||
"/page/plus/debugging.html"
|
||||
)
|
||||
|
||||
|
||||
class DebugInvalidOptions(PlatformioException):
|
||||
pass
|
||||
|
||||
|
||||
class TestDirNotExists(PlatformioException):
|
||||
|
||||
MESSAGE = (
|
||||
|
@ -40,7 +40,7 @@ class cd(object):
|
||||
|
||||
|
||||
def get_source_dir():
|
||||
curpath = os.path.abspath(__file__)
|
||||
curpath = os.path.realpath(__file__)
|
||||
if not os.path.isfile(curpath):
|
||||
for p in sys.path:
|
||||
if os.path.isfile(os.path.join(p, __file__)):
|
||||
@ -49,9 +49,9 @@ def get_source_dir():
|
||||
return os.path.dirname(curpath)
|
||||
|
||||
|
||||
def get_file_contents(path):
|
||||
def get_file_contents(path, encoding=None):
|
||||
try:
|
||||
with open(path) as fp:
|
||||
with io.open(path, encoding=encoding) as fp:
|
||||
return fp.read()
|
||||
except UnicodeDecodeError:
|
||||
click.secho(
|
||||
@ -117,7 +117,7 @@ def ensure_udev_rules():
|
||||
if not any(os.path.isfile(p) for p in installed_rules):
|
||||
raise exception.MissedUdevRules
|
||||
|
||||
origin_path = os.path.abspath(
|
||||
origin_path = os.path.realpath(
|
||||
os.path.join(get_source_dir(), "..", "scripts", "99-platformio-udev.rules")
|
||||
)
|
||||
if not os.path.isfile(origin_path):
|
||||
@ -143,10 +143,10 @@ def path_endswith_ext(path, extensions):
|
||||
return False
|
||||
|
||||
|
||||
def match_src_files(src_dir, src_filter=None, src_exts=None):
|
||||
def match_src_files(src_dir, src_filter=None, src_exts=None, followlinks=True):
|
||||
def _append_build_item(items, item, src_dir):
|
||||
if not src_exts or path_endswith_ext(item, src_exts):
|
||||
items.add(item.replace(src_dir + os.sep, ""))
|
||||
items.add(os.path.relpath(item, src_dir))
|
||||
|
||||
src_filter = src_filter or ""
|
||||
if isinstance(src_filter, (list, tuple)):
|
||||
@ -159,7 +159,7 @@ def match_src_files(src_dir, src_filter=None, src_exts=None):
|
||||
items = set()
|
||||
for item in glob(os.path.join(glob_escape(src_dir), pattern)):
|
||||
if os.path.isdir(item):
|
||||
for root, _, files in os.walk(item, followlinks=True):
|
||||
for root, _, files in os.walk(item, followlinks=followlinks):
|
||||
for f in files:
|
||||
_append_build_item(items, os.path.join(root, f), src_dir)
|
||||
else:
|
||||
|
@ -12,10 +12,10 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import io
|
||||
import codecs
|
||||
import os
|
||||
import sys
|
||||
from os.path import abspath, basename, isdir, isfile, join, relpath
|
||||
from os.path import basename, isdir, isfile, join, realpath, relpath
|
||||
|
||||
import bottle
|
||||
|
||||
@ -64,7 +64,7 @@ class ProjectGenerator(object):
|
||||
"project_name": basename(self.project_dir),
|
||||
"project_dir": self.project_dir,
|
||||
"env_name": self.env_name,
|
||||
"user_home_dir": abspath(fs.expanduser("~")),
|
||||
"user_home_dir": realpath(fs.expanduser("~")),
|
||||
"platformio_path": sys.argv[0]
|
||||
if isfile(sys.argv[0])
|
||||
else where_is_program("platformio"),
|
||||
@ -129,18 +129,18 @@ class ProjectGenerator(object):
|
||||
dst_dir = join(self.project_dir, tpl_relpath)
|
||||
if not isdir(dst_dir):
|
||||
os.makedirs(dst_dir)
|
||||
|
||||
file_name = basename(tpl_path)[:-4]
|
||||
contents = self._render_tpl(tpl_path, tpl_vars)
|
||||
self._merge_contents(join(dst_dir, file_name), contents)
|
||||
|
||||
@staticmethod
|
||||
def _render_tpl(tpl_path, tpl_vars):
|
||||
return bottle.template(fs.get_file_contents(tpl_path), **tpl_vars)
|
||||
with codecs.open(tpl_path, "r", encoding="utf8") as fp:
|
||||
return bottle.SimpleTemplate(fp.read()).render(**tpl_vars)
|
||||
|
||||
@staticmethod
|
||||
def _merge_contents(dst_path, contents):
|
||||
if basename(dst_path) == ".gitignore" and isfile(dst_path):
|
||||
return
|
||||
with io.open(dst_path, "w", encoding="utf8") as fp:
|
||||
with codecs.open(dst_path, "w", encoding="utf8") as fp:
|
||||
fp.write(contents)
|
||||
|
@ -1,8 +1,8 @@
|
||||
% _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
|
||||
{
|
||||
"execPath": "{{ cxx_path }}",
|
||||
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
|
||||
"gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
|
||||
"gccDefaultCFlags": "-fsyntax-only {{! to_unix_path(cc_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
|
||||
"gccDefaultCppFlags": "-fsyntax-only {{! to_unix_path(cxx_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
|
||||
"gccErrorLimit": 15,
|
||||
"gccIncludePaths": "{{ ','.join(includes) }}",
|
||||
"gccSuppressWarnings": false
|
||||
|
@ -6,6 +6,7 @@
|
||||
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
|
||||
|
||||
cmake_minimum_required(VERSION 3.2)
|
||||
|
||||
project("{{project_name}}")
|
||||
|
||||
include(CMakeListsPrivate.txt)
|
||||
|
@ -5,7 +5,7 @@
|
||||
# please create `CMakeListsUser.txt` in the root of project.
|
||||
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
|
||||
|
||||
%from platformio.project.helpers import (load_project_ide_data)
|
||||
% from platformio.project.helpers import (load_project_ide_data)
|
||||
%
|
||||
% import re
|
||||
%
|
||||
@ -22,10 +22,14 @@
|
||||
% return path
|
||||
% end
|
||||
%
|
||||
% def _escape(text):
|
||||
% return to_unix_path(text).replace('"', '\\"')
|
||||
% end
|
||||
%
|
||||
% envs = config.envs()
|
||||
|
||||
% if len(envs) > 1:
|
||||
set(CMAKE_CONFIGURATION_TYPES "{{ ";".join(envs) }}" CACHE STRING "" FORCE)
|
||||
set(CMAKE_CONFIGURATION_TYPES "{{ ";".join(envs) }};" CACHE STRING "" FORCE)
|
||||
% else:
|
||||
set(CMAKE_CONFIGURATION_TYPES "{{ env_name }}" CACHE STRING "" FORCE)
|
||||
% end
|
||||
@ -37,8 +41,8 @@ set(SVD_PATH "{{ _normalize_path(svd_path) }}")
|
||||
|
||||
SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}")
|
||||
SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}")
|
||||
SET(CMAKE_CXX_FLAGS_DISTRIBUTION "{{cxx_flags}}")
|
||||
SET(CMAKE_C_FLAGS_DISTRIBUTION "{{cc_flags}}")
|
||||
SET(CMAKE_CXX_FLAGS "{{ _normalize_path(to_unix_path(cxx_flags)) }}")
|
||||
SET(CMAKE_C_FLAGS "{{ _normalize_path(to_unix_path(cc_flags)) }}")
|
||||
|
||||
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
|
||||
% cc_stds = STD_RE.findall(cc_flags)
|
||||
|
@ -13,6 +13,35 @@
|
||||
% return to_unix_path(text).replace('"', '\\"')
|
||||
% end
|
||||
%
|
||||
% def _escape_required(flag):
|
||||
% return " " in flag and systype == "windows"
|
||||
% end
|
||||
%
|
||||
% def _split_flags(flags):
|
||||
% result = []
|
||||
% i = 0
|
||||
% flags = flags.strip()
|
||||
% while i < len(flags):
|
||||
% current_arg = []
|
||||
% while i < len(flags) and flags[i] != " ":
|
||||
% if flags[i] == '"':
|
||||
% quotes_idx = flags.find('"', i + 1)
|
||||
% current_arg.extend(flags[i + 1:quotes_idx])
|
||||
% i = quotes_idx + 1
|
||||
% else:
|
||||
% current_arg.append(flags[i])
|
||||
% i = i + 1
|
||||
% end
|
||||
% end
|
||||
% arg = "".join(current_arg)
|
||||
% if arg.strip():
|
||||
% result.append(arg.strip())
|
||||
% end
|
||||
% i = i + 1
|
||||
% end
|
||||
% return result
|
||||
% end
|
||||
%
|
||||
% cleaned_includes = []
|
||||
% for include in includes:
|
||||
% if "toolchain-" not in dirname(commonprefix([include, cc_path])) and isdir(include):
|
||||
@ -55,17 +84,21 @@
|
||||
% cc_stds = STD_RE.findall(cc_flags)
|
||||
% cxx_stds = STD_RE.findall(cxx_flags)
|
||||
%
|
||||
% # pass only architecture specific flags
|
||||
% cc_m_flags = " ".join([f.strip() for f in cc_flags.split(" ") if f.strip().startswith("-m")])
|
||||
%
|
||||
% if cc_stds:
|
||||
"cStandard": "c{{ cc_stds[-1] }}",
|
||||
% end
|
||||
% if cxx_stds:
|
||||
"cppStandard": "c++{{ cxx_stds[-1] }}",
|
||||
% end
|
||||
"compilerPath": "\"{{cc_path}}\" {{! _escape(cc_m_flags) }}"
|
||||
"compilerPath": "{{ cc_path }}",
|
||||
"compilerArgs": [
|
||||
% for flag in [ '"%s"' % _escape(f) if _escape_required(f) else f for f in _split_flags(
|
||||
% cc_flags) if f.startswith(("-m", "-i", "@"))]:
|
||||
"{{ flag }}",
|
||||
% end
|
||||
""
|
||||
]
|
||||
}
|
||||
],
|
||||
"version": 4
|
||||
}
|
||||
}
|
||||
|
@ -1,7 +1,23 @@
|
||||
% import json
|
||||
% import os
|
||||
% import re
|
||||
%
|
||||
% recommendations = set(["platformio.platformio-ide"])
|
||||
% previous_json = os.path.join(project_dir, ".vscode", "extensions.json")
|
||||
% if os.path.isfile(previous_json):
|
||||
% fp = open(previous_json)
|
||||
% contents = re.sub(r"^\s*//.*$", "", fp.read(), flags=re.M).strip()
|
||||
% fp.close()
|
||||
% if contents:
|
||||
% recommendations |= set(json.loads(contents).get("recommendations", []))
|
||||
% end
|
||||
% end
|
||||
{
|
||||
// See http://go.microsoft.com/fwlink/?LinkId=827846
|
||||
// for the documentation about the extensions.json format
|
||||
"recommendations": [
|
||||
"platformio.platformio-ide"
|
||||
]
|
||||
}
|
||||
// See http://go.microsoft.com/fwlink/?LinkId=827846
|
||||
// for the documentation about the extensions.json format
|
||||
"recommendations": [
|
||||
% for i, item in enumerate(sorted(recommendations)):
|
||||
"{{ item }}"{{ ("," if (i + 1) < len(recommendations) else "") }}
|
||||
% end
|
||||
]
|
||||
}
|
||||
|
@ -12,8 +12,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from os import remove
|
||||
from os.path import abspath, exists, getmtime
|
||||
import os
|
||||
from time import sleep, time
|
||||
|
||||
from platformio import exception
|
||||
@ -45,15 +44,15 @@ class LockFile(object):
|
||||
def __init__(self, path, timeout=LOCKFILE_TIMEOUT, delay=LOCKFILE_DELAY):
|
||||
self.timeout = timeout
|
||||
self.delay = delay
|
||||
self._lock_path = abspath(path) + ".lock"
|
||||
self._lock_path = os.path.realpath(path) + ".lock"
|
||||
self._fp = None
|
||||
|
||||
def _lock(self):
|
||||
if not LOCKFILE_CURRENT_INTERFACE and exists(self._lock_path):
|
||||
if not LOCKFILE_CURRENT_INTERFACE and os.path.exists(self._lock_path):
|
||||
# remove stale lock
|
||||
if time() - getmtime(self._lock_path) > 10:
|
||||
if time() - os.path.getmtime(self._lock_path) > 10:
|
||||
try:
|
||||
remove(self._lock_path)
|
||||
os.remove(self._lock_path)
|
||||
except: # pylint: disable=bare-except
|
||||
pass
|
||||
else:
|
||||
@ -93,9 +92,9 @@ class LockFile(object):
|
||||
|
||||
def release(self):
|
||||
self._unlock()
|
||||
if exists(self._lock_path):
|
||||
if os.path.exists(self._lock_path):
|
||||
try:
|
||||
remove(self._lock_path)
|
||||
os.remove(self._lock_path)
|
||||
except: # pylint: disable=bare-except
|
||||
pass
|
||||
|
||||
|
@ -151,7 +151,7 @@ def after_upgrade(ctx):
|
||||
"PlatformIO has been successfully upgraded to %s!\n" % __version__,
|
||||
fg="green",
|
||||
)
|
||||
telemetry.on_event(
|
||||
telemetry.send_event(
|
||||
category="Auto",
|
||||
action="Upgrade",
|
||||
label="%s > %s" % (last_version, __version__),
|
||||
@ -315,7 +315,7 @@ def check_internal_updates(ctx, what):
|
||||
ctx.invoke(cmd_lib_update, libraries=outdated_items)
|
||||
click.echo()
|
||||
|
||||
telemetry.on_event(category="Auto", action="Update", label=what.title())
|
||||
telemetry.send_event(category="Auto", action="Update", label=what.title())
|
||||
|
||||
click.echo("*" * terminal_width)
|
||||
click.echo("")
|
||||
|
@ -24,13 +24,14 @@ from platformio.proc import copy_pythonpath_to_osenv, get_pythonexe_path
|
||||
from platformio.project.config import ProjectConfig
|
||||
|
||||
CORE_PACKAGES = {
|
||||
"contrib-piohome": "~3.0.0",
|
||||
"contrib-piohome": "~3.1.0",
|
||||
"contrib-pysite": "~2.%d%d.0" % (sys.version_info[0], sys.version_info[1]),
|
||||
"tool-pioplus": "^2.5.8",
|
||||
"tool-unity": "~1.20403.0",
|
||||
"tool-scons": "~2.20501.7" if PY2 else "~3.30101.0",
|
||||
"tool-pioplus": "^2.6.1",
|
||||
"tool-unity": "~1.20500.0",
|
||||
"tool-scons": "~2.20501.7" if PY2 else "~3.30102.0",
|
||||
"tool-cppcheck": "~1.189.0",
|
||||
"tool-clangtidy": "^1.80000.0",
|
||||
"tool-pvs-studio": "~7.5.0",
|
||||
}
|
||||
|
||||
PIOPLUS_AUTO_UPDATES_MAX = 100
|
||||
|
@ -23,7 +23,7 @@ import click
|
||||
import semantic_version
|
||||
|
||||
from platformio import app, exception, util
|
||||
from platformio.compat import glob_escape, string_types
|
||||
from platformio.compat import glob_escape
|
||||
from platformio.managers.package import BasePkgManager
|
||||
from platformio.managers.platform import PlatformFactory, PlatformManager
|
||||
from platformio.project.config import ProjectConfig
|
||||
@ -61,29 +61,6 @@ class LibraryManager(BasePkgManager):
|
||||
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def normalize_dependencies(dependencies):
|
||||
if not dependencies:
|
||||
return []
|
||||
items = []
|
||||
if isinstance(dependencies, dict):
|
||||
if "name" in dependencies:
|
||||
items.append(dependencies)
|
||||
else:
|
||||
for name, version in dependencies.items():
|
||||
items.append({"name": name, "version": version})
|
||||
elif isinstance(dependencies, list):
|
||||
items = [d for d in dependencies if "name" in d]
|
||||
for item in items:
|
||||
for k in ("frameworks", "platforms"):
|
||||
if k not in item or isinstance(k, list):
|
||||
continue
|
||||
if item[k] == "*":
|
||||
del item[k]
|
||||
elif isinstance(item[k], string_types):
|
||||
item[k] = [i.strip() for i in item[k].split(",") if i.strip()]
|
||||
return items
|
||||
|
||||
def max_satisfying_repo_version(self, versions, requirements=None):
|
||||
def _cmp_dates(datestr1, datestr2):
|
||||
date1 = util.parse_date(datestr1)
|
||||
@ -312,7 +289,7 @@ class LibraryManager(BasePkgManager):
|
||||
click.secho("Installing dependencies", fg="yellow")
|
||||
|
||||
builtin_lib_storages = None
|
||||
for filters in self.normalize_dependencies(manifest["dependencies"]):
|
||||
for filters in manifest["dependencies"]:
|
||||
assert "name" in filters
|
||||
|
||||
# avoid circle dependencies
|
||||
|
@ -17,21 +17,19 @@ import json
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
from os.path import abspath, basename, getsize, isdir, isfile, islink, join
|
||||
from os.path import basename, getsize, isdir, isfile, islink, join, realpath
|
||||
from tempfile import mkdtemp
|
||||
|
||||
import click
|
||||
import requests
|
||||
import semantic_version
|
||||
|
||||
from platformio import __version__, app, exception, fs, telemetry, util
|
||||
from platformio import __version__, app, exception, fs, util
|
||||
from platformio.compat import hashlib_encode_data
|
||||
from platformio.downloader import FileDownloader
|
||||
from platformio.lockfile import LockFile
|
||||
from platformio.package.manifest.parser import (
|
||||
ManifestParserError,
|
||||
ManifestParserFactory,
|
||||
)
|
||||
from platformio.package.exception import ManifestException
|
||||
from platformio.package.manifest.parser import ManifestParserFactory
|
||||
from platformio.unpacker import FileUnpacker
|
||||
from platformio.vcsclient import VCSClientFactory
|
||||
|
||||
@ -347,7 +345,7 @@ class PkgInstallerMixin(object):
|
||||
|
||||
try:
|
||||
manifest = ManifestParserFactory.new_from_file(manifest_path).as_dict()
|
||||
except ManifestParserError:
|
||||
except ManifestException:
|
||||
pass
|
||||
|
||||
if src_manifest:
|
||||
@ -364,7 +362,7 @@ class PkgInstallerMixin(object):
|
||||
if "version" not in manifest:
|
||||
manifest["version"] = "0.0.0"
|
||||
|
||||
manifest["__pkg_dir"] = pkg_dir
|
||||
manifest["__pkg_dir"] = realpath(pkg_dir)
|
||||
self.cache_set(cache_key, manifest)
|
||||
return manifest
|
||||
|
||||
@ -423,7 +421,7 @@ class PkgInstallerMixin(object):
|
||||
|
||||
def get_package_by_dir(self, pkg_dir):
|
||||
for manifest in self.get_installed():
|
||||
if manifest["__pkg_dir"] == abspath(pkg_dir):
|
||||
if manifest["__pkg_dir"] == realpath(pkg_dir):
|
||||
return manifest
|
||||
return None
|
||||
|
||||
@ -439,6 +437,7 @@ class PkgInstallerMixin(object):
|
||||
pkg_dir = None
|
||||
pkgdata = None
|
||||
versions = None
|
||||
last_exc = None
|
||||
for versions in PackageRepoIterator(name, self.repositories):
|
||||
pkgdata = self.max_satisfying_repo_version(versions, requirements)
|
||||
if not pkgdata:
|
||||
@ -449,12 +448,15 @@ class PkgInstallerMixin(object):
|
||||
)
|
||||
break
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
last_exc = e
|
||||
click.secho("Warning! Package Mirror: %s" % e, fg="yellow")
|
||||
click.secho("Looking for another mirror...", fg="yellow")
|
||||
|
||||
if versions is None:
|
||||
util.internet_on(raise_exception=True)
|
||||
raise exception.UnknownPackage(name)
|
||||
raise exception.UnknownPackage(
|
||||
name + (". Error -> %s" % last_exc if last_exc else "")
|
||||
)
|
||||
if not pkgdata:
|
||||
raise exception.UndefinedPackageVersion(
|
||||
requirements or "latest", util.get_systype()
|
||||
@ -656,7 +658,7 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
|
||||
|
||||
def install(
|
||||
self, name, requirements=None, silent=False, after_update=False, force=False
|
||||
):
|
||||
): # pylint: disable=unused-argument
|
||||
pkg_dir = None
|
||||
# interprocess lock
|
||||
with LockFile(self.package_dir):
|
||||
@ -705,13 +707,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
|
||||
manifest = self.load_manifest(pkg_dir)
|
||||
assert manifest
|
||||
|
||||
if not after_update:
|
||||
telemetry.on_event(
|
||||
category=self.__class__.__name__,
|
||||
action="Install",
|
||||
label=manifest["name"],
|
||||
)
|
||||
|
||||
click.secho(
|
||||
"{name} @ {version} has been successfully installed!".format(
|
||||
**manifest
|
||||
@ -721,7 +716,9 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
|
||||
|
||||
return pkg_dir
|
||||
|
||||
def uninstall(self, package, requirements=None, after_update=False):
|
||||
def uninstall(
|
||||
self, package, requirements=None, after_update=False
|
||||
): # pylint: disable=unused-argument
|
||||
# interprocess lock
|
||||
with LockFile(self.package_dir):
|
||||
self.cache_reset()
|
||||
@ -760,13 +757,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
|
||||
|
||||
click.echo("[%s]" % click.style("OK", fg="green"))
|
||||
|
||||
if not after_update:
|
||||
telemetry.on_event(
|
||||
category=self.__class__.__name__,
|
||||
action="Uninstall",
|
||||
label=manifest["name"],
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def update(self, package, requirements=None, only_check=False):
|
||||
@ -815,9 +805,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
|
||||
self.uninstall(pkg_dir, after_update=True)
|
||||
self.install(name, latest, after_update=True)
|
||||
|
||||
telemetry.on_event(
|
||||
category=self.__class__.__name__, action="Update", label=manifest["name"]
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
|
@ -17,13 +17,18 @@
|
||||
import base64
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from os.path import basename, dirname, isdir, isfile, join
|
||||
|
||||
import click
|
||||
import semantic_version
|
||||
|
||||
from platformio import __version__, app, exception, fs, proc, util
|
||||
from platformio import __version__, app, exception, fs, proc, telemetry, util
|
||||
from platformio.commands.debug.exception import (
|
||||
DebugInvalidOptionsError,
|
||||
DebugSupportError,
|
||||
)
|
||||
from platformio.compat import PY2, hashlib_encode_data, is_bytes, load_python_module
|
||||
from platformio.managers.core import get_core_package_dir
|
||||
from platformio.managers.package import BasePkgManager, PackageManager
|
||||
@ -69,6 +74,7 @@ class PlatformManager(BasePkgManager):
|
||||
with_packages=None,
|
||||
without_packages=None,
|
||||
skip_default_package=False,
|
||||
with_all_packages=False,
|
||||
after_update=False,
|
||||
silent=False,
|
||||
force=False,
|
||||
@ -79,9 +85,14 @@ class PlatformManager(BasePkgManager):
|
||||
)
|
||||
p = PlatformFactory.newPlatform(platform_dir)
|
||||
|
||||
if with_all_packages:
|
||||
with_packages = list(p.packages.keys())
|
||||
|
||||
# don't cleanup packages or install them after update
|
||||
# we check packages for updates in def update()
|
||||
if after_update:
|
||||
p.install_python_packages()
|
||||
p.on_installed()
|
||||
return True
|
||||
|
||||
p.install_packages(
|
||||
@ -91,6 +102,8 @@ class PlatformManager(BasePkgManager):
|
||||
silent=silent,
|
||||
force=force,
|
||||
)
|
||||
p.install_python_packages()
|
||||
p.on_installed()
|
||||
return self.cleanup_packages(list(p.packages))
|
||||
|
||||
def uninstall(self, package, requirements=None, after_update=False):
|
||||
@ -105,6 +118,8 @@ class PlatformManager(BasePkgManager):
|
||||
|
||||
p = PlatformFactory.newPlatform(pkg_dir)
|
||||
BasePkgManager.uninstall(self, pkg_dir, requirements)
|
||||
p.uninstall_python_packages()
|
||||
p.on_uninstalled()
|
||||
|
||||
# don't cleanup packages or install them after update
|
||||
# we check packages for updates in def update()
|
||||
@ -590,6 +605,10 @@ class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
|
||||
packages[name].update({"version": version.strip(), "optional": False})
|
||||
return packages
|
||||
|
||||
@property
|
||||
def python_packages(self):
|
||||
return self._manifest.get("pythonPackages")
|
||||
|
||||
def get_dir(self):
|
||||
return dirname(self.manifest_path)
|
||||
|
||||
@ -695,6 +714,45 @@ class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
|
||||
|
||||
return [dict(name=name, path=path) for path, name in storages.items()]
|
||||
|
||||
def on_installed(self):
|
||||
pass
|
||||
|
||||
def on_uninstalled(self):
|
||||
pass
|
||||
|
||||
def install_python_packages(self):
|
||||
if not self.python_packages:
|
||||
return None
|
||||
click.echo(
|
||||
"Installing Python packages: %s"
|
||||
% ", ".join(list(self.python_packages.keys())),
|
||||
)
|
||||
args = [proc.get_pythonexe_path(), "-m", "pip", "install", "--upgrade"]
|
||||
for name, requirements in self.python_packages.items():
|
||||
if any(c in requirements for c in ("<", ">", "=")):
|
||||
args.append("%s%s" % (name, requirements))
|
||||
else:
|
||||
args.append("%s==%s" % (name, requirements))
|
||||
try:
|
||||
return subprocess.call(args) == 0
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
click.secho(
|
||||
"Could not install Python packages -> %s" % e, fg="red", err=True
|
||||
)
|
||||
|
||||
def uninstall_python_packages(self):
|
||||
if not self.python_packages:
|
||||
return
|
||||
click.echo("Uninstalling Python packages")
|
||||
args = [proc.get_pythonexe_path(), "-m", "pip", "uninstall", "--yes"]
|
||||
args.extend(list(self.python_packages.keys()))
|
||||
try:
|
||||
subprocess.call(args) == 0
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
click.secho(
|
||||
"Could not install Python packages -> %s" % e, fg="red", err=True
|
||||
)
|
||||
|
||||
|
||||
class PlatformBoardConfig(object):
|
||||
def __init__(self, manifest_path):
|
||||
@ -799,11 +857,12 @@ class PlatformBoardConfig(object):
|
||||
if tool_name == "custom":
|
||||
return tool_name
|
||||
if not debug_tools:
|
||||
raise exception.DebugSupportError(self._manifest["name"])
|
||||
telemetry.send_event("Debug", "Request", self.id)
|
||||
raise DebugSupportError(self._manifest["name"])
|
||||
if tool_name:
|
||||
if tool_name in debug_tools:
|
||||
return tool_name
|
||||
raise exception.DebugInvalidOptions(
|
||||
raise DebugInvalidOptionsError(
|
||||
"Unknown debug tool `%s`. Please use one of `%s` or `custom`"
|
||||
% (tool_name, ", ".join(sorted(list(debug_tools))))
|
||||
)
|
||||
|
@ -15,7 +15,15 @@
|
||||
from platformio.exception import PlatformioException
|
||||
|
||||
|
||||
class ManifestException(PlatformioException):
|
||||
class PackageException(PlatformioException):
|
||||
pass
|
||||
|
||||
|
||||
class ManifestException(PackageException):
|
||||
pass
|
||||
|
||||
|
||||
class UnknownManifestError(ManifestException):
|
||||
pass
|
||||
|
||||
|
||||
@ -24,13 +32,14 @@ class ManifestParserError(ManifestException):
|
||||
|
||||
|
||||
class ManifestValidationError(ManifestException):
|
||||
def __init__(self, error, data):
|
||||
def __init__(self, messages, data, valid_data):
|
||||
super(ManifestValidationError, self).__init__()
|
||||
self.error = error
|
||||
self.messages = messages
|
||||
self.data = data
|
||||
self.valid_data = valid_data
|
||||
|
||||
def __str__(self):
|
||||
return (
|
||||
"Invalid manifest fields: %s. \nPlease check specification -> "
|
||||
"http://docs.platformio.org/page/librarymanager/config.html" % self.error
|
||||
"http://docs.platformio.org/page/librarymanager/config.html" % self.messages
|
||||
)
|
||||
|
@ -12,15 +12,17 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import inspect
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
|
||||
import requests
|
||||
|
||||
from platformio import util
|
||||
from platformio.compat import get_class_attributes, string_types
|
||||
from platformio.fs import get_file_contents
|
||||
from platformio.package.exception import ManifestParserError
|
||||
from platformio.package.exception import ManifestParserError, UnknownManifestError
|
||||
from platformio.project.helpers import is_platformio_project
|
||||
|
||||
try:
|
||||
@ -36,36 +38,36 @@ class ManifestFileType(object):
|
||||
MODULE_JSON = "module.json"
|
||||
PACKAGE_JSON = "package.json"
|
||||
|
||||
@classmethod
|
||||
def items(cls):
|
||||
return get_class_attributes(ManifestFileType)
|
||||
|
||||
@classmethod
|
||||
def from_uri(cls, uri):
|
||||
if uri.endswith(".properties"):
|
||||
return ManifestFileType.LIBRARY_PROPERTIES
|
||||
if uri.endswith("platform.json"):
|
||||
return ManifestFileType.PLATFORM_JSON
|
||||
if uri.endswith("module.json"):
|
||||
return ManifestFileType.MODULE_JSON
|
||||
if uri.endswith("package.json"):
|
||||
return ManifestFileType.PACKAGE_JSON
|
||||
if uri.endswith("library.json"):
|
||||
return ManifestFileType.LIBRARY_JSON
|
||||
for t in sorted(cls.items().values()):
|
||||
if uri.endswith(t):
|
||||
return t
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def from_dir(cls, path):
|
||||
for t in sorted(cls.items().values()):
|
||||
if os.path.isfile(os.path.join(path, t)):
|
||||
return t
|
||||
return None
|
||||
|
||||
|
||||
class ManifestParserFactory(object):
|
||||
@staticmethod
|
||||
def type_to_clsname(t):
|
||||
t = t.replace(".", " ")
|
||||
t = t.title()
|
||||
return "%sManifestParser" % t.replace(" ", "")
|
||||
|
||||
@staticmethod
|
||||
def new_from_file(path, remote_url=False):
|
||||
if not path or not os.path.isfile(path):
|
||||
raise ManifestParserError("Manifest file does not exist %s" % path)
|
||||
for t in get_class_attributes(ManifestFileType).values():
|
||||
if path.endswith(t):
|
||||
return ManifestParserFactory.new(get_file_contents(path), t, remote_url)
|
||||
raise ManifestParserError("Unknown manifest file type %s" % path)
|
||||
raise UnknownManifestError("Manifest file does not exist %s" % path)
|
||||
type_from_uri = ManifestFileType.from_uri(path)
|
||||
if not type_from_uri:
|
||||
raise UnknownManifestError("Unknown manifest file type %s" % path)
|
||||
return ManifestParserFactory.new(
|
||||
get_file_contents(path, encoding="utf8"), type_from_uri, remote_url
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def new_from_dir(path, remote_url=None):
|
||||
@ -74,29 +76,23 @@ class ManifestParserFactory(object):
|
||||
type_from_uri = ManifestFileType.from_uri(remote_url) if remote_url else None
|
||||
if type_from_uri and os.path.isfile(os.path.join(path, type_from_uri)):
|
||||
return ManifestParserFactory.new(
|
||||
get_file_contents(os.path.join(path, type_from_uri)),
|
||||
get_file_contents(os.path.join(path, type_from_uri), encoding="utf8"),
|
||||
type_from_uri,
|
||||
remote_url=remote_url,
|
||||
package_dir=path,
|
||||
)
|
||||
|
||||
file_order = [
|
||||
ManifestFileType.PLATFORM_JSON,
|
||||
ManifestFileType.LIBRARY_JSON,
|
||||
ManifestFileType.LIBRARY_PROPERTIES,
|
||||
ManifestFileType.MODULE_JSON,
|
||||
ManifestFileType.PACKAGE_JSON,
|
||||
]
|
||||
for t in file_order:
|
||||
if not os.path.isfile(os.path.join(path, t)):
|
||||
continue
|
||||
return ManifestParserFactory.new(
|
||||
get_file_contents(os.path.join(path, t)),
|
||||
t,
|
||||
remote_url=remote_url,
|
||||
package_dir=path,
|
||||
type_from_dir = ManifestFileType.from_dir(path)
|
||||
if not type_from_dir:
|
||||
raise UnknownManifestError(
|
||||
"Unknown manifest file type in %s directory" % path
|
||||
)
|
||||
raise ManifestParserError("Unknown manifest file type in %s directory" % path)
|
||||
return ManifestParserFactory.new(
|
||||
get_file_contents(os.path.join(path, type_from_dir), encoding="utf8"),
|
||||
type_from_dir,
|
||||
remote_url=remote_url,
|
||||
package_dir=path,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def new_from_url(remote_url):
|
||||
@ -109,12 +105,18 @@ class ManifestParserFactory(object):
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def new(contents, type, remote_url=None, package_dir=None):
|
||||
# pylint: disable=redefined-builtin
|
||||
clsname = ManifestParserFactory.type_to_clsname(type)
|
||||
if clsname not in globals():
|
||||
raise ManifestParserError("Unknown manifest file type %s" % clsname)
|
||||
return globals()[clsname](contents, remote_url, package_dir)
|
||||
def new( # pylint: disable=redefined-builtin
|
||||
contents, type, remote_url=None, package_dir=None
|
||||
):
|
||||
for _, cls in globals().items():
|
||||
if (
|
||||
inspect.isclass(cls)
|
||||
and issubclass(cls, BaseManifestParser)
|
||||
and cls != BaseManifestParser
|
||||
and cls.manifest_type == type
|
||||
):
|
||||
return cls(contents, remote_url, package_dir)
|
||||
raise UnknownManifestError("Unknown manifest file type %s" % type)
|
||||
|
||||
|
||||
class BaseManifestParser(object):
|
||||
@ -125,6 +127,8 @@ class BaseManifestParser(object):
|
||||
self._data = self.parse(contents)
|
||||
except Exception as e:
|
||||
raise ManifestParserError("Could not parse manifest -> %s" % e)
|
||||
|
||||
self._data = self.normalize_repository(self._data)
|
||||
self._data = self.parse_examples(self._data)
|
||||
|
||||
# remove None fields
|
||||
@ -139,7 +143,7 @@ class BaseManifestParser(object):
|
||||
return self._data
|
||||
|
||||
@staticmethod
|
||||
def cleanup_author(author):
|
||||
def normalize_author(author):
|
||||
assert isinstance(author, dict)
|
||||
if author.get("email"):
|
||||
author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"])
|
||||
@ -160,6 +164,22 @@ class BaseManifestParser(object):
|
||||
email = raw[raw.index(ldel) + 1 : raw.index(rdel)]
|
||||
return (name.strip(), email.strip() if email else None)
|
||||
|
||||
@staticmethod
|
||||
def normalize_repository(data):
|
||||
url = (data.get("repository") or {}).get("url")
|
||||
if not url or "://" not in url:
|
||||
return data
|
||||
url_attrs = urlparse(url)
|
||||
if url_attrs.netloc not in ("github.com", "bitbucket.org", "gitlab.com"):
|
||||
return data
|
||||
url = "https://%s%s" % (url_attrs.netloc, url_attrs.path)
|
||||
if url.endswith("/"):
|
||||
url = url[:-1]
|
||||
if not url.endswith(".git"):
|
||||
url += ".git"
|
||||
data["repository"]["url"] = url
|
||||
return data
|
||||
|
||||
def parse_examples(self, data):
|
||||
examples = data.get("examples")
|
||||
if (
|
||||
@ -167,8 +187,8 @@ class BaseManifestParser(object):
|
||||
or not isinstance(examples, list)
|
||||
or not all(isinstance(v, dict) for v in examples)
|
||||
):
|
||||
examples = None
|
||||
if not examples and self.package_dir:
|
||||
data["examples"] = None
|
||||
if not data["examples"] and self.package_dir:
|
||||
data["examples"] = self.parse_examples_from_dir(self.package_dir)
|
||||
if "examples" in data and not data["examples"]:
|
||||
del data["examples"]
|
||||
@ -250,6 +270,8 @@ class BaseManifestParser(object):
|
||||
|
||||
|
||||
class LibraryJsonManifestParser(BaseManifestParser):
|
||||
manifest_type = ManifestFileType.LIBRARY_JSON
|
||||
|
||||
def parse(self, contents):
|
||||
data = json.loads(contents)
|
||||
data = self._process_renamed_fields(data)
|
||||
@ -265,6 +287,8 @@ class LibraryJsonManifestParser(BaseManifestParser):
|
||||
data["platforms"] = self._parse_platforms(data["platforms"]) or None
|
||||
if "export" in data:
|
||||
data["export"] = self._parse_export(data["export"])
|
||||
if "dependencies" in data:
|
||||
data["dependencies"] = self._parse_dependencies(data["dependencies"])
|
||||
|
||||
return data
|
||||
|
||||
@ -305,7 +329,7 @@ class LibraryJsonManifestParser(BaseManifestParser):
|
||||
# normalize Union[dict, list] fields
|
||||
if not isinstance(raw, list):
|
||||
raw = [raw]
|
||||
return [self.cleanup_author(author) for author in raw]
|
||||
return [self.normalize_author(author) for author in raw]
|
||||
|
||||
@staticmethod
|
||||
def _parse_platforms(raw):
|
||||
@ -324,13 +348,37 @@ class LibraryJsonManifestParser(BaseManifestParser):
|
||||
return None
|
||||
result = {}
|
||||
for k in ("include", "exclude"):
|
||||
if k not in raw:
|
||||
if not raw.get(k):
|
||||
continue
|
||||
result[k] = raw[k] if isinstance(raw[k], list) else [raw[k]]
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _parse_dependencies(raw):
|
||||
# compatibility with legacy dependency format
|
||||
if isinstance(raw, dict) and "name" in raw:
|
||||
raw = [raw]
|
||||
|
||||
if isinstance(raw, dict):
|
||||
return [dict(name=name, version=version) for name, version in raw.items()]
|
||||
if isinstance(raw, list):
|
||||
for i, dependency in enumerate(raw):
|
||||
assert isinstance(dependency, dict)
|
||||
for k, v in dependency.items():
|
||||
if k not in ("platforms", "frameworks", "authors"):
|
||||
continue
|
||||
if "*" in v:
|
||||
del raw[i][k]
|
||||
raw[i][k] = util.items_to_list(v)
|
||||
return raw
|
||||
raise ManifestParserError(
|
||||
"Invalid dependencies format, should be list or dictionary"
|
||||
)
|
||||
|
||||
|
||||
class ModuleJsonManifestParser(BaseManifestParser):
|
||||
manifest_type = ManifestFileType.MODULE_JSON
|
||||
|
||||
def parse(self, contents):
|
||||
data = json.loads(contents)
|
||||
data["frameworks"] = ["mbed"]
|
||||
@ -352,7 +400,7 @@ class ModuleJsonManifestParser(BaseManifestParser):
|
||||
name, email = self.parse_author_name_and_email(author)
|
||||
if not name:
|
||||
continue
|
||||
result.append(self.cleanup_author(dict(name=name, email=email)))
|
||||
result.append(self.normalize_author(dict(name=name, email=email)))
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
@ -363,10 +411,12 @@ class ModuleJsonManifestParser(BaseManifestParser):
|
||||
|
||||
|
||||
class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
manifest_type = ManifestFileType.LIBRARY_PROPERTIES
|
||||
|
||||
def parse(self, contents):
|
||||
data = self._parse_properties(contents)
|
||||
repository = self._parse_repository(data)
|
||||
homepage = data.get("url")
|
||||
homepage = data.get("url") or None
|
||||
if repository and repository["url"] == homepage:
|
||||
homepage = None
|
||||
data.update(
|
||||
@ -383,6 +433,8 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
if "author" in data:
|
||||
data["authors"] = self._parse_authors(data)
|
||||
del data["author"]
|
||||
if "depends" in data:
|
||||
data["dependencies"] = self._parse_dependencies(data["depends"])
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
@ -451,7 +503,7 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
name, email = self.parse_author_name_and_email(author)
|
||||
if not name:
|
||||
continue
|
||||
authors.append(self.cleanup_author(dict(name=name, email=email)))
|
||||
authors.append(self.normalize_author(dict(name=name, email=email)))
|
||||
for author in properties.get("maintainer", "").split(","):
|
||||
name, email = self.parse_author_name_and_email(author)
|
||||
if not name:
|
||||
@ -462,31 +514,29 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
continue
|
||||
found = True
|
||||
item["maintainer"] = True
|
||||
if not item.get("email"):
|
||||
if not item.get("email") and email:
|
||||
item["email"] = email
|
||||
if not found:
|
||||
authors.append(
|
||||
self.cleanup_author(dict(name=name, email=email, maintainer=True))
|
||||
self.normalize_author(dict(name=name, email=email, maintainer=True))
|
||||
)
|
||||
return authors
|
||||
|
||||
def _parse_repository(self, properties):
|
||||
if self.remote_url:
|
||||
repo_parse = urlparse(self.remote_url)
|
||||
repo_path_tokens = repo_parse.path[1:].split("/")[:-1]
|
||||
if "github" in repo_parse.netloc:
|
||||
url_attrs = urlparse(self.remote_url)
|
||||
repo_path_tokens = url_attrs.path[1:].split("/")[:-1]
|
||||
if "github" in url_attrs.netloc:
|
||||
return dict(
|
||||
type="git",
|
||||
url="%s://github.com/%s"
|
||||
% (repo_parse.scheme, "/".join(repo_path_tokens[:2])),
|
||||
url="https://github.com/" + "/".join(repo_path_tokens[:2]),
|
||||
)
|
||||
if "raw" in repo_path_tokens:
|
||||
return dict(
|
||||
type="git",
|
||||
url="%s://%s/%s"
|
||||
url="https://%s/%s"
|
||||
% (
|
||||
repo_parse.scheme,
|
||||
repo_parse.netloc,
|
||||
url_attrs.netloc,
|
||||
"/".join(repo_path_tokens[: repo_path_tokens.index("raw")]),
|
||||
),
|
||||
)
|
||||
@ -498,9 +548,9 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
result = {"exclude": ["extras", "docs", "tests", "test", "*.doxyfile", "*.pdf"]}
|
||||
include = None
|
||||
if self.remote_url:
|
||||
repo_parse = urlparse(self.remote_url)
|
||||
repo_path_tokens = repo_parse.path[1:].split("/")[:-1]
|
||||
if "github" in repo_parse.netloc:
|
||||
url_attrs = urlparse(self.remote_url)
|
||||
repo_path_tokens = url_attrs.path[1:].split("/")[:-1]
|
||||
if "github" in url_attrs.netloc:
|
||||
include = "/".join(repo_path_tokens[3:]) or None
|
||||
elif "raw" in repo_path_tokens:
|
||||
include = (
|
||||
@ -511,12 +561,36 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
|
||||
result["include"] = [include]
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _parse_dependencies(raw):
|
||||
result = []
|
||||
for item in raw.split(","):
|
||||
item = item.strip()
|
||||
if not item:
|
||||
continue
|
||||
if item.endswith(")") and "(" in item:
|
||||
name, version = item.split("(")
|
||||
result.append(
|
||||
dict(
|
||||
name=name.strip(),
|
||||
version=version[:-1].strip(),
|
||||
frameworks=["arduino"],
|
||||
)
|
||||
)
|
||||
else:
|
||||
result.append(dict(name=item, frameworks=["arduino"]))
|
||||
return result
|
||||
|
||||
|
||||
class PlatformJsonManifestParser(BaseManifestParser):
|
||||
manifest_type = ManifestFileType.PLATFORM_JSON
|
||||
|
||||
def parse(self, contents):
|
||||
data = json.loads(contents)
|
||||
if "frameworks" in data:
|
||||
data["frameworks"] = self._parse_frameworks(data["frameworks"])
|
||||
if "packages" in data:
|
||||
data["dependencies"] = self._parse_dependencies(data["packages"])
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
@ -525,8 +599,16 @@ class PlatformJsonManifestParser(BaseManifestParser):
|
||||
return None
|
||||
return [name.lower() for name in raw.keys()]
|
||||
|
||||
@staticmethod
|
||||
def _parse_dependencies(raw):
|
||||
return [
|
||||
dict(name=name, version=opts.get("version")) for name, opts in raw.items()
|
||||
]
|
||||
|
||||
|
||||
class PackageJsonManifestParser(BaseManifestParser):
|
||||
manifest_type = ManifestFileType.PACKAGE_JSON
|
||||
|
||||
def parse(self, contents):
|
||||
data = json.loads(contents)
|
||||
data = self._parse_system(data)
|
||||
|
@ -12,6 +12,9 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# pylint: disable=too-many-ancestors
|
||||
|
||||
import marshmallow
|
||||
import requests
|
||||
import semantic_version
|
||||
from marshmallow import Schema, ValidationError, fields, validate, validates
|
||||
@ -19,23 +22,61 @@ from marshmallow import Schema, ValidationError, fields, validate, validates
|
||||
from platformio.package.exception import ManifestValidationError
|
||||
from platformio.util import memoized
|
||||
|
||||
MARSHMALLOW_2 = marshmallow.__version_info__ < (3,)
|
||||
|
||||
class StrictSchema(Schema):
|
||||
def handle_error(self, error, data):
|
||||
|
||||
if MARSHMALLOW_2:
|
||||
|
||||
class CompatSchema(Schema):
|
||||
pass
|
||||
|
||||
|
||||
else:
|
||||
|
||||
class CompatSchema(Schema):
|
||||
class Meta(object): # pylint: disable=no-init
|
||||
unknown = marshmallow.EXCLUDE # pylint: disable=no-member
|
||||
|
||||
def handle_error(self, error, data, **_): # pylint: disable=arguments-differ
|
||||
raise ManifestValidationError(
|
||||
error.messages,
|
||||
data,
|
||||
error.valid_data if hasattr(error, "valid_data") else error.data,
|
||||
)
|
||||
|
||||
|
||||
class BaseSchema(CompatSchema):
|
||||
def load_manifest(self, data):
|
||||
if MARSHMALLOW_2:
|
||||
data, errors = self.load(data)
|
||||
if errors:
|
||||
raise ManifestValidationError(errors, data, data)
|
||||
return data
|
||||
return self.load(data)
|
||||
|
||||
|
||||
class StrictSchema(BaseSchema):
|
||||
def handle_error(self, error, data, **_): # pylint: disable=arguments-differ
|
||||
# skip broken records
|
||||
if self.many:
|
||||
error.data = [
|
||||
error.valid_data = [
|
||||
item for idx, item in enumerate(data) if idx not in error.messages
|
||||
]
|
||||
else:
|
||||
error.data = None
|
||||
error.valid_data = None
|
||||
if MARSHMALLOW_2:
|
||||
error.data = error.valid_data
|
||||
raise error
|
||||
|
||||
|
||||
class StrictListField(fields.List):
|
||||
def _deserialize(self, value, attr, data):
|
||||
def _deserialize( # pylint: disable=arguments-differ
|
||||
self, value, attr, data, **kwargs
|
||||
):
|
||||
try:
|
||||
return super(StrictListField, self)._deserialize(value, attr, data)
|
||||
return super(StrictListField, self)._deserialize(
|
||||
value, attr, data, **kwargs
|
||||
)
|
||||
except ValidationError as exc:
|
||||
if exc.data:
|
||||
exc.data = [item for item in exc.data if item is not None]
|
||||
@ -61,7 +102,33 @@ class RepositorySchema(StrictSchema):
|
||||
branch = fields.Str(validate=validate.Length(min=1, max=50))
|
||||
|
||||
|
||||
class ExportSchema(Schema):
|
||||
class DependencySchema(StrictSchema):
|
||||
name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
|
||||
version = fields.Str(validate=validate.Length(min=1, max=100))
|
||||
authors = StrictListField(fields.Str(validate=validate.Length(min=1, max=50)))
|
||||
platforms = StrictListField(
|
||||
fields.Str(
|
||||
validate=[
|
||||
validate.Length(min=1, max=50),
|
||||
validate.Regexp(
|
||||
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
|
||||
),
|
||||
]
|
||||
)
|
||||
)
|
||||
frameworks = StrictListField(
|
||||
fields.Str(
|
||||
validate=[
|
||||
validate.Length(min=1, max=50),
|
||||
validate.Regexp(
|
||||
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
|
||||
),
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class ExportSchema(BaseSchema):
|
||||
include = StrictListField(fields.Str)
|
||||
exclude = StrictListField(fields.Str)
|
||||
|
||||
@ -80,7 +147,7 @@ class ExampleSchema(StrictSchema):
|
||||
files = StrictListField(fields.Str, required=True)
|
||||
|
||||
|
||||
class ManifestSchema(Schema):
|
||||
class ManifestSchema(BaseSchema):
|
||||
# Required fields
|
||||
name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
|
||||
version = fields.Str(required=True, validate=validate.Length(min=1, max=50))
|
||||
@ -92,8 +159,12 @@ class ManifestSchema(Schema):
|
||||
homepage = fields.Url(validate=validate.Length(min=1, max=255))
|
||||
license = fields.Str(validate=validate.Length(min=1, max=255))
|
||||
repository = fields.Nested(RepositorySchema)
|
||||
dependencies = fields.Nested(DependencySchema, many=True)
|
||||
|
||||
# library.json
|
||||
export = fields.Nested(ExportSchema)
|
||||
examples = fields.Nested(ExampleSchema, many=True)
|
||||
downloadUrl = fields.Url(validate=validate.Length(min=1, max=255))
|
||||
|
||||
keywords = StrictListField(
|
||||
fields.Str(
|
||||
@ -105,7 +176,6 @@ class ManifestSchema(Schema):
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
platforms = StrictListField(
|
||||
fields.Str(
|
||||
validate=[
|
||||
@ -142,10 +212,6 @@ class ManifestSchema(Schema):
|
||||
)
|
||||
)
|
||||
|
||||
def handle_error(self, error, data):
|
||||
if self.strict:
|
||||
raise ManifestValidationError(error, data)
|
||||
|
||||
@validates("version")
|
||||
def validate_version(self, value): # pylint: disable=no-self-use
|
||||
try:
|
||||
@ -176,7 +242,7 @@ class ManifestSchema(Schema):
|
||||
def load_spdx_licenses():
|
||||
r = requests.get(
|
||||
"https://raw.githubusercontent.com/spdx/license-list-data"
|
||||
"/v3.6/json/licenses.json"
|
||||
"/v3.8/json/licenses.json"
|
||||
)
|
||||
r.raise_for_status()
|
||||
return r.json()
|
||||
|
131
platformio/package/pack.py
Normal file
131
platformio/package/pack.py
Normal file
@ -0,0 +1,131 @@
|
||||
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import tarfile
|
||||
import tempfile
|
||||
|
||||
from platformio import fs
|
||||
from platformio.package.exception import PackageException
|
||||
from platformio.package.manifest.parser import ManifestFileType, ManifestParserFactory
|
||||
from platformio.package.manifest.schema import ManifestSchema
|
||||
from platformio.unpacker import FileUnpacker
|
||||
|
||||
|
||||
class PackagePacker(object):
|
||||
EXCLUDE_DEFAULT = [
|
||||
"._*",
|
||||
".DS_Store",
|
||||
".git",
|
||||
".hg",
|
||||
".svn",
|
||||
".pio",
|
||||
]
|
||||
INCLUDE_DEFAULT = ManifestFileType.items().values()
|
||||
|
||||
def __init__(self, package, manifest_uri=None):
|
||||
self.package = package
|
||||
self.manifest_uri = manifest_uri
|
||||
|
||||
def pack(self, dst=None):
|
||||
tmp_dir = tempfile.mkdtemp()
|
||||
try:
|
||||
src = self.package
|
||||
|
||||
# if zip/tar.gz -> unpack to tmp dir
|
||||
if not os.path.isdir(src):
|
||||
with FileUnpacker(src) as fu:
|
||||
assert fu.unpack(tmp_dir, silent=True)
|
||||
src = tmp_dir
|
||||
|
||||
src = self.find_source_root(src)
|
||||
|
||||
manifest = self.load_manifest(src)
|
||||
filename = re.sub(
|
||||
r"[^\da-zA-Z\-\._]+",
|
||||
"",
|
||||
"{name}{system}-{version}.tar.gz".format(
|
||||
name=manifest["name"],
|
||||
system="-" + manifest["system"][0] if "system" in manifest else "",
|
||||
version=manifest["version"],
|
||||
),
|
||||
)
|
||||
|
||||
if not dst:
|
||||
dst = os.path.join(os.getcwd(), filename)
|
||||
elif os.path.isdir(dst):
|
||||
dst = os.path.join(dst, filename)
|
||||
|
||||
return self._create_tarball(
|
||||
src,
|
||||
dst,
|
||||
include=manifest.get("export", {}).get("include"),
|
||||
exclude=manifest.get("export", {}).get("exclude"),
|
||||
)
|
||||
finally:
|
||||
shutil.rmtree(tmp_dir)
|
||||
|
||||
@staticmethod
|
||||
def load_manifest(src):
|
||||
mp = ManifestParserFactory.new_from_dir(src)
|
||||
return ManifestSchema().load_manifest(mp.as_dict())
|
||||
|
||||
def find_source_root(self, src):
|
||||
if self.manifest_uri:
|
||||
mp = (
|
||||
ManifestParserFactory.new_from_file(self.manifest_uri[5:])
|
||||
if self.manifest_uri.startswith("file:")
|
||||
else ManifestParserFactory.new_from_url(self.manifest_uri)
|
||||
)
|
||||
manifest = ManifestSchema().load_manifest(mp.as_dict())
|
||||
include = manifest.get("export", {}).get("include", [])
|
||||
if len(include) == 1:
|
||||
if not os.path.isdir(os.path.join(src, include[0])):
|
||||
raise PackageException(
|
||||
"Non existing `include` directory `%s` in a package"
|
||||
% include[0]
|
||||
)
|
||||
return os.path.join(src, include[0])
|
||||
|
||||
for root, _, __ in os.walk(src):
|
||||
if ManifestFileType.from_dir(root):
|
||||
return root
|
||||
|
||||
return src
|
||||
|
||||
def _create_tarball(self, src, dst, include=None, exclude=None):
|
||||
# remap root
|
||||
if (
|
||||
include
|
||||
and len(include) == 1
|
||||
and os.path.isdir(os.path.join(src, include[0]))
|
||||
):
|
||||
src = os.path.join(src, include[0])
|
||||
include = None
|
||||
|
||||
src_filters = self.compute_src_filters(include, exclude)
|
||||
with tarfile.open(dst, "w:gz") as tar:
|
||||
for f in fs.match_src_files(src, src_filters, followlinks=False):
|
||||
tar.add(os.path.join(src, f), f)
|
||||
return dst
|
||||
|
||||
def compute_src_filters(self, include, exclude):
|
||||
result = ["+<%s>" % p for p in include or ["*", ".*"]]
|
||||
result += ["-<%s>" % p for p in exclude or []]
|
||||
result += ["-<%s>" % p for p in self.EXCLUDE_DEFAULT]
|
||||
# automatically include manifests
|
||||
result += ["+<%s>" % p for p in self.INCLUDE_DEFAULT]
|
||||
return result
|
@ -20,8 +20,9 @@ from hashlib import sha1
|
||||
|
||||
import click
|
||||
|
||||
from platformio import exception, fs
|
||||
from platformio.compat import PY2, WINDOWS, hashlib_encode_data
|
||||
from platformio import fs
|
||||
from platformio.compat import PY2, WINDOWS, hashlib_encode_data, string_types
|
||||
from platformio.project import exception
|
||||
from platformio.project.options import ProjectOptions
|
||||
|
||||
try:
|
||||
@ -29,7 +30,8 @@ try:
|
||||
except ImportError:
|
||||
import configparser as ConfigParser
|
||||
|
||||
CONFIG_HEADER = """;PlatformIO Project Configuration File
|
||||
CONFIG_HEADER = """
|
||||
; PlatformIO Project Configuration File
|
||||
;
|
||||
; Build options: build flags, source filter
|
||||
; Upload options: custom upload port, speed and extra flags
|
||||
@ -38,10 +40,12 @@ CONFIG_HEADER = """;PlatformIO Project Configuration File
|
||||
;
|
||||
; Please visit documentation for the other options and examples
|
||||
; https://docs.platformio.org/page/projectconf.html
|
||||
|
||||
"""
|
||||
|
||||
|
||||
MISSING = object()
|
||||
|
||||
|
||||
class ProjectConfigBase(object):
|
||||
|
||||
INLINE_COMMENT_RE = re.compile(r"\s+;.*$")
|
||||
@ -104,7 +108,7 @@ class ProjectConfigBase(object):
|
||||
try:
|
||||
self._parser.read(path)
|
||||
except ConfigParser.Error as e:
|
||||
raise exception.InvalidProjectConf(path, str(e))
|
||||
raise exception.InvalidProjectConfError(path, str(e))
|
||||
|
||||
if not parse_extra:
|
||||
return
|
||||
@ -228,6 +232,8 @@ class ProjectConfigBase(object):
|
||||
return [(option, self.get(section, option)) for option in self.options(section)]
|
||||
|
||||
def set(self, section, option, value):
|
||||
if value is None:
|
||||
value = ""
|
||||
if isinstance(value, (list, tuple)):
|
||||
value = "\n".join(value)
|
||||
elif isinstance(value, bool):
|
||||
@ -239,46 +245,25 @@ class ProjectConfigBase(object):
|
||||
value = "\n" + value
|
||||
self._parser.set(section, option, value)
|
||||
|
||||
def getraw(self, section, option):
|
||||
def getraw( # pylint: disable=too-many-branches
|
||||
self, section, option, default=MISSING
|
||||
):
|
||||
if not self.expand_interpolations:
|
||||
return self._parser.get(section, option)
|
||||
|
||||
value = None
|
||||
found = False
|
||||
value = MISSING
|
||||
for sec, opt in self.walk_options(section):
|
||||
if opt == option:
|
||||
value = self._parser.get(sec, option)
|
||||
found = True
|
||||
break
|
||||
|
||||
if not found:
|
||||
value = self._parser.get(section, option)
|
||||
|
||||
if "${" not in value or "}" not in value:
|
||||
return value
|
||||
return self.VARTPL_RE.sub(self._re_interpolation_handler, value)
|
||||
|
||||
def _re_interpolation_handler(self, match):
|
||||
section, option = match.group(1), match.group(2)
|
||||
if section == "sysenv":
|
||||
return os.getenv(option)
|
||||
return self.getraw(section, option)
|
||||
|
||||
def get(self, section, option, default=None): # pylint: disable=too-many-branches
|
||||
value = None
|
||||
try:
|
||||
value = self.getraw(section, option)
|
||||
except (ConfigParser.NoSectionError, ConfigParser.NoOptionError):
|
||||
pass # handle value from system environment
|
||||
except ConfigParser.Error as e:
|
||||
raise exception.InvalidProjectConf(self.path, str(e))
|
||||
|
||||
option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option))
|
||||
if not option_meta:
|
||||
return value or default
|
||||
|
||||
if option_meta.multiple:
|
||||
value = self.parse_multi_values(value)
|
||||
if value == MISSING:
|
||||
value = (
|
||||
default if default != MISSING else self._parser.get(section, option)
|
||||
)
|
||||
return self._expand_interpolations(value)
|
||||
|
||||
if option_meta.sysenvvar:
|
||||
envvar_value = os.getenv(option_meta.sysenvvar)
|
||||
@ -288,17 +273,45 @@ class ProjectConfigBase(object):
|
||||
if envvar_value:
|
||||
break
|
||||
if envvar_value and option_meta.multiple:
|
||||
value = value or []
|
||||
value.extend(self.parse_multi_values(envvar_value))
|
||||
elif envvar_value and not value:
|
||||
value += ("" if value == MISSING else "\n") + envvar_value
|
||||
elif envvar_value and value == MISSING:
|
||||
value = envvar_value
|
||||
|
||||
# option is not specified by user
|
||||
if value is None or (
|
||||
option_meta.multiple and value == [] and option_meta.default
|
||||
):
|
||||
return default if default is not None else option_meta.default
|
||||
if value == MISSING:
|
||||
value = option_meta.default or default
|
||||
if value == MISSING:
|
||||
return None
|
||||
|
||||
return self._expand_interpolations(value)
|
||||
|
||||
def _expand_interpolations(self, value):
|
||||
if (
|
||||
not value
|
||||
or not isinstance(value, string_types)
|
||||
or not all(["${" in value, "}" in value])
|
||||
):
|
||||
return value
|
||||
return self.VARTPL_RE.sub(self._re_interpolation_handler, value)
|
||||
|
||||
def _re_interpolation_handler(self, match):
|
||||
section, option = match.group(1), match.group(2)
|
||||
if section == "sysenv":
|
||||
return os.getenv(option)
|
||||
return self.getraw(section, option)
|
||||
|
||||
def get(self, section, option, default=MISSING):
|
||||
value = None
|
||||
try:
|
||||
value = self.getraw(section, option, default)
|
||||
except ConfigParser.Error as e:
|
||||
raise exception.InvalidProjectConfError(self.path, str(e))
|
||||
|
||||
option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option))
|
||||
if not option_meta:
|
||||
return value
|
||||
|
||||
if option_meta.multiple:
|
||||
value = self.parse_multi_values(value or [])
|
||||
try:
|
||||
return self.cast_to(value, option_meta.type)
|
||||
except click.BadParameter as e:
|
||||
@ -325,14 +338,14 @@ class ProjectConfigBase(object):
|
||||
|
||||
def validate(self, envs=None, silent=False):
|
||||
if not os.path.isfile(self.path):
|
||||
raise exception.NotPlatformIOProject(self.path)
|
||||
raise exception.NotPlatformIOProjectError(self.path)
|
||||
# check envs
|
||||
known = set(self.envs())
|
||||
if not known:
|
||||
raise exception.ProjectEnvsNotAvailable()
|
||||
raise exception.ProjectEnvsNotAvailableError()
|
||||
unknown = set(list(envs or []) + self.default_envs()) - known
|
||||
if unknown:
|
||||
raise exception.UnknownEnvNames(", ".join(unknown), ", ".join(known))
|
||||
raise exception.UnknownEnvNamesError(", ".join(unknown), ", ".join(known))
|
||||
if not silent:
|
||||
for warning in self.warnings:
|
||||
click.secho("Warning! %s" % warning, fg="yellow")
|
||||
@ -445,7 +458,12 @@ class ProjectConfig(ProjectConfigBase, ProjectConfigDirsMixin):
|
||||
path = path or self.path
|
||||
if path in self._instances:
|
||||
del self._instances[path]
|
||||
with open(path or self.path, "w") as fp:
|
||||
fp.write(CONFIG_HEADER)
|
||||
with open(path or self.path, "w+") as fp:
|
||||
fp.write(CONFIG_HEADER.strip() + "\n\n")
|
||||
self._parser.write(fp)
|
||||
fp.seek(0)
|
||||
contents = fp.read()
|
||||
fp.seek(0)
|
||||
fp.truncate()
|
||||
fp.write(contents.strip() + "\n")
|
||||
return True
|
||||
|
53
platformio/project/exception.py
Normal file
53
platformio/project/exception.py
Normal file
@ -0,0 +1,53 @@
|
||||
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from platformio.exception import PlatformioException, UserSideException
|
||||
|
||||
|
||||
class ProjectError(PlatformioException):
|
||||
pass
|
||||
|
||||
|
||||
class NotPlatformIOProjectError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = (
|
||||
"Not a PlatformIO project. `platformio.ini` file has not been "
|
||||
"found in current working directory ({0}). To initialize new project "
|
||||
"please use `platformio project init` command"
|
||||
)
|
||||
|
||||
|
||||
class InvalidProjectConfError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = "Invalid '{0}' (project configuration file): '{1}'"
|
||||
|
||||
|
||||
class UndefinedEnvPlatformError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = "Please specify platform for '{0}' environment"
|
||||
|
||||
|
||||
class ProjectEnvsNotAvailableError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = "Please setup environments in `platformio.ini` file"
|
||||
|
||||
|
||||
class UnknownEnvNamesError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = "Unknown environment names '{0}'. Valid names are '{1}'"
|
||||
|
||||
|
||||
class ProjectOptionValueError(ProjectError, UserSideException):
|
||||
|
||||
MESSAGE = "{0} for option `{1}` in section [{2}]"
|
@ -284,6 +284,13 @@ ProjectOptions = OrderedDict(
|
||||
description="Custom packages and specifications",
|
||||
multiple=True,
|
||||
),
|
||||
# Board
|
||||
ConfigEnvOption(
|
||||
group="platform",
|
||||
name="board",
|
||||
description="A board ID",
|
||||
buildenvvar="BOARD",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="platform",
|
||||
name="framework",
|
||||
@ -291,36 +298,29 @@ ProjectOptions = OrderedDict(
|
||||
multiple=True,
|
||||
buildenvvar="PIOFRAMEWORK",
|
||||
),
|
||||
# Board
|
||||
ConfigEnvOption(
|
||||
group="board",
|
||||
name="board",
|
||||
description="A board ID",
|
||||
buildenvvar="BOARD",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="board",
|
||||
group="platform",
|
||||
name="board_build.mcu",
|
||||
description="A custom board MCU",
|
||||
oldnames=["board_mcu"],
|
||||
buildenvvar="BOARD_MCU",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="board",
|
||||
group="platform",
|
||||
name="board_build.f_cpu",
|
||||
description="A custom MCU frequency",
|
||||
oldnames=["board_f_cpu"],
|
||||
buildenvvar="BOARD_F_CPU",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="board",
|
||||
group="platform",
|
||||
name="board_build.f_flash",
|
||||
description="A custom flash frequency",
|
||||
oldnames=["board_f_flash"],
|
||||
buildenvvar="BOARD_F_FLASH",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="board",
|
||||
group="platform",
|
||||
name="board_build.flash_mode",
|
||||
description="A custom flash mode",
|
||||
oldnames=["board_flash_mode"],
|
||||
@ -531,7 +531,7 @@ ProjectOptions = OrderedDict(
|
||||
group="check",
|
||||
name="check_tool",
|
||||
description="A list of check tools used for analysis",
|
||||
type=click.Choice(["cppcheck", "clangtidy"]),
|
||||
type=click.Choice(["cppcheck", "clangtidy", "pvs-studio"]),
|
||||
multiple=True,
|
||||
default=["cppcheck"],
|
||||
),
|
||||
@ -582,11 +582,15 @@ ProjectOptions = OrderedDict(
|
||||
description="A connection speed (baud rate) to communicate with a target device",
|
||||
type=click.INT,
|
||||
),
|
||||
ConfigEnvOption(group="test", name="test_transport", description="",),
|
||||
ConfigEnvOption(
|
||||
group="test",
|
||||
name="test_transport",
|
||||
description="A transport to communicate with a target device",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="test",
|
||||
name="test_build_project_src",
|
||||
description="",
|
||||
description="Build project source code in a pair with test code",
|
||||
type=click.BOOL,
|
||||
default=False,
|
||||
),
|
||||
@ -596,6 +600,16 @@ ProjectOptions = OrderedDict(
|
||||
name="debug_tool",
|
||||
description="A name of debugging tool",
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="debug",
|
||||
name="debug_build_flags",
|
||||
description=(
|
||||
"Custom debug flags/options for preprocessing, compilation, "
|
||||
"assembly, and linking processes"
|
||||
),
|
||||
multiple=True,
|
||||
default=["-Og", "-g2", "-ggdb2"],
|
||||
),
|
||||
ConfigEnvOption(
|
||||
group="debug",
|
||||
name="debug_init_break",
|
||||
|
@ -13,13 +13,12 @@
|
||||
# limitations under the License.
|
||||
|
||||
import atexit
|
||||
import os
|
||||
import platform
|
||||
import re
|
||||
import sys
|
||||
import threading
|
||||
from collections import deque
|
||||
from os import getenv, sep
|
||||
from os.path import join
|
||||
from time import sleep, time
|
||||
from traceback import format_exc
|
||||
|
||||
@ -79,6 +78,7 @@ class MeasurementProtocol(TelemetryBase):
|
||||
|
||||
self._prefill_screen_name()
|
||||
self._prefill_appinfo()
|
||||
self._prefill_sysargs()
|
||||
self._prefill_custom_data()
|
||||
|
||||
def __getitem__(self, name):
|
||||
@ -99,10 +99,19 @@ class MeasurementProtocol(TelemetryBase):
|
||||
dpdata.append("PlatformIO/%s" % __version__)
|
||||
if app.get_session_var("caller_id"):
|
||||
dpdata.append("Caller/%s" % app.get_session_var("caller_id"))
|
||||
if getenv("PLATFORMIO_IDE"):
|
||||
dpdata.append("IDE/%s" % getenv("PLATFORMIO_IDE"))
|
||||
if os.getenv("PLATFORMIO_IDE"):
|
||||
dpdata.append("IDE/%s" % os.getenv("PLATFORMIO_IDE"))
|
||||
self["an"] = " ".join(dpdata)
|
||||
|
||||
def _prefill_sysargs(self):
|
||||
args = []
|
||||
for arg in sys.argv[1:]:
|
||||
arg = str(arg).lower()
|
||||
if "@" in arg or os.path.exists(arg):
|
||||
arg = "***"
|
||||
args.append(arg)
|
||||
self["cd3"] = " ".join(args)
|
||||
|
||||
def _prefill_custom_data(self):
|
||||
def _filter_args(items):
|
||||
result = []
|
||||
@ -119,7 +128,6 @@ class MeasurementProtocol(TelemetryBase):
|
||||
caller_id = str(app.get_session_var("caller_id"))
|
||||
self["cd1"] = util.get_systype()
|
||||
self["cd2"] = "Python/%s %s" % (platform.python_version(), platform.platform())
|
||||
# self['cd3'] = " ".join(_filter_args(sys.argv[1:]))
|
||||
self["cd4"] = (
|
||||
1 if (not util.is_ci() and (caller_id or not is_container())) else 0
|
||||
)
|
||||
@ -143,14 +151,7 @@ class MeasurementProtocol(TelemetryBase):
|
||||
return
|
||||
|
||||
cmd_path = args[:1]
|
||||
if args[0] in (
|
||||
"platform",
|
||||
"platforms",
|
||||
"serialports",
|
||||
"device",
|
||||
"settings",
|
||||
"account",
|
||||
):
|
||||
if args[0] in ("account", "device", "platform", "project", "settings",):
|
||||
cmd_path = args[:2]
|
||||
if args[0] == "lib" and len(args) > 1:
|
||||
lib_subcmds = (
|
||||
@ -179,13 +180,10 @@ class MeasurementProtocol(TelemetryBase):
|
||||
cmd_path.append(sub_cmd)
|
||||
self["screen_name"] = " ".join([p.title() for p in cmd_path])
|
||||
|
||||
@staticmethod
|
||||
def _ignore_hit():
|
||||
def _ignore_hit(self):
|
||||
if not app.get_setting("enable_telemetry"):
|
||||
return True
|
||||
if app.get_session_var("caller_id") and all(
|
||||
c in sys.argv for c in ("run", "idedata")
|
||||
):
|
||||
if all(c in sys.argv for c in ("run", "idedata")) or self["ea"] == "Idedata":
|
||||
return True
|
||||
return False
|
||||
|
||||
@ -296,29 +294,64 @@ def on_command():
|
||||
measure_ci()
|
||||
|
||||
|
||||
def on_exception(e):
|
||||
skip_conditions = [
|
||||
isinstance(e, cls)
|
||||
for cls in (IOError, exception.ReturnErrorCode, exception.UserSideException,)
|
||||
]
|
||||
try:
|
||||
skip_conditions.append("[API] Account: " in str(e))
|
||||
except UnicodeEncodeError as ue:
|
||||
e = ue
|
||||
if any(skip_conditions):
|
||||
return
|
||||
is_fatal = any(
|
||||
[
|
||||
not isinstance(e, exception.PlatformioException),
|
||||
"Error" in e.__class__.__name__,
|
||||
]
|
||||
)
|
||||
description = "%s: %s" % (
|
||||
type(e).__name__,
|
||||
" ".join(reversed(format_exc().split("\n"))) if is_fatal else str(e),
|
||||
)
|
||||
send_exception(description, is_fatal)
|
||||
|
||||
|
||||
def measure_ci():
|
||||
event = {"category": "CI", "action": "NoName", "label": None}
|
||||
known_cis = ("TRAVIS", "APPVEYOR", "GITLAB_CI", "CIRCLECI", "SHIPPABLE", "DRONE")
|
||||
for name in known_cis:
|
||||
if getenv(name, "false").lower() == "true":
|
||||
if os.getenv(name, "false").lower() == "true":
|
||||
event["action"] = name
|
||||
break
|
||||
on_event(**event)
|
||||
send_event(**event)
|
||||
|
||||
|
||||
def on_run_environment(options, targets):
|
||||
non_sensative_values = ["board", "platform", "framework"]
|
||||
safe_options = []
|
||||
for key, value in sorted(options.items()):
|
||||
if key in non_sensative_values:
|
||||
safe_options.append("%s=%s" % (key, value))
|
||||
else:
|
||||
safe_options.append(key)
|
||||
targets = [t.title() for t in targets or ["run"]]
|
||||
on_event("Env", " ".join(targets), "&".join(safe_options))
|
||||
def encode_run_environment(options):
|
||||
non_sensative_keys = [
|
||||
"platform",
|
||||
"framework",
|
||||
"board",
|
||||
"upload_protocol",
|
||||
"check_tool",
|
||||
"debug_tool",
|
||||
]
|
||||
safe_options = [
|
||||
"%s=%s" % (k, v) for k, v in sorted(options.items()) if k in non_sensative_keys
|
||||
]
|
||||
return "&".join(safe_options)
|
||||
|
||||
|
||||
def on_event(category, action, label=None, value=None, screen_name=None):
|
||||
def send_run_environment(options, targets):
|
||||
send_event(
|
||||
"Env",
|
||||
" ".join([t.title() for t in targets or ["run"]]),
|
||||
encode_run_environment(options),
|
||||
)
|
||||
|
||||
|
||||
def send_event(category, action, label=None, value=None, screen_name=None):
|
||||
mp = MeasurementProtocol()
|
||||
mp["event_category"] = category[:150]
|
||||
mp["event_action"] = action[:500]
|
||||
@ -331,43 +364,21 @@ def on_event(category, action, label=None, value=None, screen_name=None):
|
||||
mp.send("event")
|
||||
|
||||
|
||||
def on_exception(e):
|
||||
def _cleanup_description(text):
|
||||
text = text.replace("Traceback (most recent call last):", "")
|
||||
text = re.sub(
|
||||
r'File "([^"]+)"',
|
||||
lambda m: join(*m.group(1).split(sep)[-2:]),
|
||||
text,
|
||||
flags=re.M,
|
||||
)
|
||||
text = re.sub(r"\s+", " ", text, flags=re.M)
|
||||
return text.strip()
|
||||
|
||||
skip_conditions = [
|
||||
isinstance(e, cls)
|
||||
for cls in (
|
||||
IOError,
|
||||
exception.ReturnErrorCode,
|
||||
exception.UserSideException,
|
||||
exception.PlatformIOProjectException,
|
||||
)
|
||||
]
|
||||
try:
|
||||
skip_conditions.append("[API] Account: " in str(e))
|
||||
except UnicodeEncodeError as ue:
|
||||
e = ue
|
||||
if any(skip_conditions):
|
||||
return
|
||||
is_crash = any(
|
||||
[
|
||||
not isinstance(e, exception.PlatformioException),
|
||||
"Error" in e.__class__.__name__,
|
||||
]
|
||||
def send_exception(description, is_fatal=False):
|
||||
# cleanup sensitive information, such as paths
|
||||
description = description.replace("Traceback (most recent call last):", "")
|
||||
description = description.replace("\\", "/")
|
||||
description = re.sub(
|
||||
r'(^|\s+|")(?:[a-z]\:)?((/[^"/]+)+)(\s+|"|$)',
|
||||
lambda m: " %s " % os.path.join(*m.group(2).split("/")[-2:]),
|
||||
description,
|
||||
re.I | re.M,
|
||||
)
|
||||
description = re.sub(r"\s+", " ", description, flags=re.M)
|
||||
|
||||
mp = MeasurementProtocol()
|
||||
description = _cleanup_description(format_exc() if is_crash else str(e))
|
||||
mp["exd"] = ("%s: %s" % (type(e).__name__, description))[:2048]
|
||||
mp["exf"] = 1 if is_crash else 0
|
||||
mp["exd"] = description[:8192].strip()
|
||||
mp["exf"] = 1 if is_fatal else 0
|
||||
mp.send("exception")
|
||||
|
||||
|
||||
|
@ -73,6 +73,7 @@ class TARArchive(ArchiveBase):
|
||||
).startswith(base)
|
||||
|
||||
def extract_item(self, item, dest_dir):
|
||||
dest_dir = self.resolve_path(dest_dir)
|
||||
bad_conds = [
|
||||
self.is_bad_path(item.name, dest_dir),
|
||||
self.is_link(item) and self.is_bad_link(item, dest_dir),
|
||||
@ -137,10 +138,13 @@ class FileUnpacker(object):
|
||||
if self._unpacker:
|
||||
self._unpacker.close()
|
||||
|
||||
def unpack(self, dest_dir=".", with_progress=True, check_unpacked=True):
|
||||
def unpack(
|
||||
self, dest_dir=".", with_progress=True, check_unpacked=True, silent=False
|
||||
):
|
||||
assert self._unpacker
|
||||
if not with_progress:
|
||||
click.echo("Unpacking...")
|
||||
if not with_progress or silent:
|
||||
if not silent:
|
||||
click.echo("Unpacking...")
|
||||
for item in self._unpacker.get_items():
|
||||
self._unpacker.extract_item(item, dest_dir)
|
||||
else:
|
||||
|
@ -366,10 +366,11 @@ def get_api_result(url, params=None, data=None, auth=None, cache_valid=None):
|
||||
)
|
||||
|
||||
|
||||
PING_INTERNET_IPS = [
|
||||
"192.30.253.113", # github.com
|
||||
"31.28.1.238", # dl.platformio.org
|
||||
"193.222.52.25", # dl.platformio.org
|
||||
PING_REMOTE_HOSTS = [
|
||||
"140.82.118.3", # Github.com
|
||||
"35.231.145.151", # Gitlab.com
|
||||
"github.com",
|
||||
"platformio.org",
|
||||
]
|
||||
|
||||
|
||||
@ -377,12 +378,12 @@ PING_INTERNET_IPS = [
|
||||
def _internet_on():
|
||||
timeout = 2
|
||||
socket.setdefaulttimeout(timeout)
|
||||
for ip in PING_INTERNET_IPS:
|
||||
for host in PING_REMOTE_HOSTS:
|
||||
try:
|
||||
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
|
||||
requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout)
|
||||
requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
|
||||
else:
|
||||
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80))
|
||||
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, 80))
|
||||
return True
|
||||
except: # pylint: disable=bare-except
|
||||
pass
|
||||
@ -401,9 +402,9 @@ def pepver_to_semver(pepver):
|
||||
|
||||
|
||||
def items_to_list(items):
|
||||
if not isinstance(items, list):
|
||||
items = [i.strip() for i in items.split(",")]
|
||||
return [i.lower() for i in items if i]
|
||||
if isinstance(items, list):
|
||||
return items
|
||||
return [i.strip() for i in items.split(",") if i.strip()]
|
||||
|
||||
|
||||
def items_in_list(needle, haystack):
|
||||
|
2
setup.py
2
setup.py
@ -33,7 +33,7 @@ install_requires = [
|
||||
"semantic_version>=2.8.1,<3",
|
||||
"tabulate>=0.8.3,<1",
|
||||
"pyelftools>=0.25,<1",
|
||||
"marshmallow>=2.20.5,<3"
|
||||
"marshmallow>=2.20.5",
|
||||
]
|
||||
|
||||
setup(
|
||||
|
@ -239,21 +239,30 @@ int main() {
|
||||
|
||||
|
||||
def test_check_individual_flags_passed(clirunner, tmpdir):
|
||||
config = DEFAULT_CONFIG + "\ncheck_tool = cppcheck, clangtidy"
|
||||
config += "\ncheck_flags = cppcheck: --std=c++11 \n\tclangtidy: --fix-errors"
|
||||
config = DEFAULT_CONFIG + "\ncheck_tool = cppcheck, clangtidy, pvs-studio"
|
||||
config += """\ncheck_flags =
|
||||
cppcheck: --std=c++11
|
||||
clangtidy: --fix-errors
|
||||
pvs-studio: --analysis-mode=4
|
||||
"""
|
||||
tmpdir.join("platformio.ini").write(config)
|
||||
tmpdir.mkdir("src").join("main.cpp").write(TEST_CODE)
|
||||
result = clirunner.invoke(cmd_check, ["--project-dir", str(tmpdir), "-v"])
|
||||
|
||||
clang_flags_found = cppcheck_flags_found = False
|
||||
clang_flags_found = cppcheck_flags_found = pvs_flags_found = False
|
||||
for l in result.output.split("\n"):
|
||||
if "--fix" in l and "clang-tidy" in l and "--std=c++11" not in l:
|
||||
clang_flags_found = True
|
||||
elif "--std=c++11" in l and "cppcheck" in l and "--fix" not in l:
|
||||
cppcheck_flags_found = True
|
||||
elif (
|
||||
"--analysis-mode=4" in l and "pvs-studio" in l.lower() and "--fix" not in l
|
||||
):
|
||||
pvs_flags_found = True
|
||||
|
||||
assert clang_flags_found
|
||||
assert cppcheck_flags_found
|
||||
assert pvs_flags_found
|
||||
|
||||
|
||||
def test_check_cppcheck_misra_addon(clirunner, check_dir):
|
||||
@ -344,3 +353,33 @@ int main() {
|
||||
|
||||
assert high_result.exit_code == 0
|
||||
assert low_result.exit_code != 0
|
||||
|
||||
|
||||
def test_check_pvs_studio_free_license(clirunner, tmpdir):
|
||||
config = """
|
||||
[env:test]
|
||||
platform = teensy
|
||||
board = teensy35
|
||||
framework = arduino
|
||||
check_tool = pvs-studio
|
||||
"""
|
||||
code = (
|
||||
"""// This is an open source non-commercial project. Dear PVS-Studio, please check it.
|
||||
// PVS-Studio Static Code Analyzer for C, C++, C#, and Java: http://www.viva64.com
|
||||
"""
|
||||
+ TEST_CODE
|
||||
)
|
||||
|
||||
tmpdir.join("platformio.ini").write(config)
|
||||
tmpdir.mkdir("src").join("main.c").write(code)
|
||||
|
||||
result = clirunner.invoke(
|
||||
cmd_check, ["--project-dir", str(tmpdir), "--fail-on-defect=high", "-v"]
|
||||
)
|
||||
|
||||
errors, warnings, style = count_defects(result.output)
|
||||
|
||||
assert result.exit_code != 0
|
||||
assert errors != 0
|
||||
assert warnings != 0
|
||||
assert style == 0
|
||||
|
@ -16,10 +16,10 @@ import json
|
||||
from os import getcwd, makedirs
|
||||
from os.path import getsize, isdir, isfile, join
|
||||
|
||||
from platformio import exception
|
||||
from platformio.commands.boards import cli as cmd_boards
|
||||
from platformio.commands.init import cli as cmd_init
|
||||
from platformio.commands.project import project_init as cmd_init
|
||||
from platformio.project.config import ProjectConfig
|
||||
from platformio.project.exception import ProjectEnvsNotAvailableError
|
||||
|
||||
|
||||
def validate_pioproject(pioproject_dir):
|
||||
@ -59,7 +59,7 @@ def test_init_ide_without_board(clirunner, tmpdir):
|
||||
with tmpdir.as_cwd():
|
||||
result = clirunner.invoke(cmd_init, ["--ide", "atom"])
|
||||
assert result.exit_code != 0
|
||||
assert isinstance(result.exception, exception.ProjectEnvsNotAvailable)
|
||||
assert isinstance(result.exception, ProjectEnvsNotAvailableError)
|
||||
|
||||
|
||||
def test_init_ide_atom(clirunner, validate_cliresult, tmpdir):
|
||||
|
@ -230,7 +230,9 @@ def test_global_lib_update_check(clirunner, validate_cliresult):
|
||||
)
|
||||
validate_cliresult(result)
|
||||
output = json.loads(result.output)
|
||||
assert set(["RFcontrol", "NeoPixelBus"]) == set([l["name"] for l in output])
|
||||
assert set(["RFcontrol", "ESPAsyncTCP", "NeoPixelBus"]) == set(
|
||||
[l["name"] for l in output]
|
||||
)
|
||||
|
||||
|
||||
def test_global_lib_update(clirunner, validate_cliresult):
|
||||
@ -250,7 +252,7 @@ def test_global_lib_update(clirunner, validate_cliresult):
|
||||
result = clirunner.invoke(cmd_lib, ["-g", "update"])
|
||||
validate_cliresult(result)
|
||||
assert result.output.count("[Detached]") == 5
|
||||
assert result.output.count("[Up-to-date]") == 11
|
||||
assert result.output.count("[Up-to-date]") == 10
|
||||
assert "Uninstalling RFcontrol @ 77d4eb3f8a" in result.output
|
||||
|
||||
# update unknown library
|
||||
|
@ -29,25 +29,45 @@ def test_library_json_parser():
|
||||
"name": "TestPackage",
|
||||
"keywords": "kw1, KW2, kw3",
|
||||
"platforms": ["atmelavr", "espressif"],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "http://github.com/username/repo/"
|
||||
},
|
||||
"url": "http://old.url.format",
|
||||
"exclude": [".gitignore", "tests"],
|
||||
"include": "mylib",
|
||||
"build": {
|
||||
"flags": ["-DHELLO"]
|
||||
},
|
||||
"examples": ["examples/*/*.pde"],
|
||||
"dependencies": {
|
||||
"deps1": "1.2.0",
|
||||
"deps2": "https://github.com/username/package.git",
|
||||
"@owner/deps3": "^2.1.3"
|
||||
},
|
||||
"customField": "Custom Value"
|
||||
}
|
||||
"""
|
||||
mp = parser.LibraryJsonManifestParser(contents)
|
||||
raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
assert not jsondiff.diff(
|
||||
mp.as_dict(),
|
||||
raw_data,
|
||||
{
|
||||
"name": "TestPackage",
|
||||
"platforms": ["atmelavr", "espressif8266"],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/username/repo.git",
|
||||
},
|
||||
"export": {"exclude": [".gitignore", "tests"], "include": ["mylib"]},
|
||||
"keywords": ["kw1", "kw2", "kw3"],
|
||||
"homepage": "http://old.url.format",
|
||||
"build": {"flags": ["-DHELLO"]},
|
||||
"dependencies": [
|
||||
{"name": "@owner/deps3", "version": "^2.1.3"},
|
||||
{"name": "deps1", "version": "1.2.0"},
|
||||
{"name": "deps2", "version": "https://github.com/username/package.git"},
|
||||
],
|
||||
"customField": "Custom Value",
|
||||
},
|
||||
)
|
||||
@ -59,20 +79,43 @@ def test_library_json_parser():
|
||||
"platforms": "atmelavr",
|
||||
"export": {
|
||||
"exclude": "audio_samples"
|
||||
}
|
||||
},
|
||||
"dependencies": [
|
||||
{"name": "deps1", "version": "1.0.0"},
|
||||
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": "arduino, espidf"},
|
||||
{"name": "deps3", "version": "1.0.0", "platforms": ["ststm32", "sifive"]}
|
||||
]
|
||||
}
|
||||
"""
|
||||
mp = parser.LibraryJsonManifestParser(contents)
|
||||
raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
assert not jsondiff.diff(
|
||||
mp.as_dict(),
|
||||
raw_data,
|
||||
{
|
||||
"keywords": ["sound", "audio", "music", "sd", "card", "playback"],
|
||||
"frameworks": ["arduino"],
|
||||
"export": {"exclude": ["audio_samples"]},
|
||||
"platforms": ["atmelavr"],
|
||||
"dependencies": [
|
||||
{
|
||||
"name": "@owner/deps2",
|
||||
"version": "1.0.0",
|
||||
"frameworks": ["arduino", "espidf"],
|
||||
},
|
||||
{"name": "deps1", "version": "1.0.0"},
|
||||
{
|
||||
"name": "deps3",
|
||||
"version": "1.0.0",
|
||||
"platforms": ["ststm32", "sifive"],
|
||||
},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
# broken dependencies
|
||||
with pytest.raises(parser.ManifestParserError):
|
||||
parser.LibraryJsonManifestParser({"dependencies": ["deps1", "deps2"]})
|
||||
|
||||
|
||||
def test_module_json_parser():
|
||||
contents = """
|
||||
@ -128,10 +171,12 @@ version=1.2.3
|
||||
author=SomeAuthor <info AT author.com>
|
||||
sentence=This is Arduino library
|
||||
customField=Custom Value
|
||||
depends=First Library (=2.0.0), Second Library (>=1.2.0), Third
|
||||
"""
|
||||
mp = parser.LibraryPropertiesManifestParser(contents)
|
||||
raw_data = parser.LibraryPropertiesManifestParser(contents).as_dict()
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
assert not jsondiff.diff(
|
||||
mp.as_dict(),
|
||||
raw_data,
|
||||
{
|
||||
"name": "TestPackage",
|
||||
"version": "1.2.3",
|
||||
@ -145,6 +190,20 @@ customField=Custom Value
|
||||
"authors": [{"email": "info@author.com", "name": "SomeAuthor"}],
|
||||
"keywords": ["uncategorized"],
|
||||
"customField": "Custom Value",
|
||||
"depends": "First Library (=2.0.0), Second Library (>=1.2.0), Third",
|
||||
"dependencies": [
|
||||
{
|
||||
"name": "First Library",
|
||||
"version": "=2.0.0",
|
||||
"frameworks": ["arduino"],
|
||||
},
|
||||
{
|
||||
"name": "Second Library",
|
||||
"version": ">=1.2.0",
|
||||
"frameworks": ["arduino"],
|
||||
},
|
||||
{"name": "Third", "frameworks": ["arduino"]},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
@ -153,6 +212,7 @@ customField=Custom Value
|
||||
"architectures=*\n" + contents
|
||||
).as_dict()
|
||||
assert data["platforms"] == ["*"]
|
||||
|
||||
# Platforms specific
|
||||
data = parser.LibraryPropertiesManifestParser(
|
||||
"architectures=avr, esp32\n" + contents
|
||||
@ -172,11 +232,11 @@ customField=Custom Value
|
||||
"include": ["libraries/TestPackage"],
|
||||
}
|
||||
assert data["repository"] == {
|
||||
"url": "https://github.com/username/reponame",
|
||||
"url": "https://github.com/username/reponame.git",
|
||||
"type": "git",
|
||||
}
|
||||
|
||||
# Hope page
|
||||
# Home page
|
||||
data = parser.LibraryPropertiesManifestParser(
|
||||
"url=https://github.com/username/reponame.git\n" + contents
|
||||
).as_dict()
|
||||
@ -185,6 +245,17 @@ customField=Custom Value
|
||||
"url": "https://github.com/username/reponame.git",
|
||||
}
|
||||
|
||||
# Author + Maintainer
|
||||
data = parser.LibraryPropertiesManifestParser(
|
||||
"""
|
||||
author=Rocket Scream Electronics
|
||||
maintainer=Rocket Scream Electronics
|
||||
"""
|
||||
).as_dict()
|
||||
assert data["authors"] == [
|
||||
{"name": "Rocket Scream Electronics", "maintainer": True}
|
||||
]
|
||||
|
||||
|
||||
def test_library_json_schema():
|
||||
contents = """
|
||||
@ -202,6 +273,7 @@ def test_library_json_schema():
|
||||
"name": "Benoit Blanchon",
|
||||
"url": "https://blog.benoitblanchon.fr"
|
||||
},
|
||||
"downloadUrl": "https://example.com/package.tar.gz",
|
||||
"exclude": [
|
||||
"fuzzing",
|
||||
"scripts",
|
||||
@ -222,15 +294,20 @@ def test_library_json_schema():
|
||||
"base": "examples/JsonHttpClient",
|
||||
"files": ["JsonHttpClient.ino"]
|
||||
}
|
||||
],
|
||||
"dependencies": [
|
||||
{"name": "deps1", "version": "1.0.0"},
|
||||
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": "arduino"},
|
||||
{"name": "deps3", "version": "1.0.0", "platforms": ["ststm32", "sifive"]}
|
||||
]
|
||||
}
|
||||
"""
|
||||
raw_data = parser.ManifestParserFactory.new(
|
||||
contents, parser.ManifestFileType.LIBRARY_JSON
|
||||
).as_dict()
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
|
||||
data, errors = ManifestSchema(strict=True).load(raw_data)
|
||||
assert not errors
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
|
||||
assert data["repository"]["url"] == "https://github.com/bblanchon/ArduinoJson.git"
|
||||
assert data["examples"][1]["base"] == "examples/JsonHttpClient"
|
||||
@ -251,6 +328,7 @@ def test_library_json_schema():
|
||||
"authors": [
|
||||
{"name": "Benoit Blanchon", "url": "https://blog.benoitblanchon.fr"}
|
||||
],
|
||||
"downloadUrl": "https://example.com/package.tar.gz",
|
||||
"export": {"exclude": ["fuzzing", "scripts", "test", "third-party"]},
|
||||
"frameworks": ["arduino"],
|
||||
"platforms": ["*"],
|
||||
@ -267,6 +345,45 @@ def test_library_json_schema():
|
||||
"files": ["JsonHttpClient.ino"],
|
||||
},
|
||||
],
|
||||
"dependencies": [
|
||||
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": ["arduino"]},
|
||||
{"name": "deps1", "version": "1.0.0"},
|
||||
{
|
||||
"name": "deps3",
|
||||
"version": "1.0.0",
|
||||
"platforms": ["ststm32", "sifive"],
|
||||
},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
# legacy dependencies format
|
||||
contents = """
|
||||
{
|
||||
"name": "DallasTemperature",
|
||||
"version": "3.8.0",
|
||||
"dependencies":
|
||||
{
|
||||
"name": "OneWire",
|
||||
"authors": "Paul Stoffregen",
|
||||
"frameworks": "arduino"
|
||||
}
|
||||
}
|
||||
"""
|
||||
raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
assert not jsondiff.diff(
|
||||
data,
|
||||
{
|
||||
"name": "DallasTemperature",
|
||||
"version": "3.8.0",
|
||||
"dependencies": [
|
||||
{
|
||||
"name": "OneWire",
|
||||
"authors": ["Paul Stoffregen"],
|
||||
"frameworks": ["arduino"],
|
||||
}
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
@ -282,13 +399,14 @@ paragraph=Supported display controller: SSD1306, SSD1309, SSD1322, SSD1325
|
||||
category=Display
|
||||
url=https://github.com/olikraus/u8glib
|
||||
architectures=avr,sam
|
||||
depends=First Library (=2.0.0), Second Library (>=1.2.0), Third
|
||||
"""
|
||||
raw_data = parser.ManifestParserFactory.new(
|
||||
contents, parser.ManifestFileType.LIBRARY_PROPERTIES
|
||||
).as_dict()
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
|
||||
data, errors = ManifestSchema(strict=True).load(raw_data)
|
||||
assert not errors
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
|
||||
assert not jsondiff.diff(
|
||||
data,
|
||||
@ -297,7 +415,10 @@ architectures=avr,sam
|
||||
"A library for monochrome TFTs and OLEDs. Supported display "
|
||||
"controller: SSD1306, SSD1309, SSD1322, SSD1325"
|
||||
),
|
||||
"repository": {"url": "https://github.com/olikraus/u8glib", "type": "git"},
|
||||
"repository": {
|
||||
"url": "https://github.com/olikraus/u8glib.git",
|
||||
"type": "git",
|
||||
},
|
||||
"frameworks": ["arduino"],
|
||||
"platforms": ["atmelavr", "atmelsam"],
|
||||
"version": "1.19.1",
|
||||
@ -309,6 +430,19 @@ architectures=avr,sam
|
||||
],
|
||||
"keywords": ["display"],
|
||||
"name": "U8glib",
|
||||
"dependencies": [
|
||||
{
|
||||
"name": "First Library",
|
||||
"version": "=2.0.0",
|
||||
"frameworks": ["arduino"],
|
||||
},
|
||||
{
|
||||
"name": "Second Library",
|
||||
"version": ">=1.2.0",
|
||||
"frameworks": ["arduino"],
|
||||
},
|
||||
{"name": "Third", "frameworks": ["arduino"]},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
@ -335,7 +469,12 @@ includes=MozziGuts.h
|
||||
),
|
||||
).as_dict()
|
||||
|
||||
data, errors = ManifestSchema(strict=False).load(raw_data)
|
||||
try:
|
||||
ManifestSchema().load_manifest(raw_data)
|
||||
except ManifestValidationError as e:
|
||||
data = e.valid_data
|
||||
errors = e.messages
|
||||
|
||||
assert errors["authors"]
|
||||
|
||||
assert not jsondiff.diff(
|
||||
@ -348,7 +487,10 @@ includes=MozziGuts.h
|
||||
"sounds using familiar synthesis units like oscillators, delays, "
|
||||
"filters and envelopes."
|
||||
),
|
||||
"repository": {"url": "https://github.com/sensorium/Mozzi", "type": "git"},
|
||||
"repository": {
|
||||
"url": "https://github.com/sensorium/Mozzi.git",
|
||||
"type": "git",
|
||||
},
|
||||
"platforms": ["*"],
|
||||
"frameworks": ["arduino"],
|
||||
"export": {
|
||||
@ -404,11 +546,6 @@ def test_platform_json_schema():
|
||||
"optional": true,
|
||||
"version": "~4.2.0"
|
||||
},
|
||||
"framework-simba": {
|
||||
"type": "framework",
|
||||
"optional": true,
|
||||
"version": ">=7.0.0"
|
||||
},
|
||||
"tool-avrdude": {
|
||||
"type": "uploader",
|
||||
"optional": true,
|
||||
@ -421,8 +558,9 @@ def test_platform_json_schema():
|
||||
contents, parser.ManifestFileType.PLATFORM_JSON
|
||||
).as_dict()
|
||||
raw_data["frameworks"] = sorted(raw_data["frameworks"])
|
||||
data, errors = ManifestSchema(strict=False).load(raw_data)
|
||||
assert not errors
|
||||
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
|
||||
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
|
||||
assert not jsondiff.diff(
|
||||
data,
|
||||
@ -444,6 +582,11 @@ def test_platform_json_schema():
|
||||
},
|
||||
"frameworks": sorted(["arduino", "simba"]),
|
||||
"version": "1.15.0",
|
||||
"dependencies": [
|
||||
{"name": "framework-arduinoavr", "version": "~4.2.0"},
|
||||
{"name": "tool-avrdude", "version": "~1.60300.0"},
|
||||
{"name": "toolchain-atmelavr", "version": "~1.50400.0"},
|
||||
],
|
||||
},
|
||||
)
|
||||
|
||||
@ -461,8 +604,7 @@ def test_package_json_schema():
|
||||
contents, parser.ManifestFileType.PACKAGE_JSON
|
||||
).as_dict()
|
||||
|
||||
data, errors = ManifestSchema(strict=False).load(raw_data)
|
||||
assert not errors
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
|
||||
assert not jsondiff.diff(
|
||||
data,
|
||||
@ -492,6 +634,7 @@ def test_package_json_schema():
|
||||
|
||||
def test_parser_from_dir(tmpdir_factory):
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
pkg_dir.join("package.json").write('{"name": "package.json"}')
|
||||
pkg_dir.join("library.json").write('{"name": "library.json"}')
|
||||
pkg_dir.join("library.properties").write("name=library.properties")
|
||||
|
||||
@ -564,8 +707,7 @@ def test_examples_from_dir(tmpdir_factory):
|
||||
|
||||
raw_data["examples"] = _sort_examples(raw_data["examples"])
|
||||
|
||||
data, errors = ManifestSchema(strict=True).load(raw_data)
|
||||
assert not errors
|
||||
data = ManifestSchema().load_manifest(raw_data)
|
||||
|
||||
assert not jsondiff.diff(
|
||||
data,
|
||||
@ -621,34 +763,32 @@ def test_examples_from_dir(tmpdir_factory):
|
||||
|
||||
|
||||
def test_broken_schemas():
|
||||
# non-strict mode
|
||||
data, errors = ManifestSchema(strict=False).load(dict(name="MyPackage"))
|
||||
assert set(errors.keys()) == set(["version"])
|
||||
assert data.get("version") is None
|
||||
|
||||
# invalid keywords
|
||||
data, errors = ManifestSchema(strict=False).load(dict(keywords=["kw1", "*^[]"]))
|
||||
assert errors
|
||||
assert data["keywords"] == ["kw1"]
|
||||
|
||||
# strict mode
|
||||
|
||||
# missing required field
|
||||
with pytest.raises(
|
||||
ManifestValidationError, match="Missing data for required field"
|
||||
):
|
||||
ManifestSchema(strict=True).load(dict(name="MyPackage"))
|
||||
ManifestValidationError, match=("Invalid semantic versioning format")
|
||||
) as exc_info:
|
||||
ManifestSchema().load_manifest(dict(name="MyPackage", version="broken_version"))
|
||||
assert exc_info.value.valid_data == {"name": "MyPackage"}
|
||||
|
||||
# invalid StrictList
|
||||
with pytest.raises(
|
||||
ManifestValidationError, match=("Invalid manifest fields.+keywords")
|
||||
) as exc_info:
|
||||
ManifestSchema().load_manifest(
|
||||
dict(name="MyPackage", version="1.0.0", keywords=["kw1", "*^[]"])
|
||||
)
|
||||
assert list(exc_info.value.messages.keys()) == ["keywords"]
|
||||
assert exc_info.value.valid_data["keywords"] == ["kw1"]
|
||||
|
||||
# broken SemVer
|
||||
with pytest.raises(
|
||||
ManifestValidationError, match=("Invalid semantic versioning format")
|
||||
):
|
||||
ManifestSchema(strict=True).load(
|
||||
dict(name="MyPackage", version="broken_version")
|
||||
)
|
||||
ManifestSchema().load_manifest(dict(name="MyPackage", version="broken_version"))
|
||||
|
||||
# broken value for Nested
|
||||
with pytest.raises(ManifestValidationError, match=r"authors.*Invalid input type"):
|
||||
ManifestSchema(strict=True).load(
|
||||
ManifestSchema().load_manifest(
|
||||
dict(
|
||||
name="MyPackage",
|
||||
description="MyDescription",
|
149
tests/package/test_pack.py
Normal file
149
tests/package/test_pack.py
Normal file
@ -0,0 +1,149 @@
|
||||
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import json
|
||||
import os
|
||||
import tarfile
|
||||
|
||||
import pytest
|
||||
|
||||
from platformio import fs
|
||||
from platformio.compat import WINDOWS
|
||||
from platformio.package.exception import UnknownManifestError
|
||||
from platformio.package.pack import PackagePacker
|
||||
|
||||
|
||||
def test_base(tmpdir_factory):
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
pkg_dir.join(".git").mkdir().join("file").write("")
|
||||
pkg_dir.join(".gitignore").write("tests")
|
||||
pkg_dir.join("._ignored").write("")
|
||||
pkg_dir.join("main.cpp").write("#include <stdio.h>")
|
||||
p = PackagePacker(str(pkg_dir))
|
||||
# test missed manifest
|
||||
with pytest.raises(UnknownManifestError):
|
||||
p.pack()
|
||||
# minimal package
|
||||
pkg_dir.join("library.json").write('{"name": "foo", "version": "1.0.0"}')
|
||||
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
|
||||
with fs.cd(str(pkg_dir)):
|
||||
p.pack()
|
||||
with tarfile.open(os.path.join(str(pkg_dir), "foo-1.0.0.tar.gz"), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(
|
||||
[".gitignore", "include/main.h", "library.json", "main.cpp"]
|
||||
)
|
||||
|
||||
|
||||
def test_filters(tmpdir_factory):
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
src_dir = pkg_dir.mkdir("src")
|
||||
src_dir.join("main.cpp").write("#include <stdio.h>")
|
||||
src_dir.mkdir("util").join("helpers.cpp").write("void")
|
||||
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
|
||||
test_dir = pkg_dir.mkdir("tests")
|
||||
test_dir.join("test_1.h").write("")
|
||||
test_dir.join("test_2.h").write("")
|
||||
|
||||
# test include with remap of root
|
||||
pkg_dir.join("library.json").write(
|
||||
json.dumps(dict(name="bar", version="1.2.3", export={"include": "src"}))
|
||||
)
|
||||
p = PackagePacker(str(pkg_dir))
|
||||
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(["util/helpers.cpp", "main.cpp"])
|
||||
|
||||
# test include "src" and "include"
|
||||
pkg_dir.join("library.json").write(
|
||||
json.dumps(
|
||||
dict(name="bar", version="1.2.3", export={"include": ["src", "include"]})
|
||||
)
|
||||
)
|
||||
p = PackagePacker(str(pkg_dir))
|
||||
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(
|
||||
["include/main.h", "library.json", "src/main.cpp", "src/util/helpers.cpp"]
|
||||
)
|
||||
|
||||
# test include & exclude
|
||||
pkg_dir.join("library.json").write(
|
||||
json.dumps(
|
||||
dict(
|
||||
name="bar",
|
||||
version="1.2.3",
|
||||
export={"include": ["src", "include"], "exclude": ["*/*.h"]},
|
||||
)
|
||||
)
|
||||
)
|
||||
p = PackagePacker(str(pkg_dir))
|
||||
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(
|
||||
["library.json", "src/main.cpp", "src/util/helpers.cpp"]
|
||||
)
|
||||
|
||||
|
||||
def test_symlinks(tmpdir_factory):
|
||||
# Windows does not support symbolic links
|
||||
if WINDOWS:
|
||||
return
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
src_dir = pkg_dir.mkdir("src")
|
||||
src_dir.join("main.cpp").write("#include <stdio.h>")
|
||||
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
|
||||
src_dir.join("main.h").mksymlinkto(os.path.join("..", "include", "main.h"))
|
||||
pkg_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
|
||||
tarball = pkg_dir.join("bar.tar.gz")
|
||||
with tarfile.open(str(tarball), "w:gz") as tar:
|
||||
for item in pkg_dir.listdir():
|
||||
tar.add(str(item), str(item.relto(pkg_dir)))
|
||||
|
||||
p = PackagePacker(str(tarball))
|
||||
assert p.pack(str(pkg_dir)).endswith("bar-2.0.0.tar.gz")
|
||||
with tarfile.open(os.path.join(str(pkg_dir), "bar-2.0.0.tar.gz"), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(
|
||||
["include/main.h", "library.json", "src/main.cpp", "src/main.h"]
|
||||
)
|
||||
m = tar.getmember("src/main.h")
|
||||
assert m.issym()
|
||||
|
||||
|
||||
def test_source_root(tmpdir_factory):
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
root_dir = pkg_dir.mkdir("root")
|
||||
src_dir = root_dir.mkdir("src")
|
||||
src_dir.join("main.cpp").write("#include <stdio.h>")
|
||||
root_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
|
||||
p = PackagePacker(str(pkg_dir))
|
||||
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(["library.json", "src/main.cpp"])
|
||||
|
||||
|
||||
def test_manifest_uri(tmpdir_factory):
|
||||
pkg_dir = tmpdir_factory.mktemp("package")
|
||||
root_dir = pkg_dir.mkdir("root")
|
||||
src_dir = root_dir.mkdir("src")
|
||||
src_dir.join("main.cpp").write("#include <stdio.h>")
|
||||
root_dir.join("library.json").write('{"name": "foo", "version": "1.0.0"}')
|
||||
bar_dir = root_dir.mkdir("library").mkdir("bar")
|
||||
bar_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
|
||||
bar_dir.mkdir("include").join("bar.h").write("")
|
||||
|
||||
manifest_path = pkg_dir.join("remote_library.json")
|
||||
manifest_path.write(
|
||||
'{"name": "bar", "version": "3.0.0", "export": {"include": "root/library/bar"}}'
|
||||
)
|
||||
|
||||
p = PackagePacker(str(pkg_dir), manifest_uri="file:%s" % manifest_path)
|
||||
p.pack(str(pkg_dir))
|
||||
with tarfile.open(os.path.join(str(pkg_dir), "bar-2.0.0.tar.gz"), "r:gz") as tar:
|
||||
assert set(tar.getnames()) == set(["library.json", "include/bar.h"])
|
@ -112,3 +112,67 @@ int main() {
|
||||
assert "-DTMP_MACRO1" not in build_output
|
||||
assert "-Os" not in build_output
|
||||
assert str(tmpdir) not in build_output
|
||||
|
||||
|
||||
def test_debug_default_build_flags(clirunner, validate_cliresult, tmpdir):
|
||||
tmpdir.join("platformio.ini").write(
|
||||
"""
|
||||
[env:native]
|
||||
platform = native
|
||||
build_type = debug
|
||||
"""
|
||||
)
|
||||
|
||||
tmpdir.mkdir("src").join("main.c").write(
|
||||
"""
|
||||
int main() {
|
||||
}
|
||||
"""
|
||||
)
|
||||
|
||||
result = clirunner.invoke(cmd_run, ["--project-dir", str(tmpdir), "--verbose"])
|
||||
validate_cliresult(result)
|
||||
build_output = result.output[result.output.find("Scanning dependencies...") :]
|
||||
for line in build_output.split("\n"):
|
||||
if line.startswith("gcc"):
|
||||
assert all(line.count(flag) == 1 for flag in ("-Og", "-g2", "-ggdb2"))
|
||||
assert all(
|
||||
line.count("-%s%d" % (flag, level)) == 0
|
||||
for flag in ("O", "g", "ggdb")
|
||||
for level in (0, 1, 3)
|
||||
)
|
||||
assert "-Os" not in line
|
||||
|
||||
|
||||
def test_debug_custom_build_flags(clirunner, validate_cliresult, tmpdir):
|
||||
custom_debug_build_flags = ("-O3", "-g3", "-ggdb3")
|
||||
|
||||
tmpdir.join("platformio.ini").write(
|
||||
"""
|
||||
[env:native]
|
||||
platform = native
|
||||
build_type = debug
|
||||
debug_build_flags = %s
|
||||
"""
|
||||
% " ".join(custom_debug_build_flags)
|
||||
)
|
||||
|
||||
tmpdir.mkdir("src").join("main.c").write(
|
||||
"""
|
||||
int main() {
|
||||
}
|
||||
"""
|
||||
)
|
||||
|
||||
result = clirunner.invoke(cmd_run, ["--project-dir", str(tmpdir), "--verbose"])
|
||||
validate_cliresult(result)
|
||||
build_output = result.output[result.output.find("Scanning dependencies...") :]
|
||||
for line in build_output.split("\n"):
|
||||
if line.startswith("gcc"):
|
||||
assert all(line.count(f) == 1 for f in custom_debug_build_flags)
|
||||
assert all(
|
||||
line.count("-%s%d" % (flag, level)) == 0
|
||||
for flag in ("O", "g", "ggdb")
|
||||
for level in (0, 1, 2)
|
||||
)
|
||||
assert all("-O%s" % optimization not in line for optimization in ("g", "s"))
|
||||
|
@ -20,6 +20,7 @@ from os.path import basename, dirname, getsize, isdir, isfile, join, normpath
|
||||
import pytest
|
||||
|
||||
from platformio import util
|
||||
from platformio.compat import PY2
|
||||
from platformio.managers.platform import PlatformFactory, PlatformManager
|
||||
from platformio.project.config import ProjectConfig
|
||||
|
||||
@ -53,6 +54,8 @@ def pytest_generate_tests(metafunc):
|
||||
for root, _, files in walk(examples_dir):
|
||||
if "platformio.ini" not in files or ".skiptest" in files:
|
||||
continue
|
||||
if "zephyr-" in root and PY2:
|
||||
continue
|
||||
group = basename(root)
|
||||
if "-" in group:
|
||||
group = group.split("-", 1)[0]
|
||||
|
@ -25,8 +25,8 @@ def test_platformio_cli():
|
||||
|
||||
|
||||
def test_ping_internet_ips():
|
||||
for ip in util.PING_INTERNET_IPS:
|
||||
requests.get("http://%s" % ip, allow_redirects=False, timeout=2)
|
||||
for host in util.PING_REMOTE_HOSTS:
|
||||
requests.get("http://%s" % host, allow_redirects=False, timeout=2)
|
||||
|
||||
|
||||
def test_api_internet_offline(without_internet, isolated_pio_home):
|
||||
|
@ -16,8 +16,8 @@ import os
|
||||
|
||||
import pytest
|
||||
|
||||
from platformio.exception import UnknownEnvNames
|
||||
from platformio.project.config import ConfigParser, ProjectConfig
|
||||
from platformio.project.exception import InvalidProjectConfError, UnknownEnvNamesError
|
||||
|
||||
BASE_CONFIG = """
|
||||
[platformio]
|
||||
@ -34,6 +34,7 @@ lib_deps =
|
||||
Lib1 ; inline comment in multi-line value
|
||||
Lib2
|
||||
lib_ignore = ${custom.lib_ignore}
|
||||
custom_builtin_option = ${env.build_type}
|
||||
|
||||
[strict_ldf]
|
||||
lib_ldf_mode = chain+
|
||||
@ -54,7 +55,7 @@ lib_ignore = LibIgnoreCustom
|
||||
|
||||
[env:base]
|
||||
build_flags = ${custom.debug_flags} ${custom.extra_flags}
|
||||
lib_compat_mode = ${strict_ldf.strict}
|
||||
lib_compat_mode = ${strict_ldf.lib_compat_mode}
|
||||
targets =
|
||||
|
||||
[env:test_extends]
|
||||
@ -65,7 +66,11 @@ extends = strict_settings
|
||||
|
||||
EXTRA_ENVS_CONFIG = """
|
||||
[env:extra_1]
|
||||
build_flags = ${custom.lib_flags} ${custom.debug_flags}
|
||||
build_flags =
|
||||
-fdata-sections
|
||||
-Wl,--gc-sections
|
||||
${custom.lib_flags}
|
||||
${custom.debug_flags}
|
||||
lib_install = 574
|
||||
|
||||
[env:extra_2]
|
||||
@ -96,13 +101,10 @@ def config(tmpdir_factory):
|
||||
|
||||
def test_empty_config():
|
||||
config = ProjectConfig("/non/existing/platformio.ini")
|
||||
|
||||
# unknown section
|
||||
with pytest.raises(ConfigParser.NoSectionError):
|
||||
config.getraw("unknown_section", "unknown_option")
|
||||
|
||||
with pytest.raises(InvalidProjectConfError):
|
||||
config.get("unknown_section", "unknown_option")
|
||||
assert config.sections() == []
|
||||
assert config.get("section", "option") is None
|
||||
assert config.get("section", "option", 13) == 13
|
||||
|
||||
|
||||
@ -111,7 +113,7 @@ def test_warnings(config):
|
||||
assert len(config.warnings) == 2
|
||||
assert "lib_install" in config.warnings[1]
|
||||
|
||||
with pytest.raises(UnknownEnvNames):
|
||||
with pytest.raises(UnknownEnvNamesError):
|
||||
config.validate(["non-existing-env"])
|
||||
|
||||
|
||||
@ -155,6 +157,7 @@ def test_options(config):
|
||||
"custom_monitor_speed",
|
||||
"lib_deps",
|
||||
"lib_ignore",
|
||||
"custom_builtin_option",
|
||||
]
|
||||
assert config.options(env="test_extends") == [
|
||||
"extends",
|
||||
@ -165,6 +168,7 @@ def test_options(config):
|
||||
"custom_monitor_speed",
|
||||
"lib_deps",
|
||||
"lib_ignore",
|
||||
"custom_builtin_option",
|
||||
]
|
||||
|
||||
|
||||
@ -176,7 +180,7 @@ def test_has_option(config):
|
||||
|
||||
|
||||
def test_sysenv_options(config):
|
||||
assert config.get("custom", "extra_flags") is None
|
||||
assert config.getraw("custom", "extra_flags") == ""
|
||||
assert config.get("env:base", "build_flags") == ["-D DEBUG=1"]
|
||||
assert config.get("env:base", "upload_port") is None
|
||||
assert config.get("env:extra_2", "upload_port") == "/dev/extra_2/port"
|
||||
@ -201,6 +205,7 @@ def test_sysenv_options(config):
|
||||
"custom_monitor_speed",
|
||||
"lib_deps",
|
||||
"lib_ignore",
|
||||
"custom_builtin_option",
|
||||
"upload_port",
|
||||
]
|
||||
|
||||
@ -223,10 +228,17 @@ def test_getraw_value(config):
|
||||
with pytest.raises(ConfigParser.NoOptionError):
|
||||
config.getraw("platformio", "monitor_speed")
|
||||
|
||||
# default
|
||||
assert config.getraw("unknown", "option", "default") == "default"
|
||||
assert config.getraw("env:base", "custom_builtin_option") == "release"
|
||||
|
||||
# known
|
||||
assert config.getraw("env:base", "targets") == ""
|
||||
assert config.getraw("env:extra_1", "lib_deps") == "574"
|
||||
assert config.getraw("env:extra_1", "build_flags") == "-lc -lm -D DEBUG=1"
|
||||
assert (
|
||||
config.getraw("env:extra_1", "build_flags")
|
||||
== "\n-fdata-sections\n-Wl,--gc-sections\n-lc -lm\n-D DEBUG=1"
|
||||
)
|
||||
|
||||
# extended
|
||||
assert config.getraw("env:test_extends", "lib_ldf_mode") == "chain+"
|
||||
@ -236,7 +248,12 @@ def test_getraw_value(config):
|
||||
|
||||
def test_get_value(config):
|
||||
assert config.get("custom", "debug_flags") == "-D DEBUG=1"
|
||||
assert config.get("env:extra_1", "build_flags") == ["-lc -lm -D DEBUG=1"]
|
||||
assert config.get("env:extra_1", "build_flags") == [
|
||||
"-fdata-sections",
|
||||
"-Wl,--gc-sections",
|
||||
"-lc -lm",
|
||||
"-D DEBUG=1",
|
||||
]
|
||||
assert config.get("env:extra_2", "build_flags") == ["-Og"]
|
||||
assert config.get("env:extra_2", "monitor_speed") == 9600
|
||||
assert config.get("env:base", "build_flags") == ["-D DEBUG=1"]
|
||||
@ -246,24 +263,29 @@ def test_items(config):
|
||||
assert config.items("custom") == [
|
||||
("debug_flags", "-D DEBUG=1"),
|
||||
("lib_flags", "-lc -lm"),
|
||||
("extra_flags", None),
|
||||
("extra_flags", ""),
|
||||
("lib_ignore", "LibIgnoreCustom"),
|
||||
]
|
||||
assert config.items(env="base") == [
|
||||
("build_flags", ["-D DEBUG=1"]),
|
||||
("lib_compat_mode", "soft"),
|
||||
("lib_compat_mode", "strict"),
|
||||
("targets", []),
|
||||
("monitor_speed", 9600),
|
||||
("custom_monitor_speed", "115200"),
|
||||
("lib_deps", ["Lib1", "Lib2"]),
|
||||
("lib_ignore", ["LibIgnoreCustom"]),
|
||||
("custom_builtin_option", "release"),
|
||||
]
|
||||
assert config.items(env="extra_1") == [
|
||||
("build_flags", ["-lc -lm -D DEBUG=1"]),
|
||||
(
|
||||
"build_flags",
|
||||
["-fdata-sections", "-Wl,--gc-sections", "-lc -lm", "-D DEBUG=1"],
|
||||
),
|
||||
("lib_deps", ["574"]),
|
||||
("monitor_speed", 9600),
|
||||
("custom_monitor_speed", "115200"),
|
||||
("lib_ignore", ["LibIgnoreCustom"]),
|
||||
("custom_builtin_option", "release"),
|
||||
]
|
||||
assert config.items(env="extra_2") == [
|
||||
("build_flags", ["-Og"]),
|
||||
@ -272,6 +294,7 @@ def test_items(config):
|
||||
("monitor_speed", 9600),
|
||||
("custom_monitor_speed", "115200"),
|
||||
("lib_deps", ["Lib1", "Lib2"]),
|
||||
("custom_builtin_option", "release"),
|
||||
]
|
||||
assert config.items(env="test_extends") == [
|
||||
("extends", ["strict_settings"]),
|
||||
@ -282,6 +305,7 @@ def test_items(config):
|
||||
("custom_monitor_speed", "115200"),
|
||||
("lib_deps", ["Lib1", "Lib2"]),
|
||||
("lib_ignore", ["LibIgnoreCustom"]),
|
||||
("custom_builtin_option", "release"),
|
||||
]
|
||||
|
||||
|
||||
@ -315,9 +339,11 @@ board = myboard
|
||||
]
|
||||
|
||||
config.save()
|
||||
contents = tmpdir.join("platformio.ini").read()
|
||||
assert contents[-4:] == "yes\n"
|
||||
lines = [
|
||||
line.strip()
|
||||
for line in tmpdir.join("platformio.ini").readlines()
|
||||
for line in contents.split("\n")
|
||||
if line.strip() and not line.startswith((";", "#"))
|
||||
]
|
||||
assert lines == [
|
||||
@ -376,6 +402,7 @@ def test_dump(tmpdir_factory):
|
||||
("custom_monitor_speed", "115200"),
|
||||
("lib_deps", ["Lib1", "Lib2"]),
|
||||
("lib_ignore", ["${custom.lib_ignore}"]),
|
||||
("custom_builtin_option", "${env.build_type}"),
|
||||
],
|
||||
),
|
||||
("strict_ldf", [("lib_ldf_mode", "chain+"), ("lib_compat_mode", "strict")]),
|
||||
@ -397,7 +424,7 @@ def test_dump(tmpdir_factory):
|
||||
"env:base",
|
||||
[
|
||||
("build_flags", ["${custom.debug_flags} ${custom.extra_flags}"]),
|
||||
("lib_compat_mode", "${strict_ldf.strict}"),
|
||||
("lib_compat_mode", "${strict_ldf.lib_compat_mode}"),
|
||||
("targets", []),
|
||||
],
|
||||
),
|
||||
|
Reference in New Issue
Block a user