Merge branch 'release/v4.2.0'

This commit is contained in:
Ivan Kravets
2020-02-12 16:42:13 +02:00
81 changed files with 2373 additions and 801 deletions

View File

@ -6,6 +6,47 @@ Release Notes
PlatformIO Core 4.0 PlatformIO Core 4.0
------------------- -------------------
4.2.0 (2020-02-12)
~~~~~~~~~~~~~~~~~~
* `PlatformIO Home 3.1 <http://docs.platformio.org/page/home/index.html>`__:
- Project Manager
- Project Configuration UI for `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__
* `PIO Check <http://docs.platformio.org/page/plus/pio-check.html>`__ automated code analysis without hassle:
- Added support for `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ static code analyzer
* Initial support for `Project Manager <https://docs.platformio.org/page/userguide/project/index.html>`_ CLI:
- Show computed project configuration with a new `platformio project config <https://docs.platformio.org/page/userguide/project/cmd_config.html>`_ command or dump to JSON with ``platformio project config --json-output`` (`issue #3335 <https://github.com/platformio/platformio-core/issues/3335>`_)
- Moved ``platformio init`` command to `platformio project init <https://docs.platformio.org/page/userguide/project/cmd_init.html>`_
* Generate `compilation database "compile_commands.json" <https://docs.platformio.org/page/faq.html#compilation-database-compile-commands-json>`_ (`issue #2990 <https://github.com/platformio/platformio-core/issues/2990>`_)
* Control debug flags and optimization level with a new `debug_build_flags <https://docs.platformio.org/page/projectconf/section_env_debug.html#debug-build-flags>`__ option
* Install a dev-platform with ALL declared packages using a new ``--with-all-packages`` option for `pio platform install <https://docs.platformio.org/page/userguide/platforms/cmd_install.html>`__ command (`issue #3345 <https://github.com/platformio/platformio-core/issues/3345>`_)
* Added support for "pythonPackages" in `platform.json <https://docs.platformio.org/page/platforms/creating_platform.html#manifest-file-platform-json>`__ manifest (PlatformIO Package Manager will install dependent Python packages from PyPi registry automatically when dev-platform is installed)
* Handle project configuration (monitor, test, and upload options) for PIO Remote commands (`issue #2591 <https://github.com/platformio/platformio-core/issues/2591>`_)
* Added support for Arduino's library.properties ``depends`` field (`issue #2781 <https://github.com/platformio/platformio-core/issues/2781>`_)
* Autodetect monitor port for boards with specified HWIDs (`issue #3349 <https://github.com/platformio/platformio-core/issues/3349>`_)
* Updated SCons tool to 3.1.2
* Updated Unity tool to 2.5.0
* Made package ManifestSchema compatible with marshmallow >= 3 (`issue #3296 <https://github.com/platformio/platformio-core/issues/3296>`_)
* Warn about broken library manifest when scanning dependencies (`issue #3268 <https://github.com/platformio/platformio-core/issues/3268>`_)
* Do not overwrite custom items in VSCode's "extensions.json" (`issue #3374 <https://github.com/platformio/platformio-core/issues/3374>`_)
* Fixed an issue when ``env.BoardConfig()`` does not work for custom boards in extra scripts of libraries (`issue #3264 <https://github.com/platformio/platformio-core/issues/3264>`_)
* Fixed an issue with "start-group/end-group" linker flags on Native development platform (`issue #3282 <https://github.com/platformio/platformio-core/issues/3282>`_)
* Fixed default PIO Unified Debugger configuration for `J-Link probe <http://docs.platformio.org/page/plus/debug-tools/jlink.html>`__
* Fixed an issue with LDF when header files not found if "libdeps_dir" is within a subdirectory of "lib_extra_dirs" (`issue #3311 <https://github.com/platformio/platformio-core/issues/3311>`_)
* Fixed an issue "Import of non-existent variable 'projenv''" when development platform does not call "env.BuildProgram()" (`issue #3315 <https://github.com/platformio/platformio-core/issues/3315>`_)
* Fixed an issue when invalid CLI command does not return non-zero exit code
* Fixed an issue when Project Inspector crashes when flash use > 100% (`issue #3368 <https://github.com/platformio/platformio-core/issues/3368>`_)
* Fixed a "UnicodeDecodeError" when listing built-in libraries on macOS with Python 2.7 (`issue #3370 <https://github.com/platformio/platformio-core/issues/3370>`_)
* Fixed an issue with improperly handled compiler flags with space symbols in VSCode template (`issue #3364 <https://github.com/platformio/platformio-core/issues/3364>`_)
* Fixed an issue when no error is raised if referred parameter (interpolation) is missing in a project configuration file (`issue #3279 <https://github.com/platformio/platformio-core/issues/3279>`_)
4.1.0 (2019-11-07) 4.1.0 (2019-11-07)
~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~
@ -18,8 +59,9 @@ PlatformIO Core 4.0
- Unused variables or functions - Unused variables or functions
- Out of scope memory usage. - Out of scope memory usage.
* `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__ and Project Inspection * `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__:
- Project Inspection
- Static Code Analysis - Static Code Analysis
- Firmware File Explorer - Firmware File Explorer
- Firmware Memory Inspection - Firmware Memory Inspection

1
MANIFEST.in Normal file
View File

@ -0,0 +1 @@
include LICENSE

View File

@ -5,14 +5,14 @@ isort:
isort -rc ./platformio isort -rc ./platformio
isort -rc ./tests isort -rc ./tests
black: format:
black --target-version py27 ./platformio black --target-version py27 ./platformio
black --target-version py27 ./tests black --target-version py27 ./tests
test: test:
py.test --verbose --capture=no --exitfirst -n 3 --dist=loadscope tests --ignore tests/test_examples.py py.test --verbose --capture=no --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: isort black lint test before-commit: isort format lint test
clean-docs: clean-docs:
rm -rf docs/_build rm -rf docs/_build

View File

@ -34,12 +34,13 @@ PlatformIO
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png .. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core :target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ an open source ecosystem for embedded development `PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ a new generation ecosystem for embedded development
* **Cross-platform IDE** and **Unified Debugger** * Open source, maximum permissive Apache 2.0 license
* **Static Code Analyzer** and **Remote Unit Testing** * Cross-platform IDE and Unified Debugger
* **Multi-platform** and **Multi-architecture Build System** * Static Code Analyzer and Remote Unit Testing
* **Firmware File Explorer** and **Memory Inspection**. * Multi-platform and Multi-architecture Build System
* Firmware File Explorer and Memory Inspection.
Get Started Get Started
----------- -----------
@ -91,10 +92,10 @@ Development Platforms
* `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_ * `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_
* `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_ * `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_
* `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_ * `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_
* `Nuclei <https://platformio.org/platforms/nuclei?utm_source=github&utm_medium=core>`_
* `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_ * `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_ * `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_ * `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
* `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_ * `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_ * `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_ * `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
@ -108,24 +109,25 @@ Frameworks
---------- ----------
* `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_ * `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_
* `ARTIK SDK <https://platformio.org/frameworks/artik-sdk?utm_source=github&utm_medium=core>`_
* `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_ * `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_
* `Energia <https://platformio.org/frameworks/energia?utm_source=github&utm_medium=core>`_
* `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_ * `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_
* `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_ * `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_
* `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_ * `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_
* `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_ * `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_
* `GigaDevice GD32V SDK <https://platformio.org/frameworks/gd32vf103-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_ * `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte FreeRTOS SDK <https://platformio.org/frameworks/kendryte-freertos-sdk?utm_source=github&utm_medium=core>`_
* `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_ * `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_
* `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_ * `Mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Nuclei SDK <https://platformio.org/frameworks/nuclei-sdk?utm_source=github&utm_medium=core>`_
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_ * `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_ * `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/frameworks/shakti?utm_source=github&utm_medium=core>`_ * `Shakti SDK <https://platformio.org/frameworks/shakti-sdk?utm_source=github&utm_medium=core>`_
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_ * `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_ * `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_ * `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_
* `Tizen RT <https://platformio.org/frameworks/tizenrt?utm_source=github&utm_medium=core>`_
* `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_ * `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_
* `Zephyr <https://platformio.org/frameworks/zephyr?utm_source=github&utm_medium=core>`_
Contributing Contributing
------------ ------------

2
docs

Submodule docs updated: 28f91efb24...dc25f117fd

View File

@ -12,12 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
VERSION = (4, 1, 0) VERSION = (4, 2, 0)
__version__ = ".".join([str(s) for s in VERSION]) __version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio" __title__ = "platformio"
__description__ = ( __description__ = (
"An open source ecosystem for embedded development. " "A new generation ecosystem for embedded development. "
"Cross-platform IDE and Unified Debugger. " "Cross-platform IDE and Unified Debugger. "
"Static Code Analyzer and Remote Unit Testing. " "Static Code Analyzer and Remote Unit Testing. "
"Multi-platform and Multi-architecture Build System. " "Multi-platform and Multi-architecture Build System. "

View File

@ -100,8 +100,9 @@ def main(argv=None):
try: try:
configure() configure()
cli() # pylint: disable=no-value-for-parameter cli() # pylint: disable=no-value-for-parameter
except SystemExit: except SystemExit as e:
pass if e.code and str(e.code).isdigit():
exit_code = int(e.code)
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
if not isinstance(e, exception.ReturnErrorCode): if not isinstance(e, exception.ReturnErrorCode):
maintenance.on_platformio_exception(e) maintenance.on_platformio_exception(e)

View File

@ -17,7 +17,7 @@ import hashlib
import os import os
import uuid import uuid
from os import environ, getenv, listdir, remove from os import environ, getenv, listdir, remove
from os.path import abspath, dirname, isdir, isfile, join from os.path import dirname, isdir, isfile, join, realpath
from time import time from time import time
import requests import requests
@ -34,7 +34,7 @@ from platformio.project.helpers import (
def projects_dir_validate(projects_dir): def projects_dir_validate(projects_dir):
assert isdir(projects_dir) assert isdir(projects_dir)
return abspath(projects_dir) return realpath(projects_dir)
DEFAULT_SETTINGS = { DEFAULT_SETTINGS = {
@ -199,6 +199,7 @@ class ContentCache(object):
return True return True
def get_cache_path(self, key): def get_cache_path(self, key):
assert "/" not in key and "\\" not in key
key = str(key) key = str(key)
assert len(key) > 3 assert len(key) > 3
return join(self.cache_dir, key[-2:], key) return join(self.cache_dir, key[-2:], key)

View File

@ -50,10 +50,10 @@ clivars.AddVariables(
DEFAULT_ENV_OPTIONS = dict( DEFAULT_ENV_OPTIONS = dict(
tools=[ tools=[
"ar", "ar",
"gas", "as",
"gcc", "cc",
"g++", "c++",
"gnulink", "link",
"platformio", "platformio",
"pioplatform", "pioplatform",
"pioproject", "pioproject",
@ -72,6 +72,7 @@ DEFAULT_ENV_OPTIONS = dict(
BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"), BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=join("$BUILD_DIR", "src"), BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=join("$BUILD_DIR", "test"), BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=join("$BUILD_DIR", "compile_commands.json"),
LIBPATH=["$BUILD_DIR"], LIBPATH=["$BUILD_DIR"],
PROGNAME="program", PROGNAME="program",
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"), PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
@ -134,6 +135,10 @@ if env.GetOption("clean"):
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)): elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
click.echo("Verbose mode can be enabled via `-v, --verbose` option") click.echo("Verbose mode can be enabled via `-v, --verbose` option")
# Dynamically load dependent tools
if "compiledb" in COMMAND_LINE_TARGETS:
env.Tool("compilation_db")
if not isdir(env.subst("$BUILD_DIR")): if not isdir(env.subst("$BUILD_DIR")):
makedirs(env.subst("$BUILD_DIR")) makedirs(env.subst("$BUILD_DIR"))
@ -161,7 +166,9 @@ for item in env.GetExtraScripts("post"):
############################################################################## ##############################################################################
# Checking program size # Checking program size
if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS: if env.get("SIZETOOL") and not (
set(["nobuild", "sizedata"]) & set(COMMAND_LINE_TARGETS)
):
env.Depends(["upload", "program"], "checkprogsize") env.Depends(["upload", "program"], "checkprogsize")
# Replace platform's "size" target with our # Replace platform's "size" target with our
_new_targets = [t for t in DEFAULT_TARGETS if str(t) != "size"] _new_targets = [t for t in DEFAULT_TARGETS if str(t) != "size"]
@ -169,6 +176,9 @@ if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
Default(_new_targets) Default(_new_targets)
Default("checkprogsize") Default("checkprogsize")
if "compiledb" in COMMAND_LINE_TARGETS:
env.Alias("compiledb", env.CompilationDatabase("$COMPILATIONDB_PATH"))
# Print configured protocols # Print configured protocols
env.AddPreAction( env.AddPreAction(
["upload", "program"], ["upload", "program"],
@ -188,7 +198,10 @@ if "envdump" in COMMAND_LINE_TARGETS:
env.Exit(0) env.Exit(0)
if "idedata" in COMMAND_LINE_TARGETS: if "idedata" in COMMAND_LINE_TARGETS:
Import("projenv") try:
Import("projenv")
except: # pylint: disable=bare-except
projenv = env
click.echo( click.echo(
"\n%s\n" "\n%s\n"
% dump_json_to_unicode( % dump_json_to_unicode(

View File

@ -0,0 +1,209 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
# Copyright 2015 MongoDB Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument, protected-access, unused-variable, import-error
# Original: https://github.com/mongodb/mongo/blob/master/site_scons/site_tools/compilation_db.py
from __future__ import absolute_import
import itertools
import json
import os
import SCons
from platformio.builder.tools.platformio import SRC_ASM_EXT, SRC_C_EXT, SRC_CXX_EXT
# Implements the ability for SCons to emit a compilation database for the MongoDB project. See
# http://clang.llvm.org/docs/JSONCompilationDatabase.html for details on what a compilation
# database is, and why you might want one. The only user visible entry point here is
# 'env.CompilationDatabase'. This method takes an optional 'target' to name the file that
# should hold the compilation database, otherwise, the file defaults to compile_commands.json,
# which is the name that most clang tools search for by default.
# TODO: Is there a better way to do this than this global? Right now this exists so that the
# emitter we add can record all of the things it emits, so that the scanner for the top level
# compilation database can access the complete list, and also so that the writer has easy
# access to write all of the files. But it seems clunky. How can the emitter and the scanner
# communicate more gracefully?
__COMPILATION_DB_ENTRIES = []
# We make no effort to avoid rebuilding the entries. Someday, perhaps we could and even
# integrate with the cache, but there doesn't seem to be much call for it.
class __CompilationDbNode(SCons.Node.Python.Value):
def __init__(self, value):
SCons.Node.Python.Value.__init__(self, value)
self.Decider(changed_since_last_build_node)
def changed_since_last_build_node(child, target, prev_ni, node):
""" Dummy decider to force always building"""
return True
def makeEmitCompilationDbEntry(comstr):
"""
Effectively this creates a lambda function to capture:
* command line
* source
* target
:param comstr: unevaluated command line
:return: an emitter which has captured the above
"""
user_action = SCons.Action.Action(comstr)
def EmitCompilationDbEntry(target, source, env):
"""
This emitter will be added to each c/c++ object build to capture the info needed
for clang tools
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:return: target(s), source(s)
"""
dbtarget = __CompilationDbNode(source)
entry = env.__COMPILATIONDB_Entry(
target=dbtarget,
source=[],
__COMPILATIONDB_UTARGET=target,
__COMPILATIONDB_USOURCE=source,
__COMPILATIONDB_UACTION=user_action,
__COMPILATIONDB_ENV=env,
)
# TODO: Technically, these next two lines should not be required: it should be fine to
# cache the entries. However, they don't seem to update properly. Since they are quick
# to re-generate disable caching and sidestep this problem.
env.AlwaysBuild(entry)
env.NoCache(entry)
__COMPILATION_DB_ENTRIES.append(dbtarget)
return target, source
return EmitCompilationDbEntry
def CompilationDbEntryAction(target, source, env, **kw):
"""
Create a dictionary with evaluated command line, target, source
and store that info as an attribute on the target
(Which has been stored in __COMPILATION_DB_ENTRIES array
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:param kw:
:return: None
"""
command = env["__COMPILATIONDB_UACTION"].strfunction(
target=env["__COMPILATIONDB_UTARGET"],
source=env["__COMPILATIONDB_USOURCE"],
env=env["__COMPILATIONDB_ENV"],
)
entry = {
"directory": env.Dir("#").abspath,
"command": command,
"file": str(env["__COMPILATIONDB_USOURCE"][0]),
}
target[0].write(entry)
def WriteCompilationDb(target, source, env):
entries = []
for s in __COMPILATION_DB_ENTRIES:
item = s.read()
item["file"] = os.path.abspath(item["file"])
entries.append(item)
with open(str(target[0]), "w") as target_file:
json.dump(
entries, target_file, sort_keys=True, indent=4, separators=(",", ": ")
)
def ScanCompilationDb(node, env, path):
return __COMPILATION_DB_ENTRIES
def generate(env, **kwargs):
static_obj, shared_obj = SCons.Tool.createObjBuilders(env)
env["COMPILATIONDB_COMSTR"] = kwargs.get(
"COMPILATIONDB_COMSTR", "Building compilation database $TARGET"
)
components_by_suffix = itertools.chain(
itertools.product(
[".%s" % ext for ext in SRC_C_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CCCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCCCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_CXX_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CXXCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCXXCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_ASM_EXT],
[(static_obj, SCons.Defaults.StaticObjectEmitter, "$ASCOM")],
),
)
for entry in components_by_suffix:
suffix = entry[0]
builder, base_emitter, command = entry[1]
# Assumes a dictionary emitter
emitter = builder.emitter[suffix]
builder.emitter[suffix] = SCons.Builder.ListEmitter(
[emitter, makeEmitCompilationDbEntry(command)]
)
env["BUILDERS"]["__COMPILATIONDB_Entry"] = SCons.Builder.Builder(
action=SCons.Action.Action(CompilationDbEntryAction, None),
)
env["BUILDERS"]["__COMPILATIONDB_Database"] = SCons.Builder.Builder(
action=SCons.Action.Action(WriteCompilationDb, "$COMPILATIONDB_COMSTR"),
target_scanner=SCons.Scanner.Scanner(
function=ScanCompilationDb, node_class=None
),
)
def CompilationDatabase(env, target):
result = env.__COMPILATIONDB_Database(target=target, source=[])
env.AlwaysBuild(result)
env.NoCache(result)
return result
env.AddMethod(CompilationDatabase, "CompilationDatabase")
def exists(env):
return True

View File

@ -14,9 +14,8 @@
from __future__ import absolute_import from __future__ import absolute_import
import os
from glob import glob from glob import glob
from os import environ
from os.path import abspath, isfile, join
from SCons.Defaults import processDefines # pylint: disable=import-error from SCons.Defaults import processDefines # pylint: disable=import-error
@ -42,10 +41,10 @@ def _dump_includes(env):
continue continue
toolchain_dir = glob_escape(p.get_package_dir(name)) toolchain_dir = glob_escape(p.get_package_dir(name))
toolchain_incglobs = [ toolchain_incglobs = [
join(toolchain_dir, "*", "include*"), os.path.join(toolchain_dir, "*", "include*"),
join(toolchain_dir, "*", "include", "c++", "*"), os.path.join(toolchain_dir, "*", "include", "c++", "*"),
join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"), os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
join(toolchain_dir, "lib", "gcc", "*", "*", "include*"), os.path.join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
] ]
for g in toolchain_incglobs: for g in toolchain_incglobs:
includes.extend(glob(g)) includes.extend(glob(g))
@ -59,8 +58,9 @@ def _dump_includes(env):
# remove duplicates # remove duplicates
result = [] result = []
for item in includes: for item in includes:
item = os.path.realpath(item)
if item not in result: if item not in result:
result.append(abspath(item)) result.append(item)
return result return result
@ -68,7 +68,7 @@ def _dump_includes(env):
def _get_gcc_defines(env): def _get_gcc_defines(env):
items = [] items = []
try: try:
sysenv = environ.copy() sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"]) sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command( result = exec_command(
"echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True "echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True
@ -119,7 +119,7 @@ def _dump_defines(env):
def _get_svd_path(env): def _get_svd_path(env):
svd_path = env.GetProjectOption("debug_svd_path") svd_path = env.GetProjectOption("debug_svd_path")
if svd_path: if svd_path:
return abspath(svd_path) return os.path.realpath(svd_path)
if "BOARD" not in env: if "BOARD" not in env:
return None return None
@ -129,18 +129,29 @@ def _get_svd_path(env):
except (AssertionError, KeyError): except (AssertionError, KeyError):
return None return None
# custom path to SVD file # custom path to SVD file
if isfile(svd_path): if os.path.isfile(svd_path):
return svd_path return svd_path
# default file from ./platform/misc/svd folder # default file from ./platform/misc/svd folder
p = env.PioPlatform() p = env.PioPlatform()
if isfile(join(p.get_dir(), "misc", "svd", svd_path)): if os.path.isfile(os.path.join(p.get_dir(), "misc", "svd", svd_path)):
return abspath(join(p.get_dir(), "misc", "svd", svd_path)) return os.path.realpath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
return None return None
def _escape_build_flag(flags):
return [flag if " " not in flag else '"%s"' % flag for flag in flags]
def DumpIDEData(env): def DumpIDEData(env):
LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS"
LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS" env["__escape_build_flag"] = _escape_build_flag
LINTCCOM = (
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
LINTCXXCOM = (
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
data = { data = {
"env_name": env["PIOENV"], "env_name": env["PIOENV"],

View File

@ -33,7 +33,10 @@ from platformio import exception, fs, util
from platformio.builder.tools import platformio as piotool from platformio.builder.tools import platformio as piotool
from platformio.compat import WINDOWS, hashlib_encode_data, string_types from platformio.compat import WINDOWS, hashlib_encode_data, string_types
from platformio.managers.lib import LibraryManager from platformio.managers.lib import LibraryManager
from platformio.package.manifest.parser import ManifestParserFactory from platformio.package.manifest.parser import (
ManifestParserError,
ManifestParserFactory,
)
from platformio.project.options import ProjectOptions from platformio.project.options import ProjectOptions
@ -108,7 +111,14 @@ class LibBuilderBase(object):
self.path = realpath(env.subst(path)) self.path = realpath(env.subst(path))
self.verbose = verbose self.verbose = verbose
self._manifest = manifest if manifest else self.load_manifest() try:
self._manifest = manifest if manifest else self.load_manifest()
except ManifestParserError:
click.secho(
"Warning! Ignoring broken library manifest in " + self.path, fg="yellow"
)
self._manifest = {}
self._is_dependent = False self._is_dependent = False
self._is_built = False self._is_built = False
self._depbuilders = list() self._depbuilders = list()
@ -144,9 +154,7 @@ class LibBuilderBase(object):
@property @property
def dependencies(self): def dependencies(self):
return LibraryManager.normalize_dependencies( return self._manifest.get("dependencies")
self._manifest.get("dependencies", [])
)
@property @property
def src_filter(self): def src_filter(self):
@ -358,7 +366,7 @@ class LibBuilderBase(object):
if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT): if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT):
continue continue
_f_part = _h_path[: _h_path.rindex(".")] _f_part = _h_path[: _h_path.rindex(".")]
for ext in piotool.SRC_C_EXT: for ext in piotool.SRC_C_EXT + piotool.SRC_CXX_EXT:
if not isfile("%s.%s" % (_f_part, ext)): if not isfile("%s.%s" % (_f_part, ext)):
continue continue
_c_path = self.env.File("%s.%s" % (_f_part, ext)) _c_path = self.env.File("%s.%s" % (_f_part, ext))
@ -876,7 +884,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
if not lib_dir: if not lib_dir:
continue continue
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if lib_dir not in lb: if lib_dir != lb.path:
continue continue
if lb not in self.depbuilders: if lb not in self.depbuilders:
self.depend_recursive(lb) self.depend_recursive(lb)

View File

@ -314,17 +314,24 @@ def ConfigureDebugFlags(env):
if scope not in env: if scope not in env:
return return
unflags = ["-Os", "-g"] unflags = ["-Os", "-g"]
for level in [0, 1, 2]: for level in [0, 1, 2, 3]:
for flag in ("O", "g", "ggdb"): for flag in ("O", "g", "ggdb"):
unflags.append("-%s%d" % (flag, level)) unflags.append("-%s%d" % (flag, level))
env[scope] = [f for f in env.get(scope, []) if f not in unflags] env[scope] = [f for f in env.get(scope, []) if f not in unflags]
env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"]) env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"])
debug_flags = ["-Og", "-g2", "-ggdb2"]
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"): for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
_cleanup_debug_flags(scope) _cleanup_debug_flags(scope)
env.Append(**{scope: debug_flags})
debug_flags = env.ParseFlags(env.GetProjectOption("debug_build_flags"))
env.MergeFlags(debug_flags)
optimization_flags = [
f for f in debug_flags.get("CCFLAGS", []) if f.startswith(("-O", "-g"))
]
if optimization_flags:
env.AppendUnique(ASFLAGS=optimization_flags, LINKFLAGS=optimization_flags)
def ConfigureTestTarget(env): def ConfigureTestTarget(env):

View File

@ -40,15 +40,15 @@ def PioPlatform(env):
def BoardConfig(env, board=None): def BoardConfig(env, board=None):
p = env.PioPlatform() with fs.cd(env.subst("$PROJECT_DIR")):
try: try:
board = board or env.get("BOARD") p = env.PioPlatform()
assert board, "BoardConfig: Board is not defined" board = board or env.get("BOARD")
config = p.board_config(board) assert board, "BoardConfig: Board is not defined"
except (AssertionError, exception.UnknownBoard) as e: return p.board_config(board)
sys.stderr.write("Error: %s\n" % str(e)) except (AssertionError, exception.UnknownBoard) as e:
env.Exit(1) sys.stderr.write("Error: %s\n" % str(e))
return config env.Exit(1)
def GetFrameworkScript(env, framework): def GetFrameworkScript(env, framework):
@ -213,7 +213,9 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
if extra: if extra:
info += " (%s)" % ", ".join(extra) info += " (%s)" % ", ".join(extra)
data.append(info) data.append(info)
return ["PACKAGES:", ", ".join(data)] if not data:
return None
return ["PACKAGES:"] + ["\n - %s" % d for d in sorted(data)]
for data in ( for data in (
_get_configuration_data(), _get_configuration_data(),

View File

@ -251,9 +251,9 @@ def CheckUploadSize(_, target, source, env):
print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"') print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"')
if data_max_size and data_size > -1: if data_max_size and data_size > -1:
print("DATA: %s" % _format_availale_bytes(data_size, data_max_size)) print("RAM: %s" % _format_availale_bytes(data_size, data_max_size))
if program_size > -1: if program_size > -1:
print("PROGRAM: %s" % _format_availale_bytes(program_size, program_max_size)) print("Flash: %s" % _format_availale_bytes(program_size, program_max_size))
if int(ARGUMENTS.get("PIOVERBOSE", 0)): if int(ARGUMENTS.get("PIOVERBOSE", 0)):
print(output) print(output)

View File

@ -31,8 +31,10 @@ from platformio.compat import string_types
from platformio.util import pioversion_to_intstr from platformio.util import pioversion_to_intstr
SRC_HEADER_EXT = ["h", "hpp"] SRC_HEADER_EXT = ["h", "hpp"]
SRC_C_EXT = ["c", "cc", "cpp"] SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
SRC_BUILD_EXT = SRC_C_EXT + ["S", "spp", "SPP", "sx", "s", "asm", "ASM"] SRC_C_EXT = ["c"]
SRC_CXX_EXT = ["cc", "cpp", "cxx", "c++"]
SRC_BUILD_EXT = SRC_C_EXT + SRC_CXX_EXT + SRC_ASM_EXT
SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % os.sep, "-<.svn%s>" % os.sep] SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % os.sep, "-<.svn%s>" % os.sep]
@ -44,7 +46,88 @@ def scons_patched_match_splitext(path, suffixes=None):
return tokens return tokens
def _build_project_deps(env): def GetBuildType(env):
return (
"debug"
if (
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
else "release"
)
def BuildProgram(env):
env.ProcessProgramDeps()
env.ProcessProjectDeps()
# append into the beginning a main LD script
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
# enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
)
env.Replace(PIOMAINPROG=program)
AlwaysBuild(
env.Alias(
"checkprogsize",
program,
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
)
)
print("Building in %s mode" % env.GetBuildType())
return program
def ProcessProgramDeps(env):
def _append_pio_macros():
env.AppendUnique(
CPPDEFINES=[
(
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
)
]
)
_append_pio_macros()
env.PrintConfiguration()
# fix ASM handling under non case-sensitive OS
if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
# process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
# apply user flags
env.ProcessFlags(env.get("BUILD_FLAGS"))
# process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
if env.GetBuildType() == "debug":
env.ConfigureDebugFlags()
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS:
env.ConfigureTestTarget()
def ProcessProjectDeps(env):
project_lib_builder = env.ConfigureProjectLibBuilder() project_lib_builder = env.ConfigureProjectLibBuilder()
# prepend project libs to the beginning of list # prepend project libs to the beginning of list
@ -85,78 +168,6 @@ def _build_project_deps(env):
Export("projenv") Export("projenv")
def BuildProgram(env):
def _append_pio_macros():
env.AppendUnique(
CPPDEFINES=[
(
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
)
]
)
_append_pio_macros()
env.PrintConfiguration()
# fix ASM handling under non case-sensitive OS
if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
# process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
# apply user flags
env.ProcessFlags(env.get("BUILD_FLAGS"))
# process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
is_build_type_debug = (
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
if is_build_type_debug:
env.ConfigureDebugFlags()
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS:
env.ConfigureTestTarget()
# append into the beginning a main LD script
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
# enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
# build project with dependencies
_build_project_deps(env)
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
)
env.Replace(PIOMAINPROG=program)
AlwaysBuild(
env.Alias(
"checkprogsize",
program,
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
)
)
print("Building in %s mode" % ("debug" if is_build_type_debug else "release"))
return program
def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
if not isinstance(flags, list): if not isinstance(flags, list):
flags = [flags] flags = [flags]
@ -343,7 +354,10 @@ def exists(_):
def generate(env): def generate(env):
env.AddMethod(GetBuildType)
env.AddMethod(BuildProgram) env.AddMethod(BuildProgram)
env.AddMethod(ProcessProgramDeps)
env.AddMethod(ProcessProjectDeps)
env.AddMethod(ParseFlagsExtended) env.AddMethod(ParseFlagsExtended)
env.AddMethod(ProcessFlags) env.AddMethod(ProcessFlags)
env.AddMethod(ProcessUnFlags) env.AddMethod(ProcessUnFlags)

View File

@ -63,5 +63,18 @@ class PlatformioCLI(click.MultiCommand):
mod_path = "platformio.commands.%s.command" % cmd_name mod_path = "platformio.commands.%s.command" % cmd_name
mod = __import__(mod_path, None, None, ["cli"]) mod = __import__(mod_path, None, None, ["cli"])
except ImportError: except ImportError:
try:
return self._handle_obsolate_command(cmd_name)
except AttributeError:
pass
raise click.UsageError('No such command "%s"' % cmd_name, ctx) raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return mod.cli return mod.cli
@staticmethod
def _handle_obsolate_command(name):
# pylint: disable=import-outside-toplevel
if name == "init":
from platformio.commands.project import project_init
return project_init
raise AttributeError()

View File

@ -32,7 +32,10 @@ def cli(query, installed, json_output): # pylint: disable=R0912
grpboards = {} grpboards = {}
for board in _get_boards(installed): for board in _get_boards(installed):
if query and query.lower() not in json.dumps(board).lower(): if query and not any(
query.lower() in str(board.get(k, "")).lower()
for k in ("id", "name", "mcu", "vendor", "platform", "frameworks")
):
continue continue
if board["platform"] not in grpboards: if board["platform"] not in grpboards:
grpboards[board["platform"]] = [] grpboards[board["platform"]] = []

View File

@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from os.path import abspath, relpath import os
import click import click
@ -52,7 +52,7 @@ class DefectItem(object):
self.id = id self.id = id
self.file = file self.file = file
if file.startswith(get_project_dir()): if file.startswith(get_project_dir()):
self.file = relpath(file, get_project_dir()) self.file = os.path.relpath(file, get_project_dir())
def __repr__(self): def __repr__(self):
defect_color = None defect_color = None
@ -86,7 +86,7 @@ class DefectItem(object):
"severity": self.SEVERITY_LABELS[self.severity], "severity": self.SEVERITY_LABELS[self.severity],
"category": self.category, "category": self.category,
"message": self.message, "message": self.message,
"file": abspath(self.file), "file": os.path.realpath(self.file),
"line": self.line, "line": self.line,
"column": self.column, "column": self.column,
"callstack": self.callstack, "callstack": self.callstack,

View File

@ -15,6 +15,7 @@
from platformio import exception from platformio import exception
from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool
from platformio.commands.check.tools.cppcheck import CppcheckCheckTool from platformio.commands.check.tools.cppcheck import CppcheckCheckTool
from platformio.commands.check.tools.pvsstudio import PvsStudioCheckTool
class CheckToolFactory(object): class CheckToolFactory(object):
@ -25,6 +26,8 @@ class CheckToolFactory(object):
cls = CppcheckCheckTool cls = CppcheckCheckTool
elif tool == "clangtidy": elif tool == "clangtidy":
cls = ClangtidyCheckTool cls = ClangtidyCheckTool
elif tool == "pvs-studio":
cls = PvsStudioCheckTool
else: else:
raise exception.PlatformioException("Unknown check tool `%s`" % tool) raise exception.PlatformioException("Unknown check tool `%s`" % tool)
return cls(project_dir, config, envname, options) return cls(project_dir, config, envname, options)

View File

@ -27,10 +27,13 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
self.config = config self.config = config
self.envname = envname self.envname = envname
self.options = options self.options = options
self.cpp_defines = [] self.cc_flags = []
self.cpp_flags = [] self.cxx_flags = []
self.cpp_includes = [] self.cpp_includes = []
self.cpp_defines = []
self.toolchain_defines = []
self.cc_path = None
self.cxx_path = None
self._defects = [] self._defects = []
self._on_defect_callback = None self._on_defect_callback = None
self._bad_input = False self._bad_input = False
@ -53,16 +56,19 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
data = load_project_ide_data(project_dir, envname) data = load_project_ide_data(project_dir, envname)
if not data: if not data:
return return
self.cpp_flags = data.get("cxx_flags", "").split(" ") self.cc_flags = data.get("cc_flags", "").split(" ")
self.cxx_flags = data.get("cxx_flags", "").split(" ")
self.cpp_includes = data.get("includes", []) self.cpp_includes = data.get("includes", [])
self.cpp_defines = data.get("defines", []) self.cpp_defines = data.get("defines", [])
self.cpp_defines.extend(self._get_toolchain_defines(data.get("cc_path"))) self.cc_path = data.get("cc_path")
self.cxx_path = data.get("cxx_path")
self.toolchain_defines = self._get_toolchain_defines(self.cc_path)
def get_flags(self, tool): def get_flags(self, tool):
result = [] result = []
flags = self.options.get("flags") or [] flags = self.options.get("flags") or []
for flag in flags: for flag in flags:
if ":" not in flag: if ":" not in flag or flag.startswith("-"):
result.extend([f for f in flag.split(" ") if f]) result.extend([f for f in flag.split(" ") if f])
elif flag.startswith("%s:" % tool): elif flag.startswith("%s:" % tool):
result.extend([f for f in flag.split(":", 1)[1].split(" ") if f]) result.extend([f for f in flag.split(":", 1)[1].split(" ") if f])
@ -132,7 +138,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
def _add_file(path): def _add_file(path):
if not path.endswith(allowed_extensions): if not path.endswith(allowed_extensions):
return return
result.append(os.path.abspath(path)) result.append(os.path.realpath(path))
for pattern in self.options["patterns"]: for pattern in self.options["patterns"]:
for item in glob.glob(pattern): for item in glob.glob(pattern):

View File

@ -61,7 +61,7 @@ class ClangtidyCheckTool(CheckToolBase):
cmd.extend(self.get_project_target_files()) cmd.extend(self.get_project_target_files())
cmd.append("--") cmd.append("--")
cmd.extend(["-D%s" % d for d in self.cpp_defines]) cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
cmd.extend(["-I%s" % inc for inc in self.cpp_includes]) cmd.extend(["-I%s" % inc for inc in self.cpp_includes])
return cmd return cmd

View File

@ -112,18 +112,18 @@ class CppcheckCheckTool(CheckToolBase):
cmd.append("--language=c++") cmd.append("--language=c++")
if not self.is_flag_set("--std", flags): if not self.is_flag_set("--std", flags):
for f in self.cpp_flags: for f in self.cxx_flags + self.cc_flags:
if "-std" in f: if "-std" in f:
# Standards with GNU extensions are not allowed # Standards with GNU extensions are not allowed
cmd.append("-" + f.replace("gnu", "c")) cmd.append("-" + f.replace("gnu", "c"))
cmd.extend(["-D%s" % d for d in self.cpp_defines]) cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
cmd.extend(flags) cmd.extend(flags)
cmd.append("--file-list=%s" % self._generate_src_file()) cmd.append("--file-list=%s" % self._generate_src_file())
cmd.append("--includes-file=%s" % self._generate_inc_file()) cmd.append("--includes-file=%s" % self._generate_inc_file())
core_dir = self.config.get_optional_dir("core") core_dir = self.config.get_optional_dir("packages")
cmd.append("--suppress=*:%s*" % core_dir) cmd.append("--suppress=*:%s*" % core_dir)
cmd.append("--suppress=unmatchedSuppression:%s*" % core_dir) cmd.append("--suppress=unmatchedSuppression:%s*" % core_dir)

View File

@ -0,0 +1,226 @@
# Copyright (c) 2020-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
import tempfile
from xml.etree.ElementTree import fromstring
import click
from platformio import proc, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes
def __init__(self, *args, **kwargs):
self._tmp_dir = tempfile.mkdtemp(prefix="piocheck")
self._tmp_preprocessed_file = self._generate_tmp_file_path() + ".i"
self._tmp_output_file = self._generate_tmp_file_path() + ".pvs"
self._tmp_cfg_file = self._generate_tmp_file_path() + ".cfg"
self._tmp_cmd_file = self._generate_tmp_file_path() + ".cmd"
self.tool_path = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"x64" if "windows" in util.get_systype() else "bin",
"pvs-studio",
)
super(PvsStudioCheckTool, self).__init__(*args, **kwargs)
with open(self._tmp_cfg_file, "w") as fp:
fp.write(
"exclude-path = "
+ self.config.get_optional_dir("packages").replace("\\", "/")
)
with open(self._tmp_cmd_file, "w") as fp:
fp.write(
" ".join(
['-I"%s"' % inc.replace("\\", "/") for inc in self.cpp_includes]
)
)
def _process_defects(self, defects):
for defect in defects:
if not isinstance(defect, DefectItem):
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
def _demangle_report(self, output_file):
converter_tool = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"HtmlGenerator"
if "windows" in util.get_systype()
else os.path.join("bin", "plog-converter"),
)
cmd = (
converter_tool,
"-t",
"xml",
output_file,
"-m",
"cwe",
"-m",
"misra",
"-a",
# Enable all possible analyzers and defect levels
"GA:1,2,3;64:1,2,3;OP:1,2,3;CS:1,2,3;MISRA:1,2,3",
"--cerr",
)
result = proc.exec_command(cmd)
if result["returncode"] != 0:
click.echo(result["err"])
self._bad_input = True
return result["err"]
def parse_defects(self, output_file):
defects = []
report = self._demangle_report(output_file)
if not report:
self._bad_input = True
return []
try:
defects_data = fromstring(report)
except: # pylint: disable=bare-except
click.echo("Error: Couldn't decode generated report!")
self._bad_input = True
return []
for table in defects_data.iter("PVS-Studio_Analysis_Log"):
message = table.find("Message").text
category = table.find("ErrorType").text
line = table.find("Line").text
file_ = table.find("File").text
defect_id = table.find("ErrorCode").text
cwe = table.find("CWECode")
cwe_id = None
if cwe is not None:
cwe_id = cwe.text.lower().replace("cwe-", "")
misra = table.find("MISRA")
if misra is not None:
message += " [%s]" % misra.text
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
defects.append(
DefectItem(
severity, category, message, file_, line, id=defect_id, cwe=cwe_id
)
)
return defects
def configure_command(self, src_file): # pylint: disable=arguments-differ
if os.path.isfile(self._tmp_output_file):
os.remove(self._tmp_output_file)
if not os.path.isfile(self._tmp_preprocessed_file):
click.echo(
"Error: Missing preprocessed file '%s'" % (self._tmp_preprocessed_file)
)
return ""
cmd = [
self.tool_path,
"--skip-cl-exe",
"yes",
"--language",
"C" if src_file.endswith(".c") else "C++",
"--preprocessor",
"gcc",
"--cfg",
self._tmp_cfg_file,
"--source-file",
src_file,
"--i-file",
self._tmp_preprocessed_file,
"--output-file",
self._tmp_output_file,
]
flags = self.get_flags("pvs-studio")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=arm")
cmd.extend(flags)
return cmd
def _generate_tmp_file_path(self):
# pylint: disable=protected-access
return os.path.join(self._tmp_dir, next(tempfile._get_candidate_names()))
def _prepare_preprocessed_file(self, src_file):
flags = self.cxx_flags
compiler = self.cxx_path
if src_file.endswith(".c"):
flags = self.cc_flags
compiler = self.cc_path
cmd = [compiler, src_file, "-E", "-o", self._tmp_preprocessed_file]
cmd.extend([f for f in flags if f])
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.append('@"%s"' % self._tmp_cmd_file)
result = proc.exec_command(" ".join(cmd), shell=True)
if result["returncode"] != 0:
if self.options.get("verbose"):
click.echo(" ".join(cmd))
click.echo(result["err"])
self._bad_input = True
def clean_up(self):
if os.path.isdir(self._tmp_dir):
shutil.rmtree(self._tmp_dir)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
src_files = [
f for f in self.get_project_target_files() if not f.endswith((".h", ".hpp"))
]
for src_file in src_files:
self._prepare_preprocessed_file(src_file)
cmd = self.configure_command(src_file)
if self.options.get("verbose"):
click.echo(" ".join(cmd))
if not cmd:
self._bad_input = True
continue
result = proc.exec_command(cmd)
# pylint: disable=unsupported-membership-test
if result["returncode"] != 0 or "License was not entered" in result["err"]:
self._bad_input = True
click.echo(result["err"])
continue
self._process_defects(self.parse_defects(self._tmp_output_file))
self.clean_up()
return self._bad_input

View File

@ -14,15 +14,15 @@
from glob import glob from glob import glob
from os import getenv, makedirs, remove from os import getenv, makedirs, remove
from os.path import abspath, basename, isdir, isfile, join from os.path import basename, isdir, isfile, join, realpath
from shutil import copyfile, copytree from shutil import copyfile, copytree
from tempfile import mkdtemp from tempfile import mkdtemp
import click import click
from platformio import app, fs from platformio import app, fs
from platformio.commands.init import cli as cmd_init from platformio.commands.project import project_init as cmd_project_init
from platformio.commands.init import validate_boards from platformio.commands.project import validate_boards
from platformio.commands.run.command import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import glob_escape from platformio.compat import glob_escape
from platformio.exception import CIBuildEnvsEmpty from platformio.exception import CIBuildEnvsEmpty
@ -35,7 +35,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
for i, p in enumerate(value): for i, p in enumerate(value):
if p.startswith("~"): if p.startswith("~"):
value[i] = fs.expanduser(p) value[i] = fs.expanduser(p)
value[i] = abspath(value[i]) value[i] = realpath(value[i])
if not glob(value[i]): if not glob(value[i]):
invalid_path = p invalid_path = p
break break
@ -111,7 +111,10 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
# initialise project # initialise project
ctx.invoke( ctx.invoke(
cmd_init, project_dir=build_dir, board=board, project_option=project_option cmd_project_init,
project_dir=build_dir,
board=board,
project_option=project_option,
) )
# process project # process project
@ -158,7 +161,7 @@ def _exclude_contents(dst_dir, patterns):
for p in patterns: for p in patterns:
contents += glob(join(glob_escape(dst_dir), p)) contents += glob(join(glob_escape(dst_dir), p))
for path in contents: for path in contents:
path = abspath(path) path = realpath(path)
if isdir(path): if isdir(path):
fs.rmtree(path) fs.rmtree(path)
elif isfile(path): elif isfile(path):

View File

@ -12,13 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import json
import os import os
import re import re
import signal import signal
import time import time
from hashlib import sha1 from hashlib import sha1
from os.path import abspath, basename, dirname, isdir, join, splitext from os.path import basename, dirname, isdir, join, realpath, splitext
from tempfile import mkdtemp from tempfile import mkdtemp
from twisted.internet import protocol # pylint: disable=import-error from twisted.internet import protocol # pylint: disable=import-error
@ -26,13 +25,13 @@ from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import stdio # pylint: disable=import-error from twisted.internet import stdio # pylint: disable=import-error
from twisted.internet import task # pylint: disable=import-error from twisted.internet import task # pylint: disable=import-error
from platformio import app, exception, fs, proc, util from platformio import app, fs, proc, telemetry, util
from platformio.commands.debug import helpers, initcfgs from platformio.commands.debug import helpers, initcfgs
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.process import BaseProcess from platformio.commands.debug.process import BaseProcess
from platformio.commands.debug.server import DebugServer from platformio.commands.debug.server import DebugServer
from platformio.compat import hashlib_encode_data, is_bytes from platformio.compat import hashlib_encode_data, is_bytes
from platformio.project.helpers import get_project_cache_dir from platformio.project.helpers import get_project_cache_dir
from platformio.telemetry import MeasurementProtocol
LOG_FILE = None LOG_FILE = None
@ -58,6 +57,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._target_is_run = False self._target_is_run = False
self._last_server_activity = 0 self._last_server_activity = 0
self._auto_continue_timer = None self._auto_continue_timer = None
self._errors_buffer = b""
def spawn(self, gdb_path, prog_path): def spawn(self, gdb_path, prog_path):
session_hash = gdb_path + prog_path session_hash = gdb_path + prog_path
@ -94,7 +94,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
] ]
args.extend(self.args) args.extend(self.args)
if not gdb_path: if not gdb_path:
raise exception.DebugInvalidOptions("GDB client is not configured") raise DebugInvalidOptionsError("GDB client is not configured")
gdb_data_dir = self._get_data_dir(gdb_path) gdb_data_dir = self._get_data_dir(gdb_path)
if gdb_data_dir: if gdb_data_dir:
args.extend(["--data-directory", gdb_data_dir]) args.extend(["--data-directory", gdb_data_dir])
@ -108,7 +108,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
def _get_data_dir(gdb_path): def _get_data_dir(gdb_path):
if "msp430" in gdb_path: if "msp430" in gdb_path:
return None return None
gdb_data_dir = abspath(join(dirname(gdb_path), "..", "share", "gdb")) gdb_data_dir = realpath(join(dirname(gdb_path), "..", "share", "gdb"))
return gdb_data_dir if isdir(gdb_data_dir) else None return gdb_data_dir if isdir(gdb_data_dir) else None
def generate_pioinit(self, dst_dir, patterns): def generate_pioinit(self, dst_dir, patterns):
@ -215,6 +215,9 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._handle_error(data) self._handle_error(data)
# go to init break automatically # go to init break automatically
if self.INIT_COMPLETED_BANNER.encode() in data: if self.INIT_COMPLETED_BANNER.encode() in data:
telemetry.send_event(
"Debug", "Started", telemetry.encode_run_environment(self.env_options)
)
self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue) self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
self._auto_continue_timer.start(0.1) self._auto_continue_timer.start(0.1)
@ -250,20 +253,19 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._target_is_run = True self._target_is_run = True
def _handle_error(self, data): def _handle_error(self, data):
self._errors_buffer += data
if self.PIO_SRC_NAME.encode() not in data or b"Error in sourced" not in data: if self.PIO_SRC_NAME.encode() not in data or b"Error in sourced" not in data:
return return
configuration = {"debug": self.debug_options, "env": self.env_options}
exd = re.sub(r'\\(?!")', "/", json.dumps(configuration)) last_erros = self._errors_buffer.decode()
exd = re.sub( last_erros = " ".join(reversed(last_erros.split("\n")))
r'"(?:[a-z]\:)?((/[^"/]+)+)"', last_erros = re.sub(r'((~|&)"|\\n\"|\\t)', " ", last_erros, flags=re.M)
lambda m: '"%s"' % join(*m.group(1).split("/")[-2:]),
exd, err = "%s -> %s" % (
re.I | re.M, telemetry.encode_run_environment(self.env_options),
last_erros,
) )
mp = MeasurementProtocol() telemetry.send_exception("DebugInitError: %s" % err)
mp["exd"] = "DebugGDBPioInitError: %s" % exd
mp["exf"] = 1
mp.send("exception")
self.transport.loseConnection() self.transport.loseConnection()
def _kill_previous_session(self): def _kill_previous_session(self):

View File

@ -23,8 +23,10 @@ import click
from platformio import app, exception, fs, proc, util from platformio import app, exception, fs, proc, util
from platformio.commands.debug import helpers from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.managers.core import inject_contrib_pysite from platformio.managers.core import inject_contrib_pysite
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project, load_project_ide_data from platformio.project.helpers import is_platformio_project, load_project_ide_data
@ -70,7 +72,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
env_name = environment or helpers.get_default_debug_env(config) env_name = environment or helpers.get_default_debug_env(config)
env_options = config.items(env=env_name, as_dict=True) env_options = config.items(env=env_name, as_dict=True)
if not set(env_options.keys()) >= set(["platform", "board"]): if not set(env_options.keys()) >= set(["platform", "board"]):
raise exception.ProjectEnvsNotAvailable() raise ProjectEnvsNotAvailableError()
debug_options = helpers.validate_debug_options(ctx, env_options) debug_options = helpers.validate_debug_options(ctx, env_options)
assert debug_options assert debug_options
@ -79,7 +81,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
configuration = load_project_ide_data(project_dir, env_name) configuration = load_project_ide_data(project_dir, env_name)
if not configuration: if not configuration:
raise exception.DebugInvalidOptions("Could not load debug configuration") raise DebugInvalidOptionsError("Could not load debug configuration")
if "--version" in __unprocessed: if "--version" in __unprocessed:
result = proc.exec_command([configuration["gdb_path"], "--version"]) result = proc.exec_command([configuration["gdb_path"], "--version"])
@ -140,7 +142,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
helpers.is_prog_obsolete(configuration["prog_path"]) helpers.is_prog_obsolete(configuration["prog_path"])
if not isfile(configuration["prog_path"]): if not isfile(configuration["prog_path"]):
raise exception.DebugInvalidOptions("Program/firmware is missed") raise DebugInvalidOptionsError("Program/firmware is missed")
# run debugging client # run debugging client
inject_contrib_pysite() inject_contrib_pysite()

View File

@ -0,0 +1,33 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.exception import PlatformioException, UserSideException
class DebugError(PlatformioException):
pass
class DebugSupportError(DebugError, UserSideException):
MESSAGE = (
"Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html"
)
class DebugInvalidOptionsError(DebugError, UserSideException):
pass

View File

@ -22,6 +22,7 @@ from os.path import isfile
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.commands import PlatformioCLI from platformio.commands import PlatformioCLI
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.platform import platform_install as cmd_platform_install from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.run.command import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes from platformio.compat import is_bytes
@ -301,7 +302,5 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
debug_port = _look_for_serial_port(tool_settings.get("hwids", [])) debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
if not debug_port: if not debug_port:
raise exception.DebugInvalidOptions( raise DebugInvalidOptionsError("Please specify `debug_port` for environment")
"Please specify `debug_port` for environment"
)
return debug_port return debug_port

View File

@ -59,8 +59,8 @@ end
target extended-remote $DEBUG_PORT target extended-remote $DEBUG_PORT
monitor clrbp monitor clrbp
monitor speed auto monitor speed auto
$LOAD_CMDS
pio_reset_halt_target pio_reset_halt_target
$LOAD_CMDS
$INIT_BREAK $INIT_BREAK
""" """

View File

@ -15,10 +15,10 @@
import os import os
from os.path import isdir, isfile, join from os.path import isdir, isfile, join
from twisted.internet import error # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error from twisted.internet import reactor # pylint: disable=import-error
from platformio import exception, fs, util from platformio import fs, util
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
from platformio.commands.debug.process import BaseProcess from platformio.commands.debug.process import BaseProcess
from platformio.proc import where_is_program from platformio.proc import where_is_program
@ -54,7 +54,7 @@ class DebugServer(BaseProcess):
if not isfile(server_executable): if not isfile(server_executable):
server_executable = where_is_program(server_executable) server_executable = where_is_program(server_executable)
if not isfile(server_executable): if not isfile(server_executable):
raise exception.DebugInvalidOptions( raise DebugInvalidOptionsError(
"\nCould not launch Debug Server '%s'. Please check that it " "\nCould not launch Debug Server '%s'. Please check that it "
"is installed and is included in a system PATH\n\n" "is installed and is included in a system PATH\n\n"
"See documentation or contact contact@platformio.org:\n" "See documentation or contact contact@platformio.org:\n"
@ -134,5 +134,5 @@ class DebugServer(BaseProcess):
return return
try: try:
self._transport.signalProcess("KILL") self._transport.signalProcess("KILL")
except (OSError, error.ProcessExitedAlready): except: # pylint: disable=bare-except
pass pass

View File

@ -21,7 +21,9 @@ from serial.tools import miniterm
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError
@click.group(short_help="Monitor device or list existing") @click.group(short_help="Monitor device or list existing")
@ -172,48 +174,49 @@ def device_list( # pylint: disable=too-many-branches
help="Load configuration from `platformio.ini` and specified environment", help="Load configuration from `platformio.ini` and specified environment",
) )
def device_monitor(**kwargs): # pylint: disable=too-many-branches def device_monitor(**kwargs): # pylint: disable=too-many-branches
env_options = {} click.echo(
"Looking for advanced Serial Monitor with UI? "
"Check http://bit.ly/pio-advanced-monitor"
)
project_options = {}
try: try:
with fs.cd(kwargs["project_dir"]): with fs.cd(kwargs["project_dir"]):
env_options = get_project_options(kwargs["environment"]) project_options = get_project_options(kwargs["environment"])
for k in ("port", "speed", "rts", "dtr"): kwargs = apply_project_monitor_options(kwargs, project_options)
k2 = "monitor_%s" % k except NotPlatformIOProjectError:
if k == "speed":
k = "baud"
if kwargs[k] is None and k2 in env_options:
kwargs[k] = env_options[k2]
if k != "port":
kwargs[k] = int(kwargs[k])
except exception.NotPlatformIOProject:
pass pass
if not kwargs["port"]: if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True) ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1: if len(ports) == 1:
kwargs["port"] = ports[0]["port"] kwargs["port"] = ports[0]["port"]
elif "platform" in project_options and "board" in project_options:
sys.argv = ["monitor"] + env_options.get("monitor_flags", []) board_hwids = get_board_hwids(
for k, v in kwargs.items(): kwargs["project_dir"],
if k in ("port", "baud", "rts", "dtr", "environment", "project_dir"): project_options["platform"],
continue project_options["board"],
k = "--" + k.replace("_", "-") )
if k in env_options.get("monitor_flags", []): for item in ports:
continue for hwid in board_hwids:
if isinstance(v, bool): hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if v: if hwid_str in item["hwid"]:
sys.argv.append(k) kwargs["port"] = item["port"]
elif isinstance(v, tuple): break
for i in v: if kwargs["port"]:
sys.argv.extend([k, i]) break
else: elif kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
sys.argv.extend([k, str(v)])
if kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
for item in util.get_serial_ports(): for item in util.get_serial_ports():
if fnmatch(item["port"], kwargs["port"]): if fnmatch(item["port"], kwargs["port"]):
kwargs["port"] = item["port"] kwargs["port"] = item["port"]
break break
# override system argv with patched options
sys.argv = ["monitor"] + options_to_argv(
kwargs,
project_options,
ignore=("port", "baud", "rts", "dtr", "environment", "project_dir"),
)
try: try:
miniterm.main( miniterm.main(
default_port=kwargs["port"], default_port=kwargs["port"],
@ -225,6 +228,37 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
raise exception.MinitermException(e) raise exception.MinitermException(e)
def apply_project_monitor_options(cli_options, project_options):
for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if cli_options[k] is None and k2 in project_options:
cli_options[k] = project_options[k2]
if k != "port":
cli_options[k] = int(cli_options[k])
return cli_options
def options_to_argv(cli_options, project_options, ignore=None):
result = project_options.get("monitor_flags", [])
for k, v in cli_options.items():
if v is None or (ignore and k in ignore):
continue
k = "--" + k.replace("_", "-")
if k in project_options.get("monitor_flags", []):
continue
if isinstance(v, bool):
if v:
result.append(k)
elif isinstance(v, tuple):
for i in v:
result.extend([k, i])
else:
result.extend([k, str(v)])
return result
def get_project_options(environment=None): def get_project_options(environment=None):
config = ProjectConfig.get_instance() config = ProjectConfig.get_instance()
config.validate(envs=[environment] if environment else None) config.validate(envs=[environment] if environment else None)
@ -235,3 +269,12 @@ def get_project_options(environment=None):
else: else:
environment = config.envs()[0] environment = config.envs()[0]
return config.items(env=environment, as_dict=True) return config.items(env=environment, as_dict=True)
def get_board_hwids(project_dir, platform, board):
with fs.cd(project_dir):
return (
PlatformFactory.newPlatform(platform)
.board_config(board)
.get("build.hwids", [])
)

View File

@ -21,6 +21,7 @@ from os.path import isdir
import click import click
from platformio import exception from platformio import exception
from platformio.compat import WINDOWS
from platformio.managers.core import get_core_package_dir, inject_contrib_pysite from platformio.managers.core import get_core_package_dir, inject_contrib_pysite
@ -87,15 +88,7 @@ def cli(port, host, no_open, shutdown_timeout):
if host == "__do_not_start__": if host == "__do_not_start__":
return return
# if already started already_started = is_port_used(host, port)
already_started = False
socket.setdefaulttimeout(1)
try:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
already_started = True
except: # pylint: disable=bare-except
pass
home_url = "http://%s:%d" % (host, port) home_url = "http://%s:%d" % (host, port)
if not no_open: if not no_open:
if already_started: if already_started:
@ -116,12 +109,35 @@ def cli(port, host, no_open, shutdown_timeout):
) )
) )
click.echo("") click.echo("")
click.echo("Open PIO Home in your browser by this URL => %s" % home_url) click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
if already_started: if already_started:
click.secho(
"PlatformIO Home server is already started in another process.", fg="yellow"
)
return return
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.") click.echo("PIO Home has been started. Press Ctrl+C to shutdown.")
reactor.listenTCP(port, site, interface=host) reactor.listenTCP(port, site, interface=host)
reactor.run() reactor.run()
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@ -23,7 +23,7 @@ from platformio.commands.home.rpc.handlers.os import OSRPC
class MiscRPC(object): class MiscRPC(object):
def load_latest_tweets(self, data_url): def load_latest_tweets(self, data_url):
cache_key = data_url cache_key = app.ContentCache.key_from_args(data_url, "tweets")
cache_valid = "7d" cache_valid = "7d"
with app.ContentCache() as cc: with app.ContentCache() as cc:
cache_data = cc.get(cache_key) cache_data = cc.get(cache_key)

View File

@ -14,12 +14,10 @@
from __future__ import absolute_import from __future__ import absolute_import
import codecs
import glob import glob
import os import os
import shutil import shutil
from functools import cmp_to_key from functools import cmp_to_key
from os.path import isdir, isfile, join
import click import click
from twisted.internet import defer # pylint: disable=import-error from twisted.internet import defer # pylint: disable=import-error
@ -67,10 +65,9 @@ class OSRPC(object):
def request_content(self, uri, data=None, headers=None, cache_valid=None): def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith("http"): if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid) return self.fetch_content(uri, data, headers, cache_valid)
if not isfile(uri): if os.path.isfile(uri):
return None return fs.get_file_contents(uri, encoding="utf8")
with codecs.open(uri, encoding="utf-8") as fp: return None
return fp.read()
@staticmethod @staticmethod
def open_url(url): def open_url(url):
@ -88,16 +85,20 @@ class OSRPC(object):
@staticmethod @staticmethod
def is_file(path): def is_file(path):
return isfile(path) return os.path.isfile(path)
@staticmethod @staticmethod
def is_dir(path): def is_dir(path):
return isdir(path) return os.path.isdir(path)
@staticmethod @staticmethod
def make_dirs(path): def make_dirs(path):
return os.makedirs(path) return os.makedirs(path)
@staticmethod
def get_file_mtime(path):
return os.path.getmtime(path)
@staticmethod @staticmethod
def rename(src, dst): def rename(src, dst):
return os.rename(src, dst) return os.rename(src, dst)
@ -112,7 +113,7 @@ class OSRPC(object):
pathnames = [pathnames] pathnames = [pathnames]
result = set() result = set()
for pathname in pathnames: for pathname in pathnames:
result |= set(glob.glob(join(root, pathname) if root else pathname)) result |= set(glob.glob(os.path.join(root, pathname) if root else pathname))
return list(result) return list(result)
@staticmethod @staticmethod
@ -131,13 +132,13 @@ class OSRPC(object):
items = [] items = []
if path.startswith("~"): if path.startswith("~"):
path = fs.expanduser(path) path = fs.expanduser(path)
if not isdir(path): if not os.path.isdir(path):
return items return items
for item in os.listdir(path): for item in os.listdir(path):
try: try:
item_is_dir = isdir(join(path, item)) item_is_dir = os.path.isdir(os.path.join(path, item))
if item_is_dir: if item_is_dir:
os.listdir(join(path, item)) os.listdir(os.path.join(path, item))
items.append((item, item_is_dir)) items.append((item, item_is_dir))
except OSError: except OSError:
pass pass

View File

@ -17,7 +17,6 @@ from __future__ import absolute_import
import os import os
import shutil import shutil
import time import time
from os.path import basename, getmtime, isdir, isfile, join, realpath, sep
import jsonrpc # pylint: disable=import-error import jsonrpc # pylint: disable=import-error
@ -28,6 +27,7 @@ from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectError
from platformio.project.helpers import get_project_dir, is_platformio_project from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.options import get_config_options_schema from platformio.project.options import get_config_options_schema
@ -38,7 +38,7 @@ class ProjectRPC(object):
assert isinstance(init_kwargs, dict) assert isinstance(init_kwargs, dict)
assert "path" in init_kwargs assert "path" in init_kwargs
project_dir = get_project_dir() project_dir = get_project_dir()
if isfile(init_kwargs["path"]): if os.path.isfile(init_kwargs["path"]):
project_dir = os.path.dirname(init_kwargs["path"]) project_dir = os.path.dirname(init_kwargs["path"])
with fs.cd(project_dir): with fs.cd(project_dir):
return getattr(ProjectConfig(**init_kwargs), method)(*args) return getattr(ProjectConfig(**init_kwargs), method)(*args)
@ -74,7 +74,7 @@ class ProjectRPC(object):
return get_config_options_schema() return get_config_options_schema()
@staticmethod @staticmethod
def _get_projects(project_dirs=None): def get_projects():
def _get_project_data(): def _get_project_data():
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []} data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
config = ProjectConfig() config = ProjectConfig()
@ -86,7 +86,7 @@ class ProjectRPC(object):
for section in config.sections(): for section in config.sections():
if not section.startswith("env:"): if not section.startswith("env:"):
continue continue
data["envLibdepsDirs"].append(join(libdeps_dir, section[4:])) data["envLibdepsDirs"].append(os.path.join(libdeps_dir, section[4:]))
if config.has_option(section, "board"): if config.has_option(section, "board"):
data["boards"].append(config.get(section, "board")) data["boards"].append(config.get(section, "board"))
data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", [])) data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
@ -94,28 +94,27 @@ class ProjectRPC(object):
# skip non existing folders and resolve full path # skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"): for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [ data[key] = [
fs.expanduser(d) if d.startswith("~") else realpath(d) fs.expanduser(d) if d.startswith("~") else os.path.realpath(d)
for d in data[key] for d in data[key]
if isdir(d) if os.path.isdir(d)
] ]
return data return data
def _path_to_name(path): def _path_to_name(path):
return (sep).join(path.split(sep)[-2:]) return (os.path.sep).join(path.split(os.path.sep)[-2:])
if not project_dirs:
project_dirs = AppRPC.load_state()["storage"]["recentProjects"]
result = [] result = []
pm = PlatformManager() pm = PlatformManager()
for project_dir in project_dirs: for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
if not os.path.isdir(project_dir):
continue
data = {} data = {}
boards = [] boards = []
try: try:
with fs.cd(project_dir): with fs.cd(project_dir):
data = _get_project_data() data = _get_project_data()
except exception.PlatformIOProjectException: except ProjectError:
continue continue
for board_id in data.get("boards", []): for board_id in data.get("boards", []):
@ -130,12 +129,12 @@ class ProjectRPC(object):
{ {
"path": project_dir, "path": project_dir,
"name": _path_to_name(project_dir), "name": _path_to_name(project_dir),
"modified": int(getmtime(project_dir)), "modified": int(os.path.getmtime(project_dir)),
"boards": boards, "boards": boards,
"description": data.get("description"), "description": data.get("description"),
"envs": data.get("envs", []), "envs": data.get("envs", []),
"envLibStorages": [ "envLibStorages": [
{"name": basename(d), "path": d} {"name": os.path.basename(d), "path": d}
for d in data.get("envLibdepsDirs", []) for d in data.get("envLibdepsDirs", [])
], ],
"extraLibStorages": [ "extraLibStorages": [
@ -146,27 +145,24 @@ class ProjectRPC(object):
) )
return result return result
def get_projects(self, project_dirs=None):
return self._get_projects(project_dirs)
@staticmethod @staticmethod
def get_project_examples(): def get_project_examples():
result = [] result = []
for manifest in PlatformManager().get_installed(): for manifest in PlatformManager().get_installed():
examples_dir = join(manifest["__pkg_dir"], "examples") examples_dir = os.path.join(manifest["__pkg_dir"], "examples")
if not isdir(examples_dir): if not os.path.isdir(examples_dir):
continue continue
items = [] items = []
for project_dir, _, __ in os.walk(examples_dir): for project_dir, _, __ in os.walk(examples_dir):
project_description = None project_description = None
try: try:
config = ProjectConfig(join(project_dir, "platformio.ini")) config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
config.validate(silent=True) config.validate(silent=True)
project_description = config.get("platformio", "description") project_description = config.get("platformio", "description")
except exception.PlatformIOProjectException: except ProjectError:
continue continue
path_tokens = project_dir.split(sep) path_tokens = project_dir.split(os.path.sep)
items.append( items.append(
{ {
"name": "/".join( "name": "/".join(
@ -190,7 +186,7 @@ class ProjectRPC(object):
def init(self, board, framework, project_dir): def init(self, board, framework, project_dir):
assert project_dir assert project_dir
state = AppRPC.load_state() state = AppRPC.load_state()
if not isdir(project_dir): if not os.path.isdir(project_dir):
os.makedirs(project_dir) os.makedirs(project_dir)
args = ["init", "--board", board] args = ["init", "--board", board]
if framework: if framework:
@ -243,10 +239,10 @@ class ProjectRPC(object):
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig() config = ProjectConfig()
src_dir = config.get_optional_dir("src") src_dir = config.get_optional_dir("src")
main_path = join(src_dir, "main.cpp") main_path = os.path.join(src_dir, "main.cpp")
if isfile(main_path): if os.path.isfile(main_path):
return project_dir return project_dir
if not isdir(src_dir): if not os.path.isdir(src_dir):
os.makedirs(src_dir) os.makedirs(src_dir)
fs.write_file_contents(main_path, main_content.strip()) fs.write_file_contents(main_path, main_content.strip())
return project_dir return project_dir
@ -261,10 +257,10 @@ class ProjectRPC(object):
is_arduino_project = any( is_arduino_project = any(
[ [
isfile( os.path.isfile(
join( os.path.join(
arduino_project_dir, arduino_project_dir,
"%s.%s" % (basename(arduino_project_dir), ext), "%s.%s" % (os.path.basename(arduino_project_dir), ext),
) )
) )
for ext in ("ino", "pde") for ext in ("ino", "pde")
@ -276,10 +272,10 @@ class ProjectRPC(object):
) )
state = AppRPC.load_state() state = AppRPC.load_state()
project_dir = join( project_dir = os.path.join(
state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
) )
if not isdir(project_dir): if not os.path.isdir(project_dir):
os.makedirs(project_dir) os.makedirs(project_dir)
args = ["init", "--board", board] args = ["init", "--board", board]
args.extend(["--project-option", "framework = arduino"]) args.extend(["--project-option", "framework = arduino"])
@ -301,7 +297,7 @@ class ProjectRPC(object):
with fs.cd(project_dir): with fs.cd(project_dir):
config = ProjectConfig() config = ProjectConfig()
src_dir = config.get_optional_dir("src") src_dir = config.get_optional_dir("src")
if isdir(src_dir): if os.path.isdir(src_dir):
fs.rmtree(src_dir) fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir) shutil.copytree(arduino_project_dir, src_dir)
return project_dir return project_dir
@ -312,9 +308,9 @@ class ProjectRPC(object):
raise jsonrpc.exceptions.JSONRPCDispatchException( raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4001, message="Not an PlatformIO project: %s" % project_dir code=4001, message="Not an PlatformIO project: %s" % project_dir
) )
new_project_dir = join( new_project_dir = os.path.join(
AppRPC.load_state()["storage"]["projectsDir"], AppRPC.load_state()["storage"]["projectsDir"],
time.strftime("%y%m%d-%H%M%S-") + basename(project_dir), time.strftime("%y%m%d-%H%M%S-") + os.path.basename(project_dir),
) )
shutil.copytree(project_dir, new_project_dir) shutil.copytree(project_dir, new_project_dir)

View File

@ -26,7 +26,7 @@ from platformio.commands import PlatformioCLI
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib
from platformio.package.manifest.parser import ManifestParserFactory from platformio.package.manifest.parser import ManifestParserFactory
from platformio.package.manifest.schema import ManifestSchema, ManifestValidationError from platformio.package.manifest.schema import ManifestSchema
from platformio.proc import is_ci from platformio.proc import is_ci
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_dir, is_platformio_project from platformio.project.helpers import get_project_dir, is_platformio_project
@ -495,11 +495,9 @@ def lib_register(config_url):
raise exception.InvalidLibConfURL(config_url) raise exception.InvalidLibConfURL(config_url)
# Validate manifest # Validate manifest
data, error = ManifestSchema(strict=False).load( ManifestSchema().load_manifest(
ManifestParserFactory.new_from_url(config_url).as_dict() ManifestParserFactory.new_from_url(config_url).as_dict()
) )
if error:
raise ManifestValidationError(error, data)
result = util.get_api_result("/lib/register", data=dict(config_url=config_url)) result = util.get_api_result("/lib/register", data=dict(config_url=config_url))
if "message" in result and result["message"]: if "message" in result and result["message"]:

View File

@ -20,6 +20,7 @@ from platformio import app, exception, util
from platformio.commands.boards import print_boards from platformio.commands.boards import print_boards
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.package.pack import PackagePacker
@click.group(short_help="Platform Manager") @click.group(short_help="Platform Manager")
@ -298,14 +299,20 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
@click.option("--with-package", multiple=True) @click.option("--with-package", multiple=True)
@click.option("--without-package", multiple=True) @click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True) @click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True)
@click.option( @click.option(
"-f", "-f",
"--force", "--force",
is_flag=True, is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist", help="Reinstall/redownload dev/platform and its packages if exist",
) )
def platform_install( def platform_install( # pylint: disable=too-many-arguments
platforms, with_package, without_package, skip_default_package, force platforms,
with_package,
without_package,
skip_default_package,
with_all_packages,
force,
): ):
pm = PlatformManager() pm = PlatformManager()
for platform in platforms: for platform in platforms:
@ -314,6 +321,7 @@ def platform_install(
with_packages=with_package, with_packages=with_package,
without_packages=without_package, without_packages=without_package,
skip_default_package=skip_default_package, skip_default_package=skip_default_package,
with_all_packages=with_all_packages,
force=force, force=force,
): ):
click.secho( click.secho(
@ -403,3 +411,13 @@ def platform_update( # pylint: disable=too-many-locals
click.echo() click.echo()
return True return True
@cli.command(
"pack", short_help="Create a tarball from development platform/tool package"
)
@click.argument("package", required=True, metavar="[source directory, tar.gz or zip]")
def platform_pack(package):
p = PackagePacker(package)
tarball_path = p.pack()
click.secho('Wrote a tarball to "%s"' % tarball_path, fg="green")

View File

@ -14,19 +14,61 @@
# pylint: disable=too-many-arguments,too-many-locals, too-many-branches # pylint: disable=too-many-arguments,too-many-locals, too-many-branches
from os import getcwd, makedirs import os
from os.path import isdir, isfile, join
import click import click
from tabulate import tabulate
from platformio import exception, fs from platformio import exception, fs
from platformio.commands.platform import platform_install as cli_platform_install from platformio.commands.platform import platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import is_platformio_project from platformio.project.helpers import is_platformio_project
@click.group(short_help="Project Manager")
def cli():
pass
@cli.command("config", short_help="Show computed configuration")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("--json-output", is_flag=True)
def project_config(project_dir, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
if json_output:
return click.echo(config.to_json())
click.echo(
"Computed project configuration for %s" % click.style(project_dir, fg="cyan")
)
for section, options in config.as_tuple():
click.echo()
click.secho(section, fg="cyan")
click.echo("-" * len(section))
click.echo(
tabulate(
[
(name, "=", "\n".join(value) if isinstance(value, list) else value)
for name, value in options
],
tablefmt="plain",
)
)
return None
def validate_boards(ctx, param, value): # pylint: disable=W0613 def validate_boards(ctx, param, value): # pylint: disable=W0613
pm = PlatformManager() pm = PlatformManager()
for id_ in value: for id_ in value:
@ -40,11 +82,11 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
return value return value
@click.command("init", short_help="Initialize PlatformIO project or update existing") @cli.command("init", short_help="Initialize a project or update existing")
@click.option( @click.option(
"--project-dir", "--project-dir",
"-d", "-d",
default=getcwd, default=os.getcwd,
type=click.Path( type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
), ),
@ -55,7 +97,7 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
@click.option("--env-prefix", default="") @click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@click.pass_context @click.pass_context
def cli( def project_init(
ctx, # pylint: disable=R0913 ctx, # pylint: disable=R0913
project_dir, project_dir,
board, board,
@ -65,7 +107,7 @@ def cli(
silent, silent,
): ):
if not silent: if not silent:
if project_dir == getcwd(): if project_dir == os.getcwd():
click.secho("\nThe current working directory", fg="yellow", nl=False) click.secho("\nThe current working directory", fg="yellow", nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False) click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow") click.secho("will be used for the project.", fg="yellow")
@ -137,16 +179,16 @@ def init_base_project(project_dir):
(config.get_optional_dir("test"), init_test_readme), (config.get_optional_dir("test"), init_test_readme),
] ]
for (path, cb) in dir_to_readme: for (path, cb) in dir_to_readme:
if isdir(path): if os.path.isdir(path):
continue continue
makedirs(path) os.makedirs(path)
if cb: if cb:
cb(path) cb(path)
def init_include_readme(include_dir): def init_include_readme(include_dir):
fs.write_file_contents( fs.write_file_contents(
join(include_dir, "README"), os.path.join(include_dir, "README"),
""" """
This directory is intended for project header files. This directory is intended for project header files.
@ -193,7 +235,7 @@ https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
def init_lib_readme(lib_dir): def init_lib_readme(lib_dir):
# pylint: disable=line-too-long # pylint: disable=line-too-long
fs.write_file_contents( fs.write_file_contents(
join(lib_dir, "README"), os.path.join(lib_dir, "README"),
""" """
This directory is intended for project specific (private) libraries. This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file. PlatformIO will compile them to static libraries and link into executable file.
@ -246,7 +288,7 @@ More information about PlatformIO Library Dependency Finder
def init_test_readme(test_dir): def init_test_readme(test_dir):
fs.write_file_contents( fs.write_file_contents(
join(test_dir, "README"), os.path.join(test_dir, "README"),
""" """
This directory is intended for PIO Unit Testing and project tests. This directory is intended for PIO Unit Testing and project tests.
@ -263,8 +305,8 @@ More information about PIO Unit Testing:
def init_ci_conf(project_dir): def init_ci_conf(project_dir):
conf_path = join(project_dir, ".travis.yml") conf_path = os.path.join(project_dir, ".travis.yml")
if isfile(conf_path): if os.path.isfile(conf_path):
return return
fs.write_file_contents( fs.write_file_contents(
conf_path, conf_path,
@ -340,8 +382,8 @@ def init_ci_conf(project_dir):
def init_cvs_ignore(project_dir): def init_cvs_ignore(project_dir):
conf_path = join(project_dir, ".gitignore") conf_path = os.path.join(project_dir, ".gitignore")
if isfile(conf_path): if os.path.isfile(conf_path):
return return
fs.write_file_contents(conf_path, ".pio\n") fs.write_file_contents(conf_path, ".pio\n")
@ -349,7 +391,9 @@ def init_cvs_ignore(project_dir):
def fill_project_envs( def fill_project_envs(
ctx, project_dir, board_ids, project_option, env_prefix, force_download ctx, project_dir, board_ids, project_option, env_prefix, force_download
): ):
config = ProjectConfig(join(project_dir, "platformio.ini"), parse_extra=False) config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
used_boards = [] used_boards = []
for section in config.sections(): for section in config.sections():
cond = [section.startswith("env:"), config.has_option(section, "board")] cond = [section.startswith("env:"), config.has_option(section, "board")]

View File

@ -12,18 +12,18 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import os
import sys import sys
import threading import threading
from os import getcwd
from os.path import isfile, join
from tempfile import mkdtemp from tempfile import mkdtemp
from time import sleep from time import sleep
import click import click
from platformio import exception, fs from platformio import exception, fs
from platformio.commands.device import device_monitor as cmd_device_monitor from platformio.commands import device
from platformio.managers.core import pioplus_call from platformio.managers.core import pioplus_call
from platformio.project.exception import NotPlatformIOProjectError
# pylint: disable=unused-argument # pylint: disable=unused-argument
@ -83,7 +83,7 @@ def remote_update(only_check, dry_run):
@click.option( @click.option(
"-d", "-d",
"--project-dir", "--project-dir",
default=getcwd, default=os.getcwd,
type=click.Path( type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
), ),
@ -104,7 +104,7 @@ def remote_run(**kwargs):
@click.option( @click.option(
"-d", "-d",
"--project-dir", "--project-dir",
default=getcwd, default=os.getcwd,
type=click.Path( type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
), ),
@ -130,9 +130,7 @@ def device_list(json_output):
@remote_device.command("monitor", short_help="Monitor remote device") @remote_device.command("monitor", short_help="Monitor remote device")
@click.option("--port", "-p", help="Port, a number or a device name") @click.option("--port", "-p", help="Port, a number or a device name")
@click.option( @click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
"--baud", "-b", type=int, default=9600, help="Set baud rate, default=9600"
)
@click.option( @click.option(
"--parity", "--parity",
default="N", default="N",
@ -183,25 +181,49 @@ def device_list(json_output):
is_flag=True, is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off", help="Diagnostics: suppress non-error messages, default=Off",
) )
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment",
)
@click.pass_context @click.pass_context
def device_monitor(ctx, **kwargs): def device_monitor(ctx, **kwargs):
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device.get_project_options(kwargs["environment"])
kwargs = device.apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
kwargs["baud"] = kwargs["baud"] or 9600
def _tx_target(sock_dir): def _tx_target(sock_dir):
pioplus_argv = ["remote", "device", "monitor"]
pioplus_argv.extend(device.options_to_argv(kwargs, project_options))
pioplus_argv.extend(["--sock", sock_dir])
try: try:
pioplus_call(sys.argv[1:] + ["--sock", sock_dir]) pioplus_call(pioplus_argv)
except exception.ReturnErrorCode: except exception.ReturnErrorCode:
pass pass
sock_dir = mkdtemp(suffix="pioplus") sock_dir = mkdtemp(suffix="pioplus")
sock_file = join(sock_dir, "sock") sock_file = os.path.join(sock_dir, "sock")
try: try:
t = threading.Thread(target=_tx_target, args=(sock_dir,)) t = threading.Thread(target=_tx_target, args=(sock_dir,))
t.start() t.start()
while t.is_alive() and not isfile(sock_file): while t.is_alive() and not os.path.isfile(sock_file):
sleep(0.1) sleep(0.1)
if not t.is_alive(): if not t.is_alive():
return return
kwargs["port"] = fs.get_file_contents(sock_file) kwargs["port"] = fs.get_file_contents(sock_file)
ctx.invoke(cmd_device_monitor, **kwargs) ctx.invoke(device.device_monitor, **kwargs)
t.join(2) t.join(2)
finally: finally:
fs.rmtree(sock_dir) fs.rmtree(sock_dir)

View File

@ -16,6 +16,7 @@ from platformio import exception, telemetry
from platformio.commands.platform import platform_install as cmd_platform_install from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
from platformio.managers.platform import PlatformFactory from platformio.managers.platform import PlatformFactory
from platformio.project.exception import UndefinedEnvPlatformError
# pylint: disable=too-many-instance-attributes # pylint: disable=too-many-instance-attributes
@ -56,12 +57,12 @@ class EnvironmentProcessor(object):
def process(self): def process(self):
if "platform" not in self.options: if "platform" not in self.options:
raise exception.UndefinedEnvPlatform(self.name) raise UndefinedEnvPlatformError(self.name)
build_vars = self.get_build_variables() build_vars = self.get_build_variables()
build_targets = list(self.get_build_targets()) build_targets = list(self.get_build_targets())
telemetry.on_run_environment(self.options, build_targets) telemetry.send_run_environment(self.options, build_targets)
# skip monitor target, we call it above # skip monitor target, we call it above
if "monitor" in build_targets: if "monitor" in build_targets:

View File

@ -107,7 +107,8 @@ def cli( # pylint: disable=redefined-builtin
raise exception.TestDirNotExists(test_dir) raise exception.TestDirNotExists(test_dir)
test_names = get_test_names(test_dir) test_names = get_test_names(test_dir)
click.echo("Verbose mode can be enabled via `-v, --verbose` option") if not verbose:
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
click.secho("Collected %d items" % len(test_names), bold=True) click.secho("Collected %d items" % len(test_names), bold=True)
results = [] results = []
@ -159,6 +160,7 @@ def cli( # pylint: disable=redefined-builtin
monitor_rts=monitor_rts, monitor_rts=monitor_rts,
monitor_dtr=monitor_dtr, monitor_dtr=monitor_dtr,
verbose=verbose, verbose=verbose,
silent=not verbose,
), ),
) )
result = { result = {

View File

@ -46,7 +46,7 @@ class EmbeddedTestProcessor(TestProcessorBase):
return False return False
if self.options["without_testing"]: if self.options["without_testing"]:
return None return True
self.print_progress("Testing...") self.print_progress("Testing...")
return self.run() return self.run()

View File

@ -119,7 +119,8 @@ class TestProcessorBase(object):
cmd_run, cmd_run,
project_dir=self.options["project_dir"], project_dir=self.options["project_dir"],
upload_port=self.options["upload_port"], upload_port=self.options["upload_port"],
silent=not self.options["verbose"], verbose=self.options["verbose"],
silent=self.options["silent"],
environment=[self.env_name], environment=[self.env_name],
disable_auto_clean="nobuild" in target, disable_auto_clean="nobuild" in target,
target=target, target=target,

View File

@ -32,11 +32,14 @@ def get_filesystem_encoding():
def get_locale_encoding(): def get_locale_encoding():
return locale.getdefaultlocale()[1] try:
return locale.getdefaultlocale()[1]
except ValueError:
return None
def get_class_attributes(cls): def get_class_attributes(cls):
attributes = inspect.getmembers(cls, lambda a: not (inspect.isroutine(a))) attributes = inspect.getmembers(cls, lambda a: not inspect.isroutine(a))
return { return {
a[0]: a[1] a[0]: a[1]
for a in attributes for a in attributes

View File

@ -12,10 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import hashlib
import io
import math
import sys
from email.utils import parsedate_tz from email.utils import parsedate_tz
from math import ceil
from os.path import getsize, join from os.path import getsize, join
from sys import version_info
from time import mktime from time import mktime
import click import click
@ -27,13 +29,9 @@ from platformio.exception import (
FDSizeMismatch, FDSizeMismatch,
FDUnrecognizedStatusCode, FDUnrecognizedStatusCode,
) )
from platformio.proc import exec_command
class FileDownloader(object): class FileDownloader(object):
CHUNK_SIZE = 1024
def __init__(self, url, dest_dir=None): def __init__(self, url, dest_dir=None):
self._request = None self._request = None
# make connection # make connection
@ -41,7 +39,7 @@ class FileDownloader(object):
url, url,
stream=True, stream=True,
headers=util.get_request_defheaders(), headers=util.get_request_defheaders(),
verify=version_info >= (2, 7, 9), verify=sys.version_info >= (2, 7, 9),
) )
if self._request.status_code != 200: if self._request.status_code != 200:
raise FDUnrecognizedStatusCode(self._request.status_code, url) raise FDUnrecognizedStatusCode(self._request.status_code, url)
@ -74,18 +72,19 @@ class FileDownloader(object):
return -1 return -1
return int(self._request.headers["content-length"]) return int(self._request.headers["content-length"])
def start(self, with_progress=True): def start(self, with_progress=True, silent=False):
label = "Downloading" label = "Downloading"
itercontent = self._request.iter_content(chunk_size=self.CHUNK_SIZE) itercontent = self._request.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE)
f = open(self._destination, "wb") f = open(self._destination, "wb")
try: try:
if not with_progress or self.get_size() == -1: if not with_progress or self.get_size() == -1:
click.echo("%s..." % label) if not silent:
click.echo("%s..." % label)
for chunk in itercontent: for chunk in itercontent:
if chunk: if chunk:
f.write(chunk) f.write(chunk)
else: else:
chunks = int(ceil(self.get_size() / float(self.CHUNK_SIZE))) chunks = int(math.ceil(self.get_size() / float(io.DEFAULT_BUFFER_SIZE)))
with click.progressbar(length=chunks, label=label) as pb: with click.progressbar(length=chunks, label=label) as pb:
for _ in pb: for _ in pb:
f.write(next(itercontent)) f.write(next(itercontent))
@ -102,25 +101,19 @@ class FileDownloader(object):
_dlsize = getsize(self._destination) _dlsize = getsize(self._destination)
if self.get_size() != -1 and _dlsize != self.get_size(): if self.get_size() != -1 and _dlsize != self.get_size():
raise FDSizeMismatch(_dlsize, self._fname, self.get_size()) raise FDSizeMismatch(_dlsize, self._fname, self.get_size())
if not sha1: if not sha1:
return None return None
dlsha1 = None checksum = hashlib.sha1()
try: with io.open(self._destination, "rb", buffering=0) as fp:
result = exec_command(["sha1sum", self._destination]) while True:
dlsha1 = result["out"] chunk = fp.read(io.DEFAULT_BUFFER_SIZE)
except (OSError, ValueError): if not chunk:
try: break
result = exec_command(["shasum", "-a", "1", self._destination]) checksum.update(chunk)
dlsha1 = result["out"]
except (OSError, ValueError): if sha1.lower() != checksum.hexdigest().lower():
pass raise FDSHASumMismatch(checksum.hexdigest(), self._fname, sha1)
if not dlsha1:
return None
dlsha1 = dlsha1[1:41] if dlsha1.startswith("\\") else dlsha1[:40]
if sha1.lower() != dlsha1.lower():
raise FDSHASumMismatch(dlsha1, self._fname, sha1)
return True return True
def _preserve_filemtime(self, lmdate): def _preserve_filemtime(self, lmdate):

View File

@ -152,49 +152,6 @@ class FDSHASumMismatch(PlatformIOPackageException):
) )
#
# Project
#
class PlatformIOProjectException(PlatformioException):
pass
class NotPlatformIOProject(PlatformIOProjectException):
MESSAGE = (
"Not a PlatformIO project. `platformio.ini` file has not been "
"found in current working directory ({0}). To initialize new project "
"please use `platformio init` command"
)
class InvalidProjectConf(PlatformIOProjectException):
MESSAGE = "Invalid '{0}' (project configuration file): '{1}'"
class UndefinedEnvPlatform(PlatformIOProjectException):
MESSAGE = "Please specify platform for '{0}' environment"
class ProjectEnvsNotAvailable(PlatformIOProjectException):
MESSAGE = "Please setup environments in `platformio.ini` file"
class UnknownEnvNames(PlatformIOProjectException):
MESSAGE = "Unknown environment names '{0}'. Valid names are '{1}'"
class ProjectOptionValueError(PlatformIOProjectException):
MESSAGE = "{0} for option `{1}` in section [{2}]"
# #
# Library # Library
# #
@ -319,7 +276,7 @@ class UpgradeError(PlatformioException):
""" """
class HomeDirPermissionsError(PlatformioException): class HomeDirPermissionsError(UserSideException):
MESSAGE = ( MESSAGE = (
"The directory `{0}` or its parent directory is not owned by the " "The directory `{0}` or its parent directory is not owned by the "
@ -338,20 +295,6 @@ class CygwinEnvDetected(PlatformioException):
) )
class DebugSupportError(PlatformioException):
MESSAGE = (
"Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html"
)
class DebugInvalidOptions(PlatformioException):
pass
class TestDirNotExists(PlatformioException): class TestDirNotExists(PlatformioException):
MESSAGE = ( MESSAGE = (

View File

@ -40,7 +40,7 @@ class cd(object):
def get_source_dir(): def get_source_dir():
curpath = os.path.abspath(__file__) curpath = os.path.realpath(__file__)
if not os.path.isfile(curpath): if not os.path.isfile(curpath):
for p in sys.path: for p in sys.path:
if os.path.isfile(os.path.join(p, __file__)): if os.path.isfile(os.path.join(p, __file__)):
@ -49,9 +49,9 @@ def get_source_dir():
return os.path.dirname(curpath) return os.path.dirname(curpath)
def get_file_contents(path): def get_file_contents(path, encoding=None):
try: try:
with open(path) as fp: with io.open(path, encoding=encoding) as fp:
return fp.read() return fp.read()
except UnicodeDecodeError: except UnicodeDecodeError:
click.secho( click.secho(
@ -117,7 +117,7 @@ def ensure_udev_rules():
if not any(os.path.isfile(p) for p in installed_rules): if not any(os.path.isfile(p) for p in installed_rules):
raise exception.MissedUdevRules raise exception.MissedUdevRules
origin_path = os.path.abspath( origin_path = os.path.realpath(
os.path.join(get_source_dir(), "..", "scripts", "99-platformio-udev.rules") os.path.join(get_source_dir(), "..", "scripts", "99-platformio-udev.rules")
) )
if not os.path.isfile(origin_path): if not os.path.isfile(origin_path):
@ -143,10 +143,10 @@ def path_endswith_ext(path, extensions):
return False return False
def match_src_files(src_dir, src_filter=None, src_exts=None): def match_src_files(src_dir, src_filter=None, src_exts=None, followlinks=True):
def _append_build_item(items, item, src_dir): def _append_build_item(items, item, src_dir):
if not src_exts or path_endswith_ext(item, src_exts): if not src_exts or path_endswith_ext(item, src_exts):
items.add(item.replace(src_dir + os.sep, "")) items.add(os.path.relpath(item, src_dir))
src_filter = src_filter or "" src_filter = src_filter or ""
if isinstance(src_filter, (list, tuple)): if isinstance(src_filter, (list, tuple)):
@ -159,7 +159,7 @@ def match_src_files(src_dir, src_filter=None, src_exts=None):
items = set() items = set()
for item in glob(os.path.join(glob_escape(src_dir), pattern)): for item in glob(os.path.join(glob_escape(src_dir), pattern)):
if os.path.isdir(item): if os.path.isdir(item):
for root, _, files in os.walk(item, followlinks=True): for root, _, files in os.walk(item, followlinks=followlinks):
for f in files: for f in files:
_append_build_item(items, os.path.join(root, f), src_dir) _append_build_item(items, os.path.join(root, f), src_dir)
else: else:

View File

@ -12,10 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import io import codecs
import os import os
import sys import sys
from os.path import abspath, basename, isdir, isfile, join, relpath from os.path import basename, isdir, isfile, join, realpath, relpath
import bottle import bottle
@ -64,7 +64,7 @@ class ProjectGenerator(object):
"project_name": basename(self.project_dir), "project_name": basename(self.project_dir),
"project_dir": self.project_dir, "project_dir": self.project_dir,
"env_name": self.env_name, "env_name": self.env_name,
"user_home_dir": abspath(fs.expanduser("~")), "user_home_dir": realpath(fs.expanduser("~")),
"platformio_path": sys.argv[0] "platformio_path": sys.argv[0]
if isfile(sys.argv[0]) if isfile(sys.argv[0])
else where_is_program("platformio"), else where_is_program("platformio"),
@ -129,18 +129,18 @@ class ProjectGenerator(object):
dst_dir = join(self.project_dir, tpl_relpath) dst_dir = join(self.project_dir, tpl_relpath)
if not isdir(dst_dir): if not isdir(dst_dir):
os.makedirs(dst_dir) os.makedirs(dst_dir)
file_name = basename(tpl_path)[:-4] file_name = basename(tpl_path)[:-4]
contents = self._render_tpl(tpl_path, tpl_vars) contents = self._render_tpl(tpl_path, tpl_vars)
self._merge_contents(join(dst_dir, file_name), contents) self._merge_contents(join(dst_dir, file_name), contents)
@staticmethod @staticmethod
def _render_tpl(tpl_path, tpl_vars): def _render_tpl(tpl_path, tpl_vars):
return bottle.template(fs.get_file_contents(tpl_path), **tpl_vars) with codecs.open(tpl_path, "r", encoding="utf8") as fp:
return bottle.SimpleTemplate(fp.read()).render(**tpl_vars)
@staticmethod @staticmethod
def _merge_contents(dst_path, contents): def _merge_contents(dst_path, contents):
if basename(dst_path) == ".gitignore" and isfile(dst_path): if basename(dst_path) == ".gitignore" and isfile(dst_path):
return return
with io.open(dst_path, "w", encoding="utf8") as fp: with codecs.open(dst_path, "w", encoding="utf8") as fp:
fp.write(contents) fp.write(contents)

View File

@ -1,8 +1,8 @@
% _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines]) % _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
{ {
"execPath": "{{ cxx_path }}", "execPath": "{{ cxx_path }}",
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}", "gccDefaultCFlags": "-fsyntax-only {{! to_unix_path(cc_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}", "gccDefaultCppFlags": "-fsyntax-only {{! to_unix_path(cxx_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccErrorLimit": 15, "gccErrorLimit": 15,
"gccIncludePaths": "{{ ','.join(includes) }}", "gccIncludePaths": "{{ ','.join(includes) }}",
"gccSuppressWarnings": false "gccSuppressWarnings": false

View File

@ -6,6 +6,7 @@
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO. # The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
cmake_minimum_required(VERSION 3.2) cmake_minimum_required(VERSION 3.2)
project("{{project_name}}") project("{{project_name}}")
include(CMakeListsPrivate.txt) include(CMakeListsPrivate.txt)

View File

@ -5,7 +5,7 @@
# please create `CMakeListsUser.txt` in the root of project. # please create `CMakeListsUser.txt` in the root of project.
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO. # The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
%from platformio.project.helpers import (load_project_ide_data) % from platformio.project.helpers import (load_project_ide_data)
% %
% import re % import re
% %
@ -22,10 +22,14 @@
% return path % return path
% end % end
% %
% def _escape(text):
% return to_unix_path(text).replace('"', '\\"')
% end
%
% envs = config.envs() % envs = config.envs()
% if len(envs) > 1: % if len(envs) > 1:
set(CMAKE_CONFIGURATION_TYPES "{{ ";".join(envs) }}" CACHE STRING "" FORCE) set(CMAKE_CONFIGURATION_TYPES "{{ ";".join(envs) }};" CACHE STRING "" FORCE)
% else: % else:
set(CMAKE_CONFIGURATION_TYPES "{{ env_name }}" CACHE STRING "" FORCE) set(CMAKE_CONFIGURATION_TYPES "{{ env_name }}" CACHE STRING "" FORCE)
% end % end
@ -37,8 +41,8 @@ set(SVD_PATH "{{ _normalize_path(svd_path) }}")
SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}") SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}")
SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}") SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}")
SET(CMAKE_CXX_FLAGS_DISTRIBUTION "{{cxx_flags}}") SET(CMAKE_CXX_FLAGS "{{ _normalize_path(to_unix_path(cxx_flags)) }}")
SET(CMAKE_C_FLAGS_DISTRIBUTION "{{cc_flags}}") SET(CMAKE_C_FLAGS "{{ _normalize_path(to_unix_path(cc_flags)) }}")
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)") % STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
% cc_stds = STD_RE.findall(cc_flags) % cc_stds = STD_RE.findall(cc_flags)

View File

@ -13,6 +13,35 @@
% return to_unix_path(text).replace('"', '\\"') % return to_unix_path(text).replace('"', '\\"')
% end % end
% %
% def _escape_required(flag):
% return " " in flag and systype == "windows"
% end
%
% def _split_flags(flags):
% result = []
% i = 0
% flags = flags.strip()
% while i < len(flags):
% current_arg = []
% while i < len(flags) and flags[i] != " ":
% if flags[i] == '"':
% quotes_idx = flags.find('"', i + 1)
% current_arg.extend(flags[i + 1:quotes_idx])
% i = quotes_idx + 1
% else:
% current_arg.append(flags[i])
% i = i + 1
% end
% end
% arg = "".join(current_arg)
% if arg.strip():
% result.append(arg.strip())
% end
% i = i + 1
% end
% return result
% end
%
% cleaned_includes = [] % cleaned_includes = []
% for include in includes: % for include in includes:
% if "toolchain-" not in dirname(commonprefix([include, cc_path])) and isdir(include): % if "toolchain-" not in dirname(commonprefix([include, cc_path])) and isdir(include):
@ -55,17 +84,21 @@
% cc_stds = STD_RE.findall(cc_flags) % cc_stds = STD_RE.findall(cc_flags)
% cxx_stds = STD_RE.findall(cxx_flags) % cxx_stds = STD_RE.findall(cxx_flags)
% %
% # pass only architecture specific flags
% cc_m_flags = " ".join([f.strip() for f in cc_flags.split(" ") if f.strip().startswith("-m")])
%
% if cc_stds: % if cc_stds:
"cStandard": "c{{ cc_stds[-1] }}", "cStandard": "c{{ cc_stds[-1] }}",
% end % end
% if cxx_stds: % if cxx_stds:
"cppStandard": "c++{{ cxx_stds[-1] }}", "cppStandard": "c++{{ cxx_stds[-1] }}",
% end % end
"compilerPath": "\"{{cc_path}}\" {{! _escape(cc_m_flags) }}" "compilerPath": "{{ cc_path }}",
"compilerArgs": [
% for flag in [ '"%s"' % _escape(f) if _escape_required(f) else f for f in _split_flags(
% cc_flags) if f.startswith(("-m", "-i", "@"))]:
"{{ flag }}",
% end
""
]
} }
], ],
"version": 4 "version": 4
} }

View File

@ -1,7 +1,23 @@
% import json
% import os
% import re
%
% recommendations = set(["platformio.platformio-ide"])
% previous_json = os.path.join(project_dir, ".vscode", "extensions.json")
% if os.path.isfile(previous_json):
% fp = open(previous_json)
% contents = re.sub(r"^\s*//.*$", "", fp.read(), flags=re.M).strip()
% fp.close()
% if contents:
% recommendations |= set(json.loads(contents).get("recommendations", []))
% end
% end
{ {
// See http://go.microsoft.com/fwlink/?LinkId=827846 // See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format // for the documentation about the extensions.json format
"recommendations": [ "recommendations": [
"platformio.platformio-ide" % for i, item in enumerate(sorted(recommendations)):
] "{{ item }}"{{ ("," if (i + 1) < len(recommendations) else "") }}
} % end
]
}

View File

@ -12,8 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from os import remove import os
from os.path import abspath, exists, getmtime
from time import sleep, time from time import sleep, time
from platformio import exception from platformio import exception
@ -45,15 +44,15 @@ class LockFile(object):
def __init__(self, path, timeout=LOCKFILE_TIMEOUT, delay=LOCKFILE_DELAY): def __init__(self, path, timeout=LOCKFILE_TIMEOUT, delay=LOCKFILE_DELAY):
self.timeout = timeout self.timeout = timeout
self.delay = delay self.delay = delay
self._lock_path = abspath(path) + ".lock" self._lock_path = os.path.realpath(path) + ".lock"
self._fp = None self._fp = None
def _lock(self): def _lock(self):
if not LOCKFILE_CURRENT_INTERFACE and exists(self._lock_path): if not LOCKFILE_CURRENT_INTERFACE and os.path.exists(self._lock_path):
# remove stale lock # remove stale lock
if time() - getmtime(self._lock_path) > 10: if time() - os.path.getmtime(self._lock_path) > 10:
try: try:
remove(self._lock_path) os.remove(self._lock_path)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
else: else:
@ -93,9 +92,9 @@ class LockFile(object):
def release(self): def release(self):
self._unlock() self._unlock()
if exists(self._lock_path): if os.path.exists(self._lock_path):
try: try:
remove(self._lock_path) os.remove(self._lock_path)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass

View File

@ -151,7 +151,7 @@ def after_upgrade(ctx):
"PlatformIO has been successfully upgraded to %s!\n" % __version__, "PlatformIO has been successfully upgraded to %s!\n" % __version__,
fg="green", fg="green",
) )
telemetry.on_event( telemetry.send_event(
category="Auto", category="Auto",
action="Upgrade", action="Upgrade",
label="%s > %s" % (last_version, __version__), label="%s > %s" % (last_version, __version__),
@ -315,7 +315,7 @@ def check_internal_updates(ctx, what):
ctx.invoke(cmd_lib_update, libraries=outdated_items) ctx.invoke(cmd_lib_update, libraries=outdated_items)
click.echo() click.echo()
telemetry.on_event(category="Auto", action="Update", label=what.title()) telemetry.send_event(category="Auto", action="Update", label=what.title())
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
click.echo("") click.echo("")

View File

@ -24,13 +24,14 @@ from platformio.proc import copy_pythonpath_to_osenv, get_pythonexe_path
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
CORE_PACKAGES = { CORE_PACKAGES = {
"contrib-piohome": "~3.0.0", "contrib-piohome": "~3.1.0",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info[0], sys.version_info[1]), "contrib-pysite": "~2.%d%d.0" % (sys.version_info[0], sys.version_info[1]),
"tool-pioplus": "^2.5.8", "tool-pioplus": "^2.6.1",
"tool-unity": "~1.20403.0", "tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if PY2 else "~3.30101.0", "tool-scons": "~2.20501.7" if PY2 else "~3.30102.0",
"tool-cppcheck": "~1.189.0", "tool-cppcheck": "~1.189.0",
"tool-clangtidy": "^1.80000.0", "tool-clangtidy": "^1.80000.0",
"tool-pvs-studio": "~7.5.0",
} }
PIOPLUS_AUTO_UPDATES_MAX = 100 PIOPLUS_AUTO_UPDATES_MAX = 100

View File

@ -23,7 +23,7 @@ import click
import semantic_version import semantic_version
from platformio import app, exception, util from platformio import app, exception, util
from platformio.compat import glob_escape, string_types from platformio.compat import glob_escape
from platformio.managers.package import BasePkgManager from platformio.managers.package import BasePkgManager
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@ -61,29 +61,6 @@ class LibraryManager(BasePkgManager):
return None return None
@staticmethod
def normalize_dependencies(dependencies):
if not dependencies:
return []
items = []
if isinstance(dependencies, dict):
if "name" in dependencies:
items.append(dependencies)
else:
for name, version in dependencies.items():
items.append({"name": name, "version": version})
elif isinstance(dependencies, list):
items = [d for d in dependencies if "name" in d]
for item in items:
for k in ("frameworks", "platforms"):
if k not in item or isinstance(k, list):
continue
if item[k] == "*":
del item[k]
elif isinstance(item[k], string_types):
item[k] = [i.strip() for i in item[k].split(",") if i.strip()]
return items
def max_satisfying_repo_version(self, versions, requirements=None): def max_satisfying_repo_version(self, versions, requirements=None):
def _cmp_dates(datestr1, datestr2): def _cmp_dates(datestr1, datestr2):
date1 = util.parse_date(datestr1) date1 = util.parse_date(datestr1)
@ -312,7 +289,7 @@ class LibraryManager(BasePkgManager):
click.secho("Installing dependencies", fg="yellow") click.secho("Installing dependencies", fg="yellow")
builtin_lib_storages = None builtin_lib_storages = None
for filters in self.normalize_dependencies(manifest["dependencies"]): for filters in manifest["dependencies"]:
assert "name" in filters assert "name" in filters
# avoid circle dependencies # avoid circle dependencies

View File

@ -17,21 +17,19 @@ import json
import os import os
import re import re
import shutil import shutil
from os.path import abspath, basename, getsize, isdir, isfile, islink, join from os.path import basename, getsize, isdir, isfile, islink, join, realpath
from tempfile import mkdtemp from tempfile import mkdtemp
import click import click
import requests import requests
import semantic_version import semantic_version
from platformio import __version__, app, exception, fs, telemetry, util from platformio import __version__, app, exception, fs, util
from platformio.compat import hashlib_encode_data from platformio.compat import hashlib_encode_data
from platformio.downloader import FileDownloader from platformio.downloader import FileDownloader
from platformio.lockfile import LockFile from platformio.lockfile import LockFile
from platformio.package.manifest.parser import ( from platformio.package.exception import ManifestException
ManifestParserError, from platformio.package.manifest.parser import ManifestParserFactory
ManifestParserFactory,
)
from platformio.unpacker import FileUnpacker from platformio.unpacker import FileUnpacker
from platformio.vcsclient import VCSClientFactory from platformio.vcsclient import VCSClientFactory
@ -347,7 +345,7 @@ class PkgInstallerMixin(object):
try: try:
manifest = ManifestParserFactory.new_from_file(manifest_path).as_dict() manifest = ManifestParserFactory.new_from_file(manifest_path).as_dict()
except ManifestParserError: except ManifestException:
pass pass
if src_manifest: if src_manifest:
@ -364,7 +362,7 @@ class PkgInstallerMixin(object):
if "version" not in manifest: if "version" not in manifest:
manifest["version"] = "0.0.0" manifest["version"] = "0.0.0"
manifest["__pkg_dir"] = pkg_dir manifest["__pkg_dir"] = realpath(pkg_dir)
self.cache_set(cache_key, manifest) self.cache_set(cache_key, manifest)
return manifest return manifest
@ -423,7 +421,7 @@ class PkgInstallerMixin(object):
def get_package_by_dir(self, pkg_dir): def get_package_by_dir(self, pkg_dir):
for manifest in self.get_installed(): for manifest in self.get_installed():
if manifest["__pkg_dir"] == abspath(pkg_dir): if manifest["__pkg_dir"] == realpath(pkg_dir):
return manifest return manifest
return None return None
@ -439,6 +437,7 @@ class PkgInstallerMixin(object):
pkg_dir = None pkg_dir = None
pkgdata = None pkgdata = None
versions = None versions = None
last_exc = None
for versions in PackageRepoIterator(name, self.repositories): for versions in PackageRepoIterator(name, self.repositories):
pkgdata = self.max_satisfying_repo_version(versions, requirements) pkgdata = self.max_satisfying_repo_version(versions, requirements)
if not pkgdata: if not pkgdata:
@ -449,12 +448,15 @@ class PkgInstallerMixin(object):
) )
break break
except Exception as e: # pylint: disable=broad-except except Exception as e: # pylint: disable=broad-except
last_exc = e
click.secho("Warning! Package Mirror: %s" % e, fg="yellow") click.secho("Warning! Package Mirror: %s" % e, fg="yellow")
click.secho("Looking for another mirror...", fg="yellow") click.secho("Looking for another mirror...", fg="yellow")
if versions is None: if versions is None:
util.internet_on(raise_exception=True) util.internet_on(raise_exception=True)
raise exception.UnknownPackage(name) raise exception.UnknownPackage(
name + (". Error -> %s" % last_exc if last_exc else "")
)
if not pkgdata: if not pkgdata:
raise exception.UndefinedPackageVersion( raise exception.UndefinedPackageVersion(
requirements or "latest", util.get_systype() requirements or "latest", util.get_systype()
@ -656,7 +658,7 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
def install( def install(
self, name, requirements=None, silent=False, after_update=False, force=False self, name, requirements=None, silent=False, after_update=False, force=False
): ): # pylint: disable=unused-argument
pkg_dir = None pkg_dir = None
# interprocess lock # interprocess lock
with LockFile(self.package_dir): with LockFile(self.package_dir):
@ -705,13 +707,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
manifest = self.load_manifest(pkg_dir) manifest = self.load_manifest(pkg_dir)
assert manifest assert manifest
if not after_update:
telemetry.on_event(
category=self.__class__.__name__,
action="Install",
label=manifest["name"],
)
click.secho( click.secho(
"{name} @ {version} has been successfully installed!".format( "{name} @ {version} has been successfully installed!".format(
**manifest **manifest
@ -721,7 +716,9 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
return pkg_dir return pkg_dir
def uninstall(self, package, requirements=None, after_update=False): def uninstall(
self, package, requirements=None, after_update=False
): # pylint: disable=unused-argument
# interprocess lock # interprocess lock
with LockFile(self.package_dir): with LockFile(self.package_dir):
self.cache_reset() self.cache_reset()
@ -760,13 +757,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
click.echo("[%s]" % click.style("OK", fg="green")) click.echo("[%s]" % click.style("OK", fg="green"))
if not after_update:
telemetry.on_event(
category=self.__class__.__name__,
action="Uninstall",
label=manifest["name"],
)
return True return True
def update(self, package, requirements=None, only_check=False): def update(self, package, requirements=None, only_check=False):
@ -815,9 +805,6 @@ class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
self.uninstall(pkg_dir, after_update=True) self.uninstall(pkg_dir, after_update=True)
self.install(name, latest, after_update=True) self.install(name, latest, after_update=True)
telemetry.on_event(
category=self.__class__.__name__, action="Update", label=manifest["name"]
)
return True return True

View File

@ -17,13 +17,18 @@
import base64 import base64
import os import os
import re import re
import subprocess
import sys import sys
from os.path import basename, dirname, isdir, isfile, join from os.path import basename, dirname, isdir, isfile, join
import click import click
import semantic_version import semantic_version
from platformio import __version__, app, exception, fs, proc, util from platformio import __version__, app, exception, fs, proc, telemetry, util
from platformio.commands.debug.exception import (
DebugInvalidOptionsError,
DebugSupportError,
)
from platformio.compat import PY2, hashlib_encode_data, is_bytes, load_python_module from platformio.compat import PY2, hashlib_encode_data, is_bytes, load_python_module
from platformio.managers.core import get_core_package_dir from platformio.managers.core import get_core_package_dir
from platformio.managers.package import BasePkgManager, PackageManager from platformio.managers.package import BasePkgManager, PackageManager
@ -69,6 +74,7 @@ class PlatformManager(BasePkgManager):
with_packages=None, with_packages=None,
without_packages=None, without_packages=None,
skip_default_package=False, skip_default_package=False,
with_all_packages=False,
after_update=False, after_update=False,
silent=False, silent=False,
force=False, force=False,
@ -79,9 +85,14 @@ class PlatformManager(BasePkgManager):
) )
p = PlatformFactory.newPlatform(platform_dir) p = PlatformFactory.newPlatform(platform_dir)
if with_all_packages:
with_packages = list(p.packages.keys())
# don't cleanup packages or install them after update # don't cleanup packages or install them after update
# we check packages for updates in def update() # we check packages for updates in def update()
if after_update: if after_update:
p.install_python_packages()
p.on_installed()
return True return True
p.install_packages( p.install_packages(
@ -91,6 +102,8 @@ class PlatformManager(BasePkgManager):
silent=silent, silent=silent,
force=force, force=force,
) )
p.install_python_packages()
p.on_installed()
return self.cleanup_packages(list(p.packages)) return self.cleanup_packages(list(p.packages))
def uninstall(self, package, requirements=None, after_update=False): def uninstall(self, package, requirements=None, after_update=False):
@ -105,6 +118,8 @@ class PlatformManager(BasePkgManager):
p = PlatformFactory.newPlatform(pkg_dir) p = PlatformFactory.newPlatform(pkg_dir)
BasePkgManager.uninstall(self, pkg_dir, requirements) BasePkgManager.uninstall(self, pkg_dir, requirements)
p.uninstall_python_packages()
p.on_uninstalled()
# don't cleanup packages or install them after update # don't cleanup packages or install them after update
# we check packages for updates in def update() # we check packages for updates in def update()
@ -590,6 +605,10 @@ class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
packages[name].update({"version": version.strip(), "optional": False}) packages[name].update({"version": version.strip(), "optional": False})
return packages return packages
@property
def python_packages(self):
return self._manifest.get("pythonPackages")
def get_dir(self): def get_dir(self):
return dirname(self.manifest_path) return dirname(self.manifest_path)
@ -695,6 +714,45 @@ class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
return [dict(name=name, path=path) for path, name in storages.items()] return [dict(name=name, path=path) for path, name in storages.items()]
def on_installed(self):
pass
def on_uninstalled(self):
pass
def install_python_packages(self):
if not self.python_packages:
return None
click.echo(
"Installing Python packages: %s"
% ", ".join(list(self.python_packages.keys())),
)
args = [proc.get_pythonexe_path(), "-m", "pip", "install", "--upgrade"]
for name, requirements in self.python_packages.items():
if any(c in requirements for c in ("<", ">", "=")):
args.append("%s%s" % (name, requirements))
else:
args.append("%s==%s" % (name, requirements))
try:
return subprocess.call(args) == 0
except Exception as e: # pylint: disable=broad-except
click.secho(
"Could not install Python packages -> %s" % e, fg="red", err=True
)
def uninstall_python_packages(self):
if not self.python_packages:
return
click.echo("Uninstalling Python packages")
args = [proc.get_pythonexe_path(), "-m", "pip", "uninstall", "--yes"]
args.extend(list(self.python_packages.keys()))
try:
subprocess.call(args) == 0
except Exception as e: # pylint: disable=broad-except
click.secho(
"Could not install Python packages -> %s" % e, fg="red", err=True
)
class PlatformBoardConfig(object): class PlatformBoardConfig(object):
def __init__(self, manifest_path): def __init__(self, manifest_path):
@ -799,11 +857,12 @@ class PlatformBoardConfig(object):
if tool_name == "custom": if tool_name == "custom":
return tool_name return tool_name
if not debug_tools: if not debug_tools:
raise exception.DebugSupportError(self._manifest["name"]) telemetry.send_event("Debug", "Request", self.id)
raise DebugSupportError(self._manifest["name"])
if tool_name: if tool_name:
if tool_name in debug_tools: if tool_name in debug_tools:
return tool_name return tool_name
raise exception.DebugInvalidOptions( raise DebugInvalidOptionsError(
"Unknown debug tool `%s`. Please use one of `%s` or `custom`" "Unknown debug tool `%s`. Please use one of `%s` or `custom`"
% (tool_name, ", ".join(sorted(list(debug_tools)))) % (tool_name, ", ".join(sorted(list(debug_tools))))
) )

View File

@ -15,7 +15,15 @@
from platformio.exception import PlatformioException from platformio.exception import PlatformioException
class ManifestException(PlatformioException): class PackageException(PlatformioException):
pass
class ManifestException(PackageException):
pass
class UnknownManifestError(ManifestException):
pass pass
@ -24,13 +32,14 @@ class ManifestParserError(ManifestException):
class ManifestValidationError(ManifestException): class ManifestValidationError(ManifestException):
def __init__(self, error, data): def __init__(self, messages, data, valid_data):
super(ManifestValidationError, self).__init__() super(ManifestValidationError, self).__init__()
self.error = error self.messages = messages
self.data = data self.data = data
self.valid_data = valid_data
def __str__(self): def __str__(self):
return ( return (
"Invalid manifest fields: %s. \nPlease check specification -> " "Invalid manifest fields: %s. \nPlease check specification -> "
"http://docs.platformio.org/page/librarymanager/config.html" % self.error "http://docs.platformio.org/page/librarymanager/config.html" % self.messages
) )

View File

@ -12,15 +12,17 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import inspect
import json import json
import os import os
import re import re
import requests import requests
from platformio import util
from platformio.compat import get_class_attributes, string_types from platformio.compat import get_class_attributes, string_types
from platformio.fs import get_file_contents from platformio.fs import get_file_contents
from platformio.package.exception import ManifestParserError from platformio.package.exception import ManifestParserError, UnknownManifestError
from platformio.project.helpers import is_platformio_project from platformio.project.helpers import is_platformio_project
try: try:
@ -36,36 +38,36 @@ class ManifestFileType(object):
MODULE_JSON = "module.json" MODULE_JSON = "module.json"
PACKAGE_JSON = "package.json" PACKAGE_JSON = "package.json"
@classmethod
def items(cls):
return get_class_attributes(ManifestFileType)
@classmethod @classmethod
def from_uri(cls, uri): def from_uri(cls, uri):
if uri.endswith(".properties"): for t in sorted(cls.items().values()):
return ManifestFileType.LIBRARY_PROPERTIES if uri.endswith(t):
if uri.endswith("platform.json"): return t
return ManifestFileType.PLATFORM_JSON return None
if uri.endswith("module.json"):
return ManifestFileType.MODULE_JSON @classmethod
if uri.endswith("package.json"): def from_dir(cls, path):
return ManifestFileType.PACKAGE_JSON for t in sorted(cls.items().values()):
if uri.endswith("library.json"): if os.path.isfile(os.path.join(path, t)):
return ManifestFileType.LIBRARY_JSON return t
return None return None
class ManifestParserFactory(object): class ManifestParserFactory(object):
@staticmethod
def type_to_clsname(t):
t = t.replace(".", " ")
t = t.title()
return "%sManifestParser" % t.replace(" ", "")
@staticmethod @staticmethod
def new_from_file(path, remote_url=False): def new_from_file(path, remote_url=False):
if not path or not os.path.isfile(path): if not path or not os.path.isfile(path):
raise ManifestParserError("Manifest file does not exist %s" % path) raise UnknownManifestError("Manifest file does not exist %s" % path)
for t in get_class_attributes(ManifestFileType).values(): type_from_uri = ManifestFileType.from_uri(path)
if path.endswith(t): if not type_from_uri:
return ManifestParserFactory.new(get_file_contents(path), t, remote_url) raise UnknownManifestError("Unknown manifest file type %s" % path)
raise ManifestParserError("Unknown manifest file type %s" % path) return ManifestParserFactory.new(
get_file_contents(path, encoding="utf8"), type_from_uri, remote_url
)
@staticmethod @staticmethod
def new_from_dir(path, remote_url=None): def new_from_dir(path, remote_url=None):
@ -74,29 +76,23 @@ class ManifestParserFactory(object):
type_from_uri = ManifestFileType.from_uri(remote_url) if remote_url else None type_from_uri = ManifestFileType.from_uri(remote_url) if remote_url else None
if type_from_uri and os.path.isfile(os.path.join(path, type_from_uri)): if type_from_uri and os.path.isfile(os.path.join(path, type_from_uri)):
return ManifestParserFactory.new( return ManifestParserFactory.new(
get_file_contents(os.path.join(path, type_from_uri)), get_file_contents(os.path.join(path, type_from_uri), encoding="utf8"),
type_from_uri, type_from_uri,
remote_url=remote_url, remote_url=remote_url,
package_dir=path, package_dir=path,
) )
file_order = [ type_from_dir = ManifestFileType.from_dir(path)
ManifestFileType.PLATFORM_JSON, if not type_from_dir:
ManifestFileType.LIBRARY_JSON, raise UnknownManifestError(
ManifestFileType.LIBRARY_PROPERTIES, "Unknown manifest file type in %s directory" % path
ManifestFileType.MODULE_JSON,
ManifestFileType.PACKAGE_JSON,
]
for t in file_order:
if not os.path.isfile(os.path.join(path, t)):
continue
return ManifestParserFactory.new(
get_file_contents(os.path.join(path, t)),
t,
remote_url=remote_url,
package_dir=path,
) )
raise ManifestParserError("Unknown manifest file type in %s directory" % path) return ManifestParserFactory.new(
get_file_contents(os.path.join(path, type_from_dir), encoding="utf8"),
type_from_dir,
remote_url=remote_url,
package_dir=path,
)
@staticmethod @staticmethod
def new_from_url(remote_url): def new_from_url(remote_url):
@ -109,12 +105,18 @@ class ManifestParserFactory(object):
) )
@staticmethod @staticmethod
def new(contents, type, remote_url=None, package_dir=None): def new( # pylint: disable=redefined-builtin
# pylint: disable=redefined-builtin contents, type, remote_url=None, package_dir=None
clsname = ManifestParserFactory.type_to_clsname(type) ):
if clsname not in globals(): for _, cls in globals().items():
raise ManifestParserError("Unknown manifest file type %s" % clsname) if (
return globals()[clsname](contents, remote_url, package_dir) inspect.isclass(cls)
and issubclass(cls, BaseManifestParser)
and cls != BaseManifestParser
and cls.manifest_type == type
):
return cls(contents, remote_url, package_dir)
raise UnknownManifestError("Unknown manifest file type %s" % type)
class BaseManifestParser(object): class BaseManifestParser(object):
@ -125,6 +127,8 @@ class BaseManifestParser(object):
self._data = self.parse(contents) self._data = self.parse(contents)
except Exception as e: except Exception as e:
raise ManifestParserError("Could not parse manifest -> %s" % e) raise ManifestParserError("Could not parse manifest -> %s" % e)
self._data = self.normalize_repository(self._data)
self._data = self.parse_examples(self._data) self._data = self.parse_examples(self._data)
# remove None fields # remove None fields
@ -139,7 +143,7 @@ class BaseManifestParser(object):
return self._data return self._data
@staticmethod @staticmethod
def cleanup_author(author): def normalize_author(author):
assert isinstance(author, dict) assert isinstance(author, dict)
if author.get("email"): if author.get("email"):
author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"]) author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"])
@ -160,6 +164,22 @@ class BaseManifestParser(object):
email = raw[raw.index(ldel) + 1 : raw.index(rdel)] email = raw[raw.index(ldel) + 1 : raw.index(rdel)]
return (name.strip(), email.strip() if email else None) return (name.strip(), email.strip() if email else None)
@staticmethod
def normalize_repository(data):
url = (data.get("repository") or {}).get("url")
if not url or "://" not in url:
return data
url_attrs = urlparse(url)
if url_attrs.netloc not in ("github.com", "bitbucket.org", "gitlab.com"):
return data
url = "https://%s%s" % (url_attrs.netloc, url_attrs.path)
if url.endswith("/"):
url = url[:-1]
if not url.endswith(".git"):
url += ".git"
data["repository"]["url"] = url
return data
def parse_examples(self, data): def parse_examples(self, data):
examples = data.get("examples") examples = data.get("examples")
if ( if (
@ -167,8 +187,8 @@ class BaseManifestParser(object):
or not isinstance(examples, list) or not isinstance(examples, list)
or not all(isinstance(v, dict) for v in examples) or not all(isinstance(v, dict) for v in examples)
): ):
examples = None data["examples"] = None
if not examples and self.package_dir: if not data["examples"] and self.package_dir:
data["examples"] = self.parse_examples_from_dir(self.package_dir) data["examples"] = self.parse_examples_from_dir(self.package_dir)
if "examples" in data and not data["examples"]: if "examples" in data and not data["examples"]:
del data["examples"] del data["examples"]
@ -250,6 +270,8 @@ class BaseManifestParser(object):
class LibraryJsonManifestParser(BaseManifestParser): class LibraryJsonManifestParser(BaseManifestParser):
manifest_type = ManifestFileType.LIBRARY_JSON
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
data = self._process_renamed_fields(data) data = self._process_renamed_fields(data)
@ -265,6 +287,8 @@ class LibraryJsonManifestParser(BaseManifestParser):
data["platforms"] = self._parse_platforms(data["platforms"]) or None data["platforms"] = self._parse_platforms(data["platforms"]) or None
if "export" in data: if "export" in data:
data["export"] = self._parse_export(data["export"]) data["export"] = self._parse_export(data["export"])
if "dependencies" in data:
data["dependencies"] = self._parse_dependencies(data["dependencies"])
return data return data
@ -305,7 +329,7 @@ class LibraryJsonManifestParser(BaseManifestParser):
# normalize Union[dict, list] fields # normalize Union[dict, list] fields
if not isinstance(raw, list): if not isinstance(raw, list):
raw = [raw] raw = [raw]
return [self.cleanup_author(author) for author in raw] return [self.normalize_author(author) for author in raw]
@staticmethod @staticmethod
def _parse_platforms(raw): def _parse_platforms(raw):
@ -324,13 +348,37 @@ class LibraryJsonManifestParser(BaseManifestParser):
return None return None
result = {} result = {}
for k in ("include", "exclude"): for k in ("include", "exclude"):
if k not in raw: if not raw.get(k):
continue continue
result[k] = raw[k] if isinstance(raw[k], list) else [raw[k]] result[k] = raw[k] if isinstance(raw[k], list) else [raw[k]]
return result return result
@staticmethod
def _parse_dependencies(raw):
# compatibility with legacy dependency format
if isinstance(raw, dict) and "name" in raw:
raw = [raw]
if isinstance(raw, dict):
return [dict(name=name, version=version) for name, version in raw.items()]
if isinstance(raw, list):
for i, dependency in enumerate(raw):
assert isinstance(dependency, dict)
for k, v in dependency.items():
if k not in ("platforms", "frameworks", "authors"):
continue
if "*" in v:
del raw[i][k]
raw[i][k] = util.items_to_list(v)
return raw
raise ManifestParserError(
"Invalid dependencies format, should be list or dictionary"
)
class ModuleJsonManifestParser(BaseManifestParser): class ModuleJsonManifestParser(BaseManifestParser):
manifest_type = ManifestFileType.MODULE_JSON
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
data["frameworks"] = ["mbed"] data["frameworks"] = ["mbed"]
@ -352,7 +400,7 @@ class ModuleJsonManifestParser(BaseManifestParser):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
continue continue
result.append(self.cleanup_author(dict(name=name, email=email))) result.append(self.normalize_author(dict(name=name, email=email)))
return result return result
@staticmethod @staticmethod
@ -363,10 +411,12 @@ class ModuleJsonManifestParser(BaseManifestParser):
class LibraryPropertiesManifestParser(BaseManifestParser): class LibraryPropertiesManifestParser(BaseManifestParser):
manifest_type = ManifestFileType.LIBRARY_PROPERTIES
def parse(self, contents): def parse(self, contents):
data = self._parse_properties(contents) data = self._parse_properties(contents)
repository = self._parse_repository(data) repository = self._parse_repository(data)
homepage = data.get("url") homepage = data.get("url") or None
if repository and repository["url"] == homepage: if repository and repository["url"] == homepage:
homepage = None homepage = None
data.update( data.update(
@ -383,6 +433,8 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
if "author" in data: if "author" in data:
data["authors"] = self._parse_authors(data) data["authors"] = self._parse_authors(data)
del data["author"] del data["author"]
if "depends" in data:
data["dependencies"] = self._parse_dependencies(data["depends"])
return data return data
@staticmethod @staticmethod
@ -451,7 +503,7 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
continue continue
authors.append(self.cleanup_author(dict(name=name, email=email))) authors.append(self.normalize_author(dict(name=name, email=email)))
for author in properties.get("maintainer", "").split(","): for author in properties.get("maintainer", "").split(","):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
@ -462,31 +514,29 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
continue continue
found = True found = True
item["maintainer"] = True item["maintainer"] = True
if not item.get("email"): if not item.get("email") and email:
item["email"] = email item["email"] = email
if not found: if not found:
authors.append( authors.append(
self.cleanup_author(dict(name=name, email=email, maintainer=True)) self.normalize_author(dict(name=name, email=email, maintainer=True))
) )
return authors return authors
def _parse_repository(self, properties): def _parse_repository(self, properties):
if self.remote_url: if self.remote_url:
repo_parse = urlparse(self.remote_url) url_attrs = urlparse(self.remote_url)
repo_path_tokens = repo_parse.path[1:].split("/")[:-1] repo_path_tokens = url_attrs.path[1:].split("/")[:-1]
if "github" in repo_parse.netloc: if "github" in url_attrs.netloc:
return dict( return dict(
type="git", type="git",
url="%s://github.com/%s" url="https://github.com/" + "/".join(repo_path_tokens[:2]),
% (repo_parse.scheme, "/".join(repo_path_tokens[:2])),
) )
if "raw" in repo_path_tokens: if "raw" in repo_path_tokens:
return dict( return dict(
type="git", type="git",
url="%s://%s/%s" url="https://%s/%s"
% ( % (
repo_parse.scheme, url_attrs.netloc,
repo_parse.netloc,
"/".join(repo_path_tokens[: repo_path_tokens.index("raw")]), "/".join(repo_path_tokens[: repo_path_tokens.index("raw")]),
), ),
) )
@ -498,9 +548,9 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
result = {"exclude": ["extras", "docs", "tests", "test", "*.doxyfile", "*.pdf"]} result = {"exclude": ["extras", "docs", "tests", "test", "*.doxyfile", "*.pdf"]}
include = None include = None
if self.remote_url: if self.remote_url:
repo_parse = urlparse(self.remote_url) url_attrs = urlparse(self.remote_url)
repo_path_tokens = repo_parse.path[1:].split("/")[:-1] repo_path_tokens = url_attrs.path[1:].split("/")[:-1]
if "github" in repo_parse.netloc: if "github" in url_attrs.netloc:
include = "/".join(repo_path_tokens[3:]) or None include = "/".join(repo_path_tokens[3:]) or None
elif "raw" in repo_path_tokens: elif "raw" in repo_path_tokens:
include = ( include = (
@ -511,12 +561,36 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
result["include"] = [include] result["include"] = [include]
return result return result
@staticmethod
def _parse_dependencies(raw):
result = []
for item in raw.split(","):
item = item.strip()
if not item:
continue
if item.endswith(")") and "(" in item:
name, version = item.split("(")
result.append(
dict(
name=name.strip(),
version=version[:-1].strip(),
frameworks=["arduino"],
)
)
else:
result.append(dict(name=item, frameworks=["arduino"]))
return result
class PlatformJsonManifestParser(BaseManifestParser): class PlatformJsonManifestParser(BaseManifestParser):
manifest_type = ManifestFileType.PLATFORM_JSON
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
if "frameworks" in data: if "frameworks" in data:
data["frameworks"] = self._parse_frameworks(data["frameworks"]) data["frameworks"] = self._parse_frameworks(data["frameworks"])
if "packages" in data:
data["dependencies"] = self._parse_dependencies(data["packages"])
return data return data
@staticmethod @staticmethod
@ -525,8 +599,16 @@ class PlatformJsonManifestParser(BaseManifestParser):
return None return None
return [name.lower() for name in raw.keys()] return [name.lower() for name in raw.keys()]
@staticmethod
def _parse_dependencies(raw):
return [
dict(name=name, version=opts.get("version")) for name, opts in raw.items()
]
class PackageJsonManifestParser(BaseManifestParser): class PackageJsonManifestParser(BaseManifestParser):
manifest_type = ManifestFileType.PACKAGE_JSON
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
data = self._parse_system(data) data = self._parse_system(data)

View File

@ -12,6 +12,9 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-ancestors
import marshmallow
import requests import requests
import semantic_version import semantic_version
from marshmallow import Schema, ValidationError, fields, validate, validates from marshmallow import Schema, ValidationError, fields, validate, validates
@ -19,23 +22,61 @@ from marshmallow import Schema, ValidationError, fields, validate, validates
from platformio.package.exception import ManifestValidationError from platformio.package.exception import ManifestValidationError
from platformio.util import memoized from platformio.util import memoized
MARSHMALLOW_2 = marshmallow.__version_info__ < (3,)
class StrictSchema(Schema):
def handle_error(self, error, data): if MARSHMALLOW_2:
class CompatSchema(Schema):
pass
else:
class CompatSchema(Schema):
class Meta(object): # pylint: disable=no-init
unknown = marshmallow.EXCLUDE # pylint: disable=no-member
def handle_error(self, error, data, **_): # pylint: disable=arguments-differ
raise ManifestValidationError(
error.messages,
data,
error.valid_data if hasattr(error, "valid_data") else error.data,
)
class BaseSchema(CompatSchema):
def load_manifest(self, data):
if MARSHMALLOW_2:
data, errors = self.load(data)
if errors:
raise ManifestValidationError(errors, data, data)
return data
return self.load(data)
class StrictSchema(BaseSchema):
def handle_error(self, error, data, **_): # pylint: disable=arguments-differ
# skip broken records # skip broken records
if self.many: if self.many:
error.data = [ error.valid_data = [
item for idx, item in enumerate(data) if idx not in error.messages item for idx, item in enumerate(data) if idx not in error.messages
] ]
else: else:
error.data = None error.valid_data = None
if MARSHMALLOW_2:
error.data = error.valid_data
raise error raise error
class StrictListField(fields.List): class StrictListField(fields.List):
def _deserialize(self, value, attr, data): def _deserialize( # pylint: disable=arguments-differ
self, value, attr, data, **kwargs
):
try: try:
return super(StrictListField, self)._deserialize(value, attr, data) return super(StrictListField, self)._deserialize(
value, attr, data, **kwargs
)
except ValidationError as exc: except ValidationError as exc:
if exc.data: if exc.data:
exc.data = [item for item in exc.data if item is not None] exc.data = [item for item in exc.data if item is not None]
@ -61,7 +102,33 @@ class RepositorySchema(StrictSchema):
branch = fields.Str(validate=validate.Length(min=1, max=50)) branch = fields.Str(validate=validate.Length(min=1, max=50))
class ExportSchema(Schema): class DependencySchema(StrictSchema):
name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
version = fields.Str(validate=validate.Length(min=1, max=100))
authors = StrictListField(fields.Str(validate=validate.Length(min=1, max=50)))
platforms = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
),
]
)
)
frameworks = StrictListField(
fields.Str(
validate=[
validate.Length(min=1, max=50),
validate.Regexp(
r"^([a-z\d\-_]+|\*)$", error="Only [a-z0-9-_*] chars are allowed"
),
]
)
)
class ExportSchema(BaseSchema):
include = StrictListField(fields.Str) include = StrictListField(fields.Str)
exclude = StrictListField(fields.Str) exclude = StrictListField(fields.Str)
@ -80,7 +147,7 @@ class ExampleSchema(StrictSchema):
files = StrictListField(fields.Str, required=True) files = StrictListField(fields.Str, required=True)
class ManifestSchema(Schema): class ManifestSchema(BaseSchema):
# Required fields # Required fields
name = fields.Str(required=True, validate=validate.Length(min=1, max=100)) name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
version = fields.Str(required=True, validate=validate.Length(min=1, max=50)) version = fields.Str(required=True, validate=validate.Length(min=1, max=50))
@ -92,8 +159,12 @@ class ManifestSchema(Schema):
homepage = fields.Url(validate=validate.Length(min=1, max=255)) homepage = fields.Url(validate=validate.Length(min=1, max=255))
license = fields.Str(validate=validate.Length(min=1, max=255)) license = fields.Str(validate=validate.Length(min=1, max=255))
repository = fields.Nested(RepositorySchema) repository = fields.Nested(RepositorySchema)
dependencies = fields.Nested(DependencySchema, many=True)
# library.json
export = fields.Nested(ExportSchema) export = fields.Nested(ExportSchema)
examples = fields.Nested(ExampleSchema, many=True) examples = fields.Nested(ExampleSchema, many=True)
downloadUrl = fields.Url(validate=validate.Length(min=1, max=255))
keywords = StrictListField( keywords = StrictListField(
fields.Str( fields.Str(
@ -105,7 +176,6 @@ class ManifestSchema(Schema):
] ]
) )
) )
platforms = StrictListField( platforms = StrictListField(
fields.Str( fields.Str(
validate=[ validate=[
@ -142,10 +212,6 @@ class ManifestSchema(Schema):
) )
) )
def handle_error(self, error, data):
if self.strict:
raise ManifestValidationError(error, data)
@validates("version") @validates("version")
def validate_version(self, value): # pylint: disable=no-self-use def validate_version(self, value): # pylint: disable=no-self-use
try: try:
@ -176,7 +242,7 @@ class ManifestSchema(Schema):
def load_spdx_licenses(): def load_spdx_licenses():
r = requests.get( r = requests.get(
"https://raw.githubusercontent.com/spdx/license-list-data" "https://raw.githubusercontent.com/spdx/license-list-data"
"/v3.6/json/licenses.json" "/v3.8/json/licenses.json"
) )
r.raise_for_status() r.raise_for_status()
return r.json() return r.json()

131
platformio/package/pack.py Normal file
View File

@ -0,0 +1,131 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import re
import shutil
import tarfile
import tempfile
from platformio import fs
from platformio.package.exception import PackageException
from platformio.package.manifest.parser import ManifestFileType, ManifestParserFactory
from platformio.package.manifest.schema import ManifestSchema
from platformio.unpacker import FileUnpacker
class PackagePacker(object):
EXCLUDE_DEFAULT = [
"._*",
".DS_Store",
".git",
".hg",
".svn",
".pio",
]
INCLUDE_DEFAULT = ManifestFileType.items().values()
def __init__(self, package, manifest_uri=None):
self.package = package
self.manifest_uri = manifest_uri
def pack(self, dst=None):
tmp_dir = tempfile.mkdtemp()
try:
src = self.package
# if zip/tar.gz -> unpack to tmp dir
if not os.path.isdir(src):
with FileUnpacker(src) as fu:
assert fu.unpack(tmp_dir, silent=True)
src = tmp_dir
src = self.find_source_root(src)
manifest = self.load_manifest(src)
filename = re.sub(
r"[^\da-zA-Z\-\._]+",
"",
"{name}{system}-{version}.tar.gz".format(
name=manifest["name"],
system="-" + manifest["system"][0] if "system" in manifest else "",
version=manifest["version"],
),
)
if not dst:
dst = os.path.join(os.getcwd(), filename)
elif os.path.isdir(dst):
dst = os.path.join(dst, filename)
return self._create_tarball(
src,
dst,
include=manifest.get("export", {}).get("include"),
exclude=manifest.get("export", {}).get("exclude"),
)
finally:
shutil.rmtree(tmp_dir)
@staticmethod
def load_manifest(src):
mp = ManifestParserFactory.new_from_dir(src)
return ManifestSchema().load_manifest(mp.as_dict())
def find_source_root(self, src):
if self.manifest_uri:
mp = (
ManifestParserFactory.new_from_file(self.manifest_uri[5:])
if self.manifest_uri.startswith("file:")
else ManifestParserFactory.new_from_url(self.manifest_uri)
)
manifest = ManifestSchema().load_manifest(mp.as_dict())
include = manifest.get("export", {}).get("include", [])
if len(include) == 1:
if not os.path.isdir(os.path.join(src, include[0])):
raise PackageException(
"Non existing `include` directory `%s` in a package"
% include[0]
)
return os.path.join(src, include[0])
for root, _, __ in os.walk(src):
if ManifestFileType.from_dir(root):
return root
return src
def _create_tarball(self, src, dst, include=None, exclude=None):
# remap root
if (
include
and len(include) == 1
and os.path.isdir(os.path.join(src, include[0]))
):
src = os.path.join(src, include[0])
include = None
src_filters = self.compute_src_filters(include, exclude)
with tarfile.open(dst, "w:gz") as tar:
for f in fs.match_src_files(src, src_filters, followlinks=False):
tar.add(os.path.join(src, f), f)
return dst
def compute_src_filters(self, include, exclude):
result = ["+<%s>" % p for p in include or ["*", ".*"]]
result += ["-<%s>" % p for p in exclude or []]
result += ["-<%s>" % p for p in self.EXCLUDE_DEFAULT]
# automatically include manifests
result += ["+<%s>" % p for p in self.INCLUDE_DEFAULT]
return result

View File

@ -20,8 +20,9 @@ from hashlib import sha1
import click import click
from platformio import exception, fs from platformio import fs
from platformio.compat import PY2, WINDOWS, hashlib_encode_data from platformio.compat import PY2, WINDOWS, hashlib_encode_data, string_types
from platformio.project import exception
from platformio.project.options import ProjectOptions from platformio.project.options import ProjectOptions
try: try:
@ -29,7 +30,8 @@ try:
except ImportError: except ImportError:
import configparser as ConfigParser import configparser as ConfigParser
CONFIG_HEADER = """;PlatformIO Project Configuration File CONFIG_HEADER = """
; PlatformIO Project Configuration File
; ;
; Build options: build flags, source filter ; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags ; Upload options: custom upload port, speed and extra flags
@ -38,10 +40,12 @@ CONFIG_HEADER = """;PlatformIO Project Configuration File
; ;
; Please visit documentation for the other options and examples ; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html ; https://docs.platformio.org/page/projectconf.html
""" """
MISSING = object()
class ProjectConfigBase(object): class ProjectConfigBase(object):
INLINE_COMMENT_RE = re.compile(r"\s+;.*$") INLINE_COMMENT_RE = re.compile(r"\s+;.*$")
@ -104,7 +108,7 @@ class ProjectConfigBase(object):
try: try:
self._parser.read(path) self._parser.read(path)
except ConfigParser.Error as e: except ConfigParser.Error as e:
raise exception.InvalidProjectConf(path, str(e)) raise exception.InvalidProjectConfError(path, str(e))
if not parse_extra: if not parse_extra:
return return
@ -228,6 +232,8 @@ class ProjectConfigBase(object):
return [(option, self.get(section, option)) for option in self.options(section)] return [(option, self.get(section, option)) for option in self.options(section)]
def set(self, section, option, value): def set(self, section, option, value):
if value is None:
value = ""
if isinstance(value, (list, tuple)): if isinstance(value, (list, tuple)):
value = "\n".join(value) value = "\n".join(value)
elif isinstance(value, bool): elif isinstance(value, bool):
@ -239,46 +245,25 @@ class ProjectConfigBase(object):
value = "\n" + value value = "\n" + value
self._parser.set(section, option, value) self._parser.set(section, option, value)
def getraw(self, section, option): def getraw( # pylint: disable=too-many-branches
self, section, option, default=MISSING
):
if not self.expand_interpolations: if not self.expand_interpolations:
return self._parser.get(section, option) return self._parser.get(section, option)
value = None value = MISSING
found = False
for sec, opt in self.walk_options(section): for sec, opt in self.walk_options(section):
if opt == option: if opt == option:
value = self._parser.get(sec, option) value = self._parser.get(sec, option)
found = True
break break
if not found:
value = self._parser.get(section, option)
if "${" not in value or "}" not in value:
return value
return self.VARTPL_RE.sub(self._re_interpolation_handler, value)
def _re_interpolation_handler(self, match):
section, option = match.group(1), match.group(2)
if section == "sysenv":
return os.getenv(option)
return self.getraw(section, option)
def get(self, section, option, default=None): # pylint: disable=too-many-branches
value = None
try:
value = self.getraw(section, option)
except (ConfigParser.NoSectionError, ConfigParser.NoOptionError):
pass # handle value from system environment
except ConfigParser.Error as e:
raise exception.InvalidProjectConf(self.path, str(e))
option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option)) option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option))
if not option_meta: if not option_meta:
return value or default if value == MISSING:
value = (
if option_meta.multiple: default if default != MISSING else self._parser.get(section, option)
value = self.parse_multi_values(value) )
return self._expand_interpolations(value)
if option_meta.sysenvvar: if option_meta.sysenvvar:
envvar_value = os.getenv(option_meta.sysenvvar) envvar_value = os.getenv(option_meta.sysenvvar)
@ -288,17 +273,45 @@ class ProjectConfigBase(object):
if envvar_value: if envvar_value:
break break
if envvar_value and option_meta.multiple: if envvar_value and option_meta.multiple:
value = value or [] value += ("" if value == MISSING else "\n") + envvar_value
value.extend(self.parse_multi_values(envvar_value)) elif envvar_value and value == MISSING:
elif envvar_value and not value:
value = envvar_value value = envvar_value
# option is not specified by user if value == MISSING:
if value is None or ( value = option_meta.default or default
option_meta.multiple and value == [] and option_meta.default if value == MISSING:
): return None
return default if default is not None else option_meta.default
return self._expand_interpolations(value)
def _expand_interpolations(self, value):
if (
not value
or not isinstance(value, string_types)
or not all(["${" in value, "}" in value])
):
return value
return self.VARTPL_RE.sub(self._re_interpolation_handler, value)
def _re_interpolation_handler(self, match):
section, option = match.group(1), match.group(2)
if section == "sysenv":
return os.getenv(option)
return self.getraw(section, option)
def get(self, section, option, default=MISSING):
value = None
try:
value = self.getraw(section, option, default)
except ConfigParser.Error as e:
raise exception.InvalidProjectConfError(self.path, str(e))
option_meta = ProjectOptions.get("%s.%s" % (section.split(":", 1)[0], option))
if not option_meta:
return value
if option_meta.multiple:
value = self.parse_multi_values(value or [])
try: try:
return self.cast_to(value, option_meta.type) return self.cast_to(value, option_meta.type)
except click.BadParameter as e: except click.BadParameter as e:
@ -325,14 +338,14 @@ class ProjectConfigBase(object):
def validate(self, envs=None, silent=False): def validate(self, envs=None, silent=False):
if not os.path.isfile(self.path): if not os.path.isfile(self.path):
raise exception.NotPlatformIOProject(self.path) raise exception.NotPlatformIOProjectError(self.path)
# check envs # check envs
known = set(self.envs()) known = set(self.envs())
if not known: if not known:
raise exception.ProjectEnvsNotAvailable() raise exception.ProjectEnvsNotAvailableError()
unknown = set(list(envs or []) + self.default_envs()) - known unknown = set(list(envs or []) + self.default_envs()) - known
if unknown: if unknown:
raise exception.UnknownEnvNames(", ".join(unknown), ", ".join(known)) raise exception.UnknownEnvNamesError(", ".join(unknown), ", ".join(known))
if not silent: if not silent:
for warning in self.warnings: for warning in self.warnings:
click.secho("Warning! %s" % warning, fg="yellow") click.secho("Warning! %s" % warning, fg="yellow")
@ -445,7 +458,12 @@ class ProjectConfig(ProjectConfigBase, ProjectConfigDirsMixin):
path = path or self.path path = path or self.path
if path in self._instances: if path in self._instances:
del self._instances[path] del self._instances[path]
with open(path or self.path, "w") as fp: with open(path or self.path, "w+") as fp:
fp.write(CONFIG_HEADER) fp.write(CONFIG_HEADER.strip() + "\n\n")
self._parser.write(fp) self._parser.write(fp)
fp.seek(0)
contents = fp.read()
fp.seek(0)
fp.truncate()
fp.write(contents.strip() + "\n")
return True return True

View File

@ -0,0 +1,53 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.exception import PlatformioException, UserSideException
class ProjectError(PlatformioException):
pass
class NotPlatformIOProjectError(ProjectError, UserSideException):
MESSAGE = (
"Not a PlatformIO project. `platformio.ini` file has not been "
"found in current working directory ({0}). To initialize new project "
"please use `platformio project init` command"
)
class InvalidProjectConfError(ProjectError, UserSideException):
MESSAGE = "Invalid '{0}' (project configuration file): '{1}'"
class UndefinedEnvPlatformError(ProjectError, UserSideException):
MESSAGE = "Please specify platform for '{0}' environment"
class ProjectEnvsNotAvailableError(ProjectError, UserSideException):
MESSAGE = "Please setup environments in `platformio.ini` file"
class UnknownEnvNamesError(ProjectError, UserSideException):
MESSAGE = "Unknown environment names '{0}'. Valid names are '{1}'"
class ProjectOptionValueError(ProjectError, UserSideException):
MESSAGE = "{0} for option `{1}` in section [{2}]"

View File

@ -284,6 +284,13 @@ ProjectOptions = OrderedDict(
description="Custom packages and specifications", description="Custom packages and specifications",
multiple=True, multiple=True,
), ),
# Board
ConfigEnvOption(
group="platform",
name="board",
description="A board ID",
buildenvvar="BOARD",
),
ConfigEnvOption( ConfigEnvOption(
group="platform", group="platform",
name="framework", name="framework",
@ -291,36 +298,29 @@ ProjectOptions = OrderedDict(
multiple=True, multiple=True,
buildenvvar="PIOFRAMEWORK", buildenvvar="PIOFRAMEWORK",
), ),
# Board
ConfigEnvOption( ConfigEnvOption(
group="board", group="platform",
name="board",
description="A board ID",
buildenvvar="BOARD",
),
ConfigEnvOption(
group="board",
name="board_build.mcu", name="board_build.mcu",
description="A custom board MCU", description="A custom board MCU",
oldnames=["board_mcu"], oldnames=["board_mcu"],
buildenvvar="BOARD_MCU", buildenvvar="BOARD_MCU",
), ),
ConfigEnvOption( ConfigEnvOption(
group="board", group="platform",
name="board_build.f_cpu", name="board_build.f_cpu",
description="A custom MCU frequency", description="A custom MCU frequency",
oldnames=["board_f_cpu"], oldnames=["board_f_cpu"],
buildenvvar="BOARD_F_CPU", buildenvvar="BOARD_F_CPU",
), ),
ConfigEnvOption( ConfigEnvOption(
group="board", group="platform",
name="board_build.f_flash", name="board_build.f_flash",
description="A custom flash frequency", description="A custom flash frequency",
oldnames=["board_f_flash"], oldnames=["board_f_flash"],
buildenvvar="BOARD_F_FLASH", buildenvvar="BOARD_F_FLASH",
), ),
ConfigEnvOption( ConfigEnvOption(
group="board", group="platform",
name="board_build.flash_mode", name="board_build.flash_mode",
description="A custom flash mode", description="A custom flash mode",
oldnames=["board_flash_mode"], oldnames=["board_flash_mode"],
@ -531,7 +531,7 @@ ProjectOptions = OrderedDict(
group="check", group="check",
name="check_tool", name="check_tool",
description="A list of check tools used for analysis", description="A list of check tools used for analysis",
type=click.Choice(["cppcheck", "clangtidy"]), type=click.Choice(["cppcheck", "clangtidy", "pvs-studio"]),
multiple=True, multiple=True,
default=["cppcheck"], default=["cppcheck"],
), ),
@ -582,11 +582,15 @@ ProjectOptions = OrderedDict(
description="A connection speed (baud rate) to communicate with a target device", description="A connection speed (baud rate) to communicate with a target device",
type=click.INT, type=click.INT,
), ),
ConfigEnvOption(group="test", name="test_transport", description="",), ConfigEnvOption(
group="test",
name="test_transport",
description="A transport to communicate with a target device",
),
ConfigEnvOption( ConfigEnvOption(
group="test", group="test",
name="test_build_project_src", name="test_build_project_src",
description="", description="Build project source code in a pair with test code",
type=click.BOOL, type=click.BOOL,
default=False, default=False,
), ),
@ -596,6 +600,16 @@ ProjectOptions = OrderedDict(
name="debug_tool", name="debug_tool",
description="A name of debugging tool", description="A name of debugging tool",
), ),
ConfigEnvOption(
group="debug",
name="debug_build_flags",
description=(
"Custom debug flags/options for preprocessing, compilation, "
"assembly, and linking processes"
),
multiple=True,
default=["-Og", "-g2", "-ggdb2"],
),
ConfigEnvOption( ConfigEnvOption(
group="debug", group="debug",
name="debug_init_break", name="debug_init_break",

View File

@ -13,13 +13,12 @@
# limitations under the License. # limitations under the License.
import atexit import atexit
import os
import platform import platform
import re import re
import sys import sys
import threading import threading
from collections import deque from collections import deque
from os import getenv, sep
from os.path import join
from time import sleep, time from time import sleep, time
from traceback import format_exc from traceback import format_exc
@ -79,6 +78,7 @@ class MeasurementProtocol(TelemetryBase):
self._prefill_screen_name() self._prefill_screen_name()
self._prefill_appinfo() self._prefill_appinfo()
self._prefill_sysargs()
self._prefill_custom_data() self._prefill_custom_data()
def __getitem__(self, name): def __getitem__(self, name):
@ -99,10 +99,19 @@ class MeasurementProtocol(TelemetryBase):
dpdata.append("PlatformIO/%s" % __version__) dpdata.append("PlatformIO/%s" % __version__)
if app.get_session_var("caller_id"): if app.get_session_var("caller_id"):
dpdata.append("Caller/%s" % app.get_session_var("caller_id")) dpdata.append("Caller/%s" % app.get_session_var("caller_id"))
if getenv("PLATFORMIO_IDE"): if os.getenv("PLATFORMIO_IDE"):
dpdata.append("IDE/%s" % getenv("PLATFORMIO_IDE")) dpdata.append("IDE/%s" % os.getenv("PLATFORMIO_IDE"))
self["an"] = " ".join(dpdata) self["an"] = " ".join(dpdata)
def _prefill_sysargs(self):
args = []
for arg in sys.argv[1:]:
arg = str(arg).lower()
if "@" in arg or os.path.exists(arg):
arg = "***"
args.append(arg)
self["cd3"] = " ".join(args)
def _prefill_custom_data(self): def _prefill_custom_data(self):
def _filter_args(items): def _filter_args(items):
result = [] result = []
@ -119,7 +128,6 @@ class MeasurementProtocol(TelemetryBase):
caller_id = str(app.get_session_var("caller_id")) caller_id = str(app.get_session_var("caller_id"))
self["cd1"] = util.get_systype() self["cd1"] = util.get_systype()
self["cd2"] = "Python/%s %s" % (platform.python_version(), platform.platform()) self["cd2"] = "Python/%s %s" % (platform.python_version(), platform.platform())
# self['cd3'] = " ".join(_filter_args(sys.argv[1:]))
self["cd4"] = ( self["cd4"] = (
1 if (not util.is_ci() and (caller_id or not is_container())) else 0 1 if (not util.is_ci() and (caller_id or not is_container())) else 0
) )
@ -143,14 +151,7 @@ class MeasurementProtocol(TelemetryBase):
return return
cmd_path = args[:1] cmd_path = args[:1]
if args[0] in ( if args[0] in ("account", "device", "platform", "project", "settings",):
"platform",
"platforms",
"serialports",
"device",
"settings",
"account",
):
cmd_path = args[:2] cmd_path = args[:2]
if args[0] == "lib" and len(args) > 1: if args[0] == "lib" and len(args) > 1:
lib_subcmds = ( lib_subcmds = (
@ -179,13 +180,10 @@ class MeasurementProtocol(TelemetryBase):
cmd_path.append(sub_cmd) cmd_path.append(sub_cmd)
self["screen_name"] = " ".join([p.title() for p in cmd_path]) self["screen_name"] = " ".join([p.title() for p in cmd_path])
@staticmethod def _ignore_hit(self):
def _ignore_hit():
if not app.get_setting("enable_telemetry"): if not app.get_setting("enable_telemetry"):
return True return True
if app.get_session_var("caller_id") and all( if all(c in sys.argv for c in ("run", "idedata")) or self["ea"] == "Idedata":
c in sys.argv for c in ("run", "idedata")
):
return True return True
return False return False
@ -296,29 +294,64 @@ def on_command():
measure_ci() measure_ci()
def on_exception(e):
skip_conditions = [
isinstance(e, cls)
for cls in (IOError, exception.ReturnErrorCode, exception.UserSideException,)
]
try:
skip_conditions.append("[API] Account: " in str(e))
except UnicodeEncodeError as ue:
e = ue
if any(skip_conditions):
return
is_fatal = any(
[
not isinstance(e, exception.PlatformioException),
"Error" in e.__class__.__name__,
]
)
description = "%s: %s" % (
type(e).__name__,
" ".join(reversed(format_exc().split("\n"))) if is_fatal else str(e),
)
send_exception(description, is_fatal)
def measure_ci(): def measure_ci():
event = {"category": "CI", "action": "NoName", "label": None} event = {"category": "CI", "action": "NoName", "label": None}
known_cis = ("TRAVIS", "APPVEYOR", "GITLAB_CI", "CIRCLECI", "SHIPPABLE", "DRONE") known_cis = ("TRAVIS", "APPVEYOR", "GITLAB_CI", "CIRCLECI", "SHIPPABLE", "DRONE")
for name in known_cis: for name in known_cis:
if getenv(name, "false").lower() == "true": if os.getenv(name, "false").lower() == "true":
event["action"] = name event["action"] = name
break break
on_event(**event) send_event(**event)
def on_run_environment(options, targets): def encode_run_environment(options):
non_sensative_values = ["board", "platform", "framework"] non_sensative_keys = [
safe_options = [] "platform",
for key, value in sorted(options.items()): "framework",
if key in non_sensative_values: "board",
safe_options.append("%s=%s" % (key, value)) "upload_protocol",
else: "check_tool",
safe_options.append(key) "debug_tool",
targets = [t.title() for t in targets or ["run"]] ]
on_event("Env", " ".join(targets), "&".join(safe_options)) safe_options = [
"%s=%s" % (k, v) for k, v in sorted(options.items()) if k in non_sensative_keys
]
return "&".join(safe_options)
def on_event(category, action, label=None, value=None, screen_name=None): def send_run_environment(options, targets):
send_event(
"Env",
" ".join([t.title() for t in targets or ["run"]]),
encode_run_environment(options),
)
def send_event(category, action, label=None, value=None, screen_name=None):
mp = MeasurementProtocol() mp = MeasurementProtocol()
mp["event_category"] = category[:150] mp["event_category"] = category[:150]
mp["event_action"] = action[:500] mp["event_action"] = action[:500]
@ -331,43 +364,21 @@ def on_event(category, action, label=None, value=None, screen_name=None):
mp.send("event") mp.send("event")
def on_exception(e): def send_exception(description, is_fatal=False):
def _cleanup_description(text): # cleanup sensitive information, such as paths
text = text.replace("Traceback (most recent call last):", "") description = description.replace("Traceback (most recent call last):", "")
text = re.sub( description = description.replace("\\", "/")
r'File "([^"]+)"', description = re.sub(
lambda m: join(*m.group(1).split(sep)[-2:]), r'(^|\s+|")(?:[a-z]\:)?((/[^"/]+)+)(\s+|"|$)',
text, lambda m: " %s " % os.path.join(*m.group(2).split("/")[-2:]),
flags=re.M, description,
) re.I | re.M,
text = re.sub(r"\s+", " ", text, flags=re.M)
return text.strip()
skip_conditions = [
isinstance(e, cls)
for cls in (
IOError,
exception.ReturnErrorCode,
exception.UserSideException,
exception.PlatformIOProjectException,
)
]
try:
skip_conditions.append("[API] Account: " in str(e))
except UnicodeEncodeError as ue:
e = ue
if any(skip_conditions):
return
is_crash = any(
[
not isinstance(e, exception.PlatformioException),
"Error" in e.__class__.__name__,
]
) )
description = re.sub(r"\s+", " ", description, flags=re.M)
mp = MeasurementProtocol() mp = MeasurementProtocol()
description = _cleanup_description(format_exc() if is_crash else str(e)) mp["exd"] = description[:8192].strip()
mp["exd"] = ("%s: %s" % (type(e).__name__, description))[:2048] mp["exf"] = 1 if is_fatal else 0
mp["exf"] = 1 if is_crash else 0
mp.send("exception") mp.send("exception")

View File

@ -73,6 +73,7 @@ class TARArchive(ArchiveBase):
).startswith(base) ).startswith(base)
def extract_item(self, item, dest_dir): def extract_item(self, item, dest_dir):
dest_dir = self.resolve_path(dest_dir)
bad_conds = [ bad_conds = [
self.is_bad_path(item.name, dest_dir), self.is_bad_path(item.name, dest_dir),
self.is_link(item) and self.is_bad_link(item, dest_dir), self.is_link(item) and self.is_bad_link(item, dest_dir),
@ -137,10 +138,13 @@ class FileUnpacker(object):
if self._unpacker: if self._unpacker:
self._unpacker.close() self._unpacker.close()
def unpack(self, dest_dir=".", with_progress=True, check_unpacked=True): def unpack(
self, dest_dir=".", with_progress=True, check_unpacked=True, silent=False
):
assert self._unpacker assert self._unpacker
if not with_progress: if not with_progress or silent:
click.echo("Unpacking...") if not silent:
click.echo("Unpacking...")
for item in self._unpacker.get_items(): for item in self._unpacker.get_items():
self._unpacker.extract_item(item, dest_dir) self._unpacker.extract_item(item, dest_dir)
else: else:

View File

@ -366,10 +366,11 @@ def get_api_result(url, params=None, data=None, auth=None, cache_valid=None):
) )
PING_INTERNET_IPS = [ PING_REMOTE_HOSTS = [
"192.30.253.113", # github.com "140.82.118.3", # Github.com
"31.28.1.238", # dl.platformio.org "35.231.145.151", # Gitlab.com
"193.222.52.25", # dl.platformio.org "github.com",
"platformio.org",
] ]
@ -377,12 +378,12 @@ PING_INTERNET_IPS = [
def _internet_on(): def _internet_on():
timeout = 2 timeout = 2
socket.setdefaulttimeout(timeout) socket.setdefaulttimeout(timeout)
for ip in PING_INTERNET_IPS: for host in PING_REMOTE_HOSTS:
try: try:
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")): if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout) requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
else: else:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80)) socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, 80))
return True return True
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
@ -401,9 +402,9 @@ def pepver_to_semver(pepver):
def items_to_list(items): def items_to_list(items):
if not isinstance(items, list): if isinstance(items, list):
items = [i.strip() for i in items.split(",")] return items
return [i.lower() for i in items if i] return [i.strip() for i in items.split(",") if i.strip()]
def items_in_list(needle, haystack): def items_in_list(needle, haystack):

View File

@ -33,7 +33,7 @@ install_requires = [
"semantic_version>=2.8.1,<3", "semantic_version>=2.8.1,<3",
"tabulate>=0.8.3,<1", "tabulate>=0.8.3,<1",
"pyelftools>=0.25,<1", "pyelftools>=0.25,<1",
"marshmallow>=2.20.5,<3" "marshmallow>=2.20.5",
] ]
setup( setup(

View File

@ -239,21 +239,30 @@ int main() {
def test_check_individual_flags_passed(clirunner, tmpdir): def test_check_individual_flags_passed(clirunner, tmpdir):
config = DEFAULT_CONFIG + "\ncheck_tool = cppcheck, clangtidy" config = DEFAULT_CONFIG + "\ncheck_tool = cppcheck, clangtidy, pvs-studio"
config += "\ncheck_flags = cppcheck: --std=c++11 \n\tclangtidy: --fix-errors" config += """\ncheck_flags =
cppcheck: --std=c++11
clangtidy: --fix-errors
pvs-studio: --analysis-mode=4
"""
tmpdir.join("platformio.ini").write(config) tmpdir.join("platformio.ini").write(config)
tmpdir.mkdir("src").join("main.cpp").write(TEST_CODE) tmpdir.mkdir("src").join("main.cpp").write(TEST_CODE)
result = clirunner.invoke(cmd_check, ["--project-dir", str(tmpdir), "-v"]) result = clirunner.invoke(cmd_check, ["--project-dir", str(tmpdir), "-v"])
clang_flags_found = cppcheck_flags_found = False clang_flags_found = cppcheck_flags_found = pvs_flags_found = False
for l in result.output.split("\n"): for l in result.output.split("\n"):
if "--fix" in l and "clang-tidy" in l and "--std=c++11" not in l: if "--fix" in l and "clang-tidy" in l and "--std=c++11" not in l:
clang_flags_found = True clang_flags_found = True
elif "--std=c++11" in l and "cppcheck" in l and "--fix" not in l: elif "--std=c++11" in l and "cppcheck" in l and "--fix" not in l:
cppcheck_flags_found = True cppcheck_flags_found = True
elif (
"--analysis-mode=4" in l and "pvs-studio" in l.lower() and "--fix" not in l
):
pvs_flags_found = True
assert clang_flags_found assert clang_flags_found
assert cppcheck_flags_found assert cppcheck_flags_found
assert pvs_flags_found
def test_check_cppcheck_misra_addon(clirunner, check_dir): def test_check_cppcheck_misra_addon(clirunner, check_dir):
@ -344,3 +353,33 @@ int main() {
assert high_result.exit_code == 0 assert high_result.exit_code == 0
assert low_result.exit_code != 0 assert low_result.exit_code != 0
def test_check_pvs_studio_free_license(clirunner, tmpdir):
config = """
[env:test]
platform = teensy
board = teensy35
framework = arduino
check_tool = pvs-studio
"""
code = (
"""// This is an open source non-commercial project. Dear PVS-Studio, please check it.
// PVS-Studio Static Code Analyzer for C, C++, C#, and Java: http://www.viva64.com
"""
+ TEST_CODE
)
tmpdir.join("platformio.ini").write(config)
tmpdir.mkdir("src").join("main.c").write(code)
result = clirunner.invoke(
cmd_check, ["--project-dir", str(tmpdir), "--fail-on-defect=high", "-v"]
)
errors, warnings, style = count_defects(result.output)
assert result.exit_code != 0
assert errors != 0
assert warnings != 0
assert style == 0

View File

@ -16,10 +16,10 @@ import json
from os import getcwd, makedirs from os import getcwd, makedirs
from os.path import getsize, isdir, isfile, join from os.path import getsize, isdir, isfile, join
from platformio import exception
from platformio.commands.boards import cli as cmd_boards from platformio.commands.boards import cli as cmd_boards
from platformio.commands.init import cli as cmd_init from platformio.commands.project import project_init as cmd_init
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError
def validate_pioproject(pioproject_dir): def validate_pioproject(pioproject_dir):
@ -59,7 +59,7 @@ def test_init_ide_without_board(clirunner, tmpdir):
with tmpdir.as_cwd(): with tmpdir.as_cwd():
result = clirunner.invoke(cmd_init, ["--ide", "atom"]) result = clirunner.invoke(cmd_init, ["--ide", "atom"])
assert result.exit_code != 0 assert result.exit_code != 0
assert isinstance(result.exception, exception.ProjectEnvsNotAvailable) assert isinstance(result.exception, ProjectEnvsNotAvailableError)
def test_init_ide_atom(clirunner, validate_cliresult, tmpdir): def test_init_ide_atom(clirunner, validate_cliresult, tmpdir):

View File

@ -230,7 +230,9 @@ def test_global_lib_update_check(clirunner, validate_cliresult):
) )
validate_cliresult(result) validate_cliresult(result)
output = json.loads(result.output) output = json.loads(result.output)
assert set(["RFcontrol", "NeoPixelBus"]) == set([l["name"] for l in output]) assert set(["RFcontrol", "ESPAsyncTCP", "NeoPixelBus"]) == set(
[l["name"] for l in output]
)
def test_global_lib_update(clirunner, validate_cliresult): def test_global_lib_update(clirunner, validate_cliresult):
@ -250,7 +252,7 @@ def test_global_lib_update(clirunner, validate_cliresult):
result = clirunner.invoke(cmd_lib, ["-g", "update"]) result = clirunner.invoke(cmd_lib, ["-g", "update"])
validate_cliresult(result) validate_cliresult(result)
assert result.output.count("[Detached]") == 5 assert result.output.count("[Detached]") == 5
assert result.output.count("[Up-to-date]") == 11 assert result.output.count("[Up-to-date]") == 10
assert "Uninstalling RFcontrol @ 77d4eb3f8a" in result.output assert "Uninstalling RFcontrol @ 77d4eb3f8a" in result.output
# update unknown library # update unknown library

View File

@ -29,25 +29,45 @@ def test_library_json_parser():
"name": "TestPackage", "name": "TestPackage",
"keywords": "kw1, KW2, kw3", "keywords": "kw1, KW2, kw3",
"platforms": ["atmelavr", "espressif"], "platforms": ["atmelavr", "espressif"],
"repository": {
"type": "git",
"url": "http://github.com/username/repo/"
},
"url": "http://old.url.format", "url": "http://old.url.format",
"exclude": [".gitignore", "tests"], "exclude": [".gitignore", "tests"],
"include": "mylib", "include": "mylib",
"build": { "build": {
"flags": ["-DHELLO"] "flags": ["-DHELLO"]
}, },
"examples": ["examples/*/*.pde"],
"dependencies": {
"deps1": "1.2.0",
"deps2": "https://github.com/username/package.git",
"@owner/deps3": "^2.1.3"
},
"customField": "Custom Value" "customField": "Custom Value"
} }
""" """
mp = parser.LibraryJsonManifestParser(contents) raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
assert not jsondiff.diff( assert not jsondiff.diff(
mp.as_dict(), raw_data,
{ {
"name": "TestPackage", "name": "TestPackage",
"platforms": ["atmelavr", "espressif8266"], "platforms": ["atmelavr", "espressif8266"],
"repository": {
"type": "git",
"url": "https://github.com/username/repo.git",
},
"export": {"exclude": [".gitignore", "tests"], "include": ["mylib"]}, "export": {"exclude": [".gitignore", "tests"], "include": ["mylib"]},
"keywords": ["kw1", "kw2", "kw3"], "keywords": ["kw1", "kw2", "kw3"],
"homepage": "http://old.url.format", "homepage": "http://old.url.format",
"build": {"flags": ["-DHELLO"]}, "build": {"flags": ["-DHELLO"]},
"dependencies": [
{"name": "@owner/deps3", "version": "^2.1.3"},
{"name": "deps1", "version": "1.2.0"},
{"name": "deps2", "version": "https://github.com/username/package.git"},
],
"customField": "Custom Value", "customField": "Custom Value",
}, },
) )
@ -59,20 +79,43 @@ def test_library_json_parser():
"platforms": "atmelavr", "platforms": "atmelavr",
"export": { "export": {
"exclude": "audio_samples" "exclude": "audio_samples"
} },
"dependencies": [
{"name": "deps1", "version": "1.0.0"},
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": "arduino, espidf"},
{"name": "deps3", "version": "1.0.0", "platforms": ["ststm32", "sifive"]}
]
} }
""" """
mp = parser.LibraryJsonManifestParser(contents) raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
assert not jsondiff.diff( assert not jsondiff.diff(
mp.as_dict(), raw_data,
{ {
"keywords": ["sound", "audio", "music", "sd", "card", "playback"], "keywords": ["sound", "audio", "music", "sd", "card", "playback"],
"frameworks": ["arduino"], "frameworks": ["arduino"],
"export": {"exclude": ["audio_samples"]}, "export": {"exclude": ["audio_samples"]},
"platforms": ["atmelavr"], "platforms": ["atmelavr"],
"dependencies": [
{
"name": "@owner/deps2",
"version": "1.0.0",
"frameworks": ["arduino", "espidf"],
},
{"name": "deps1", "version": "1.0.0"},
{
"name": "deps3",
"version": "1.0.0",
"platforms": ["ststm32", "sifive"],
},
],
}, },
) )
# broken dependencies
with pytest.raises(parser.ManifestParserError):
parser.LibraryJsonManifestParser({"dependencies": ["deps1", "deps2"]})
def test_module_json_parser(): def test_module_json_parser():
contents = """ contents = """
@ -128,10 +171,12 @@ version=1.2.3
author=SomeAuthor <info AT author.com> author=SomeAuthor <info AT author.com>
sentence=This is Arduino library sentence=This is Arduino library
customField=Custom Value customField=Custom Value
depends=First Library (=2.0.0), Second Library (>=1.2.0), Third
""" """
mp = parser.LibraryPropertiesManifestParser(contents) raw_data = parser.LibraryPropertiesManifestParser(contents).as_dict()
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
assert not jsondiff.diff( assert not jsondiff.diff(
mp.as_dict(), raw_data,
{ {
"name": "TestPackage", "name": "TestPackage",
"version": "1.2.3", "version": "1.2.3",
@ -145,6 +190,20 @@ customField=Custom Value
"authors": [{"email": "info@author.com", "name": "SomeAuthor"}], "authors": [{"email": "info@author.com", "name": "SomeAuthor"}],
"keywords": ["uncategorized"], "keywords": ["uncategorized"],
"customField": "Custom Value", "customField": "Custom Value",
"depends": "First Library (=2.0.0), Second Library (>=1.2.0), Third",
"dependencies": [
{
"name": "First Library",
"version": "=2.0.0",
"frameworks": ["arduino"],
},
{
"name": "Second Library",
"version": ">=1.2.0",
"frameworks": ["arduino"],
},
{"name": "Third", "frameworks": ["arduino"]},
],
}, },
) )
@ -153,6 +212,7 @@ customField=Custom Value
"architectures=*\n" + contents "architectures=*\n" + contents
).as_dict() ).as_dict()
assert data["platforms"] == ["*"] assert data["platforms"] == ["*"]
# Platforms specific # Platforms specific
data = parser.LibraryPropertiesManifestParser( data = parser.LibraryPropertiesManifestParser(
"architectures=avr, esp32\n" + contents "architectures=avr, esp32\n" + contents
@ -172,11 +232,11 @@ customField=Custom Value
"include": ["libraries/TestPackage"], "include": ["libraries/TestPackage"],
} }
assert data["repository"] == { assert data["repository"] == {
"url": "https://github.com/username/reponame", "url": "https://github.com/username/reponame.git",
"type": "git", "type": "git",
} }
# Hope page # Home page
data = parser.LibraryPropertiesManifestParser( data = parser.LibraryPropertiesManifestParser(
"url=https://github.com/username/reponame.git\n" + contents "url=https://github.com/username/reponame.git\n" + contents
).as_dict() ).as_dict()
@ -185,6 +245,17 @@ customField=Custom Value
"url": "https://github.com/username/reponame.git", "url": "https://github.com/username/reponame.git",
} }
# Author + Maintainer
data = parser.LibraryPropertiesManifestParser(
"""
author=Rocket Scream Electronics
maintainer=Rocket Scream Electronics
"""
).as_dict()
assert data["authors"] == [
{"name": "Rocket Scream Electronics", "maintainer": True}
]
def test_library_json_schema(): def test_library_json_schema():
contents = """ contents = """
@ -202,6 +273,7 @@ def test_library_json_schema():
"name": "Benoit Blanchon", "name": "Benoit Blanchon",
"url": "https://blog.benoitblanchon.fr" "url": "https://blog.benoitblanchon.fr"
}, },
"downloadUrl": "https://example.com/package.tar.gz",
"exclude": [ "exclude": [
"fuzzing", "fuzzing",
"scripts", "scripts",
@ -222,15 +294,20 @@ def test_library_json_schema():
"base": "examples/JsonHttpClient", "base": "examples/JsonHttpClient",
"files": ["JsonHttpClient.ino"] "files": ["JsonHttpClient.ino"]
} }
],
"dependencies": [
{"name": "deps1", "version": "1.0.0"},
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": "arduino"},
{"name": "deps3", "version": "1.0.0", "platforms": ["ststm32", "sifive"]}
] ]
} }
""" """
raw_data = parser.ManifestParserFactory.new( raw_data = parser.ManifestParserFactory.new(
contents, parser.ManifestFileType.LIBRARY_JSON contents, parser.ManifestFileType.LIBRARY_JSON
).as_dict() ).as_dict()
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
data, errors = ManifestSchema(strict=True).load(raw_data) data = ManifestSchema().load_manifest(raw_data)
assert not errors
assert data["repository"]["url"] == "https://github.com/bblanchon/ArduinoJson.git" assert data["repository"]["url"] == "https://github.com/bblanchon/ArduinoJson.git"
assert data["examples"][1]["base"] == "examples/JsonHttpClient" assert data["examples"][1]["base"] == "examples/JsonHttpClient"
@ -251,6 +328,7 @@ def test_library_json_schema():
"authors": [ "authors": [
{"name": "Benoit Blanchon", "url": "https://blog.benoitblanchon.fr"} {"name": "Benoit Blanchon", "url": "https://blog.benoitblanchon.fr"}
], ],
"downloadUrl": "https://example.com/package.tar.gz",
"export": {"exclude": ["fuzzing", "scripts", "test", "third-party"]}, "export": {"exclude": ["fuzzing", "scripts", "test", "third-party"]},
"frameworks": ["arduino"], "frameworks": ["arduino"],
"platforms": ["*"], "platforms": ["*"],
@ -267,6 +345,45 @@ def test_library_json_schema():
"files": ["JsonHttpClient.ino"], "files": ["JsonHttpClient.ino"],
}, },
], ],
"dependencies": [
{"name": "@owner/deps2", "version": "1.0.0", "frameworks": ["arduino"]},
{"name": "deps1", "version": "1.0.0"},
{
"name": "deps3",
"version": "1.0.0",
"platforms": ["ststm32", "sifive"],
},
],
},
)
# legacy dependencies format
contents = """
{
"name": "DallasTemperature",
"version": "3.8.0",
"dependencies":
{
"name": "OneWire",
"authors": "Paul Stoffregen",
"frameworks": "arduino"
}
}
"""
raw_data = parser.LibraryJsonManifestParser(contents).as_dict()
data = ManifestSchema().load_manifest(raw_data)
assert not jsondiff.diff(
data,
{
"name": "DallasTemperature",
"version": "3.8.0",
"dependencies": [
{
"name": "OneWire",
"authors": ["Paul Stoffregen"],
"frameworks": ["arduino"],
}
],
}, },
) )
@ -282,13 +399,14 @@ paragraph=Supported display controller: SSD1306, SSD1309, SSD1322, SSD1325
category=Display category=Display
url=https://github.com/olikraus/u8glib url=https://github.com/olikraus/u8glib
architectures=avr,sam architectures=avr,sam
depends=First Library (=2.0.0), Second Library (>=1.2.0), Third
""" """
raw_data = parser.ManifestParserFactory.new( raw_data = parser.ManifestParserFactory.new(
contents, parser.ManifestFileType.LIBRARY_PROPERTIES contents, parser.ManifestFileType.LIBRARY_PROPERTIES
).as_dict() ).as_dict()
raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
data, errors = ManifestSchema(strict=True).load(raw_data) data = ManifestSchema().load_manifest(raw_data)
assert not errors
assert not jsondiff.diff( assert not jsondiff.diff(
data, data,
@ -297,7 +415,10 @@ architectures=avr,sam
"A library for monochrome TFTs and OLEDs. Supported display " "A library for monochrome TFTs and OLEDs. Supported display "
"controller: SSD1306, SSD1309, SSD1322, SSD1325" "controller: SSD1306, SSD1309, SSD1322, SSD1325"
), ),
"repository": {"url": "https://github.com/olikraus/u8glib", "type": "git"}, "repository": {
"url": "https://github.com/olikraus/u8glib.git",
"type": "git",
},
"frameworks": ["arduino"], "frameworks": ["arduino"],
"platforms": ["atmelavr", "atmelsam"], "platforms": ["atmelavr", "atmelsam"],
"version": "1.19.1", "version": "1.19.1",
@ -309,6 +430,19 @@ architectures=avr,sam
], ],
"keywords": ["display"], "keywords": ["display"],
"name": "U8glib", "name": "U8glib",
"dependencies": [
{
"name": "First Library",
"version": "=2.0.0",
"frameworks": ["arduino"],
},
{
"name": "Second Library",
"version": ">=1.2.0",
"frameworks": ["arduino"],
},
{"name": "Third", "frameworks": ["arduino"]},
],
}, },
) )
@ -335,7 +469,12 @@ includes=MozziGuts.h
), ),
).as_dict() ).as_dict()
data, errors = ManifestSchema(strict=False).load(raw_data) try:
ManifestSchema().load_manifest(raw_data)
except ManifestValidationError as e:
data = e.valid_data
errors = e.messages
assert errors["authors"] assert errors["authors"]
assert not jsondiff.diff( assert not jsondiff.diff(
@ -348,7 +487,10 @@ includes=MozziGuts.h
"sounds using familiar synthesis units like oscillators, delays, " "sounds using familiar synthesis units like oscillators, delays, "
"filters and envelopes." "filters and envelopes."
), ),
"repository": {"url": "https://github.com/sensorium/Mozzi", "type": "git"}, "repository": {
"url": "https://github.com/sensorium/Mozzi.git",
"type": "git",
},
"platforms": ["*"], "platforms": ["*"],
"frameworks": ["arduino"], "frameworks": ["arduino"],
"export": { "export": {
@ -404,11 +546,6 @@ def test_platform_json_schema():
"optional": true, "optional": true,
"version": "~4.2.0" "version": "~4.2.0"
}, },
"framework-simba": {
"type": "framework",
"optional": true,
"version": ">=7.0.0"
},
"tool-avrdude": { "tool-avrdude": {
"type": "uploader", "type": "uploader",
"optional": true, "optional": true,
@ -421,8 +558,9 @@ def test_platform_json_schema():
contents, parser.ManifestFileType.PLATFORM_JSON contents, parser.ManifestFileType.PLATFORM_JSON
).as_dict() ).as_dict()
raw_data["frameworks"] = sorted(raw_data["frameworks"]) raw_data["frameworks"] = sorted(raw_data["frameworks"])
data, errors = ManifestSchema(strict=False).load(raw_data) raw_data["dependencies"] = sorted(raw_data["dependencies"], key=lambda a: a["name"])
assert not errors
data = ManifestSchema().load_manifest(raw_data)
assert not jsondiff.diff( assert not jsondiff.diff(
data, data,
@ -444,6 +582,11 @@ def test_platform_json_schema():
}, },
"frameworks": sorted(["arduino", "simba"]), "frameworks": sorted(["arduino", "simba"]),
"version": "1.15.0", "version": "1.15.0",
"dependencies": [
{"name": "framework-arduinoavr", "version": "~4.2.0"},
{"name": "tool-avrdude", "version": "~1.60300.0"},
{"name": "toolchain-atmelavr", "version": "~1.50400.0"},
],
}, },
) )
@ -461,8 +604,7 @@ def test_package_json_schema():
contents, parser.ManifestFileType.PACKAGE_JSON contents, parser.ManifestFileType.PACKAGE_JSON
).as_dict() ).as_dict()
data, errors = ManifestSchema(strict=False).load(raw_data) data = ManifestSchema().load_manifest(raw_data)
assert not errors
assert not jsondiff.diff( assert not jsondiff.diff(
data, data,
@ -492,6 +634,7 @@ def test_package_json_schema():
def test_parser_from_dir(tmpdir_factory): def test_parser_from_dir(tmpdir_factory):
pkg_dir = tmpdir_factory.mktemp("package") pkg_dir = tmpdir_factory.mktemp("package")
pkg_dir.join("package.json").write('{"name": "package.json"}')
pkg_dir.join("library.json").write('{"name": "library.json"}') pkg_dir.join("library.json").write('{"name": "library.json"}')
pkg_dir.join("library.properties").write("name=library.properties") pkg_dir.join("library.properties").write("name=library.properties")
@ -564,8 +707,7 @@ def test_examples_from_dir(tmpdir_factory):
raw_data["examples"] = _sort_examples(raw_data["examples"]) raw_data["examples"] = _sort_examples(raw_data["examples"])
data, errors = ManifestSchema(strict=True).load(raw_data) data = ManifestSchema().load_manifest(raw_data)
assert not errors
assert not jsondiff.diff( assert not jsondiff.diff(
data, data,
@ -621,34 +763,32 @@ def test_examples_from_dir(tmpdir_factory):
def test_broken_schemas(): def test_broken_schemas():
# non-strict mode # missing required field
data, errors = ManifestSchema(strict=False).load(dict(name="MyPackage"))
assert set(errors.keys()) == set(["version"])
assert data.get("version") is None
# invalid keywords
data, errors = ManifestSchema(strict=False).load(dict(keywords=["kw1", "*^[]"]))
assert errors
assert data["keywords"] == ["kw1"]
# strict mode
with pytest.raises( with pytest.raises(
ManifestValidationError, match="Missing data for required field" ManifestValidationError, match=("Invalid semantic versioning format")
): ) as exc_info:
ManifestSchema(strict=True).load(dict(name="MyPackage")) ManifestSchema().load_manifest(dict(name="MyPackage", version="broken_version"))
assert exc_info.value.valid_data == {"name": "MyPackage"}
# invalid StrictList
with pytest.raises(
ManifestValidationError, match=("Invalid manifest fields.+keywords")
) as exc_info:
ManifestSchema().load_manifest(
dict(name="MyPackage", version="1.0.0", keywords=["kw1", "*^[]"])
)
assert list(exc_info.value.messages.keys()) == ["keywords"]
assert exc_info.value.valid_data["keywords"] == ["kw1"]
# broken SemVer # broken SemVer
with pytest.raises( with pytest.raises(
ManifestValidationError, match=("Invalid semantic versioning format") ManifestValidationError, match=("Invalid semantic versioning format")
): ):
ManifestSchema(strict=True).load( ManifestSchema().load_manifest(dict(name="MyPackage", version="broken_version"))
dict(name="MyPackage", version="broken_version")
)
# broken value for Nested # broken value for Nested
with pytest.raises(ManifestValidationError, match=r"authors.*Invalid input type"): with pytest.raises(ManifestValidationError, match=r"authors.*Invalid input type"):
ManifestSchema(strict=True).load( ManifestSchema().load_manifest(
dict( dict(
name="MyPackage", name="MyPackage",
description="MyDescription", description="MyDescription",

149
tests/package/test_pack.py Normal file
View File

@ -0,0 +1,149 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import tarfile
import pytest
from platformio import fs
from platformio.compat import WINDOWS
from platformio.package.exception import UnknownManifestError
from platformio.package.pack import PackagePacker
def test_base(tmpdir_factory):
pkg_dir = tmpdir_factory.mktemp("package")
pkg_dir.join(".git").mkdir().join("file").write("")
pkg_dir.join(".gitignore").write("tests")
pkg_dir.join("._ignored").write("")
pkg_dir.join("main.cpp").write("#include <stdio.h>")
p = PackagePacker(str(pkg_dir))
# test missed manifest
with pytest.raises(UnknownManifestError):
p.pack()
# minimal package
pkg_dir.join("library.json").write('{"name": "foo", "version": "1.0.0"}')
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
with fs.cd(str(pkg_dir)):
p.pack()
with tarfile.open(os.path.join(str(pkg_dir), "foo-1.0.0.tar.gz"), "r:gz") as tar:
assert set(tar.getnames()) == set(
[".gitignore", "include/main.h", "library.json", "main.cpp"]
)
def test_filters(tmpdir_factory):
pkg_dir = tmpdir_factory.mktemp("package")
src_dir = pkg_dir.mkdir("src")
src_dir.join("main.cpp").write("#include <stdio.h>")
src_dir.mkdir("util").join("helpers.cpp").write("void")
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
test_dir = pkg_dir.mkdir("tests")
test_dir.join("test_1.h").write("")
test_dir.join("test_2.h").write("")
# test include with remap of root
pkg_dir.join("library.json").write(
json.dumps(dict(name="bar", version="1.2.3", export={"include": "src"}))
)
p = PackagePacker(str(pkg_dir))
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
assert set(tar.getnames()) == set(["util/helpers.cpp", "main.cpp"])
# test include "src" and "include"
pkg_dir.join("library.json").write(
json.dumps(
dict(name="bar", version="1.2.3", export={"include": ["src", "include"]})
)
)
p = PackagePacker(str(pkg_dir))
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
assert set(tar.getnames()) == set(
["include/main.h", "library.json", "src/main.cpp", "src/util/helpers.cpp"]
)
# test include & exclude
pkg_dir.join("library.json").write(
json.dumps(
dict(
name="bar",
version="1.2.3",
export={"include": ["src", "include"], "exclude": ["*/*.h"]},
)
)
)
p = PackagePacker(str(pkg_dir))
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
assert set(tar.getnames()) == set(
["library.json", "src/main.cpp", "src/util/helpers.cpp"]
)
def test_symlinks(tmpdir_factory):
# Windows does not support symbolic links
if WINDOWS:
return
pkg_dir = tmpdir_factory.mktemp("package")
src_dir = pkg_dir.mkdir("src")
src_dir.join("main.cpp").write("#include <stdio.h>")
pkg_dir.mkdir("include").join("main.h").write("#ifndef")
src_dir.join("main.h").mksymlinkto(os.path.join("..", "include", "main.h"))
pkg_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
tarball = pkg_dir.join("bar.tar.gz")
with tarfile.open(str(tarball), "w:gz") as tar:
for item in pkg_dir.listdir():
tar.add(str(item), str(item.relto(pkg_dir)))
p = PackagePacker(str(tarball))
assert p.pack(str(pkg_dir)).endswith("bar-2.0.0.tar.gz")
with tarfile.open(os.path.join(str(pkg_dir), "bar-2.0.0.tar.gz"), "r:gz") as tar:
assert set(tar.getnames()) == set(
["include/main.h", "library.json", "src/main.cpp", "src/main.h"]
)
m = tar.getmember("src/main.h")
assert m.issym()
def test_source_root(tmpdir_factory):
pkg_dir = tmpdir_factory.mktemp("package")
root_dir = pkg_dir.mkdir("root")
src_dir = root_dir.mkdir("src")
src_dir.join("main.cpp").write("#include <stdio.h>")
root_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
p = PackagePacker(str(pkg_dir))
with tarfile.open(p.pack(str(pkg_dir)), "r:gz") as tar:
assert set(tar.getnames()) == set(["library.json", "src/main.cpp"])
def test_manifest_uri(tmpdir_factory):
pkg_dir = tmpdir_factory.mktemp("package")
root_dir = pkg_dir.mkdir("root")
src_dir = root_dir.mkdir("src")
src_dir.join("main.cpp").write("#include <stdio.h>")
root_dir.join("library.json").write('{"name": "foo", "version": "1.0.0"}')
bar_dir = root_dir.mkdir("library").mkdir("bar")
bar_dir.join("library.json").write('{"name": "bar", "version": "2.0.0"}')
bar_dir.mkdir("include").join("bar.h").write("")
manifest_path = pkg_dir.join("remote_library.json")
manifest_path.write(
'{"name": "bar", "version": "3.0.0", "export": {"include": "root/library/bar"}}'
)
p = PackagePacker(str(pkg_dir), manifest_uri="file:%s" % manifest_path)
p.pack(str(pkg_dir))
with tarfile.open(os.path.join(str(pkg_dir), "bar-2.0.0.tar.gz"), "r:gz") as tar:
assert set(tar.getnames()) == set(["library.json", "include/bar.h"])

View File

@ -112,3 +112,67 @@ int main() {
assert "-DTMP_MACRO1" not in build_output assert "-DTMP_MACRO1" not in build_output
assert "-Os" not in build_output assert "-Os" not in build_output
assert str(tmpdir) not in build_output assert str(tmpdir) not in build_output
def test_debug_default_build_flags(clirunner, validate_cliresult, tmpdir):
tmpdir.join("platformio.ini").write(
"""
[env:native]
platform = native
build_type = debug
"""
)
tmpdir.mkdir("src").join("main.c").write(
"""
int main() {
}
"""
)
result = clirunner.invoke(cmd_run, ["--project-dir", str(tmpdir), "--verbose"])
validate_cliresult(result)
build_output = result.output[result.output.find("Scanning dependencies...") :]
for line in build_output.split("\n"):
if line.startswith("gcc"):
assert all(line.count(flag) == 1 for flag in ("-Og", "-g2", "-ggdb2"))
assert all(
line.count("-%s%d" % (flag, level)) == 0
for flag in ("O", "g", "ggdb")
for level in (0, 1, 3)
)
assert "-Os" not in line
def test_debug_custom_build_flags(clirunner, validate_cliresult, tmpdir):
custom_debug_build_flags = ("-O3", "-g3", "-ggdb3")
tmpdir.join("platformio.ini").write(
"""
[env:native]
platform = native
build_type = debug
debug_build_flags = %s
"""
% " ".join(custom_debug_build_flags)
)
tmpdir.mkdir("src").join("main.c").write(
"""
int main() {
}
"""
)
result = clirunner.invoke(cmd_run, ["--project-dir", str(tmpdir), "--verbose"])
validate_cliresult(result)
build_output = result.output[result.output.find("Scanning dependencies...") :]
for line in build_output.split("\n"):
if line.startswith("gcc"):
assert all(line.count(f) == 1 for f in custom_debug_build_flags)
assert all(
line.count("-%s%d" % (flag, level)) == 0
for flag in ("O", "g", "ggdb")
for level in (0, 1, 2)
)
assert all("-O%s" % optimization not in line for optimization in ("g", "s"))

View File

@ -20,6 +20,7 @@ from os.path import basename, dirname, getsize, isdir, isfile, join, normpath
import pytest import pytest
from platformio import util from platformio import util
from platformio.compat import PY2
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@ -53,6 +54,8 @@ def pytest_generate_tests(metafunc):
for root, _, files in walk(examples_dir): for root, _, files in walk(examples_dir):
if "platformio.ini" not in files or ".skiptest" in files: if "platformio.ini" not in files or ".skiptest" in files:
continue continue
if "zephyr-" in root and PY2:
continue
group = basename(root) group = basename(root)
if "-" in group: if "-" in group:
group = group.split("-", 1)[0] group = group.split("-", 1)[0]

View File

@ -25,8 +25,8 @@ def test_platformio_cli():
def test_ping_internet_ips(): def test_ping_internet_ips():
for ip in util.PING_INTERNET_IPS: for host in util.PING_REMOTE_HOSTS:
requests.get("http://%s" % ip, allow_redirects=False, timeout=2) requests.get("http://%s" % host, allow_redirects=False, timeout=2)
def test_api_internet_offline(without_internet, isolated_pio_home): def test_api_internet_offline(without_internet, isolated_pio_home):

View File

@ -16,8 +16,8 @@ import os
import pytest import pytest
from platformio.exception import UnknownEnvNames
from platformio.project.config import ConfigParser, ProjectConfig from platformio.project.config import ConfigParser, ProjectConfig
from platformio.project.exception import InvalidProjectConfError, UnknownEnvNamesError
BASE_CONFIG = """ BASE_CONFIG = """
[platformio] [platformio]
@ -34,6 +34,7 @@ lib_deps =
Lib1 ; inline comment in multi-line value Lib1 ; inline comment in multi-line value
Lib2 Lib2
lib_ignore = ${custom.lib_ignore} lib_ignore = ${custom.lib_ignore}
custom_builtin_option = ${env.build_type}
[strict_ldf] [strict_ldf]
lib_ldf_mode = chain+ lib_ldf_mode = chain+
@ -54,7 +55,7 @@ lib_ignore = LibIgnoreCustom
[env:base] [env:base]
build_flags = ${custom.debug_flags} ${custom.extra_flags} build_flags = ${custom.debug_flags} ${custom.extra_flags}
lib_compat_mode = ${strict_ldf.strict} lib_compat_mode = ${strict_ldf.lib_compat_mode}
targets = targets =
[env:test_extends] [env:test_extends]
@ -65,7 +66,11 @@ extends = strict_settings
EXTRA_ENVS_CONFIG = """ EXTRA_ENVS_CONFIG = """
[env:extra_1] [env:extra_1]
build_flags = ${custom.lib_flags} ${custom.debug_flags} build_flags =
-fdata-sections
-Wl,--gc-sections
${custom.lib_flags}
${custom.debug_flags}
lib_install = 574 lib_install = 574
[env:extra_2] [env:extra_2]
@ -96,13 +101,10 @@ def config(tmpdir_factory):
def test_empty_config(): def test_empty_config():
config = ProjectConfig("/non/existing/platformio.ini") config = ProjectConfig("/non/existing/platformio.ini")
# unknown section # unknown section
with pytest.raises(ConfigParser.NoSectionError): with pytest.raises(InvalidProjectConfError):
config.getraw("unknown_section", "unknown_option") config.get("unknown_section", "unknown_option")
assert config.sections() == [] assert config.sections() == []
assert config.get("section", "option") is None
assert config.get("section", "option", 13) == 13 assert config.get("section", "option", 13) == 13
@ -111,7 +113,7 @@ def test_warnings(config):
assert len(config.warnings) == 2 assert len(config.warnings) == 2
assert "lib_install" in config.warnings[1] assert "lib_install" in config.warnings[1]
with pytest.raises(UnknownEnvNames): with pytest.raises(UnknownEnvNamesError):
config.validate(["non-existing-env"]) config.validate(["non-existing-env"])
@ -155,6 +157,7 @@ def test_options(config):
"custom_monitor_speed", "custom_monitor_speed",
"lib_deps", "lib_deps",
"lib_ignore", "lib_ignore",
"custom_builtin_option",
] ]
assert config.options(env="test_extends") == [ assert config.options(env="test_extends") == [
"extends", "extends",
@ -165,6 +168,7 @@ def test_options(config):
"custom_monitor_speed", "custom_monitor_speed",
"lib_deps", "lib_deps",
"lib_ignore", "lib_ignore",
"custom_builtin_option",
] ]
@ -176,7 +180,7 @@ def test_has_option(config):
def test_sysenv_options(config): def test_sysenv_options(config):
assert config.get("custom", "extra_flags") is None assert config.getraw("custom", "extra_flags") == ""
assert config.get("env:base", "build_flags") == ["-D DEBUG=1"] assert config.get("env:base", "build_flags") == ["-D DEBUG=1"]
assert config.get("env:base", "upload_port") is None assert config.get("env:base", "upload_port") is None
assert config.get("env:extra_2", "upload_port") == "/dev/extra_2/port" assert config.get("env:extra_2", "upload_port") == "/dev/extra_2/port"
@ -201,6 +205,7 @@ def test_sysenv_options(config):
"custom_monitor_speed", "custom_monitor_speed",
"lib_deps", "lib_deps",
"lib_ignore", "lib_ignore",
"custom_builtin_option",
"upload_port", "upload_port",
] ]
@ -223,10 +228,17 @@ def test_getraw_value(config):
with pytest.raises(ConfigParser.NoOptionError): with pytest.raises(ConfigParser.NoOptionError):
config.getraw("platformio", "monitor_speed") config.getraw("platformio", "monitor_speed")
# default
assert config.getraw("unknown", "option", "default") == "default"
assert config.getraw("env:base", "custom_builtin_option") == "release"
# known # known
assert config.getraw("env:base", "targets") == "" assert config.getraw("env:base", "targets") == ""
assert config.getraw("env:extra_1", "lib_deps") == "574" assert config.getraw("env:extra_1", "lib_deps") == "574"
assert config.getraw("env:extra_1", "build_flags") == "-lc -lm -D DEBUG=1" assert (
config.getraw("env:extra_1", "build_flags")
== "\n-fdata-sections\n-Wl,--gc-sections\n-lc -lm\n-D DEBUG=1"
)
# extended # extended
assert config.getraw("env:test_extends", "lib_ldf_mode") == "chain+" assert config.getraw("env:test_extends", "lib_ldf_mode") == "chain+"
@ -236,7 +248,12 @@ def test_getraw_value(config):
def test_get_value(config): def test_get_value(config):
assert config.get("custom", "debug_flags") == "-D DEBUG=1" assert config.get("custom", "debug_flags") == "-D DEBUG=1"
assert config.get("env:extra_1", "build_flags") == ["-lc -lm -D DEBUG=1"] assert config.get("env:extra_1", "build_flags") == [
"-fdata-sections",
"-Wl,--gc-sections",
"-lc -lm",
"-D DEBUG=1",
]
assert config.get("env:extra_2", "build_flags") == ["-Og"] assert config.get("env:extra_2", "build_flags") == ["-Og"]
assert config.get("env:extra_2", "monitor_speed") == 9600 assert config.get("env:extra_2", "monitor_speed") == 9600
assert config.get("env:base", "build_flags") == ["-D DEBUG=1"] assert config.get("env:base", "build_flags") == ["-D DEBUG=1"]
@ -246,24 +263,29 @@ def test_items(config):
assert config.items("custom") == [ assert config.items("custom") == [
("debug_flags", "-D DEBUG=1"), ("debug_flags", "-D DEBUG=1"),
("lib_flags", "-lc -lm"), ("lib_flags", "-lc -lm"),
("extra_flags", None), ("extra_flags", ""),
("lib_ignore", "LibIgnoreCustom"), ("lib_ignore", "LibIgnoreCustom"),
] ]
assert config.items(env="base") == [ assert config.items(env="base") == [
("build_flags", ["-D DEBUG=1"]), ("build_flags", ["-D DEBUG=1"]),
("lib_compat_mode", "soft"), ("lib_compat_mode", "strict"),
("targets", []), ("targets", []),
("monitor_speed", 9600), ("monitor_speed", 9600),
("custom_monitor_speed", "115200"), ("custom_monitor_speed", "115200"),
("lib_deps", ["Lib1", "Lib2"]), ("lib_deps", ["Lib1", "Lib2"]),
("lib_ignore", ["LibIgnoreCustom"]), ("lib_ignore", ["LibIgnoreCustom"]),
("custom_builtin_option", "release"),
] ]
assert config.items(env="extra_1") == [ assert config.items(env="extra_1") == [
("build_flags", ["-lc -lm -D DEBUG=1"]), (
"build_flags",
["-fdata-sections", "-Wl,--gc-sections", "-lc -lm", "-D DEBUG=1"],
),
("lib_deps", ["574"]), ("lib_deps", ["574"]),
("monitor_speed", 9600), ("monitor_speed", 9600),
("custom_monitor_speed", "115200"), ("custom_monitor_speed", "115200"),
("lib_ignore", ["LibIgnoreCustom"]), ("lib_ignore", ["LibIgnoreCustom"]),
("custom_builtin_option", "release"),
] ]
assert config.items(env="extra_2") == [ assert config.items(env="extra_2") == [
("build_flags", ["-Og"]), ("build_flags", ["-Og"]),
@ -272,6 +294,7 @@ def test_items(config):
("monitor_speed", 9600), ("monitor_speed", 9600),
("custom_monitor_speed", "115200"), ("custom_monitor_speed", "115200"),
("lib_deps", ["Lib1", "Lib2"]), ("lib_deps", ["Lib1", "Lib2"]),
("custom_builtin_option", "release"),
] ]
assert config.items(env="test_extends") == [ assert config.items(env="test_extends") == [
("extends", ["strict_settings"]), ("extends", ["strict_settings"]),
@ -282,6 +305,7 @@ def test_items(config):
("custom_monitor_speed", "115200"), ("custom_monitor_speed", "115200"),
("lib_deps", ["Lib1", "Lib2"]), ("lib_deps", ["Lib1", "Lib2"]),
("lib_ignore", ["LibIgnoreCustom"]), ("lib_ignore", ["LibIgnoreCustom"]),
("custom_builtin_option", "release"),
] ]
@ -315,9 +339,11 @@ board = myboard
] ]
config.save() config.save()
contents = tmpdir.join("platformio.ini").read()
assert contents[-4:] == "yes\n"
lines = [ lines = [
line.strip() line.strip()
for line in tmpdir.join("platformio.ini").readlines() for line in contents.split("\n")
if line.strip() and not line.startswith((";", "#")) if line.strip() and not line.startswith((";", "#"))
] ]
assert lines == [ assert lines == [
@ -376,6 +402,7 @@ def test_dump(tmpdir_factory):
("custom_monitor_speed", "115200"), ("custom_monitor_speed", "115200"),
("lib_deps", ["Lib1", "Lib2"]), ("lib_deps", ["Lib1", "Lib2"]),
("lib_ignore", ["${custom.lib_ignore}"]), ("lib_ignore", ["${custom.lib_ignore}"]),
("custom_builtin_option", "${env.build_type}"),
], ],
), ),
("strict_ldf", [("lib_ldf_mode", "chain+"), ("lib_compat_mode", "strict")]), ("strict_ldf", [("lib_ldf_mode", "chain+"), ("lib_compat_mode", "strict")]),
@ -397,7 +424,7 @@ def test_dump(tmpdir_factory):
"env:base", "env:base",
[ [
("build_flags", ["${custom.debug_flags} ${custom.extra_flags}"]), ("build_flags", ["${custom.debug_flags} ${custom.extra_flags}"]),
("lib_compat_mode", "${strict_ldf.strict}"), ("lib_compat_mode", "${strict_ldf.lib_compat_mode}"),
("targets", []), ("targets", []),
], ],
), ),